## EPS-HEP

Today is the first day of the EPS-HEP conference in Stockholm, the largest particle physics conference of the year. In recent years such conferences have been awaited with great anticipation because of the prospects of new results in the latest LHC and Tevatron reports but this year things are a little more subdued. We will have to wait another two years before the LHC restarts and we can again follow every talk expecting the unexpected. Perhaps there will be some surprises in a late LHC analysis or something from dark matter searches, but otherwise this is just a good time to look back and ask, what did we learn so far from the LHC?

## Nightmare Scenario

The answer is that we have learnt that the mass of the Higgs boson is around 125 GeV and that this lies near the minimum end of the range of masses that would allow the vacuum to be stable even if there are no new particles to help stabilize it. Furthermore, we do indeed find no evidence of other new particles up to the TeV range and the Higgs looks very much like a lone standard model Higgs. Yes, there could still be something like SUSY there if it has managed to hide in an awkward place. There could even be much lighter undiscovered particles such as those hinted at by some dark matter searches, if they are hard to produce or detect at colliders, but the more obvious conclusion is that nothing else is there at these energies.

This is what many people called the “nightmare scenario” because it means that there are no new clues that can tell us about the next model for particle physics. Many theorists had predicted SUSY particles at this energy range in order to remove fine-tuning and have been disappointed by the results. Instead we have seen that the Higgs sector is probably fine tuned at least by some small factor. If no SUSY is found in the next LHC run at 13 TeV then it is fine-tuned at about the 1% level.

## Fine-tuning

Many physicists dislike fine-tuning. They feel that the laws of physics should be naturally derived from a simple model that leaves no room for such ambiguity. When superstring theory first hit the street it generated a lot of excitement precisely because it seemed to promise such a model. The heterotic string in particular looked just right for the job because its E8 gauge group is the largest exceptional simple lie algebra and it is just big enough to contain the standard model gauge group with suitable chiral structures. All they needed to do was figure out which calabi-yau manifold could be stabilised as a compactification space to bring the number of dimensions down from 10 to the 4 space and time dimensions of the real world. They would then see quickly how the symmetry gets broken and the standard model emerged at low energy, or so they hoped.

The problem is that there has been evidence for fine-tuning in nature for a long time. One of the earliest known examples was the carbon resonance predicted by Hoyle at precisely the right energy to allow carbon to form in stellar nucleosynthesis. If it was not there the cosmos would not contain enough carbon for us to exist. Hoyle was right and the resonance was soon found in nuclear experiments. Since then we have realized that many other parameters of the standard model are seemingly tuned for life. If the strong force was slightly stronger then two neutrons would form a stable bond to provide a simple form of matter that would replace hydrogen. If the cosmological constant was stronger the universe would have collapsed before we had time to evolve, any weaker and galaxies would not have formed. There are many more examples. If the standard model had fallen out of heterotic string theory as hoped we would have to accept these fine tunings as cosmic coincidences with no possible explanation.

## The Multiverse

String theorists did learn how to stabilize the string moduli space but they were disappointed. Instead of finding a unique stable point to which any other compactification would degenerate they found that fluxes could stabilize a vast landscape of possible outcomes. There are so many possible stable states for the vacuum that the task of exploring them to find one that fits the standard model seems well beyond are capabilities. Some string theorists saw the bright side of this. It offers the possibility of selection to explain fine-tuning. This is the multiverse theory that says all the possible states in the landscape exist equally and by anthropic arguments we find ourselves in a universe suitable for life simply because there is no intelligent life in the ones that are not fine-tuned.

Others were not so happy. The conclusion seems to be that string theory can not predict low energy physics at all. This is unacceptable according to the scientific method or so they say. There must be a better way out otherwise string theory has failed and should be abandoned in favor of a search for a completely different alternative. But the stting theorists carry on. Why is that? Is it because they are aging professors who have invested too much intellectual capitol in their theory. Are young theorists doomed to be corrupted into following the evil ways of string theory by their egotistical masters when they would rather be working on something else? I don’t think so. Physicists did not latch onto string theory just because it is full of enchanting mathematics. They study it because they have come to understand the framework of consistent quantum theories and they see that it is the only direction that can unify gravity with other forces. Despite many years of trying nothing else offers a viable alternative that works (more about LQG is for another post).

Many people hate the very idea of the multiverse. I have heard people say that they cannot accept that such a large space of possibilities exist. What they don’t seem to realize is that standard quantum field theory already offers this large space. The state vector of the universe comes from a Hilbert space of vast complexity. Each field variable becomes an operator on space of states and the full Hilbert space is the tensor product of all those spaces. It’s dimension is the product of the dimensions of all the local spaces and the state vector has a component amplitude for each dimension in this vast multiverse of possibilities. This is not some imaginary concept. It is the mathematical structure that successfully describes the quantum scattering of particles in the standard model. The only significant difference for the multiverse of string theory is that many of the string theory states describe different stable vacuua whereas in the standard model the stable vacuua are identical under gauge symmetry. If string theory is right then the multiverse is not some hypothetical construct that we cannot access. It is the basis of the Hilbert space spanned by the wavefunction.

Some skeptics say that there is no such fine-tuning. They say that if parameters were different then life would have formed in other ways. They say that the apparent finetuning that sets the mass of the Higgs boson and the small size of the cosmological constant is just an illusion. There may be some other way to look at the standard model which makes it look natural instead of fine-tuned. I think this is misguided. During the inflationary phase of the universe the wavefunction sat in some metastable state where the vacuum energy produced a huge effective cosmological constant. At the end of inflation it fell to a stable vaccum state whose contribution to the cosmological constant id much smaller. Since this is a non-symmetrical state it is hard to see why opposite sign contributions from bosons and fermions would cancel. Unless there is some almost miraculous hidden structure the answer seems to be fine-tuned. The same is true for the Higgs mass and other finely tuned parameters. It is very hard to see how they can be explained naturally if the standard model is uniquely determined.

People can complain as much as they like that the multiverse is unscientific because it does not predict the standard model. Such arguments are worthless if that is how the universe works. The multiverse provides a natural explanation for the unnatural parameters of physics. We do noy say that astrophysics is unscientific because it does not give a unique prediction for the size and composition of the Sun. We accept that there is a landscape of possible stellar objects and that we must use observation to determine what our star looks like. The same will be true for the standard model, but that does not stop us understanding the principles that determine the landscape of possibilities or from looking for evidence in other places.

What does it mean for life in the universe?

If the landscape of vacuua is real and the world is naturally unnatural it may take many centuries to find convincing evidence, but it will have consequences for life in the universe. If you think that life arises naturally no matter what the parameters of physics are then you would expect life to take a very diverse range of forms. I dont just mean that life on Earth is diverse in the way we are familiar with. I mean that there should be different solutions to the chemistry of life that work on other planets. On Earth there is just one basic chemistry based on DNA and RNA. This also includes the chemistry of metabolism, photosynthesis and other biochemical processes without which life on Earth would be very different. If we find that all higher lifeforms on other planets uses these same processes then we can be sure that physics is fine-tuned for life. If any one of them did not work there would be no life. Either this fine-tuning must arise naturally from a multiverse or we would have to accept that the existence of life at all is an almost miraculous coincidence. If on the other hand we find complex lifeforms based on molecules unlike DNA and supported by completely different mechanisms then the argument for fine-tuning in nature is weaker.

Theorist Nima Arkani-Hamed recently suggested that it would be worth building a 100 TeV hadron collider even if the only outcome was to verify that there is no new physics up to that energy, It would show that the Higgs mass is fine-tuned to one part in 10,000 and that would be a revolutionary discovery. If it failed to prove that it would find something less exciting such as SUSY. I don’t think this argument will raise the funding required but if the LHC continues to strengthen the case for fine-tuning we must accept the implications.

**Update 29-Jul-2013**: I am just updating to add some linkbacks to other bloggers who have followed up on this. Peter Woit takes the usual negative view about what he continues to call “multiverse mania” and summarised my post by saying “Philip Gibbs, … argues that what we are learning from the LHC is that we must give up and embrace the multiverse.” To respond, I don’t think that recognising the importance of the multiverse constitutes giving up anything except the failed idea of naturalness ( In future I will always couple the word “naturalnness” with the phrase “failed idea” because this seems to be a successful debating technique ) In particular phenomenologists and experimenters will continue to look for physics beyond the standard model in order to explain dark matter, inflation etc. People working on quantum gravity will continue to explore the same theories and their phenomenology.

Another suggestion we see coming from Woit and his supporters is that the idea that physics may be unnaturally fine-tuned is coming from string theory. This is very much not the case. It is being driven by experiment and ordinary TeV scale phenomenology. If you think that the string theory landscape has helped convert people to the idea you should check your history to see that the word “landscape” was coined by Lee Smolin in the context of LQG. Anthropic reasoning has also been around since long before string theory. Of course some string theorists do see the string theory landscape as a possible explanation for unnaturalness but the idea certainly exists in a much wider context.

Woit also has some links to interesting documents about naturalness from the hands of Seiberg and Wilczek.

Lubos Motl posted a much more supportive response to this article. He also offers an interesting idea about Fermion masses from Delta(27) which I think tends to go against the idea that the standard model comes from a fine-tuned but otherwise unspecial compactification drawn from the string lanscape. It is certainly an interesting possibility though and it shows that all philosophical options remain open. Certainly there must be some important explanation for why there are three fermion generations but this is one of several possibilities including the old one that they form a multiplet of SO(10) and the new one from geometric unity.

Interesting article. Nature is subject to its own laws which we are still unwrapping. It is impossible to walk on water unless one is a water beetle, there are no miracles – or non reported that I am aware of except for the religious propaganda stories.

If we exist ( life forms here on this planet ) it is logical to assume nature is using the same laws on other planets assuming the physical condition permit it to occur. I am pretty sure Nature does not have different rules for in different parts of the universe.

The world is full of string theorists as it a popular subject to receive a grant and write a PhD paper.

Yes I am with you on the subject of searching new ideas, there is no Nightmare Scenario as one of the obvious areas to research has never really been considered. That is we are all completely confused regards the association and relationship between matter and 3 spatial dimensions. Let’s think on the idea that the universe only has 1 dimension. But just happens to contain a very small amount of matter which we then measure as having the 3.

Thanks for this informative, entertaining, and very nice to read text Phil :-)

Concerning the “multiverse” I have no problems if one just wants to name the mathematical space of possible solutions to ST (there is absolutely nothing wrong with this being large!) by this term. But I think the points in this space of solutions should not too literally be interpretted as additional universes that exist somewhere …

Cheers

In my vision you go too far if you want to deliberate about the results of contemporary physics. In the early days of quantum physics its foundation guided in a direction that differed significantly from the direction that is now taken by QED and QCD. In those days physicists and mathematicians worked together in order to define a basement that could serve as a foundation of physics as a whole. Von Neumann and Birkhoff suggested to use what they called quantum logic as a foundation. The set of quantum logic propositions is lattice isomorphic with the set of closed subspaces of a separable Hilbert space. This presents a partial isomorphism and the resulting difference is never well comprehended. However, this lack can be repaired by a slight refinement of quantum logic. Another outcome is that neither quantum logic nor the Hilbert space is naturally equipped to represent dynamics. Both structures can only represent a static status quo. This does not forbid to construct dynamic models by using an ordered sequence of such static sub-models. However, that is not the way the physical community went.

The suggested foundation classifies quantum physics as fundamentally countable. This contradicts the need in physical theories for the availability of continuums. That need is supplied by the existence of a Gelfand triple for each separable Hilbert space.

Instead the physical community took a shortcut and skipped the notion of a separable Hilbert space and concerned on continuous function spaces. The community also ignored the results of Constantin Piron and Soler, who discovered that the number systems for specifying the inner products of Hilbert space must be taken from a suitable division ring. This poses quaternions as the most elaborate number system. Among physicists little is known about the peculiarities of quaternionic number systems.

These facts are all ignored by string theorists and LQG supporters.

Contemporary theories heavily base on gauge transformations, while in quaternionic theories gauge transformations are only allowed in exceptional cases.

Probably the physical community ran past itself when it tried to pursue quick results.

The notion that a monoverse is more ontologically parsimonious than a multiverse has always struck me as philosophically obtuse. There is always a “why” when choosing between possible alternatives, whereas the symmetry between possible alternatives is unbroken if all alternatives are realized. Answers to “why” questions require axioms; the “why”‘s never end, and demand ontological excess. The fact that nature is not complex enough to intelligently choose to reify itself uniquely among possible alternatives is already manifest in quantum mechanics, and should be a strong hint to opponents of a multiverse that their philosophical intuition is suspect.

Why would any rational scientist pay the slightest bit of any attention to what Mr. Arkani-Hamed says anymore.

Like Kaku, he has become a stand-up comic.

Such a “natural” reduction occurs in our mental organization of models of the world. What is on the other side of the Looking Glass, quantum mechanics, new discoveries in thermodynamics, exponentiation as hyperbolic geometry. Perhaps, depending on what sort of bias a person has to pursue a truth or theory, there are simpler explanations that seem so close to any of the traditional paths. Of these the comprehension of it is something that we may still be blind to see and is the source of self deceptive myths. We in fact need more than the geometry of string theory and the paradox on which we have erected logically the calculus of variations. More than the use of groups and exceptional groups that were supposedly unified in the brane theories. Can our better machines prove within (the parallel description by the statistical side of things) some accuracy anything about the nonexistence of a particular theory? For a start with new mirrors, complex models or not, it should be clear and common knowledge that in what we consider measures, constants and values, involves the subtle distinction between what is continuous and what is discrete.

This reduction in theory making, an ultimate faith in the sensibility of matching theory to the experimental facts, finely tweaking the models or not is a subjective form of higher super-symmetry breaking- for in such simplistic physics like the six colors of a tetrahedral Rubik’s cube without a mirror… one can by a few moments in chance solve it even blind,

The reason the mass is at 127GEV is that the interactive stresses in the components in space (space cannot be empty) creates an expansive mode stress of 3.6E minus 25 kgs/cum . It cannot go any lower because the maximum compressive stress density is 4.5 E+35 kgs./cum. That product leaves Newtons gravity constant as ratio with this instantaneous field potential. Its an axiomatic derivation. Study it carefully in website www dot kapillavastu dot com under the PHO.PDF file ad you will see why Physics is still stumbling despite the 27 km Cern tunnel!!! Laws of the Universe are based on axioms not mans left brained derivations.

It’s starting to seem like every science blog or news story I go to, there’s Robert L. Oldershaw with something uninsightful or misunderstood or fantasised. Perhaps someone should have a quiet word.

Apologies, Philip. This is an excellent post.

I wish I could be around to see developments over the next century or more. I’m intrigued by the lack of SUSY, and simply clueless as to what will happen next. There are so many frontiers where anything or nothing could happen over the next 20 years.

Hopeful for some surprises in neutrino / DM searches, though.

If the answers to the BIG questions are that the fundamental structure of the cosmos was decided in regimes that are in principle out of reach, then so be it. It’s not up to us to claim that we can keep on probing deeper indefinitely or until we find the answer – all we can do, if we want to keep ourselves honest, is to keep on trying to look and to live with whatever comes.

Yes, as soon as certain BSM keywords or cosmological topics are mentioned in any physics blog, news article, or more generally anywhere in the internet, you can bet a large amount of money that RLO will pounce on it FTL to do what he always does … :-D. There is a technical term for this behavior starting with t*******, but I forgot it :-/

I have stopped reading these comments because they are too self similar or scale invariant respectively, and the Amazon feature that lets one ignore certain commenters would be helpful in physics blogs and below news articles etc too …

There is no reason to tear our hairs off for fine tuning. If the Higgs potential has no mass term, all will settle down fine. But we have to wait the restart on 2015 to be sure.

I would not be surprized if at the end of the road, Dark Matter Black Holes and Dark Energy Higgs particles rule a cyclic raspberry shaped Multiverse.

So people who do not see things the right way and are trouble-makers should be shown the instruments of readjustment and given the “quiet word”?

You forget that free speech and thought are still a freedoms in science.

I think the high priests and prophets of theoretical physics are clowns running around in clown suits and bopping each over the head and honking their own oversized horns.

Meanwhile the credulous sycophants in the peanut gallery view it as highbrow science when it is really lowbrow comedy.

So far the LHC supports my view and history will bear it out even more completely. Deal with it!

“Many people hate the very idea of the multiverse. I have heard people say that they cannot accept that such a large space of possibilities exist. What they don’t seem to realize is that standard quantum field theory already offers this large space. The state vector of the universe comes from a Hilbert space of vast complexity. Each field variable becomes an operator on space of states and the full Hilbert space is the tensor product of all those spaces. It’s dimension is the product of the dimensions of all the local spaces and the state vector has a component amplitude for each dimension in this vast multiverse of possibilities. This is not some imaginary concept. It is the mathematical structure that successfully describes the quantum scattering of particles in the standard model.”

Phil, are you serious? And how about claster decomposition? How about decoherence? There is no wave function of the Universe, it is not serious!

The sheer quantity of it, though…

and the cheeeesiness…

And the the amount of uninsightful et cetera.

by the founds made by the LHC the supersymmetry and superstrings just are manipulations of the human mind;correct?

the strings in 4-dimensional manifolds are possibles.the kakler-einstein metrics are stable constant curvatures.then the time connected to the space in spacetime are part of the smooth curvatures( topolgic) indicated by donaldson?

each particles vibrate in a frequency only one.then each particle is generated by the differents spacetime continuos.then the spacetime has the metric given by conjugation of particles-antiparticles,each one if transforming into one.gives it the spacetime as measure of energy of each particles.it is the relativistic velocity is the measuration of spacetime continuos,that demonstrate the there is connection of continuity and discreteness

then the particles and antiparticles that are generated by the breakdown of rotational invariance( left-right handed spins) are the strings with differents vacua states,with vector states independent

linearly-or approximations to non linear fields with infinity variables that are the disticts energy fields

Not ling in (generated by) different “spacetime continuities”, different fields.

living

I wonder if the Monster Group stands a chance in the other 10^500 – 1 Universes? Would this be a nondeterministic randomization of physics values only with Maths unaffected? Could 10^500 be some kind of super-sized symmetry group?

I agree that the article is a good exploration of currently fashionable ideas in theoretical physics and related problems.

I’ve got a major problem with the “fine-tuning/multiverse” crap that takes up a lot of space in the article.

If life evolves to fit nature (i.e., the cart is put AFTER the horse’s ass) then all this “fine-tuning” and “multiverse” rationalizing is the product of twisted pretzel logic.

I cannot believe that mature scientists could be sucked into the “fine-tuning” arguments. Is it not more reasonable to assume that there is one fully unified Universe, with one set of principles/laws, and life evolves to fit those prior constraints?

Not more reasonable, no. And seeing that it is less parsimonious, likely less reasonable.

Does OM mean “obviously mad”?

I think fine-tuning is only needed in current state of physics i.e. there is no accepted TOE around.

I don’t think DNA is an example of a fine tuning requirement. First even regular DNA can be left or right twisted and the genetic code is probably arbitrary. Second there are many variations on the DNA theme that use different sugars and stuff. Also there could be systems so totally different that we have no clue. After all if we didn’t have DNA to study would we have invented it? Along with all the enzymes without which it would be useless? There is no reason to think and I very much doubt that there is anything universal about DNA.

I think this highlights the danger of thinking something is fine tuned. Very often it is just a failure of imagination. It would be ironic if there were a bazillion string theory universes and life existed in all of them.

The logic of a universe that in a sense changes and fine tunes itself, its constants on the whole remaining the same as in the quantum world in the main existing more than not existing, is the same as that of the logic of DNA in its general range of forms and expression. A cosmic code or natural code analog to DNA, RNA, proteins and so on somewhere between our ideas of universe or multiverse – or a multiverse of string landscapes as one landscape in totality statistical or absolute or not.

Perhaps we should consider all as speculation until we put thermodynamics and its symmetry on firm ground beyond its first two laws (and the zeroth law if you want to include it) so even to understand any such laws beyond the third and how far this may be extended.

Fine tuning, that is if we accept the more mass the more compact the object, would be observable in the case where such ideas may show the expansion of space is not the explanation as recently proposed but a focusing of more massive atoms. Did not Weyl consider this (an idea Einstein doubted as unlikely)? Are we to fine tune our theories or does our vision of them become more dense – that is more spaced out?

DNA, or rather its predecessor RNA, is an excellent example of finetuning, see my longish comment below.

For DNA there are alternatives, but for metabolic reasons DNA is a likely pathway (an easy arrived at metabolite of RNA, and more chemically stable to boot).

A nice essay, Phil. A few words of mine about it:

http://motls.blogspot.co.uk/2013/07/naturalness-and-lhc-nightmare.html?m=1

Very good thanks

Lubos, of the stances of Bohr, Einstein, Bohm, and Everett, in the mutltverse of quantum interpretations by what unnaturally natural higher physics can you assert and discern whom has the crackpot stance?

Very good article. Well done Phil, good to see you taking the fine tuning issues head on, and treating them as clues, instead of sweeping them under the carpet, as many do. Oldershaw and others who say that life develops to fit whatever universe exists have simply not allowed for the nature of life as we know it – it is highly complex, and has complex and very unlikely requirements. The vast majority of universes arising from random sets of laws would not provide anything approaching those.

It’s one thing to talk about a range of possibilities, but it’s a very big step to say that they all have to exist, just to remove the fine tuning problem.

Looking at part of the quote Vladimir Kalitvianski commented on:

“This is not some imaginary concept. It is the mathematical structure that successfully describes the quantum scattering of particles in the standard model.”

There is nothing like enough of a solution to the puzzles of QM to say that we have an explanation that neatly removes both problems. People find Everitt’s interpretation of quantum theory attractive for exactly that reason, they hope to be able to get rid of both the fine tuning and the difficulty interpreting QM with a single idea. But neither aspect of this has any solid physics underneath it, and that approach is in danger of looking like sweeping the problems under a different and more sophistocated carpet.

One way to tackle both problems is by resolving real physical mechanisms behind gravitational, EM and strong interactions. There is one common aspect, spinning. With spinning there is no need for strict margins for life and on the other hand spinning mechanism resolves current anomalies in physics. You can find more on that from my site (toebi.com).

Most promising (and scariest) aspect of spinning based theory is understanding and usage of antimatter.

Life evolves to become complex at times, but it didn’t start out that way.

What is needed is an arrow of time, a free energy source and something like the CHNOPS elements (which are homologous with the most frequent elements, only C is the chemical engine there) in a liquid regime (preferably water).

Fusion energy, CHNOPS, and liquids derives from SM, so we are back to arguing about finetuning, not complexity.

It was always fine tuning, not complexity – “complex and unlikely requirements” was in answer to a point from Oldershaw, and was about the complexity of the requirements, not of life.

But what is needed for any life is a lot more than what you mention. Stable environments (at a viable temperature) need a surprisingly long list of precise physical laws to produce them. And talking about complexity, it is intelligent life – which is always complex – that needs explaining, for several reasons.

Modularization is a very strong mechanism and needs no fine tuning. Its only requirement is that sufficient resources are present and that components can generate more complex components. It is a rough and very powerful mechanism.

It works slow with stochastic design and can work fast with intelligent design.

Hans can you give me an example of modularisation without finetuning?

Elementary particles & hadrons => atoms => molecules => cells ==> organs ==> biological individuals ==> intelligent species

In the last steps evolution plays a significant role.

Modularization applies reduction of nr of relevant relations via interfaces and reuse of interfaces and components. It means more efficient use of resources and simpler “design”..

See: “A law of nature” http://vixra.org/abs/1101.0064

Hans, what you outine in your post of July 27, 2013 at 12:50 pm unavoidably has fine tuning in it, see my post of July 23, 2013 at 10:33 (+10.53) am

@Bill Evans

Modularization can work with any set of physical laws that does not forbid its principles (coupling & encapsulation). It does not need special fine tuned physical laws.

You say “Modularization can work with any set of physical laws that does not forbid its principles”. The vast majority would certainly forbid its principles. An enormous number of universes arising from the set of possible sets of physical laws would contain space and nothing else. Anything with what you need would be rare, and the principles you talk of contain fine tuning.

This is like arguing if there is God or not? No prizes here folks. But we should always choose the simpler explanation (Occam) hence no for fine tuning… right? ;)

The Hilbert Book Model is based on quantum logic. Quantum logic will be valid in all sets of physical laws, except for empty sets. When extended with pure mathematical methods this foundation automatically leads to a system in which modularization fits.

I’ve simply made the point that it has to be one or the other. Chance (many sets of laws elsewhere), or intention. Apart from enormous coincidence, which in philosophy is generally ruled out, there is no third way. As to which one looks better if bringing Occam’s razor, that is far from obvious, and it could be argued both ways.

The third way is obvious. Physical laws will emerge no matter what. No need for multiple sets. You better check this out -> http://toebi.com/documents/ToEbi.pdf

You say: “quantum logic will be valid in all sets of physical laws, except for empty sets”. Apart from the fact that empty sets alone are enough to put fine tuning in, a very large number of physicists would disagree with you there. Even with theories where we have a good interpretation, we can’t necessarily say that what accompanies them applies across a range of sets of laws. But with QM, we have no interpretation at all – there’s no consensus on that, we simply don’t know what’s going on. That takes it another step away, because we don’t even know what physical setup quantum logic arises from.

Robert,

Just my opinion, but, it would seem you’re the one with the pretzel logic! I mean no offense but there is no fixed “nature” which “life” evolves within; “nature” = “life” and “life” = “nature”! The supposed “fine-tuning” would seem to be just “nature” adjusting to the EMERGENCE of novel composite parts; it’s a two-way street – nature, as in the unified whole, is constrained by its composite parts and the composite parts are constrained by the unified whole. Humans have a distinct tendancy to separate themselves from the evolutionary process – evince the word “artificial!” I once knew some hippies who wouldn’t drop acid because it wasn’t “natural” – Ha, Ha, Ha, . . . Everything humans do – “Artificial Intelligence” (both general and specific), Genetic Engineering, cloning . . . it’s all part of the evolutionary process; how could it be otherwise?

You, Robert, would say that the horse completely constrained the emergent structure of the cart but that the cart has no affect on the future evolution of the horse. I say this view is fallacious – the cart DOES affect the future evolution of the horse. This is the central thesis of Buddhist Philosopy and Buddhists call this the Law of Interdependent Co-origination and it leads straight way to Emptiness. If one tries to isolate any one aspect of “nature” then one is left with emptiness – no composite component can be reduced to an irreducible self-nature; its “beingness” depends on and is affected by the other components! For a more scientific/mathematical treatment of this logical phenomenon see Kevin Knuth’s work – specifically his current FQXi essay and references therein (http://fqxi.org/community/forum/topic/1831) . . .

Shit! What I meant to say above is simply that the “Universe/Multiverse” evolves along with its composite parts; the so-called “laws” of nature are, following Charles Pierce, just habits and subject to evolution themselves. In my opinion, Robert and many scientists are suffering from the fish in the

fishbowl syndrome – they view their global environment as unchanging but this is a function of our limited perspective, both themporal and spatial . . .

And speaking of “cheesiness”, the Higgs Mechanism surely sets a very high bar for cheesiness.

The clumsy and inane explanations in the popular literature (‘particles swimming through molasses and gaining mass’) are on a par with the ugliness and inanity of the actual just-so model in the scientific literature.

Those are among the worst explanations of the Higgs mechanism, I agree. Perfect straw for your man there. Good stuff.

Everyone who disagrees with little Robert is cheesy? You do nothing but bitch and complain about the state of affairs in the scientific community; you suggest people are on psychotropic medication which is nothing to joke lightly of; you bitch and complain about scientific popularizers and call them silly names . . . I have a question: if you’re so fucking brilliant why don’t YOU become a scientific popularizer and DO what they do (i.e. write books, engage grad students, START THEIR OWN BLOG)? Failing that, consider starting a yoga and meditation practice; chances are you’ll lose some of your self-induced bitterness.

I’ve been finding myself enjoying reading/re-reading the Yoga Sutras lately. Yoga is a practice outside of space and time. (paraphrased)

OK enough of this, stop before I find myself having to delete lots of comments please

Can a Hilbert space support operators that have an affine space as its eigenspace? Or can the Gelfand triple of a Hilbert space support operators that have an affine space as its eigenspace?

In that case a single Hilbert space or Gelfand triple can easily support a single universe that consists of multiple disjoint compartments that each have their own enumeration or coordinate system.

Gravitation fields may pass the compartment borders. Every compartment may sense its own (never ending) history.

It seems to be a meta-thinking way that is conceptually different from what Science (causal) usually dictates. Is the multi-verse finite and so big that it may as well be infinite? If you look at 10^500 random solutions (approximations) then a probability of our universe occurring is 1/10^500 which might as well be zero. If those solutions are random then there is no spectrum found within the 10^500 as there could be many repeats of the same solution. If you look at the finite 10^500 solutions as a whole with a spectrum then the probability of our Universe existing is then close to one. The 10^500 is then a saturation of values that makes workable physics and chemistry including the carbon resonance. Our physics hints strongly at gauge unification. Is the unification energy then an anthropic value too? Then there would be no GUT or TOE that had any mathematical structural meaning and that in other universes there would be near misses and physics values that collapse into inelegant messes and perhaps those never appeared at all as part of the 10^500 solutions. It seems if these universes that survived as solutions then, there must be a translation between all solutions and that dictionary is pure mathematics. It is hard to think of the beta running of gauge values in the Standard Model as just being a near miss and that it almost appears elegant.

the universe appear to have um negatuive potential.then the universe could be generated through the spatio temporal evolution.then the universe could to be created ad infinitum.then the multiuniverse are possibles ,then the universe has cardinality of the contiuum

Mott, then all sets with cardinality of the continuum potentially make up the dictionary of pure maths, structures between solutions. How do you pick a ‘form’ from that, that is ‘unique’, say a calibi yau space?

It is much more likely that universe is a mixture of countable sets that are embedded in one or more continuums.

The superposition of the potentials that are emitted by distant particles forms a continuum or for every type of potential a separate continuum.

I wonder then if a solution for a universe is related to the nature of the graviton (a quantum particle) embedded to a Euclidean space R^n where n can be a set of different dimensions and that in itself is a set. Collapse of the super potential or a decoherence just means incompatability of a particular or peculiar dimension with existence of quantum matter and different values of the graviton. Some of the gauge values then may fit in well with certain dimensions and not others.

A massive particle owns a coherent set of placeholders, which are locations where it can be detected. That set is created by a stochastic process that distributes them over an embedding continuum. The particle uses these step stones one by one. At each arrival at a step stone the particle emits a wave front that transmits information about its presence and about the properties of the particle. This happens at an ultra-high frequency and the wave fronts move away from their source with light speed. The wave fronts slightly fold and thus curve the embedding continuum. Since the step stones take slightly different locations, that also holds for the source locations of the wave fronts. Due to their ultra-high frequency the corresponding waves cannot be observed directly, but they can act as carrier waves for modulation waves that have a much lower frequency. Together the wave fronts form the potentials of the particle. This superposed form of the emitted wave fronts can be observed as gravitation potential or EM potential.

The emitted wave front corresponds to one or more Green’s functions. The Green’s function defines the contribution of the wave front to the corresponding potential function.

The emission and the persistence of the wave fronts is governed by the Huygens principle.

Photons are examples of modulation waves. They can be emitted by oscillating or rotating particles.

This is how “Nature” works; “ready, get-set, go”, no tuning of “any kind” at all.

If the axiomatic expression of a (any) physics theory (not Nature) cannot “directly’ make contact with “Nature” but relies on some “tuning”, it is simply “wrong”.

Luboš Motl used a dimensionless parameter (Alpha, fine structure constant) as an example of this fine-tuning issue. Is the smallness of Alpha (= 1/137.03599) unnaturally too small?

Richard Feynman asked another question, “Why is this Alpha so damn mystically unnatural? No ‘formula’ of any kind (physics equation or otherwise) can calculate it”.

Phil Gibbs wrote, “The problem is that there has been evidence for fine-tuning in nature for a long time. One of the earliest known examples was the carbon resonance predicted by Hoyle at precisely the right energy to allow carbon to form in stellar nucleosynthesis. If it was not there the cosmos would not contain enough carbon for us to exist. … Since then we have realized that many other parameters of the standard model are seemingly tuned for life.

Is Alpha the Hall-mark of unnaturalness and of the fine-tuning? Of course, not, as the Alpha is not damn mystically unnatural after all; (Feynman, Sir). It can be easily calculated with the following “physics equation”.

Beta = 1/alpha = 64 ( 1 + first order sharing + sum of the higher order sharing)

= 64 (1 + 1/Cos A(2) + .00065737 + …) = 137.03599…

A(2) = 28.743 “degrees” is the Weinberg angle (θW ), the most important quantum parameter in the Standard Model.

The sum of the higher order sharing = 2(1/48)[(1/64) + (1/2)(1/64)^2 + ...+(1/n)(1/64)^n +...] = .00065737 + …

Alpha is “precisely” derived, not fine-tuned. Alpha is not just a coupling constant for electromagnetism but is a “central” lock which locks the DNA of the universe, the three nature constants [e (electric charge), c (light speed) and ħ (Planck constant)]. After this DNA is firmly locked, the universe is allowed to evolve with total “freedom”. Thus, carbon resonance and all parameters of the standard model which are seemingly tuned for life are the “precise (not fine-tuned)” outcome of this locked DNA; no fine-tuning at all. Nature is precise, not fine-tuned.

Luboš Motl wrote, “If there’s no experimental breakthrough and no theoretical revolution that will immediately and convincingly change our opinion about what is right around the corner behind the Standard Model, the status quo will simply continue whether you like it or not.”

A theoretical revolution will definitely face the “debate” for it being right or wrong. Yet, there is no argument of any kind needed for the above “physics equation” for Alpha, as its correctness can be verified by any 8th grader with a piece of paper and a pencil. Furthermore, no LHC dancing (or any other whatnot steps) of any kind is able to “alter” the Alpha formula, that is, to challenge the “physics” which underlies beneath the Alpha formula. In this Alpha-physics, SUSY (with s-particles), M- and F-theory are all wrong. There is no nightmare but death-sentence for them.

“If the axiomatic expression of a (any) physics theory (not Nature) cannot ‘directly’ make contact with ‘Nature’ but relies on some ‘tuning’, it is simply ‘wrong’. “

The followings are the “direct” outcomes (via the axiomatic expressions) of the Alpha-physics, no tuning of any kind.

1. The Alpha equation.

2. The G-string (available at http://blog.vixra.org/2013/05/16/why-i-still-like-string-theory/#comment-32550 ), which produces 48 known elementary particle.

Note:

a. Phil Gibbs wrote, “The conclusion seems to be that string [M-] theory cannot predict low energy physics at all.”

b. Peter Woit wrote (http://www.math.columbia.edu/~woit/wordpress/?p=6002 ), “… since the story of the last thirty years is not one of evidence for string [M-] theory unification [getting the Standard Model out of it] accumulating, but the opposite: … String [M-] theory unification is an idea now discredited in the scientific community … The most common attitude I hear among string theorists is that the ways people used to hope to connect it to the SM have failed.”

Yet, every 8th grader (not knowing any physics) is able to make a proofreading check between the G-string and the 48 known elementary particles.

3. The Super-unification (including the gravity) with the unified force equation.

Force (degenerated) = K (degenerated) F(unified), K is the coupling constant.

F (unified) = ħ / (delta T * delta S) ; T, time; S, space.

4. Uncertainty principle: Delta P * Delta S = Force * Delta T * Delta S = K (degenerated) ħ

The “strength” of the quantum effect is determined by K (the coupling).

5. The expansion of universe is accelerating, see page 50 of the book “Super Unified Theory (US copyright # TX 1-323-231, issued on April 18, 1984)”.

The 5 above are only some examples of the “direct” consequences of this Alpha-physics, no tuning of any kind involved.

The so called G-string is just a classification algorithm that combines the symmetry sets of two continuous quaternionic functions. Each continuous quaternionic function exists in 16 version that differ in their discrete symmetry sets. If the real part is ignored.then 8 different symmetry sets result. The coupling of the two quaternionic fields deliver 64 possibilities. Not all of them can be differentiated by observations. This is due to the fact that color charge cannot be measured. It can only be deduced via the Pauli principle.

The physical meaning of alpha will not be solved via numerology.

Many have tried and nothing lasing has ever come of it.

Alpha needs to be explained conceptually, as well as quantitatively.

Here is the explanation: http://arxiv.org/abs/0708.3501 .

Unfortunately one must give up a couple of untested but deeply entrenched assumptions before one can see that this is the definitive explanation for the physical meaning of alpha.

I am patient; I can wait.

Is this the deeply entrenched assumption that one should reject arguments that blatantly go around in a circle?

No. The major assumption is the completely untested assumption that gravitational interactions WITHIN atomic scale systems are “very weak”. I would argue that G is not an absolute constant for all of nature’s discrete hierarchical scales (using the fixed units approach as opposed to the relative units approach), and that it is very strong WITHIN atomic scale systems.

I am familiar with the “circular argument” criticism and it is discussed in the cited paper. I do not think the argument is valid, but feel free to put it into a detailed and specific scientific form that I can attempt to refute.

Perhaps in the pseudo-science world refutation by innuendo is considered sufficient?

Robert L. Oldershaw refers to arXiv 0708.3501 in which he said

“… the fine structure constant is the ratio

of the strengths of the unit electromagnetic interaction and

the unit gravitational interaction within atomic scale systems …”.

If you regard the strength of the gravitational interaction

as

the product of

a 1 / (Planck Mass )^2 Mass Factor

and

a Geometrical Factor related to the anti-deSitter group

similarly to how Armand Wyler related the strength of electromagnetism to its U(1) gauge group

then the fine structure constant is the ratio of

the strength of the electromagnetic interaction

and

the Geometrical Factor part of the strenght of the gravitational interaction

(see viXra 1108.0027 particularly pages 96-104 in the pdf pagination)

Whether or not the gravitational Mass Factor becomes ineffective within atomic scale, as Robert L. Oldershaw proposes, is irrelevant with respect to my calculation of the fine structure constant,

but

the same basic conceptual idea of ratio of electromagnetism to gravitational strenghs underlies both approaches.

Tony

Oops, “nothing lasting”

The question of whether alpha can be derived is irrelevant – you have not understood the issue at all. The fine tuning we find is not related to our ability to link one thing with another. It’s about whether we can take the laws of physics to have arisen randomly in the first place. It turns out that we have a very unlikely set of laws, given what they do. Most sets of laws would not even produce gas, let alone stars. So we need to explain the existence of our set of laws, and it can be a clue as to what’s out there. There is only one general mechanism that allows our set of laws to have arisen randomly, and that is that many other sets of laws have appeared elsewhere. There are different versions of that, but they all have a common element, which allows us to narrow things down. So maybe there are many other sets of laws out there.

@ Bill Evans, “The question of whether alpha can be derived is irrelevant – you have not understood the issue at all.”

Thanks for your great insight. It will be a nice thing to do for sending an email upstairs to Dr. Richard Feynman on this great news right the way.

“Dear Dr. Feynman:

I knew that you went upstairs with a great regret, without resolving the damn mystery of Alpha which bugged you all your life. But, a great gentleman just assured me that the damn mystery of Alpha is totally “irrelevant” after all (as you simply did not understand the issue at all), that is, all your fuss was the result of a hysteric anxiety. Just stay calm, and you will be fine.

Furthermore, this new Alpha-equation is predominately determined by a weak-mixing-angle which is the rock bottom base for Electroweak Unification, which was published right before your journey to upstairs. That is, you did not have the time and energy to evaluate it. Otherwise, you would surely have resolved that damn mystery before your long journey.

Yet, the most comforting news is that one physicist has assured me that this new Alpha-equation is purely numerological, having nothing to do with physics. The weak-mixing-angle in the equation is only numerological smoke-screen. So, really, nothing you have missed.

Yours truly,

Tienzen Gong”

I of course didn’t say that the question of whether or not alpha can be derived is irrelevant to everything, just what was being discussed.

Highs physics is not complete. 2592 zeros = 2048 + 512 + 32 . First & third generation spin sum as well. There is also continuous numerology. Robert.

Sent from my Virgin Mobile Android-Powered Device

is it,thomas

The fine tuning vs. computability war has not been ruled on yet and so far a lot is still on the table until these things become falsifiable. Until then it is meta talk of a sort. No computations of dimensionless physics parameters have yet yielded anything worth looking at. Just because you obtain a value very close or on top of a NIST Codata value is currently meaningless and there are many reasons why this is so. I have a saying, “Deriving your target from introduction of the target information is not a principled calculation of the target.” You have to extract or derive a value from a context (it has to be relatively clear) that is embedded in accepted Theory that would have a falsifiable character. In addition, the empirical derived fundamental physics targets at our low-energy may not be the best choice for determining a ‘pure math’ calculation of a dimensionless physics constant.

In an infinite eternal Universe, the truly fundamental laws and principles of physics do not “arise”. Not randomly or any other way.

Rather the truly fundamental laws and principles of physics have always existed in the present form and always will.

The job of physics is to identify those fundamental laws and principles, not to generate ugly and complicated models that fail Occam’s razor by an unseemly number of orders of magnitude, not to mention requiring an endless series of epicycles to keep them viable.

How pagan, and no closer to current convention than the clasical potential theory. Still, the old axiom Reality = Nature + Reason applies (e.g. Galen, On Natural Faculties). with Nature understood as “growth and increase”, i.e. a scaling factor. I actually agree we are missing a great deal in this line (biology, for starters), but see the problem through the Gibbs paradox. Some now talk of a ‘static entropy’ in contrast to dynamic entropy, and, of course, the entropy of mixing (= the Schroedinger information!) does not reduce.

So Nima shows to ignore that a laser-ee+ collider would be the real next need in HEP? Anyhow, fine-tunning is just a consequence of having a critical point in SM plus asymptotic safety of gravity: http://physics.stackexchange.com/questions/41199/was-the-higgs-mass-correctly-predicted-by-asymptotic-safety-of-gravity

It’s possible to look at why something exists separately from the question of whether or not it had a beginning. When I said the fine tuning is “about whether we can take the laws of physics to have arisen randomly” I wasn’t implying that they necessarily did or didn’t have a beginning – the word ‘arisen’ wasn’t that literal – I was just talking about the question of why these laws of physics exist at all.

And what we now know is if their existence is a chance, random thing, then there are many other sets of laws out there. But if there are not many other sets of laws out there, then the existence of our set of laws is not a chance, random thing. So if at some point in the future we find evidence that there are or aren’t other sets of laws out there, it will have direct bearing on this other question.

I think religion is largely nonsense, by the way, and has little or no bearing on these questions anyway. All it has done is to make people fail to look at these clues properly. It is ridiculous to think that either the universe exists by chance, or else religion is right, but many people assume that automatically, before they even start thinking.

Shouldn’t life still need a large enough number of particles interacting for a long enough time to exist? A too low or too high CC should prevent this regardless of what particle content and forces exist.

Is finding the vacuum with the largest possible CC “easy” in some way compared to one with a near zero one? That vacuum should (eventually) be almost all of the multiverse, right?

Is the multiverse a spectrum which would allow for a saturation of states to exist or is it random which could be problematic? M theory seems to suggest homeomorphism of all of these states.

Homeomorphic is not the right word maybe duality. Oh I see if it is a supermap then the word Landscape is appropriate and then it is more likely to be a continuum (of non connected states?) and therefore a spectrum. So the Landscape is an object subject to more physics and mathematics scrutiny.

One of the requirements on the long list of them (which is also a long list of coincidences, as they were all there), is a long enough timeframe for life to evolve into intelligent life. The initial expansion rate looks very finely tuned – a little faster and matter might not have held together as it did, a little slower and it might have recollapsed without giving us enough time in stable environments to evolve. Some of these coincidences are more certain than others, we don’t know all there is to know about the expansion. But there are so many that together they’re unambiguous. Phil mentioned one of the crucial ones, the resonance that allowed carbon to form, which is the basis of all known life.

The power of the modularization mechanism is so strong (read efficient) that when sufficient resources are present, automatically more complex modular systems are created. Finally it reaches a state in which intelligent creatures are formed.

In this sense the modularization mechanism must be classified as a physical law.

the quaternions with noncommutative properties is equivalent with relativistic spacetime continuos,explaining the continuity and the discreteness of the wavefunctions to differents quantic vacua to the superstring theories

If you model with quaternions, then the progression step conforms to a proper time step and the size of an infinitesimal quaternionic step conforms to an infinitesimal coordinate time step. The result is that space-time has Minkowski signature while quaternions have Euclidean signature. Space-time is used by contemporary physics and coordinate time is our usual notion of time. Proper time ticks at the location of the observed item. Coordinate time ticks at the the location of the observer. In a quaternionic model it is sensible to synchronize all proper time clocks. In that case universe steps with universe wide proper time ticks from one static status quo to the next static status quo.

I should add that being able to derive one thing from another doesn’t alter this much at all. Mark Thomas talks about “The fine tuning vs. computability war”, and others also seem to think this makes a large difference. It’s true that we can sometime reduce the number of coincidences on the list by showing that two are really just one and the same, but there are so many that even though we can reduce the number somewhat in that way, the overall situation is still exactly the same, and they still need an explanation.

What do you think of a hypothesis that there is no separate forces hence eliminating bunch of fine-tunes? Every force is based on same phenomenon, spinning in “ether”.

Out of boredom I am going to take a guess at alexsisxela’s “circular reasoning” criticism, and then refute it.

Circular Reasoning Argument: RLO uses the definition of the Planck mass to get an expression for hc, then he plugs this definition into the FSC equation that he claims to explain. This is not acceptable.

My refutation would be as follows. I regard the equation: M = (h-bar c/G)^1/2 as an important equality between 4 fundamental constants of nature. This equation can be rewritten: h-bar c = G(M)^2.

[Note: M is not the mass of a particle, according to Discrete Scale Relativity, but rather the base of the baryonic hierarchy, and perhaps more importantly the boundary between horizon-possessing particles and horizonless particles. It is just below the mass of the proton.]

Given my interpretation of the equation relating the 4 fundamental constants, I see no problem with using h-bar c = G(M)^2 in the FSC equation.

Perhaps I have guessed wrong about alexsisxela’s argument, or not appreciated its additional components.

I am willing to listen to and respond to criticism of the reasoning presented in the cited preprint, but first I need an explicit discussion of the specific problem(s).

You talk about eliminating a bunch of fine-tunes by unifying the forces, as if you had forgotton that the fine tunes are about biology as well as physics. They’re about the physics requirements of biology, so just simplifying the physics, although that is a worthy goal, has very little to do with it.

As I said, we can change the odds a little by simplifying the physics, but nothing like enough to alter the overall situation, and the questions remain unchanged.

I disagree strongly. Biology is based on physics, it emerges from physics. If there is no need for fine-tunes is physics there won’t be any need for fine-tunes in biology.

Yes, biology emerges from physics, but not too easily! The issue with the fine tuning is why the physics contains all that was needed for that to happen. And simplifying the laws doesn’t alter much the odds on that – the set of all possible laws that could exist in a random way is still vastly larger than the set of laws that happen to allow the emergence of intelligent life.

Bill et al, Physics can be seen as a branch of Biology too.

Everything which is needed for current lifeforms emerges without fine-tuning. You shouldn’t investigate single force interaction alone and say for example If this would be 10% bigger force then bla bla…

Because all forces have the same basis they “walk” hand-by-hand.

@Phil Gibbs: “If you think that life arises naturally no matter what the parameters of physics are then you would expect life to take a very diverse range of forms. … I mean that there should be different solutions to the chemistry of life that work on other planets. … If we find that all higher lifeforms on other planets uses these same processes then we can be sure that physics is fine-tuned for life.”

This is the issue of massive confusion entanglements on “multiverse, preciseness and fine-tuning”.

First, even with a set of absolute precise laws of life (with “zero” wiggling room), there will still have zillions of different human faces, as there are zillions different boundary conditions. The absolute preciseness of a law will not necessarily kill the diversity of the boundary conditions.

Second, the “chemistry” of life is a very high “tier”-expression of life. The laws of life underlie many tiers below the chemistry. I will discuss only two laws of life here.

1. The “key” mission of life is about “individuality”, that is, distinguishing a self from the other. Even the “identical” twins are two “individuals”. When “a” virus moves into two cells, there are two individual viruses. There is a “four color theorem” which is able to produce zillions “distinguishable” balls. Thus, the life of “this” universe uses “four color codes, the (A. G, T, C)” as a way for the manifestation of the essence of life (the individuality). Thus,

Law of life (1) in “this” universe —- the essence of life (the individuality) is expressed with the “four color theorem”.

2. The key functions of life (reproduction and/or metabolism) are carried out by processing information. Any information can be processed with a computing device, such as the Turing computer, the abacus or a set of counting straws. Thus,

Law of life (2) in “this” universe —- life must carry a computing device.

If a life in the (multiverse + 1)th-verse,

a. not about “individuality”, such as, only as a gigantic blob for the whole-verse,

b. not using 4-color-theorem for individuality but using another way (not available and/or known in “this” universe) for the manifestation of individuality,

c. needs no computing device or uses a device not available (or known) in “this” universe, (note, the life of this universe uses “Turing”-computer as the computing device, see http://www.prequark.org/Biolife.htm ).

then, that (multiverse + 1)th-verse life is “different” from the life of “this” universe. Otherwise, it will be no different from the life of “this” universe even if it is “silver”-based instead of the carbon-based life of this universe, as they two are still governed by the same two laws of life.

So, that the life forms (high or low) in the other -verse use these same (or completely different) processes (on chemistry level) as in “this” universe needs not to be the consequence of any “tuning”.

Furthermore, any-verse is governed only by some measuring rulers (which are the sources for the axiomatic expressions of laws). Those rulers are “nature-constants” for that any-verse. Yet, there must be a “lock” to lock those rulers absolutely tight. In the case of “this” universe, the “final lock” is the Alpha.

If the (multiverse + 1)th-verse uses four rulers (instead of three), such as (Turtle-speed, dog-charge, tail-spin and the whatnot-else), is it different from “this” universe, that is, not a part of this universe?

The answer is in its “lock”. If its lock is different from our “Alpha”, then, it is definitely not a part of “this” universe. If its lock is also a dimensionless number with the numeric value identical to Alpha of “this” universe, then it is still a part of this universe even if it has completely different “measuring rulers” (the nature-constants). Multiverse or not is not about those zillion parameter spaces but is about the “lock”. If the lock is different, there are different-verses. The “Lock-physics” ends the story.

The physical law behind the creation of very complex structures such as intelligent species is the modularization mechanism.

The modular creation process has great advantages in contrast to the creation of monoliths because it works far more efficient with resources and its configuration process is much simpler.

The number of interrelations that must be considered can be reduced with orders of magnitude. Especially when components can be configured from other components the the improvement goes exponentially.

The result is that if sufficient resources are present, then nearly automatically very complicated structures will emerge.

This is a question of time and resources and not a question of fine tuning.

i believe not that life how know is not continnum with the physical transformation. but there are discreteness between theirs.

the nature is not a continnum between physics,chemical,and biology…

Physics has no place for believes or religions.

The modularization law is more a mathematical law than a physical law, but it has great influence on the hierarchy of physical objects.

Remember N objects have N.x (N-1) .potential relations. Complexity goes with the square of the number of participating objects. Interfaces help to reduce relations. Reuse of components can work out even better than reduction of relations.

If the physics of (multiverse + 1)th-verse does not provide a computing device while there is something called “life” in this-verse, that “life” is definitely different from the life of “this”-universe.

If the physics of (multiverse + 2)th-verse provides an abacus computing device, then its “life” is not fundamentally different from the life of “this” universe. Yet, I do not know how an abacus can be embedded in its physics laws.

It is our great fortune that the physics of “this” universe has G-strings, and a set of Turing computers are embedded in the G-strings. That is, when a boundary condition of “this” universe is able to stretch its legs, those legs can immediately stand up and walk as a computing device is ready available [embedded in its (life’s) building blocks] free of charge. There is no fine-tuning of any kind needed for the rise of life in “this” universe. Furthermore, the G-strings provide the followings.

1. It is the only theoretical base for quark-color (red, yellow, blue, white), and it is the manifestation of the four-color-theorem in physics. That is, the “individuality” of life is guaranteed by physics laws.

2. Tommaso Dorigo’s post ofJuly 21, 2013 is “Do Measurements Of The B_d Decay To Muon Pairs Indicate Four Generations Of Matter ? (http://www.science20.com/quantum_diaries_survivor/do_measurements_b_d_decay_muon_pairs_indicate_four_generations_matter-116800 )”. In G-string, there is simply no room to house a 4th generation of matter. Neff is neither 3.1 nor 2.99 but =3 exactly in G-strings.

@Phil Gibbs: “The answer is that we have learnt that the mass of the Higgs boson is around 125 GeV and that this lies near the minimum end of the range of masses that would allow the vacuum to be stable even if there are no new particles to help stabilize it. … Instead we have seen that the Higgs sector is probably fine tuned at least by some small factor. If no SUSY is found in the next LHC run at 13 TeV then it is fine-tuned at about the 1% level.”

Again, the LHC new particle should have a mass exactly one-half (1/2) of the spacetime “vacuum” energy. That is, the “minimum end” calculation is simply wrong. The fact is that the entire “Higgs saga” is wrong, only as a fairy-tale and hallucination. A detailed story about this fairy-tale is available at (http://profmattstrassler.com/articles-and-posts/particle-physics-basics/the-known-forces-of-nature/the-strength-of-the-known-forces/#comment-61734 ). There is no need of any fine-tuning for “life” as the entire Higgs saga is just fairy-tale.

It has also been pointed out that even if all the laws of physics were eventually reduced to one ‘superlaw’, then that superlaw would also contain fine tuning, and we would have to ask ourselves why it was so suitable for intelligent life. What needs to be faced up to is that however you look at it, the set of all possible sets of laws is far bigger than the set of all sets of laws that lead to intelligent life. This simply means that if these laws exist by chance, then there are other sets of laws elsewhere.

We are here in deadlock situation. I can’t prove my claim with tools given by current physics paradigm. And your claim is based on current paradigm which obviously isn’t the right one. The future shall be our judge ;)

http://lblogbook.cern.ch/Shift/72410 count-down to… construction or destruction?

Hi Stephen,

What’s behind that link of yours? I don’t have an account for it.

Kimmo, I’m surprised you don’t know about the lhc logbook! it shouldn’t require an account. at least it doesn’t for me. although that may be one of the “jokes” . You can read all about the fairy-tale in those log book entries (just rows in an oracle database somewhere actually)

Ok, now I got it… I thought there should be something special in that last database row :)

I have no idea what you guys are bantering about, so I’m just gonna leave these links here. http://scienceasia.asia/index.php?journal=ama&page=article&op=view&path%5B%5D=61 Cohomological Induction on Generalized G-Modules to Infinite Dimensional Representations and http://mathoverflow.net/questions/102839/what-is-the-relationship-between-motivic-cohomology-and-the-theory-of-motives “What is the relationship between motivic cohomology and the theory of motives?” Peace, Stephen

Number is not generic in human languages: it arises in the Economy of Nature, when you’ve got something to loose. So I really wouldn’t assume that number is equivalent to physics.

Grothendiek was just trying to tame topology, like a circus-master, or a Feudal with a Civilizing Mission. The Galois connection looks interesting, in view of knot theory found in the Absolute Galois Group.

Numbers arise when you enumerate a set.

Hans, you mean cardinal numbers, when do rationals vs. irrational numbers arise?

A 1D rational number is any number that can be expressed as the quotient or fraction p/q of two integers, with the denominator q not equal to zero.

Rational hyper complex numbers have rational numbers as their coefficients.

These hyper complex rational numbers can be seen as embedded in a corresponding hyper complex continuum.

This continuum has the cardinality of the real numbers.

The hyper complex rational numbers have the cardinality of the natural numbers. These number sets are countable.

To my opinion the situation in fundamental physics should be looked from a wider perspective than that given by last forty – not very successful – years of theoretical high energy physics. Physics is definitely in crisis. Multiverse scenario and the view about necessity of fine tuning are conclusions from sticking to certain basic dogmas and refusal to admit that some of them might be badly wrong. I do not believe in all these dogmas and therefore do not share these pessimistic conclusions.

I believe that standard model symmetries have fundamental meaning being selected by their very special mathematical and physical character. GUT approach denied this possibility and led theoreticians on wrong track leading to standard SUSY and eventually to M-theory landscape. Also the fact that that the observed space-time is 4-dimensional very probably contains a very important message. But also the idea about 4-dimensional space-ime became old-fashioned as super string revolutions revolutions followed each other. Sociological factors played a key role in the process. The attitude that thousands of brilliant theoreticians cannot be wrong allowed the situation to develop to a catastrophe made manifest by the findings at LHC. Even in this situation we are told that we should continue to follow the leaders and now give up even the belief that theoretical physics can explain and predict – the very motivation of super string theory originally. And this only because few generations of theoretical particle physicists became victims of mass psychosis. I will not eat this cake!

Concerning the fine tuning of coupling parameters: I believe that fine tuning of dynamical parameters is a basic aspect of quantum evolution leading to life as we identify it, but that standard model symmetries and space-time dimension are mathematical necessities rather than outcomes of evolution in some region of multiverse. One should therefore accept the obvious: superstring model describes physics of 2-D space-time but – as has become clear – the attempts to deduce real physics from it are doomed to fail. Nature does not love tricks. Super-conformal symmetry remains the genuine contribution of string models to physics and the natural next step is to finally generalize this symmetry to four dimensions.

I have spent much time during last decades in trying to understand why we have gradually ended up with this dead end and why the professionals are not able to see that there is no way out except radical rethinking of fundamentals. The history of physics is history of bold and often wrong generalizations. The naive length scale reductionism is one of the most influental of these wrong assumptions. It has been raised to a level of dogma and together with materialistic world view more or less defines nowadays what it is to be scientific. Fractality is very natural candidate for replacing the reductionism and quantum theory strongly encourages to give up materialism but still taken as givens by particle physicists. Reductionism is indeed responsible for many far reaching and probably wrong dogmas in recent day physics.

Reductionism forces us to believe that the strange findings at RHIC and LHC about heavy ion collisions and proton heavy ion collisions are consistent with QCD although here we would have the new physics that we are so desperately searching for. This relates also to naturalness. To my opinion, the attempt to understand mass ratios of various fermion generations group theoretically is doomed to fail. If one accepts the notion of length scale hierarchy implied by fractality there is no need to extend standard model symmetries. The fact that separate B and L conservation is consistent with experimental facts provides an additional strong constraint.

Length scale reductionism also forces us to believe that biology and brain science are just complexity, consciousness is just an epiphenomenon, and free will is an illusion. Theoretical physicists lose a huge treasure trove of anomalies which could help to achieve the sought for unification. As a consequence of this isolation from experiental reality, theoretical physicists have divided into half-religious sects such as super-stringers and loop gravitists. Feynman has talked about general relativists gathering to their yearly meetings and discussing again and again the same old dead ideas. Sadly, Feynman’s characterization seems to apply quite well also to Strings 2013 and Loops 2013.

Length scale reductionism guides us to search dark matter from elementary particle length scales. This direction might be completely wrong: TGD suggests generalization of quantum theory by introducing the hierarchy of effective Planck constants and in this framework dark matter as quantum coherent phases would emerge in long length scales. Ironically, already Tesla made observations, which one might be interpret as indications for the existence of something behaving much like dark matter in TGD sense. Tesla spoke of “cold electricity” not seen in ampere-meter but as a child of his time assigned with it what he called aether particles. Did Tesla discover the dark matter for more than century ago? One cannot exclude this possibility since his experiments typically used high voltages, low frequencies, and using sudden pulses resulting in switching on of electrical circuits and in this manner testing the boundaries of Maxwell’s theory in long rather than short scales (as particle physics does). In this context one must mention also the strange quantum like effects of ELF radiation on vertebrate brain and the fact that cell membrane resting potential corresponds to an electric field above the dielectric breakdown in air. Tesla’s vision about future technology was also surprisingly far reaching and he also saw a possible connection with the energy technology and biology: his ideas are still revolutionary. To me the example of Tesla demonstrates that the history of science is not steady linear evolution but a continual fight between mediocrits and visionaries and mediocrits quite too often win in the short run.

I agree with some of what you say, including the need to let go of some basic assumptions in physics. But the fine tuning issues are nothing to do with that. This part of what you say is entirely wrong:

“…the view about necessity of fine tuning are conclusions from sticking to certain basic dogmas and refusal to admit that some of them might be badly wrong.”

The fine tuning issues are philosophical ones that been have very often misunderstood and misapplied by physicists. Some of them apply in any paradigm, not just the present one. Whatever physics we find in the world around us, it is impossible to say that it could not have been otherwise. No theory of everything, however complete and all encompassing it may be, will arrive with a note attached saying “this is the only way that it could possibly have been, if the universe exists by chance”.

And that has implications for when we estimate the odds relating to our universe, or to the environment we find ourselves in. It simply means, as I said, that the set of all possible sets of laws is far larger than the set of sets of laws that allow intelligent life. This philosophical point applies whatever physics you happen to believe in. it means, unavoidably, that if our set of laws exists by chance, then other sets of laws exist, or have existed, elsewhere.

(When I say ‘unavoidably’ I mean excluding massive coincidence, which in philosophy is generally excluded.)

@Matti Pitkänen

Many good points, thanks.

Especially, “…the view about necessity of fine tuning are conclusions from sticking to certain basic dogmas and refusal to admit that some of them might be badly wrong.”

The “Possible Universe” is a very old subject in philosophy, but it does not mean that the Universe can have different measuring rulers (nature constants). Those nature constants are “precisely defined” and cannot be “otherwise” indeed.

The Hilbert Book Model (HBM) allows very pictorial representations of its fundamental concepts. Let me give you a small preview.

In the model, generators produce coherent groups of discrete objects that are spread over an embedding continuum. The density distribution and the current density distribution of these coherent groups are continuous functions that describe and categorize these groups.

Depending on a suitable Green’s function, the distributions of discrete objects also correspond to potential functions. Depending on the way in which the potential is generated, the potential function corresponds to a local curvature of the embedding space. This can be comprehended when the groups are generated dynamically in a rate of one element per progression step.

During its very short existence the element transmits a wave front that slightly folds and thus curves the embedding space. The wave front keeps floating away with light speed from its previous source. It represents a trace of the existence of the element. This trace survives the element when that element is long gone. These traces can be observed without affecting the emitter.

For each coherent group, the elements are generated at a rate of one element per progression step. With other words the wave fronts form ultra-high frequency waves that move with light speed away from their source. However, each wave front is emitted at a slightly different location. Already at a small distance it appears as if they originate from the same center location. The coherent group forms a building block. These waves together constitute the potential function(s) of this building block.

The elements act as step stones and together they form a micro-path for the corresponding group. This micro-movement can be considered as a combination of a quasi-oscillation and a quasi-rotation. Indirectly, the generator influences space curvature. The descriptors only describe the influence of the potentials on the local space curvature. The ultra-high frequency wave cannot be observed. Only its averaged effect is observable. The resulting potential is an integral and therefore a rather static effect. Modulations of this wave that are due to oscillations of the emitter can be ob-served. These modulation waves possess a much lower frequency than the ultra-high frequency carrier wave has.

The element generator can be described by the convolution of a sharp continuous function and a low scale spread function that blurs the continuous function. In this way, the spreading part can be seen as the activator of local space curvature, while the derivative of the sharp part defines a local metric that can be considered as the descriptor of the local curvature. The two parts must be in concordance. In this way two kinds of descriptors of local curvature exist. The first is the density distribution that describes the spread of the discrete objects. It corresponds to a potential function. The second descriptor is the local metric. Since these functions act on different scales, they can usually be treated separately.

The origin of the local curvature is the dynamic stochastic process that produces the low scale spread of the discrete objects. As described above these objects transmit waves that curve the local space. The HBM suggests the combination of a Poisson process that is coupled to a binomial process, where the attenuation of the binomial process is implemented by a 3D spread function . The stochastic generator process will generate according to a standard plan. In principle, at each location where it is active the generator produces locally the same kind of patterns. In undisturbed (natal) format, these patterns may only differ in their symmetry proper-ties. However, these patterns cause space curvature. The local curvature is generated by the considered group and by neighboring groups. Due to an existing uniform move of the building block and due to the variance in space curvature, the center location of the pattern may become displaced. Both effects disturb the natal state of the distributions that are generated by the generating process. Since the patterns are generated with a single element per progression step, the generation poses a large chance to not generate the target natal shape but instead a distorted shape that in addition is spread over the path that the center location decides to follow. The produced distribution can still be described by a continuous function, but that function will differ from the continuous function that describes the undisturbed natal state. So the generation process is characterized by two functions. The first one represents the characteristics of the local generation process. It describes the natal state of the intended distribution. It is more a prospector than a descriptor. The second one describes the actually produced distribution that is distorted by the local space curvature and spread out by the movement of the center location. Further the generation of the distribution may not be completely finished, because not enough elements were generated since the generation of the pattern was started. The generated element only lives during the current progression step. In the next step a newly generated element replaces the previous object. At any instant the generated distribution consists of only one element. Thus for its most part the distribution can be considered as a set of virtual elements that lived in the past or will live in the future. The virtual distribution together with its current non-virtual element represents a pattern. The local curvature is partly caused by the pattern itself, but for another part it is caused by neighbor patterns.

The previous description of the natal generation can be imagined visually. At a rate of one element per progression instant the generator produces step stones that are used by the generated building block. The step stones are located randomly in a coherent region of 3D space. The building block walks along these step stones. As a consequence even at rest the building block follows a stochastic micro-path. Any movement of the building block as a whole, will be superposed on the micro-path. At every arrival at a step stone, the building block transmits its presence via a wave front that slightly folds and thus curves the embedding continuum. These wave fronts and the transmitted content constitute the potentials of the building block.

Nobody said that the undercrofts of physics behave in a simple way!

The HBM is the name of my own personal project.

The Hilbert Book Model is a simple fully deduced model of the lowest levels of fundamental physics. It is strictly founded on traditional quantum logic. That foundation is extended with trustworthy mathematical methods. The model development is not based upon, but instead it is guided by the findings of contemporary physics. Traditional quantum logic can only model a static status quo. The HBM uses an ordered sequence of these static sub-models in order to construct a dynamic model.

Thus the the HBM steps with universe wide progression steps.

fwiw, I really like your model Hans , makes more sense than a lot of things.

I am very selfish. I have built the HBM for my own fun and in order to feed my curiosity to what exists in the lowest levels of physics. By just reflecting, I discovered more than I expected to find.

[…] more conventional wisdom along these lines, see Naturally Unnatural from Philip Gibbs, which also argues that what we are learning from the LHC is that we must give up […]

Peter Woit, give it up.

M-theory is now an established religion with two gospels. Gospel is absolutely correct and goes way beyond the reach of reasoning. The following two gospels cannot be refuted in anyway.

1. Frank Wilczek: “If people as clever as us haven’t explained it, that’s because it can’t be explained – it’s just an accident.”

2. Philip Gibbs: “People can complain as much as they like that the multiverse is unscientific because it does not predict the standard model. Such arguments are worthless if that is how the universe works”

Gospel is the shining light of the absolute truth which “blinds” every other “facts”. The fact that the Standard Model is reproduced theoretically must be blinded. The fact that the laws of “this” universe can be derived axiomatically must be blinded. Indeed, all facts (including those learned from the LHC) are that we must give up. Give it up, Peter.

Color confinement is a principle that is as strong as the Pauli principle. Most particles of the SM list cannot be created as individuals.

So to put it all into layperson terms the universe and us are here by either: 1) chance (multiverse) or 2) design (fine tunning).

Yes, but instead of ‘multiverse’, one should say many sets of physical laws, as the word multiverse is sometimes used in a more specific way. And one should not say ‘fine tuning’ as that has a different meaning, and is often used to mean ‘apparent fine tuning’. And one should not say ‘design’, as a bunch of idiots use the word a lot. You can say chance (many sets of physical laws), or intention.

@Bill Evans: “Yes, but instead of ‘multiverse’, one should say many sets of physical laws, … You can say chance (many sets of physical laws), or intention.”

Fine, very fine; there are many sets of physics laws.

But, as soon as the Alpha was derived, “one set” among those many sets was “derived” axiomatically, that is, starting with a formal system (with definitions, axioms, procedures, then comes with sentences and theorems (laws)). This derived set is no longer using “predictions” but has “axiomatic consequences”. Yet, there is one coincidence; the laws of this set are “identical” to the “discovered” physics laws. For this “Axiomatic-physics”, it is absolutely not the result of any “chance”. Furthermore, no other set of physics law can manifest from this “Axiomatic-Physics” while it does encompass with zillions (infinite, not any finite numbers) boundary conditions (or parameter spaces).

As far as I know that the multiverse doctrine is about a big number of parameter spaces. I do not know that they have showed any new “set” of physics laws. There are only “two” sets of physics laws; one was “discovered” while the other is “derived”, but these two sets are “identical”. If you know another new set of physics laws, please give me some info on that.

could the multiuniverses be given by the cardinality invented by cantor,as the transfinites.could its mesure the differents multiuniverses by its potencials

Take a suitable foundation for your physics or member of physical multiverse. For example select quantum logic. How many different mutually independent sets of physical laws can be derived from that foundation?

@ Hans van Leunen: “Take a suitable foundation for your physics or member of physical multiverse. … How many different mutually independent sets of physical laws can be derived from that foundation?”

In this Axiomatic-physics, the foundation of the formal system (the definitions, the axioms and procedures) is

a. Precisely defined, and

b. Must not contain any known (discovered) physics laws, that is, no quantum logic or relativity principles.

Yet, its sentences (theorems of the axiomatic expressions) must make contact with all known (discovered) physics laws, such as,

i. Alpha value,

ii. quantum principle,

iii. Standard Model particles,

iv. Super unification (including gravity) with the “Super Unified Force Equation”,

v. etc., such as, the “rise” of e-charge, the spin, the mass, as those are not allowed in the “foundation” of this axiom-system.

But, most importantly, the direct consequence of the “Supersymmetry” is “spin”, not any s-particle. The SUSY (with s-particle) is not allowed in this Axiomatic-physics. Finally, there is only “one” set of physics laws as the consequences of this Axiom-system.

“People can complain as much as they like that the multiverse is unscientific because it does not predict the standard model. Such arguments are worthless if that is how the universe works”

This argument can equally well be applied to religion: People can complain as much as they like that the hypothesis “God” is unscientific because it does not predict the standard model. Such arguments are worthless if that is how the universe works.

Yes – it MAY well be that the universe has an ultimate nature which cannot be reached by the scientific method – and in that case science should and will never reach it …

One difference is that the multiverse predicts the Higgs mass (and the cosmological constant).

Another and IMHO the main difference is that magic is predicting everything, so predicting nothing, while multiverses has a lot of physical constraint.

And really, a magic that must be responsible for less than 10^-22, or in other words 0, of the curvature flattening wasn’t much to begin with. =D

But yes, the offered analysis isn’t good enough.

I should say “of the remaining curvature”.

Comment to Bill Evans about fine tuning.

Thank you for a comment. I formulated my view about fine tuning somewhat sloppily.

As I say later in the comment, I believe in fine tuning of certain dynamical parameters as a result of quantum evolution implying also biological evolution (in TGD inspired theory of consciousness I formulate it as what I call Negentropy Maximization Principle (NMP) saying that the information content of conscious experience associated with state function reduction creating a new quantum Universe is maximal). Quantum jump sequence gradually selects the values of these parameters to produce maximally intelligent universe (very optimistic view but consistent with the second law!). NMP means a replacement of anthropic principle with a variational principle for the dynamics of consciousness identified as a sequence of moments of consciousness=quantum jumps.

I however do not believe that standard model symmetries would be a result of this kind of selection. Standard model symmetries would be forced by the existence of geometry for infinite-dimensional space – “world of classical worlds” (WCW) consisting of 3-surfaces defining the analog of Wheeler’s superspace.

In the case of much simpler loop spaces the mere existence of this Kahler geometry fixes it uniquely for given group G defining the loop group (the existence of Riemann connection requires infinite-dimensional Kac-Moody group as isometries as shown by Freed). Standard model symmetries fix WCW (equivalently, the imbedding space H =M^4xCP_2 containing space-times as 4-surfaces) and the conjecture is that the mathematical existence of WCW Kahler geometry implies the same WCW. Standard model symmetries would have also number theoretical interpretation in terms of classical number fields. For instance, color group would correspond to isometries of CP_2 and subgroup of automorphisms of octonions.

Standard Model of Particle Physics is just a proved hoax. Please circulate the following message.

World should force all the physicists of the world to join to accept the open challenge (which could be seen at http://www.worldsci.org/php/index.php?tab0=Abstracts&tab1=Display&id=6476&tab=2 and also at http://www.gsjournal.net/Science-Journals/Essays/View/4018 ) , which is standing till date, and produce the rebuttal article to the scientific articles on the basis of which open challenge has been put forward; the very well known & only accepted procedure of denying the scientific findings published in peer-reviewed journals.

However for the information of all I would like to state here that every concerned physicist of the world is in total knowledge of the open challenge and they also know that they can do nothing about it technically & scientifically. They would have easily accepted the alternative paradigm of physics which emerges as the consequence of my scientific publications but that would reduce their degrees to trash and they will have to quit their jobs on the basis of morality. Besides there is fund flow of trillions of dollars annually for research in thousands of research institutions all over the world and once world comes to know that all these institutions have been knowingly wasting trillions of dollars of public money; then many (millions) heads would have to role.

Thus all the main-stream physicists are maintaining a deliberate silence and ignorance of the open challenge but how long they can deceive the world as I have taken a vow that I will take this whole issue to its logical conclusion.

Relax! I’m just about ready to prove ToEbi as the ultimate TOE ;) http://toebi.com/documents/ERE.pdf

There won’t be left room for further speculations after my experiment.

is very hard to prove the strings theories or superstrings-with extradimensions and supersymmetry-then the strings or superstrings make part of mainstream of contemporary physics or is a beauty theory that has in the matematics beautyful and deep

ly structures? because some others mathematics branches has connections with the mathematics made for the strings and superstrings?

I quite not get it. Can you rephrase your questions?

is simples: the strings theory does part of the mainstream of physics?

I suppose so.

The best article I’ve seen on multiverses and finetuning. I would also add that classical EM with its bias potential is another open-ended model where observations are used to decide free parameters.

The “doesn’t predict the standard model” is tedious bullshit. As you say, models of stars predict our Sun, and it has been a major test for such theories. We call such theories predictive and testable.

“There could even be much lighter undiscovered particles such as those hinted at by some dark matter searches, if they are hard to produce or detect at colliders, but the more obvious conclusion is that nothing else is there at these energies.”

There may be something there, but not something that explain cosmological DM. Unless the astronomical > 40 GeV WIMP DM bound set by two independent methods 2011 is somehow rejected. [ http://news.brown.edu/pressreleases/2011/11/wimps ]

“If it was not there the cosmos would not contain enough carbon for us to exist.”

AFAIK the low temperature carbon production rate (by the more unlikely pathway of 3 helium collisions IIRC) of large stars would still supply a hefty amount.

“more about LQG is for another post”. It is enough to see that there is no dynamics in LQG – no way to construct harmonic oscillators – so no theory of physics.

“On Earth there is just one basic chemistry based on DNA and RNA.”

I happen to have an interest in astrobiology. There are known thermodynamic reasons why lifeforms that evolves to heredity will use RNA first.

RNA is the only known nucleotide material that satisfies the thermodynamic bound on replicators, it has enough stability to make them (4 years half life) but not too much to be unable to replicate. ["Statistical Physics of Self-replication", England, arxiv 1209.1179, TBP AFAIK.]

There are also ways around the no-go situation as so often, but less likely. Then you need to evolve robust proteins before having RNA, to increase the chemical energy of the bound.

In my book, with or without the go-around pathway, that counts as finetuning.

I have been meaning to make a general post for you to link to which offers a speculation…Now that I can access the internet at least from the library and found passwords (sorry for tech problems) I will post it soon there, Higglets and so on… it includes a link to Lubos on his speculations of which it seems to me he is coming closer to my long standing view… The post will be called something along the lines of Higglets… In all your partial but brilliant views- here reading to me as a sort of metaphysics really before we fine tune our theories… where things are complimentary in standard duality the more we have of supersymmetry the less renormalization and conversely for one unified theory. One should not confuse without understanding the indefinite with the infinite. Perhaps when the library opens Monday…

Burma Shave Messages by the Data Stream

Roadside

L. Edgar Otto Sunday, 21 July, 2013

God, neither created nor destroyed

lathers up His Occam’s razor

Your existence does not guarantee your rights,

your rights guarantee your existence

Turing machine Universe erasing bits is

a multiverse Being by printing

A bankrupt soul persists as parallel reality,

a system and not your hearts can be bankrupt

Free will is the uncertainty of possibilities

when going off-line

– – – Burma Shave

* * * *

if the susy is not confirmed. how the fails in the standard model will be compensated for others theories,as breakdown of cp and cpt.

My theory shows that reality arises from a unique mathematical structure, it is the only structure that gives rise to a dynamic universe like ours. The system shows how QM comes about and the constants of nature are computed.

Never the less it is mind boggling that such a unique structure naturally arose and then it lead to atoms and life. That is the ultimate miracle.

http://fqxi.org/community/forum/topic/1877

I have added new programs that shows how alpha(Fine Structure Constant) appears as the ratio of the probability hits on the electron Compton wave to the total number of throws. I think the programs(section 11) are simple enough and can be understood by anybody who knows some basic JavaScript programming.

http://www.qsa.netne.net

@ adel sadeq, “I have added new programs that shows how alpha (Fine Structure Constant) appears …”

“Every” pure number can be numerological approached (to any degree of accuracy) with, at least, one formula. Is it a pure numerological formula? It can be recognized with first sight. A numerological formula often has no physics meaning.

Alpha is defined with [e (electric charge), c (light speed) and ħ]. By using these three constants to calculate Alpha is getting the answer “with” the answer. The only meaningful way in physics for getting the Alpha is to calculate it with something completely unrelated to its definition, that is, no e, c, or ħ in the equation.

@Tienzen, In my theory c,h,e are themselves are deduced from the theory. Moreover, I have mentioned the program that you can run and see how alpha also as a probability also in perfect agreement with standard physics which regard alpha as probability of emitting a photon.

All these results are automatic in my theory. The whole point of the theory is that once the initial design was recognized everything else came automatically including the interaction law which is the line intersection. i.e. the only thing that lines can do is intersect or not.

It seems like this thread has devolved into a bunch of tautologies.

As I said in the post above from July 23, 2013 at 10:33 am: “Whatever physics we find in the world around us, it is impossible to say that it could not have been otherwise. No theory of everything, however complete and all encompassing it may be, will arrive with a note attached saying “this is the only way that it could possibly have been, if the universe exists by chance. And that has implications for when we estimate the odds relating to our universe, or to the environment we find ourselves in”. No arguments have been put forward against this point.

@ Bill Evans, “No arguments have been put forward against this point.”

There is no way to convince you if you take your position as a religion.

Indeed, for “discovered” physics laws, we have not discovered a “law” which states that all discovered laws are unique and cannot be otherwise.

But, for an axiomatic system, its “sentence set” is always “unique” to the system. Even the Godel’s incompleteness theorem does not give rise to two different “sets” of sentences. So, the argument is very simple.

a. For any axiomatic-physics, its laws are unique and cannot be otherwise.

b. “All” discovered-physics laws are identical to the laws of “one” axiomatic-physics. Then, this discovered set is a sub-set of the axiomatic-physics.

Even if you deny that there is such an axiomatic-physics which is encompassing the discovered-physics, the above argument stands true if such an axiomatic-physics is found. For your argument to stand, such an axiomatic-physics must not exist in nature at all. But, it is a different subject.

Such axiomatic foundation exists. It is quantum logic.

Those ideas are all within a set of laws of physics. You talk about things that cannot be otherwise with a particular universe, or within a particular physical environment. Of course within that setup there may be things that cannot be otherwise.

But the issue we’re discussing here goes outside that, and we simply can’t say that it could not have been otherwise. If a set of laws exists by chance, then in philosophy anything at all could have existed, and the selection of what exists is random.

Taking it to have existed by chance, which is one of the two possibilities I’ve mentioned, implies an origin at the deepest level where nothing can affect what exists, and what exists is selected in a purely random way. The set of physical laws could be any set at all, but not all will produce anything tangible or with duration.

It’s either that, or the other possibility – intention, where what exists is not random. When we look at the odds for the first possibility, we find we need a multiverse or something similar to make sense of the existence of intelligent life. From there one can use anthropic reasoning to explain the fact that we find ourselves in a habitable universe, which we do of course, but there must also be many other inhabitable ones, if this one came about by chance. That way there is no enormous coincidence, which we need to rule out. It can only be ruled out in one of two ways – chance (many sets of laws), or intention.

sorry, it should have been: “…but there must also be many other UNinhabitable ones, if this one came about by chance.”

@Bill Evans, since I know you are not going to read my theory and contemplate it deeply, I will make my point this way. What I am saying is that the design of reality has nothing to do with chance. It is a MATHEMATICAL STRUCTURE. Is the circle there by chance, are the natural numbers there by chance, etc. What is unique about it is that it has lead to reality and there is no other structure that can do similar stuff, that is all.

You say “there is no other structure that can do similar stuff, that is all.” I have no idea how you could possibly know that, seems very unlikely. But that’s not the issue anyway.

The issue is how a structure or a setup is selected out of all possible ones, and to that you say: “it is mind boggling that such a unique structure naturally arose and then it lead to atoms and life. That is the ultimate miracle.” So I think you’re implying intention, which is one of the two alternatives I’ve mentioned anyway.

@ Bill Evans, “If a set of laws exists by chance, … Taking it to have existed by chance, which is one of the two possibilities I’ve mentioned, implies an origin at the deepest level where nothing can affect what exists, and what exists is selected in a purely random way.”

There is no way to argue with your above statement unless you agree to the following reduction about this issue.

1. Is there an axiomatic-physics (AP)?

2. Is the entire discovered-physics (DP) a sub-set of this AP? If there is a single DP-law is outside of this AP, then your point stands.

If there is any negative answer on either of the questions above, your point stands.

There is no point to get the bottom of these two questions right here at this moment. But, I have showed a few “things” in this thread (or in this website).

a. G-strings

b. Neff = 3

c. Alpha equation

d. Super Unified Force equation (leads to uncertainty principle)

e. Energy distribution [67 (dark energy), 28 (dark matter), 5 (visible matter)] with an iceberg model,

f. etc.

All above are the consequences of an AP, absolutely having nothing to do with any gadget data of DP. In fact, the prerequisite of this AP is that “not” any DP is allowed in the base of this AP (definition, axioms, and procedures), that is, the above AP sentences will never be the results of being putting the answers into its base.

@ Bill Evans, “we find we need a multiverse or something similar to make sense of the existence of intelligent life. … but there must also be many other [un-]inhabitable ones,”

There are many un-inhabitable places in this Solar System, such as, the Sun, the Moon, the …

@ Bill Evans, “It can only be ruled out in one of two ways – chance (many sets of laws), or intention.”

The existence of an AP could be a chance or intention. But, what is the issue? If such an AP exists, the laws of nature are unique regardless of being chance or intention.

It sounds like you think these axioms apply within any set of laws. Surely you realise that with different laws you’d arrive at different axioms. I’d say you need to think out of the box a bit more.

Different sets of physical laws means different sets of mathematics and finally different set of logic.

In currently known physics an AP builds on classical logic, quantum logic and a refinement of quantum logic that is even closer to the structure of a Hilbert space. It is not too difficult to show that many aspects of fundamental physics follow from this foundation. However, the result is not deterministic. At small scale it is stochastic.

@Bill Evans, “… with different laws you’d arrive at different axioms. I’d say you need to think out of the box a bit more.”

We don’t derive axioms from laws.

Of course, a different axiom-system will have different set of laws. For physics, there is a Litmus test. If an axiom-system cannot encompass the DP (discovered physics), it will simply be deemed to be wrong. M-theory is indeed an axiom-system but failed to encompass the Standard Model. If we can “prove” that no axiom-system of any kind is able to encompass the SM, then nature might very well is structured with two sets of disjointed laws. But, if one axiom-system is able to encompass the SM, it is the end of the story.

@Bill Evans, If I implied an intention it would not have been mind boggling, would it.

Of course, at this stage of development of my theory I cannot be sure. But all indications from the theory that such variation is not possible. As I show how the constant(including their running phase) arise from the theory, it is not like standard physics where you are free to change the constants. That, among many other reasons I say it is unlikely for other realities to arise.

Moreover, my theory invalidated multiverse interpretation as in QM because you need the probability density arising from the whole interaction picture to compute energy.

Anyhow, out of infinite numbers of triangles you will find a set with equilateral triangles and sets of right angled triangles(special properties), so there is nothing strange about natural “fine tuning”. Now, I suppose you could use the word “chance” to describe such a unique structure that leads to reality since many other ones like fractals and regular automata could not match reality. But I don’t find the word descriptive in this case.

I am not sure how to state things so that I will be clear for you. But I will repeat, there is ONLY one design that leads to our reality, as a matter of fact there is NO other design available to any other reality. That is my conclusion from my theory.

At least now you can see how our reality is designed, from that I don’t see any possible way to design any new dynamic universe. You can try all you want. Even in the unlikely event that you could find a design variation of my theory that leads to other universes, the fact remains that reality is nothing but a mathematical structure and that suffice to say the problem is solved. I personally don’t like that but my preference has nothing to do with it.

People sometimes have trouble facing that there are only these two alternatives even if they don’t think they’ve found the theory of everything. Most of the people I’ve been talking to here do, which doesn’t help.

Based on the latest “particle” accelerator data people have concluded that either:

a) we live in one proper sub-set of an uncountably infinite power-set said power-set representing the full sample space or,

b) we live in an intentional universe.

The truth is we live in one proper sub-set of a fully intentional, uncountably infinite power-set whose universe of discourse is emptiness with said emptiness being primal consciousness. Controlled novelty is introduced via a process which closely approximates evolutionary theory. I believe modern primates refer to this as “approximate reasoning,” a mixture of the two “Evans poles.” In a thousand years I could prove it but no-one in the maths or science fields would accept my proof anyway; Goddess forbid, I’m a freaking mystic (ask any mystical adept and they will tell you that, not only do many worlds exist but, by the grace of the cosmic consciousness, modern primates can access many of these many worlds; Buddhists call them Pure Lands).

Now, I don’t believe pure-science and religion are compatible but then pure-spirituality and religion are not compatible either. The over-riding objective of religion is to maintain status-quo and this is accomplished by maintaining or maximizing information entropy – ignorance! The over-riding objective of both pure-science and pure-spirituality, on the other hand, is evolving status-quo and this is accomplished by maintaining and/or maximizing information negentropy – knowledge! This leads one to conjecture that perhaps pure-science and pure-spirituality are not only compatible but complementary as well.

There have always been two paths to knowledge: mythos and logos. Whether from the perspective of the individual or society, to expend all of your energy on one path at the expense of the other is to sell yourself short; it’s like trying to navigate the entirety of life by exploring half of it. If you only investigate half of life you can only build half a map! Materialists seem to find this acceptable but then they’re trying to maintain the status-quo; that’s not serving the best interests of pure-science, rather, it’s serving the best interests of pure-religion!

“I wanted to understand science because it gave me a new area to explore in my personal quest to understand the nature of reality. I also wanted to learn about it because I recognized in it a compelling way to communicate insights gleaned from my own spiritual tradition.” – His Holiness the Dalai Lama

His Holiness has started the Science for Monks program (http://scienceformonks.org/).

When it comes to these “theories of everything” I tend to agree with Stanford physicist, William Tiller. He talks about the “ladder of understanding” which has, potentially, an infinite number of rungs. He says modern humanity has done a fairly good job of fleshing out the VERY BOTTOM rung, Ha, Ha, Ha . . . Dr. Tiller’s goal, with his Psychoenergetic Science (http://www.tiller.org/), is to establish a bridge to the second rung but in no way, shape, or form would he suggest that he has the “theory of everything.” But then Dr. Tiller not only has a stellar scientific pedigree, he and his wife have also maintained a consistent spiritual practice (meditative) since the late fifties! Go figure . . .

Finally, I pretty much agree with everything Matti Pitkanen had to say . . .

Check, on a): http://arxiv.org/abs/1111.2704: how topos theory drives the multiverse

but on the track of axiomatics, he opts out: http://arxiv.org/pdf/1111.5854.pdf

also on fqxi: http://www.fqxi.org/data/essay-contest-files/Benavides_leavingcantorshea.pdf

Karl Popper opted out for the same reason, from Kolmogarov’s topology of probability.

The real problem arises when the full set of rational quaternions must be embedded in an affine continuum without creating preferred directions with the imaginary parts .

Nature might have solved this by assigning a smallest rational number and filling the non-used space by a stochastic enumeration process. .

This approach also offers sufficient freedom to create some dynamics in the mapping process. However, in that case the map must be recreated at subsequent instances.

Sure, the world recreated (by God) at each instant was the rationalist (and fatalist) response to the radical Buddhism of Nagarjuna. No fractals, no freedom. Its hard, I know, but not new in our time. The issue has been alive since Pelagius was excommunicated for preaching free will, and returned with a vengeance in the 1640s. Interestingly, d’Alembert was schooled in the heresy at the College de Quatre Nations (they still play Rugby together), and turned it to the service of knowledge. That was really great, but very rare.

Matti pointed out @ http://matpitka.blogspot.com/2012/04/how-to-build-quantum-computer-from.html some time back that Turing machines represent extreme levels of abstraction

@Orwin

Both articles that build on the validity of the continuum hypothesis use varieties of logics and try to make them the foundation of new quantum physics. The problem with that approach is that these logics can easily represent a static status quo, but fail to handle dynamics. Via lattice isomorphism with topological spaces these logics can represent geometrical concepts. So the spatial part of physics can be handled, but not the temporal part.

However, the extension to a a dynamical model can easily be accomplished by using an ordered sequence of these static sub-models.

In that case an extra mechanism must be added that takes care of establishing sufficient coherence between subsequent static sub-models. The coherence must not be too stiff otherwise again no dynamics takes place. The vehicle uses the structure of the static sub-models, but further it is independent of these sub-models. Thus it is no logic system. On the contrary. It plays with the enumeration of the propositions and with the embedding of these discrete objects in a possible curved and affine embedding continuum.

I do not find any mentioning of these extra facets in the linked articles.

Normal operators in the separable Hilbert space can deliver enumerators as their eigenvalues. Via its Gelfand triple the separable Hilbert space connects to continuum eigenspaces.

Nature seems to have problems with this embedding process and at small scales it uses a stochastic enumeration mechanism that is implemented by the correlation vehicle. It is the reason of existence of quantum physics.

Yeah Orwin, I’m starting to refer to you in my internal dialogue as “the real deal” because of your tendency to plug logic; all things arise in the mind . . . but what do you mean by “Pitty (sic) he doesn’t know that all cells are pH buffered so no way is his effect psychic in its basis?” Tiller isn’t altering the pH of cells and there’s a plethora of empirical evidence, ignored by mainstream science, which not only demonstrates the mind’s ability to affect/alter matter but its ability to affect/alter the so-called “laws of physics” in real time! Thanks for the links, by the way, I’ll have to find the time to read them.

To you and Hans: I linked to this Kevin Knuth/Philip Goyal paper on Phil’s (Philip Gibbs that is) section of the FQXi forum but I think you all would find the paper interesting; it’s called “Quantum Theory and Probability Theory: Their Relationship and Origin in Symmetry” (http://www.mdpi.com/2073-8994/3/2/171). It shows the exact relationship between QM and PM but, just as importantly, it uses the Cox formulation of PM which is a generalization of the Boolean Logic of Propositions! Here’s a brief excerpt to pique your interest:

“In this paper, we seek to close the gulf between probability theory and quantum theory, and thereby

to precisely establish their relationship. For example, we show that the apparent inapplicability of probability theory to quantum systems is due to the failure to identify assumptions external to probability theory which are rooted in classical physics, and we prove that Feynman’s rules are compatible with probability theory by explicitly deriving Feynman’s rules on the assumption that probability theory is generally valid.

Our approach is inspired by Cox’s pioneering derivation of probability theory [2,3]. The first modern

formulation of probability theory was due to Kolmogorov in 1933 [4]. In this formulation based on set theory, propositions are represented by sets, and probabilities by measures on sets. The key components of Kolmogorov’s formulation are what we recognize as the sum and product rules of probability theory stated above, which, at the time of Kolmogorov’s formulation, were regarded as ultimately justified by recourse to the frequency interpretation of probability. However, in 1946, Cox showed that it was possible to derive these rules from much more primitive ideas, and to understand probability in a more general way. As a result of Cox’s development, the probability calculus can be regarded as a systematic generalization of the Boolean logic of propositions. In a nutshell, his line of thinking runs as follows. In Boolean logic, existing logical propositions (well-formed statements which are objectively true or false) can be used to generate new logical propositions by using the unary negation (or complementation) operator and the binary operators AND and OR [5]. The logic is solely concerned with propositions that are true or false, and formalizes the process of deductive reasoning. Cox showed that it was possible to systematically generalize Boole’s logic by quantifying over the space of propositions in such a way as to remain faithful to the symmetries of the logic, thereby formalizing the process of inductive reasoning (that is, reasoning on the basis of incomplete information).

In particular, to each pair, A;D, of propositions, he associates a real number, p(AjD), which is interpreted as quantifying the degree to which an agent believes proposition A is true given that the agent believes proposition D is true. Cox then requires that the quantification be consistent with the symmetries of the Boolean logic. For example, due to the associativity of the logical AND operator ^, for any propositions A;B and C, one has that A ^ (B ^ C) = (A ^ B) ^ C, which leads to the constraint p (A ^ (B ^ C) jD) =

p ((A ^ B) ^ CjD). These constraints on p yield a set of functional equations whose solution yields

the standard sum and product rules of probability theory. Thus, Cox showed that probability theory

can be understood as a calculus that systematically generalizes the Boolean logic of propositions, and

that probability could be interpreted as an agent’s degree of belief in a proposition on some given

evidence. Very importantly, this view of probability recognizes that, from the outset, all probability

statements are conditional in nature—one always speaks of the probability of a proposition given some

other proposition—which greatly encourages explicit statement of the assumptions that, in application

of Kolmogorov’s formulation, are oftentimes left implicit.

Apart from the importance of Cox’s work in establishing a new mathematical and conceptual

foundation for probability theory, his work also offered a methodological innovation, namely to show

how one can systematically generalize a logic to a calculus in a manner that respects the symmetries that characterize the logic. In recent years, Cox’s example has been expanded into a general methodology [6] that has been used to yield insights not only into existing areas such as measure theory [7] but also to aid in the construction of new calculi, such as a calculus of questions [7,8]. It is this methodology which we employ here to derive Feynman’s rules of quantum theory. “

And finally, I think Dr. Knuth makes a great point on his homepage with: “I bring to this work my experience in machine learning, which amounts to effective and efficient problem-solving. The more that is assumed in a theory, the more likely it is to be wrong. And perhaps more importantly, what is assumed cannot be understood. For example, studying the foundations of quantum mechanics by assuming all of the mathematics of a Hilbert space, basically assumes half the problem, and in doing so prevents one from achieving deep insight” (http://knuthlab.rit.albany.edu/index.php/Program/Foundations).

Best regards . . .

In my opinion logics are only suitable for representing a static status quo. If you want to use it as the foundation of a dynamic physical model, then you must use an ordered sequence of such static sub-models. Further, you should add a mechanism that ensures sufficient coherence between the the subsequent static sub-models. On the other hand the coherence must not be too stiff otherwise again no dynamics will take place.

A solution might be to add sufficient low scale blur between the static sub-models. This indicates that dynamics as well as blur is added from outside the logic systems. It is controlled by the extra mechanism.

Hans,

The dynamics are expressed in the calculus, that’s why one generalizes a logical structure, which is basically an algebra, to a calculus; in fact, in his FQXi paper, Dr. Knuth refers to such as a “process calculus.” To me, it’s the same problem everyone confronts when trying to take the discrete to the continuous. It’s basically the problem of how to maintain faithfulness to the underlying discrete structure in a continuous process. In the work referenced symmetries play a huge part in maintaining this faithfulness which, to me, says a bit about the it from bit question.

Symmetries are nothing but patterns; David Deutsch points out that information can be morphed to fit a variety of media and then asks the question, “What is the general form of information?” I think Ben Goertzel makes a rather formidable argument with his Pattern Theoretics that pattern is the general form of information and what is mathematics? It’s the study of pattern, whether discrete or continuous. So the question becomes: Are patterns generated by reality or does reality emerge due to pattern dynamics? Ben Goertzel, a mathematician, argues that pattern dynamics give rise to reality and has developed a rather cool model of such involving his aptly named magicians and anti-magicians. These magicians and anti-magicians (mathematical functions) run around casting spells on each other creating in the PROCESS, structural conspiracies, which are simply patterns conspiring to maintain themselves (i.e. complex adaptive systems)! Interestingly enough, given that both men have or still do work in machine learning, Kevin Knuth takes the opposing view – reality generates pattern!

To me, the importance of the work referenced is in exposing implicit assumptions inherent in the foundations – a continuation of the seminal work carried out by Cox and Jaynes. But of course I’m a process philosopher at heart so . . .

Anyway, to all of those musicphiles out there, I caught this on the blogoshpere: a new album by @c called, “Up, Down, Charm, Strange, Top, Bottom” (http://thestaticfanatic.blogspot.com/2013/08/c-up-down-charm-strange-top-bottom.html). I have not yet listened to it but I have never been disappointed by the static fanatic . . .

Installing dynamics is a complicated action. Logic gives relational structure and via isomorphism with Hilbert spaces it gives also geometric structure. It is possible to use both isomorphic sub-models to represent a static status quo and install dynamics by an external control mechanism.

It seems to me that the continuum limit of Hans’ kind of model is just Huygens’ Principle. His generation assumed that continuity is integral to reality, because of inertia, for the sake of the calculus… and that conviction took hold in the nineteenth century, with wave mechanics (Hamilton, Rankine, Mach), even as Huygens was tripped up by representation issues. But the Standard Model has no continuity principle, and the current view in quantum philosophy is certainly that Hans’ position cannot be excluded:

http://takingupspacetime.wordpress.com/2011/02/01/objective-chance-and-anti-humeanism/#more-542

Now in fact Lawvere intended Category Theory for no kind of string speculation, but for the Continuum Mechanics of Coleman and Noll…. meanwhile they deconstruct the continuity principle into Constitutive Relations and Jump Conditions which span the gaps between materials…

The SM and String lobbies are way isolated in the reality of current math and applications…

WEIRD

Hi Wes,

If you like Knuth, you should look at this intuitionistic quantum logic, built on the poset approach: arxiv:0902.3201v2. Background here: http://arxiv.org/abs/0709.4364

These guys are impressive, but the road winds on: I want a closer logical analysis of conditional probability, to match current Monte Carlo statistics, with look-elsewhere, etc. And I want fine-grained analysis of contingencies, especially accessibility.

Working with Chris Heunen you find Samson Abransky, who is hot on that trail:

Sequence & Concurrence in Games and Logic: http://arxiv.org/abs/1111.7159 – Structuralism reborn!

Relational Databases and Bell’s Theorem!! http://arxiv.org/abs/1208.6416

Abstract Scalars: http://arxiv.org/abs/0910.2931 -getting warm on the Gibbs Paradox problem

Knot theory to computation: http://arxiv.org/abs/0910.2737

I begin to see the question here as one that can be resolved by simple mathematical, arithmetical considerations to which everyone should be more familiar. Tienzen…from a wider axiom perhaps we can sort out the one theory in a landscape of axioms that defines what is real and how we see it— such as of the standard particles and grand unification and so on…

Axiom of Quasi-choice

L. Edgar Otto 29 July, 2013

Grand

Singlets

Unified

Gravity waves

Many paths the real

Collapse into one choice

Messages in Klein’s bottles

Indefinite Infinity

Dimensionless constants transfinite

Sleeping ducks hide their necks inside their wings

* * * * *

@ L. Edgar Otto, “… Tienzen…from a wider axiom perhaps we can sort out the one theory in a landscape of axioms that defines what is real and how we see it— such as of the standard particles and grand unification and so on… ”

I have made enough comments on this subject. But you just point out the key point of this issue (the landscape of axioms). Thus, I will beg Phil’s permission to answer this question. By the way, this blog is one of very few physics blogs that gain my respect for its professionalism and the honesty. Thanks Phil.

The term of “Multiverse” is now making massive confusion both in physics and in linguistics. Thus, I will first make some clear definitions below.

“Nature” is the nature of “this” universe. A nature of other-verse will be written as nature of other-verse. For the nature of this universe, it encompasses at least three parts.

1. Physical universe (Earth, Solar system, galaxies, etc.), excluding life,

2. Life,

3. Numbers (natural numbers, rational, irrational, real, and imaginary).

Are there three different sets of laws for these three parts? Are these three parts governed by a unified set of laws? These are the fundamental questions, but I will not discuss them now. For the present issue, I will exclude the “Numbers” from this nature for the convenience of discussion.

A. For the physical universe, only fundamental physics which is discovered (excluding all non-verified theory) will be considered, and it encompasses the following key points (only).

i. Standard Model particles

ii. Nature constants, such as Alpha

iii. Uncertainty principle

iv. Gravity (Newton and Einstein)

v. Cosmology (Planck data)

B. For life, only three issues will be considered.

i. Life processes (reproduction and metabolism),

ii. Conscientiousness (the ability of knowing self from other, about individuality),

iii. Intelligence.

The above (A and B) is considered as the “reality” of the nature of “this” universe. Yet, up to this point, the discovered physics laws has very little direct “connection” with the reality of life, especially in terms of conscientiousness and intelligence.

With the above “reality” is clearly defined, we now are able to launch a “beauty contest” — who can “design” the most beautiful universe? And, there is no restricting rule for the design. If the designed (not the real one) is an axiomatic-system, and then there is no restriction in the landscape of axioms. Yet, there is one and only one rule for this beauty contest — our only competitor is the “nature”. Can we come up a better design than the nature’s? At least, our design should be as good as the nature’s. If not, then we are the losers.

M-theory is of course an axiomatic-system. Yet, up to this point, M-theory has failed on every contest (the SM particle, the Alpha calculation, etc.).

In this thread, I have discussed an AP (axiomatic-physics), and it encompasses,

a. Alpha equation

b. G-strings

In fact, the Alpha equation is also the consequence of the G-strings. The G-strings give the following consequences.

1. Standard Model particles,

2. Super unified force equation (then, uncertainty principle),

3. Neff = 3, as there are only exact room for housing 3 generations of quarks,

4. Planck data.

The above makes a great contest with nature in the part of “physical universe” in terms of the “discovered” physics. Yet, there are a few very important contests in the “Life” arena, as follow,

i. G-strings give rise to four-color system (red, yellow, blue and white) which ensures the “individuality” can be implemented somehow.

ii. G-strings show that a Turing computer is embedded in both proton and neutron, and it ensures that a computing device is available for … .

That is, only with the G-strings “design”, the Standard Model particles can give rise to life. Without this G-strings feature, the SM particles are dead stone without life.

Multiverse-ists are welcome to this beauty-contest. There is no restriction on choosing the “base (axioms)”. There is no need for any experimental test, just a “design”. Wait, there is a restriction about the base, no known discovered physics can be a part of the base.

I have showed the consequences of this AP (axiomatic-physics) here. If anyone is interested in its base, here is the gateway (http://www.prequark.org/inte001.htm ).

Tiensen… I rather like the idea of seeing a proton as a Turing machine… universal or multiversal…this must be the part where we include the numbers in the various designs of what to include in expressed paths for the best of probable or possible worlds (to use concepts of earlier times). Much of the ideas of modern physics at the foundations derive mass or gravity from some excess finite in the chirality or spin beyond the fine tuning of an imagined balance. Recently the science magazines noted schizophrenics cannot see the difference in face reversal images… a deeper understanding goes beyond this initial confusion if one survives it or has evolved to do so- thus find something of what an individual is in a din of voices chasing the spin velocity of mental light.

What sort of mediator, wave or zero point like would be required to entangle or transfer energy between Higglike particles? To use the quantum paradigm?

In this foundational sense the idea of branes are useful if we generalize them more- thus our concepts of physics including the loop and gauge view correspond – if we survive the confusion… Then maybe we can get to the hard questions, or a little closer. On the way there are many useful applications of this wider sense of physics.

i think what a new physics will appear only with the violation of pt,but mantaining the invariance of lorentz to the spacetime continuos.there are infinities continuities of space-time,as well as explain the QM.

cpt is not violated,the chage is generated by the deformations of spacetime,when pt is violationg-but with the lorebtz’s invariance,

the suppositions generated by the decayment Bs of that the SUSY

doesn’t exist in the nature.

Tienzen Oh… the symmetry of the four color idea is interesting and something I considered long ago… so we concur… but structurally we note that enters a fifth color as possible when things are colored regions arbitrarily. In this case we do need more elementary proofs than recursion or exhaustion… and that is not out of reach for us all who have met here.

Peirce wanted to open this trail, and now Marni finds herself wayout of it there.

@ L. Edgar Otto,

For physics, we are about reaching our gadget building limit. The LHC will trash itself very soon (in 10 to 20 years). The lifetime of any new gadget will not last too long neither. Thus, many theories are hiding behind this situation. Their predictions are simply beyond the reach of any gadget in the foreseeable future (the next 50 to 100 years). This is why the “beauty-contest” can come in. For this beauty-contest, we can “design” a universe with infinite degree of freedom, without the consideration of any prediction and test verification about its base. The only criterion is that the outcomes of our design must encompass the nature of “this” universe, not less. Again, we must not put any known (discovered) physics into the base of our design, that is, do not put the answer into the design at the beginning. Obviously, the base of this G-strings is not experimentally verified. But, for the beauty-contest, no experimentally verification for the “base” is required. Only its “consequences” matter.

I have all the respect for those zillion-unknown-verses. But, please find one-verse (out of those zillion-verses) which is able to come to this beauty-contest, that is, compete with the nature of “this” universe. If the Multiverse-ists cannot find one-verse for this simple task, I will simply drop the multiverse idea.

In the G-strings, a Turing computer is embedded in them. If multiverse can show that life needs no computing device, it will be a great knowledge to know. If it can show that the laws of life have nothing to do with the laws of physics, it again is an Amen.

In addition to this beauty-contest, there is something verifiable. In G-strings, there is no room for any s-particle (besides it is not needed). And, LHC might be able to provide some info on this very soon.

Turing machines have nothing to do with it. You people are too obsessed with computers. Also, the word “schizophrenics” is unscientific as this “diagnosis” is made based entirely upon what the “patient” tells the “doctor”.

Physics is perhaps also the diagnosis like schizophrenia or fibro myalgia that is said when the diagnosis escapes them… Anyway. Stephen, I was meaning to connect with you for you seem to understand the implications of numbers and structures… to say kudos to your stand… and what you may think of the idea of dimensionless constants as part of the natural description of sets of transfinite numbers… and so on…

What do you-all make of 4-quark charmonum? Does it pre-empt the 4-th color with a N=4 solution? Can G-string theory give a complement within Confinement? Is that different to the white color? you can see I’m still plugging logic here…

Because I want to turn it all inside-out, Moebius-style, to match topological charges and other subtleties in solid-state (real naff in the hep-th orbit).

Post=Turing=Church etc. is just the whole of first-order model theory which is everywhere and (almost) everything in the mainstream. 4-color graphics of God’s plan are also two a penny: Carl Jung used to play glass bead game with them. So right now (ignoring Rausher rousting Matti) maybe Edward Tiller has the last laugh on us just trying to take the first step to rung two of the ladder. Pitty he doesn’t know that all cells are pH buffered so no way is his effect psychic in its basis. The projection, of course, remains as mysterious as ever, but has a palpable function in the way a mother does help to stabilize the immune system of her newborn. this is a classic example of Tienzen’s Conscientiousness, which is a memorable idea.

Stephen, Edgar, i see a new wave rising with Anderson’s log-algebraicity – logs for algebraic numbers – with a huge interest reflected at the noncommutative geometry blog: http://noncommutativegeometry.blogspot.com/ and nore of an inside story from rudy perkins: http://rudyperkins.files.wordpress.com/2013/07/perkinslatalk.pdf.

There are some challenging continuations, and a whole new view of transcendental numbers. I rate this as the new way to tackle the gibbs paradox and specifically why it takes two entropies to yield one temperature, which then comes out complex, and as a dimensionless number.

Very interesting, Orwin… it seems a good direction but I do not think they are looking deeply enough into the idea of what a scalar operation is as a particle- not an ending to unification of the physics… along with wider ideas of even new operations in terms of supersymmetry as a general principle (one that could suggest an interpretation of a double entropy as here mentioned)… Why is there a directed complex of asymmetry? It is not enough to suggest complex numbers contribute or match what we can by merely doubling zero or positive groups in their algebras – which as in E8 appears overwhelming the case (as Rowlands suggested as a possibility). In the general logic of the quasifinite picture such transcendental numbers or ideas of angle laws the connections may be intermittent much as we seem to observe the independence of fields of dark matter. This simple physics (and those links were rather hard to read and decipher) relying on simple physical models such as that distinction between peculation and diffusion only needs a little more generalization as to how we say define n-dimensions and n-symmetry… see my last pesla.blogspot for the photo I worked out testing my new camera… If you mean some sort of log as exponentiation that is most likely not enough to describe all matrix operations anymore than it can for the usually tensor wave scalars multiplied to the 4×4 matrix. Of course it should be at least 16 x 16. Just as meaningless division can be called meaning-free, dimensionless numbers can be seen at the heart of distinctions as dimension-free.

Fair comment Edgar, although its early days yet with log/algebra. As for directed complex asymmetries, that’s pretty much the mystery posed by the CPT theorem and its violations, which now spills out into the vacuum. Closer to your way is this very readable work from Zihua went at a Chinese tech university: arxiv:0811.0066v5.

His octonion representation yields invariants in linear and spin magnetic momentum, and power density from angular momentum (like a Poynting Theorem), all with gauge-related anti-commutative uncertainty terms (which the dude wants to throw away).

The WEIRD part is that *> the anti-commutators are symmetry-broken internal representation <* and possibility of consistency proof, just where you'd think it impossible.

Meanwhile, it occurs to me. Edgar, that your original spin.varied crystal matrix allows schematic representation of semiconductor possibilities, and thereby the electronic basis of comoputation capacity.

I must say, its bugged me for years that there is no accessible introduction to semiconductor electronics: the original corporate lock in. Now that Carver Mean has retired to mull over fundamental constants and the like, he should put some money into opening up the area: after all, its like the crunch-zone after the shock of the aborted quantum revolution tat harried his early career…

So the WEIRD PART got folded into the inner hunger of our ertatz reality, the yawning maw of materialized, plasticized ignorance. Now I understand by Norbert Weiner thought entropy was EVIL, although I still don’t agree…

The WEIRD part is that *> the anti-commutators are symmetry-broken internal representation <* and possibility of consistency proof, just where you'd think it impossible. etc.

I don't know whether to laugh or rant.

I mentioned you Orwin, on my pesla blogspot com in relation to superconductivity – if I understood what you saw in relation to it- also a call for speculation as what some biological implications may be from such loop and string models where they seem to match or apply…

Having offered you a speculation as theory model building, and getting used to the sense of it… I extended the idea into what I tentatively call Higgy Biological Conjugation… if you care to refute, read, or add to the wide and dramatic new idea of such physics. The photos at my blog http//:www.pesla.blogspot.com include a general statement concerning this dynamic mechanism in relation to questions here on life’s rarity and universe, the numerology of maximum symmetry groups, the idea of complex conjugation over the octonions in the photo of the spider suspended on the hidden strings of web as a physical idea at least of subconscious and perhaps some relation to “nightmare scenarios”. The duck asleep suggests to me the Kleins bottle and all that implies in code reading or intersection of the two moebius strips, knots and so on. At last an arrangement of magenta flowers in acknowledgement of hope in the future of physics and praise for your flower words that is also a passion and joy as pure poetry.

i think that the string theory,special theory of relativity and octonions are intrinsically connecteds and generates the spacetime

continuos into of continutity and discreteness structures.there appear the imaginary part( tempo) that deform the space and gives orientation temporal to fundamental entities that generate the manifolds:non-comutative matrices,quaternions,octonion in 8D to curved riemannian manifolds and others hyperbolics structures.

I really agree about imaginary parts and deformations: in there is the possibility of freedom – a cell morphing its way through the surrounding water. The tricky part is to represent morphable bodies: this seems to require taking a limit of surrounding polygons, giving the transfinite diameter. Interestingly, it is this feature of Donaldson-Witten theory hep-th/9705138, hep-th/9811198v2, ) (arxiv: that requires Kripke-Joyal forcing semantics and intuitionist logic (arxiv: hep-the/9712241v2): if the object is “not the surrounds” the logic requires disproof in place of proof.

Folks. So much great stuff here the past few days. I feel ya’ll are definitely on to something. I’ve just realized that there is no linear order to these ideas hear.. so between this and programming and chasin tail it’ll take me a while to digest the relevant bits of infinite wisdom dropped here to formulate a coherent reply. Thanks for the thoughts Edgar… understanding is one thing, being able to communicate it, another. Peace. I’m being rather quiet cause im contemplating the 4d body representation thing.. someone very close to me told me the film Logans Run was filmed not far from where I grew up and live.

I am sorry to ask such a silly question but what does geometric unity say about the three generations of fermions?

You unlock this door with the key of imagination. Beyond it is another dimension: a dimension of sound, a dimension of sight, a dimension of mind. You’re moving into a land of both shadow and substance, of things and ideas; you’ve just crossed over into the Twilight Zone.

About generations:

All elementary particles are generated by a mechanism that is based on a stochastic spatial spread function. (a combination of a Poisson process and a binomial process, where the binomial process is implemented by a 3D spread function).

The three different generations correspond to different statistical characteristics of the corresponding generators.

I thought Geometric Unity was geometric rather than statistical.

Fermions exist in 3 generations.

About geometric unity:

What if a black hole represents a well ordered state of greatest density? This can be the same state that universe was in at the start of the evolution. This would mean that the activity of the BH can be interpreted as a local return to its natal state.

The HBM sees QP as a form of fluid dynamics, where QP uses the differential equations and cosmology uses the integral equations.

These integral equations cover volume integrals and surface integrals. The condition that surface integrals are equal to zero divides universe into compartments. Particles do not pass these surfaces, but potentials can.

Inside the compartments the BH’s suck all matter until finally only a single huge BH results. After quieting this final BH is disrupted and the never ending story goes on with a new episode.

I am a huge fan of the never ending story model but the meaning of those acronyms elludes me.

The driving force behind this never ending story is caused by the gravitation potentials that can pass the borders of the compartments. The gravitational potentials are raised by all compartments that exist in this divided universe. This potential must be enormous.

I definitely agree that the potentials can go through the boundaries. That is foundational principle of my research. Since potential is relative, I don’t see an enormousness. (Compared to the scale of the universe.) I like to think of this potential as a a golden spiral circling through Fibonacci cells.

In the HBM potentials are emitted as spherical wave fronts. The wave fronts that are emitted by a given particle combine in ultra-high frequency carrier waves. The contribution of a wave front to the potential is given by a dedicated Green’s function. The transport of the carrier waves is governed by the Huygens principle.

In the HBM both the particles and the wave fronts are (re)generated at every progression step. The size of the progression step defines the ultra-high frequency of the carrier waves. (Photons are low-frequency modulations of these ultra-high frequency carrier waves.)

Hans, and Jonathan (I like your website Jonathan so thought I might try a little deeper explanation in an illustration from my pesla blogspot com for this issue of particle generation) I also mention some of the ideas others have offered here in a general context. It is combinatorics 101 but how many of you have time, as Marni once remarked, to learn a whole new language. Yet here I speak of what should be our common terms- well, at least a thousand words in a picture.

Thanks! That is an interesting way to look at it. I also have a model that explains the three generations, albeit a bit more simply.

Have you read “The Truth About Geometric Unity?”

http://vixra.org/abs/1307.0075

Can we tone down the WEIRDness level? that would be greeeeeeeat. http://www.computerworld.com/s/article/9241371/Spy_agencies_want_low_energy_system_to_solve_interesting_problems_

Hooke’s law is coming to mind

Scientific American have a Future of the Web piece by Markus Hoffman of ATT’s Bell Labs – sounding like a complete dwerp, but ATT have been on the skids for decades. It all about putting Metadata (library calalog tabs) in internet packets for easier snooping. Cringe, that’s the beat that’s saw old Ireland back into the bog.

For chrissake, can’t we have an open system for posting your shopping list and shopping range online?

Where you live has nothing to do with it, in a world where people will fly to HongKong or Singapore just to shop. And income is again irrelevant, given huge overhanging debt – its your credit rating that matters, which is online anyway. So we need a secure open system for linking shopping lists and credit ratings. Period.

Orwin, sounds about right, it’s incredible why anyone in their right minds would think this “meta-data” sewer stream would have anything of value whatsoever. In my view, they are just points in a bunch of point processes…now, the human beings triggering insertion of points into this cacophony… well.. just be careful what kind of feedback you entertain. Some of the things people think are “real” are just downright bizarre. I’ve been trying to get people to use retroshare, retroshare.sf.net …. I think trust will be a big issue for perpetuity… whether it be domains, people, institutions, computers, animals, whatever. The vast majority of web users have no clue or interest in these things however.

http://phys.org/news/2013-08-obama-met-telecoms-chiefs-surveillance.html buncha tail tucked between their legs shit-eating bastards. Namaste, and whatnot. (Waiting for Robert to tell me to take my meds again)

So my friend has this to say….

——————-

I’m going to comment on observed behaviors which have been extensively analyzed. I’m not making up shit, much.

These guys believe they’re brilliant, therefore anything they don’t understand must be totally brilliant, and so intrinsically good, as they are good. Quantum theory is amazing. Therefore, if their process complies with qed theory, by it’s incomprehensibility, it must be great. Consider cargo cults. They mimic in their shrines the technologies that gave them great wealth. Your people raise digital shrines. Lead them in prayer.

———————-

What say ye?

My friend does have a fairly pessimistic worldview, I must admit. Does it have anything to do with effing the ineffable?

@Orwin.

you last post here with interesting links to Wes – I added more visuals to my last pesla.blogspot with some interesting ideas more intuitionist than say Phil’s interest in random graph theory. What am I seeing here I sometimes wonder sorting thru the numerical coincidences…. thanks.

How else do you taste molecules which are always rolling, with six primary tastes, by tradition? I’ve been looking out of a lead on that one for a long time. Now wading through Heath on Euclid for leads on diorismi – distinctions, the other, older way in geometry, a play of contrast as in Spencer and the chiaroscuro of Expressionism. So Paul Tannery, failed tobacco executive and academic opportunist, had a fundamentalist rant on the subject and started a racket called Philosophy of Science, where too many today are still imprisoned.

One had four levels of abstraction:

*position (locale, point if you insist)

*magnitude (circle)

*species (polygon)

*ratio (instances, with symmetry-breaking)…

Orwin, that is a most interesting and fresh approach. After all when it comes to the sense of smell it is more complicated as if we can make very fine distinctions on chemical structures. It is amazing to me the sixth taste was not named until rather recently some I think in Japanese like MSG sensing. It just cannot be as simple as six degrees of freedom in an ensemble of gasses. I imagine that within a small system, the spacious now or the breath of a quantum bra-ket we have causation of what is preceded by x , a range of z’s may follow. But outside the brakets or the now causation is logically just an historical fiction. I wonder also if ideas of a difference of dimension not only defines the boundary of a material object but constitutes it materiality… a theme used in science fiction. In any case such a developing system if the terms are set down these can result in closed loops which short of a full higher space limits the information to spinning entities…perhaps a cycle here deeper than whatever is broken that defines entropy in isolation as cyclic temperature (or time). The speculation the 20 involved in Dirac’s algebra corresponds to the 20 amino acids as in Peter Rowlands (and my own independent ideas starting with n-D n-symmetry in 1964, the bilateral symmetry important) is now common speculation as to the unity of biology and physics. We are as organisms a compressed shadow of such higher structures.

From the mental or natural physical processes here considered I have imagined implications of strong applications of my last posting ideas…the fact nets should not be thrown into the sea at random nor should they constrain things by the strength and weave of our nets and limited vision. We have free access beyond anyone whoi would meter and charge for it, to such physics in general space to do most anything we can imagine – the world could be stranger to come even more than we have dared to dream. But the idea is too new to post anywhere for now.

Fellows,

Finding the right place in the technology, that and independent development has shown me where my education is lacking, the steps I assumed close to the foundations and elaborated on, and that the way science is taught can be an arbitrary bias if not outright wrong for some projects. I have had to go back to the books to catch up and understand our basic electronics projects where so much of our era of science began (think of Einstein in the early days of he patent office around new inventions) although my Dad was a great engineer he felt hampered by the higher math so in that sense his best teaching was having me take apart various electronic components- or building antennas, I wish I had more time and yes budget but with greater theory things go rapidly.

Two things have come up synchronously in my current concerns of which I regard them as hints of what could develop and are very much understated. These are connected in the brain and natural blog threads. Some are concerned with the observation effect be it a matter of quantum stuff or some form of consciousness – for that some have suggested the solution is make your own internet.

The first is that form of particles that looks like my last musings and of which Lubos has a post also (of which he tries to cite people who in the past were the origins of the idea now that it is published and became fashionable (this precedence for special purposes cited other than it gives us a common name for effects I do not think is historical or viable outside the brackets of our current spacious now in principle) Let us blame Harvard perhaps and science as political or social, or economic, collective or private.

http://io9.com/mystery-particle-could-revolutionize-personal-electroni-1077218876

The other aspect to be considered concerning the brain as computation, parallel ideas from nature. Is a bold attempt from IBM for a new programming language.

http://io9.com/new-computer-programming-language-imitates-the-human-br-1080026417

In these things at the frontier we surely can go deeper where they are intimately related – the question of what is needed to manufacture or program the hard and soft ware as well a better understanding of what the mind is (perhaps more than a quantum induction system- given this we can go further than the dramatic statement of a ten fold miniaturization and so on to higher spaces. If we know, and can discern where the natural or artificial differences show up there will be ways to compensate this great ambiguity of material or mental dualism- autistic cures for example in the intelligent tinkering with say what is lacking in particular genes.

But I do not mean to add to the abstracts of abstracts of news nor be a node in how the knowledge is presented or used passed as news from my time place and circumstance. One of the mysteries of QM is the sense that we all share something of science and wisdom. As was said and I forget who… the Neanderthal gazing at the beauty of a sunset is in some ways the same person we are – at least for now, and we can be so much more than that.

Thank you all for focusing my thoughts and for your inspiration.

Edgar, I always enjoy your posts here even if I find it hard to interpret such abstract geometric representations of such concepts, you’ve intuitively expressed ideas which I have been floating around in the back of my mind without a firm manifestation in the form of a sequence of words to express them. I just ordered http://www.amazon.com/gp/product/0821851993/ref=ox_ya_os_product looks interesting…

Peace,

Stephen

So kind of you, sir

I added some drawings to show the space a little better in the idea of the geometry and algebraic patterns- my first attempt to see in Nerdy terms the ascII code for the plane in seven dimensions of color (thus a image of three space). As I close this session thru a hot spot on my phone with but a few post I check science daily and found this:

http://www.sciencedaily.com/releases/2013/08/130810063645.htm

Portal? well, the vanishing point that the authors suggest leads to something deeper than the Higgs field (dark matter and would this not extend things if the discovery as the supersymmetry discovery as well for the Higgs and standard theory?) I did post the Higgs as a dynamic terminator and initiator idea akin to the biology mechanisms not that far back that suggests there is something more or missing we see in all this. I am amazed just how far our electronics has progressed based on fudged values and uncertain foundational ideas within a tolerance- perhaps old Ben Franklin is relatively right after all as the formulas are till workable if the lighting does not go thru us to the ground when we all tie keys to our kites. All this since Maxwell – and his displacement current was imagined by leaks thru the Ether!

Again, our theoreticians come so close with not enough imagination then hit a brick wall so to speak… Still, what we are is the frontier of what amazes me if I think about it… Caution is perhaps better than smugness we know what is going on when it come to what most any of us can find with our doubts as a barrier of which surely our minds act as if portals to wisdom.

Peace

What do ya’ll think of the wikipedia article @ https://en.wikipedia.org/wiki/Electromagnetic_mass ? I read that thing about the “Higgs portal” … bewildering

Elementary particles do not possess internal kinetic energy, but composites have internal kinetic energy that is carried by their constituents. That kinetic energy and eventual electromagnetic energy adds to the mass of the composite.

You see just how esoteric is the correct understanding of special relativity – realtivistic appearances as I prefer to call them. All true idealist following Poincare remain entangled in the illusions.

And Feynmann’s work on self-energy is then lost behind the veil, leaving zero chance of getting through to Yukawa, who shared the Nobel, but thought differently. That’s where the longitudinal wave gets placed, with solitons.

It all goes back to ignoring Ernst Mach and the simple fact that a sound wave is NOT a simple harmonic/sine wave but a shock-wave, which breaks the symmetry of the wave-form so introducing a directionality. The only continuation at all easy to follow is jump conditions in continuum mechanics.

Anycase there’s this prior asymmetry still to be accounted for in the Higgs hunt, hidden in the isospin concept. Way to go, way to go,…

Hansvanleunen,

In the black box of which we can access by dual particles and assume there is some sort of structure or principle in the vacuum such as vibrations and so on… or perhaps a brick wall of nothing absolutely… who is to say how many levels such Higgs like fields may go containing black boxes and so on? I understand that the contribution includes the internal (and perpetual motion as if we can imagine them carrying heat) ensemble of particles to the energy which may be a displacement current of zero in an ether.

My question is how this is achieved or done as it limits us to the standard theory and why does it seem not enough to account for the total energy? If we are discussing wave forms here and the differentiation of a wave is a wave how can that tell us of things that are outside it as if some black box measure of ignorance in the quantum sense? What do you add that your program as a method may explain much of this… Energy is also express as structure (say the volume of the universe) as well as action-frequency or mass-lightspeed.

@Edgar

The HBM derives geometric structure from its foundation, which is quantum logic. Quantum logic can be refined to Hilbert logic, whose structure is much closer to the structure of a separable Hilbert space. Together these logics pose a hierarchy that is reflected in the Hilbert space. Quantum logic propositions represent closed subspaces of the Hilbert space. Atomic Hilbert propositions that span a quantum logic proposition correspond to Hilbert base vectors that span the corresponding Hilbert subspace. The quantum logic proposition represents a building block, such as an elementary particle. The atomic Hilbert propositions represent the constituents of this building block. They represent the step stones that together configure the building block.

Thus via these isomorphisms will the logical systems obtain their geometric explanation.

Allocation operators can be used to enumerate the Hilbert base vectors via the eigenvalues of the operator.

hansvanleunen

If the eigenvalues can by quantum logic describe the structure and that further refined by Hilbert base vectors so as to specify an elementary particle why do these need be connected to describe a unified system and these not be discrete themselves? Perhaps in the totality of this vision we again expect a totality from a closed system of probable closed systems. Moreover, is there a sequential logic (digital ) here as quantum logic. Cannot such a hierarchy of distinct levels be imagined as a hierarchy of quantum logic systems as surely nature seems to do in actuality as particle generations – or can there be something more general than Hilbert space for what we focus as a potential unity of natural laws, Within a closed system it does not matter how far we confound the issues with expanding terms of a tower of babble and particle zoo.

But there is nothing wrong with this viewpoint as a good statement of what we have established as the body of wisdom standard and known- I just expected something new to be noticed or arise in the test or method of your system. To say that a Hilbert entity contains the information of all other such entities as if the world is assumed everywhere connected is still a mystical principle or metaphysical one that given our quantum tautologies are contradictions to the very idea of what is isolated, separate, multiversal so to speak of indefinite origins and without the mirror of meaningful teleology.

Perhaps the more classical values or ideas apply as such bricks to which we have to find finer definition of the glue or mortar between them so to adjust classes of interger numbers (why does the QM theory respond to such numbers) as we compute mass or energy. That what is classical in the model is not vanished into the vagueness of uncertainty at the foundation of quantum logic.

So, does the geometry or structure determine the mathematical ideas here or the physics as far as we can take it determine the math and logic?

If we take qm idea of spin literally then elementary particles can posses intrinsic energy and asymmetric directions of operations. But if the idea of say supersymmetry, higher dimensions, even higher qm or other logic operations cannot be seen in the depths when it is near and all around us perhaps like light this is invisible save only in reflection.

We must take the idea of spin literally. It’s the only way to TOE.

Laplace’s Lemon – they all though he was mad and had exactly nothing to say – but it was the odd birth of symmetry analysis. Odd.

http://www.maa.org/sites/default/files/pdf/upload_library/22/Ford/Grabiner3-18.pdf

Edgar

The Hilbert Book Model does its best to explain all details of the process of deriving a simple model of physics that is completely deduced from its foundation. The HBM selects quantum logic as its foundation.

See: http://www.e-physics.eu/PhysicsOfTheHilbertBookModel.pdf

Hans Van Leunen,

That was a masterpiece e-book pdf. I liked the personal introduction and it places you at the front of inquiry in this young century’s generation. It addresses or is aware of new concerns, ones morever that fit our new observations.

Where our models need to be further generalized much of the way is marked here in directions, intuitively at least, implied. As logic it needs to be generalized also beyond induction, and deduction as to matters in between (abduction? the transfer of material of higher complexity?)

But to combine these ideas as if a logic stance as a beginning of theory is, well, a brilliant stroke. It handles the seven dimensions in terms of the technicalities of gluon theory. The continuum of the power set 2^n, in the world of 16 natural dimensions… is a basis for my view as a third physics branch in the unification.

Such ideas of logic definitely justify to myself the need for a better description of dimensions and how they relate, are many or one, or are superimposed in the lattice of physical things.

Thank you for the link. What is science but noble attempts of the trying and testing? You, sir show that we can intelligible reach for the truth of things,

L. Edgar

@Edgar

One thing is pity. I am no longer young. My age is 72 and I am a retired physicist. Not even a good one.

However I hope to stimulate young ones to take up the courage and enter yet unexplored directions by using trustworthy methodology. It certainly has sense and can bring great satisfaction.

At the same time I want to warn the elder and experienced physicists that physics is not yet established. Instead it looks as if progress of the development of fundamental has come to an unwanted stagnation.

I undertake the Hilbert Book model project for fun and in order to feed my curiosity to the crypts of physics.

Orwin,

Some interesting new ideas on how flies sense molecules in the depth (not just our idea of surface physics) of molecules, new scientist.

I had thoughts put into a picture and rather compact words on my pesla.blogspot tonight which does relate to the foundations. In general this idea, as you mentioned and Dirac saw no evidence at the time- the effects of shock waves so important in nature also.

Lubos has a good post with his own poetic words for the idea of even smaller higgslike particles which I see follows also – even the electron may be abstractly composed of an integral number of them as stated in 1969 at the 8D discovery of my Quasic grid. In effect it is hard to distinguish if at all between a unit cell singularity as the number of points as are points is a class- that is we really cannot distinguish a crystalline background from a more centered one.

Shock waves are a kind of 3D impulse responses. They are wave fronts. If they are emitted by a source at rest in a high frequency rate, then they form high frequency waves. Usually they are emitted by a moving source. A tsunami is also a kind of shock wave.

Edgar, I think you are missing something there, because we do distinguish body-centered and face-centered crystals. Its a subtle distinction, and its now said that the face-centered organization introduces a kind of fifth dimension, like the Penrose tiling, and related discoveries again in crystallography. That suggests an opening on holography, and things yet more subtle: Newton grew the crystal tree of antinomy as a demonstration of alchemy, and I’ve seen acetyls seed a similar fractal in soap-formation, which is in the semi-crystalline, organic range, like some Penrose-related discoveries. Recently I tracked the acetyls in herbology, not into the tacky, weedy psychotropic range. In these ways I’m very glad my work is not confined to digital representations.

Interesting, from an analog viewpoint could you make clearer what I am missing (yet when is a theory ever complete beyond some level?) The general idea is not one that emphasizes the digital…our senses and mind are both. In reading Coxeter from the n-dimensional Euclidean viewpoint even the Penrose tiles in three space are a possibility (a suggestion of which before the fact the admissions at NC state had no comprehension of what I was saying at all- Hey, I would not make a good football player in the dark loving the wrong woman and famous enough or afraid of riots and such to cover it up with a scholarship to New Mexico to continue a career. I mean the advisers at the Veterans Office back then said “we all cannot be O.J. Simpson”.)

The Penrose tiling, many good patterns discovered as a hobby by an Aussie housewife, can contain decagons of which the internal structure of the rhombs are independent of the surrounding quasicrystal patterns… in space also, the 4D case being the golden ratio as the essential dihedral angles. Now we stack molecules in layers based on that tiling and ratio. In the five space case I think it is not so removed from the general structure when a subtle change of state occurs being analogous to the square root of 11 as the quadratic irrational distance involved. Surely it is an intuitive reason so many theoreticians have enjoyed these sorts of patterns aside from their work.

Perhaps something thermodynamic is missing here (of which your experiments hold thrilling possibilities when considered for life properties, commendable and refreshing for those who really seek to understand and have a disciplined yet open mind- thanks for a most valuable reply.) If we grow a crystal of Germainium shaped like a horn in the slice or perhaps a pseudo-sphere… the current thru it heats the flat end and freezes the small end (now which way does this temperature difference stop or which way needs it be asymptotically open?) Consider too the Hirsh tube vortex used by firemen to cool their suits that so distributes or decides Maxwell’s demon like the mechanics of molecule vibration separation.

Hans, we have to redifine fundamentally what we mean by a moving source or motion itself in terms of our natural simple dimensions. One man’s exotic topology can be another man’s flatland.

Well, Orwin… that is as far I can comment for now life beckons and cooking is on my mind as adding salsa vastly improved my gumbo chicken celery rice (by which I am tempted to try the induction ovens for temperature control- makes sense. Also I have to stop running into all my crazy neighbors trying to engage me in intricate political commentary or some such stance that is an odd language between them when they occasionally escape the group homes- hey, I will give them cigarettes or dinner if they are hungry.

You know Orwin, my old string theorist friend who did not make tenure and moved on to make more money in own business used to regent how the physics faculty meet outside of school and had deep discussions of things- most rarely talk to each other passing thru the floors and halls. Again, thank you all.

Edgar

I selected a rather lazy viewpoint. I always use the Hilbert Book Model as the background of my deliberations.

That is a strong restriction, but it is also a save restriction.

You may have the wild end of Galois theory there: http://www.neverendingbooks.org/index.php/galois-last-letter.html, a story which opens on Scottish Balls, which is rhetoric pitched at deflating the myth of Platonic solids (uh, Pythagorean in some range). I rather think they were used as agitators in washing sheepskins, but the point then is also what all was known about sheep grease after some thousands of years of distilling (from c. 3.5k BCE). I mean, they recently discovered a 40 acre stone-age distilling factory on Cyprus – destroyed by an earthquake. But when the tale turns up that Euclid wrote a book on the Light and the Heavy with an ace definition of specific gravity, its gets huffed away as bunkum. Yawn. i scheme you were lucky to get rejected – academia is soiled with these prissy lies about the past and other cultures, and blunders on with their best ideas stashed away in the crypts. i swear, their libraries should carry a health warning:

!DISINFORMATION MAY INDUCE SEMANTIC NAUSEA!

Speaking of Penrose tilings.. I briefly stumbled across this in this fractal string thing I was wildly chasing in foolish hopes of proving the Riemann hypothesis (I don’t suggest anyone else try, and do not, whatever you do, do NOT read the book: Prime Obsession) http://scienceasia.asia/index.php?journal=ama&page=article&op=view&path%5B%5D=55 look on Page 4 right after Equation 12: “An interesting fact is that the density of a motif in a certain noncommutative space described in

[23, 5.1] must necessarily be an element of the group Z + ϕZ.” that reference is, [23] Michel L. Lapidus. In search of the Riemann zeros: Strings, Fractal membranes and Noncommutative Spacetimes. American Mathematical Society, 2008.

Marni reckons motivic gravity is now hot, but uncertainty relations/non-commutative options are not. So that looks just like a gauge-related uncertainty like that weird stuff from China. i love the deep quiet of these ideas – the sense of being utterly, metaphysically insulated from academic blather. Peace.

i noticed recently that the formula for Pythagorean tripples works from an expression (c+b)(c-b), which has the signature of a Poisson bracket and thus uncertainty. The tripples are thus like error-terms in the oriention of a line, as within a fiber-bundle. And in the Classical debate as it unfolded in the wreakage of the Athenian empire and Academic ambitions, the formula yields a series converging on sqrt(2), known as Theon’s Ladder. in a word, the irrational intrudes in Classical discourse as the possibility of freedom, but gets pasted as disobedient/evil. likely the zeta function diverges symmetrically so that the zeroes are lost in uncertainty again. the Bessell function is absolutely like that, and so becomes the representation of shock-waves and processes of discontinuity. The very idea of continuity then looses its metaphysical bearings, and all the Empires fall over again.

….and i feel fine…

Peace

Edgar, don’t feel bad, I tried to go to school once and realized I couldn’t learn the things I wanted to learn there, it would cost a lot of money and time, etc. Food and cigarettes are good but cigarettes aren’t the healthiest of things… maybe some other herb is in order…. mmmm salsa… I’m from Texas… the home of Tex-Mex… you’ll find some of the best salsa ever hear… where do you live? I’m coming over :)

Orwin, I also feel that sense of deep quiet… almost austere … its very nice … not easy to transmit however

Peace,

Stephen

I added a couple of drawings to my last post… I should put this method more formally, program it or something. Many ideas seem to occur counting the numbers in the pictures. Now it does seem like we enter the realm just beyond the wild end of Galois theory and I wonder what took us so long. What makes things move in some direction over the degeneracy of the DNA code meanings? I suggest these number theory levels akin to this idea of how the polarity is formed say in the nerves (or even the total head or tail of an organism) : http://www.pnas.org/content/early/2013/08/08/1301588110 but it certainly seems more than just he biochemistry of nerves.

Initiators are indefinite and open ended when begins somewhere in a sequence but the range of the terminators brings to a stop of a definite state. 240 can be found in there somewhere as well as in the pattern of 11 (also a binary 2) the 11zies can be nodes or the remainder of the field be nodes in the physical definition. That some things are defined at rest is like the 11th space so that strings can vibrate, the same but mirrored principle.

Ahhh, going near Laredo with my eyes beginning to water from the great chilli dishes as I walked into our Spanish friend’s mom’s house for dinner! A quiet but vivid memory of life unbound like a tumbleweed.

I have often wondered why it seems that half the people speak out of the wrong end when trying to explain something… 50/50 statistics from some beginning.

Interesting, I’ve put out a request for the PDF… cant get it thru my damned university account… one of those things where they say they are subscribed but actually aren’t.

http://phys.org/news/2013-08-cosmologist-universe.html

Cosmologist Christof Wetterich of the University of Heidelberg has uploaded a paper to the arXiv server in which he claims it’s possible that the theory of expansion of the universe might be incorrect. He suggests instead that the redshift observed by researchers here on Earth might be caused by an increase in the mass in the universe.

*shrug*

therealsac

I am not that that fond of the usual redshift explanations including Hoyle’s iron whisker dust nor from the center of the sun a photon exchanged takes as long as from the first big bang light to leave a star…expansion said lately can be an illusion.

In the familiar world more matter makes a bigger pile and occupies space- how is it the more mass the more compact an object in the deeper scales of space? Or it seems that the first light from the big bang is now reaching us in our era? Or lately the structure of galaxies as we see them came early on in that time line. My first instinct is to say we need to sort out what we mean by information (as in teleportation of quantum bits be the minimum bits needed 4 or 8 as in Penrose quantanglement).

Weyl had something to say about mass of an particle depending on its history path but Einstein rejected it from what was theory then as another thing too weird to be true- so in this great debate of what infinite or discrete things are and how we feel about them has great room for passionate interest such as yours. We have to start somewhere, or restart and I am certainly not against our academia, just want to encourage them as some can from within- so enjoy your experience and drink deeply- what is your main area of interest- the pdf link was rather deep into the biology terminology and ideas and that is certainly part of the picture where we understand how an organism develops and expands?

I made an early comment that in terms of an active Higgs like idea, the portal, that someone would try to tie it into dark matter ideas- and they did. But new ideas are a couple of years beyond publication or research and as technology seem to last but a couple of years before something replaces it. Can we predict what some may come to predict?

Yet, the nature of the fact we are here in the center scale of all this says that any idea of the nature of scales as the main focus such as the h and string models is part of the big picture also.

The hot peppers help stop pain from the neck up and in fact are the same receptors as that we feel for actual heat. Learning can be a pain or struggle in exercise- study hard, make friends and try to have some fun.

therealsac=Stephen, btw, accounts != identity, but I digress… some people get really bent outta shape yelling about redshift and the universe not “making sense” but doesn’t the idea of not insisting on things “making sense” reduce the anxiety caused by a lack of sense? One person says the word “world”. wtf does that even mean? You really have to take it directly in context of the personal meanings of the people you are referring to. I think some of this multiverse mania stems from the shift between representations in the on and offline worlds… Peace, Stephen

and shifting gears… I’ve opensourced 3 projects mathy type projects which can be found at https://github.com/TheRealSAC and if anyone were to pick up where I left off that would be awesome, but such as opensource volunteer basis goes, I am not insisting on it at all, merely mentioning it so some younger idealist with lots of enthusiasm might do something cool with it. Im an oldschool opensource purist by heart…and being approximately 2^4 years old… something about point processes and “stopping times” in determinism… etc I feel sorry for people who believe the universe is deterministic.

//Consciously “rambling” in this style, not caring what anyone thinks, they gonna judge me anyway, so whatever

Peace,

Stephen

:)

Stephen that is a most interesting link but I am not quite sure I understand it… in general I think the world (World for universe but I rarely capitalize it) the environs then real or virtual or whathaveyou, will prove open as a source of physics laws and the technology for ill or good in principle open to everyone. It is a social or economic problem as to who can patent a sphere or its derivative variations under an open umbrella. It does seem to me the universe does not forbid such at least social determinism as part of the intelligible or cloudy total picture. Does time, another great philosophy idea, not seem determined after the fact? Do self designing computers need any clock circuits at all or is that a human way to relate to information? How does one generation, or the same aging soul in one lifetime, relate to different eras – would I have understood or underestimate what we know and can know if I were the same age as my sons when we debate how advanced or not our technology or civilization is? Anxiety can be said the loss of the father or hell the distance from such a meaning unifying concept…but the paradox of open freedom to determine our lives and evolution is that even for the lesser wise and informed among us some say it is good to volunteer takes away the work from those who can do it or need to in order to support themselves and families and who have the right it seems as one desired to so contribute to the world. A thousand point of light are all at the same flame temperature. So what sort of world will we soon inherit when work a luxury for the few and the nature of post-economic wealth not some imagined theory of ideals and deterministic history but the cold reality? Multiverse for me was surpassed as a concept long, long ago… but to understand the next levels one must live the total belief in a determinism even if but a necessary moment as philosophy, then find the next purpose like era that reaches science again. The man who knows how will get a job but as things seem to go the man who knows why will be his boss. A sustained and balanced economy would work if people did not wage war so much or if they did not have to pay for those who take from others by force or stealth— that much remains but an idealism.

Thanks for the interesting post- I am not to judge anyone but without a judge of at least physical law what should we do with the forever unknown so to adjust to the freedom and endless never to return loops and rambling when that can be a rewarding walk of inquiry…how can we judge ourselves if we own up to the responsibility… What is the source of light is both a science and a philosophic question and concern where inquiry seems to me very much wide open Peace to all

beautiful. I got all sentimental and misty-eyed on that one, or, gee… it got real dusty in here all the sudden… must have been one of those pesky UFOs. Seriously though. Nice. I like to think of it in terms of WCW(World of Classical Worlds) in Matti’s universe. Peace to all as well(fwiw)

I added a drawing summarizing the state of our physics models issues and vision. Such intelligible operations persist in our descriptions even over arithmetic. Leo Vuyk posted on his raspberry universe an explanation for one principle that tries to relate higher symmetries and an overview of chirality as a statement of motion or direction in physics – such a connection to chirality seems central to standard particle models but we have to did deeper into the foundations. http://www.pesla.blogspot.com will continue the postings as too many drawings were on one page….Leo’s drawing sent to me on facebook (as well I discovered the viXra org has a fb. where Leo with a comment on locality and supersymmetry grasped the general idea of a problem to solve in the general span of physics phenomena. Also the realization again of models of doubling as an effect of his double lattice… I would only ask of his model if the black hole like entities can have qualitative differences more than size or a great initial one as if they too have generations.

lol. I just realized these months/years of discussion amounts to various forms of great thoughts but is kind of like… metaphysical grandstanding, or show-offeyness, or something, can’t quite put my finger on it…

Life’s short… anyway I went ahead and posted what was on my mind too new to post as experiments to try, inventions… sort of like science fiction… replicators and so on for us species of subgeniuses. An MIT article I saw today comes close…well not that close…

http://web.mit.edu/newsoffice/2013/superfluid-turbulence-through-the-lens-of-black-holes-0725.html

I strongly disagree that life is short… that’s just an expression people say to express some general sentiment.. usually indicating their conditional mood at that moment in time

I hesitate to bring this up, but, what do ya’ll think of https://www.goodreads.com/book/show/13378612-solving-the-ufo-enigma seems to be the last true mystery left given what with all the bigbrother boogeyman ruckus out in the common public knowledge now, even if the publics apparent lack of interest of the bigbrother stuff…what im saying is. UFOs are far more strange, and there is an entire astroturfing industry dedicated to “debunking”. Peace

therealsac

I hesitated to reply to your last post – we are taking too much of the space… especially since in my time their was no evidence whatsoever and I saw things not even congress knew about. But lately this “mystery” is raised in the science magazines as to what this phenomenon, at least mentally, means to us – our equivalent of angels and ghosts of a bygone age. One article asks if we can understand such things better in light of the more advanced physics of our era… Then again if such beings landed I probably would not be that impressed- but I would not assume it was a trick of some kind. Science is useful to sooth any such fears we may have of this experience of life, for me anyway. cheers, and yes may the billions of us among the billions of galaxies of billions of stars know Peace.

It’s a numb moment. Apparently Grothendiek doesn’t want his papers posted free online, but is too low-profile to let us know why:

http://homotopical.wordpress.com/2013/01/06/grothendieck-anagram/

Then I found this rare Bourbaki blog, where they discuss symplectic geometry and Poisson brackets:

http://amathew.wordpress.com/tag/symplectic-geometry/

I got category theory shoved at me in the name of Bourbaki as an undergrad, and bothered to find Bourbaki int he library and read. I sensed something completely different and had no idea where to look for it. So I walked out on the whole show. Who isn’t an alien by now?

my gawd!!! that’s brilliant. I’m gonna meta-pirate that as a template … it’s the sincerest form of flattery ..

“Many people who do not closely grasp the idea of a ‘homonym’ as it relates to argument have been mixed up and confused…” That’s Galen, near 2 000 year ago, on the state of medicine after some geometers in the Academy started preaching the Elements and the Synthesis of Plato’s precious triangles. No its not the homonym/homeomorphism as a mathematical plaything that counts, but the same as a factor in argument, in logic. Or computation, for that matter.

Orwin, shades of Marni and angles and all, category theory:

http://www.sciencedaily.com/releases/2013/08/130821152106.htm

and synchronicity as I tried to address this with new insights to my system on my blog but at a foundational level… saw the article tonight after posting… Now, the new nova in Sagitta will give us more neutrino data… with hopefully our finer eyes.

Interesting, https://en.wikipedia.org/wiki/Psychophysical_parallelism

Edgar, the van der Waals force/classicla virial was recently measured at long last, and analyzed at R^6, decaying at distance into R^3 dipole moment flux.

http://prl.aps.org/abstract/PRL/v110/i26/e263201

The dimensionality of this problem is sure odd (and even!). I mean, the higgs problem points to the Yukawa coupling, where the solitons are only stable in two dimensions. I can only think we meet a touch of supersymmetry and the hologram mediating between odd and even dimensionalities. As for what that means in number theory, your guess is better than mine.

Orwin, with any clear history or change at all from some initial singularity state we do observe a certain arithmetical asymmetry that is not an illusion. Consider the checkerboard as a six dimensional plane of two space. If we can break it into two three spaces we can have the so called illusion of Escher’s water flowing down and up stream. Why would arrangements in random drawings of letters tend to mean something let along you post above as an anagram, homonyms and so on? Certainly, this suggests some general process or way we think in logical principles – not to say like in therealsac’s link above this can be a matter of science or consciousness as psychology but the idea of metalanguage does seem to depend on proofs by the uniqueness of prime numbers (and that so far seems still unsolved in higher symmetries as complicated as they are.) Yet this result, at the foundations btw I am not taking sides on if science can only find or be proven as materialism – At duality indeed we touch the first realm of supersymmetry for after all 4D breaks into 2x2D of which a lot of the work of theory depends, loop or strings and so on. But dimensions is a whole new world beyond that if properly understood. Now in the language of the article from the symplectic link above deep in it to a link on the bottom I read in the principles as if a metalanguage with all the coincidental or similarities (say parallel proofs by elliptic curves) the same sort of principles as I mentioned in my Arithmetic of Viriality post linked above. Of course I am not fluent in the common number theory language but was it not Einstein who after all this tremendous work we hope to find a simple formula he thought probably algebraic? What would the theoreticians do for work then? Such proofs may remain in a mystery world at the foundations as if a visual illusion.

That Van der Waals force – hopefully nature’s laws and constants cannot change much or we just define them by fiat…is still a challenge to its deeper meaning- some have related it to the Casmir effect as a different way to see it rather than classical points in the equivalent parallel interpretation.

So we continue, thanks for the dialog and links. I have only yesterday discovered the long running conversations on other threads here with all the great personalities on the standard and alternative fronts.

I cannot find the link, recent and I think from a Japanese guy but it is there somewhere maybe in the comments… I have is saved somewhere.

TheRealGroth Department: [{(You are clearly not supposed to know, believe of conceive this, but Grothendiek is clearly not into that Paris hype, where he actually complained and agitated about lack of wide interdisciplinary networking – no, I truth will out, he is verily into the Weyl action, where quite a lot is happening)}}

Variation on a Theme of Grothendiek:

http://arxiv.org/abs/1210.8161

I can now see my way clear to motivating three kinds of gravity conspiring to constitute a sign, so addressing Edgar’s concern precisely with metalanguage. Back of this I turned up a passage in the Corpus Hermeticum/Poemandres where fire rises above the waters to constitute the Logos…

http://arxiv.org/abs/1211.1020

Origins of geometrization:

http://arxiv.org/abs/1302.0900

Breakthrough by Lagrange multipliers:

http://arxiv.org/abs/1307.2229

http://m.phys.org/news/2013-08-limits-phenomenology.html

Big data—with all its power to amass information about everything and anything captured by the second, does not provide answers that are useful.

I whole-heartedly concur

Lubos has an interesting article today the idea put into metaphors of a Boltzman brain and makes some statements about number theory and empty DeSitter universes and so on. This dialog out there is new to me in the terminology but not the concerns. I view it as creative rank metaphysics or religious concepts more than the clear science… Lubos comments on the situation appears to be a countering in the learned arguments which seems to me on the same level of assumptions or missing assumptions in this where it applies to science or how we reason, individually or collective ideologically or not.

Otherwise we may already assume super-string theories and higher super-symmetry as empty. The laws of he universe as useful information and methods are only quasi-uniform locally.

Can we not predict 9 digits anywhere along the random looking digits of pi from the last 9 in base ten?

I also like the comments on the viXra blog lately between Orwin, Stephen, and me. In the case of orienting axes in three space where something follows or is contiguous to what can be considered the information we may store in a shorthand of six colors we cannot but assert that the universe is so random that we cannot learn something logical and intelligible from considerations of a wider whole. This of course involves our concepts of Triality such as of particle flavors, and sensible patterns of Higgs-like objects which are close to the counting logic of number theory- especially the recurring patters of 24.

Big data may in fact supply us with some useful information, Stephen, if we have a slightly better grasp of these combinotoric structures underlying the physics of this world.

And Orwin, the references to the Greeks are still worth learning from as well the triality you mention for Van der Waals – interestingly you suggest it relates to gravity. But in my charts where things are distinctly vertical or horizontal in two dimensions we do not half the plane into even and odd values as both axes, rather I imagine something like gender applied to the unique numbers rows that suggests a deeper asymmetry in the universe that comes before the physics of ratios of matter and antimatter.

This on my pesla blog and fb with illustrations and notes… The tablecloth fractal superposition can assign a unique coordinate of binary information encoded as Abels 15 group cycle in bicolors with a definite way to see and view the orientation of 720 particle state objects where the colors match. and so on n!

BTW….this post was lost in trying to post it when net service was interrupted and to back up copies did not save… I am not sure I can write it as good again… essentially it is about applying some of these ideas to make sense of all the information and perhaps judge its worth… some points I recall as each of us are at some stage of awareness, local or distant of imagined influences, just as with that possibility in numbers and metalanguage and how far the universe can be as quasifinite

Big Bang the first miracle and all left as science? Math unlike other faiths that can prove it is based on faith? One big miracle of the prophets then the practical world left we can explain? Are we distracted to keep into a certain cultural or scientific paradigm by hidden or accidental dark matter influences? Was in the miracle at Jerusalem not the Dante experience of exploring the cosmos in that older cyclic formulation…that then said a closure of such prophesy or in an individual an awakening to higher civilization and understanding that must have seemed divine.

We each as valuable as a universe, that or just s speck of dust not that unique or worth much at all…worrying about today’s miracle of our being in the world or that somewhere it may forever end…

Our parallel thoughts that seem real each to his own as his own or his faith in the collective vague totality of a miracle. So many metalanguages vertical and horizontal in integration.

We do not need some sense of conspiracy to see what may be coming as we work half blind hoping it is good in equifinality. This exposure to information, the more we experience the more we can hold… but how do we know simple things like what is safe to eat, the artificial man-made or that from nature- protocells with their need for vast energy compared to us humans…sometimes the unnatural can out do or enhance what we consider natural where the geometry of dimers and clay jump start universal life. Are such nano particles already in our food like gene modification toxic for any say individual genome? Does any theory tell us about this? The claims would cure many thing but when? just promises of hope?

Certainly today we can only watch with sorrow how cultures caught in a loop as if some certainty of religion or science continues the tribal wars and the lust for place and obtainment. Our spiritual manuals say otherwise as with all disinformation and lies I suppose.

Hi guys! Check this out http://www.toebi.com/blog/theory-of-everything-by-illusion/black-holes-by-toebi/

Some of you asked me about black holes few months ago. At that time I had something more important ongoing. But today I really pondered about BHs :)

Hey Philip! What are you doing? No blog posts for a long time.

[…] of these conversation was posted at (http://blog.vixra.org/2013/07/18/naturally-unnatural/#comments […]

If a sense of balance is important for our sense of understanding and well-being, the idea that grounds our faith in truth, then in systems of sufficient dimensions to allow asymmetry as well as the easily seen symmetry of breaking into equal complimentary parts may contain sub-symmetries hidden in the vague imbalance of the rest to which we isolate our vision and insights from the next reachable physical and mental truth. This at 4D is important for the expression of organic parts and systems and that there are more than just orthogons, antiorthogons, and simplexes in some low dimensions.

What kind of data you thinking about Edgar?

Perhaps, something to do with this…(although as in the movie Dark Matter, outside the team effort, the hero did not graduate without finer data as if taking the frontier of an intuition and adding on that with delusion of a Nobel Prize) http://cerncourier.com/cws/article/cern/53091

This question of what is natural or artificial- “everyone’s model right…” or perhaps we reach a point in physics where some analog to atoms cannot be seen. (there is an article on how science should not underestimate or constrain evolving future trends but cannot find it at the moment)

In the movie the hero sees spots in a foam on a boiling over pot of food and this represents an eureka moment… Just as N. Otto and the 4 cycle engine possible with the age of steel did not get the patent staring at smokestack clouds and knowing to add lead to reduce the no-knock. I mean hard and simulated digital data but I also me intuitive data in the end.

My interpretations are Euclidean by default and from that we can explore the rest… a visual grid which would require simple programming to sort and post and study the numbers in relation to this Quasic grid. A visual aid to number theory and geometry. But should we really trust such simulations, how? It is interesting in the Z code in chips we can see rather soon in a program so to project it if some errors will arise after a long computation project so the method is at last consistent, Do you ask so that you can set up such programs from this side view approach?

In my social poetry blog the discussion came up on the Fibonacci syllable form so I put some diagrams and poems to show the richness of such a series, its paths and when it begins in the growing complexity. One may say such sums start with zero and one but where before that: 0+0=1 0+1=1 1+1 = 2 … 3 5 8 (or is it 7 as partitions obvious from simple count where we sum successive terms of the p(n) numbers after the initial one and does the flow begin anywhere in particular, singularity like, and how is it sometimes the higher curves end in such contexts?

Damn good question you raised!

http://www.sciencenews.org/view/generic/id/352421/description/Belief_in_multiverse_requires_exceptional_vision was the first link I refer to thanks to Leo Vuyk on fb

http://www.dreamikins.blogspot.com for the recent Fibonacci poetry.

Edgar, if you like number sequences you might find some good material to explore at oeis.org the Online Encyclopedia of Integer Sequences

Thanks,TheRealSac, these sort of programs are helpful and fun as far as it goes but I am all about a wide picture, matrix, interconnections of a thousand words… a score of years ago someone said to me we could not picture every formula or that some ideas are too abstract. Got a really cool idea and posted it concerning Landau Poles and my flatland…

Huygens Principle got modified to encompass diffraction, as Huygens-Fresnel, and then got worked over by Kirschoff, with a derivation from Maxwell’s equations via Green’s functions in the boundary.

The interesting point is that this boundary is internal to the propagation process, and that matches Roman Jakobson’s analysis of the functional features of phonetics: the sounds of speech are tempered and fitted together strictly to optimize enunciation and recognition.

Bringing Huygens’ Principle under the Action principle of Maupertuis, and admitting for once (against Voltaire’s cheep Newtonian sarcasm) the metaphysical resonances, you have a complete account of the subtle body of speech which is at once material and teleological.

That’s history now, but there’s a sequel playing out in the physics. Kirschoff’s solution extends to complex spheres, and there appears a limit or uncertainty which is very like isospin, the factor yet to be confronted in the Higgs hunt…..

http://www.wavelets.com/pages/documents/HK11.pdf

There’s place here for recognizing the relevance of what Hans van Leunen is saying, but at the level of neutrons rather than atoms. And that makes the final error term just the neutrino, where the whole logic of polarization collapses into a ‘mirror-symmetry’ as Marni has it.

Edgar: for that ‘gender’ factor, think metaphor and metonymy, Jakbson on the other face of language, where meanings are too complex and entangled for any regular optimization….

In the Hilbert Book Model nature steps with universe wide steps from one static status quo to the next static status quo. This goes together with a minimum (discrete) progression step, a maximum frequency and corresponding features, such as ultra-high frequency waves that cannot be observed and whose propagation is governed by Huygens principle.

Orwin, good points… gender and other first musings by the Greeks on number theory (not sure we can optimize the gender of words in so many languages let alone human diverse desires and relationships). Marni is likely the case for what we deal with in physics in the standard way we do it now (a step pregression as Hans says as a possibility (how far the laws extend in our locality and what we can ever observe as we reach higher levels) If there is not much to gender thus perfect numbers can there be a wider world of physics where say Matti’s p-adic ideas tell us little from Mersenne Primes? (I know I am missing something in understanding him but I think it is just a detail or step in reasoning)

I am approaching things as pebbles, figurate numbers and so on, counting on my fingers so to speak as if something to touch so I am looking for patterns in the data to tell something fixed in all the changes and model systems of the world. I have not posted but aside from other things am drawing representational models that would say give us a novel way to arrange something as fundamental as a hypercube or any orthogon beyond it. it does optimize the logical simplifying of circuits compared to Karnaugh or topological maps. It is colorized to a spectrum of various representational dimensions, perhaps a weird gender confusion.

Sabine on fb posted this article link which is many words and few formulas and a long read. I seems to summarize the state of things and raises the questions without definite working out things from the direction of their suggestions.

http://arxiv.org/abs/1308.5097

What do you think of this article, guys? Does it tell us anything new, really or is just educational and historical?

Orwin, Shannon said so much with the idea that in terms of information words may express entropy and that information and meaning are conjugate… OK dont ask what is in that space above the mercury barometer it does not match up with Nature Abhors a vacuum (kids sometimes struggle with the slogans) nor if the south pole can be weaker than the north pole of a magnet.

What is the point of publishing such an article unless for the references sake? Yet in it I can see the germ of hopeful directions of which the authors are not aware of themselves.

The representation hypercube is skin only rather like a hollow triangular cylinder… a quasic wormhole. Otherwise it would physically collapse in the complexity of it all.

That article matches the sober mood among experimentalists after the LHC1. But to me its all too Statistical Mechanics/Mean Field Theory – after all these years, when all mean field theories are wrong and renormalization is the name of the new game. As for insight into the regularization problem in renormalization, well that’s clearly asking too much. Which leaves a mathematician like Peter Woit feeling out of it.

I’m glad to have found a way of relating to the Baryon Acoustic Oscillation, which is out big picture of the universe, and more accurate than the endless Hubble guestimates. Interestingly, Piron, Hans’ touchstone, was known as a French Empiricist, with

Baire in measure theory, which mattered after Cantor and Dedekind. We hear so much of the new rationalism, the a priori of symmetry analysis, that these roots of logical empiricism get lost.

Did you know Minkowski wrote on the Geometry of Number? His last work, evidently, before he died unexpectedly, which was such a pitty. But I only find it in German…

Thanks, Orwin informative filling some gaps in how this has developed. Minkowski is underrated in the breakthroughs of things. At one time or another renormalization if not said a limited idea or is defended— along with excluded middles, mathematical induction, or the like, or is defended as an article of faith (the article I posted touched base with this issue. Usually it is solved by some insight regarding the exponential – and pi). The simple horizontal and vertical distinction certainly has lead to a few novel ways to do calculus, Newton-Leibniz, Riemann, Weierstrass, Dedekind. Lebesgue others to which we never quite say in the analysis of such a plane Zeno’s ideas are solved rather than avoided – what after all are some of our ideas on velocity if for some particles these are not visible to others of further differentiation but a vague idea of half ghosts of departed quantities? Is Empiricism a sort of Rationalism turned on its head… the Greeks, Zeus bless them, did try to measure the stars concluding they were fixed from the data. Or is radical empiricism just the opposite of Nominalism- and so on in philosophy of which you certainly are aware but maybe someone may be spurred to thoughts on this. Descartes only used the positive quadrant, the rest dismissed…angles and such and all the complex analysis Fourier of course, are accepted but what of the properties of e, say the measure of charge on Lisa Randals brane models… can it be the deep difference in the hyperbolic and spherical (both can magnify light as lenses) is not as foundational as we now imagine them? Of course the axiom of choice and the continuum hypothesis might not as independent from those of Peano arithmetic, someway. So the name of the game on its deepest level today is these ideas of asymptotic freedom, Landau Pole (now there is the shadow and wake of a scientist dying young and the usual paradoxical pity.) I dont by the triviality of a string of evenly spaced gluons nor logs of them as in sound or music to which Dirac said in his day there were no such wave phenomena in his physics of that time. Nor should we imagine at some vertex a Triality, or even a vertex but we can colorize the singularities according to a spectrum, ultraviolet and red shift and so on… again something to do with the vacuum structure or content of dispersed or binary expanded singularities. What has always amazed me is just how things in numbers and geometry will be limited as if we can only in our familiar dimensions see light focus only so far thru them as if a limited inverse square law is closer…and structures of which other than the simple ones have no higher analogs to which I vaguely feel at first glance should abstractly be there. I did post a parallel representational dimensional geometry which I first intuitively extended in the soma cubes but at the time did not know why… surely we have a wide key to reach into the hidden things science makes clear already, a priori, a posteriori, or perhaps something in our matters of judgement. All I know for sure is that the more advanced math software cannot handle the way I see things Oh well, it all continues onward but to be aware of the question fails us if we think it is a lead to an answer. As to the solving of puzzles and bored with the known where even a small part may describe the whole or whole define the part even when the shape is identical or the geodesic measures the curvature of an egg – I am always searching for a new principle or something we missed along the way. Thank you for those missing pieces…. my crude sketch is up on pesla blogspot and little else not suggested by the drawing and the words there.

On hearing of the fall of the Berlin wall I spontaneous broke out in the old Austrian-German anthem recalling I can read the black letter very easily- the morphogenisis surely likely involved to which we all can read all such languages as natural rather than difficult new artificial ones. There things stand at the moment. Graphene…certainly that flow of free electrons over Lord Kelvin’s solid a shadow of higher forms with the hex cells slightly indented for that space filler,. Is it not the same and how fast as in heat diffusion can they go, at the speed of light? and so on… Look, I am embarrassed to have written down all these historical names…. Edgar

Minkowski signature and Euclidean signatures can be seen as two views of the same possibly curved space. At infinitesimal scales they are related by Pythagoras law. They both lead to their own local metric, which specifies space curvature. They both relate infinitesimal space, proper time and coordinate time steps. The infinitesimal spacetime step is in fact an infinitesimal proper time step. The Euclidean signature conforms to a quaternionic space-proper time view. There the inifinitesimal coordinate time step corresponds to the length of an infinitesimal quaternionic step. This makes our common notion of time (which is coordinate time) a mixture of space and proper time.

It is not for nothing that the Hilbert Book Model restricts its scope to levels underneath and including hadrons. It only touches some generalities of cosmologies. Other items quickly become far too complicated in order to grasp in a confidential way. In any case you must restrict to abstractions in which the boundary and starting conditions become simple enough. Only in the mentioned lower levels it is possible to devise a rather consistent model, which is consistent in itself.

Those that have a preference for cosmological phenomena make use of the fact that the environment can only be vaguely estimated and the discussed rules cannot be tested by precise measurements. The results of those deliberations are always questionable. The discussion easily degenerates into balderdash at infinitum.

Hans, nice explanations… but infinite regress is not necessarily a defective view unless we are forbidden to actually touch or reach the absolute zero, the infinitesimal. If there is an actual restriction and not something ad hoc like cosmic censorship then there may be a good reason why… I think we have all outgrown the distinctions. logically of what is the Euclidean and non-Euclidean geometries and their (true as geometry) analogies. Let us hope such a restriction gains us clarity in the method while not fossilizing the scope and language of science in some cumbersome thoughts and scripts. Entropy perhaps in this sense is nature’s own balderdash. What is outside or inside of something finite really? Even the inverse square laws may be universally consistent and true and yet vary between local regions… and this limits the range of such forces within so many dimensions hidden or way beyond what can reasonably be interrelated in our familiar low natural dimensions. Of course that nature seems to so restrict. this issue of what can be in a standard theory… is the source of such soul searching in these threads concerning the standard model.

A close to tautological path exists from quantum logic to a quaternion based space. This path runs via isomorphism between this logic and a Hilbert space, the relation between that Hilbert space and its Gelfand triple and the discovery of Constantin Piron that these latter structures must use division rings as their number system.

You can claim more freedom, but then you must either use a different foundation than quantum logic or you must specify a close to tautological path from the foundation to this more freedom-rich model.

That’s a telling hit, Edgar, on Descartes as the source of all our scientific realisms. And he was mightily surprised to find Desarges in projective geometry with a completely different way into the same mathematics. To me its projective geometry that lifts one out of flatland, into perceptual or perspective space,so the issue is also about consciousness and hence pluralism. Beyond the horizontal and the vertical are the horizons, if you allow that they may change: there are glimmerings of this in Husserl, and Hegel already asked what do we mean by the infinitessimals at the end of the day, so there’s no going back on such questions.

By the way, Jakobson was very seriously engaged with information theory,a indeed placed phonemes as redundancy in linguistic terms, for all their functions in the speech process. And there’s a powerful distinction between behavior and meaning, which breaks past radical empiricism!

And on the cutting edge of programming there’s Cabri and Cinderella (!) which will take you cleanly round the phases of things – but you have to pay for such value, although not much.

Edgar, it looks like you are right about an octonionic representation surface:

http://arxiv.org/abs/0812.0212

There’s something similar in Jakobson on the aphasias, which fascinated Lacan, and always struck me as the deepest mystery yet pinned down in hard data. That took a lot of cracked heads, in the desperate resistance to Hitler…

Increasing back-scattering is commonly referred to as “glory”. <————- fantastic

In science and fantasy we debate “invisibility cloaks”. Saw this interesting article just now and made it the status on my http://www.theverge.com/2013/8/28/4668540/scientists-grow-a-miniature-human-brain-in-a-lab L. Edgar Otto facebook. The issue of what is natural or unnatural, perhaps philosophy or religion and when we need the fiction of the distinction. Here experiments trumps speculation rather than promoting an idea against a stance by saying something like- you will know a theory is wrong if someone proposes baby universes or something like that- rather than give us a concrete thought experiment (Lubos recent posting on blogspot com)

In case you do not click on one of my sights here is the main comment I placed on that article as a comment:

Doesn’t pain help direct the development of things as in the teeth and bones? Perhaps what not seen here is the location of some principle like consciousness or the soul. Not merely a physical reason like lack of blood flow or connectivity of neurons that generate the hidden geometry of tunnels entering the light. What is to be interpreted here is that the brain system is much more complicated that the physical system and that more than simple physics with its DNA and virus encoding of clay. Could such a blob be used to make an organic memory processor or is the idea of memory itself somehow part of this code to which the emotions of laughter for example is not just the brain of a Sheldon on the Big Bang but a wider gift of a God or universe given science… for if He did not want us to use our brains why would He have created them? This experiment is too important for the usual mediocre cues of canned laughter that we as the audience are induced to clap.

I agree. All is for naught if there is no humor and smiling. See Darwin’s Expression of the Emotion in Man and Animal.. good stuff. I’m not really trying to become invisible though… http://www.newscientist.com/article/mg21929322.700-why-your-brain-may-work-like-a-dictionary.html interesting stuff . Peace, Stephen

Stephen, in a sense we are already invisible or partly so… that link is a very good one in the context of these discussions where the idea of words and memories and so on do seem to fit at some higher totality our various models proposed as a partial picture.

If we do not make a distinction between matter and dark matter or their relationships as physics… we can image a state matrix of quasi-illusion. That is to say the vast structures of stars, the new explanation of a layer of counter currents which Einstein said was not covered by his musings and in general include this under rotating things, other than when and why matter is considered concrete or abstract like a black hole… an planet, star, even an atom from a quasifinite and dimensionfree view is the same creative entity. This generalizations applies also to a sea of singularities as well point continua erected at each point in an Omnimultiverse. Our minds, bodies, subject to the same model are not as complicated as this generalization.

TheRealGroth Department: Bad Old Days at Goettingen U.

So Max Born ripped off Gustav Mie, and David Hilbert ripped off Max Born, and then Einstein ripped off David Hilbert, to bring you that blessed Cosmological Constant. And Lo, it was haunted by the Ghosts of rip-offs past, and Einstein wasted decades trying to prove that he hadn’t just cosmologised statistical mechanics.

What you also don’t get to know is that the later Einstein-Cartan model is no longer Minkowski, but Weizenboek, and that Cartan was allegedly a Bourbaki. So if you try to put your spinnors back in Minkowski space, all you get are old Pauli-Fierz spinnors which are massless and won’t vibrate (Matt’s problem).

Also that David Hilbert claimed a variational derivation which he did not and could not prove, and spoke of a Hamiltonian when it was a Lagrangian, and the gossip didn’t stop until Emmy Noether tidied up the Lagrangians by their symmetries.

But that left a spherical symmetry which Goedel then sprung on Einstein, so proving that he was assuming Newton’s rotationally-gauged absolute space, and that didn’t prove his head wasn’t spinning…

And Newton hadn’t solved the inverse problem (that was Lambert), so the problem still reached back to Kepler,

http://arxiv.org/abs/nlin/0011011

and Kepler was using old alchemy passed down by Arabs to regularize away the, uh, ghosts, which sure as Nuts came back to haunt them all and still do.

http://arxiv.org/abs/0810.5772

That’s Matti’s Problem, of course. The complex number/operator analogy is specifically fragile to the distinction between prime and maximal ideals, and this matters in Khaler QFTs.

Ghosts arise just as easily as endothermal reactions, or the influx of energy into a system, which then mixes into the probability distribution and is not easily distinguished.

So, life’s like that, and the Earth’s like that, and if you don’t like that, *** off to Mars, where its worse.

The latest House-type music from Korea is like this too:

Men will want her,

Because Life won’t haunt her…

Tori Amos

she appearing perfectly self-sufficient, which is, of course, an illusion.

Question is, do ghosts help attract mates or increase survival skills or increase quality of life? http://serendip.brynmawr.edu/exchange/node/336 there is a danger in “being too abstract’ Peace, Stephen :)

If the idea of a negative sign is to simplify calculations and not an indicator of ghosts or holes in some idea of physicality… if we can sum an infinite positive series and arrive at say minus one third…how can the ghosts of 9 life quantum cats represent anything real save their endlessly falling knots of four space? In a sense is the use of complex numbers not just an elaborate extension of this idea of negativity ( I mean that there is relative negative distance only or rotations seemingly infinite yet contained by the velocity of light.) Do we only concern ourselves with the leopard’s spots or what is inside the mind of these stealthy panthers as we monkeys stand up and throw rocks at them ?

http://www.slate.com/articles/health_and_science/science/2013/08/symmetry_in_the_universe_physics_says_you_shouldn_t_exist.single.html

That was a fun article even if it begins we are lucky, or that things are so finely set in balance the “God particle” suggests a perfect initial Being for the masses has something to do with the local evidence of design. So, if as I have logically shown at least on a small level the universe is foundationally asymmetric as well as balanced on all scales as far as structure and information goes, can I naively reason that despite the risky free will in the illusion of this nest of what appears remote and not violent stars that the creative force is actually beneficial in the main to life and thought against the deterministic nihilism of cancelled symmetries? This quite aside from the shared question of why things exist rather than not exist. For now at least we can have effects on our evolution to bring things into the world, perhaps worlds, that seem unnatural or artificial if we accept the responsibility of such freedom while put in simple terms for science and philosophy it seems our debates interact such that our theories reach some ever falling entropy or when all things are understood and balanced wisdom as well as creation vanish regardless of what is the mysterious source of our ultimate concerns such as the universe that persists despite us.

For those who may care to follow my last pesla blogspot com was on the Omnimultiverse but this morning and considering the ideas in the air I offer a further generalization I meant just as a speculation. I have found the ideas and terms I need… So I posted on the Teleomniverse as a generalization of my quasic principles. While doing so I found two links to other blogs I follow… one leads to the views of Marni…for her alternative interpretations. This also follows from my post here inspired by the article therealsac listed… what if I asked myself the doom and gloom of symmetry has a mirror image where it seems a goal or teleology of a more general universe state of which we tend toward a higher complexity to find the stability that on the final, Planck, or hierachical set of such action, energy, and thermodynamic structural levels the universe can be described as the flatland of the Pythagorean brane. After all in a quasifinite universe while states are different all states at least as physics may be in a sense present.

sincerely i cannot believe in the existence of higgs particles”,aswell as in the existence of antimatter to a complete symmetry to the nature nor in the existence of antiparticles-those are sub-products of the asymmetry of the matter and of the spacetime-just as approximations of our human mind in the observations of the events in the universe. the proper served it of inertial frame to analise the movement of the bodies in relation to “fixestars”,with surely in the infinity.

carlmott… I think I catch what you are saying… My terms should be qualified as making no assertions of some things (thus Higgs-like, opaque matter -rather than dark in general… We certainly seem to raise the issue of real and virtual mirrors and yes IrreducibleGhost below the vortex and tori things can reach back all to the way to Descartes’s or Poe’s general Theory of Everything. Do we have an image or not in a mirror and if real is it symmetrical energy-wise (or like in Alice in Wonderland is the fire in the fireplace different?) It certainly seem unlikely the objective world or the mind do not eventually match in the laws rather than one ultimately predominating over the other so can this philosophic debate and interpretation as physics, say quantum physics, trust an argument from a too restricted theory? Or in some ways are hypotheses of perception as well as the physicality of laws part of at least a little more unified scheme? Perhaps our mind-brains can be so stated in the terms of say particle physics…. A brain may have (in totality or parts with our without gaps or ghosts in the spectrum of light) a place for Fermi or Boson statistics with the structures that implies. Yet between them, just as Brownian motion seems a compromise between classical and quantum formulations (for the ways to describe the same phenomenon anyway by formulas) is there not Boltzman and perhaps his image as never there ultimately? Binary wise we can imagine the top and bottom of scales as the spectrum in vague psychological terms as Autism to Schizophrena… a centered only universe or something many-fold or even other worldly…Or some sort of cyclic mania and depression… even here more general symmetry and operations are needed than a random choice of which direction to turn and which direction in entropy or time to find surety in development.

Can it not be as simple as this while some constants as logs or as simple subtraction of inverses thought coincidence but as artifact of approximate or exact geometry be its own derivative.

Genus is a much wider thing and can be reduced to linear ideas as well as loops and curves, so which was do the higher Tori Aim us?

Strangely, the Einstein-Cartan (Weitzenboek) relativity gives soliton-like eddies in the manifold, which appear just like the Abelian Higgs model (those vertices are sure full of spin, palpably vortical), so the symmetry-breaking “mechanism” must be just gravity waves (quasar/ic dynamics):

http://arxiv.org/abs/1304.3386v1

And these “vortices” are just the old smoke-rings, tori in Math, and yes, you can then simplify the Dirac mess to something like Elizabeth Raucher’s toric protons: its really just a matter of making some space for imaginary (ghost) values, for negative refraction (yes its happening! Next-gen lenses!!) and such.

It was much simpler than one thought, but sufficiently weird for ghosts. Like they sang,

You can get anything you want

At Alice’s Restaurant…

Irreducible… that link does fill in a lot of steps, colors in our paint by number dreams, but it is not general for comprehensive foundations… we need such details even if it may not support one side of our ideas as to what in physics is foundational. For example the dragging of the frame seems good evidence for the view such fields are physical in effect (among so many possibilities- btw in fb I left a message to Sabine asking her what the hell is gravity anyway, in all modesty… it does not seem to me as straightforward a concept as it so obviously has been, even if we take GR as a cue. But does string stuff tell it any better?

Are you ready for the Juno Earth flyby? :) Here is a predictions for the flyby anomaly http://www.toebi.com/blog/applications/juno-flyby-anomaly/

With more accurate mission data during the flyby I could calculate even better predictions. But those predictions in my blog post are good enough. Interesting to see how ToEbi passes this experiment :)

It is perhaps not in the scientific spirit to express feelings informally as passion for the frontier of technology and science. My first reaction to the suggestion I could no longer post on the scienceandphilosophy chat forum, that I really should have a blog resulted in my abandonment of that forum for what blogs were I understood to be a marginalization of content not worthy of the established journals and institutions of science. Yet, has that not been the case for any new media, radio, television, the internet that at the beginning promises are made and expected that it will inspire the young and make education accelerated and accessible to all. Something that feels right but periodically fails even in a program of clear agendas.

The spring of the new physics influenced by the social media has resulted in a hopeless loop of civil war between the theory community a virtual but moral equivalent of religious war. The system has less tolerance for moderate compromise, for some nor for “us” are said against us.

But do we not need passion, not so much to protect our turf against weapons of mass consumption as if the world’s end in certain doom is the ultimate prophesy- or that as Einstein suggested once weapons are invented it is inevitable they will be used? What we need passion for in the business of science is to uplift our civilization beyond its narrow present concerns. We may be naive to imagine that if a human being could understand and so be inspired much of the less noble passions would be abandoned freely and compassionately for all who cling peacefully to their ways of belief and life – given a chance the mainstream on the whole shows the rewards for all who can see the better paths and natural desire awakened in the risky but benevolant universe.

There is intermediate work to be done as we explore various models that some have dedicated their lives to as best they can to fill in the missing ideas if their science is adaptable. This direction may give us unexpected and correct new ideas as well as useful and careful technologies.

In the end we should strive to judge our own art over the whims of others- if in such matters we know, and know on what level of the comprehension or concepts are right as well as insist against mass fact telling us they are wrong. But in this intuitive cloud of reasoning if it can from the vagueness become crystallized, we can make fundamental and original mistakes… a measure of how rare they are is perhaps a better measure of what is genius or crack-pottery or which in consensus seems right.

When the top quark was discovered long ago I considered it evidence of a mistake of my intuition where in the informal numerology I thought there could be no more than five, symmetry senses aside. Well, the gifted Einstein was said to be in a futile search for unity after his first revelations and his self evaluation (now ignored and pursued as a fact) held the cosmological constant a mistake. I myself not finding the things I called quasic points in the universe, a distinction from the uniformity of galaxies almost abandoned my pursuit of science against the fact no such entities could I find in the library so I enlisted, for the nation, and hopes to learn a practical skill such as radar, both following my father’s footsteps.

Now I see on the snarxiv blog the idea of the 5 light quarks by someone in good standing. After all I reasoned why throw away the fifth and greater levels of acceleration save for the fact these ghosts of departed quantities did not contribute to what is physical… thus invisible with nothing to gain for our cliche as a philosophic axiom that “time is the fourth dimension” Even the young Riemann said as much but knew a lot of things he did not say or explore, he left for new theorists.

So, geometrically, we can make connections from the number patterns up to and beyond echos in five space. The other numbers would follow in theory, such as the Triality of threes and other time like or ghost like dynamic portals in the count of particles even within the same level or theory model. But make no mistake, I do not believe we can put the physics beyond it into simple integer ratios (although to get around this apparent paradox may require new ideas on numbers, even on the idea of complex number systems).

What seems to come out and appear of my writings from my subjective view does not always match what is within – a subjective time thing I suppose. Of my stages of aging I simply do not like what I see in the mirror, not that it matters once you arrive to some level of a shining star – one that has come so close yet failed this generation around to reach and understand our collective life’s possibilities and goals. That is the dedication and sacrifice science promises and extracts from us, the wide see of isolated and failed stars. For I do paste onto the universe as I ask as a younger soul when do things begin, how do we work it out, and when and if they end… this classical triune pattern where in fact or dreams we age or stay young- a nine-fold thing really in a wider sea, thus my Alpha, Mu, and Omega Omnium periods.

I say some work comes to an end but it rarely does. With that in mind I will try to promise to conclude my comments here (hard to access it from the so called “smart phones”) So if anyone desired to continue comments, corrections or dialog please use my pesla.blogspot.com I hope that from here as from the popular and formal science news some of you continue to share with me your breakthroughs on the frontiers.

* * * *

Edgar, I hope one day I have the patience to try to make sense of your posts… I find it really hard to follow… in the mean-time, http://www.aeonmagazine.com/world-views/can-the-multiverse-explain-the-course-of-history/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+AeonMagazineEssays+%28Aeon+Magazine+Essays%29 :) Peace, Have fun , –Stephen

Stephen, that link was a very clear read and does raise the issues of what this means philosophy and the foundations of science as to how much we can know… of course a few people know or want to seek what is abstractly out there and lack of knowing is to risk a real irrelevancy of our being and theories in the vague sense in that link. (so many here do have that basic curiosity that kills so many quantum kittens essential to science).

Is it enough in such a debate to state things formally and clearly considering this climate (for example the very post links that follows me here – the Great Vindications as a survey of the issues yet with still the question “Is there Nothing else offering a viable alternative…”) for me there are at least a few things more to which the debates have not found yet. Over the years in conversation people did not understand but did not say so until years later when something kicked in and were excited to tell me so. We expect, even some disdain. the lack of honesty in the pursuit of experiment and observations. (forgive my special wording here, force of habit, words to grow in depth of meaning – I can speak Basic English and sometimes use it for those not part of that native language.)

Deo Vindice…well, maybe time or something like that as your link suggested an optimistic notion of Leibniz – I do not claim to have all the answers but a few more filling the colors into our paint by number dreams…

So let me knock it down a notch- consider Sabine’s arXiv posts on supertasks posted yesterday… it presents a comprehensive picture of one stance to theory. I think it includes some social issues also debated hidden between the lines- for it is a view of the role of gender, of feminist environments of some mysterious matrix or Mother. Most everyone can understand the symbolism of sex if physics can be put into those terms. Active and passive is the issue here… is there a force or a fall or neutrality in gravity? But sex is much more complicated than our limited understanding and experience, like particles at least a four way deal. It is even humorous if we imagine cyclic theories as to what anatomy may arise in the topology of three space.

http://backreaction.blogspot.com/2013/09/what-is-special-relativity.html?spref=fb

http://arxiv.org/abs/1309.0144

http://lblogbook.cern.ch/Shift/72477 End-Of-Fill Calibration Starting whats going on over there? here’s a stupid thing that I’ll surely get flamed for… but what if the whole notion of the biblical devils tail or about the nasty crap in the tails of distributions we cant see… just out of range? http://answers.yahoo.com/question/index?qid=20071031094605AAvp39K

Flamed? serpents are linear and lilly pads are radial as to natural evolving forms…a double vector active current in balance with the reverse flow of holes…makes the world go ’round. Even Zeus could throw a lightning bolt…physics and religion still are chasing their tails…. but love has its tragedy- for the force from one object to another or from one to the other is the same description vector wise… :-)

https://en.wikipedia.org/wiki/Slippery_devil%27s_staircase aka the Minkowski question mark function :) Peace, Have fun, etc.

therealsac … thanks, goes way back I see… a most interesting wiki post… I understand the p-adic view and some of Matti’s terminology now… of course it is hard being blind to each others minds to know what is original or what was lost or discovered again when and where and to whom… Look, I mentioned you in a partial drawing on my pesla.blogspot com toward the bottom this morning… it is but a first blush informal sketch… You certainly seem to be fluent on the frontiers here math-wise. It was fun and will be more so when I put it into binary and figure out why the pattern is so irregular in my and natural ordering.

[…] following is my comment (http://blog.vixra.org/2013/07/18/naturally-unnatural/#comment-33501 […]

[…] following is my comment at (http://blog.vixra.org/2013/07/18/naturally-unnatural/#comment-33594 […]

[…] the big gadget (such as LHC), physics can be done with paper and pencil. In the Phil Gibbs’ blog (http://blog.vixra.org/2013/07/18/naturally-unnatural/#comment-33919 ), a new methodology of physics (beauty-contest) was discussed. Furthermore, SUSY (with […]

[…] the search must go on. Thus, this is, in fact, a physics issue, and a discussion at a physics blog (http://blog.vixra.org/2013/07/18/naturally-unnatural/#comment-33594 ) did discuss […]

Fine-Tuning is evidence for the existence of a complete particle physics theory CPT. Within CPT all parameters are related and calculable, so no single parameter can be varied hence nature appears to be fine-tuned.

i cannot to believe that CPT be completely conserved and relate all the interactions of the physical world.

the antimatter doesn’t exist in the nature so as the antiparticles,those are originated in the deformations of the relativistic equations of movement in the transformations of mass into energy and viceversa,as well as the asymmetry of space and time to explain the symmetry of the spacetime continuos,through of the appering of antiparticles.

You mis-understood, I was referring a Complete Particle Theory or a Unified Field Theory not the CPT symmetry. I do not like the term ‘Theory of Everything’

What about the qualification “Many Aspects Theory”?

the physics theories QM and GTR are beyond of CPT,that is broken in cp,pt and others.

then have several series that define differents physicsl law,that are

inserted in several sets with transfinite classes.

These splits concern symmetries. Thus in fact they concern boundary/start conditions of models/sub-models. They do not concern different theories. At least not if you include symmetries as parameters.

For example the Hilbert Book Model introduces special indices for discrete symmetries. Continuous quaternionic functions exist in 16 versions that only differ in their discrete symmetry. For the HBM the corresponding formulas still represent a single theory. In fact the versions can and will be coupled and do that for elementary particles.

i think that the antiparticles from STR with ORTHO-CHRONOUS

ANTI-CHRONOUS LORENTZ TRANSFORMATIONS DOES APPEAR

THE SPEED OF LIGHT AS CONSTANT AND INVARIANT TO REFERENTIAL FRAMES TO RELATIVE MOTIONS.THE CONSTANCY OF SPEED OF LIGHT IS DUE PT- SYMMETRY BREAKING IN THE CONJUGATION OF SPACE AND TIME INTO

SPACETIME CONTINUOS

THEN THE INOSOTROPY OF SPEED OF LIGHT IS DUE TO THE BREAKDOWN OF PT AND WHEN RENORMALIZED PT IN THE 4-DIMENSIONAL spacetime manifolds does appear the isotropy of speed of light

I have something to say regarding this “fine tuning”… what if the fundamental parameters of our universe change over long periods of time? What if something like the Weinberg angle can also change over time? If the Weinberg angle changes, then does not the electron charge also change over time? If the electron charge enters a phase in time where its magnitude is conducive to aiding the structural formation of life, and we look at it over our short lifespans of detection potential and naively call it “fine-tuned”, then is it really “fine-tuned” after all?

As conditions in our universe change, and the parameters enter the life-harboring phase long enough for us to assume a fine-tuning, we are shortchanging ourselves. We are failing to look at a system where there is no deterministic fine-tuning, but rather an ad-hoc appearance of fine-tuning based solely on the fact that passage through this life-harboring phase of universal parameterization is simply INEVITABLE.

I haven’t followed all of the discussion between there and here, but can answer this point. In philosophical terms, this is more or less identical to the multiverse, or many sets of laws. The sets of laws are just spread across time instead of across space, or they’re spread across both. An individual area (or era) still has the same apparent fine-tuning, and the two possible explanations are still the same two, as in posts above.

But in the version with many sets of laws, there is certainly no inevitability, unless you postulate an infinite number of sets of laws, which seems to involve adding rather a lot of excess baggage. It also seems a potentially unscientific way to solve a problem – science narrows things down, rather than widening them out until everything one needs to explain is predicted.

Different sets of boundary conditions and starting conditions can have similar effects as different sets of laws.

Mathematics does not occur in different versions.

It is possible to build completely deduced physical models that are based on a solid and well accepted mathematical foundation. At least our universe seems to correspond to such a model.

Hans, you are right for the idea is the universe seems to be its own laws as if fine tuning (but we need a little better things in our mathematics- although your treatment as a pure abstract geometry feels even more solid a foundation on this issue of laws, on multi or but one universe). Evidently, there will be a few surprises to find that some things are universal and inevitable and predictable (this almost too obvious to see or predict to some end point as I feel now part of a wider picture even with those whose reality cannot so widening a view find solid ground nor see.

So Bill, these issues do need the methods of philosophy as well as science… but note that science also asks why as well as how.