If Supersymmetry is found or excluded at the Large hadron Collider, how will it affect your opinion on string theory as unification of gravity and particle physics? This is a hard question and opinions differ widely across the range of theorists, but at the least any answer should be consistent with the laws of probability including Bayes Law. What can we really say?

A staunch string theorist might want to respond as follows:

*“I am confident about the relevance of superstring theory to the unification of gravity and the forces of elementary particles because it provides a unique way to accomplish this that is consistent in the perturbative limits (Amongst other reasons.) Unfortunately it does not have a unique solution for the vacuum and we have not yet found a principle for selecting the solution that applies to our universe. Because of this we cannot predict the low energy effective physics and we cannot even know if supersymmetry is an observable feature of physics at energy scales currently accessible. Therefore if supersymmetry is not observed at the TeV scale even after the LHC has explored all channels up to 14 TeV with high integrated luminosities, there is no reason for that to make me doubt string theory. On the other hand, if supersymmetry is observed I will be enormously encouraged. This is because there are good reasons to think that supersymmetry will be restored as an exact gauge symmetry at some higher scale, and gauged sypersymmetry inevitably includes gravity within some version of supergravity. There are further good reasons why supergravity is not likely to be fully consistent on its own and would necessarily be completed only as a limit of superstring theory. Therefore if supersymmetry is discovered by the LHC my confidence in string theory will be greatly improved.” *

On hearing this a string theory skeptic would surely be seen shaking his head vigorously. He would say:

*“You cannot have it both ways! If you believe that the discovery of supersymmetry will confirm string theory then you must also accept that failure to discover it falsify string theory. Any link between the two must work equally in both directions. You are free to say that supersymmetry at the electro-weak scale is a theory completely Independent of string theory if you wish. In that case you are safe if suppersymmetry is not found but by the same rule the discovery of supersymmetry cannot be used to claim that superstring theory is right. If you prefer you can claim that superstring theory predicts supersymmetry (some string theorists do) but if that is your position you must also accept that excluding supersymmetry at the LHC will mean that string theory has failed. You can take a position in between but it must work equally in both directions.”*

* *** The Tetrahedron of Possibilities**

**The Tetrahedron of Possibilities**

What does probability theory tell us about the range of possibilities that a theorist can consider for answers to this problem? Prior to the experimental result he will have some estimate for the probability that string theory is a correct theory of quantum gravity and for the probability that supersymmetry will be observed at the LHC. In my case I assign a probability of *P _{ST}* = 0.9 to the idea that string theory is correct and

*P*= 0.7 to the probability that SUSY will be seen at the LHC. These are my prior probabilities based on my knowledge and reasoning. You can have different values for your estimates because you know different things, but you can’t argue with mine. There are no absolutely correct global values for these probabilities, they are a relative concept.

_{SUSY}However, these two probabilities do not describe everything I need to know. There are four logical outcomes I need to consider altogether:

*P*_{1}= the probability that both string theory is correct and SUSY will be found*P*_{2}= the probability that string theory is correct and SUSY will not be found*P*_{3}= the probability that string theory is wrong and SUSY will be found*P*_{4}= the probability that string theory is wrong and SUSY will not be found

You might try to tell me that there are other possibilities, such as that SUSY exists at higher energies or that string theory is somehow partly right, but I could define my conditions for correctness of string theory and for discovery of SUSY so that they are unambiguous. I will assume that has been done. This means that the four possible outcomes are mutually exclusive and exhaustive. We can conclude that *P*_{1} + *P*_{2} + *P*_{3} + *P*_{4} = 1. Of course the four probabilities must also be between 0 and 1. These conditions map out a three-dimensional tetrahedron in the four-dimensional space of the four probability variables with the four logical outcomes at each vertex. This is the tetrahedron of possible prior probabilities and any theorists prior assessment of the situation must be described by a single point within this tetrahedron.

So far I have only given two values that describe my own assessment so to pinpoint my complete position within the three-dimensional range I must give one more value. If I thought that string theory and SUSY at the weak scale were completely independent theories I could just multiply as follows

*P*_{1} = *P _{ST}* .

*P*= 0.63

_{SUSY}*P*

_{2}=

*P*.(1 –

_{ST}*P*) = 0.27

_{SUSY}*P*

_{3}= (1 –

*P*) .

_{ST}*P*= 0.07

_{SUSY}*P*

_{4}= (1 –

*P*) .(1 –

_{ST}*P*) = 0.03

_{SUSY}The condition that the two theories are independent fall on a surface given by the equation *P*_{1} . *P*_{4} = *P*_{2} . *P*_{3} that neatly divides the tetrahedron in two.

As I already explained I do not think these two things are independent. I think that SUSY would strongly imply string theory. In other words I think that the probability of SUSY being found and string theory being wrong is much lower than the value of 0.07 for *P*_{3} . In fact I estimate it to be something like *P*_{3} = 0.01. I must still keep the other probabilities fixed so *P*_{1} + *P*_{2} = *P _{ST}* = 0.9 and

*P*

_{1}+

*P*

_{3}=

*P*= 0.7. This means that all my probabilities are now known

_{SUSY}*P*_{1} = 0.69

*P*_{2} = 0.21

*P*_{3} = 0.01

*P*_{4} = 0.09

Notice that I did not get to fix *P*_{1} separately from *P*_{3}. If I know how much the discovery of SUSY is going to affect my confidence in string theory then I also know how much the non-discovery of SUSY will affect it. It is starting to sound like the string theory skeptic could be right, but wait. Let’s see what happens after the LHC has finished looking.

Suppose SUSY is now discovered, how does this affect my confidence? My posterior probabilities *P’*_{2} and *P’*_{4} both become zero and by the rules of conditional probabilities *P’ _{ST}* =

*P*

_{1}/

*P*= 0.69/0.7 = 0.986. In other words my confidence in string theory will have jumped from 90% to 98.6%, quite a significant increase. But what happens if SUSY is found to be inaccessible to the LHC? In that case we end up with

_{SUSY}*P’*=

_{ST}*P*

_{2}/(1-

*P*) = 0.21/0.3 = 0.7 . This means that my confidence in string theory will indeed be dented, but it is far from falsified. I should still consider string theory to have much better than level odds. So the skeptic is not right. The string theorist can argue that finding SUSY will be a good boost to string theory without it being falsified if SUSY is excluded, but the string theorists has to make a small concession too. His confidence in string theory has to be less if SUSY is not found.

_{SUSY}Remember, I am not claiming that these probabilities are universally correct. They represent my assessment and I am not a fully fledged string theorist. Someone who has studied it more deeply may have a higher prior confidence in which case excluding SUSY will not make much difference at all to him even if he believes SUSY would strongly imply string theory.

Phil,

Probabilistic arguments are not a substitute for science. I don’t see the real value of either one of these opinions, as they spark controversy but lead to no real progress in understanding Nature.

Oh, really, Ervin? Probabilistic arguments are not substitute for science? All arguments in natural science are probabilistic.

Aren’t you exactly the layman whom Richard Feynman discussed with about flying saucers here?

His comments seem to be perfectly designed to address your bizarre ideas about what science means. Or were you just joking?

Sorry if this comment is posted twice; you may erase one copy.

Lubos,

Your rant makes no sense. I was referring to arguments that substitute hard objective evidence with “would be” scenarios.

No, you were not talking about “would be” scenarios as you explicitly described the target of your criticism as “probabilistic arguments” which are arguments and they are a very important class of arguments in science. One could say that they’re the only class.

These arguments will have to be made and the data will have to be evaluated once we know what the LHC says, much like we do the same thing once we know that the LHC hasn’t found gluinos in a few inverse femtobarns of MET 7 TeV collisions, once we know the Higgs mass, and many other things.

The point is that “nothing qualitatively new and sharp is seen” changes the picture and opinions less dramatically than sharp discoveries and turns which is why the Bayes theorem and rational reasoning in general implies that the reaction to Yes-discovery and No-discovery may have a different magnitude.

Also, Phil wasn’t really talking about would-be scenarios. He was talking about the right evaluation of certain observations for a broader picture and these methods already have to be partially applied at this point because the LHC isn’t a thought experiment. It has already told us a non-negligible part of what it will have told us in 2020 when it’s stopped.

It seems to me that the string theorist here is on much more solid ground. The thing that seems pretty much certain about string theory is that it needs SUSY at some scale; but, it need not be the electroweak scale. An LHC discovery of electroweak scale SUSY is a discovery of SUSY in general; but, an LHC exclusion of electroweak SUSY is not an exclusion at all scales, meaning that it ought to have less effect on probabilistic argument about string theory than a discovery would.

Hope I didn’t jump too much on your continuation here…

Feel free to comment. I will add the probability argument tomorrow and you can see if it changes or confirms your view.

It doesn’t seem like the LHC can definitively refute SUSY, let alone string theory, so the real question is whether it discovers something that gives a big boost to some other compelling theory (and are there even any candidates here??) Barring that, it seems like theorists will just keep on working on whatever seems like the most mathematically compelling idea that isn’t clearly excluded by data.

Of course, I subscribe to the words by Phil’s string theorist.

There’s no symmetry between Yes and No – because the prior probabilities aren’t equal to each other and to 50 percent and because the conditional probabilities aren’t balanced and 50 percent, either. It’s often the case in science that the proposition A may be much more easily proven than the proposition non(A).

In particular, we often say that specific hypotheses may be falsified but they can never be proven.

A discovery of SUSY would falsify the idea that there can’t be any new physics of the stringy type; this falsification would be pretty much irreversible. On the other hand, a non-discovery of SUSY would provide us with much less far-reaching information.

The LHC may be business-as-usual that just extends the range of energies where the Standard Model is the only experimentally established theory of particle physics. But it may also be the device that totally changes the game. The latter is a paradigm shift so from this viewpoint, its prior probability is much smaller.

In reality, there are good reasons that something new beyond the Higgs should be ultimately discovered by the LHC, too. The odds are much more comparable than either extreme story suggests. At the end of the collider’s life, however, the odds that the LHC finds something new won’t be comparable at all because we will learn the answer.

I agree that the non-discovery of anything new would just turn the LHC era into an intermezzo and physics would return to the regime we’ve known for two or three decades now, the regime in which the consistency and mathematical rigidity and beauty are the primary referees. Of course, in this no-discovery case, everyone will have to take new, stricter limits into account, too.

Of course you can have it both ways (you who are a little philosophically disabled) which ever way the experiment might decide. Nature is asymmetric as well as symmetrical, the handedness may change but the color is conserved. Probability has its place in nature for laws- let us not assume the opposite of the possible is impossible but it is not necessary – nor what is imagined as mandatory. (if not forbidden as the qm saying goes).

Lubos, you are probably wrong and all the shouting down some vague opposition will not make it true- you have a tenable position only in the case where there are no necessary realities.

That, from a pragmatic existentialist view, is the case but one of more generalization than these first blush theories- on any scale of say energy (can we really not do it by theory?) the collider will awaken us to the fact of this wider scope of science and I agree with Ervin real progress (in understanding and falsifying our stages of ignorance as well as science.)

May the black holes in your mind in forcing things down to a false level of materialist reductionism feel the paradox that such isolation from dialog is a ground, even probabilistic, for the causes and aesthetics of Global Warming- the world is simply wider than string theory.

The PeSla

The key to supersymmetry and unification is in the establishment of the principle of equivalence between space-time-energy. Indeed, space. time and energy-mass are the same: dimensions. According to certain symmetry breaks to the observer appear to be different, but it is only an appearance of the interaction of the observer with the breaking of symmetry This mathematical concept is provable from the extension of the Heisenberg uncertainty principle to extra dimensions, 11 dimensions A number of dimensions, with prime number is not factorizable and involves quantum entanglement. But the same principle of uncertainty is equivalent factorizable into four terms: delta X, (space), Delta M (mass), delta X2 (space), delta (time) The Heisenberg uncertainty principle extended to 11 dimensions, quantum entanglement factorizes unification, the prime number of dimensions, 11, in: 4, 4, 2, 1, as the sum In turn, this factorization implies the 32 supercharges because 4x4x2x1 = 32 If you go into technicalities. The equivalence principle meets special relativity, so that (sinh (1 / [2 ^ 1/8])) ^ 2 (cosh (1 / [2 ^ 1/8])) ^ 2 = (1 / 2) x In [(Msusy / mt)], top quark mass mt = This article will write very soon, demonstrating that the scale of SUSY is 2 TeV or so. The mass of the lightest Higgs boson is 126.23 GeV, and this mass is consistent with the equation of mass renormalization Higss boson, MSSM model, or NMSSM. The angle Beta is exactly: Pi/2- (2Pi/60), or 84 º

SQR[(sinh (1 / [2 ^ 1/8])) ^ 2 + (cosh (1 / [2 ^ 1/8])) ^ 2 ]= (1 / 2) x In [(Msusy / mt)

mt= top quark mass

SQR[(sinh (1 / [2 ^ 1/8])) ^ 2 + (cosh (1 / [2 ^ 1/8])) ^ 2]= 1.791465134

EXP(1.791465134)= 5,9982

mSUSY= 172.7 Gev x 5.9982 x 2= 2071 Gev

We should recognise that there is a minimum diversity of the nature, of which a part can’t be expressed / substituted by another. If we have, saying, 4 fundamental forces which can’t be reduced more, and which describe the biggest part of the behaviour of the natur (at least in the sense as physics expect it), I think then it’s OK. Then nature isn’t too complicated, nor too simple.

If the future isn’t predetermined — and I hope so – then, for an EXACT modellation of the nature would be necessary so much parameters as linear-independent events happened (and this are potentially about e ^10⁶¹). By a smaller number one could describe it only approximately. But even such a TOE would be of short duration. Each event what happens ‘tomorrow’, will not be exactly predictable by all these existing rules (ie news will not be ‘included’ in the past), and forms its little new own rule or natural force, expressing just that difference / derivation from the ‘impulse’ or inactive, dead continuation of the past, the difference which realizes, individualizes that new fact.

Thus, anyway we have to work with approximations. There is no need nor any possibility for a theory which includes everything.

To a high degree of precision, the most processes we can describe with a small set rules. The only logical is, that this set of ‘bigger’ rules (or cooling effect in an initial chaos) originated first, before all ‘smaller’ rules. They also appears us subjectively different – as the most basical natural forces and dimensions (up th the space, as we are of matter, and matter seems to ‘belong’ to the space, in any kind of sense, perhaps because matter is a bunch or grouping of informations with just the storage capacity enough for localization in space but not in more dimensions) . We can be glad that of the very big amount of ‘significant big’ and ‘less-significant small’ rules or forces which models the past (a number which almost triplicates itself each appr. elementary time), for materie and for us just a handful are so relevant that we feel them as essential different aspects (forces and dimensions), the remaining appears us just as impredictable derivations from them. But we can’t reduce this more.

It’s right that physics had sucess to explain subjectively very different and ‘complicated’ effects with a few rules (until a certain accuracy). However, this has its limits.

I think a Bayesian analysis on the presence of the Higgs would be a better topic – there is real data to support prior probabilities and the results will come soon enough to validate the calculation.

Before trying to answer the question of the posting one should specify what one means with SUSY.

The standard SUSY is only one possibility and it looks more and more probable that Nature has not realized it. For me this would not be surprising: it is simply too brute. The conflict with separate conservation of B and L characterizing also GUTs and string models is perhaps the worst shortcoming. Here new thinking free of old dogmas is required. My opinion: if SUSY is found at LHC, its realization very probably turns out to be very different from that in mainstream theories.

Whether or not SUSY is found, does not affect my views about string theory. In many respects it is an ugly brute force attempt to unification and lacks deeper principles. If we want to make progress, we must ask what are the deep and beautiful mathematical ideas of it allowing a generalization.

Super-conformal invariance is certainly such an idea and has extremely beautiful generalization obtained by replacing 2-D string world sheets with light-like 3-surfaces: this also explains why space-time must be four-dimensional. Also the generalization of super-conformal algebra and Kac-Moody algebras to D=4 is required (http://matpitka.blogspot.fi/2012/06/about-deformations-of-known-extremals.html). I believe this is the path we must follow sooner or later.

The basic problems are caused by the massive sociological inertia of Big Science. An epoch that lasted for about four decades in particle physics – the era of GUTs – is about to end. How much time it takes for a new generation to take the lead and get rid of all the dead stuff?

Lubos says:

“No, you were not talking about “would be” scenarios as you explicitly described the target of your criticism as “probabilistic arguments” which are arguments and they are a very important class of arguments in science. One could say that they’re the only class.”

Again, you are misinterpreting my point. I did not say that probabilistic arguments have no merit. They obviously do in the sense that you are describing.

But my specific point is that, regardless of the utility of probabilistic arguments, they are not substitutes of objective evidence in science. You may have a different view on this matter, but I believe that real progress in understanding Nature can only be claimed when predictions are matched by experimental observations.

Dear Ervin, *objective evidence in science* and *probabilistic arguments in science* are the very same thing. All arguments are probabilistic and they only differ in the confidence level of how certain a conclusion about Nature these arguments imply. That’s why your comments completely misunderstand the nature of evidence in science.

For example, when we see quadrillions of photons coming from the Moon to a telescope every second, chances are that it’s because there is a localized source somewhere in that direction. But strictly speaking, the evidence for the existence of the Moon is still probabilistic because the photons have a nonzero probability to come from that direction by chance.

In the case of the Moon, it would be silly to talk about the possibility that the photons come from that direction by chance; the confidence level for the existence of the Moon is billions of sigmas. However, for particle physics near the cutting edge, one never gets this overwhelming certainty. Pretty much by definition of the cutting edge, all the evidence comes in a small number of sigmas and it matters whether it is 3 or 6 sigmas etc.

When the evidence becomes 20 sigma, one is already a latecomer when he notices the evidence. It’s already a textbook stuff. The actual research that decides about the fate of theories and changes the scientists’ minds is almost always comparable to 5-sigma statistical evidence. A million-sigma proofs only mattered at the beginning of science when people had no previous systematic efforts to do science right. But the low-lying fruits have been picked and now when a scientist wants to find a new thing, it inevitably starts as a rather weak signal that gets stronger, so the statistical evidence is always needed and decisive in modern science.

Sorry if this is posted several times. Could you please erase the previous copies?

Of course all measurements are probabilistic in nature and involve a signal-to-noise ratio. But the identification of probabilistic argumentation with experimental testing is certainly not “black and white”, at least in my opinion.

In particular, when observations are not directly accessible, can one confidently use probabilistic arguments as substitute for experimental proof in Physics? Are the methods of cognitive science and knowledge representation legitimate tools for Physics research, where one seeks falsification of a theory through laboratory measurements?

Just look what you started Phil!

My votes for the Most Memorable Comments so far are:

‘in [the] no-discovery case, everyone will have to take new, stricter limits into account’ (Lubos)

‘the massive sociological inertia of Big Science’ (Matti)

I wish my stats classes (boring) had been as interesting and relevant as this.

I look forward to your next instalment.

Thank you

To me the problematic aspect is that we are taught that there are only few scenarios to be considered. Standard SUSY, M-theory and super string models, Higgs or SUSY variant of Higgs, etc… We are told that we can select between options for these scenarios. This kind of setting of the stage is how the establishment uses its power.

I do not like this: all this scenarios provided for us could be and very probably are failures and the discussion about this is what would be needed rather than pondering whether we can continue to believe in M-theory if SUSY fails.

This isn’t simply a sociological issue, although sociological issues surely do disfavor some viable options. The tremendous experimental success of the SM3 with massive neutrinos and GR have profoundly limited the nature of BSM and GUT theories that are experimentally possible. New experiments (and not just as LHC) threaten to exacerbate this limitation. And, there aren’t all that many problems to which BSM and GUT theories would aspire to offer solutions either. There are only so many extension of SM physics that can be devised that solve the problems and don’t run afoul of the experimental limitations. We have computer programs these days that can actually derive laws of physics simply from raw data and those kinds of programs generally have stable solutions from given sets of data. There are only so many ways to skin this cat that make any sense.

It is like those saloon games where the only way to solve the problem is to violate some assumption you didn’t even know that you were making that isn’t expressly stated. Until every possibility that is consistent with that assumption is definitively dead it is very hard to figure out which implicit assumption must be dumped to solve the outstanding problems your solution hopes to resolve. I suspect that until experiments tell us in no uncertain terms that the entire world of theories with features (perhaps, e.g., B-L conservation rather than separate B and L conservation) simply must be rejected, we will cling to them won’t be able to make the leap the devises the final solution.

The trouble is that fundamental physics wouldn’t be possible at all without intentionally indoctrinating all physicists with a particular kind of group think with some deeply embedded assumptions, many unstated and unrecognized. And, so, one can’t simply adopt approaches that discourage group think. You can’t do big science without it. But, the very group think which makes physics and especially big physics possible may very well contain the source of our inability to come up with a better solution. Sooner or later, I have faith that the log jam will break, but the barriers to alternative both fundamental and sociological run very deep.

“Sadly, I cannot imagine a single experimental result that would falsify string theory. I have been brought up to believe that systems of belief that cannot be falsified are not in the realm of science.” — Sheldon Glashow, quoted by Matthew Chalmers

http://download.iop.org/pw/PWSep07strings.pdf “Stringscape” by M. Chalmers, 2007

Supersymmetry cannot be refuted. Superstring theory cannot be refuted. The reasons for the two preceding statements are that bizarre, contrived brane interactions can explain anything. However, modified M-theory with the finite nature hypothesis, i.e., Wolfram’s automaton, seems to predict the Rañada-Milgrom effect and the Space Roar Profile Prediction. M-theory cannot be refuted but versions of M-theory with additional physical hypotheses are definitely refutable. Note that infinitesmal calculus can be used to create models of miracles and cartoon physics.

SUSY seems already to be in the morgue.

http://www.science20.com/quantum_diaries_survivor/new_cern_results_rare_b_decays_tombstone_susy-90861

Let’s wait to see the results of the autopsy.

Supersymmetry is a mathematical structure. SUSY is categorically similar to general relativity, where by way of comparison general relativity is almost just pure Riemannian geometry, and the physics is inserted as various constants or phenomenology. So as Phil’s string theory panegyric says, there is no unique vacuum solution and the physics of SUSY involves lots of phenomenology that is applied to SUSY. So failure to find SUSY at the LHC will be disappointing, but it does not rule it out. Finding SUSY at the LHC does not confirm string theory, but it would be an update which increases the probability that string theory is correct.

String theory is likely a theory of something, if it is not a TOE. If strings exist then supersymmetry must be correct. We clearly do not live in a 26 dimensional bosonic string world. In the M-theory construction that must be some equivalent to superstring theories, eg type IIA etc, which has supersymmetry. The logic is then “if string theory then SUSY.” The modus tolens is then if not-SUSY then not-string theory. However, not-SUSY will not be determined by the LHC. All that we might get is not-SUSY up to 14TeV, where SUSY could still be lurking at say 25TeV.

If SUSY is found this would be a Bayesian update or new prior which would lend support for the probability for string theory. There are reasons now to think that SUSY is theoretically reasonable, in particular how it is a way around the Coleman-Mandula theorem. Without SUSY there is no manner by which gravitation can be quantized along side gauge theories, for internal and external gauge (like) fields are not interchangeable. This of course is no proof of SUSY, and I know a couple of people who think gravity simply is not quantized. It would be disappointing to find we live in such a world, and it would be disappointing if by some means we find that SUSY is completely false. However, the LHC will not completely falsify SUSY, but simply test for its falsification up to an upper bound in energy.

I recommend the book “THE TROUBLE WITH PHYSICS ” (Lee Smolin). It gives a actual description of the mental closure of the some scientific elites and of the unproven myths (they call them conjectures) at the base of the modern theories in physics. Who points out the problems of the theory is automatically out of the community; who proposes something new is an enemy. The trouble with string theory, SUSY, extradimensions, also affects phenomenology. The models have so many parameters that they have lost any predictability. They often are just a mix of myths. I hope LHC will put a stop to this, but I fear it will not give a definitive answer this year.

Take the english wiki and compare the properties/categories of these people, then you self see what’s going on. But them merit to be a cracpot comunity, preventing that they stay behind, whilst others go forwards. Read Edda’s Lokiglepsa.

String theory is for one thing not really a theory. It should better be called a hypothesis. Even more it is a framework for hypotheses. The entire system is huge and it contains a vast number of possible constructions. For a number of reasons I think it is likely to be at least a theory of something, if it turns out not to be a theory of everything. Aspects of stringy structures have turned up in other areas of physics. In particular AdS stringy holographic physics has emerged in solid state physics. I suspect there is likely some emergent AdS-string-M-holographic physics going on in low energy nuclear physics. Nature has a way of displaying the same structure or symmetry in different guises at different energy scales. It would honestly be surprising if it should turn out that string theory is found to be completely wrong.

Smolin is a loop variable quantum gravity (LQG) theorist. What is funny about LQG is that its formalism holds close to general relativity and at the face of things it seems to be a very natural way of thinking about quantum gravity. However, LQG has failed to be workable to even one loop correction. The whole thing seems utterly unworkable, in spite of this so called background independence. As John Wheeler put it, the complement of a great truth can itself be a great truth. The fact LQG fails to work may well be telling us something very deep. There seems to be some fundamental obstruction to LQG, and understanding that could be very important. So LQG may be in this sense a very important development; its falsehood could point to something very deep.

For all this discusions about the correct theory, I repeat:

If the world started from a very simple initial situation (no matter if one understands this as a ‘physically’, ‘geometrically’ or ‘number-of-independent-informations/parameters’ -simple) — and there are several indications that this was the case, more exactly that it started out from just one information, what is the ‘yes’ of its own existence (inconditionally valid within the world and for us), whose ‘existence’ means its ‘action’ and the successive production of news — then any theory of time, space etc can’t be complicated, but must be very simple, because it can’t be more than an implicite representation of the number of independent events at that time !

If we assume that time and space originated very on the beginning (and also on classic physics, these dimensions are supposed to exist before anything else), then at least the global part of their theory (i.e., of the time and space as such, independent on later arising local deformations) should have been originated (and thus, their general properties fixed) by just the next efectuated 1, 2, 4 events.

Rather obviously, the dimensions which originated in that way, were: 0) discrete events / world-points, with their action as a second aspect or ‘necessary effect’ of existence; 1) time, originated by one only event at the 2nd evolution step/rank, thus having no relation to another ‘sidewards’ time-point but only to its causing action, so that time get the property we can’t move within it and to be very related with the sequence of events; 2) a kinematic-defined extension, produced by the next rank of 2 events, so that the first space direction get the property of equal-righted coexisting space-points and possible space movements; 3) isotropy or two other space directions geometrically defined transversally to the kinematic one, by the next 4 events, thus also permitting movement in them directions. Each these events were arbitrary, which ‘values’ physically we can interprete as its natural constants (either ‘absolute’ as their elementary units, or dynamically relative to its precessor, formally as their quotient (i.e., of space relatively to time, lpl/tpl = c) or geometrically as the new ‘unfolding direction’ or aspect of the nature), and the individual properties of previous dimensions becomes pre-conditions and thus generical properties of all others. As matter is a ‘grouping’ of 6-tuples of informations, rising later, it’s connected to time and space, and can’t stock/storage more dimensions, so that we don’t feel further dimensions as such.

Thus, there is no place for complicated theories what people propose. Each such theory, you should test if it works still with a number of ‘space-points’ and/or ‘time-points’ of, saying, equal or less then 8 !! The most of these theories fails. Thus, complicated theories of space and time are garbage.

And we see imediately, that the special relativity and the general shape of the general relativity correspond to this test. For example, the relationship between the one-dimensional spatial direction and time gives geometrically the special relativity and physically the inertia, fixed by just lpl as one new characteristics at the side of tpl already produced one step before (or by c as their quotient) together with the metrics as quadratic, which enters as generic properties into all further dimensions.

The LHC is not the only tool in the barn. In a nutshell, LHC results can determine that SUSY behavior is not observed in reactions at a given energy level or below with considerable precision. In theory, one can always simply assume that it takes more energy to see SUSY with this tool, and that absence of evidence is not evidence of absence.

But, there are other tools in the barn, most notably, neutrinoless double beta decay models which is one of the few phenomenological results from SUSY which can be predicted quite generically for SUSY with a given characteristic energy level. SUSY that is distinguishable only at high energy levels produce more neutrinoless double beta decay; low energy SUSY produces less.

LHC pushes up the bottom plate of the vise. Neutrinoless double beta decay studies push down the top plate of the vise. We are very close to reaching a point where there is no space between the two plates.

Of course, one can create new versions of the theory that post-dict the data. But, if a rate of neutrinoless double beta decay sufficiently low to imply a characteristic SUSY energy level below the LHC exclusion threshold is observed, the entire class of SUSY models in which neutrinoless double beta decay is influenced by characteristic SUSY energy levels is gone and that is the lion’s share of those models (far broader than mere the parameters of the MSSM for example).

Astronomy provides yet more tools for SUSY searches. Likewise, higher characteristic energy levels generically imply that the lighest stable supersymmetric particle (often just one or two of them) are very heavy (100 GeV and up) in theories have have a stable lighest supersymmetric particle. The existence of a heavy stable supersymmetric particle to provide dark matter is indeed a major remaining motivation for SUSY. But, a wide variety of methodologically varied astronomy measurements are putting constraints on the properties of dark matter that increasingly exclude particles that look like a stable lighest supersymmetric particle of 100 GeV or more – basically ruling out R-parity preserving SUSY models. But, if R-parity is violated, SUSY parameter space is narrowed as proton half lives get longer, as upper bounds on neutrino masses get lighter, and as experimental limits on baryon number conservation and lepton number conservation tighten, and a variety of experiments, many outside the LHC are tightening these screws.

And, of course, any result that provides positive evidence of a particle (perhaps due to the narrowing of the dark matter parameter space) or a force, that SUSY doesn’t provide for at all, also falisfies SUSY.

The key point is that in every “vise” constraint situation, whole classes of SUSY theories and not merely values of SUSY constants, can be ruled out. And, Bayesian expectations about both SUSY and by implication M-theory, should be affected much more by how deformed SUSY models have to be to fit the emerging data as generic features of most models are ripped away one by one, than they are by mere adjustments of parameter spaces. As SUSY models start to look like epicycles and cease to solve the issues they were devised to resolve, they start to make less sense. Likewise, if other candidate theories for tying up loose ends in the SM + GR world emerge, the status of SUSY as the only game in town offering unification, which keeps it going, diminishes.

I deleted some comments that risked leading to a flame war.

This whole (by itself not bad😉.. ) article beared a high risk of starting a flame war in the comments right from the start …

Phillip,

I see you assigned probabilities as if a logical and geometric abstract structure ( of of which can bear on our interpretations of probabilities as applied to physics models ) saying: p1 + p2 + p3 + p4 = 1. of course to cut a tetrahedron in half through the edges leaves a square. So far so good- but is it not suggestive of higher dimensions and symmetries as a given that geometrical analogies may apply to general truth to a physics system, objective or otherwise if the distinction is made? Should there not be a p5 here at the very low natural dimensions of groups and so on?

Should we not see a carbon atom as five dimensional rather than a tetrahedron? Does it not vibrate logically in 15 of the expected ways with microwaves? Indicating at least an abstract four dimensional object? Now, even such circular reasoning that in effect assumes string theory aspects, is shown not to be capable of encompassing the logical axioms of arithmetic from one view. This seems to me as fuzzy as fuzzy logic of the plane between zero and one as the possible and impossible (where electrons are most certainly forbidden in some regions as impossible).

The general logic of a more unified theory (if this is possible as that was once thought by quantum theory not likely at all) I find Poncaire close to a more advanced and wider realist view and yet in matters of geometry his theorem while true is too rigid a view or limited a view as with the insight of dimensions and Riemann.

It is just a point of view in the age old philosophical debates as to if we can take from a formalist position a complete but logical choice of possible worlds (string landscape choices as unique as numbers if we can indeed so define laws).

In these matters of 26 base bosonic theory that deal with an extreme of vacua, of singularities, we might do well to find some beautiful mind that solves them with odd sounding words based on the English alphabet- then ask why such a jumbled mind found such solutions as did Nash. Or is that a sort of mental coincidence a little more improbable than chance?

The flame wars are as old as our history on core debates if one is aware of alternative positions and not closed minded- but the fire is in the pistons contained thus to do work- until we run out of such fossil fuel. If the theorists one day will not find work they would be overqualified but competent to ask “How many gallons will you have, Sir?”

Of course, practically it takes the technology of steal to make the gasoline engine possible as an advance over iron and steam- and maybe a little lead in the mix for no knock.

Now, if experience (experiment) and predictability mean the same thing and string or supersymmetry cannot be experienced what does that mean for ideas of measures of ignorance or so called falsification by these inductive methods for any solid or high dimensional theory of prediction?

The PeSla

Whether SUSY is found should also subscribe to a time axis. Whether SUSY is found in 5 years, 10 years, 100 years are very different in people’s probabilistic estimates. You argument could be expanded with a P_SUSY(t)

How about this …

1. We live in a 3D universe because three are the minimum dimensions required to create massive particles.

Why …

2. I believe there is no such condition as a rest state (zero motion), every joules of energy is in motion (in transition). So when any point of energy expands (high density to low density or visa verse) whether it is the Big Bang or electron-positron annihilation, the process is not spontaneous, as Dirac theorized, put occurs in a finite time intervals, Planck’s time, tp, (or lower?). The energy-time uncertainty principle, Delta-E x Delta-t ~ h, also points to this time interval mechanism, i.e. the time in the energy-time uncertainty is the time interval the quantum state remains the same, unchanged. …

So …

If the time interval is not zero then there will be an expansion of “space”, Delta-X ~ Delta-t, i.e. motion and hence velocity. I speculate that c is the velocity that “came out of the wash” at the Big Bang, i.e. proportional to the initial conditions at t = 0. Hence if space (variable energy densities) is expanding at a finite velocity, c, then the distances between the isothermal “rays” will increase and hence create normal rays, the radiation is no longer collimated. It is this mechanism that creates vortices, “rotation” and hence oscillatory waves that lead to standing waves and “massive” particles.

So …

If I can further my speculation a bit more, I would say as we have seen so far through the SM, quanta, repetitive ans stable states of energy, will exist for the first fundamental “particle” created by the expansion, Higg’s boson.

Hence, my question to you, Sir. Could the Higg’s boson = Dirac’s particulate aether?

————————————————————————————————

How close am I in my interpretation of a “particle”?

In the standing spherical wave concept, the energy in that sphere (packet) is E = h * c / lambda. where h is Planck’s constant, c is the speed the peak moves in the sphere and lambda is 2r (r is the spacial radius of the sphere).

It takes the “peak” energy (density?) 720 degrees to make one cycle around the sphere (oscillating 90 degrees at a time from the center to the “surface” (amplitude?) of the sphere and back to the center).

The spin is the intrinsic rotation of the peak around an axis to complete one cycle (through x, y and z, i.e. 720 degrees). This intrinsic rotation is what gives the “particle’s angular momentum.

The electric charge is a measure of the effect by the “electric” field created by the peak oscillating between the center of the sphere to the “surface”. The electric field is the gradient of energy created in a grid of all the particles in the universe.

The mass is the measure of the momentum transferable from one particle to another and is created by oscillatory motion of the peak confined in a spherical space (quantum confinement, quanta space).

Speculations from my interpretation:

1) The radius of the sphere for any type of particle is derived by the principle of least action, the resultant effects of all the fields acting on the particle.

2) The attraction force, quantum gravity, is created by the oscillatory nature of the “wave” within the spherical space. When the peak moves to the surface it creates a negative pressure (tending towards “empty” space in the center) and by the principle of least action must return to the center. Like all other fields, gravity likewise is the summation of these (quanta) negative pressures by all the particles in the universe. hence, the gravity “wells” are greatest where there is a dense coalescing of particles, galaxies, stars, planets, etc.

3) These oscillations that some have coalesced to “particles” (standing waves) where created by the expansion of the energy, space, and time system. The expansion of the universe (energy and space) could not be done isotropically because of the time factor, i.e. instantaneity is not possible and hence energy expanded in a non-uniform densities. These variations in energy densities patterns grow more and more complex leading to the “coalescing” of space, (formation of “particles”).

4) The fields and particles have a duality in the sense that all the particles create the fields and each particle effects another through these fields.

————————————————————————————————————-

Fields Particles

Are fields the interaction of particles of the same characteristics (quantum numbers)?

In reality the universe is a collection of different particles at different densities and arrays. The fundamental being either the Higgs (or similar) or the graviton (or similar). In other words, as the universe cooled down the first array of particles (and hence filed) were (was) created (coalesced), (Higgs, graviton, something else). As the temperature further dropped more type particles were create (coalesced at different quantum numbers), some interacted with the fundamental field and some did not (reasons could be coincidence of Nature and nothing to do about meeting human’s math).

So, I ask the question, if everything is made of energy at different densities, then what is energy?

PS; What is energy?

I would like to quote Narendra Katkar in one of his papers, “The Speed of Light, A Fundamental Retrospection to Prospection”

“The Universe is a process of Absolute transformation,

from Cosmic Primal Energy, CPE to Quantum to

Radiation and back to CPE Vacuum State.

CPE → QE → RE→ CPE

Energy is never created neither lost.

“Everything essentially is Energy”

What is Energy? …!!! ”

http://www.jofamericanscience.org/journals/am-sci/am0705/18_4719am0705_113_127.pdf