Questioning the Foundations: 4th FQXi Essay Contest

The Foundational Questions Institute has announced its 4th Essay contest on the question “Which of Our Basic Physical Assumptions Are Wrong?” Scientific American are co-sponsors again along with Gruber and submeta. In the third contest I managed a “4th” prize so I will probably have another go. Anyone can enter and past contests have seen a range of authors from amateurs to well-known professionals. Last year there were several viXra authors who made it into the final cut of 37 and it would be great to see more this time.

The subject this year is very open and will suit anyone interested in foundational questions. If your ideas are well outside the mainstream of physics don’t be afraid to enter but don’t be disheartened if you don’t get good results. The important thing is making your contribution and joining in with the comments on the essays.

Like this:

LikeLoading...

Related

This entry was posted on Friday, May 25th, 2012 at 1:13 pm and is filed under Physics, Prizes. You can follow any responses to this entry through the RSS 2.0 feed.
Both comments and pings are currently closed.

29 Responses to Questioning the Foundations: 4th FQXi Essay Contest

I think that this is a typical question literally inviting crackpot activity, the kind of foundations of FQXi. There are many people who are “brave enough” to spit on any essential principles or foundations and trash talk them (or, ad hominem, trash talk top physicists of the past and present).

But all of this is worthless nonsense up to the moment when someone actually finds a theory that can achieve at least everything good that the theories that obey the principles achieved – and maybe something extra – out of a starting point that drops these principles.

It’s completely counterproductive to separate these two phases of the research – the dropping from the finding of new constraints. One may also see that historically, people probably never had the right list “what principles should be drop” to start with. They first had patterns and glimpses of a new theory and while trying to make the picture involving these new patterns more complete, they were forced to see conclusions that also implied that some older principles were not true at the fundamental level.

“Successful and interesting essays will not use this topic as an opportunity to trot out their pet theories simply because those theories reject assumptions of some other or established theory. Rather, the challenge here is to create new and insightful questions or analysis about basic, often tacit, assumptions that can be questioned but often are not”

Your statements might make an opening to a good essay :)

I lost faith in this competition when Julian Barbour won in 2008 with his essay The Nature of Time. Talented guy no doubt, but why do amateurs believe that they can compete on the same level as young professionals just out of grad school? Still, I suppose it’s all harmless fun at the end of the day.

Dear Carla, it could be a harmless game except that an increasing percentage of the attention, interest, and the resources are circulating in the game instead of the real thing. So it’s like a music industry in which you may earn the same money by singing out-of-tune as by singing nicely. What the result and the composition of music in the air will be?

The essay contest is funded by relatively small amounts of private money. Of course there is always a larger controversy about whether public funding is well spent but this is a different matter.

As you know I am an architect without scientific training on physics.
I would like to attend the 4th FOXi essay contest with an adapted version my Vixra letter called:

Experiments to determine the mass related Lightspeed extinction volume
around the Earth and around spinning objects in the Lab. http://vixra.org/pdf/1102.0056v1.pdf

However I have the feeling that
A: I better split the 4 experiments mentioned in the paper into 4 single letters to the contest.
B: my (Dutch based) English should be adapted too into a more English one.

I would be very pleased if you could give me some reaction on both feelings.

Well, what happens is that the the historical lists was mostly in the right, orthodox side.

“Infinitesimal changes of angular momentum” was in the list of suspects from the start. Probably it is the motivation for Newton to defer and rewrite some initial chapters of the principia, after comments of “two friends” (unnamed, but guess Barrow and Collins). The issue is solved by Planck constant.

Divisibility of space by atoms is also a complex topic, we can read of it in Galileo. The best answers, up to now, are the creation of new particles in quantum field theory, the split of strings in QCD and string theory, and renormalisation group connecting multiple scales of lenghts.

The finiteness or infiniteness of lightspeed has been always an open topic, finally solved in the finite side. To be noted, epicureans prefer to use a neutral word to tell that free elementary particles flight to the “maximum possible speed”, without relating it to lightspeed.

I hope that “Which of Our Basic Physical Assumptions Are Wrong?” really reflects the emerging views about the recent sad state of theoretical physics and the realization that something has to be done.

The period of stagnation has lasted for forty years now. It is high time for an attempt to understand what went wrong at the time when GUTs emerged and also what might be wrong even with the standard model (see this).

Personally I am not too optimistic about the contest. Same old names everywhere all the time both sharing and receiving the prizes: this is one of the symptoms of the disease.

Only a few people have won prizes more than once. Last years winner was someone not well known who does not have a position working as a theoretical physicist. He just found an original way to present his ideas that many people liked.

Its meant to be about the foundations of physics. If you think a biology/biophysics meets that condition you are fine to write about it, but it will be a tough case to make.

I am not sure what the GR/SR debate you are referring to is. The question will no doubt be taken up by many people who think mainstream ideas like GR/SR are completely wrong but I dont imagine that is what the organisers are looking for. If someone can make interesting and original points along those lines they might get some votes, but I cannot imagine it myself. If people have ideas about the way GR or SR break down at untested scales that would be a different matter.

Ye, maybe I write about Life? What is Life? That should be foundational enough?

I personally like this contest as something interdishiplinary, where also non-professionals can have their say. One of few areas they can bring forth their ideas. I think we need this.

The recent experimental situation at LHC and the tragic fate of super string program certainly forces to ask whether something went wrong for about four decades ago when GUTs emerged: later fashions have assumed GUT wisdom more or less as such.

Guts force to give up separate conservation of B and L but not a slightest experimental evidence has emerged for the decay of proton. The identification of color as spin like quantum number is also questionable so that even standard model view about strong interactions could be wrong at very short scales. Also the relationship of QCD- to old-fashioned description based on hadrons relying on the notion of strong isospin is fuzzy: maybe we should have asked very seriously: does strong isospin really emerge from QCD?

About special relativity- general relativity relation we have debated here. The mainstream thinking is that Poincare symmetries are symmetries of general relativity. Noether’s theorem is however lost and one cannot even define energy, momentum etc. as Noether currents, and the attempts to identify them are completely ad hoc guesses. Situation remains the same for the long length scale limit of string models which choses the easy way and just assumes the existence of QFT limit at long length scales (and ends up with landscape catastrophe). This conceptual blunder might explain the failure to construct quantum theory or gravity.

Taking blackholes as something given without a slightest experimental indication that their interiors are what they are believed to be looks also strange to me. Some people even suggest that they well be for 21st century what harmonics oscillator was for previous one. It seems that namy theoreticians are tinkering with just that stuff of GRT, which corresponds to the limit at which fails. A thorough conceptual housecleaning is needed.

Particles are not particles, strings are not strings and waves are not waves as there is only interference of ‘generations’ of particles/strings/waves. As limited states equal unlimited observables, I do not know what is a generation in terms of time – constant interference, as space is only functionally relevant to energy. Interference is absolute while observations may be relative, The reason may lie within the phase transitions of what one seems as an energetic field space.

For me it was an eyopener to take part in the FQXi contest “Is reality digital or analog”, as a layman I thought that maybe my ideas were stupid and too simple. But I am really glad to have participated, learned a lot of the interventions of other laymen and professionals who encouraged me to persuit my queeste. The new contest is difficult because it is omni-extensive, and 9 pages to explain your point of view is not much, but that gives you only the rule to be to the point without too much blabla.
Wilhelmus

Here are 3 quick suggestions for which fundamental assumptions are holding physics back. 1. First and foremost at the beginning of the 1900s physicists assumed that the Newtonian gravitational constant is absolutely the same on all scales of nature’s hierarchy. In a discrete fractal or conformal model of nature, this assumption fails badly. The value of G within an atom or subatomic particle has never been measured; it is purely assumed to be the conventional value. If G changes by large and discrete amounts for each cosmological scale [atomic, stellar, galactic], as predicted by Discrete Scale Relativity, then you get a whole new paradigm for understanding the structure and dynamics of nature. 2. We have assumed that strict reductionism is the “only game in town”. This is a bad assumption and flagrantly ignores the clear fractal and conformal properties of nature. 3. Physics has suffered because of its inability to bring the fundamental symmetry: relativity of scale, into its theories. It has been wrongly assumed that scale is absolute. This is probably false and very misleading. Weyl, Einstein, Dirac and a host of others tried repeatedly to work relativity of scale into physics, but it never seemed to work quite right. However, if your emphasis is on studying nature, instead of studying Platonic models, then you can see how nature accomplishes this. Nature cannot have continuous conformal symmetry because that strongly violates our empirical knowledge of nature. But discrete conformal symmetry does not need to conflict with empirical results. If the laws of physics, especially gravitation, are recast with discrete conformal symmetry, then you get a new and completely different understanding of nature in terms of a discrete self-similar hierarchy that has no bounds.
With this new paradigm you can unify GR and QM, explain the fine structure constant, demystify h-bar, resolve the vacuum energy density crisis, predict the exact nature of the dark matter, retrodict the masses of all particles (including the electron), and have a proper understanding of the hierarchy of Planck scales. This new paradigm predicted pulsar-planets, and it predicted the hundreds of billions of unbound planetary-mass objects recently inferred as roaming free throughout the Galaxy. It makes an exact prediction for the dark matter mass spectrum. I have a website that serves as a teaching resource for this new paradigm. RLO http://www3.amherst.edu/~rloldershaw Discrete Scale Relativity Fractal Cosmology

”As I see what we need first and foremost is not correct theory,but some theory to start from,whereby we may hope to ask a question so that we will get an answer,if only to the effect that our notion was entirely erroneous.Most of the time we never even get around to asking the question in such a form that it can have an answer.”

This statement is fully applicable to the topic of Essay Contest 2012.

I do not intend to participate in this competition,but would be read with great interest the essay ” Is wrong assumption Gravity as a force of interaction?”
I mean Erik Verlinde approach.

Well, then maybe we only can present questions? But the right questions are more valueable than answers? I think we must reformulate many questions. 9 p is not much to do that on.

“… finds a theory that can achieve at least everything good that the theories that obey the principles achieve …” The theory has already been found — it is M-theory. The whole deal is to add 5 or 6 new physical principles to M-theory.
MILGROM DENIAL HYPOTHESIS: The main problem with M-theory is that M-theorists fail to realize that Milgrom is the Kepler of contemporary cosmology.
Are Yoshio Koide, John P. Lestone, and Gerald H. Rosen underappreciated geniuses? What about Carl Brannen and Stephen Wolfram?

From a mathematical viewpoint, is M-theory the only game in town for the foundations of physics? Consider some claims: Milgrom, McGaugh, and Kroupa are among the best astrophysicists in the world. There are 4 main interpretations of quantum theory: (1) Copenhagen interpretation; (2) many worlds interpretation with the string landscape; (3) many worlds interpretation with the finite nature hypothesis; (4) local, realist interpretation with the universe hypothesis and with quantum randomness simulated by local, realist determinism. Bryan Sanctuary, J. Christian, J. F. Gourdes, M. Nagasawa, and others are correct in asserting that Bell’s theorem is somewhat incorrect in its physical assumptions. J. Christian’s idea that Bell’s SU(1) quantum states should be replaced by SU(8) quantum states is correct. The string landscape cannot be refuted, but modified M-theory with Wolfram’s automaton is the most plausible explanation for the space roar.

I think it’s of key importance to find out, experimentally by astrophysical observations, if the mass or energy of the world stays constant or grows, i.e. we have to observe more accurate the world’s density in the past. The fact that at least currently age, size and density of the world fit (or in cosmologiacl terms that the observed density is near the critical density), if not ocasionally and only currently (flatnes ‘problem’), is a strong indication that the mass grows.

If we know that the mass grows, then we have to consider that the world started, roughly, with zero or one elementary mass. More detailed, neither the dimensions nor the physical forces, nor logics, geometry, physics and their laws were pre-fixed, but originated by the first development steps of the world.

Then, when time and space arose, geometry and mathematics still were extremly simple. More exactly, they were not more complex than the, saying, 8 events / facts created already at that time.

Thus, the essential (and for us now physically relevant) theory of time and space can’t be so complicated as suggested by all the models discused here and elsewhere !!! This is why I consider many these theories junk.

If, for example, we limit us to time and one space dimension, then the essential physics is expressed by simplest rules of quantum mechanics and special relativity. F.ex. the whole special relativity is expressed by one individual characteristics (the light speed c as the proportion between space-extension’s and time’s elementary units), beside of one generical characteristics inherent in all dimensions (that the metrics is quadratic). With the nowadays mathematics, the theoretical physics can describe ilimited everything ilimited more complicated, but the realistic, essential content is just what I said. Time, obviously arose before space, is much more simple to describe than space, more exactly proper time is very similar to a causal sequence or world-line with events, and the difference among them both is fixed only by one characteristics.

The characteristics of time and space were fixed definitively at the time when they arose, and this have to be very on the beginning, before everything else. Later, additional arbitrary events could have changed only details – such as the ‘figure’ and ‘time-development’ (i.e. the metric coefficients); during this, previous arosed dimensions should not retroact to already existing, so that the global metric coefficient of the action shouldn’t depend on the time; that of time not on the scale factor of space; that scale factor not on the two other space directions i.e. angular direction. However, as said, the dimensions themselves, and the relationship among them (like, the inertia, expressed by the special relativity, as the relationship between time and extension) were fixed very at the beginning, when almost no mathematics existed yet, and thos should be simple, not complicated. The special relativity is an example of this simpleness, and the principal part of the general relativity (letting away details caused by local mass / energy accumulations and local space deformations) also.

This we should separate from a next problem, the theory of the elementar particles. These arose later, when a bigger number of ocasional events occured, and mathematics / geometry / physics can be more complex, without that this, however, could change back the already fixed general and global characteristics of time, space.

David Brown’
You can think at fundamental questions about M-theory? The most fundamental I can think of is the criticality. And of course the time and scaling question?
Not to talk of gravity and negative matter? Or negative gravity?
There are lots of questions that would need a reformulation?

Observable expanding space/energy are unlimited and therefore discrete as one build up fractals. As fractals transform due interference of particles/strings/waves, physical states compound on – the reason behind patter formation. There are some unknowns we don’t know and yes, certain assumption by brain washed ‘scientists’ do determine stagnation of the process perceptual evolution. We do not care about clear fractals, we care about the process (forces/potential) behind time as matter compresses. Matter may be measurable via assumed/adopted scales due convenience albeit matter self interference is not perceived outside of light of interference of particles/strings/waves. Reality can be seen only through anti-real space , that is an oscillating anti-meta reality is within the core of philosophical discreteness of measurable quanta. As Newton ball gets smaller and smaller, and therefore their surface ratio increase exponentially, the magnitude as a complex vector get highly netted – that is emergence can be foreseen only and only if we looked at reductionism as a reciprocal matter/time VS energy/space. That is space cannot unify with time. And space as a fundamental definition does not equal mass.

viXra has passed the 10,000 paper milestone! 4 months ago

LHC Schedule looks good. Cryogenics nearly ready now and Op vistas says 6800 GeV per beam = 13.6 TeV com fb.me/1ZpgkgsN14 months ago

First ten essays in the FQXi essay contest are now online including mine and some other familiar faces... fb.me/6Zh4SGiFJ5 months ago

RT @SAPLancer: @viXra Respect for your decision & efforts to create an alternative archive to serve the scientific community, independent f… 6 months ago

@closefrank even though they were conference proceedings Springer were claiming that they were peer-reviewed. 6 months ago

I think that this is a typical question literally inviting crackpot activity, the kind of foundations of FQXi. There are many people who are “brave enough” to spit on any essential principles or foundations and trash talk them (or, ad hominem, trash talk top physicists of the past and present).

But all of this is worthless nonsense up to the moment when someone actually finds a theory that can achieve at least everything good that the theories that obey the principles achieved – and maybe something extra – out of a starting point that drops these principles.

It’s completely counterproductive to separate these two phases of the research – the dropping from the finding of new constraints. One may also see that historically, people probably never had the right list “what principles should be drop” to start with. They first had patterns and glimpses of a new theory and while trying to make the picture involving these new patterns more complete, they were forced to see conclusions that also implied that some older principles were not true at the fundamental level.

The instructions say this:

“Successful and interesting essays will not use this topic as an opportunity to trot out their pet theories simply because those theories reject assumptions of some other or established theory. Rather, the challenge here is to create new and insightful questions or analysis about basic, often tacit, assumptions that can be questioned but often are not”

Your statements might make an opening to a good essay :)

agreed Philip

Wilhelmus

Well, good luck to them. Is there a bookmaker that allows me to bet against the emergence of a later appreciated paper within this FQXi framework? ;-)

I lost faith in this competition when Julian Barbour won in 2008 with his essay The Nature of Time. Talented guy no doubt, but why do amateurs believe that they can compete on the same level as young professionals just out of grad school? Still, I suppose it’s all harmless fun at the end of the day.

Dear Carla, it could be a harmless game except that an increasing percentage of the attention, interest, and the resources are circulating in the game instead of the real thing. So it’s like a music industry in which you may earn the same money by singing out-of-tune as by singing nicely. What the result and the composition of music in the air will be?

I would prefer dreaming up pet theories in combination with proposals for doable experiments in professional labs

The essay contest is funded by relatively small amounts of private money. Of course there is always a larger controversy about whether public funding is well spent but this is a different matter.

Dear Phil,

As you know I am an architect without scientific training on physics.

I would like to attend the 4th FOXi essay contest with an adapted version my Vixra letter called:

Experiments to determine the mass related Lightspeed extinction volume

around the Earth and around spinning objects in the Lab.

http://vixra.org/pdf/1102.0056v1.pdf

However I have the feeling that

A: I better split the 4 experiments mentioned in the paper into 4 single letters to the contest.

B: my (Dutch based) English should be adapted too into a more English one.

I would be very pleased if you could give me some reaction on both feelings.

Best regards,

Leo Vuyk.

The only advice I can give is to read the rules and guidelines. You can only submit one essay and there is a maximum size

Well, what happens is that the the historical lists was mostly in the right, orthodox side.

“Infinitesimal changes of angular momentum” was in the list of suspects from the start. Probably it is the motivation for Newton to defer and rewrite some initial chapters of the principia, after comments of “two friends” (unnamed, but guess Barrow and Collins). The issue is solved by Planck constant.

Divisibility of space by atoms is also a complex topic, we can read of it in Galileo. The best answers, up to now, are the creation of new particles in quantum field theory, the split of strings in QCD and string theory, and renormalisation group connecting multiple scales of lenghts.

The finiteness or infiniteness of lightspeed has been always an open topic, finally solved in the finite side. To be noted, epicureans prefer to use a neutral word to tell that free elementary particles flight to the “maximum possible speed”, without relating it to lightspeed.

I hope that “Which of Our Basic Physical Assumptions Are Wrong?” really reflects the emerging views about the recent sad state of theoretical physics and the realization that something has to be done.

The period of stagnation has lasted for forty years now. It is high time for an attempt to understand what went wrong at the time when GUTs emerged and also what might be wrong even with the standard model (see this).

Personally I am not too optimistic about the contest. Same old names everywhere all the time both sharing and receiving the prizes: this is one of the symptoms of the disease.

Only a few people have won prizes more than once. Last years winner was someone not well known who does not have a position working as a theoretical physicist. He just found an original way to present his ideas that many people liked.

Basic physical assumptions should incorporate the GR/SR debate? Why does not SR fit in gravity? Or…?

Can it be biology/biophysic too?

Its meant to be about the foundations of physics. If you think a biology/biophysics meets that condition you are fine to write about it, but it will be a tough case to make.

I am not sure what the GR/SR debate you are referring to is. The question will no doubt be taken up by many people who think mainstream ideas like GR/SR are completely wrong but I dont imagine that is what the organisers are looking for. If someone can make interesting and original points along those lines they might get some votes, but I cannot imagine it myself. If people have ideas about the way GR or SR break down at untested scales that would be a different matter.

Ye, maybe I write about Life? What is Life? That should be foundational enough?

I personally like this contest as something interdishiplinary, where also non-professionals can have their say. One of few areas they can bring forth their ideas. I think we need this.

The recent experimental situation at LHC and the tragic fate of super string program certainly forces to ask whether something went wrong for about four decades ago when GUTs emerged: later fashions have assumed GUT wisdom more or less as such.

Guts force to give up separate conservation of B and L but not a slightest experimental evidence has emerged for the decay of proton. The identification of color as spin like quantum number is also questionable so that even standard model view about strong interactions could be wrong at very short scales. Also the relationship of QCD- to old-fashioned description based on hadrons relying on the notion of strong isospin is fuzzy: maybe we should have asked very seriously: does strong isospin really emerge from QCD?

About special relativity- general relativity relation we have debated here. The mainstream thinking is that Poincare symmetries are symmetries of general relativity. Noether’s theorem is however lost and one cannot even define energy, momentum etc. as Noether currents, and the attempts to identify them are completely ad hoc guesses. Situation remains the same for the long length scale limit of string models which choses the easy way and just assumes the existence of QFT limit at long length scales (and ends up with landscape catastrophe). This conceptual blunder might explain the failure to construct quantum theory or gravity.

Taking blackholes as something given without a slightest experimental indication that their interiors are what they are believed to be looks also strange to me. Some people even suggest that they well be for 21st century what harmonics oscillator was for previous one. It seems that namy theoreticians are tinkering with just that stuff of GRT, which corresponds to the limit at which fails. A thorough conceptual housecleaning is needed.

Particles are not particles, strings are not strings and waves are not waves as there is only interference of ‘generations’ of particles/strings/waves. As limited states equal unlimited observables, I do not know what is a generation in terms of time – constant interference, as space is only functionally relevant to energy. Interference is absolute while observations may be relative, The reason may lie within the phase transitions of what one seems as an energetic field space.

For me it was an eyopener to take part in the FQXi contest “Is reality digital or analog”, as a layman I thought that maybe my ideas were stupid and too simple. But I am really glad to have participated, learned a lot of the interventions of other laymen and professionals who encouraged me to persuit my queeste. The new contest is difficult because it is omni-extensive, and 9 pages to explain your point of view is not much, but that gives you only the rule to be to the point without too much blabla.

Wilhelmus

Here are 3 quick suggestions for which fundamental assumptions are holding physics back. 1. First and foremost at the beginning of the 1900s physicists assumed that the Newtonian gravitational constant is absolutely the same on all scales of nature’s hierarchy. In a discrete fractal or conformal model of nature, this assumption fails badly. The value of G within an atom or subatomic particle has never been measured; it is purely assumed to be the conventional value. If G changes by large and discrete amounts for each cosmological scale [atomic, stellar, galactic], as predicted by Discrete Scale Relativity, then you get a whole new paradigm for understanding the structure and dynamics of nature. 2. We have assumed that strict reductionism is the “only game in town”. This is a bad assumption and flagrantly ignores the clear fractal and conformal properties of nature. 3. Physics has suffered because of its inability to bring the fundamental symmetry: relativity of scale, into its theories. It has been wrongly assumed that scale is absolute. This is probably false and very misleading. Weyl, Einstein, Dirac and a host of others tried repeatedly to work relativity of scale into physics, but it never seemed to work quite right. However, if your emphasis is on studying nature, instead of studying Platonic models, then you can see how nature accomplishes this. Nature cannot have continuous conformal symmetry because that strongly violates our empirical knowledge of nature. But discrete conformal symmetry does not need to conflict with empirical results. If the laws of physics, especially gravitation, are recast with discrete conformal symmetry, then you get a new and completely different understanding of nature in terms of a discrete self-similar hierarchy that has no bounds.

With this new paradigm you can unify GR and QM, explain the fine structure constant, demystify h-bar, resolve the vacuum energy density crisis, predict the exact nature of the dark matter, retrodict the masses of all particles (including the electron), and have a proper understanding of the hierarchy of Planck scales. This new paradigm predicted pulsar-planets, and it predicted the hundreds of billions of unbound planetary-mass objects recently inferred as roaming free throughout the Galaxy. It makes an exact prediction for the dark matter mass spectrum. I have a website that serves as a teaching resource for this new paradigm. RLO http://www3.amherst.edu/~rloldershaw Discrete Scale Relativity Fractal Cosmology

In the last century Warren McCulloch wrote:

”As I see what we need first and foremost is not correct theory,but some theory to start from,whereby we may hope to ask a question so that we will get an answer,if only to the effect that our notion was entirely erroneous.Most of the time we never even get around to asking the question in such a form that it can have an answer.”

This statement is fully applicable to the topic of Essay Contest 2012.

I do not intend to participate in this competition,but would be read with great interest the essay ” Is wrong assumption Gravity as a force of interaction?”

I mean Erik Verlinde approach.

Well, then maybe we only can present questions? But the right questions are more valueable than answers? I think we must reformulate many questions. 9 p is not much to do that on.

Phil,

On a completely different topic…viXra is one submission shy of 3000!

Ready to celebrate this impressive milestone?

Cheers,

Ervin

No more this evening but hopefully tomorrow

“… finds a theory that can achieve at least everything good that the theories that obey the principles achieve …” The theory has already been found — it is M-theory. The whole deal is to add 5 or 6 new physical principles to M-theory.

MILGROM DENIAL HYPOTHESIS: The main problem with M-theory is that M-theorists fail to realize that Milgrom is the Kepler of contemporary cosmology.

Are Yoshio Koide, John P. Lestone, and Gerald H. Rosen underappreciated geniuses? What about Carl Brannen and Stephen Wolfram?

From a mathematical viewpoint, is M-theory the only game in town for the foundations of physics? Consider some claims: Milgrom, McGaugh, and Kroupa are among the best astrophysicists in the world. There are 4 main interpretations of quantum theory: (1) Copenhagen interpretation; (2) many worlds interpretation with the string landscape; (3) many worlds interpretation with the finite nature hypothesis; (4) local, realist interpretation with the universe hypothesis and with quantum randomness simulated by local, realist determinism. Bryan Sanctuary, J. Christian, J. F. Gourdes, M. Nagasawa, and others are correct in asserting that Bell’s theorem is somewhat incorrect in its physical assumptions. J. Christian’s idea that Bell’s SU(1) quantum states should be replaced by SU(8) quantum states is correct. The string landscape cannot be refuted, but modified M-theory with Wolfram’s automaton is the most plausible explanation for the space roar.

I think it’s of key importance to find out, experimentally by astrophysical observations, if the mass or energy of the world stays constant or grows, i.e. we have to observe more accurate the world’s density in the past. The fact that at least currently age, size and density of the world fit (or in cosmologiacl terms that the observed density is near the critical density), if not ocasionally and only currently (flatnes ‘problem’), is a strong indication that the mass grows.

If we know that the mass grows, then we have to consider that the world started, roughly, with zero or one elementary mass. More detailed, neither the dimensions nor the physical forces, nor logics, geometry, physics and their laws were pre-fixed, but originated by the first development steps of the world.

Then, when time and space arose, geometry and mathematics still were extremly simple. More exactly, they were not more complex than the, saying, 8 events / facts created already at that time.

Thus, the essential (and for us now physically relevant) theory of time and space can’t be so complicated as suggested by all the models discused here and elsewhere !!! This is why I consider many these theories junk.

If, for example, we limit us to time and one space dimension, then the essential physics is expressed by simplest rules of quantum mechanics and special relativity. F.ex. the whole special relativity is expressed by one individual characteristics (the light speed c as the proportion between space-extension’s and time’s elementary units), beside of one generical characteristics inherent in all dimensions (that the metrics is quadratic). With the nowadays mathematics, the theoretical physics can describe ilimited everything ilimited more complicated, but the realistic, essential content is just what I said. Time, obviously arose before space, is much more simple to describe than space, more exactly proper time is very similar to a causal sequence or world-line with events, and the difference among them both is fixed only by one characteristics.

The characteristics of time and space were fixed definitively at the time when they arose, and this have to be very on the beginning, before everything else. Later, additional arbitrary events could have changed only details – such as the ‘figure’ and ‘time-development’ (i.e. the metric coefficients); during this, previous arosed dimensions should not retroact to already existing, so that the global metric coefficient of the action shouldn’t depend on the time; that of time not on the scale factor of space; that scale factor not on the two other space directions i.e. angular direction. However, as said, the dimensions themselves, and the relationship among them (like, the inertia, expressed by the special relativity, as the relationship between time and extension) were fixed very at the beginning, when almost no mathematics existed yet, and thos should be simple, not complicated. The special relativity is an example of this simpleness, and the principal part of the general relativity (letting away details caused by local mass / energy accumulations and local space deformations) also.

This we should separate from a next problem, the theory of the elementar particles. These arose later, when a bigger number of ocasional events occured, and mathematics / geometry / physics can be more complex, without that this, however, could change back the already fixed general and global characteristics of time, space.

David Brown’

You can think at fundamental questions about M-theory? The most fundamental I can think of is the criticality. And of course the time and scaling question?

Not to talk of gravity and negative matter? Or negative gravity?

There are lots of questions that would need a reformulation?

In response to Robert,

Observable expanding space/energy are unlimited and therefore discrete as one build up fractals. As fractals transform due interference of particles/strings/waves, physical states compound on – the reason behind patter formation. There are some unknowns we don’t know and yes, certain assumption by brain washed ‘scientists’ do determine stagnation of the process perceptual evolution. We do not care about clear fractals, we care about the process (forces/potential) behind time as matter compresses. Matter may be measurable via assumed/adopted scales due convenience albeit matter self interference is not perceived outside of light of interference of particles/strings/waves. Reality can be seen only through anti-real space , that is an oscillating anti-meta reality is within the core of philosophical discreteness of measurable quanta. As Newton ball gets smaller and smaller, and therefore their surface ratio increase exponentially, the magnitude as a complex vector get highly netted – that is emergence can be foreseen only and only if we looked at reductionism as a reciprocal matter/time VS energy/space. That is space cannot unify with time. And space as a fundamental definition does not equal mass.