Congratulations to the winners of the FQXi essay contest “Questioning the Foundations” . The results show an impressive and diverse range of ideas about common assumptions that need to be questioned to progress with foundational physics. This was the fourth contest of its type run by the FQXi institute. These provide a unique opportunity for professional and independent physicists to cross words in a public forum about this kind of subject. I know there will always be criticisms of the results and the imperfect voting system but the contest is still a very worthy exercise. This year there were 272 entries, significantly more than previous contests so the top 36 from the community voting who made the final cut should be extra proud of their success, even if they were not among the final winners. This year I narrowly missed out of joining them but there were many other good essays that did not make it either so there is no need to feel out of it. Taking part and having a chance to air our views on physics is much more important than winning. One last word of congratulations goes to the Perimeter Institute since the vast majority of the winners had strong connections with the centre, such as being past or present researchers there. The Perimeter Institute is well-known for its research on foundational issues so their success here is not surprising. They should also be applauded for their culture which seems to encourage taking part when many professional scientists from other centres are too shy to try it.

The winning essay entitled “The paradigm of kinematics and dynamics must yield to causal structure” was written by Perimeter Institute theorist Robert Spekkens. The idea of questioning the separation of kinematics and dynamics is very original. I never thought of it in this context myself even though I had previously made a similar point in a physics.stackexchange answer about a year ago. Spekkans goes on to link this to causality and the use of POSETs (Partially ordered sets) in models of fundamental physics. This aspect of his essay is a perfect example of what my essay on causality is against. In my view the concept of temporal causality (every effect has a cause preceding in time) is not fundamental at all. It is linked to the arrow of time which emerges as an aspect of thermodynamics. It is not written into the laws of physics which as we know them are perfectly symmetrical under time reversal (or more precisely CPT inversion). I therefore question why it needs to be used in approaches to understanding the fundamental laws of physics. My point did not go down well with other contestants and Spekkens was not the only prize winner who advocated the importance of causality as something to preserve while throwing out other assumptions. Of course this just makes me more pleased that I choose this point to make, winning is not what matters.

Aside from that there is something else about the contest that is of special interest on this blog. According to my count exactly 50 out of the 295 authors (17%) who wrote essays have also submitted papers to the viXra archive. The number who have submitted papers to the arXiv is 95 (32%). This provides a rare opportunity to do a comparative statistical analysis on range of quality of papers submitted to these repositories. By the way 11 of the authors can be found in both arXiv and viXra (including myself), leaving 161 authors (54%) who have not used either. The authors who use arXiv are mostly professional physicists because the endorsement system used by Cornell to filter arXiv submissions makes it difficult, but not impossible for most independent scientists to get approval, so we can conclude that about a third of the FQXi contest entrants are professionals. However I am more interested in what can be learnt about viXra authors.

I started viXra in 2009 to help scientists who have been excluded from the arXiv, either because they do not know anyone who can act as their endorser or because the arXiv administrators have specifically excluded them. Many people at the time said that viXra would only support crackpots and this opinion persists in many places. When someone wrote an entry for viXra on Wikipedia some administrators actively campaigned (unsuccessfully) to have it deleted calling viXra a “crank magnet” and concluding that it had no scientific value. Last month the wave of censorship even reached Google who suddenly removed all viXra entries from Google Scholar. We only had about 3% of our hits coming from there so it was not such a great loss, but it leaves us with no way of tracking citations of viXra papers which is a great disservice. This development reflects the opinions of many professional scientists who have said that viXra at best provides no value to science and only serves to keep crackpots in one safe place. Some are even less charitable and believe that it only promotes bad research and is harmful to science. Are they right?

When viXra was launched I said that it would also serve as an experiment to see if arXiv’s moderation policy was excluding some good science. Nobody should be surprised that there is a lot of bad quality research on viXra because it does not have any filtering and makes no claim to endorse its individual contents (personally I am of the opinion that even bad research can have value as a creative work and may even contain hidden gems of knowledge), but does it nevertheless have work of high value that would otherwise be lost? A recent paper by Lelk and Devine submitted to both arXiv and viXra tried to carry out a quantitative assessment of viXra in comparison to arXiv. It found that 15% pf articles on viXra were published in peer-reviewed journals (based on a very low sample). This may sound low but you should take into account that many independent scientists are less interested in journal publications because they do not need to produce a CV. In any case 15% of 4000 papers is a non-negligeable count if you do think this is a good measure of value.

How else then can the value of viXra by assessed if the papers are not being rated via peer-review? One answer is to use the ratings of its authors as provided by the FQXi contest. Essays in the contest were rated using marks from the authors themselves. This is not a perfect system by any means. There were essays that were placed either much lower or much higher in the results than they deserved. Nevertheless, the overall ranking is statistically a good measure of the papers quality in the terms demanded by the contest rules, with mostly good papers ending up at the top and bad ones at the bottom. It can therefore be used to collectively analyse the range of ability of the authors using either arXiv or viXra.

Let’s start with arXiv whose authors have been endorsed and moderated by its administrators. Given such filtering it is easy to predict that they should do well in the contest. Here is a graph of their placings counted in ten bins of about 29.5 authors. The lowest rated essays are in bin 1 on the left and the highest are in bin 10 on the right.

As expected the majority of arXiv authors have made it into the top bins. 87 were ranked in the top half and only 17 in the lower half.

How would you expect the distribution to look for viXra authors? If we are indeed all crackpots as many people suggest then the distribution would be the opposite with most authors doing badly and hardly any making the top bins that are dominated by the arXiv authors. Here is the actual result.

In fact the distribution is essentially flat within the statistical error bars (not shown) and there are plenty of viXra authors who did well. In fact six viXra authors made the final cut.

What should be concluded from this? If someone is identified to you as an author who submits papers to viXra how should you judge their status? Is it justified to assume that they must be a crank with no useful knowledge because they apparently can’t get their research into arXiv? The answer according to this analysis is that you should judge them the same way you would judge a typical author who has submitted an essay to the FQXi contest. They may not be good but they could be of a similar standard to the authors who submit papers to arXiv. I don’t suppose this will change the opinions of our critics but it should. Google are happy to index FQXi essays on Google Scholar so why should they refuse to index viXra papers?

By the way, of all the essays that were written by viXra or arXiv authors, the one that got the lowest rating was an essay by a Cornell professor who has four papers on arXiv. I let you judge.

Phil, you say:

“One last word of congratulations goes to the Perimeter Institute since the vast majority of the winners had strong connections with the centre, such as being past or present researchers there. The Perimeter Institute is well-known for its research on foundational issues so their success here is not surprising. They should also be applauded for their culture which seems to encourage taking part when many professional scientists from other centres are too shy to try it.”

I may be missing something here, but can you reference any single research work produced by the Perimeter Institute that has convincingly solved an outstanding problem of contemporary theoretical physics?

Cheers,

Ervin

I wasn’t defending their research. I just said they are well known for it. In any case, can you give me an example of a paper from anywhere in the last 13 years that convincingly solved an outstanding problem of contemporary theoretical physics? (Just so I know what kind if thing you are looking for)

Few examples of ground-breaking contributions, some older than 13 years:

Bardeen’s involvment with the BCS theory, Landau’s work on the behavior of liquid helium, Feynman’s seminal discoveries on QED, the Glashow-Weinberg-Salam model, ‘t Hooft’s and Veltman’s work on renormalization of Yang-Mills theory, the discovery of asymptotic freedom in QCD, Nambu’s work in explaining the mechanism of SSB and so on.

But these are all much older than 13 years. If there are no suitable examples that are more recent why pick on PI which has only been around for 13 years?

I could barely believe it, but several of my papers were accepted to arxiv without an endorser.. but I think that’s only because I had a .edu address? http://arxiv.org/find/all/1/all:+AND+stephen+crowley/0/1/0/all/0/1 In any case, I think vixra rocks.. keep up the good work

I can appreciate your surprise: you have just disrupted analytical continuation in potential theory. If different fields can do inconsistent things at the same singularity, we’d better stop overloading symplectic representations.

Did you know that the ancient Babylonians used “zig-zag” functions to approximate planetary orbits? Harmonics was their strength, and I can now see how those Pythagorean triples served to estimate the curvature. They got way ahead of Greek astronomy, but how they did it has been a taboo subject in the so-called history of science.

You might pick up a trail in Gauss’ Disquisitiones Arithmeticae which covered all the methods in astronomy up to his own work there.

Hi Orwin… Let us mention also the calculations of the orbit of the moon and their sacred number 240… which finally around 1964 we understood as the number of eight dimensional spheres around another.

ThePeSla

Hi, Orwin, thanks for the interesting remarks. I’m not sure if the functions I wrote about are actually harmonic functions in the sense of potential theory but that is something for me to look into! I know that the Dirichlet Laplacian is related to the spectrum of fractal strings and drums so surely there is some connection. The real mystery for me personally is determing what forces i was actually sensing here in Texas from the LHC… There were an amazing number of coincidences and incidents over the last couple of years… I could sense when something was wrong before things would show up in the logbook. If the higgs field has properties that manifest world wide.. Does not that leave a lot of grey area in the realm of cause and effect and perhaps even law…and what role do international fiber optic and satellite connections play?

Stephen, what struck me at the time was a gauge warp, and Tommaso Dorigo later came out with Pauli’s Other Exclusion Principle: “Bosons get discovered in Europe, and Fermions in the US.” That was a another piece of lost intellectual history, until you spoke up! And there, I think, you have a true mark of the subtle: yes, its integral to probability itself. Like the one cert. Princeton result: men tend to run bullish, and women bearish. So anything you could tell us about hypergeometric potentials would be very interesting. There is a hypogeometric cosmology out on the Web, but right on the fringe of the fringe.

Bulls and bears, of course, are characters in the old sense of Aesop’s Fables. Sadly, he was murdered for his pains, and Athens couldn’t trace his relatives – and philosophy was cut adrift from cultural imaginings – until the Pe Sla hails us from there.

I don’t think information is a field ripe for regular funding: it needs ‘seeding’, and space where ideas can be knocked around, into some workable shape. I’ve been watching this for years, and there is still no recognition of the very concept in philosophy… so people chase binary structuralism, or cognitive behaviorism or computation theory of mind, and call it information. meanwhile its happening in biology and system engineering.

Philip, why not open a section in viXra? There seems to be enough interest to get a peer dynamic going.

In a more focused mode, Stephen,

here’s an unusual application of hypergeometrics, to

Vertex Operator Algebras, which covers the Higgs branching,

where the anomaly now appears on Resonaanaces.

http://arxiv.org/abs/1012.5443v1

This a 26D solution, like Penrose twistors, and actually

an answer to the problem Matti tries to approach through

braids.

Lemma 3.4, proved in the appendix, involves the singularities

and continuations.

Now turn everything inside-out, as by the Riemann map, and

you come to transfinite diameter problems, the external

measure of an arbitrary polygon. This has very important

application to intrinsic viscosity and conductivity:

http://fire.nist.gov/bfrlpubs/build96/PDF/b96082.pdf

If one can recover the entropy/viscosity relation, that’s

really all string theory adds to physics, so why not call

the transfinite diameter geometric string?

There is a 2-5% error, and here’s an approximate solution for

the boundary layer, applied to Dirichlet problems (fractional

subdiffusion), but leaving hypergeometrics for the integral

method:

http://arxiv.org/pdf/1103.1586v1

So what if there’s a problem, an ambiguity with the

hypergeometric approach here? Something that runs

curiously “under the skin”?

Hi Orwin, interesting.. Didn’t know about the links between Virasoro algebras and hypergeometric functions.. I must confess that i dont know much about Virasoro algebras but I have studied Clifford algebras briefly and I found a paper linking the two algebras… http://arxiv.org/pdf/math-ph/0310044.pdf my stuff on vixra can be found at http://vixra.org/author/stephen_crowley

Triangle diagrams in the standard model., http://wwwthep.physik.uni-mainz.de/~davyd/preprints/davdub.pdf

Feynman Diagrams, Differential Reduction, and Hypergeometric Functions http://arxiv.org/abs/0901.4716

Fully differential Higgs boson production and the di-photon signal through next-to-next-to-leading order… http://www.researchgate.net/publication/222393846_Fully_differential_Higgs_boson_production_and_the_di-photon_signal_through_next-to-next-to-leading_order

:Hi Stephen, that figures: I’ve always thought dimensional analysis or regularization is essential for any ground-breaking research. And now a clear view of partons in Higgs production.

My guess is that hypergeometic functions are the “ladder” to large N factor-structure and thermalization. But to resolve the algebra questions is like the leap up to second-order model theory.

The whole of exact science has been stuck within the confines of Church=Turing=Post=Markov=Formal systems, but the NNLO work on the Higgs is a break-out.

If the hypergeometric approach leaves some parameters undecided, the Standard Model is like that. And I’m seeing that the total potential of material processes is greater than all the laws of physical equilibrium, allowing for non-equilibrium processes. This overthrows the Comte/positivist hierarchy of sciences at last.

Weyl went “wide” with the affine connection, and found his metric morphing into the Schrodinger wave: it was the hugest double-take, and nobody followed.

Vito Volterra ran “wide” of Newton with his orbits, in population dynamics. Out there are Riemann-Stieltjies integrals and Fourier-Stieltjies transforms. These transforms bring an underlying measure into harmonics, opening on all the complexities of measure theory.

I don’t see the Higgs mechanism or any substitute resolved short of that.

OO: “My guess is that hypergeometic functions are the “ladder” to large N factor-structure and thermalization. But to resolve the algebra questions is like the leap up to second-order model theory. ”

Me: I have a similiar feeling, see http://arxiv.org/abs/math/9803067 — Polylogarithmic ladders, hypergeometric series and the ten millionth digits of $ζ(3)$ and $ζ(5)$ … “These constants result from calculations of massless Feynman diagrams in quantum chromodynamics. In a related paper, hep-th/9803091, we show that massive diagrams also entail constants whose base of super-fast computation is $b=3$. ”

OO: “Vito Volterra ran “wide” of Newton with his orbits, in population dynamics. Out there are Riemann-Stieltjies integrals and Fourier-Stieltjies transforms. These transforms bring an underlying measure into harmonics, opening on all the complexities of measure theory.”

Me: Funny you mention Volterra… I’ve been running into his integral equations in my recent work on (stochastic) point process modelling with Hawkes processes.. so called ‘self-exciting processes’ http://vixra.org/abs/1211.0094 the paper is geared towards finance applications but the mathematics applies to just about anything.. say clicks from a geiger counter for instance.. has a non-poisson background ever been theorized or detected? back to finance, the volterra link https://www.maths.unsw.edu.au/sites/default/files/seppparjap.pdf

Apparently the Hawkes process arises in quantum theory by considering feedback via continuous measurements where the quantum analog of a self-exciting point process is a source of irreversibility whose strength is controlled by the rate of detections from that source. “Quantum theory of continuous feedback” available at https://docs.google.com/open?id=0B5kp8BrW_9rdZl9mVEExMUZoLVU since it’s behind a paywall. The cool thing about this paper is that they pay extreme attention to the notion of measurement and observer and end up with a solution but unable to tell how fundamental it is or whether its a number trick.

I wish understanding the Higgs would make the headaches go away…

Orwin, thanks for the arxiv vertex operator paper I will review it carefully. My first impression is it is long winded in an impressive language of which I do not claim to know- I mean since a long time the outsiders are suppressed by those in power with some tradition or dogma, ideology of theory. These outsiders face a great chance of never being heard. But so what if what they have to say no matter how fancy and clean and complicated simply does not address the best model and the foundations of the case that such is communicated?

Recall, Peter Rowlands on the foundations was rejected by that group and now you can find his work in the Scientific American book club. But from an overview of the formalism- plus a few ways I see things and really archived for myself, I suggest that this approach is Not enough to sort out the Higgs mechanism. It is not because there is a more comprehensive model that is right under our noses. Now, in this view Higgs is at least a philosophic entity or mechanism. The real fun and new questions are at the extremes where such formulas break down.

Now this approach can be part of the picture, a small part. In general the simplex method is flawed as a general theory even if some model is translatable with invariants to the Euclidean sense. It does not cover what actually are the properties of branes in singularities. Hypernumbers cannot do the job especially if we include ideas of tachyon behavior and all the paradoxes that indicates. Heat is not a matter of complex number transfer only nor the complex plane enough.

If these are new operators then I must agree with that part of it. Some of it reminds me of my recent singularity notations.

thePeSla

The Broadhurst paper is part of the mainstream knot-QFT program initiated by Broadhurst, Kreimer, Connes and others. Ie. Hopf algebra structure of renormalization.

Interesting marni… odd thing is that last night I tried to draw and see how it applied to a wider knot theory- perhaps beyond QFT or a start.

The PeSla

Thanks for the Volterra lead, Stephen, very welcome: sounds like the “Economy of Nature,” which is what the Physiocrats were doing before Positivism.

Here’s what I found on the Fourier trail:

arXiv:1204.5026v1 [math.AP] spatial boundary problem with Dirichlet-Neumann condition and Appell hypergeometric in the Green’s function.

Konno H., Elliptic quantum group Uq,p(bsl2), Hopf algebroid structure and elliptic hypergeometric series, arXiv:0803.2292.

arXiv:1205.2458v1 [math.CV] hypergeometric identities from Hardy norm of conformal maps

There’s speculative work on Gromov-Witten invarints of the Grassmanian, seems to be a big thing in M-theory: arXiv:1101.2746v2 [math.AG]

But I’m much more impressed by this approach which has a thermodynamic foundation (Stirling’s formula!) and gets you to… analytic continuations!

arXiv:math/0510486v1 [math.AG]

generalized hypergeometric system introduced

by Gelfand, Kapranov and Zelevinsky and its relationship

with the toric Deligne–Mumford (DM) stacks recently studied by

Borisov, Chen and Smith. We construct series solutions with valuesin a combinatorial version of the Chen–Ruan (orbifold) cohomology and in the K–theory of the associated DM stacks. In the spirit of the homological mirror symmetry conjecture of Kontsevich, we show that the K–theory action of the Fourier–Mukai functors associated to basic toric birational maps of DM stacks are mirrored by analytic continuation transformations of Mellin–Barnes type.

Just goes to show how hard it is to pin down an anlytic continuation with complete rigour!!

I reckon there’s enough for a hypergeometric quantum thermodyanmics with boundary conditions, which is to say a non-linear thermodyanmics. Ilya Prigogine was on this trail, but veered off into superoperators, tracking the superselectors in string theory.

Hi Edgar. I’m minded to assert at this point that a D-brane is a compete artifact, a fiction, and the closest you’ll get to it now is what they call a “cut”, back in the lab at the LHC. A cut slices through the continuum of events to define a pre-geometry of representation, like the canvas of a painter, or the material of a sculpture.

It would be neat to wrap the proton in a D-brane, but we don’t have all the spin, we can’t integrate the angular momentum, so again its effectively a cut, isolating a set of quarks and gluons.

In Fourier-Stieltjes theory they say you get particular space-time dynamics in the relativity. For me that’s enough to establish that reality as space-time with stuff happening, is plural, not unitary. In principle gravity draws it all together, and they keep promising (or asserting) a TEO, but don’t deliver.

Steve Wolfram is nudging us to wrap, to deliver a standard for quantum engineering, to get people back to work. He can feel its there in his data-base, but that’s the platform, not the package.

To let this happen I’m ready to set gravity aside for a future metatheory, to realize what’s possible on the Fourier trail. Metatheory could be a cool lobby where second-order model theory can gel: work like that takes way-out meditative patience and detachment.

Interesting reply, thank you. I speak of foundations btw… my quasic brane existed before it could be seen as brane theory… All of these leads seem to come together- anyway, I tried to color code things in this sort of abstract algebra and geometry on my pesla.blogspot.com which I hope is clear and interesting. I will try to follow the links you gave here so thanks for them. I am not so much concerned of wrapping a proton with a D brane (have not heard the term really) but of the universe itself to wrap my head around. But these abstract things are satisfying- especially when I find others using ideas not that long ago I would have rather kept to myself as they seemed so radical. Kind of you to talk with me.

The PeSla

The subject is ‘periods and motives’ where periods are a natural class of complex numbers and motives also underlie the more physical twistor formulation of scattering amplitudes. Broadhurst, Kreimer et al originally turned a Feynman vertex into a knot crossing and empirically showed that many multiple zeta values (MZVs) match chorded knot diagrams.

http://www.icmat.es/congresos/periods-and-motives/PM2012-program.pdf

Once there are knots, there are q-hypergeometric series and not just hypergeometric ones, where as always q is a deformation parameter for the algebraic structure. These are the things that Ramanujan studied long ago when inventing mock modular forms – at the very heart of modern motivic mathematics. There is no ‘other way’ here. Understanding motives is the key to everything and every decent mathematician knows it.

You’re welcome Orwin… I always thought a good way would be to make money in order to break free of modern office slavery which is not good for health. Any way, the entire LHC budget is equivalent to only a few hours or days of the atrocious debacle of the Iraq war, so on some level haven’t we accepted that money isn’t even “real” any more even though it’s “necessary” ? Theres got to be some link back to fundamental concepts and resources, beyond all this hype of the internet changing things or whatever, it’s mostly filled with trash. Gotta dig for the ‘good info’ but there is a cost to that…

Anyway I was about to mention K-theory the other day.. don’t know much about it.. and as far as mirror symmetry.. it’s self evident pondering ones own reflection in any reflecting material at all and then pondering the role of mirror neurons identiftying self on a rotational/translational/scaling/probabably-more invariant basis.. but these processes happen on some collection of frequencies and the bunch crossings at the LHC happen on some others.. and meanwhile I’m sitting in an office with flurescent lights and a bunch of LCD monitors blasting out there useless frequency modulations.. and we infer things and read the log books which makes us take a trip on the international fiber etc…

As far as theory goes, I am interested in a lot of things but solving the tinnitus and perception thing is the most for me.. and also civil liberties.. but I know physicists steer clear of these things probably.

Actually, this is no conspiracy/rabbit-hole stuff.. this is real research.. http://cogprints.org/6146/1/Microwave_Congruence_SchizophreniaPub.htm … the results.. they can hardly tell the two apart (RF interference or actual schizophrenia).

Meanwhile, Google is giving out free Wi-Fe to new york city, aren’t they just wonderful! Or and they hired Kurzweil, so we can all upload ourselves into the cloud or something. Give me a farking break.

Back on the Fourier trail, I enjoyed reading this paper last year,

http://arxiv.org/abs/1111.2983v1 Small representations, string instantons, and Fourier modes of Eisenstein series

That’s useful, marni, what with shaves, stacks, shelves and the like the math of physics is now a thicket of obscurities, and F-theory does not offer a clear direction.

On making money, Stephen, high-speed trading is pin money that pays for the platforms. Serious money is still put on “fundamentals” on the biological and demographic side of the picture.

Talking of which, I see microwave ovens are finished: people are buying broilers and steamers and contact grills, which do infra-red.

Offices where the people count have ferns and such to freshen the air. Sesame oil on your face and neck has a similar effect, and now I find caraway seed oil adds a penetration, to get to your larynx or behind your ears.

On time dilation: any sense will intensify with concerted practice, but a vicarious intensity suggests that some biological pacemaker is erratic or obscured. The nervous system is known to be pretty immune to interference, but biological oscillators importantly fall outside of that.

The really big money in research funding goes to medicine, and the “cognitive revolution” has failed in psychiatry. I was impressed to notice Russians taking up “excitable media” a generation ago, but oligarchy is no way to go. This is the time for something to happen, so people charge in all directions and trip over each other.

PS to Edgar: Combinatorial medicine from two thousand years ago:

http://www.ucl.ac.uk/~ucgadkw/papers/trivandrum.pdf

Still patients react differently and docs prescribe off-label and get bust by the FDA.

Hi Orwin, interesting comments as usual :) Going back a few posts…

Orwin: “Steve Wolfram is nudging us to wrap, to deliver a standard for quantum engineering, to get people back to work. He can feel its there in his data-base, but that’s the platform, not the package.

To let this happen I’m ready to set gravity aside for a future metatheory, to realize what’s possible on the Fourier trail. Metatheory could be a cool lobby where second-order model theory can gel: work like that takes way-out meditative patience and detachment.”

Me: I was just thinking some sort of ‘standard’ was needed.. but have you *seen* Mathematic’s syntax? It’s arcane.. like a programmer made it (and I’m a programmer.. I should know). I tried Maple from Maplesoft(Univ of Waterloo) several years ago and never looked back.. you can program in it, but it’s primarly a mathematical language.. beautiful.

Orwin: So what if there’s a problem, an ambiguity with the

hypergeometric approach here? Something that runs

curiously “under the skin”?

Me: you know.. the skin shields out most alpha particles.. can they convert to photons when they hit? :) Maybe we can make some new sparkly cosmetic and make money that way.

Orwin: …Prigogine…

Me: I have a good friend who went to MIT who studied Prigogine’s stuff extensively and we’ve had many interesting discussions.. the problem we found was, how do you communicate this stuff to the people who need to understand it? It kind of falls out of Maslov’s hierarchy of needs and no one can think of a business case for understanding entropy or life really..

Orwin: On time dilation: any sense will intensify with concerted practice, but a vicarious intensity suggests that some biological pacemaker is erratic or obscured. The nervous system is known to be pretty immune to interference, but biological oscillators importantly fall outside of that.

Me: I recall becoming mildly obsessed with Norbert Wiener’s “Control and Communication in the Animal and the Machine” and his whole cybernetics thing a few years ago.. at one job I worked with a neuroscientists who left the defense industry because the artificial nervous system integrations they were developing spooked him out, that, and the overwhelming prescence of big brother.

I’m going to try to use a Kindle DX paper display for a main monitor instead of an LCD and see if that helps my eye problems…

Orwin, this is not on the hypergeometric approach.. but maybe indirectly.. anyway, I found this intriguing.

http://arxiv.org/abs/nlin/0405058

The result? The zeros of Riemann Zeta could be momentum or wave vector eigenvalues of a Helmholtz operator on a suitable

domain.

Zeta appears in the Casimir effect.. would understanding these resonances have anything to do with real particle physics instead of these billiard models? Or, are those protons smashing together the billiards in this case.. heh, heh.

A whole lot of background:

Gelfand on nonlinear Fourier transforms:

http://arxiv.org/pdf/hep-th/9504042v1.pdf

How Gelfand tamed Hilbert space:

http://arxiv.org/pdf/math/0607548.pdf

Dirichlet problem by Gelfand transforms:

(generalized Fourier-Stieltjes)

http://arxiv.org/abs/1212.1099

Representation theory for Hypergeometrics:

arXiv:alg-geom/9711011v3 6 Apr 1998

Gelfand’s hypergeometric theory:

(Calabi-Yau sigularities!)

arXiv:0711.0464v1 [math.AG] 3 Nov 2007

But are the continuations unique? I have a real problem with this heavy category theory: its like the Medievals reasoning

by analogy, or the Victorians and their correlations. Correlation never proves causality and not all transforms are reversible. So uncertainty enters the picture.

Motivic reormalization:

http://arxiv.org/pdf/0804.4824v3.pdf

Note the geometric model is still specuative!

Operational Axioms for QM!

http://arxiv.org/abs/quant-ph/0611094

The Wigner approximation!!

http://arxiv.org/abs/0804.0259

Maybe (some of) the math Matti is missing:

http://arxiv.org/abs/0807.3514

Underlying issues in set theory:

arXiv:1110.2013v8 [math.FA] 2 Oct 2012

At last, a hard look at different kinds of morphisms.

On the zeta function: yes, i see something like that with the transfinite diameter, beyond all the ambiguities of boundaries ad boundary conditions. For Descartes and Leibniz, momentum exists only in a real medium….

Orwin, an interesting post…I am not sure why you would ask about certain math missing in a theory like Matti’s- I would like to see just how he explains certain numbers like the use of the prime 19 in his theory so I still feel I am missing something in understanding that part of it- unless it is so evident surely such space models are implied if not transcended. Can you explain it to me?

In any case I have summed up a lot of things from my own explorations, today in illustrations on my http://www.pesla.blogspot.com blog called simply Journal Omnium. The idea is to get closer to the grounding or foundations of our theories, the reason behind why they are possible and why we so imagine them. It is a recreational math exercise perhaps, but in the struggle to comprehend the universe I have finally solved two of my problems based on information that solved the color matching cube problem.

This most likely will satisfy my quest for awhile so I may not post as much moving on to more practical pursuits unless something comes up radically different. Thank you for the dialog and the documentations. And readers of these comments pardon my intrusions as I did not realize how much I responded to notifications here by email… to the kind keeper of this blog. Yet the question remains where is the proper place anywhere for any of us to present a thesis?

L. Edgar Otto Eau Claire, WI

Orwin: On the zeta function: yes, i see something like that with the transfinite diameter, beyond all the ambiguities of boundaries ad boundary conditions. For Descartes and Leibniz, momentum exists only in a real medium….

Me: Ahh that’s what I needed…more philosophy less electrons

Schramm-Loewner Evolution and Liouville Quantum Gravity –

http://arxiv.org/pdf/1012.4800

Watersheds are Schramm-Loewner Evolution Curves

http://arxiv.org/pdf/1206.3159

More :)

Multiple Schramm-Loewner evolutions for conformal field theories with Lie algebra symmetries http://arxiv.org/abs/1207.4057 <– has 2F1 hypergeometric functions!

Schramm-Loewner evolution martingales in coset conformal field theory http://connection.ebscohost.com/c/articles/80436657/schramm-loewner-evolution-martingales-coset-conformal-field-theory liberated from behind paywall at

https://docs.google.com/file/d/0B5kp8BrW_9rdR0FaaDBwSEhmYms/edit

I'm guessing by now.. far.. "off topic" hah

There’s no off-topic around here: I’ve been thinking that Liouville gravity is what governs biomass – its too amorphous for the standard metric. And due to finite speed of nerve transmission, perceived reality is Poincare. Pluralism again.

Speaking of programming, have you looked at the Gibbs project? matforge.org/gibbs. Still wide open at this stage.

Edgar, I can’t explain Matti’s take on Mersenne primes except historically; Mersenne (a Dominican monk) enticed Descartes back into science after he withdrew in solidarity with Galileo. He wrote a paper on harmonic theory and Pythagoras, which i a way anticipates principal quantum numbers. But Kahler manifolds were developed for x-ray crystallography, and the 1/r distribution of scattering was one thing we could agree on. Also Fourier found himself a furious argument with Biot, in which they both contributed to the theory of heat conductivity. That’s where dimensionless ratios come in, and in that way pure numbers.

In general I don’t buy any of these catch-all geometry theories.

There’s a paper on arXiv by Welling, Andrew Gelfland and Iher (no arXiv number given) on iterating calculations over clusters: so the clustering gives the dimensionality, as in the Renomalization Group Theory, Ising-Onsanger-K. Wilson. So space-time geometry can only set constraints, which show up in Hamiltonians, like the zeta-function Stephe noticed.

But anything that gets us back to the language of natural philosophy is hugely welcome. Sidney Coleman saw that a generation ago, and now there’s a whole Natural Philosophy Association, full of Victorian nostalgists. We live in a slightly mad world, like ken Wilber says.

Stephen – I see the Gibbs Project is object-oriented. Inheritance between objects aint no simple tree-structure: its Galois lattice,ad that points to the Madrid conference spotted by marni, where Hidekazu Furusho spoke on Galois action on knots, and showed that the Absolute Galois group is just the real really knotty) number line.

Then you need the ladder to 2d-order models, basically mathematical induction. Ad a way of managing the Frobenius problem, representations and number systems.

Now that’s enough to overload a quad-core desktop; this isn’t CAD/CAM anymore. I’ll tell you one thing: serious engineering has always been done on dedicated med-range machines: old DEC VAX, through open OpenVMS to Wolfram Alpha. Now you’re looking at Linux desktop clusters (under 100 000 dollars), or the new intel 50-core database chip. This IA64 architecture, the road less traveled, which Oracle wanted to block. While the commodity markets are dug in on 32-bit. Sigh.

Also at Madrid: Oliver Bouillot on “equivalence between properties/conjectures concerning multizeta values and conjectures concerning multitangent functions,” shows the kind of conjectural space where Matti can’t decide anything. It was Descartes who went off on the proverbial tangent, ad Gauss zetad astronomy, and after that only Moebius followed. That’s why i look to inside-out/outside-in construals.

Orwin: There’s no off-topic around here: I’ve been thinking that Liouville gravity is what governs biomass – its too amorphous for the standard metric. And due to finite speed of nerve transmission, perceived reality is Poincare. Pluralism again.

Me: I can’t help but get a creepy vibe when I perceive people and animals as just blobby metrics.. they are so much more than that :) Nice to see the structure underlying it.. what is the goal of understanding such things?

Orwin: – I see the Gibbs Project is object-oriented. Inheritance between objects aint no simple tree-structure: its Galois lattice,ad that points to the Madrid conference spotted by marni, where Hidekazu Furusho spoke on Galois action on knots, and showed that the Absolute Galois group is just the real really knotty) number line.

Me: cool.. I wish I knew a use for it.. my day job is beyond boring.. working on the high-frequency trading thing.. the martingale theories are useful to bridge the language gap it seems

Stephen, those Schramm-Loewer papers are a real eye-opener! The watersheds give you Conrad Waddington’s epigenetic landscape, and crossover statistics are the basic genetic statistics (due to crossover of DNA strands during cell division): and I mean the constant, relentless recombination of bacterial genes driving epidemics, not to mention ageing and the Medicare overhead.

That’s the foundation of theoretical biology as we know it!

So we head into a confrontation: the cellular automation is a silicon fiction (also a computer virus template!), simply because there is no copy process in real cell division! And Stuart Kauffman’s “candidate law” of information in evolution fails badly just at the bacterial baseline, where epigenetics took off!

That also takes out Steve Wolfram’s New Kind of Science, and Gerard t’Hooft’s take on new physics….

On “the language gap”: That opened after Carnap retreated from the catch-all spacetime realism of his Aufbau to the symbolic foundation in linguistics. But now engineering and thermodynamics are across the gap, leaving Academia playing high finance, while industry wilts.

And the new opportunity in Open Source publishing gets snapped up in France (Phill Gibbs’ next theme) – at least that drives some investment in Africa.

PS most monitors are set way too bright as I see it – and you have to hunt for the brightness control in the Display manager.

Orwin: Stephen, those Schramm-Loewer papers are a real eye-opener! The watersheds give you Conrad Waddington’s epigenetic landscape, and crossover statistics are the basic genetic statistics (due to crossover of DNA strands during cell division): and I mean the constant, relentless recombination of bacterial genes driving epidemics, not to mention ageing and the Medicare overhead.

That’s the foundation of theoretical biology as we know it!

Me: Thanks.. I had a feeling there was something to it..

Orwin: So we head into a confrontation: the cellular automation is a silicon fiction (also a computer virus template!), simply because there is no copy process in real cell division! And Stuart Kauffman’s “candidate law” of information in evolution fails badly just at the bacterial baseline, where epigenetics took off!

That also takes out Steve Wolfram’s New Kind of Science, and Gerard t’Hooft’s take on new physics….

Me: It seems that way, unfortunately.. I’ve had glimpses of these cellular automata before but they seem to be fractals that cycle around in some limit sets.. does the internet have anything to do with natural sunlight? What about the scattering of the particles as they travel thru the smog? Where does this oppressive feelign emerge.. as if ghestalt itself is shifting.. too many people forming their worldview thru television?

Orwin: On “the language gap”: That opened after Carnap retreated from the catch-all spacetime realism of his Aufbau to the symbolic foundation in linguistics. But now engineering and thermodynamics are across the gap, leaving Academia playing high finance, while industry wilts.

Me: I’ve noticed sometimes in office environments there will be a sudden increase in what feels like gravitational pressure or waves, and people will suddenly clear their throats in unison or make exclamations of some sort.. this is usually correlated with the automated functioning of equipment such as air conditioners, and fax machines, copiers, etc. When I noticed this coupling between ‘environment’ and people/equipment it was rather disturbing to say the least.. what is one supposed to do, suddenly start yelling about the dangers of air conditioners or how they effect the climate and general electrical systems?

PS most monitors are set way too bright as I see it – and you have to hunt for the brightness control in the Display manager.

Me: yes, but they attract attention which might distract from good thoughts or flow of qi :)

The original Schramm-Loewer paper, “Scaling limits of loop-erased random walks and uniform spanning trees”: http://arxiv.org/abs/math.PR/9904022

Also something neat in the LHC status report, the graph on page 16 of

http://lhc-commissioning.web.cern.ch/lhc-commissioning/news-2013/presentations/Week3/p-Pb%20Morning%20Meeting%2021Jan2013%20Jowett.pptx looks very much like the harmonic sawtooth or the Gauss continued fraction map http://vixra.org/abs/1202.0079 I wonder if that is just a coincidence :)

Me: Also something neat in the LHC status report, the graph on page 16 of

http://lhc-commissioning.web.cern.ch/lhc-commissioning/news-2013/presentations/Week3/p-Pb%20Morning%20Meeting%2021Jan2013%20Jowett.pptx looks very much like the harmonic sawtooth or the Gauss continued fraction map http://vixra.org/abs/1202.0079 I wonder if that is just a coincidence :)

If the radio frequency modulations are following such a pattern.. then what physical interpretations does the Mellin transform of the mapping function have?

http://arxiv.org/abs/0707.1203 Eigenfunctions of transfer operators and cohomology

Hi Stephen, I’ve been schlepping sysadmin and thinking about something strange that came to light on Frank Visser’s Ken Wilber critique site, integralworld.net. One Jose Alvarez Lopes Lorentz contracted Newton’s gravitational constant and derived Einstein’s orbital equation! That’s gravity waves for real, then, never mind the cosmology.

http://www.integralworld.net/piacenza25.html

Stephen: a sudden increase in what feels like gravitational pressure of waves, and people will suddenly clear their throats in unison or make exclamations of some sort…

Me: And dogs play up like someone opened a door… when its actually a domain wall passing in the air, with a discontinuity in temperature and pressure… so the dogs are making tin gods of their owners, poor things. But the masses of air involved are not trivial.

Stephen: does the internet have anything to do with natural sunlight?

I used to tell these wire-bound hardware fitters: “Ethernet is broadcast medium,” and grill them on packet radio. But now Microsoft have “technical issues” getting muzak all over to their XBox.

On the LHC backroom stuff: that’s were the action is all right, where the beam fades against entropy. But folk who take sunlight as illumination and luminosity as destiny will never see that.

There’s also some Born-Infeld gauge effect in on Lopes’ trail, and Poincare en route from the Argentine back through France to Spain. i guess Verlinde must be right in some way, about gravity as an emergent information dynamic, which people long took to be spirit. We don’t really know enough to say they were wrong: what else can act meta-theoretically?

One special function that does not come within the hypergeometric envelope is Lambert’s W, which is just the exponential form of Shannon’s channel capacity. And traffic flows follow the Newton field in population mass….

Orwin: One special function that does not come within the hypergeometric envelope is Lambert’s W, which is just the exponential form of Shannon’s channel capacity. And traffic flows follow the Newton field in population mass….

Me: Now *that* is uncanny.. literally minutes before I read your post here mentioning the Lambert W function it popped up in my Maple session as the result of an inverse equation I was studying used to predict/forecast/omen the next point of a Hawkes process given its entire filtration.

Also, earthquake near me the other night http://www.kvue.com/news/30-magnitude-earthquake-rocks-North-Texas-187997471.html

the Hawkes process is used in seismic modelling…

I’m tired of thinking about gravity.. just trying to survive, it seems to be a force of nature eminating from without.. strange stuff.. its began to cause other people around me to take pause at their notions of traditional logic and they always assign the source of the phenomena to some higher power, a big defense contractor, the government, the power company, the internet company, the NSA, etc, but in reality hardly any of those groups understand what the hell is going on… too busy glued to their monitors… it’s strange happenings in the ambient field of the electric grid… quite creepy and it has something to do with tracking obviously but to what end…

I’m also fagged out at this point, but I’ve learned not to drop something before its finished.

On inverses: I was surprised to learn that the notorious Beyes theorem in probability is just the inverse theorem, giving Type II error (false negative), i.e. signal/noise ratio or resolution of the phenomenon. The logic here would extend to any signal.

On tracking: as it happens the folks in wireless telescopes have just got to the point of worrying about thermal response in the chips, which work though extremes of desert temperatures. It would be cool if the nextgen telescopes could track passing air masses on the fly, ‘cos they matter for airline safety and efficiency. Folk in sound engineering know their signals are affected.

So in the grey dawn of a long night, we see the open horizon of knowledge. FXQi get to manage a funding round for information research, ‘cos it matters for these nextgen projects, and I guess that’s what Steve Wolfram is twitchy about, expecting the information paradigm to mature soon.

But what do the 1930s nostalgists at FXQi know about information, the quintessinal 50s buzzword? Cripes, they’re on the wrong side of WWII for this! Marni says she’s put in a proposal, in the midst of generally giving up on such, but that’s really how it is with this stuff.

I’m afraid Marni said on her blog that she withdrew her FQXi grant proposal. I applied for one too. Probably I do not have sufficient credibility as an independent and wont be able to find suitable referees, but my fourth prize on the related essay contest provides a small glimmer of hope.

Orwin: “I’m also fagged out at this point, but I’ve learned not to drop something before its finished.

On inverses: I was surprised to learn that the notorious Beyes theorem in probability is just the inverse theorem, giving Type II error (false negative), i.e. signal/noise ratio or resolution of the phenomenon. The logic here would extend to any signal.”

Me: That’s an interesting observation.. I saw you mentioned Jaynes in another thread.. I went on an adventure reading his work several years ago back in 2006 or so when I was dreaming up trading strategies.. I was actually trying to understand entropy in all its manifestations, whether it be stock price flucuations or daily experience of weather, etc.. also read this book http://www.amazon.com/Probability-Schrodingers-Mechanics-David-Cook/dp/9812381910 which I found interesting.. I’ve found that reading this sort of material sometimes alienates ones self from the surrounding culture due to the rather radical paradigm shifts from common perception that it presents… such are the perils of understanding.

“On tracking: as it happens the folks in wireless telescopes have just got to the point of worrying about thermal response in the chips, which work though extremes of desert temperatures. It would be cool if the nextgen telescopes could track passing air masses on the fly, ‘cos they matter for airline safety and efficiency. Folk in sound engineering know their signals are affected.”

Me: I can see how that would be useful..sometimes I feel like I’m being caught up in a vast electronic dragnet with respect to the bits I send out into cyberspace..so many levels of feedback involved, and of course involving all the conscious persons who read the results of those bits I cast out :)

Back on the wild excursion/trail we were on, here is where the Lambert W function popped up in my current work just as you mentioned it here… http://vixra.org/pdf/1211.0094v8.pdf Equation 55 on page 10

Now I know this is far from ‘physics’ but is it really? Even if the LHC were not running, how would things be any different?

I remember now feeling pretty terrified when the LHC was ramping up with the perception that the extreme electrostatic tension involved was causing various flaky wiring to fail. The most dramatic was an antique tall-sailed ship in the Thames – and now G. Srinivasan is talking up that stuff, and I can see for sure that the stability of those tall ships involves forces ranging to windpower that accelerates particles beyond the ranges we can dream off to produce lightning-storms, with gamma-ray bursts that are contained only by the earth’s magnetic field.

As my main interest this year was the question of physics on the foundations- and I have also concluded we should question this ideas of a sort of quasi indepencence of the topic of the winning paper, causuality perhaps better understood or grounded.

But a for the idea of crackpots or the sane a recent article in the science magizines in the origin or our multi gene copies as to the gift from the same gene of higher wisdom at the cost also of incidences of mental illness as species evolve- maybe that is the case for our theoreticians. That I addressed today in a post I called “Frankensouls” on the pesla.blogspot com

You might find it entertaining from someone outside the survey of publishing or not- I note that Peter Rowlands sumissions were regected by arvx as his was on the foundations and that later his book for sale was offered from the scientific american book club.

The PeSla

“Most theories which posit dynamics that are nonlinear also allow superluminal signalling, in contradiction with relativity theory . . .”

This essay by Mr. Spekkens is very interesting, especially given his emphasis on the de Broglie moiety interpretation of QM. One thing that I find of relevance: superluminal signaling is not necessarily contradictory to special relativity! Professor Emeritus, Stanford University, William Tiller and his associate Walter Dibble demonstrate the contrary with their Psychoenergetic Science http://www.tillerfoundation.com/White%20Paper%20V.pdf and http://www.tillerfoundation.com/White%20Paper%20VI.pdf (I believe Mr. Tiller would also be an excellent addition to your series, “crackpots who were proven right” in that he is dismissed by the orthodox community even though his theory is backed by peer replicated double-blind experiment).

The other thing I find troubling about “foundational physics” is the absence of retro-causation. Consider this passage from Murray Gell-Mann’s book, The Quark and the Jaguar:

“If the future condition were not one of complete indifference, violations of causality would result and events would occur that were inexplicable (or at least exceedingly improbable) in terms of the past but were required (or nearly so) by the condition specified for the distant future. As the age of the Universe increased, more and more such events would occur.”

I regard this statement to be somewhat in error. Causality is not violated it just appears as though it is to us Information Gathering and Utilization Systems (IGUSes) because, like Alice, our memories only work one way. In other words, relative to our reference frame causality appears violated because it seems as though the consequent exists without an antecedent but, in actuality, the consequent precedes the antecedent. How so? Causality is propagating backwards in time!

Here’s the kicker: if both the initial condition and the final condition are other than indifferent, any theory describing the system in question must accommodate bi-directional causality! And why is it assumed that the final condition is indifferent anyway? People assume the final condition is indifferent because the first and second laws of thermodynamics seem to point in that direction; it’s taken for granted. But the final condition is not indifferent.

If one applies the proper amount of course-graining they can view the state of the Universe at any given time as a temporal equilibrium between competing causalities: the “initial condition causality” (differentiation) which propagates forward in time; the “final condition causality” (integration) which propagates backwards in time. And this relates to the Anthropic Principle, one of the most inexplicable of phenomena. IGUSes exist because the final condition demands it; more importantly, the final condition is other than indifferent because the very propagation of perceptual reality demands it. If not for the existence of IGUSes entropy would rule the day; eventually perceptual reality would dissipate into nothingness – the Void/Plenum. And it seems to me that The Law of Accelerated Returns definitively points in the direction of a distinct final condition – the infinite integration of information (aka Singularity).

So returning to Mr. Spekkens, I wholeheartedly agree with his emphasizing causal structure over the separate entities, kinematics and dynamics, but his treatment of causal structure is limited; it should be expressed as bisimilar equivalence relations on ill-founded sets! But then I’m a “crackpot” myself . . .

I didn’t realize when I made the above comment that a Tobias and Bob Coecke had already pointed Mr. Spekkens in the direction of bisimulation in the comments section of the FQXi post. This was done way back in Oct. so I am very much behind the curve. Anyway, Bob Coecke directs Mr. Spekkens to Wiki and, while I truly appreciate Wikipedia, there exist much better resources. To any who may be interested I would direct your attention to a good paper on bisimulation: http://www.cs.unibo.it/~sangio/DOC_public/history_bis_coind.pdf. This paper examines the history of bisimularity in the context of computer science, Modal Logic, and set theory.

Hi, I have considered such foundational things and even if the light is violated there needs be no contradiction to Einstein, just better definitions and generalizations. Yours an interesting comment of which my recent posts I quite imagine you will find most interesting (for example I mention the various alternative cosmologies such as the comment in these pages of a world of the vast string theory landscape at each sub space such a vast landscape and so on…) It is highly symbolic.

ThePeSla in the three posts broken up called What is Our most General and Wildest Conception of Space pesla,blogspot com

@ Phil,

Frankly, I don’t follow your logic. There are plenty of unsettled foundational issues in HEP and cosmology crying for answers. True, the PI is 13 years old, but given the reputation and concentration of bright minds there, one would expect at least one hint for a compelling resolution to these issues. I am afraid that the evidence to-date tends to point to the contrary.

There are indeed plenty of problems but very few convincing solutions from PI or anywhere else in recent years. It is not the only place with a concentration of bright minds This is not the fault of PI. It is because the problems are very hard and there is little new experimental input to help. There have however been many good ideas that might lead to solutions one day from PI and elsewhere, but until the full solutions are reached it is hard to say who is on the right track.

I don’t disagree with you. From what I’ve seen, I think that their research (particularly in QG)is not pointing in the right direction. Only time will tell.

If I write up something and arXiv.org judges it as “too speculative”, as sometimes happens, then I can at least put it on viXra.org, where I can direct others to go for a free look and download.

Thus, I would argue that in spite of the weird stuff that appears frequently at viXra.org, there is also scientific content there and it serves as an venue for alternative ideas that are considered too far out for arXiv.org (although the last contribution posted there by Don Page surely blurs the distinction!) In fact sometimes when I am perusing a HEP section I am not sure which site I am at. :)

Robert L. Oldershaw

Discrete Scale Relativity

Some of the FQXi Essays in the last contest are published in Prespacetime Journal 3(12). See http://prespacetime.com/index.php/pst/issue/view/35

Prespacetime Journal 3(13) is published today. This issue also contains one article based on FQXi Essay. See http://prespacetime.com/index.php/pst/issue/view/36

The next issue of PSTJ features Peter Rowlands’ work. See: http://prespacetime.com/file/FocusIssue.html

Phil, you’re up against empiricism, which is no unoriginal its a yawn. But Hans Reichenbach wrote it into the interpretation of relativity, and there’s a Reichanbach fad on. Meanwhile, kinematics/dynamics is just how Heisenberg introduced complementarity/uncertainty, and that’s a target for the wider fad in determinisms and Bohmian stuff. The kinematics/dynamics contrast goes back to Jean Buridan and Albert of Saxony in the late Middle Ages, and to take it out through POSETS is to take out the whole algebra of logic tradition. Well, C.S. Peirce worked that way, and the Canadian Simon Newell wrecked his career, so there may well be an old axe grinding behind the PI, which would be real bad for them going forward.

The Established way is now Russell’s logicism, so here goes:

The proposition of Russell is ambiguous!!!

Right, does that mean the general type (illustrated), or Russell’s analysis of the indefinite article ‘the’, or some specific philosophical stance, like Russell’s claim that Leibniz was hopeless and he could save philosophy???

go figure.

Hoora for Vixra creation and existence! With the dismal output coming from the Perimeter Institute since its inception, the creation of Vixra was a very welcome ray of hope for the likes of a self-taught physicists such as me. Fortunately after 30 years of hiding in the closet, I had the good fortune to go for the raw cash and thus be temporarily freed from approval of that vast horde of thoroughly over-trained and brain-damaged physicists that submit to both arXiv and Vixra and people the halls of the Perimeter Institute.

In your statistical comparison, did you include the 11 people who you said submitted to both arxiv and vixra? If so, could you redo your analysis with those 11 removed, because I think it confuses things. Then compare arxiv-only authors to vixra-only authors in the FQXi contest. Can you show those 2 graphs? Thanks.

They were included in both plots.

Eight of the eleven are in the top half so it would make the viXra results look a little worse (although only one is in the top bin). However these are still people who had to use viXra. In some cases they could use arXiv in the past but are now excluded because the endorsement system was introduced or for similar reasons.

It is a reasonable query to make but it would be a misrepresentation of viXra to show the plots without these people so I wont do it.

The plot without those 11 would be interesting in addition to what you already have. It is a shame you won’t do it. It means you are trying to present the data in a way to optimally make a point, rather than just letting the data speak for itself. I suspect that removing the 11 is not a huge deal, but your refusal to do so, makes it really seem like you are trying to hide something.

Bob,

It is a shame of mainstream dictatorship, not vixra independent researchership.

LHC results will kill supersymmetry theory, will kill string theory, will kill big bang theory. Truth will prevail and scientific dictatorship will be down as Hitler was.

Bob, I have told you why I wont do that plot. The 11 authors use viXra because of difficulties submitting to arXiv so they should be included. All information I have used is from public sources so I am not hiding anything. I dont see why I should produce a plot that would be misleading and which others might copy without explaining why it is wrong.

That is fine by me. But you should know that it makes the results look very bad, since the plots are intentionally hiding some information, i.e., if someone looks at your plots and sees that vixra performed somewhat worse than arxiv, and also realizes that the data is distorted in vixra’s favor, then they will decide that it is REALLY bad for vixra.

Bob,

The next physics revolution will be initiated by independent researchers not by mainstream elite.

FXQI rating is secretely run, corrupted, and distorted.

FQXI contest rating is meaningless.

Bob, there are always people who will look for the most ngeative side that best confirms their preconceptions. It is pointless trying to convince them otherwise.

Bob, what about three plots? For arxiv only, vixra only and arxiv+vixra?

It is a pity that viXra is censored out from Google Scholar. There must be very active and influental academic lobbers behind this kind of decision. This shows how desperately some academic circles hate freedom of thought and new ideas. In any case, to my opinion viXra makes a wonderful service for the science during a period when big money dominates science and freedom of though does not prevail.

Here’s an off-the-wall analogy version of what happened at Perimeter Institute.

100 Ferraris, Aston Martins, Shelby Cobras, Harley Davidsons, etc. with their engines racing but going nowhere because they are all up on blocks.

The blocks are the old paradigms of cosmology and particle physics, and no one is going anywhere until a new unified paradigm is identified and people get their intellectual wheels back on the solid ground of nature.

And as Einstein said quite seriously :”If at first an idea does not sound absurd, then there is no hope for it.”

Let’s repeat that: “If at first an idea does not sound absurd, then there is no hope for it.”

Time to consider radically different comprehensive paradigms, so long as they are derived from empirical evidence, can make definitive predictions and can PASS definitive predictions.

Robert L. Oldershaw

Discrete Scale Relativity

I should have added that initially sounding absurd is a necessary, but not sufficient, criterion for any idea that can claim to be a paradigm-changing idea.

It is also necessary that the idea is based upon empirical evidence, that it can lead to definitive predictions, and that it can pass tests of those definitive, non-adjustable predictions.

Science is not supposed to be facile or plastic.

RLO

DSR

Phil,

You say:

“This aspect of his essay is a perfect example of what my essay on causality is against. In my view the concept of temporal causality (every effect has a cause preceding in time) is not fundamental at all. It is linked to the arrow of time which emerges as an aspect of thermodynamics. It is not written into the laws of physics which as we know them are perfectly symmetrical under time reversal (or more precisely CPT inversion). I therefore question why it needs to be used in approaches to understanding the fundamental laws of physics. My point did not go down well with other contestants and Spekkens was not the only prize winner who advocated the importance of causality as something to preserve while throwing out other assumptions.”

I haven’t yet read your essay (I’m heading that way now) but your position as stated here strikes me as the position Stephen Hawking took with his Euclidean approach to quantum spacetime and when he tried to demonstrate the quantum structure of spacetime as emergence using Feynman’s Path Integral it was a total failure. After Hawking, Hartle, and company abandoned the approach it was taken up by Renee Loll and company with great success. What did they change? They added gluing rules which establish a causal direction. They call their approach Causal Dynamical Triangulations . . . (http://arxiv.org/pdf/0711.0273v2.pdf, sorry, it’s a arxiv post). I don’t know that any of this really means anything but it seems to point in a promising direction.

The methods used by Hawking et al were based on semi-classical approximations and reduced systems of quantum cosmology. These were always incomplete and bound to come up against a wall at some point. It does not mean that the underlying ideas such as emergent causality were wrong.

Understanding emergent causality will probably require the use oif the holographic principle and the other ideas such as “complete symmetry” to explain why singularities impose low entropy constraints. With models that do not include such phenomena you cant expect to succeed. CDT may be useful to model other aspects of quantum gravity but by putting in the causal flow by hand it can never be a complete model in my opinion. These points were covered in my essay http://fqxi.org/community/forum/topic/1369

For me the most promising approach at this time is the work that has grown out of scattering amplitude calculation in SUSY. You can listen to talks to Nima Arkani-Hamed about how he thinks space, time, cauality and locality emerge from the mathematics of permutations of states e.g. https://www.youtube.com/watch?v=Qdkjn-YdlBU

Thanks for your response; I look forward to reading your paper! This certainly has been a popular post generating a good bit of discussion!

I understand and agree with what you’re saying about CDT, it seems to be unnecessary manipulation, however, you could think of it as the “active information” inherent in the Bohm/Hiley interpretation of QM. Of course “active information” implies Universal Mind, implies determinism, implies manipulation whether necessary or unnecessary. My mind is open one way or the other . . .

Okay Phil,

I read your paper and found it serendipitous. Why? Recently I read a paper by the philosopher, Richard Lucido, called, Towards an Understanding of the Mutual Dependency of Consciousness and Matter (http://www.cejournal.org/GRD/lucido.pdf), in which Mr. Lucido suggests that consistency is an ontological force of nature. He calls it a force of constraint. One could quite easily view gravity and electro-magnetism as agents of consistency; they either motivate, impede, or some combination thereof, thus constrain. In other words, consistency, as a force, subsumes and, hence, is fundamental to, gravity and electro-magnetism.

In his paper, Mr. Lucido coins the word essesential (as opposed to existential) to describe entities whose existence is completely contained in their essence; such entities have no temporal extension. He uses as an example the number 3 but then he extends this idea to the elementary particles and, hence, matter. Basically what he is saying is that matter is virtual; it exists in essence but not in fact. Matter doesn’t become factual until acted upon by consciousness. Consciousness is existential in that it has temporal extension and no amount of finite information can capture its essence (due primarily to its temporal extension). Consciousness is perpetually incomplete.

I find Mr. Lucido’s argument to be rather brilliantly executed but I feel he errors in his treatment of energy. He suggests that energy is essesential and I fail to see how this is so. He uses the analogy, 2 +3 = 5, with the plus representing energy and states:

“The logical consequence of essesential matter is the loss of direct physical causation from one elementary particle to another. These interactions of essences fall within the domain of mathematics and a priori reasoning, while causation is born from existence beyond essence, being over time. By itself, information cannot interact with another piece of information to cause a third piece of information to exist. The result of such purely essesential interactions could only be a potential growth of the total amount of information and not a causal transformation of it. This is because there is no intrinsic

force in the essences themselves that causes, for example, 2 + 3 to become 5. All that can be said is that 2 + 3 = 5, but this is merely a relationship without causal direction. […]Has the 2 interacted with the 3 to become 5 yet? This is an absurd question. The physical concept of energy provides us no assistance in this dilemma. Energy is quantized, a form of essesential being like matter. Energy may exist as the plus between the 2 and the 3, and while in this analogy it may provide a general causal impetus, it in itself, being essesential, cannot force the 2 to interact with the 3 via the nature of its plusness and cause the 5 to have an objective existence, not just a potential one. This is exactly what is seen experimentally. The measurement problem in quantum physics is the logical result of essesential matter. Since all of a particle’s existence is bound up in its essence, there is none left to interact with other particles. This is meant quite literally. The solid existential presence of matter is what our everyday experience would lead us to believe moves causation. Since we know that matter is solely essesential it cannot have in itself that which makes causation possible, objective existence across time. This existence over time is the sole property of existential being.”

To me, force or energy is what transforms the 2 + 3 into the 5. When one accelerates particles in an accelerator and causes them to collide, it is energy or force which is the causal impetus at work. Does this render Mr. Lucido’s argument invalid? I say no and suggest that all of the forces in physics can best be understood as Conscious Intent! Ha, Ha, Ha . . .

Shortly after reading Mr. Lucido’s paper I wrote a paper of my own (since destroyed) in which I suggest extending Fransisco Varela’s 3 valued system to a 5 valued system: the mark, the not-mark, self-reference, simultaneity, consistency. Simultaneity manifests as the inseparability of Hilbert space and was motivated by the work of mathematician, Louis Kauffman, in virtual logic (http://homepages.math.uic.edu/~kauffman/VirtualLogic.pdf). Mr. Kauffman has a great sense of humor! He demonstrates simultaneity with his Flagg Resolution which is basically a Global Rewrite Rule which resolves all paradox in virtual logic:

“We divide and propagate dichotomies: system/observer, body/mind, object/process leading to object, self/other, and more. Only in the unity of such dichotomies are there classical truth values. In the sides of the division or distinction there are valid imaginary truth values exemplifying the strong non-locality of the Flagg Resolution. The world is one, and when it is broken into distinct parts these parts can partake of classical logic only in the non-local structure of the whole. Does this sound like a discussion of non-locality in quantum physics? I believe that indeed it does. In splitting the world, in making a distinction, we can continue to use classical logic only in a context of non-local or global relations. […]With the Flagg Resolution we see that the modification of a simple law of substitution in logic/algebra is sufficient to create an algebraic notion of simultaneity. This simultaneity is not in a state of time (or space), but it is the precursor, or archetype of timespace.” (http://homepages.math.uic.edu/~kauffman/Flagg.pdf)

Another thing that I found interesting about your paper regards the following excerpt:

“Entropy is a macroscopic statistical quantity that is not reflected in the underlying physics. In fact, the underlying laws are reversible. If we started a simulated physical system with an imposed initial low entropy condition and allowed it to evolve forward in time, then entropy would be observed to increase, just as in real life. We can understand from the theoretical work of Boltzmann why that is. However, if we evolved the same system backwards in time from the same initial conditions we would find that entropy also increased going backwards in time, very unlike the real world.”

Consider an initial condition correlated with a final condition wherein both initial and final conditions are maximum in negentropy, in such a system maximum entropy would occur at the midway point of system evolution and there would be no transitional discrepancies! This relates to my comment above.

Possibly very interesting but there is little connection between my essay and Lucido’s aside from the mention of consistency. They are otherwise quite different.

Ah yes, I realize they are quite different; I just found the emphasis on consistency as fundamental in both papers to be somewhat serendipitous . . . or perhaps serendipitous is too strong of a word. I just thought it worth mentioning – obviously . . .

Ervin asked, “I may be missing something here, but can you reference any single research work produced by the Perimeter Institute that has convincingly solved an outstanding problem of contemporary theoretical physics?”

Joy Christian’s “Disproof of Bell’s Theorem by Clifford Algebra Valued Local Variables” is one that I can think of. Produced while a long term vistor at PI.

http://arxiv.org/abs/quant-ph/0703179

It seems that Einstein was right after all. Dr. Christian is also the most active FQXi member on their blogs. For those interested, more can be read in his book,

http://www.brownwalker.com/book.php?method=ISBN&book=1599425645

Well… work is still being done to convince everyone but it is just a matter of time.

Fred,

Bell’s theorem undoubtedly plays an important role in understanding the foundations of Quantum Mechanics.

But my question was directed at the large list of unsettled challenges facing the Standard Model of particle physics and cosmology. For instance, the origin of neutrino masses and mixing, the Dark Matter and Dark Energy puzzles, the source of CP symmetry breaking and baryon asymmetry, the fine-tuning problem of the Higgs sector, the mass hierarchy problem, the source of anomalous magnetic moments of charged leptons and the unknown connection of the Standard Model to the low-energy manifestations of Quantum Gravity.

Hi Ervin,

Perhaps some of those “unsettled challenges” can be answered now that we have a more complete model of physical behavior thanks to Dr. Christian. As Dr. Christian says in the conclusion of the first chapter of his book, “In our view these correlations are the evidence, not of non-locality, but the fact that the physical space we live in respects the symmetries and topologies of a parallelized 7-sphere.”

If true, which I believe it is, Christian’s work is ground breaking new physics.

The gospel story of Jesus turning water into wine is a “proof” that Cinderella’s godmother could turn a pumpkin into a coach. Christian’s disproof of Bell’s theorem only remains a ground breaking revolution on the FQXI blog.

I tend to agree with you on this. It would be nice if you’d find the time to write a paper for, e.g., Prespacetime Journal, on your criticism of Christian’s “proof.”

In the spring of 2011 I commented on Joy Christian’s page posted on the FQXI blog. This was carried on by Florin Moldoveanu, who took this much further. He even posted a couple of short papers on this

http://arxiv.org/abs/1107.1007

http://arxiv.org/abs/1109.0535

Joy Christian uses geometric algebra and the definition of bivectors to construct his “disproof” of the Bell theorem. However in doing this there is a sign change that occurs for no mathematical reason. It is a case of the old cartoon with a derivation on a blackboard where in the middle is “and a miracle occurs.” All the rest of his stuff with parallelizable 7-sphere and the rest amounts to additional stories build on top of a house already built on sand.

To “give the devil his due” it has occurred to me that this might have something to do with decoherence and the reduction of a quantum state to a statistical outcome. Joy claims to reduce QM to a purely statistical ensemble. Yet there is no good reason to think this pertains to a pure quantum state. However the sign change might be some aspect of how decoherent sets are established. I have though not bent metal on this, and frankly if I were to I imagine he will swoop in with all of his vituperative comments. In fact I give him half a chance of popping up here; he has quite an internet radar that detects negative assessments of his “disproof.”

To be honest Joy Christian is not the most amiable person to encounter. Any question or objection that is raised is met by lots of ad hominem retorts. However, since he advanced this 2007 and there have been no “takers” I think it is fair to say this is a dead theory. Nobody from Anandan, to Fuchs, or heavy hitters like Witten have given a thumb’s up on this. For myself I would just assume avoid the controversy that would come from writing such a paper. Also I would have to review all of this, which would take time that I really do not have. It would have all the pleasantry of kicking a porcupine, and to be honest as far as I see things this is a dead issue —even though Joy keeps populating the arxiv with papers on this.

Mr. Crywell ,

Congratulations on winning the FQXi essay contest. You deserve it more than anyone else since—as you often point out—you are smarter than Ed Witten. You even knew what the error was in von Neumann’s theorem before Bell did. Not only that, you do not even have to read my reply to Moldoveanu

http://arxiv.org/abs/1110.5876

to know who is right and who is doing you-know-what. After all, you are in possession of truth itself. What is more, his prowess and brilliance in the foundations of quantum mechanics is no less than your own. And even though Anandan passed away many years ago, you know that without his approval there can be no truth in my argument. I must wait until he reincarnates and gives his approval to my disproof. And of course you are hobnobbing with all those big shots in the foundations of quantum mechanics all the time, so you know, I mean really really KNOW, that no one has given their approval to my disproof. You, Mr. Crywell, know it all.

Since this is degenerating into sarcasm and I am sure you two have had this discussion before I call a halt there thanks.

Lawrence said, “However in doing this there is a sign change that occurs for no mathematical reason.”

The sign change is physical not necessarily of mathematical origin. I think every one could or should agree that many things in nature have a 50-50 chance of being either left or right oriented upon creation handedness-wise. Dr. Christian is simply exploiting the physics of Nature in his model.

What you say segues into my comment above about decoherent sets. Quantum mechanics is stochastic in the prediction of a measurement outcome. However, quantum mechanics is not a stochastic physics. The wave equations are all perfectly deterministic; the evolution of a quantum wave is completely determined by the initial conditions and unitary evolution. What is not deterministic is the outcome one obtains in a measurement.

The random assignment of those signs in the product expressions appears most likely to be a manifestation of the reduction of a quantum wave function. I don’t see it as appropriate for quantum physics and the evolution of a pure state. If one assigns these signs + or – by the flip of a coin this appears to reflect how a measurement changes a pure state into a mixed state or statistical distribution given by the diagonal of the reduced density matrix. The assignment of those signs appears to be something introduced from outside, which physically is the act of a measurement.

Crowell,

For heaven’s sake read the first two papers of Bell’s book before embarrassing yourself any further. I have never taken you seriously because—as I have repeatedly pointed out to you—you are clueless about foundations of quantum mechanics. What you are missing is a basic understanding of what is meant by a hidden variable in Bell’s local-realistic framework.

As if any of you seem to understand the deeper meaning of Bell’s theorem from say a topological view… Why pick sides in the quantum interpretation of which some say we do not or cannot understand when from my more general view both stances do not have enough useful speculation, wisdom, or information. The great insight from the LHC is that we tend to discuss what is really outside the scope of our grounding possibilities by experiment as are what happens beyond the {particle} limits of all information the same thus conveying no useful information or all information in what we contain as random chaos. Life is somewhere in between. Things go a little deeper than what would be a modest breaktrhough physics as “seven-sphere” an idea to which our alternative physicists happy in their own frontiers have gone beyond.

The PeSla (making a general comment here too)

I am not a researcher on quantum foundations particularly. I have of course studied Bell’s theorem and related matters. I do have suspicions about this “disproof,” and the mathematics does not appear to fit right. In addition the reviews of this “disproof” have largely been negative, where while I prefer not to appeal to authority it can at times have weight.

Hi, I know we list home page sites here but I would like to view anything you may have on line. I for one think Bell’s theorem has to be better understood and generalized for it can seem like a disproof. I have recent posts of a more philosophic nature of which that and not the issues of science may be referred to and not some idea of pumpkins and wine changes of state. I always catch hell for saying we need more than quantum foundations as well as anyone I have seen who tries to apply it to the scale of our brains.

The Pe Sla

Crowell,

There does not exist a single published criticism of my disproof. The only published work on my disproof is this one:

http://lccn.loc.gov/2012001131

I do not consider Internet gossip by some unqualified and uniformed individuals a “negative review.” On the other hand, I consider the opinion of someone as knowledgeable and prominent as Lucien Hardy a very positive review. Moreover, I do not care about your opinion but for the fact that you have been spreading unjustified negative rumors about my work on the Internet for a long time now, without understanding the first thing about either Bell’s theorem or my disproof.

I checked that link but could not see your work. I too suggested once something to published in the library of congress- but Bush declassified such knowledge in the name of its business potential, so who knows how much of any idea is known by the best of our schools and research institutions- most likely related to the military. A theory cannot be judges any better by experts- thus we see the great service of the subject of this debate- let us not load the jury of peer review by those we consider as our peers. Such an appeal to authority is emotional maybe… Thank You (from someone outside academia with no credentials formally- but I did expect more from our universities.)

The PeSla

Hi Fred,

I am highly skeptical that a conclusion such as “physical space we live in respects the symmetries and topologies of a parallelized 7-sphere” can lead to a satisfactory resolution of the deep challenges I alluded to.

Please keep in mind that the Standard Model is a successful theory whose predictions have been confirmed to a dramatic level of accuracy. Coming up with any viable extension of this model is far from being trivial.

Hi Ervin,

The prospects for SUSY (and string theory) is looking more bleak every year that goes by now so some new direction to proceed upon is sorely needed. Perhaps you might be interested in this discussion on FQXi between Joy Christian and Michael Goodband and others.

http://www.fqxi.org/community/forum/topic/1352#post_70909

Plus I highly recommend reading Michael’s paper on vixra in addition to his FQXi essay which can be found at the above link.

http://vixra.org/abs/1209.0034

Some more ground breaking new physics, IMHO. Both of which pretty much accomodate the Standard Model as is.

Fred,

As much as I’d like to stay open-minded, I fail to see how the cited paper can convincingly resolve the serious challenges of the Standard Model and duplicate its predictive power.

Let’s agree to disagree.

Cheers,

Ervin

Ervin but you are aware that any quotient of S3xS5 via a U(1) action has the isometry group SU(3)xSU(2)xU(1), are you? I mean, it is very different to be skeptical while knowing this fact and to be skeptical of this kind of facts.

Alejandro,

It seems to me that we ought to discuss these issues somewhere else. The conversation is sliding too much off-topic.

Cheers,

Ervin

Actually, this is not only a very good grasp of topology and traditional ideas of Euclidean geometry, 7 is only the beginning of long series of such hidden dimensions and symmetry. How our modern physicists use groups to pigeonhole their strung together theories tells us nothing hew really, beyond group theory itself.

Phil,

Many thanks for your ongoing efforts in maintaining vixra. It would be personally demanding at times, I’m guessing. Your work is much appreciated. Thank you

Dirk

the eternal quest between viXra and arXiv….

yes yes while arXiv is for professionals viXra is for amateurs etc etc

but sometimes an amateur can exceed the knowledge of the established professionals

an example:Oliver Heaviside

in the Century XIX Heaviside exceeded his conteporary physicists…transplanting both arXiv and viXra to the Century XIX guess where Heaviside would appear

quoting from Wikipedia

http://en.wikipedia.org/wiki/Oliver_Heaviside

————————————————————————–

Oliver Heaviside( 18 May 1850 – 3 February 1925) was a self-taught English electrical engineer, mathematician, and physicist who adapted complex numbers to the study of electrical circuits, invented mathematical techniques to the solution of differential equations (later found to be equivalent to Laplace transforms), reformulated Maxwell’s field equations in terms of electric and magnetic forces and energy flux, and independently co-formulated vector analysis. Although at odds with the scientific establishment for most of his life, Heaviside changed the face of mathematics and science for years to come.

———————————————————————–

one of his contributions

the Heaviside step function

http://en.wikipedia.org/wiki/Heaviside_step_function

quoting from Wiki again

—————————————————————-

The Heaviside step function, or the unit step function, usually denoted by H (but sometimes u or θ), is a discontinuous function whose value is zero for negative argument and one for positive argument. It seldom matters what value is used for H(0), since H is mostly used as a distribution. Some common choices can be seen below.

The function is used in the mathematics of control theory and signal processing to represent a signal that switches on at a specified time and stays switched on indefinitely. It is also used in structural mechanics together with the Dirac delta function to describe different types of structural loads. It was named after the English polymath Oliver Heaviside.

It is the cumulative distribution function of a random variable which is almost surely 0. (See constant random variable.)

The Heaviside function is the integral of the Dirac delta function: H′ = δ.

—————————————————————-

normally professional scientists from the mainstream of the academic community do not make errors in their investigations

while amateurs are liable to do so…

normally…

however !!!

however recently one of the World Top Class institutions with dozens of professional scientists shaked the world with something that never happened….a broken fiber optic cable ruined an experiencie..the so-called “superluminal neutrino”

and their work is in arXiv since the first version…and the so-called “superluminal neutrino” never happened!!!!!…the first version of an arXiv paper describes somethign that never happened!!

so the general feeling that arXiv papers are correct and viXra papers are wrong is not entirely accurate

professional scientists hesitate to submit to viXra because they fear to loose academic prestige etc etc etc

as for me viXra is an excellent way to communicate scientific knowledge…

if a guy send a wrong paper to viXra proving that 2 + 2 = 5 the “guilty” cannot be attributed to viXra..it must be of the responsability of the author not viXra

viXra is a vehicle….an excellent vehicle by the way

professional scientists reading viXra papers would perhaps spot the “2 + 2 = 5 ” paper and would conclude that all viXra authors are cranks and viXra is a low quality e-print server and all

papers are wrong etc etc etc

not like that..not really like that

Fernando,

what does it mean, beyond this world of a sort of certain 1 + 1 = 2 arithmetic if we can sum up fields of numbers such that 2+2=5 but the extral one is hidden as some space structures and physics structures can be imagined to so comply?

the example 2 + 2 =5 was to illustrate that if a paper is wrong the responsability is of the author and the e-print server cannot be blamed for this.

..i am tired to see people blaming viXra due to the poor quality of some papers…but there are others of very good quality..and i am not speaking about my own papers

imagine that you have 2 dollars or 2 euros or 2 sterling pounds or 2 ienes..or 2 yuans or 2 pesos

2 dollars + 2 dollars = 4 dollars …no hidden dollars in the subspace physical structures….

if you go to a supermarket in the US and an article costs 5 dollars and you have4 dollars no hidden space structure will provide the extra dollar

:-) we can just print more money, the conservation of the class of a number of units involved may never reach zero up or down as we seem to respect the limit of continuous compound interest.

The difference between 2^n euros and 10 euros is outside our normal concept of counting and the equivalences of bases. We after all do not dwell in the overwhelmingly likely transcendental universe.

The Pe Sla

[...] Our comparison of essays by viXra authors submitted to the 2012 FQXi essay contest which were independently rated, showed that the distribution of scores was similar to the overall distribution from all authors of which about a third were professional scientists who submit to arXiv. [...]

Happy New Year viXra!

Thank you for your portal and that opportunity you give independent researchers in advancing their research!

Also want to thank The Foundational Questions Institute (FQXi) for the opportunity to participate in the competition FQXi Essay 2012!

One proposal to improve the evaluation system: to add to the ratings introduction main ideas of participants in a table and evaluate all essays on new ideas, and not just the first 36 essays.