A lot of people like to worry about the measurement problem and wave function collapse in quantum mechanics. How can a physical outcome depend in such a fundamental way on how we observe it? Many of us have been happy to accept that quantum mechanics works as described and that no real paradoxes arise from its interpretation, but ever since Einstein challenged Bohr’s Copenhagen interpretation, a minority of physicists from Bohm to ‘t Hooft have tried to find a hidden variable explanation that avoids the philosphical problems. Even if you don’t worry about such things, the maths and physics behind such ideas can be quite interesting. Last week I came across the paper arXiv:1103.6058 by Waegell and Aravind that relates a proof of the Kochen-Specker theorem to the 24-cell. The Kochen-Specker theorem is a no-go result for hidden variable theories and the 24-cell is a unique mathematical structure that comes up in the context of systems of qubits as I discussed just recently.

Any hidden variable theory must avoid certain no-go theorems of this type. The most well known is Bell’s inequality that follows from the assumption of locality and has been shown to be violated in experiments. This is consistent with quantum mechanics but rules are local hidden variable theories. It is quite a strong refutal of Einstein’s philosphical stance against quantum mechanics, because he had a strong belief in locality that followed from his work on relativity where he established the principle that no signal can be sent faster than light. It turns out that quantum mechanics does not violate this principle in the practical sense, yet it is formulated in such a way that wave-function collapse describes an apparent non-local effect. To some physicists including Einstein this seemed philosophically unsatisfactory. Too bad! Sometimes the universe does not respect our philosophical preferences and this is one of them. The experimental verification of the violation of Bell’s inequality shows that we have to accept non-locality in some form.

But non-locality is not the only philosophical objection that physicists have. The bigger problem is that the rules of wave-function collapse depend on what is measured. This seems to give observers a special role in the laws of physics, but there is no good way to define what an observer is from first principle. In a hidden variable theory we would postulate the existence of state variables with definite values that cannot be easily seen but which determine the outcome of quantum mechanical measurements. So could there be a hidden variable theory of quantum mechanics which is non-local but where no variable depends on the context of the measurement? The Kochen-Specker theorem proved by Simon B. Kochen and Ernst Specker in 1967 rules out such a theory and it does so in a very interesting way.

The original proof was quite complex but in 1991 quantum information expert Asher Peres gave a simpler proof. Although he did not mention it at the time, his proof relies on the symmetry of the root system of the exceptional Lie algebra F4. This comprises 48 vectors in 4D space which can be interpreted as the vertices of a 24-cell and its dual. You don’t need to know anything about root systems or Lie-groups or the 24-cell to understand the proof so don’t be put off.

Each root vector is paired with its negative to define a line through the origin in 4d space. These 24 lines are the 24 rays of Peres. I listed these points in my previous post on the 24-cell but here again are the 24 rays with numbers as in the new paper so that I can refer to them.

1 | (2,0,0,0) | 2 | (0,2,0,0) | 3 | (0,0,2,0) | 4 | (0,0,0,2) |

5 | (1,1,1,1) | 6 | (1,1,-1,-1) | 7 | (1,-1,1,-1) | 8 | (1,-1,-1,1) |

9 | (-1,1,1,1) | 10 | (1,-1,1,1) | 11 | (1,1,-1,1) | 12 | (1,1,1,-1) |

13 | (1,1,0,0) | 14 | (1,-1,0,0) | 15 | (0,0,1,1) | 16 | (0,0,1,-1) |

17 | (0,1,0,1) | 18 | (0,1,0,-1) | 19 | (1,0,1,0) | 20 | (1,0,-1,0) |

21 | (1,0,0,-1) | 22 | (1,0,0,1) | 23 | (0,1,-1,0) | 24 | (0,1,1,0) |

Suppose these (when normalised) are 24 quantum states |ψ_{i}> in a 4 dimensionl Hilbert Space e.g. it might be a system of two qubits. For each state we can define a projection operator

P_{i} = |ψ_{i}><ψ_{i}|

These are Hermitian operators with three eigenvlaues of 0 and one of 1. They can be considered as observables and we could set up an experimental system where we prepare states and measure these observables to check that they comply with the rules of quantum mechanics. One thing that we observe is that there are sets of 4 operators which commute because the 4 rays they are based on are mutually orthogonal. An example would be the four operators P_{1}, P_{2}, P_{3}, P_{4}.

We know from the theory of quantum mechanics that if we measure these observables in any order we will end up with a state which is a common eigenvector i.e. one of the first four rays. The values of the observables will always be given by 1,0,0,0 in some order. This can be checked experimentally. There are actually 36 sets of 4 different rays that are mutually orthogonal, but we just need nine of them as follows:

{P_{2}, P_{4}, P_{19}, P_{20}}

{P_{10}, P_{11}, P_{21}, P_{24}}

{P_{7}, P_{8}, P_{13}, P_{15}}

{P_{2}, P_{3}, P_{21}, P_{22}}

{P_{6}, P_{8}, P_{17}, P_{19}}

{P_{11}, P_{12}, P_{14}, P_{15}}

{P_{6}, P_{7}, P_{22}, P_{24}}

{P_{3}, P_{4}, P_{13}, P_{14}}

{P_{10}, P_{12}, P_{17}, P_{20}}

At this point you need to check two things, firstly that each of these sets of 4 observables are mutually commuting because the rays are othogonal, secondly that there are 18 observables each of which appears in exactly two sets.

imagine now that there is some hidden variable theory that explains this system and which reproduces all the predictions of quantum mechanics. At any given moment the system would be in a definite state and values for each of the 18 operators would be determinate, even if it is hard for us to see what they are directly. The values must be 0 or 1 but they must also comply with the rules that they are equal to 1 for exactly one observable in each of the nine sets. the other three values in each set will be 0. So there must be nine values set to one overall. But this is impossible! each observable appears twice so which ever observables have the value of 1 there will always be an even number of ones in total, and nine is not even. This proves the Kochen-Specker theorem. (This version of the proof using only 18 of the 24 vectors is a later refinement due to Kernaghan, Cabello, Estebaranz, and Garcia-Alcaine, see paper linked for references)

The conclusion is that it you want to believe in hidden variable theories you had better find a way of implementing it that does not comply with the assumptions of this theorem. It is not enough to look for non-local hidden variable theories. You have to avoid the conditions that the variables have definite values or that they are independent of the context of the measurement. For my money that takes you away from the philosophical objections that hidden variable theories are supposed to answer, but if you want to dispute that you can do it here :).

Is the philosophical non locality problem perhaps also a problem of time ?

If we think of time as timequanta with the size of 5.39121×10^-44sec and couple this to the locality quantum of 1.616252×10^-35m, a onobserved wave exists in all these quanta, once observed (or can observed also mean that some interaction has taken place?) the wave function collapses in this very time moment and the observed particle occupies X volumes of space quanta (depends on the kind of particle).

Because of the fact that we cannot intervene observe?) with ONE space or time quantum, the wavefunction is easily understood. There the analog essence of our 4-D Universe becomes apparent.

Nice post. These arguments seem to implicitly assume one is doing quantum mechanics over division algebras. There, one generally can decompose rank one operators into an outer product of a pure state and its dual. However, once one works over split composition algebras (i.e., the split octonions), there exist rank one operators which do not take this form. Such operators make the manifold structure of the corresponding projective spaces non-trivial.

Quantum mechanics over split composition algebras has already found applications in toroidal compactifications of M-theory (M-theory on T^k, k=5, 6, 7, 8), most notably in Duff’s qubit/extremal black hole correspondence (arXiv:1002.4223 [hep-th]).

The split composition algebras may seem like strange beasts, but they actually are special cases of the complex composition algebras. For instance, Ferrara, Gunaydin and Duff have investigated “magic N=2″ supergravity theories based on the division algebras which so far do not have a clear M-theory interpretation (arXiv:0809.4685 [hep-th], arXiv:0712.2976 [hep-th]). However, using split composition algebras gives rise to N=8 supergravity theories that do have an M-theory interpretation. One can relate these two types of supergravities, at least mathematically, by constructing black hole charge spaces using the full complexified composition algebras. Then, one can construct U-duality transformations that can transform between black hole solutions in the two kinds of theories.

So if M-theory turns out to be the correct theory of quantum gravity, nature would appear to have more philosophical challenges up its sleeve.

So, QM is non-local. But, classical phenomena act local in the vast majority of circumstances. I wonder if there is a good heuristic way to quantify the extent to which non-locality exists and matters in some way that would add more insight and understanding to the phenomena beyond the standard formulations of uncertainty and engtanglement, and perhaps tie in the outside the light-cone contributions to the Feynman propogator.

I wonder if it is possible to articulate both of these as a measure of the extent of the non-locality of the topology of time-space itself, rather than as a property of interactions.

I really enjoyed this post, fascinating stuff! For another (non-group theoretic, but remarkable) perspective on the Kochen-Specker theorem check out Isham and Butterfield’s Topos theoretic proof:

http://arxiv.org/abs/quant-ph/9803055

For the latest developments in this direction see:

http://arxiv.org/abs/1102.2213

and for a review see:

http://arxiv.org/abs/0803.0417

There is in my opinion a “stringy” interpretation of this. F_4 is the isometry group of the projective plane over the octonions. There are extensions to this where the bi-ocotonions CxO have the isometry group E_6, HxO has E_7 and OxO has E_8. This forms the basis of the “magic square.” F_4 plays a prominent role in the bi-octonions, which is J^3(O) or the Jordan algebra as the automorphism which preserves the determinant of the Jordan matrix

The exceptional group G_2 is the automorphism on O, or equivalently that F_4xG_2 defines a centralizer on E_8. The fibration G_2 –> S^7 is completed with SO(8), where the three O’s satisfy the triality condition in SO(8). The G_2 fixes a vector basis in S^7 according to the triality condition on vectors V \in J^3(O) and spinors θ in O, t:Vxθ_1xθ_2 –> R. The triality group is spin(8) and a subgroup spin(7) will fix a vector in V and a spinor in θ_1. To fix a vector in spin(7) the transitive action of spin(7) on the 7-sphere with spin(7)/G_2 = S^7 with dimensions

dim(G_2) = dim(spin(7)) – dim(S^7) = 21 – 7 = 14.

The G_2 group in a sense fixes a frame on the octonions, and has features similar to a gauge group. The double covering so(O) ~= so(8) and the inclusion g_2 \subset spin(8) determines the homomorphism g_2 hook–> spin(8) –> so(O). The 1-1 inclusion of g_2 in so(O) maps a 14 dimensional group into a 28 dimensional group. This construction is remarkably similar to the moduli space construction of Duff et al. .

The so(9) is not the most general symmetry of J^3(0). There exists a permutation on the three scalars z_0, z_1,z_2 and {\cal O}^3. This means there is an additional automorphism so(3). The more general automorphism is then F_4. The quotient between the 52 dimensional F_4 and the 36 dimensional so(9) ~ B_4 defines the short exact sequence

F_4/B_4:1 –> spin(9) –> F_{52\16} –> {\cal O}P^2 –> 1,

where F_{52\16} means F_4 restricted to 36 dimensions, which are the kernel of the map to the 16 dimensional Moufang or Cayley plane OP^2. Geometrically the F_4 define the symmetry of the 24cell, called the icositetrachoron or polyoctahedron, according to 24 octahedral cells. The B_4 also defines a more restricted symmetry on the 24 cell according to 16 tetrahedral cells and 8 octahedral cells. The 8 octahedral cells define the {\bf 8}_0, or so(8) in the J^3(O), while the 16 tetrahedral cells are mapped to the OP^2. This means on the algebraic level f_4 ~= so(8)(+)V(+)θ_1(+)θ_2 [here (+) = \oplus], which explicitly describes the triality condition the three octonions with the so(8). More generally according to octonions f_4 ~= so(O)(+)O^3, and f_4 diagonalizes the Jordan cubic matrix.

The 24-cell has the largest group representation F_4 in 52 dimensions, of which the SO(9) in 36 dimension defines a short exact sequence between spin(9) and the Moufang plane OP^2. B_4 ~ SO(9) defines the symmetry of the 24-cell by 16 tetrahedral and 8 octahedral cells. The elements of the exceptional Jordan matrix is composed of elements V_{ab} which are accompanies by 16 superpartners θ_{ab}, where the indices a and b indicate internal elements which transform these elements to N\times N matrices in SU(N). The SU(N) may be contained in the HxO, with E_7 structure, for N = 8, where SU(8) exists in the 64 dimensional quaternion-octonionic space. This obtains for a single D-brane, in particular here a D0-brane, where for N > 1 this gauge group is SU(8)^N , or the embedding group SU(8N). The Lagrangian assumes the form

L = (1/2)(tr(∂_μV_i)^2 – (1/2g)tr[V_i, V_j]^2 – 2{θ-bar}_iγ_j[θ^i, V^j]),

where integer the indices i, j denote the matrix indices.. Here the superpartners to the vectors V transform as spinors under the SO(9) transverse rotations, and the matrices V_{ab}, θ_{ab} (vectors and spinors in J^3(O)) are components in a 10 dimensional super Yang-Mills space. This Lagrangian is applied as the SO(9) theory in the BFSS.

The 36 sets of 4 mutually orthogonal rays is contained in F_{52\16} above. The short exact sequence defines the f_4 ~= so(9)(+)S^9 in the short exact sequence above. This means the K-S theorem is a consequence of qubit structure which has the Eguchi-Kostant isomorphism with black holes.

nice post n good questions asked..

http://www.nature.com/nature/journal/v449/n7158/abs/nature06118.html

http://iopscience.iop.org/1367-2630/12/2/025019

and many, many more.

Quantum information science involves the storage, manipulation and communication of information encoded in quantum systems, where the phenomena of superposition and entanglement can provide enhancements over what is possible classically

no time? transferring and entangling the quantum information between memories that may be separated by macroscopic or even geographic distances.

no c? instantly? conditional quantum gates through photonic channel, impossible large distances (Tiller – http://www.tillerfoundation.com/Tiller%20-%20Human%20Psychophysiology%20Macroscopic%20Information%20Entangle.pdf)

2D sheets? a dedifferentation?

http://www.articlesbase.com/nature-articles/quantum-entanglement-spooky-action-at-a-distance-4523339.html

entangled qubits = to know without measurement

There are some departures between the theoretical and experimental concepts of entanglement and in particular the teleportation of states. A part of this departure is due to the measurement issue.

“Any hidden variable theory must avoid certain no-go theorems of this type. The most well known is Bell’s inequality that follows from the assumption of locality and has been shown to be violated in experiments. This is consistent with quantum mechanics but rules are local hidden variable theories.” – Phil

Please see Caroline H. Thompson, “Subtraction of ‘accidentals’ and the validity of Bell tests”, http://arxiv.org/PS_cache/quant-ph/pdf/9903/9903066v2.pdf:

‘In some key Bell experiments, including two of the well-known ones by Alain Aspect, 1981-2, it is only after the subtraction of ‘accidentals’ from the coincidence counts that we get violations of Bell tests. The data adjustment, producing increases of up to 60% in the test statistics, has never been adequately justified. Few published experiments give sufficient information for the reader to make a fair assessment.’

Dr Thomas Love, at California State University, kindly emailed me a preprint stating:

‘The quantum collapse [in 1st quantization QM, where a wavefunction collapse occurs whenever a measurement of a particle is made] occurs when we model the wave moving according to Schroedinger (time-dependent) and then, suddenly at the time of interaction we require it to be in an eigenstate and hence to also be a solution of Schroedinger (time-independent). The collapse of the wave function is due to a discontinuity in the equations used to model the physics, it is not inherent in the physics.’

Feynman’s path integral explanation of the double slit experiment in his 1985 book QED, firmly refutes any role of the observer in collapsing a wavefunction: the sum of histories for the two paths the light photon takes, each with a different action and thus different phase factor, results in the differing probability for the photon to arrive at any given place. Each path action depends on the diffraction of the paths of the photon by slit edges, which provides the randomness. There is no effect from an observer:

Light … uses a small core of nearby space. (In the same way, a mirror has to have enough size to reflect normally: if the mirror is too small for the core of nearby paths, the light scatters in many directions, no matter where you put the mirror.)’

– R. P. Feynman, QED, Penguin, 1990, page 54.

He also debunks 1st quantization, since the uncertainty principle must in 1st quantization (where the field is classical) be intrinsic to a particle’s position and momentum, rather than just describing the randomness of field quanta interactions with on-shell particles like orbital electrons. Feynman paints a picture of an electron behaving chaotically in the atom due to the fact the field is composed of quanta, so that the Coulomb force binging the electron into its orbit of the positive nucleus is not a unvarying constant, but fluctuates as individual field quanta are exchanged at random. This is what produces indeterminancy.

Richard P. Feynman, QED, Penguin, 1990, pp. 55-6, and 84:

‘I would like to put the uncertainty principle in its historical place: when the revolutionary ideas of quantum physics were first coming out, people still tried to understand them in terms of old-fashioned ideas … But at a certain point the old fashioned ideas would begin to fail, so a warning was developed that said, in effect, “Your old-fashioned ideas are no damn good when …”. If you get rid of all the old-fashioned ideas and instead use the ideas that I’m explaining in these lectures – adding arrows [arrows = path phase contributions of exp(iS) in the path integral] for all the ways an event can happen – there is no need for an uncertainty principle! … on a small scale, such as inside an atom, the space is so small that there is no main path, no “orbit”; there are all sorts of ways the electron could go, each with an amplitude. The phenomenon of interference [by field quanta] becomes very important …’

This isn’t “hidden variables” or an “alternative theory”, it’s quantum field theory! Either people accept quantum fields, which doesn’t away with QM because the quanta of the fields produce all the observed chaos (like air molecules producing Brownian motion on observed pollen fragments), or they don’t! You can’t have your cake and eat it. 2nd quantization means we can get indeterminancy from quantum field randomness, with no need for the intrinsic use of the uncertainty principle.

The ignorance of this by the Board of Physical Review A was quoted in their 2004 email to Caroline Thompson (University of Wales):

“In 1964, John Bell proved that local realistic theories led to an upper bound on correlations between distant events (Bell’s inequality) and that quantum mechanics had predictions that violated that inequality. Ten years later, experimenters started to test in the laboratory the violation of Bell’s inequality (or similar predictions of local realism). No experiment is perfect, and various authors invented ‘loopholes’ such that the experiments were still compatible with local realism. Of course nobody proposed a local realistic theory that would reproduce quantitative predictions of quantum theory (energy levels, transition rates, etc.). This loophole hunting has no interest whatsoever in physics.”

Bell’s inequality implicitly assumes 1st quantization! Nobody has ever proved that there is any observer produced effect: it’s quite the opposite. It’s the observer who creates the illusion of an effect by misunderstanding physics, mistaking the wavefunction for a classical effect.

In both 1st and 2nd quantization, the “wavefunction” of a single particle is directly proportion to one phase amplitude exp(iS), but while 1st quantization uses only that factor, in 2nd quantization the field itself is quantized so you have to sum multiple contributions of exp(iS) from every field quanta that contributes to an event!

Therefore, the “wavefunction” is only intrinsically indeterminate in 1st quantization; in 2nd quantization the indeterminancy is supplied by the path integral, the summing of the many varying phases that add up with coherence or interference to produce the overall wavefunction. The wavefunction for a result from a path integral doesn’t depend on the observer, but on each of the discrete field interactions that contribute!

http://arxiv.org/abs/1104.0807

Yuri: thanks for that reference to “The Anna Karenina principle”. That paper is superficial, since the problem of physics goes much deeper, to the choice of careers in the first place.

It was Mach who first tried to rule out anything looking like “hidden variables” (after Maxwell’s light-carrying aether was falsified by the Michelson-Morley experiment of 1887). Then you have “principles” like relativity replacing mechanisms altogether.

Then you have Rutherford’s letter to Bohr, pointing out that Bohr’s atomic electron doesn’t “know where to stop” when it radiates, so Bohr’s atom is wrong or incomplete. Bohr then tries to ban such “objections” to theories, by inventing the correspondence principle that “nobody understands quantum mechanics”.

Finally, John von Neumann comes up with a proof that no hidden variables theories can exist, Einstein fails to convince Bohr of the EPR paradox, and “hidden variables” (or any mechanisms in QM) look very unfashionable if not wrong. At Pocono in 1948, Bohr actually had the stupid arrogance to tell Feynman that path integrals made no sense because of the uncertainty principle:

“Bohr … said: “… one could not talk about the trajectory of an electron in the atom, because it was something not observable.” … Bohr thought that I didn’t know the uncertainty principle. … it didn’t make me angry, it just made me realize that … [ they ] … didn’t know what I was talking about, and it was hopeless to try to explain it further.”

– Feynman quoted in Mehra’s 1994 Feynman biography, pages 245-248, http://www.tony5m17h.net/goodnewsbadnews.html#badnews

Compare that arrogant attack to Bohr’s own very similar flogging by Rutherford in 1916:

“There appears to me one grave difficulty in your hypothesis which I have no doubt you fully realize [conveniently not mentioned in your paper], namely, how does an electron decide with what frequency it is going to vibrate at when it passes from one stationary state to another? It seems to me that you would have to assume that the electron knows beforehand where it is going to stop.”

– Rutherford to Bohr, 20 March 1913, in response to Bohr’s model of quantum leaps of electrons which explained the empirical Balmer formula for line spectra. (Quotation from: A. Pais, “Inward Bound: Of Matter and Forces in the Physical World”, 1985, page 212.)

http://www.sciencedaily.com/releases/2011/04/110405084252.htm

Thanks for all the deep comments everyone. It will take me a few weeks just to digest some of them.