OPERA fail to find error in Faster Than Light Measurement

The OPERA experiment has failed to find an error in their measurement of neutrino speeds that shows them travelling faster than light. The earlier result was most strongly criticized because of the statistical nature of the measurement which involved fitting the timing profile of many observed events at Gran Sasso to the known shape of long duration pulses of protons 730 km away at the source in CERN. This objection has now been quashed by using much shorter pulses so that the effect can be seen with just a few neutrinos. While the previous measurement used data gathered over three years, this new confirmation took just a few weeks.

The crucial new plot is this one

The timing of the neutrinos is spread over a 50 ns window but is still clearly different from the zero mark that would be consistent with travel at the speed of light. The spread could either be due to inaccuracies in the timing or differences in the speed of the neutrinos themselves, if the effect is real. An interesting question would be whether there is any correlation between the timing offset and the energy of the neutrinos and I don’t know if they have that data.

In fact this spread is the most exciting part of the new result. As far as I know it is bigger than the known systematic errors. If there were an unknown systematic error in the measurement of the distance between the experiments or in the timing, we would expect that to be constant. Here I am assuming that atomic clocks were used at each end to keep the timing stable rather than constant referral to GPS time which could vary. If this is the case then the spread actually rules out several other sources of systematic error.

Factoring in my prior probabilities from preconceived theoretical prejudices I can now say that the probability of the result being correct has increase from 1 in a million to one in 100 thousand (numbers are illustrative :) ). This is sufficient to convince most of the collaboration to sign the paper which may now go forward to peer-review. To convince more theorists they may need to do more checks on the result. The strongest criticisms will now fall on the use of GPS. To eliminate this they should check the timing and the distance calculation independently.  The timing could be checked by flying a portable atomic clock from CERN to OPERA and back at high-speed on a helicopter to calibrate the clocks at either end. Portable clocks can be stable to within a few nanoseconds over the course of a day so it should be possible to carry out this check  with sufficient accuracy and it would not be too expensive. The distance measurement also needs to be repeated, preferably using old-fashioned surveying techniques rather than GPS between the two locations.

If this is also fails to find an error then the probability of the result being correct goes up to one in ten thousand. The next most likely source of errors would be the timing measurement for the collisions that generate the neutrinos at CERN. This involves some electronics with a lag that may not be precisely known. To eliminate this they possibility need to build a near detector to catch neutrino events in the path of the beam near CERN. If the beam is everywhere deep underground this could be an expensive addition to the experiment, but it would be a very significant check taking the probability of the result being real up to one in 100 or better depending on what other possible sources of error might be left.

To really confirm that neutrinos are faster than light requires confirmation from other labs using measurements that could not be subject to correlated errors. Hopefully this will arrive next year.

For more details see TRF where this was reported two days ago based on a tachyonic version of twitter, or AQDS using conventional light-speed technology from arXiv. The official press release is here.

Update: People are telling me that the timing calibration has already been done. An update from Dorigo makes some interesting points including the fact that the timing depends on a 20 megahertz clock signal. This explains the spread of the measurements over 50 ns. In fact it means that the time offest must be very sharp which is not such good news. It makes a constant systematic error seem much more likely.

I think another essential upgrade to the experiment would be to record a timestamp for events with nanosecond accuracy.


101 Responses to OPERA fail to find error in Faster Than Light Measurement

  1. Simplicity says:

    This is a very interesting information. Thanks to Dr. Gibbs for that. And it has been stunning to see how different we picture us the possible cause behind the possible FTL neutrions. And I am truly shocked by how advanced all your thinking on this matter is.

    I, my self, think that we can not yet know if the neutrions really is Faster Than Light on the path between Cern and Opera until we are able to send a light pulse in vacuum along excactly the same path, and then measure how fast that light pulse moves in that local place, measured by a clock also placed at the same local place.

    Here is a little more information of what my general view is. I think that reality only is 3-dimensional, with no time. That there is only one 3-D space that we are moving within. No past, no yesterday, no now, no tomorrow, no future. Only space where we move i 3 dimensions. If I moves faster relative to an observer in rest in this space then it will only affect, and slow, the motion/spin of the elementary particles in my body relative to the observer.

    See my TOE here:

    http://www.vixra.org/pdf/1010.0035v4.pdf

    My greatest thanks to Dr. Gibbs for letting me archive my article into vixra. I know that most of you will laugh of my article and find that most of my view and math is wrong. This I respect and understand. But maybe some of the content can be of non-zero interests for the community.

    After some more motions/spins in our bodies particles, the atomic clock will advertise saturday and I hope that You all then will have a nice weekend :-)

  2. Nessuno says:

    Dear Philip,
    as they can identify from which 3ns pulse each of the 20 neutrinos is coming, they should look if there is any correlation between the arrival time mismatch and a) the position of the pulse in the SPS proton train, b) the time interval since the last resynchronization of the Opera detector (they said they resynchronize every second?), and c) any other possible source of time error drift inside the detector.(and d) perhaps with the position in the detector where the neutrino is detected, even if the size of the detector is maybe small compared with the effect they see.) Also, do you know what is the error they claim on the individual neutrino arrival time?
    Best regards

  3. Luboš Motl says:

    Dear Phil, using tachyons, I also received the future v2 version of the paper which is not available via arXiv yet because they crazily posted the same v1 version of the paper as v2 as well although they changed the abstract. ;-) But I detected the neutrinos coming from the future and reconstructed the future paper by OPERA. See TRF.

  4. Stan says:

    Hi Phil,

    The original paper (and presumably the new version) already includes a timing cross-check with atomic clocks transported between CERN and Gran Sasso by the Federal German Metrology Institute. They found a systematic offset of 2.3 +/- 0.9 ns between the atomic clock standard and the stationary GPS-sync’ed atomic clocks used as time standards at each site.

    As for the baseline distance cross-check, I don’t think there are any survey techniques besides GPS that can be used over 750 km with the required accuracy. The most likely issue in the baseline measurement would be in the traditional survey that had to be used to find the position of the underground detector hall relative to the GPS anchor point on surface. That is a hard measurement to make since there is no line-of-sight between surface and underground, so many segments have to be carefully sighted and combined.

    (I was entertained to hear the suggestion after the OPERA colloquium at CERN that they should drill a small hole from surface down to the detector hall so they can laser survey the location relative to the GPS receiver station. The speaker felt that this was impractical for a number of reasons.)

  5. Joerg says:

    Just in case: Here is the PTB report about the time synchronization:

    http://operaweb.lngs.infn.it/Opera/publicnotes/note134.pdf

  6. Philip Gibbs says:

    Thanks for the correction regarding the time measurement. I am not convinced that a direct distance survey is not possible but I can’t find any info on that. if anyone knows please do tell.

  7. ohwilleke says:

    Suffice it to say that it appears possible to rule out many potential sources of error with some modest adjustments to the experiment and a replication that shouldn’t be hard to set up.

    We can dramatically improve the certainty of the measurement within a couple of years.

    If that confirms the result, I have to think that the more sensible approach is to look for some effect that is causing the speed of light in the most sensitive experimental measurements of light to be travelling slightly lower than “c” due to some systemtic effect found in all of the most accurate measurements on the order of the refractive index of air (e.g. impacts from Earth and solar magnetic fields, low densities of ions in the extreme Earth atmophere, GR considerations), and that the neutrino measurement is the true Lorentz transform constant “c”, rather than resorting to tachyonic theories, or the like.

  8. Ervin Goldfain says:

    Among other systematics, Tommaso cites on his blog a potentially large source of timing errors due to variations in the optical properties of the optical fiber (40,000 ns offset!).

    It is pretty common to register individual variations of both refractive index and dispersion along and across optical fibers. These must be carefully accounted for. The temperature needs to be also carefully monitored. Internal stressing of the fiber (or microcracks in the core) that develop over time can also be troublesome.

  9. ProfChuck says:

    When the Michelson–Morley experiment failed to disclose the presence of “Lumineferous ether” the physics community began a scramble to account for the findings. It took Einstein to unravel the mystery and a new era of physics was born. As a physicist I relish the idea that we may be witness to such a revelation in our lifetimes.
    A review of the most recent publications from the OPERA experimenters makes it clear that they take the potential consequences of their discovery very seriously. The rigor with which they are examining their experiment in a search for sources of error should serve as an illustration of how good science is done. If these findings are real, and it looks more and more like they are, is there an Einstein lurking in the wings that can make sense of it all?

  10. Ervin Goldfain says:

    If all systematic and statistical errors are ruled out, a field-theoretic solution for the OPERA anomaly is to consider that single-flavor ultra-relativistic neutrinos and photons behave as components of the same gauge doublet. The benefit of this approach is that Lorentz symmetry is automatically preserved regardless of the neutrino beam energy or its relative velocity:

    http://prespacetime.com/index.php/pst/article/view/288

    But this is just a speculation at this time…

  11. A naive question from a science writer: Rather than drilling a hole and other potentially expensive changes to the experiment, would it be easier to build a bare-bones new detector further down, where the neutrino beam resurfaces before going out into space?

    I imagine that having a detector on the surface would make estimating the distance a lot easier. Sure, the signal to noise ratio might get worse because of cosmic rays and other stuff. But perhaps for the purpose of picking up a few neutrinos for the purpose of velocity measurement, that would not matter too much?

    • Kea says:

      Davide, the difficulty is that the beam only points in a certain direction, and the beam is necessary to obtain a statistically significant number of events. Now it is unlikely that a new LHC type accelerator will ever be justified. Some people are thinking of a space based detector … now that could work.

      • Oh, but I wasn’t suggesting making any changes to the accelerator or to the beam. What I was saying is that the beam, which follows a straight line from the LHC to the undeground lab, at some point must come out of the earth and then head into space. In principle one should be able to put a detector right there on the surface where the beam comes out.

        And yes, I imagine a satellite in geostationary orbit would work too, but by then the beam wold be spread very thinly, wouldn’t it? (Who suggested that idea?)

      • Kea says:

        But that point on the surface is fairly close to Gran Sasso, so it wouldn’t help.

        As for the space based detector, now I can’t find it anywhere, but what I imagine is this: it would have to be a large array of strings, like IceCube or the ocean detectors. I think the Earth’s gravity could be used to position the base of each string, and the difficult engineering is in the stable holding grid at the top. Without thinking about it much, I’m guessing this would have to be at least 1km square (disadvantage of vacuum compared to ice/water, but we don’t aim for the same capabilities as the Earth based detectors).

      • My point was not to have a detector at much greater distance, just to have a detector on the surface, because it might be easier to accurately measure its distance from CERN.

      • Kea says:

        The problem is the lack of shielding from cosmic rays, which is why underground detectors are usually used. But I still think it is possible …

      • Kea says:

        … I mean, the space based scenario.

  12. So now, on page 15 of the report, we have the case of the miraculously accurate 100 MHz ethernet oscillator. AZSquib posting here and Marc Bevand on his log have spotted this issue. At a minimum, something is unexplained here.

    http://blog.zorinaq.com/

    http://www.azr2.com/user/14

    • AZSquib says:

      I cannot find any documented accuracy for the local 100MHz clock sources used as the time bases for the fine counters in the OPERA Detector FPGAs on the Ethernet Controller Mezzanine (ECM) boards that are part of the DAQ system located with the Front End (FE) cards of the Target Tracker (TT). The best source of information I could find on this clock source is from reference [27] in the paper:
      “The OPERA global readout and GPS distribution system” by J. Marteau

      http://arxiv.org/abs/0906.1494>

      As I understand from this paper, the fine counters in the FPGAs are slaved to the OPERA master clock which increments a “course counter” in each FPGA every 0.6 seconds (the 1PPS signal). This 1PPS OPERA master clock signal that is distributed to the TT FE ECM cards is the one that is meticulously checked and time stamped against GPS, so its accuracy and position in time looks to me to be well understood and accounted. The fine counters are reset every 0.6 s by the arrival of the master clock signal that also increments the coarse counters. These two counters together provide the time stamp of the arrival of a detected event. The paper meticulously accounts for the delays in resetting the fine counter and the quantization error of the FPGA’s 100 MHz clock (10ns period) appropriately (24.5 +/- 1ns).

      However, I don’t see the local 100 MHz clock source described, it’s accuracy or compensation described, or this clock accounted for in any of the the systemic error calculations.

      I had coorespondence with J. Marteau on this subject and he stated: “Every distributed sensor received a 20 MHz clock derived from a 10 MHz oscillator (TEMEX 4842) with a very high frequency stabilility. The 100 MHz clock on each sensor is generated from this 20 Mhz clock using a local PLL and therefore we are not adding any more delays and/or drifts in the local timing. So this local clock is in permanence locked on the precise oscillator, not relying at all on cheap quartz oscillators.” This is not described in any paper that I can find, but I could beleive it except that it contradicts the following from his own paper which states that the 20 MHz clock is only used at the “clock master cards” for distribution of the 1PPS signal to the individual TT FE ECM sensor interface cards:

      “The clock distribution system starts from the GPS control unit which synchronizes a 20MHz clock with the signal of the GPS and sends the clock + encoded commands via an optical fibre. The signal is then
      converted into electrical format and distributed to the “clock master cards” through M-LVDS bus. Each of these cards deserialize the commands and the clock, and distribute both of them to the clock unit of each controller board through another M-LVDS bus. The PPS sent to each sensor is used to reset the local fine counters after correction of the time propagation delay. Since the PPS transmission time is recorded and locked with the GPS, the absolute event timestamp is assigned as the sum of that time and the value of the local 100MHz clock running on the ECM.”

      This seems to again point to a local 100 MHz oscillator and perfectly explains resetting a fine counter upon arrival of the PPS signal. This re-synching of a local clock source to the PPS prevents any long term drift, but local short trem drift exists, if it really is a local clock and not derived from the 20MHz OPERA Master Clock (not documented).

      Clearly, over enough time (say, 0.6s of the 1PPS period sent to the sensors according to documented descriptions), there is no clock drift from the OPERA Master Clock, so the overall accuracy is that of the master clock. However, for each and every event recorded, the local clock drift of the “fine counters” will come into play since it’s last re-sync with the master clock.

      After reading about the OPERA DAQ design, it is my understanding that the Target Tracker is based on a quantity of 1,984 of the 32-channel ROC ASICs and that there are two such devices used to readout each PMT. It is further my understanding that there is one FPGA on the ECM per two ROC ASICs, so there are 992 FPGAs and therefore there are also 992 local 100 MHz clocks.

      From the physical photograph of the ECM in the J. Marteau paper, it appears to me that the local 100 MHz clock MIGHT be derived from the quartz crystal oscillator seen in the photo. J. Marteau stated it is not, but I am not convinced based on the actual desctiptions that are documented. The quartz crystal oscillators of the type shown in the photograph that are used for Ethernet communication have an accuracy of +/- 50 to 100 parts per million (ppm). Summing the local clock errors for the 992 local clocks could easily result in a -.2 ppm average drift error that would be needed to account for the result of -60 ns discrepancy ToF reported. Particularly when the data is recording the EARLIEST arrival detection in the TT.

      My calculation for the -.2 ppm mean drift error is as follows: There is 0.6 seconds between master clock resets to the fine counters. Therefore, the arrival time of random recorded events would be with the fine counters having a mean of 0.3 seconds. For a reported discrepancy of -60ns, the mean local clock error would then need to be -60ns in 0.3s which is -0.2ppm.

      I wrote to J. Marteau with my observations and concerns and he stated that they are collecting them and I have not heard back since.

      However, I have also written to him of a specific and easy test to see if this local 100 MHz clock is an error term, if the raw data exists.

      With the raw time stamp data (fine counter data in particular), bin the det cted events into two bins: Those detected events that have a time stamp with fine counter of less than 300 milliseconds and those that are greater than 300 milliseconds. The average detected event arrival in each bin (assuming a flat PDF of arrivals) would be 150 milliseconds and 450 milliseconds respectively and each bin would have approximately the same number of events. The 0-300 millisecond bin would have a new “early arrival time discrepancy” of 30ns and the 300-600 millisecond bin would be 90ns (given that the flight velocity really was C). Of course, with less samples in each bin, the statistical error would grow, but it would still be enough to show whether this is an unaccounted error term.

      Now, with the new results, there are only 20 samples to bin. This check would be very easy, as long as the raw timestamp information is available (fine counter data for each of the 20 samples). If the local 100 MHz clocks are contributing error due to local drift with respect to the OPERA master clock, it will be obvious in the results of this simple test.

  13. Azzam AlMosallami says:

    The story began with me in 1996, when I finish my project of my graduation in BA. It was regarded to unifying between Quantum theory (Copenhagen school), and relativity theory of Einstein. I introduced in my research that it is possible measuring the velocity of the particle to be greater than light speed in vacuum, but the actual velocity is not greater than c. in 2007-2008, the quantum tunneling experiments for G. Nimtz were good support for my proposal, but it was required for an experiment for particles which own rest mass greater than zero. OPera experiment now proof my theory. To understand how could I predict the possibility of measuring speeds greater than the speed of light review my paper in http://vixra.org/pdf/1111.0001v1.pdf

  14. Excellent

    this is what i call….A GOOD NEW

    i know that OPERA scientists are right and no error will be found

    CERN scientists know what they are doing

    any attempt to prove that clocks were desynchronizated etc etc etc

    are wrong……CERN is right..OPERA is right

    see this one…no conventional explanations can disbelief OPERA

    http://hal.archives-ouvertes.fr/index.php?halsid=rpiuvjasfrfdo82jgnstuqlpc4&view_this_doc=hal-00632059&version=1

    everyday i check this link to see if the Superluminal Neutrino remains

    http://hal.archives-ouvertes.fr/index.php?halsid=abojdakqdbd6onbfaab9s8lde0&action_todo=home

    and remains…

    • quoting Dr Gibbs
      ————————————

      The OPERA experiment has failed to find an error in their measurement of neutrino speeds that shows them travelling faster than light. The earlier result was most strongly criticized because of the statistical nature of the measurement which involved fitting the timing profile of many observed events at Gran Sasso to the known shape of long duration pulses of protons 730 km away at the source in CERN. This objection has now been quashed by using much shorter pulses so that the effect can be seen with just a few neutrinos. While the previous measurement used data gathered over three years, this new confirmation took just a few weeks.
      ———————————–

      the OPERA Neutrinos are FTL without shadows of doubt

      therer are only 3 theories that can explain what happend to the Neutrino to be FTL

      1)-micro wormhole generated in accelerator
      2)-micro warp drive generated in accelerator
      3)-micro jump gate to an extra dimension generated in accelerator

      i favor the micro warp drive

      micro wormholes woule evaporqate in a fraction of second releasing Hawking Radiation..even after the Neutrino crosseed the Einstein.Rosen Bridge….Kurskal-Szekers or Penrose diagrams

      no Hawking detected

      jump gates are even more difficult..the accelerator woud need to generate energy to break the junction conditions

      the warp drive seems to be the viable option

  15. Azzam AlMosallami says:

    The problem is not in the OPERA experiment. The problem is how to understand the special relativity according to the concepts, principles and laws of Quantum theory (Copenhagen school). Einstein dependent depended on his derivation of his equation of relativity on the objective existence of the phenomenon, But according to Copenhagen school the observer has the main formation of the phenomenon. Quantum tunneling experiments have shown that 1) the tunneling process is non-local, 2) the signal velocity is faster than light, i.e. superluminal, 3) the tunneling signal is not observable, since photonic tunneling is described by virtual photons, and 4) according to the experimental results, the signal velocity is infinite inside the barriers, implying that tunneling instantaneously acts at a distance. We think these properties are not compatible with the claims of many textbooks on Special Relativity [1-9, 16]. The results produced by our modified special relativity theory are in agreement with the results produced by quantum tunneling experiments as noted above, and thus it explains theoretically what occurs in quantum tunneling. It proves the events inside the tunneling barrier should occur at a faster rate than the usual situation in the laboratory. It provides a new concept of time speedup which is not existed in special relativity theory. The concept of time speedup in our theory is proven by many experiments where some enzymes operate kinetically, much faster than predicted by the classical G . In “through the barrier” models, a proton or an electron can tunnel through activation barriers [11, 12]. Quantum tunneling for protons has been observed in tryptamine oxidation by aromatic amine dehydrogenase [13]. Also British scientists have found that enzymes cheat time and space by quantum tunneling – a much faster way of traveling than the classical way – but whether or not perplexing quantum theories can be applied to the biological world is still hotly debated. Until now, no one knew just how the enzymes speed up the reactions, which in some cases are up to a staggering million times faster [14]. Seed Magazine published a fascinating article about a group
    of researchers who discovered a bit more about how enzymes use quantum tunneling to speed up chemical reactions [15]. The modified special relativity theory answers all the preceding questions. In the case of the OPERA experiment, it is produced a frame with time speeding up. thus inside the tunnel in which the neutrino moving the clocks and events is moving faster than our clocks. Thus from this point, when the neutrino reaching to the end of his trip to the tunnel end and passed the length of the tunnel L for an observer stationary inside the tunnel and this observer (inside the tunnel) measured a time separation t for this event, at this moment, the observer on the earth will not see the neutrino reached to the end of his trip but it is still in L’=gama^-1*L, and this passed in a time separation t’=gama^-1*t. Thus when the neutrino reaches to the end of the tunnel according to an observer inside the tunnel it is actually reached to the end and it will be received by our detectors. Thus we will think that the neutrino is jumped from L’ to L at a zero time separation and when we divide the passed distance L/t’ which is the time recorded according to our clocks we will get the velocity of the neutrino greater than the speed of light in vacuum where t’ less than t. But this velocity is not actual. The actual velocity is given according to L/t, which t is time measured by observer stationary inside the tunnel and this velocity is not exceeding the speed of light. Now when we see the neutrino jumped from L’ to L in a zero time separation we see the neutrino at two places at same time. This is confirmed In March 2010 researchers at UC Santa Barbara have provided the first clear demonstration that the theory of quantum mechanics applies to the mechanical motion of an object large enough to be seen by the naked eye. In a related experiment, they placed the mechanical resonator in a quantum superposition, a state in which it simultaneously had zero and one quantum of excitation. This is the energetic equivalent of an object being in two places at the same time. I want to mention here gama^-1 is not depending on the velocity of the neutrino, but depending on the difference of the potential between the frame of neutrino moving and the observer on the lab located and this depends on many factors like, pressure, temperature and mass…etc. where in this case the observer on the lab. located on a higher potential. To understand more about my new relativity, review; http://vixra.org/pdf/1111.0001v1.pdf

    • Ervin Goldfain says:

      Quantum mechanical excursions outside the light cone are perfectly compatible with Einstein causality and are accommodated within Quantum Field Theory via pair creation and annihilation. Contrary to your claim, quantum tunneling does not require changing Special Relativity:

      http://arxiv.org/abs/quant-ph/9809030

      http://arxiv.org/abs/1110.1162

      • Azzam AlMosallami says:

        I mentioned in my paper the properties that produced from the results produced by the experiments done by G. Nimtz, where G. Nimtz noticed by his results “these properties are not compatible with the claims of many textbooks on Special Relativity “.
        I’m not changing the special relativity because Quantum tunneling required that. I did my work in 1996 as a project research of my graduation. the aim of my project was how to understand the relativity theory according to the concepts, principles and laws of quantum (Copenhagen school). or in other words, Unifying the relativity and quantum in the same concepts, principles and laws. If you look carefully to quantum tunneling results you will find there is a connection between these results and the result produced by OPERA experiment. but the difference is quantum tunneling is accepted according to quantum laws, but why it is speeding up time still mysterious, because space and time is only interpreted by relativity theory only. OPERA experiment can’t be interpreted by relativity theory but neutrino is seen to move greater than the speed of light. To understand what happened in the both cases, we must connect the spacetime with wave function produced by quantum. and this required to unify between quantum and relativity in same concepts and principles, and this what I did in 1996. According to my work, the wave function of Heisenberg and the spacetime are two faces for one currency which is the present.

      • Paul Hoiland says:

        It could be nature showing a mechanism that preserves causality at expence of local value of C. The neutrinos in their local bubble of a modified vacuum state would be subluminal even though in the normal vacuum state they appear superluminal

  16. It seems to me this result is not unrelated to another recent development :

    http://xxx.lanl.gov/abs/1111.3328

  17. Ervin Goldfain says:

    @Azzam AlMosallami,

    I still do not understand your claims about the need to reformulate Special Relativity. The validity of Lorentz transformations has been confirmed countless times in many different settings.

    The paper by Nimtz talks about violation of relativistic dispersion by quantum effects associated with evanescent light waves. There is nothing surprising here. Since Special Relativity is a classical theory, it automatically ignores quantum processes such as virtual particles, particle-antiparticle pairs and vacuum fluctuations. This is the whole point of Nimtz’s paper and of the paper I cited before on quantum tunneling:

    http://arxiv.org/abs/quant-ph/9809030

    • ProfChuck says:

      It’s all about causality. IF, and that is a very big IF, this discovery shows that it is possible to transmit information faster than the speed of light then then according to most interpretations of the GRT it is possible to transmit information back in time. The consequences of this are enormous. Either this interpretation of the GRT is wrong or we must come up with a trans-relatavistic mechanism to accommodate the obvious paradoxes that would result.
      This observation appears to be a macroscopic phenomena triggered by a quantum event. That is very rare. Helium 4 superfluid is one of the few such things that I can think of.

      • Ervin Goldfain says:

        It is disturbing that so many people erroneously interpret the relationship between Special Relativity and Einstein causality, on the one hand, and quantum theory on the other. They fail to realize that Quantum Field Theory in fact reinforces Einstein causality by bringing up antiparticles and virtual processes driven by vacuum fluctuations.

  18. Ervin Goldfain says:

    @Azzam AlMosallami,

    Here is an article that talks about erroneous interpretation of results discussed in Nimtz’s article:

    http://sitemaker.umich.edu/herbert.winful/files/comment_on_nimtz_stahlhofen_experiment.pdf

    • Azzam AlMosallami says:

      When I was studying physics in BA. I believed the laws of physics that govern the micro world and micro must be the same. I found the concepts, principles and laws that quantum theory (Copenhagen school) was built on are more convinced than the concepts, principles and laws of classical physics that relativity was built on. Specially, when Einstein built on his special relativity, the concepts of Quantum theory were not existed. because of that Einstein was hate quantum theory, specially Heisenberg uncertainty principle. and he tried to proof the inconsistency of quantum, but he failed. Because Einstein was understanding the concepts, principles and laws will lead in the future to modify his relativity theory according to its concepts. When I reformulated the special relativity theory according to the concepts of Quantum in 1996, I predicted it is possible that measuring a speed of light beam or particles depending on x and t to be greater than light speed in vacuum, but the measured speed is not real according to Heisenberg uncertainty principle. OPERA experiment was good support for me to proof my work after 15 years. If you study my paper well, you will see how my paper agrees exactly with what measured by OPERA team furthermore it agrees with Cohen-Glashow effect, where OPERA team measured the speed of the neutrino greater than c according to x/t, but this speed is not real. the real speed measured by an observer located stationary in the frame that the neutrino moving in, and this speed is less than c. both of the observers create his own picture related to x and t of the neutrino, same as in copenhagen school. I can’t summarize my paper in few lines, Please will you review it, then we discuss.

    • Azzam AlMosallami says:

      Dear Ervin Goldfain,
      Please review http://www.telegraph.co.uk/science/science-news/8905322/Speed-of-light-experiment-was-wrong-after-all.html
      and compare my last reply to you. I said in the last reply “When I was studying physics in BA. I believed the laws of physics that govern the micro world and micro must be the same. I found the concepts, principles and laws that quantum theory (Copenhagen school) was built on are more convinced than the concepts, principles and laws of classical physics that relativity was built on. Specially, when Einstein built on his special relativity, the concepts of Quantum theory were not existed. because of that Einstein was hate quantum theory, specially Heisenberg uncertainty principle. and he tried to proof the inconsistency of quantum, but he failed. Because Einstein was understanding the concepts, principles and laws will lead in the future to modify his relativity theory according to its concepts. When I reformulated the special relativity theory according to the concepts of Quantum in 1996, I predicted it is possible that measuring a speed of light beam or particles depending on x and t to be greater than light speed in vacuum, but the measured speed is not real according to Heisenberg uncertainty principle. OPERA experiment was good support for me to proof my work after 15 years. If you study my paper well, you will see how my paper agrees exactly with what measured by OPERA team furthermore it agrees with Cohen-Glashow effect, where OPERA team measured the speed of the neutrino greater than c according to x/t, but this speed is not real. the real speed measured by an observer located stationary in the frame that the neutrino moving in, and this speed is less than c. both of the observers create his own picture related to x and t of the neutrino, same as in copenhagen school. I can’t summarize my paper in few lines, Please will you review it, then we discuss.”

  19. Jeroen Seebodlt says:

    One element I haven’t seen in both papers is a correction for the rotation of the earth itself. During time of flight Gran Sasso is moving towards the neutrino beam, making it seem that the neutrinos are moving faster than the speed of light. This deducts a few nano seconds from the measured speed

  20. [...] correct has increased from 1 in a million to one in 100 thousand,” wrote physicist Philip Gibbs on the viXra log (though he stressed that those numbers were merely illustrative and not actual calculated [...]

  21. [...] correct has increased from 1 in a million to one in 100 thousand,” wrote physicist Philip Gibbs on the viXra log (though he stressed that those numbers were merely illustrative and not actual calculated [...]

  22. ICARUS refute a superluminal interpretation of the OPERA result

    http://arxiv.org/abs/1110.3763

    • Philip Gibbs says:

      These days, by time something reaches the arXiv it is old news :)
      In this case I agree with Strassler http://profmattstrassler.com/2011/11/21/why-icarus-doesnt-refute-opera/

      Icarus clearly does not refute it because it depends on a theoretical result that has a higher probability of being wrong than OPERA has of being right.

      • i agree with you at 100% Dr Gibbs

        ..i checked the link you provided

        OPERA is Superluminal without shadows of doubt

        and i am no longer reading arXiv papers….there are so many papers about OPERA and many of them have nothing to add on this subject it is always the same…ICARUS vs OPERA etc etc etc

        that i prefer to follow the progress from this blog

        of course my Warp Drive interpretation for the OPERA is among the possible theories…..perhaps the Superluminality of the Neutrino is due to other physical reason but i must coinfess that Warp Drive is my favorite theory and fits well to explain OPERA

      • Strassler treats «OPERA as deeply implausible» as I do.

        It seems to me that Strassler is being overly speculative when he says that OPERA is not strictly refuted. The key though is his:

        «ICARUS only prove that either OPERA is wrong or there’s a so-far unknown modification of relativity which allows an escape from Cerenkov-type radiation for neutrinos at the energies of OPERA’s beam.»

        One always could imagine the existence of some unknown modification of some known theory for explaining virtually anything.

    • Azzam AlMosallami says:

      This paper http://arxiv.org/abs/1110.3763 is right and The opera result is right also. Both are right in their measurements.
      When I was studying physics in BA. I believed the laws of physics that govern the micro world and micro must be the same. I found the concepts, principles and laws that quantum theory (Copenhagen school) was built on are more convinced than the concepts, principles and laws of classical physics that relativity was built on. Specially, when Einstein built on his special relativity, the concepts of Quantum theory were not existed. because of that Einstein was hate quantum theory, specially Heisenberg uncertainty principle. and he tried to proof the inconsistency of quantum, but he failed. Because Einstein was understanding the concepts, principles and laws will lead in the future to modify his relativity theory according to its concepts. When I reformulated the special relativity theory according to the concepts of Quantum in 1996, I predicted it is possible that measuring a speed of light beam or particles depending on x and t to be greater than light speed in vacuum, but the measured speed is not real according to Heisenberg uncertainty principle. OPERA experiment was good support for me to proof my work after 15 years. If you study my paper well, you will see how my paper agrees exactly with what measured by OPERA team furthermore it agrees with Cohen-Glashow effect, where OPERA team measured the speed of the neutrino greater than c according to x/t, but this speed is not real. the real speed measured by an observer located stationary in the frame that the neutrino moving in, and this speed is less than c. both of the observers create his own picture related to x and t of the neutrino, same as in copenhagen school. I can’t summarize my paper in few lines, Please will you review it, then we discuss
      my paper http://vixra.org/pdf/1111.0001v1.pdf

  23. number26 says:

    Top speed of the muon neutrino is a quantum tunneling effect, produced by the interaction between the vacuum and neutrinos, the value of the negative energy of vacuum is very close to the neutrino rest mass, so that the probability of exchange of energy between the vacuum and neutrinos has a real amplitude. In this way we can see that the square of the mass difference for the angle
    for theta13, 2.34 x 10 ^ 3 eV ⁻ ² = (exp (3) x -2.39 x 10 ^ -3 ev) ², where: -2.39 x 10 ^ -3 is equal to the energy of the vacuum.
    The vacuum tunneling jumps an energy barrier next to the rest energy of the neutrinos, and neutrinos give the same vacuum energy in a continuous loop, preserving the energy conservation.
    This jump in the energy barrier creates a virtual walls where the velocity of light is exceeded
    For this reason there is no Cherenkov radiation.
    The neutrino oscillations are due to a phenomenon of interaction with this vacuum.
    The mass of the neutrino has a value very close to the Planck mass x EXP-(inverse fine structure constant / 2), and negative energy of the vacuum can be expressed as:
    [-planck mass x c² x EXP- (70 + Dark energy density)] / 1.602176565 x 10 ⁻ 19 C = [negative vacuum energy] x sqr (2) ( ev)

    [-planck mass x c²x EXP- (70 + Dark energy density = 0.728)] / 1.602176565 x 10 ⁻ 19 C = [negative vacuum energy] x sqr (2) ( ev)

  24. number26 says:

    70²= Sum 1 to 24 ( n² ), 24 dimensions, permutations of 4 dimensions

  25. Bill Phates says:

    Did they forget to take into account the curvature of the Earth when measuring the 730 km?

    • Simplicity says:

      Hello :-)

      I am currently working on my own to assess if and how FTL-neutrinos can be predicted by a mathematical/physical framework.

      Concerning this: Does anyone know if astronomers have measured and found where the gravitational center in our “local galaxy group” is ?

      I have searhed the net and the google-scholar and I only find that it is located somewhere between the Milky Way and the Andromeda Galaxy.

      • Simplicity says:

        Ok, here is my wild speculation:

        http://c-and-gravitation.blogspot.com/

        Regards :-)

      • Paul Hoiland says:

        One problem is the neutrinos are not massless so even if we could assume a modification of relativity along such lines one would have to explain how a particle with mass manages to be faster than a particle without rest mass.

        As for the center of mass of our local group the effect of such decreases over distance and given the 1/R^2 rule any effect it has would be way smaller than even this increase in velocity unless you invoke some type of Newtownian rest frame was at play.

      • ————————————–
        quoting Paul Hoiland

        One problem is the neutrinos are not massless so even if we could assume a modification of relativity along such lines one would have to explain how a particle with mass manages to be faster than a particle without rest mass. As for the center of mass of our local group the effect of such decreases over distance and given the 1/R^2 rule any effect it has would be way smaller than even this increase in velocity unless you invoke some type of Newtownian rest frame was at play.
        ————————————–
        Even given there needs to be a lot more experiments like over a longer baseline before this case is verified the explinations that at present could fit the situation all involve some form of extension to SR, a violation of SR, or possibly dimensional/time short cuts along quantum lines, micro warp bubbles, brane lensing, etc.
        ————————————–

        agreed

        if e=mc^2 is valid then a neutrino could never bypass a photon in the vacuum

        the mass of the neutrino is about 10^-9 GeV/c^2

        something happened …..and i favor these ones

        dimensional/time short cuts along quantum lines, micro warp bubbles, brane lensing, etc.

      • Paul Hoiland says:

        Even given there needs to be a lot more experiments like over a longer baseline before this case is verified the explinations that at present could fit the situation all involve some form of extension to SR, a violation of SR, or possibly dimensional/time short cuts along quantum lines, micro warp bubbles, brane lensing, etc.

      • Paul Hoiland says:

        Actually, on a side question for Fernando, I happened to really look close at the metric for the NWD idea and you’re older Brane lensed hyperdrive. These two metrics are rather simular except for the e field values in the hyperdrive one. Is there a tie there? And if so what is the tie in. I had asked a few others about that and no one seemed to know. In fact, I had noticed that if you transformed the NWD metric into a higher dimensional format you could get the same. So are the two metrics connected.

        Reason I asked is a short look at what you predicted in the first and the energy involved tends to tell me you hit the nail on the head about high energy particle experiments could display brane lensing effects like superluminal travel and if there is a connection then the micro warp bubble is simply a 4D model of a higher dimensional short cut.

      • ——————————–
        quoting Paul Hoiland

        Actually, on a side question for Fernando, I happened to really look close at the metric for the NWD idea and you’re older Brane lensed hyperdrive. These two metrics are rather simular except for the e field values in the hyperdrive one. Is there a tie there?
        ——————————
        In fact, I had noticed that if you transformed the NWD metric into a higher dimensional format you could get the same.
        —————————–
        Reason I asked is a short look at what you predicted in the first and the energy involved tends to tell me you hit the nail on the head about high energy particle experiments could display brane lensing effects like superluminal travel and if there is a connection then the micro warp bubble is simply a 4D model of a higher dimensional short cut.
        —————————-

        no Paul the metrics dont have a tie..

        actually i abandoned 5D Extra Dimensions and my arXiv papers on that subject gr-qc/0603106 etc long ago…

        i came back to 4D for the Natario Warp Drive(NWD) not the Alcubierre one

        5D can indeed produce superluminal effects seen in 4D the same effects that would be seen if an external observer sees a 4D warp drive passing by him…..he could not tell if he is watching a 4D warp drive or a 5D superluminal geometrical effect seen in 4D passing by him only by watching the superluminal velocity

        as for “hit the nail on the head about particle experiments” only the future will tell….i choosed micro warp drive because i am “addicted” to warp drive theory..other theories are equally valid

        but although i have my fingers crossed around the micro warp drive i am not counting on the OPERA micro warp drive to “pay my bills”

      • Simplicity says:

        First of all I have to thank you for your feedback.

        I find that I am able to answer your following statement:

        “if e=mc^2 is valid then a neutrino could never bypass a photon in the vacuum”

        In my wild mathematical speculation I arrive to the conclusion that:

        C (here the maximum speed limit) inside earth might be faster by 0,0027 % compared to the light speed above earth. That means that a photon in a thin, thin vacuum inside earth should go up to 0,0027% faster than a photon in vacuum above earth. In this frame, here inside the earth, the “OPERA-neutrinos” who runs 0,0025 % faster than the so far known lightspeed is not bypassing the speed limit C for a photon inside that frame, inside the earth.

        But I know that my mathematical speculation here is far out.

        The speculation in my blog is this:

        “Neutrinos faster than light speed. A wild speculation:

        Ok. I will now try a wild, wild and, most likely, totally wrong speculation of how to explain why neutrinos sent from Cern, through the earth, to Opera is exceeding the light speed by 0,0025 %. My explanation is founded on the main equation in my theory “A philosophical and mathematical theory of everything”:

        See my TOE (Theory of everything, described both philosophically and mathematically) here:

        http://www.vixra.org/pdf/1010.0035v4.pdf

        Main equation there is:

        MC^2
        E = ———————————-
        (1+((r(Ap1/Ap2))/RS))^2

        Where

        r = Distance between particle 1 and 2

        Ap1=Ap2 = Surface area of particle 1 / surface area of particle 2

        RS = Distance between particle 1 and “All particles original state, the state of singularity, that (in my theory) surrounds / encircles the Universe (S)” for example, 46.5 billion light years”

        To understand why I mean that the above equation is correct you really have to understand the mathematical chapter in my above mentioned vixra article.

        OK.

        Neutrinos faster than light explained by my main equation and theory:

        First we actually have to use the fact that the gravitational center in our local galaxy group lies somewhere between the Milky way and the Andromeda galaxy, about 1 250 000 lightyears from earth. Se explanation below.

        Let us begin with the math based upon the main equation above:

        C=Maximum Speed limit

        Light speed measured is: 299792,458 km/s

        Let us see if C, the maximum speed limit, is faster than 299792,458 km/s:

        r is here the distance between earth and the gravitational center in our local galaxy group that lies somewhere between the Milky way and the Andromeda galaxy, about 1 250 000 lightyears from earth. This fact must, in my wild bet here, be part of the equation, here in r, when we shall calculate light speed, as measured above earth, relative to the speed limit C that might be faster inside earth cause the relevant r there is close to 0 (in this wild bet). The reason for this is that light speed above earth is, as far as I understand, mostly affected/bent by the gravitational center in our local galaxy group, other factors that might affect light speed above earth is in my opinion negligible. But, as I here wildly bet on, C, the speed limit, should be higher inside the earth cause the relevant r there is close to 0. But let us begin the math:

        E/M=C^2/(1+((r(Ap1/Ap2))/RS))^2

        C^2/(E/M)=(1+((r(Ap1/Ap2))/RS))^2

        C^2/(E/M)=(1+((1250000*(1/1))/46000000000))^2

        C^2/(E/M)=1,00005434856451
        Root of C^2/(E/M)=C/(E/M)=1,00002717391304=

        0,0027 % faster than light speed

        Explained whith mainstream terms:

        E/momentum of massless photon (P) = lightspeed = C/(1+((r(Ap1/Ap2))/RS))

        E/momentum of massless photon (P) = lightspeed = C/(1+((1250000*(1/1))/46000000000))

        Lightspeed (above earth) is measured to be 299792,458 km/s

        299792,458=C/(1+((1250000*(1/1))/46000000000))

        299792,458*(1+((1250000*(1/1))/46000000000))=C

        C=299800,604534185

        C / lightspeed (above earth)=1,00002717391304

        =In percent =100,002717 %

        0,0027 % faster than light speed (above earth), and this result fit nice with the Opera neutrino result.

        Most likely I will throw all this speculation right into the litter box. “

      • Simplicity says:

        Ok.

        My speculative explanation, however, more thorough and detailed than explained here on this thread, is now online at vixra.

        http://www.vixra.org/pdf/1112.0081v1.pdf

        Regards

      • The reviewers of the Journal which I sent my theory to shattered me completely.

        But one reviewer said about my main equation that “This is arithemetically ok but OTHERWISE the formula is Completely ad hoc.”

        Regarding my life in physics: The reviewers verdict was so devastating that I now regard myself as completely shot dead.

        I will from now on completely stop thinking about physics.

        The fact is that I now hate all that has the slightest taste of physics.

        I will never more touch anything related to physics. My daily life contains a well-paid full-time job in finance and a very happy family life. As I’ve always done I will continue to focus on this. This was my final goodbye. Goodbye!

  26. [...] correct has increased from 1 in a million to one in 100 thousand,” wrote physicist Philip Gibbs on the viXra log (though he stressed that those numbers were merely illustrative and not actual calculated [...]

  27. David Brown says:

    “The strongest criticisms will now fall on the use of GPS.” Is the OPERA neutrino anomaly yet another confirmation of MOND (in this case via the Rañada-Milgrom effect)?
    Is Milgrom’s MOND empirically wrong? I conjecture that Milgrom, McGaugh. and Kroupa are plausible candidates for the Nobel Prize in cosmology.
    http://www.astro.umd.edu/~ssm/mond The MOND pages (McGaugh)
    http://www.astro.uni-bonn.de/~pavel/kroupa_cosmology.html Pavel Kroupa: Dark Matter, Cosmology and Progress
    Is the Lambda Cold Dark Matter Model 100% correct? I conjecture that the cuspy halo problem proves that something is seriously wrong with the Lambda CDM Model.

    http://wikipedia.org/wiki/Cuspy_halo_problem

    I am saying Lambda CDM doesn’t work … — Pavel Kroupa
    http://www.youtube.com/watch?v=EZF9bPVOsb4 Dark Matter — The Debate, 2010, YouTube

    • Paul Hoiland says:

      One could also suggest a lot of the Post Newtonian corrections approach, modified metric theories, brane lensing, and the list goes on.

    • Paul Hoiland says:

      Here is something that stems from John Archibald Wheeler’s own words about Mach’s Principle and GR. In this universe everything out there in the Universe, and even beyond in some Brane Theory, has effect on our motion. Basically, if you think about this along voting lines every amount of matter/energy, even perhaps the vacuum itself has 1 single vote and as such it is on equal grounds.

      Now, having said that it is also true that things closer to us have more direct control, like our sun for instance. This is because gravity follows the 1/r^2 rule.

      Added to this we have Lorentz Invariance proposed before Einstein and SR. Even backed up experimentally before him.

      • Paul Hoiland says:

        Under Lorentz invariance, well tested to some major degrees of accuracy, any object with positive mass would need infinite energy to reach the velocity of light. That fact is not changed by our position in the cosmos simply because it stems from properties of the vacuum or ZPF itself. Only known ways around this are:

        1. Lorentz violations.

        2. Exotic energy states.

        3. Switching mass to negative mass.

        The first has been well tested and if it exists there would have to be a perfered frame of reference or simply some energy level where it is broken.

        The second involves things like warp drive, wormholes, and the list goes on. Problem here is certain energy condition factors from quantum theory generally prevent free unbounded exotic energy states.

      • Paul Hoiland says:

        There are exceptions like inflation which can generate an energy state less than the norm and Casimir effects both of which utilize and artificial barrier. The first is a domain wall and the second is generally a manmade or natural occuring one. Other than that all known energy in our cosmos has positive mass/energy.

        The third suffers from same ristriction as second and by the physics of the math involved would end up with only the same exceptions.

        Now your idea preposes our local groups motion offsets C a bit. For that to be the case it would have to be an absolute frame of reference. Such a state goes right back to the first case which has well been shown to not exist at least on any classical level.

      • Paul Hoiland says:

        Now I said forbidden classically. To some extents the CMB itself is a sort of background frame. But, it is for all purposes equal in all directions so even here it fails as some sort of absolute time frame which would allow a Lorentz violation. So actually does the ZPF fail in that respect even though it can be viewed itself as a background in which motion takes place. There have been articles since the 70′s on that one. In fact, I once used the same math reference source on something Fernando pointed out and without knowing wrote an almost identical article on such a frame of reference to a guy back in the 70′s. Same idea, same math, and same conclussion. But he is the first and deserves the credit even though I never knew his existed. In fact, we started from a different point and got the same basic answer.

      • Paul Hoiland says:

        Mine started from the aspects of other branes being themselves a background frame of reference. His was based upon the quantum ZPF actually being the same. But, out side of at the planck scale there is nothing in this sort of aether that violates SR or lorentz invariance. Even in an older article I once did about the ZPF there would actually be no absolute time involved since all the particles at the planck scale via the spread of their quantum waves act as if they exist at all points in space with absolutely zero time. Again, no real direct lorentz violation because the conditions at the planck level are different than those on a more macro scale due to entanglement effects.
        \
        Also, latter looks at Fernando RS model idea showed me velocity there is not actually infinite either.

      • Paul Hoiland says:

        Even when you invoke the idea of a second negative energy brane alonside our own velocity there is still limited by its own version of C and lorentz invariance still applies. The only difference between the two states(our’s and the negative one) is things happen faster there than here. History is still history and the outcome remains the same. Everything still runs forward in time. In fact, a friend of ours by the name of Todd once pointed out via math that both times are actually entangled and as such you still do not beat C.

      • Paul Hoiland says:

        Mach’s principle that came before SR actually prevents your idea on a change in C from being correct. It is a nice idea though and I respect that fact. But, and God knows Fernando and I have had some arguments over math versus physics, but even when the math can have a proper answer that math may or may not be physically correct. Most of the people over here that hate string theory for example hate it because the math can not at present be phsycially proved or disproved. Any theory has to be backed up with observational and experimental evidence when it comes to Physics and science in general. String theory may can be backed up. In fact, these neutrino events and the Cern finding on the Higgs can indirectly do so. But the theory will remain theory untill backed by a lot more physical evidence.

        In the case you propose there exists physical evidence via prior experiments against it well tested to levels beyond the Cern case itself.

    • Paul Hoiland says:

      By the way, I think this all got posted under a different thread than the one Simplicity started and if so it was refering to his only.

  28. Paul Hoiland says:

    I am going to give Fernando Loup’s MWD micro-warp bubble it’s due. There are a lot of articles both published and unpublished that have made the suggestion that an inflation field could take the place of the exotic energy portion of a warp field. I have recently done an article that relooks at tachyon in general and submitted another that focuses on tachyon drivbe reheat and inflation. If one subsititutes our vacuum for the perfect fluid in those equations you end up with a different vacuum state where C no longer holds. Now, several of us over the years have always though the general idea of the field requiring two partys to construct is wrong. In essence if you make one part you get the other then Fernando’s idea could apply to the Cern results.

  29. Paul Hoiland says:

    But it is not the only possible solution.

    By the way, two typos above MWD should be NWD and partys should be parts.

  30. Paul Hoiland says:

    I know that is still a long way from what some want him to explain about how, but, it does give an avenue of that how even if one would still have to explain why one half generates the other half.

    • Paul Hoiland says:

      And there is also the fact that if you search under domain wall motion you find in what we do know that motion of such a wall, in this case induced by energy differences results in local shape function changes not yet looked at when it comes to the actual warp metric shape function. My own short look at this has shown that the field slightly morphs from the NWD function towards something between it and an AWD one and that only the initial static bubble appears fully NWD.

      Now this could itself explain the slow decay of the field when it comes to velocity and why the Super Nova signals are closer to C than those in more local region.

      • Paul Hoiland says:

        But then again, simular looks at brane lensing tend to support a simular decay of the field over time unless the cause of the initial lensing is reenforced.

      • Paul Hoiland says:

        And there is the related issue that a negative mass object behind a domain wall because of over compensation via energy condition requirements could make such a mass appear to have positive mass outside the domain wall. Only signature of the negative mass would be FTL velocity.

      • Paul Hoiland says:

        In fact the last issue can totally change what http://mathbin.net/74445 stipulates. It could have an over compensated positive mass value even though it’s real mass is negative.

      • Paul Hoiland says:

        But then again if the SN signals measurment and this one stand as correct there would have to be an explination of the slowdown also since such would imply the negative mass over distance decreases towards positive which itself runs contrary to negative mass equations under SR.

  31. Paul Chatfield says:

    I forget who said “the simplest explanation is usually the right one” but to me the simplest explanation is that the distance measurement between the 2 points is wrong. A simple comparison between the distance between 2 points on the surface of a sphere and the chordal distance between the same 2 point in a straight line through the sphere gives a difference in the order of 400m for the 2 sites in question which is far greater than the difference observed from timings. It therefore seems obvious to me that the only thing to attempt to measure accurately is the distance.

    • ProfChuck says:

      You are referring to Occams razor and it is generally a good rule to go by. However, a careful read of the OPERA reports shows that distance and timing errors were their first suspicion. It is clear that they have gone to great lengths to eliminate these sources of error from their analysis. It is possible that an error of this kind could be responsible but that possibility is becoming less and less likely.

      • Paul Hoiland says:

        Yes, even the GPS issue was looked into and ruled out after examination. Distance, unless you consider quantyum tunneling is not the solution. The only way to further test this is run the experiment, measure the results at different points and over a longer baseline. One reason I suggest that is the Super Nova results and these tend to suggest even if the velocity results are true that what ever causes this decreases over distance. There has been debate about the SN cases and some favored it to showed them as FTL. But the FTL there is less than here which might help us better understand what is going on.

      • Paul Hoiland says:

        There really needs to be better tests of this. Reason I suspect the effect can fall off is both of the major alternatives that can explain it have a stablility issue that could cause the FTL effect to decay over distance. Also, no one to date has even tackled the how one type of neutrino could do it and another not aspect.

      • Profhuck says:

        Hi Paul.. The issue of neutrinos traveling at different velocities is at the heart of the question. According to the standard theory lighter particles can propigate at velocities closer to the speed of light than heavier ones. Because flavor oscillation between mu and tau neutrinos, which can be identified by changes in mass, has been observed in neutrinos during flight they must be traveling slower than the speed of light. The principal function of the OPERA detector experiment was to observe these oscilations by infering mass from measured velocity. Obviously, the experimenters expected a range of detected velocities, all less than c, and were astonished to find that some of the particles seemed to travel at super-luminal speeds. Because this observation is at odds with both general and special relativity they assumed that the anomalous data was the result of some sort of error in measurement method or apparatus. As a result they have gone to considerable lengths to re-perform the experiment with a wide range of varying parameters such as pulse width and other factors. In spite of their efforts at least some of the super-luminal velocity findings appear to remain intact. The straight line distance (chord distance) has been verified by several different methods and timing errors have been confirmed to be less than three nanoseconds. So the problem remains.
        Both general and special relativity have been repeatedly verified by other experiments so the question of why should a special exception occur in the case of certian kinds of neutrinos presents a significant puzzle.
        I suspect that if the phenomena is not due to a hidden assumption involving classical physics such as gravitational redshift or time dialation of GPS clocks the answer may lie in the difficulty of reconciling, or unifying, relativity and quantum mechanics.
        It is recognized that certian aspects of quantum physics appear to viiolate relativity and thus causality. This is one of the reasons that Einstein did not like quantum theories. His famous statement “God does not play dice.” is an example of his dislike of the concepts involved. However, his own discoveries regarding the photo-electric effect (for which he received the Nobel Prize) serve as powerful foundation material for quantum physics. Interestingly whle Einstein made it clear that he did not like the consequences of quantum mechanics he never said it was not true.

      • Profhuck says:

        Hi Paul. Actually differing propigation velocities of neutrinos is at the heart of the issue. Flavor oscillations between tau and mu neutrinos have been detected so it follows that they must be traveling at velocities slower than c. It is a consequence of the standard theory that lighter particles travel at velocities closer to the speed of light than do heavier ones. It was the purpose of the OPERA detector experiment to measure these velocities with a high degree of accuracy and thus observe the flavor oscillation phenomena with improved precision. When the experimental results suggested that some neutrinos travel at super-luminal velocities the experimenters were understandably dubious and were convinced that some sort of instrumental error was involved. However, after repeated variations of the experiment including such things as varying the pulse width and othe factors the anomalous measurements remained intact. So far, easy solutions to the problem have not been found. It is especially troubling that both special and general relativity have been verified in many different experiments for the last 100 years so why should a particular kind of neutrino be exempt?
        I suspect that the error, if there is one, is the result of something very subtile. There may be a hidden assumption or some elusive factor such as gravitational redshift or time dialation of GPS clocks behind the problem. What ever it is it is clearly not obvious.
        It may well be that the difficulty lies in the issue of reconsiling or unifying relativity and quantum physics It is known that some aspects of quantum mechanics appear to violate relativity and thus threaten causality. It is for this reason that Einstein did not like quantum physics. His famous observation “God does not play dice” is an example of his distrust of quantum mechanics. Even though his discoveries regarding the photoelectric effect (for which he received the Nobel Prize) serve to bolster quantum physics he still distrusted the concept.

        What ever is going on here it is an exciting time for theoretical physics. At the very least some of our methods of measuring phenomena may be thrown into question and maybe, just maybe, we may be looking at revolutionary new physics.

    • Melloney says:

      This shows real exerptise. Thanks for the answer.

  32. Paul Chatfield says:

    Hi ProfChuck, thanks for the reply. Yes I looked up the quote after I posted and had a “doh!” moment of realisation about Occams razor.

    On the point of distance measurement, I realise they went to great lengths to measure the distance, but they used GPS and may not realise that GPS is (as far as I’m aware) based on a co-ordinate plot on an elipsoid surface such that the distance between 2 points is not the same as the chordal distance if you travel through the earth. I’m just wondering if its one of those “so obvious its invisible” mistakes. It would be nice to see the actual distance calculations made.

    • ProfChuck says:

      Hi Paul. I suspect that if there is a distance measurement or timing error it is more subtle than geometry. I would consider relativistic effects such as Lorentz time dilation of the on board atomic clocks due to orbital velocity or gravitational red shift of signals two and from the satellites relative to a location on the surface of the earth. It is important to consider that GPS clocks are sufficiently accurate that relativistic effects must be considered.
      In fact, there are “compensations” included in the GPS navigation algorithms that remove “confusing” relativistic phenomena in an effort to improve accuracy of the process. I discovered this when trying to use GPS data to map the gravitational geoid. These relativistic compensations must be reversed out of the data to get a clear picture of the details of the orbital parameters of the satellites. Interesting stuff.

  33. Paul Chatfield says:

    I lost the “back of a fag packet” I calculated the difference in the expected arrival time and observed arrival time, but it was about 17 meters quicker than lightspeed which doesnt seem subtle at all compared with their +/- 20cm distance “accuracy” quoted. Not being part of the scientific community, I dont know how to put the question to the team directly whether or not they took the curve of the earth into account. I’d like to know. Any advice on how I find out? Thanks.

    • ProfChuck says:

      I suggest you read the actual reports from the OPERA team. They clearly have used the chord rather than surface distance in their calculations. If there is a distance error it is somewhere else. It has been suggested that an optical fiber line be installed between the two locations as an independent timing and distance measurement method. That sound like a good idea to me.

  34. Paul Chatfield says:

    I’d love to read the actual report, but cannot find it. Google just gives me page after page of newspaper links none of which forward me on to the actual report. Do you know of a link I can use to get to it? Many thanks for your help :)

  35. ProfChuck says:

    Try

    http://operaweb.lngs.infn.it/

    This is published by the OPERA group them selves and contains the most up to date information on the project. Because of the controversy they are constantly modifying the content but it is a good source of information. You might have to dig for the data you are looking for but it is worth the search.

    As far as GPS and distance measurement goes it is a very complex problem when you try to get high accuracies. The GPS satellites do not follow a pure elliptical orbit because the earth is not a homogenous gravitational source. Instead, they follow a time varying gravitational geodesic whose shape is determined by a collection of both static and dynamic influences. These include mass concentrations (masscons) in the earth and tidal effects from the Moon and the Sun. The instantaneous shape of this geodesic determines the position of each of the satellites at any particular moment in time. Algorithms are employed to derive the “actual” location of the satellite in geocentric coordinates. The precision with which this is done is one of the determining factors of the accuracy of final location coordinates. Many of the potential errors in this process are revealed as time-variant derivatives in the data and as such are readily recognized and subject to compensation.

    A critical issue regarding positional accuracy remains and has been addressed by comparing GPS and VLBI results. The following paper describes the process by which the location of the phase centers of GPS satellite antennas can be determined by astronomical very long baseline observations of pulsars. It is a bit technical but definitely worth a read.

    http://www.fs.wettzell.de/publ/publ/wtz136.pdf

    In my mind there still exists the potential for static systematic errors in this process that could account for the remarkable OPERA findings. From what I have been able to uncover in the OPERA team reports they are well aware of this possible source of error and have utilized detailed processes to eliminate them.

    If an error still exists after all these efforts my money is on uncompensated relativistic phenomena. Regardless of what is actually going on it is a very exciting time for theoretical physics.

  36. Paul Chatfield says:

    Thanks for the links and additional commentary – it’ll take me a while to digest it all :)

    Yes it is a very exciting time – I just hope it isnt an “oops” moment like the fusion in a bottle thing from a decade or so ago.

    • Philip Gibbs says:

      With cold fusion the people who claimed the discovery were very sure of themselves. The difference here is that the collaboration realize that it is an extraordinary claim and are questioning it themselves.

  37. ProfChuck says:

    Indeed, the scientific community does not need another embarrassment like that one. It is interesting that some scientists are still pursuing the Pons Fleishman effect and the activity is being held under the closest secrecy. Dumm de dum dum key the Twilight Zone theme.

  38. Paul Hoiland says:

    Yes, it is an exciting time indead. I suspect the FTl case will be born out and then comes the really big questions on what exactly is causing them. As someone on the fringe at times who has been an early supporter of VSL ideals and of things like warp drive and hyperdrive I might actually live long enough to see some proof of such being possible. However, before anyone wants to run out and book a trip to any where its still a long road from here to there.

    Hell, in some ways we’ve had previews of “Beam me up, Scotty” long before the drive to reach the stars.

  39. Paul Hoiland says:

    No, it won’t pay the bills and even if valid it still is a long way to something that can be scaled up and used. But interesting anyway and congrats on two possible valid solutions.

  40. masterpapers says:

    masterpapers…

    [...]OPERA fail to find error in Faster Than Light Measurement « viXra log[...]…

Follow

Get every new post delivered to your Inbox.

Join 281 other followers

%d bloggers like this: