Neutrinos exceed the speed of light?

September 23, 2011 • 10:28 am

This news is all over the physics blogosphere: a particle may have exceeded the speed of light.  This, of course, is big news because no object is supposed to be able to do that, though “action at a distance,” as posited by Bell’s inequality and supported by some experiments, suggest that particle interactions (but not information) can exceed light speed.

According to the New York Times, neutrinos lauched at the CERN accelerator in Switzerland seem to have gone faster than light:

Even before the European physicists had presented their results — in a paper that appeared on the physics Web site on Thursday night and in a seminar at CERN, the European Center for Nuclear Research, on Friday — a chorus of physicists had risen up on blogs and elsewhere arguing that it was way too soon to give up on Einstein and that there was probably some experimental error. Incredible claims require incredible evidence. . .

. . . According to scientists familiar with the paper, the neutrinos raced from a particle accelerator at CERN outside Geneva, where they were created, to a cavern underneath Gran Sasso in Italy, a distance of about 450 miles, about 60 nanoseconds faster than it would take a light beam. That amounts to a speed greater than light by about 0.0025 percent (2.5 parts in a hundred thousand).

Even this small deviation would open up the possibility of time travel and play havoc with longstanding notions of cause and effect. Einstein himself — the author of modern physics, whose theory of relativity established the speed of light as the ultimate limit — said that if you could send a message faster than light, “You could send a telegram to the past.”

Alvaro de Rujula, a theorist at CERN, called the claim “flabbergasting.”

Indeed.  Now how could this happen?  Wormholes!

John Learned, a neutrino astronomer at the University of Hawaii, said that if the results of the Opera researchers turned out to be true, it could be the first hint that neutrinos can take a shortcut through space, through extra dimensions. Joe Lykken of Fermilab said, “Special relativity only holds in flat space, so if there is a warped fifth dimension, it is possible that on other slices of it, the speed of light is different.”

Now I’m no physicist, but if neutrinos can go through wormholes, then information can be transmitted faster than the speed of light, which would truly be a monumental finding.

But of course—and this is my guess—the results could be an error or an artifact.  In fact, many physicists are dubious about this.  In the NYT piece, several of them immediately call for replication:

“My dream would be that another, independent experiment finds the same thing,” Dr. Ereditato [the research group leader] told the BBC. “Then I would be relieved.”


Alvaro de Rujula, a theorist at CERN, called the claim “flabbergasting.”

“If it is true, then we truly haven’t understood anything about anything,” he said, adding: “It looks too big to be true. The correct attitude is to ask oneself what went wrong.”


“These guys have done their level best, but before throwing Einstein on the bonfire, you would like to see an independent experiment,” said John Ellis, a CERN theorist who has published work on the speeds of the ghostly particles known as neutrinos.

Ahh. . . music to my ears: the pervasive doubt and calls for replication that follow a radical new claim about science.  Nothing is accepted until many independent observers can duplicate the observations.  Wouldn’t it be wonderful if claims of “religious truth” were subject to the same strictures?

110 thoughts on “Neutrinos exceed the speed of light?

  1. And here I’ve been busy being caught up with having some lulz online (and as it happens, the Starbucks hilarity of today) when actual news is being made. Thanks for the article – time to get into this and see what’s shaking.

  2. I congratulate them on their caution. If they’re right, they’re Nobel-prize-winning game-changers.

    If they’re wrong, they’ve gained an enormous amount of respect from their colleagues.

    Contrast this with the cold fusionists. What did they gain by putting the publicity cart before the science horse? Nothing. And their credibility ended up being destroyed.

    1. Yeah, I’ve been pretty impressed with the reseachers’ humility here. It’s almost like they are saying, “Look folks, we know this result can’t be right, but we did our damnedest and we can’t figure out what went wrong. Any thoughts?”

        1. I well remember hearing a lecture from Eugene Wigner. He was talking about some more philosophical aspect of quantum mechanics, and upon finishing, he said to the audience, “now, dispute this,” and he did it in a way that clearly showed he wanted to enter into a discussion which might have led to his being “proved” wrong.

  3. No matter how this turns out, it’s exciting and it’s good news.

    Most likely, the researchers will discover some subtle effect that they haven’t yet accounted for. Maybe something peculiar about the local geography is messing with the GPS measurements or the like. That in and of itself will be worth a paper.

    But I think we can rule out time travel, at least as popularly conceived. If time travel were possible, it wouldn’t take long before somebody realized the huge advantages to be had by exploiting an earlier version of the universe with less entropy. In no time at all, there’d be a race back to the Big Bang, and the universe would look far different as a result.

    Neutrinos are notorious in how little they interact with matter. It’d be really neat if we discovered that the current standard value for c is based on the “medium” normal matter travels in, and that neutrinos travel that hair’s-breadth faster because space is that much more transparent to them. Maybe virtual particles are slowing down photons?

    Any all y’all real physicists out there who want to correct my misconceptions, please have at it….



    1. It’s easy for photons to be slowed down by their interactions. That’s what causes refraction of light in matter. However, this kind of thing should be a straightforwardly wavelength-dependent phenomenon. Yet the vacuum speed of light does not seem to depend on frequency, to very high precision. Moreover, the maximum speed of electrons has been measured more than a billion times more precisely than the speed of these neutrinos, and it agrees with the speed of light. Any theory in which differently interacting particles exhibit different speeds would need to explain that remarkably precise coincidence.

      1. Ok, as I was thinking out loud below: “photons can spontaneously decay into a electron/positron pair which then annhilate, reproducing the “original” photon. I don’t see why the pair could not interact with other photons (absorb then re-emit) in the interim. The absorb/re-emit process is one way light is slowed in a transparent medium.”

        So is it possible that electrons are also slowed by the same process, in a way dual to photons? Whereas neutrons miss them both?

        1. Yes, both photons and electrons can be slowed down by interactions (although this normally requires a small amount of charged matter filling empty space, which leads to other effects that aren’t observed), but they are not going to be slowed down by the same amount.

          1. Thanks. Now a pretty trusted physicist on another thread contradicts your statement about electrons traveling at light-speed. The reasoning goes that electrons have a mass of 511 keV, so they must all go slower than light. Perhaps you just mean that the same permeability of free space constant is measured in experiments using both electrons and light (since \mu_0 is a parameter in the formula for c predicted by Maxwell’s equations.)

          2. I didn’t mean electrons actually traveled at the speed of light. In this context, “maximum speed of electrons” means the speed they can approach arbitrarily closely but never attain.

      2. Can you point me in the direction of any papers that were written about the measurement of the maximum speed of electrons. Thanks.

    2. Maybe (probably) I’m missing something, but, considering the fame of the Davis/Bahcall/Homestake saga, I wonder that I haven’t heard of anybody talking about the fact that we think neutrinos have some mass in the first place because they experience time and, thus, must actually be travelling a bit slower than C.

      1. It’s my understanding that experiments have shown that neutrinos have mass, but it’s very small indeed. I don’t recall what the upper limit is. The three flavours of neutrino probably have similar masses.

        1. That’s just the thing, though. I’m pretty sure neutrinos were “shown” to have mass by virtue of the fact that they change states and must experience time whereas something travelling at the speed of light cannot. I don’t know what the implications would be if neutrinos were found to be travelling faster still.

          1. As I understand it, many (possibly all) of the experiments used to measure neutrino mass look at the spectrum of electrons emitted in beta decay. If the neutrino has mass, there will be a maximum energy that the emitted electron can carry below the maximum possible energy that the electron-antineutrino pair can carry.

    3. “But I think we can rule out time travel, at least as popularly conceived.”

      That’s not saying much, since the popular conception of time itself is incoherent.

      A coherent theory of time travel combines general relativity with many-worlds QM to create wormholes into alternate pasts. You can’t change your own past (the one you remember experiencing); that would be incoherent. But you can go back and create a branch point from which an alternate chain of events will then unfold.

      1. Also: GR time machines aren’t vehicles; they’re regions of warped spacetime. The farthest you can go back is the time at which the warpage came into existence. So no visits to the Big Bang, unless you can find some naturally occurring time machine dating from that era.

        1. James Hogan wrote some entertaining stories playing with the idea of that type of time travel. The punchline is that, the millisecond your time-travel oraculum gets switched on, it gets hacked from the future. If it allowed transport instead of just information, the first thing through from the other side would be the tip of an irresistible invasion.


      2. I never did care for many-worlds, and it seems to get less plausible the more I think about it…but this probably isn’t the place for that discussion.

        But to stay on topic…if it were possible to create a wormhole into our past from another “universe,” some future alternate resource-starved civilization would have long since done so — and would have done so to all other such universes.

        Remember, growth is exponential unless limited. Time travel raises those limits dramatically, and many-worlds time travel would be even more dramatically unlimited. It wouldn’t take all that many generations before it wasn’t just fossil fuels that got exhausted, before the Dyson Sphere wasn’t enough, before a galactic version of a Dyson Sphere wasn’t enough…but before entire universes aren’t enough.



          1. According to GR, no time travelers can reach the era before time machines were invented. There has to be a working machine at the receiving end for them to step out of.

        1. “…if it were possible to create a wormhole into our past from another ‘universe’…”

          It isn’t. You can’t just choose an arbitrary universe from among the many worlds and hook a wormhole up to it. You must build the wormhole first, and the act of doing so creates a branch point from which many possible futures follow. In some of those futures, time travelers emerge from the wormhole. In some, they don’t. (But in all cases in which they do, they’re not from “another” universe; they’re your own future selves, using the time machine you’ve just built for that purpose.) All physically possible futures exist; that’s the whole point of many worlds.

          1. It’s that last sentence that’s the problem. The set of physically possible universes is overwhelmingly dominated by unstable ones in which everything has already spontaneously decayed. The probability that one should just happen to find one’s self in that 0.00000000000…% of stable universes…well, clearly, the odds against such a proposition are 100% (with any for of rounding ever used in observational sciences.

            In this case, the proportion of universes in which time travel is exploited will approach unity at an exponential rate the instant a universe with time travel emerges. So what’re the odds we should find ourselves in that minuscule fraction of “natural” universes as opposed to all those other ones infested with time travel? Total the number of potential universes, count the number of each, and you have your answer: roughly 0%



          2. If I understand you right, this is just the Fermi paradox in another guise: if the universe is teeming with exponentially-expanding technological life, why haven’t we been overrun by now? I don’t see how allowing the aliens time travel as part of their technological arsenal alters the equation much, or makes a particularly compelling case against time travel in particular (as distinct from any other technological enabler of exponential expansion).

          3. Yes, that’s basically it. But the difference is that we already know that there exists at least one technological “alien” species that would have a profound interest in exploiting time travel: our future selves. That they haven’t exploited us is proof positive that they can’t. That the universe itself hasn’t been exploited means that it’s not possible for any putative civilization in the entire hundred-billion potential practical lifespan of life in the universe.



          4. As I’ve already explained, a GR-based time machine must be built before anyone can emerge from it. All you’ve proved is that we don’t have such technology yet — which we already knew.

      3. First, many-worlds QM has nothing to do with wormhole time travel. And further, you can’t connect a wormhole to a MW that has decohered out of contact, that is after all what QM decoherence means (irreversible pathway).*

        Second, there are all sorts of no-go results on time travel, and wormholes happens to be one of them. There is a very simple classical mechanical paradox tied to them (essentially the outgoing object collides with the ingoing, preventing wormhole travel). I am sure I can find the reference if it is asked for.

        Other no-go’s are that time travel computing would collapse the algorithmic tower of complexity classes, so all of physics would be trivially simple. That is not observed.

        * This is because MW theory is realistic, so it cuts of a lot of BS (physics speak; it means bullshit) that is allowed in other QM theories.

        1. “you can’t connect a wormhole to a MW that has decohered out of contact”

          I’m not claiming you can. You connect the wormhole first, and then decoherence happens depending on what (if anything) comes out of it.

          As for paradoxes, David Deutsch makes what I found to be a pretty convincing argument (and one that agrees with my own prior intuition) that MW solves those. You can’t prevent yourself from entering the time machine because by the time you get there, you’re by definition already on a divergent history from the one that got you there.

          But as I said, I’m not a physicist, so it’s quite possible I’m not doing Deutsch’s argument justice here.

      1. You’re not thinking big enough.

        Assuming the many-worlds variation, inject enough antimatter into a primitive universe to cause antimatter to predominate instead of matter. Open a bridge with a matter universe, and you’ve got two universes worth of E=MC² energy at your disposal.

        Even without many-worlds, it’s still insane. The initial conditions of the universe would be highly mutable. Dope the plasma with some fancy machinery, take advantage of all the energy floating around, and have it pre-bake your habitats for you instead of messy galaxies.

        And do it before your rivals in the Andromeda Galaxy get the idea, or else you’ll have to engineer a solution that works even earlier than theirs does.

        I’m fond of the quote, “Time is nature’s way of keeping everything from happening at once.” Time travel means everything not only can happen at once, but it must happen at once. And we simply don’t observe that happening; ergo, no time travel.



        1. I am not sure why you tie this to Many World QM theory.

          Is it because of the idea above, that it somehow makes time travel hypotheses easier? It doesn’t see my comment on that.

          MW theory is the realistic, parsimonious QM theory. (Need two less axioms because they can be combined.) It is the least mysterious and assuming of them all.

          Most of the times one finds people attack strawman versions of it. It is like evolution in that regard. :-/

          1. Yes, that’s why I combined the two.

            Here’s my problem with many-worlds.

            Assume a very simple universe with two radioactive atoms and nothing else. Assume the atoms have a half-life of 100 seconds, and that Planck Time for this universe is one second.

            Normally, after 100 seconds, we’d expect to see one of the atoms decayed and the other one not (on average, of course).

            With many-worlds, every possible outcome happens. So, after one second, we have the following set of universes:

            universe a:
            atom 1: not decayed
            atom 2: not decayed

            universe b:
            1: decayed
            2: not decayed

            universe c:
            1: not decayed
            2: decayed

            universe d:
            1: decayed
            2: decayed

            Once an atom has decayed, it, of course, cannot un-decay. So, let’s see what happens one second later, at the two second mark.

            universe a: splits four ways as above.

            universe b-a:
            1. decayed
            2. not decayed

            universe b-b:
            1. decayed
            2. decayed

            universe c-a:
            1. not decayed
            2. decayed

            universe c-b:
            1. decayed
            2. decayed

            universe d:
            1. decayed
            2. decayed

            So, two seconds after our experiment has started, we have one universe where neither atom has decayed, four universes where one atom has decayed, and four universes where both atoms have decayed. The chance of neither atom having decayed is only 11%; the chances of one or both atoms having decayed are equal, both at 44%. After only two seconds!

            I can’t be bothered to carry out the math for a hundred iterations, but it should already be obvious that the inevitable result is that all radioisotopes would decay at the exact same, insanely rapid rate.

            That doesn’t even remotely match observations.

            If you can explain to me how many-worlds can therefore be consistent with observations, you’ll be the first to do so….



          2. Try it this way:

            At t=0, there are a gazillion identical universes containing your two undecayed particles.

            At t=1, in 98.6% of those universes, neither particle has decayed; in 0.7% of them, particle A has decayed; in another 0.7%, particle B has decayed, and in roughly 0.0005%, both particles have decayed. All possibilities are represented, but not in equal proportions.

            If you balk at the idea of so many parallel universes, remember that the underlying reality is a superposition of probability amplitudes in a single universe. “Parallel universes” is just a convenient way of visualizing it.

          3. But that’s exactly why I reduced the example down to something so small.

            Pick any finite number of initial universes you like, with any finite number of particles. Look at just one of those universes. After Planck Time, that one universe will have split into exactly as many different universes as there are possible outcomes. Each of those other universes will have done the same. Now, add up all of those and you’ve got your probability spread.

            Imagining the spread of outcomes with numbers like “gazillions” is impossible for humans, which is why I reduced it down to two particles in one universe, a number we can generally manage to handle. Even then, the factorial growth quickly results in unmanageable numbers.

            Go ahead and start with two universes with two particles instead of one universe with two particles. All you’ve done is pick a second universe from the set above, and delayed the cascade by one generation. You haven’t done anything to change the fundamental problem.

            Think of it another way. Phi, the Golden Ratio, can famously be approximated by dividing any two sequential Fibonacci numbers. 8 / 5 = 1.6; 13 / 8 = 1.625; 21 / 13 = 1.615…; 7778742049 / 4807526976 = 1.6180339887….

            But you can pick any pair of numbers, add them together, divide the result by the larger of the pair, and approximate Phi that way. And it doesn’t take very many iterations before it converges to a precision that far exceeds that necessary for human-scale engineering.

            That’s all you’re doing by mixing in all those gazillions of extra universes. Regardless of how many you start with, each one will split into every possible (not probable) universe, and the proportions of subsequent universes will, in just a few generations, result in all radionuclides decaying in the exact same proportions at the exact same insanely-rapid rate, regardless of their observed half-lives.

            And, remember, there’s no hidden variables to control whether or not one set of particles does or doesn’t decay. Each tick of the clock, you get a complete set of all possible permutations from each universe.



          4. The point of the gazillion universes is to get rid of the notion of binary splitting that’s tripping you up. Instead of one universe that splits at each clock tick, imagine a (possibly infinite) set of initially identical universes that differentiate (without splitting) with each clock tick. In some universes the atom decays at that tick, and in some it doesn’t, in unequal proportions according to the probability amplitudes. The result is that the proportion of universes in which the atom has not yet decayed declines over time exactly according to its half-life, just like with a mass of identical atoms in one universe.

            Or if you insist on splitting, why limit yourself to a binary split? At each tick, split the universe a gazillion ways, 98.6% of which see no decay, 0.7% see atom A decay, and so on. The reason you’re not getting the right decay rates is because you’re discarding all information about probability amplitudes in enforcing your logically minimal split. (And again, the amplitude is the reality; “splitting the universe” is just a way of thinking about it.)

          5. Hmmm…I think I see where you’re going with this.

            With my simplistic example above, after one Planck second we don’t get just four universes, but instead we get, say, about a thousand universes in which neither atom has split, and a half-dozen or so universes each for each of the other scenarios. Right?

            I can see how that would solve the math problems…but it doesn’t match my earlier understanding of how many-worlds is supposed to work. A universe will really split into multiple perfectly-identical universes (in addition to multiple not-so-identical universes)? Does the math actually say that? Somehow, it doesn’t even seem to make sense to say that the universe is splitting if nothing is changing.

            It also stinks to high heaven of needlessly multiplying variables, as Ockham might have complained.

            Not that reality has to comply with any sort of human notion of elegance, but experience suggests that something so needlessly complex as “gazillions” of identical universes constantly being manifested just has to be a kludge.



          6. Depends what you mean by “really split”. At bottom, there’s a universal wave function with different probability amplitudes for various configurations. We can interpret those amplitudes as sets of universes in various proportions, but that doesn’t have to mean that anything is “really splitting”. A “universe” in this sense is just a particular history seen by a particular configuration of a particular observer. But all of those histories and configurations are described by the time evolution of the single universal wave function.

            At least, that’s my (non-technical) understanding of it.

            And Ockham doesn’t care how many universes there are or how many ways they split; he cares about how many assumptions you have to make to explain it all.

          7. What Gregory said, plus…

            Your model of your simple universe is incomplete, if not unphysical. Where did it come from? What else must have been created?

            If you have atoms, you have at least photons, if not Ws and Zs. Oh — and dark matter and dark energy. What are they doing? Why is radioactive decay the only thing that will split universes? Don’t forget vacuum fluctuations either.

            What happens to the decay products? You will also have an electron and a neutrino, just a neutrino (and an excited decayed atom), an alpha-particle, a proton or a neutron depending on the kind of decay. (Maybe doubled.) What is that/are those particle/s now doing? (You’ve even got the possibility of chemistry.)

            A double-decayed universe now has even more things going on in it that can lead to further splitting. It’s not a “dead end”.


          8. Oh — even worse: If you have radioactive atoms in the first place, you must have had a star, and sufficient numbers of atoms to have come from a supernova, and of all (? – most) atomic numbers less than your radioactive atoms. Suddenly it’s a very busy place.


  4. Wouldn’t it be wonderful if claims of “religious truth” were subject to the same strictures?

    “Vatican theologians today reported research that suggests the figure of Jesus, best known from the book The Bible, may in fact have been a half-god sent by the creator of the universe to expiate humanity’s inherent sinfulness. The lead researcher on the project , Fr. Sarducci of the Vatican’s Institute for Applied Soteriology, explained that ‘If this result is true, it would have tremendous implications for our understanding of the nature of humanity. Of course, much work will still need to be done in various other labs to see if this result holds up.’ Others in the field point to conflicting findings made at the Hebraic Center for High-Energy Torah Studies and the Mohammad Fundamentalist Particle Laboratory.”

  5. Since the emitter and the receiver were so close (for light fast particles), it seems the chance of error is pretty high. And no paper has been published yet, from my understanding. I suspect a mistake.

    Phil Plait pointed out that, if this was normal neutrino behavior, we would have seen the neutrinos from the recent supernova before we saw the visible light. We didn’t. That was a much further distance, and thereby would have given us a better measurement.

    1. Victor, technical paper found here:

      Contra Phil Plait, Sean Carroll ruminates on ‘if it is actually true, then’ situations by noting that supernovae neutrinos are electron neutrinos, which weren’t used in this experiment. These are muon neutrinos which are substantially more energetic, and if they are actually breaking the light barrier, it would make sense they do so in some energy dependent fashion. Further, he writes that thought has already been given to how this might be. Say, lorentz variance violation on which he wrote in 2008.

      Of course, Sean like the rest of the world is also pointing out that this is almost certainly a measurement problem, and not particles exceeding the speed of light.

  6. “Wouldn’t it be wonderful if claims of “religious truth” were subject to the same strictures?”

    What if they just claim that God made it go faster? Save them all that money and effort involved in verification.

  7. This virtual particle idea sounds good to me. Photons can spontaneously decay into a electron/positron pair which then annhilate, reproducing the “original” photon. I don’t see why the pair could not interact with other photons (absorb then re-emit) in the interim. The absorb/re-emit process is one way light is slowed in a transparent medium.

    Surely this effect if possible should have been calculated somewhere in QED. I’m off to ask a physicist…

  8. This is a good time to remember that it sometimes takes theory a while to catch up with apparent irregularities in reality. Magnetism was pretty much a mystery until the principles of electomagnetism were worked out; Newtonian physics was starting to get a little creaky until relativity stepped in (I’m thinking here of Mercury’s orbital precession) and quantum mechanics accounted for things that relativity found a bit hinky.

    First, verification. Next, discover the proper explanation. Understanding electromagnetism makes most of our world possible (think of all the things that need electromagnets) – makes you wonder what applications could come out of this.

  9. This recalls a remark from a physics professor my roommate once related to me. He was using optical traps to super-cool atoms, something relying heavily on the standard model. When asked what he thought his chances were, he said there were two possible outcomes, “It will work, or we’ll win a Nobel Prize.”

  10. The best news I’ve heard all week, particularly if it turns out not to be an error. I wouldn’t be surprised, however, if Alvaro de Rújula’s “If it is true, then we truly haven’t understood anything about anything” is quote mined and used against him.

  11. 60 nanoseconds is a huge time difference. Light travels 18 meters during that time, a value much higher than uncertainties after correcting for tide effects at each site.

    Neutrinos are weird, and only recently did we find they have mass. We’ve detected supernova neutrinos and those arrived at the time of the photons. If this held they’d have arrived 4 years before. So I have trouble assigning this to higher than a speed of light effect. There is a mistake somewhere.

    1. I’m looking at the paper now. If there’s an error, I’m buggered if I can see where it is. I will wait to see if the results are reproducible or if somebody brighter than me can find what went wrong.

  12. I’m not a physicist, but to my knowledge the idea that neutrinos might be tachyons has been around at least since the 1980s. Tachyons (if they exist) are not inconsistent with special relativity, which says that you can’t accelerate a massive particle to the speed of light, since that would require infinite energy. However, a hypothetical particle with imaginary mass must travel faster than light to have finite energy.

    Nor does this “open up” the possibility of time travel. That possibility has been open since the 1930s, when Gödel found a solution to Einstein’s equation of gravity (aka general relativity) that allows closed timelike loops. Additional general relativity time-travel solutions have been found by Tipler in the ’70s and Thorne in the 90s (if memory serves).

  13. Yeah! I know how much most physicists HATE, HATE, HATE, HATE, HATE, HATE, HATE, HATE, HATE, HATE, HATE faster than light travel! But if this turns out to be correct, it could mean the creation of faster than light radio communications. Just like in that episode of Space: 1999 where they made contact with Earth in the year 2120 using neutrino transmissions!

    1. Presumably before ‘Space: 1999’, Gregory Benford’s novel ‘Timescape’ made use of neutrino communication (and multiple-worlds).

      My thoughts on this neutrino result as a vertebrate palaeontologist: supposing I were to find a fossil rabbit in Devonian rocks, I’d be looking pretty closely at the idea of ‘intrusive burial’, but either way it would make for a pretty neat publication.

  14. Isn’t it more likely that scientists were slightly wrong about the distance ? 60 nanoseconds sounds well within the margin off error.

    1. According to the paper the distance measurements were maid via GPS apparatus whose accuracy had been verified to +/- 2cm by two independent tests by two independent agencies. That has been taken into account in the paper.

  15. Without reading everyone’s comments, I had wondered if there were tectonic forces at work as Italy is still being pushed north by the African plate is it not? …but I suppose they would have taken those into account.

    1. I would think that wouldn’t be such a large error, given that much more mundane things like the length of the measuring apparatus (18m) are taken into account.

  16. When they get it wrong, they just ignore it: Jesus went on to say, “I tell you the truth, some standing here right now will not die before they see the Kingdom of God arrive in great power!”

    Mark 9:1

    2000 years of wrong, including Jesus himself, and we still have them saying the Kingdom is soon to come. I’m like, just give it up. You’re wrong. You’re wrong. You’re wrong.

    But, since relgion isn’t about self-correcting truth, but pandering/bilking fools… There’s no real reason to correct one’s errors. Just pretend plain text means something else…

  17. When I read about the “Fifth Dimension” I did some astrological investigation and here’s what I found.

    The moon is in the Seventh House and Jupiter is aligned with Mars. Peace is guiding the planets …

    And love will steer the stars.

    I tell you, this research is the Dawning of the Age of Aquarius.

    Peace out.

  18. “Wouldn’t it be wonderful if claims of “religious truth” were subject to the same strictures?”

    Stricter scripture stricture?

    I affirm a firm affirmative on that.

  19. Well it’s probably best to wait until the experiment is actually repeated again and again before wildly speculating on its implications because it probably won’t be too long before out of control speculations becomes some sort of theology.

    1. That might take some time. According to Huffpo, only Fermilab and a similar experiment in Japan can currently do it. Fermilab’s equipment is not as accurate, and the Japan experiment was slowed by the recent earthquake/tsunami.

      1. For years, there have been proposals to update neutrino experiments at other locations to do high-precision speed measurements. Perhaps this will push the funding agencies to support the idea. If the next generation of experiments is done, it will be quite a bit more precise than the OPERA measurement.

    2. This “experiment” wasn’t a discrete thing, it was the collection of ~15,000 neutrino events from a beam that pretty continuously fired over several years. So repetition may not be the best conceptual check to apply.

      Also note that this detector was the one used to detect neutrinos from supernova 1987a, which is currently the experiment mentioned as refuting it. So they kinda already did their own internal check with another source.

      I’m not saying there aren’t statistical or systemic errors, there very well could be (I personally think its going to be found to be an error). But the problem is probably not at the “today our machine was wonky” level.

      1. That’s right, and presumably there’s similar data for Fermilab and the Japanese collaboration (I don’t know the name, but there’s probably a reference in the paper). If the Fermilab system is not upgraded, that leaves Japan with their earthquake related setbacks.

        As for supernova neutrinos, it has already been pointed out that that may not be a good comparison, given that they are low energy electron neutrinos and not high energy muon neutrinos used in OPERA/CNGS. Neutrinos come in 3 flavours.

      2. The reason to ask for a repeat is not to see if the physics or the experiment is repeatable, but to eliminate errors both accidental and systematical. See my comment below for some examples of potential ones.

  20. “A rather complete theoretical structure has been shattered at the base and we are not sure how the pieces will be put together.”

    I.I. Rabi following the announcement of the first experimental evidence for parity violation.

    1. This is a rather different case. With parity violation, nobody had ever thought to test parity invariance in the weak interactions until Lee and Yang. When their suggested experiments were done, the parity violation was huge. On the contrary, people have been doing Lorentz tests (including with neutrinos) for some time, with no real evidence of it uncovered.

      1. Yes, it’s certainly different, and if it turned out that it really was not a result of an error, I think it would be far more disruptive than parity violation.

      2. By the way, I remember reading that before Lee and Yang had submitted their paper, a Russian graduate student of Lev Landau had the same idea. When the student put it in front of him, Landau said something like “quatsch” or “pathology” and told him not to pursue it.

  21. So … has everybody caught where they goofed yet?*

    It is an easy one. According to the paper the distance measurement procedure use the geodetic distance in the ETRF2000 (ITRF2000) system as given by some standard routine. The european GPS ITRF2000 system is used for geodesy, navigation, et cetera and is conveniently based on the geode.

    I get the difference between measuring distance along an Earth radius perfect sphere (roughly the geode) and measuring the distance of travel, for neutrinos the chord through the Earth, as 22 m over 730 km. A near light speed beam would appear to arrive ~ 60 ns early, give or take.

    Of course, they have had a whole team on this for 2 years, so it is unlikely they goofed. But it is at least possible. I read the paper, and I don’t see the explicit conversion between the geodesic distance and the travel distance anywhere.

    Unfortunately the technical details of the system and the routine used to give distance from position is too much to check this quickly. But the difference is a curious coincidence with the discrepancy against well established relativity.

    * Extraordinary claims need extraordinary evidence. Other outstanding concerns are:

    1. This needs to be repeated.

    2. It is not a clear photon vs neutrino race. Physicist Ellis and others here noted that the time differential for the supernova SN 1987A was a few hours, but at the distance of ~ 200 000 ly it should have been years if the suggested hypothesis would be correct.

    3. Analogous to the experiments where light waves seemingly travels faster than photon speed in vacuum, they don’t measure travel times of individual neutrinos but averages over a signal envelope. That must be carefully measured to establish that particles (or information, for that matter) travels faster than relativity allows.

    Especially since the neutrino beam oscillates between different kinds of particles!

    1. There are numerous other issues as well, including:

      1. profile of the particle beam as it’s gated

      2. distribution of the lifetime of particles decaying into neutrinos

      3. reliability of time synchronization of events happening at such distances (if someone says they used NTP or got their time via GPS I’ll fall over laughing – some serious work needs to go into the time stuff)

      4. timing of signals through the instrumentation

      #2 is fairly well known so I doubt it’s an issue; I’m not familiar with the design of the beam mechanism so I can’t comment on #1, but #3 and #4 require painstaking work, but I think one of the biggest issues is determining physical distance.

  22. “We don’t allow FTL neutrinos here”, said the barman. A neutrino walks into a bar.

    [HT Miscience]

    though “action at a distance,” as posited by Bell’s inequality and supported by some experiments, suggest that particle interactions (but not information) can exceed light speed.

    Excuse me? The very idea behind the inequality and its test is that relativity is taken to be preserved. So causality is preserved too, naturally. All physical observations and theories to date are causal.

    To suggest otherwise is to not have done due diligence. In biological terms it would be tantamount to claim that microevolution happens but macroevolution is forbidden.

    The outcome of the Bell tests is consistent with the prediction, and shows that there are no hidden variables. Hence relativity and quantum mechanics are compatible, and especially quantum systems shows entanglement.

  23. I’ve commented elsewhere that I need to see much more information before I’m even convinced that this is a novel observation. Faster than the speed of light in vacuum? Extremely unlikely. It’s far more likely that the experimenters have not properly accounted for something. Keep in mind that every inch of wire used in the instrumentation is of vital importance when considering the timing involved in this claim. The astrophysicists are sneering because neutrino events are observed hours after optical observation of a supernova and give an unbelievably good measure of the speed of neutrinos vs. light and supernova observations are seriously at odds with this claim from OPERA.

  24. The blokes that have the time travel have already corrected this little leak……I can feel myself forgetting as I write….
    They were just adjusting things when the stock market shuddered (coincidence?)

    Well, how else do you think the collider was funded?

  25. Shades of Einstein’s doubters when his Special Theory of Relativity was published and Max Planck (through and aide as I recall) verified his theory. (If memory serves.)

    Fortunately, there are enough Planck equivalents and the Internet to keep updates current.

  26. I think you need to brush up on the concept behind wormholes, extra dimensions does not mean wormholes. But my physics is basic, perhaps I’m wrong. Although the use of wormholes is an interesting option, there is others.

  27. Has to, has to, has to be repeatable.

    That said, it’s a lot more fun to speculate on it being true, because, let’s face it, it really hasn’t been all that exciting in… a long time 😉

    The ‘time travel’ speculations seem weird. Faster-than-light might get past someone’s light cone, but it’s not going to go back beyond their ‘Minkowski ice’ as it were. The light cones are not bending, and it may be ‘in your past’ in GR terms but no number of FTL technologies like ansibles, warp drive, what have you will ever go even a minute into your current x,y,z,t past.

    One intriguing thing that would come out of a verifiable result would be some possible way to introduce a mechanism back into quantum non-locality. There can be no hidden variables theory that is not also non-local, and we have taken that to mean no hidden variables due to the sensible light speed limit.

    It might put neutrinos into a more interesting position in physics as well. Imagine if they had a place as a carrier?

    1. I remember my dad’s story – lol. My father and I were amazed to see it published in “Countdown to Midnight” as he was never consulted about his story’s publication!

      1. I have in in an old short-short story anthology titled Great Science Fiction By Scientists. I first read it in high school, and I gave a presentation on the theory of the neutrino bomb in my physics class and the teacher never caught on.

        1. When the story was originally published in the Los Alamos Scientific Laboratory News, July 13,1961, only the New York Times called to check out the real scoop and they did write an article. The “Bulletin of Atomic Scientists” asked my dad to write an official paper and give a talk on this bomb, which most people thought the Russians really had developed. My parents still get a kick out of the memories this satire produced.

  28. Light travels a whopping 30 cm in 1 nanosecond. The suggestion that neutrinos ‘exceeded the velocity of light by 1 nanosecond’ thus claims the race would be wone by neutrinos by a massive 30 cm if the two were run at the same time in the same race.
    The short time required by light to travel 1 mm, the admitted precision error in the known distance of the race, is a mere 3 trillionths of a second.
    During daylight hours, as the earth speeds around the sun in an orbit, Eastern locations on earth race in front of Western locations, while at night locations to the West lead geographic locations that are to the East. Light does not participate in such motions as do physical matter, and instead travels a straight path from a source that must be intercepted later by a moving sensor at the position of intersection. The calculation of the actual time reqwuired by a light photon to traverse from the CERN to OPERA is complex and depends on the time of day among other factors. These considerations were not made in the published work.
    A better experiment would be to run the neutrinos and a light beam in concert at the same time to see who actually wins the race.
    Einstein wrote in 1905 that light speed (in the propagation direction) is fixed at c (from Maxwell), BUT the relative velocity between a beam front and a moving detector is either c – v or c + v when a detector moves away from, or toward, the light respectively. Since the earth orbits at 65,000 miles per hour around the sun, it is necessary to apply this correction in any experiment using a one way direction of travel. The time required for a light photon to travel the CERN tube during the day when the OPERA detector speeds away from the light is longer than the time required by a light photon to travel through the tube at night when the detector would be traveling in the direction opposite that of the photon. Albert Michelson performed his speed of light measurement at Mt.Baldy with a round-trip measurement that avoided this problem.
    Light does not pick up extra forward speed from a forward moving source, but matter with mass does so. The formula for the exact speed of light is known from the Calculus performed by Maxwell but can only be assigned a numeric value that is limited by the precision of instruments that measure the properties of the medium in which light propagates. Light speed is incredibly fast only because light has no mass–it is an EM field. Objects with mass cannot attain the speed of light. Although Einstein was wrong on many counts with aspects of special relativity (i.e. ‘time dilation’) the one thing he was correct about was that light speed cannot be exceeded by objects that have mass.
    Richard Sauerheber, Ph.D

  29. we are all assuming that einstein was 100 percent right. to me time travel sounds about as fantastic as some say that the concept of God is. there is so much that we don’t know about the universe. it’s funny to me that people are just discounting faster than light because of einstein’s theory. how many theories have been disproved due to new information. we need to look at all experiments with an open mind.

    1. Every time you use GPS to discover your location, you are performing an experiment that confirms Einstein was right. Every time the almanac correctly predicts Mercury’s rise / set times, Einstein has been vindicated. One of the most carefully-designed and rigorous experiments in all of human history, Gravity Probe B, had as its soul porpoise to see if Einstein was right — and it found no flaw.

      If confirmed, the neutrino experiment will have significance similar to that of Mercury’s orbit. It’s not impossible, but it’s so unlikely that it really should be the last thing you should be betting your money on.

      One thing it won’t do is lead to time travel. We have lots of other ways of knowing that time travel isn’t possible — to continue the analogy, at this point, a discovery of time travel would be akin to discovering a species of apple tree whose fruit falls up. The universe would be a much different place if such were possible — not even remotely recognizable as the place we know and love.



  30. In my opinion, some particle or something else can exceeds the velocity of light. So i’m standing with CERN. If u have same opinion like me, please send me ur finding about this topic to my email.

Leave a Comment

Your email address will not be published. Required fields are marked *