This Week’s Hype

It had to happen. New Scientist managed to find a physicist willing to describe the OPERA result as “evidence for string theory”:

So if OPERA’s results hold up, they could provide support for the existence of sterile neutrinos, extra dimensions and perhaps string theory. Such theories could also explain why gravity is so weak compared with the other fundamental forces. The theoretical particles that mediate gravity, known as gravitons, may also be closed loops of string that leak off into the bulk. “If, in the end, nobody sees anything wrong and other people reproduce OPERA’s results, then I think it’s evidence for string theory, in that string theory is what makes extra dimensions credible in the first place,” Weiler says.

Update: hep-ph is chock-a-block with papers purporting to explain the OPERA results, using theoretical models of varying degrees of absurdity. There is however one much more sensible paper this evening, from Cohen and Glashow, which points out that superluminal neutrinos would produce electron-positron pairs via bremsstrahlung, and lose energy, which is not observed. This is also incompatible with Super-Kamionkande and IceCube data. No matter what sort of extra dimensions you introduce for the neutrinos to travel in, the OPERA claim seems to be in violent disagreement with other observations.

This entry was posted in This Week's Hype. Bookmark the permalink.

69 Responses to This Week’s Hype

  1. Ben Martin says:

    I remember when New Scientist was a decent, reputable publication. This make me feel very, very old.

  2. Peter Woit says:

    Ben Martin,

    And I can remember when the only people the press could find to make absurd claims like this were obvious crackpots, not respectable physicists…

  3. Bernhard says:

    Was easy to see this one coming. String theory is the explanation for anything new, right or wrong.

  4. Proudmemberofthecult says:

    Compared to the current flood of papers on hep-th,hep-ph, gr-qc with the word superluminal this is but a small trickle. Sometimes it seems physicists are like locusts in search of food.

  5. MathPhys says:

    When I posted that link to Gubser’s paper, I thought it was the only one on the subject. How silly of me.

  6. Phil says:

    I’m more sorrowful for the decline of Scientific American… as I remember it New Scientist was always sensationalist, albeit often in an entertaining sort of way. Anyhow I wouldn’t beat them up too bad for this – if the OPERA results hold up it will point to something very unexpected and bizarre. Can’t argue with that. Given how unlikely it is they’ll hold up, it’s admittedly a surreal discussion…

  7. Misha says:

    Who is this Weiler, after all? Is he a reputable physicist? I have not heard of him. The superluminal hype will inevitably continue for some time, since many theorists are indeed hungry and in search of ideas, but then it will fizzle out …

  8. I wonder ... says:

    If string theory explains the OPERA result, why didn’t any string theorist cite their work before it came out?

  9. chris says:

    so… does that mean that when the opera result doesn’t hold string theory is falsified?

  10. MathPhys says:

    Misha, I expect that Tom Weiler hasn’t heard of you either.

  11. Yatima says:

    Neutrinos Faster Than Light, or Artifacts of the FPGA?: Comments About the FPGA Platform Used in the Data Acquisition System of the OPERA Experiment”

    After devouring everything I could read about the experiment, I speculate (“gut feeling”) that the explanation of these unexplainable numbers are variable timing delays introduced by the FPGA-based data acquisition system (DAQ), for the reasons stated below.

    I guess that’s a possible explanation.

  12. Shantanu says:

    Tom Weiler is a very respected physicist at Vanderbilt who has worked on neutrinos
    including astrophysical aspects.

  13. Noah Smith says:

    Hi Peter, this may only be on-topic regarding hype occurring this week. I’m curious how you interpret this press release. It seems to almost ignore the possibility there is no Higgs.

    http://www.spacedaily.com/reports/Could_the_Higgs_boson_explain_the_size_of_the_Universe_999.html

  14. Peter Woit says:

    Noah,

    I took a very quick look at their paper. It looks like they’re trying to put together the Higgs field with another scalar field (the “dilaton”), and get an underlying scale-invariant theory with known mass scales picked out by dynamical symmetry breaking. This certainly does require a conventional Higgs field, so if no Higgs shows up at the LHC, it rules this idea out. Like most ideas, it’s quite speculative, and not particularly convincing. They are trying to use it to make predictions about inflation, maybe this goes somewhere, but to evaluate that you need a cosmologist, not me. It doesn’t say anything new about LHC-scale physics.

    The story involves just the usual amount of hype always there whenever theorists try and promote ideas they’re working on. I suppose it’s worth noting sociologically that maybe now it’s considered much more promising to relate one’s ideas about a dilaton field in cosmology to the Higgs than to string theory…

  15. Noah Smith says:

    Thanks Peter, the sociology seems interesting. When I see this type of press release I wonder about the types of people involved. Last weeks neutrino news felt like a similar case. I wonder if I’m just noticing these things more often, or the frequency is actually increasing…:)

  16. Peter Woit says:

    Noah,

    The Higgs/dilaton thing is something very common, an isolated story about some speculative theoretical idea, one that isn’t getting a lot of attention from anyone except the authors.

    The neutrino thing is quite different, there a major experimental collaboration is releasing an experimental result claiming to overturn fundamental principles of physics. If this were solid, it would be huge, and deserve the attention it got in the press. The problem here though is that it’s a very implausible result, and not that solid. Many experiments at one time or another appear to be giving exciting data that violates fundamental principles, but this is essentially always a mistake in the experiment. So experimentalists typically are more wary than the OPERA people seem to have been about going public, since the likely end is that this will be shown to be mistaken, in which case the fewer people who see your mistake, the better…

    Once the experimental result is out there, I guess it’s not surprising that lots of theorists try and jump on the bandwagon and get attention for speculative ideas of their own. That anyone these days puts out a story about “evidence for string theory” with a straight face is perhaps a bit surprising.

  17. Chris Austin says:

    This is the list of Weiler’s articles on arXiv.

  18. Daniel says:

    There seems to be a number of scientists who will be coming up with a posteriori explanations of the OPERA results, but that’s a little too easy imho. Too many theories can be retrofitted.

    My question is, are there any theories that predicted superluminal speeds, and in particular, superluminal neutrinos?

  19. Bernhard says:

    Daniel,

    Perhaps this might be of interest to you:

    http://www.sciencedirect.com/science/article/pii/0370269385904605

  20. Coin says:

    Yatima: … hm. Having worked with FPGAs in the commerical engineering space, I would be * really * surprised if this turned out to be the problem, just because this is the sort of thing you’d normally expect any competent FPGA engineer to get right. FPGA engineers are usually– basically have to be– highly sensitive to timing issues, and it would seem that the kinds of (highly plausible) error the blogger proposes would be very easy to test and calibrate for without requiring actual neutrinos in the system. I’m sensitive to the argument an experiment run by physicists might not think to worry about some of these issues even where it would be the first thing engineers would think of, but I’d assume if an experiment like this is hiring someone to design this sort of thing for them then that person is basically being paid to think of these sorts of things…

    I mean, I do think the blog article author raises some excellent questions that deserve answers but I’m not going to expect this to be the source of error.

  21. Daniel L. Burnstein says:

    Thanks Bernhard. Very interesting indeed.

  22. M says:

    Notice that already http://arxiv.org/abs/1109.5682 had an inconsistency argument against the OPERA anomaly:
    “We point out that, quite generically, electroweak quantum corrections transfer the information of superluminal neutrino properties into Lorentz violations in the electron and muon sector, in apparent conflict with experimental data.”

    The argument from Cohen and Glashow is probably more clean.

  23. Shantanu says:

    Peter, this maybe offtopic , but what happened to the unparticle by Georgi a few years ago? Is it completely ruled out?
    Thanks

  24. Actually, one can in hind sight say that the results were expected:
    http://www.science20.com/alpha_meme/expect_tachyonic_neutrinos_have_their_higgs_and_smoke_it-83157
    Anyway, why so negative?

  25. Anon says:

    One thing that bothers me, is that if neutrinos are tachyons, why should all extant measurements of their speed give a result so close to the speed of light, instead of being distributed over all possible velocities on their purported tachyonic mass hyperboloid? In an experiment like this one, maybe there is a kinematic reason, but what about the supernova observations? Would there also be a kinematic reason?

  26. somebody says:

    Me and none of my string theory colleagues have ever heard of this Weiler character. There are plenty of fossils with tenure in various universities. Not suggesting that Weiler is one such, but it also doesn’t mean that his opinions are the opinions of the string “community”.

    Gubser paper says in fact that it is not easy to explain superluminal neutrinos even with extra dimensions, if you only have ordinary matter. Anybody who wants some attention from the media just has to make some statement (positive or negative) about string theory. Shrug.

  27. J says:

    There is actually a paper focusing on the experimental issues

    http://arxiv.org/abs/1109.6160

    which seems to suggest “the effect of the synchronisation convention is not properly taken into account in the OPERA analysis and may well invalidate their interpretation of superluminal neutrino velocity”.

  28. M. Wang says:

    Thanks for the link to Cohen & Glashow. It is most pertinent. Now theorists can conclusively say that the OPERA result has to be wrong.

    Too bad that Sidney is no longer with us, but I am glad to see that Sheldon is there to cut through the clutter in 3 simple pages. They don’t seem to make physicists like them any more.

  29. DPB says:

    @M. Wang

    While I agree that superluminal neutrinos are incredibly unlikely I have to disagree on one major point. It isn’t the job of theorists to conclusively say that experimentalists are wrong. Experiments must always have the final say, eventually.

  30. abbyyorker says:

    DPB

    According to Glashow and Cohen, the experimental evidence IS there, but the experimentalists could not parse it. Not dumping on them but theorists have a bigger role than you imply.

  31. high school physics teacher says:

    Sir,
    Having Glashow’s tour de force “refutation” existing under the main heading of This Week’s Hype seems a bit amiss. I’m just saying…

  32. MathPhys says:

    “Me and none of my string theory colleagues have ever heard of this Weiler character. “

    If your string theory is as good as your grammar, I don’t think my colleagues and I will ever hear of you.

  33. D R Lunsford says:

    Glashow seems to keep a pretty low profile, probably because he’s just a nice man and doesn’t want to get involved in controversies. In this case, I get the strong feeling that “enough was enough”, and the world’s great expert on neutrinos had to say something correct 🙂

    -drl

  34. Shantanu says:

    Its also nice to see 3 of my papers cited in Cohen and Glashow reference (ref 8 to 10) 🙂

  35. Aaron Sheldon says:

    As a statistician, after reviewing the OPERA paper, the analysis conducted is seriously flawed. They fitted only a single MLE for a time offset against the empirical proton distribution; that is they fitted assuming all the neutrinos have a uniform time of flight. They should have fit additional MLEs for the dispersal in neutrino velocities. This oversight should have sunk the paper out of the gate, as it makes the numerical analysis is fatally flawed. It is very likely that including MLEs for velocity dispersal would have reduced the time offset MLE to within the range of the systematic uncertainties and biases.

    Overall it is a bad sign for the quality of the editorial statistical review in the literature.

  36. Aaron Sheldon says:

    Just to be clear how the dispersal MLE would work. One would fit for the parameters of a normal distribution that is convoluted with the empirical proton distribution, where the mean is the time offset and the standard deviation is the velocity dispersal along the flight path.

  37. doctor physics says:

    would i be far off, in saying, that this seems to be not-even-wrong theories seeking out faulty experimental results as life-blood?

  38. David Nataf says:

    Aaron,

    Would you care to elaborate on why that would matter?

    Are the mean and variance of a Gaussian not supposed to be uncorrelated?

  39. Aaron Sheldon says:

    Basically the opposite of over fitting the data, you are under fitting the data. In the OPERA article they assumed they knew more about the velocity distribution then they actually did. If you assume you know less about how the velocity is distributed, then your uncertainty in the time offset is going to be much greater. Really they assumed there is much less variance (and co-variance) in the data then their actually is.

    The dirty little secret of statistics is that p-value, and sigma counts are really just indicators of likelihood of convergence of the MLE to a particular value, not the validity of the model. If you choose the wrong distributions to fit to the data, the MLE will in most cases still converge (as long as the Kullback-Leibler divergence of the fitted distribution against the actual, unknown, distribution is finite), but the answer will not be meaning full.

    The best test would be to use the likelihood ratio test to compare the convolution with the normal hypothesis to the linear shift hypothesis, because the linear shift is a sub-space of the convolution hypothesis (normal with zero variance, i.e. the Dirac linear shift functional).

    It will take a solid week to work out the differences formally, but basically it comes down to work on a couple theorems like this:

    1. Given a family of probability density functions f(x – \theta) parameterized by \theta, if we find the MLE of the family of probability density functions parameterized by a convolution with the normal distribution n( {x – \mu} over \sigma) \star f(x) then the distribution of the MLE of \mu and \sigma are…

    2. (dual) Given a family of probability density functions parameterized by a convolution with the normal distribution n( {x – \mu} \over \sigma) \star f(x), if we find the MLE of the family of probability density functions f(x – \theta) then the distribution of the MLE of \theta is…

    the physics community should take some solace that this is an incredible common mistake in the life sciences, my particular haunt, and hence I’ve become particularly attuned to this error.

  40. Aaron Sheldon says:

    PS,

    A literature search on my name will not turn up much of anything meaningful.

    Because I am a senior analyst for a large health care provider much of the work I do is either privileged within the organization and its oversight body, or is restricted by the prohibition against the secondary use patient information laid out in health information laws in the jurisdiction in which I work.

  41. Reply to Zathras says:

    OPERA assumed that all neutrinos travel at the same speed. Theoretically there was no a priori reason to believe that neutrinos with an average of 17 GeV energy travel with any significant speed dispersion.

  42. Reply to Zathras says:

    One more point – it is difficult to be sure from the OPERA paper, but it does appear that the time interval from the earliest to last detected neutrino events matches the duration of the proton pulse – i.e., there is no spreading. It bothers me – physical processes tend to spread pulses. A 60 ns spread in a 10,000 ns pulse is only 6 parts per 1000, and yet might swallow up the 60 ns that they found. On the other hand it is difficult to think of a known physical process that would cause so much spread. So as far as we know there is no spreading of the pulse; and so, at least to me, it seems there can be no speed dispersion of significance.

  43. Aaron Sheldon says:

    Unfortunately the Law of Large Numbers is not as kind to such assumptions as one would assume, and in the case of the OPERA results, all the 6-sigma level tells me was that the N was more than large enough to get good convergence of the MLE, but little else.

    As I stated before a statistical hypothesis that understates the sources of variability will over state the statistical confidence of the MLE. Without writing a single equation down I can tell you with certainty that if you do not include dispersion in your statistical fit of the MLE you will underestimate the variance in the fitted MLE.

    In information theoretic terms, adding the dispersion “uses up” part of the information in the data set, and so the variance in the time offset will be larger because you have less information to constrain its value.

  44. David Nataf says:

    Zathras,

    1) You have a fantastic screen name for someone on a time travel discussion thread.
    2) If the neutrino has a mass, real or imaginary, does that not mean that an energy dispersion implies an velocity dispersion?

    In light of the very small effect they “measure”, even an energy dispersion of say… 1%, could give the velocity dispersion sufficient to yield the systematic effects mentioned by Aaron.

  45. Reply to Zathras says:

    David,
    Wiki says the muon neutrino has a mass less than 170 KeV. In conventional Einstein theory, at 1700 KeV, already the neutrino is within 0.005% of the speed of light, and we are talking here of neutrinos with an average energy of 17 GeV = 17,000,000 KeV. There is no room for a 1% speed dispersion here, if the speed of light is the upper limit of speed. Note also that the superluminal speed that OPERA finds is about 0.0025% faster than light.

    The second point, on which I would like clarity, if anyone has it, is what I pointed out already – there seems to be no spreading of the pulse, and that would suggest no speed dispersion, even if the neutrinos were superluminal.

    Unless there is something new to say, I’ll be silent now; Peter does not like repetitions.

  46. Aaron Sheldon says:

    Actually, according to the paper they have sufficient N to detect a dispersion of about 1 part in a thousand, but the cost would be that the 6-sigma intervals would be much larger (roughly your MLE distribution is over area instead of length).

    Interestingly, this is also why political polls are so unreliable, because they underestimate the variance, which is calculated for simple binary choices, when in fact political polls typically have four or more choices depending on the number of candidates. It comes down to the problem of volume, if you want 90% of a centimetre you only need 9 millimetres, but an area of 9 X 9 millimetres is only 89% of the square centimetre, you need ~9.48 X 9.48 millimeters to get 90% of a square centimetre…the logic continues inductively, remembering that probability is just a fancy way of saying volume.

  47. Aaron Sheldon says:

    About the restrictions due to being close to the speed of light, that actually adds one more hypothesis with one more unknown parameter that needs to be fit. To enumerate:

    1. A simple uniform time shift with a single parameter, the time shift.

    2. Convolution with the normal distribution with two parameters, the time shift and the dispersion

    3. Convolution with the normal distribution where the underlying variable is first Lorentz transformed by some unknown speed (which destroys the symmetry of the dispersion), this has three parameters, the time shift, the dispersion, and the speed of the Lorentz transformation.

    Hypothesis 2 is a sub-space of hypothesis 1 in the limit that the Lorentz speed parameter goes to 0.

    Hypothesis 1 is a sub-space of hypothesis 2 in the limit that the dispersion goes to 0.

    This makes all of these hypotheses ripe for the likelihood ratio test.

    And that is pretty much how a statistician thinks.

  48. Aaron Sheldon says:

    Opps hypothesis 2 is a sub-space of hypothesis 3…

    I think you can see where I was going

  49. Bob McElrath says:

    Aaron Sheldon,

    First, the assumption of velocity dispersion is, not reasonable from a physics perspective. To have a delta-v large enough to affect the pdf, the energy of the neutrino would be so low that it would not be detected. One can easily calculate the expected dispersion from several sources and it is vanishingly small. Second, from a statistics perspective, the assumption of velocity dispersion would broaden the observed distribution of arrival times, as well as possibly bias it toward later arrival times. A careful look at Figs. 11 and 12 in the paper shows that this is not supported by the data.

    Second I think you are misinterpreting the “six sigma” quoted as a statistical p-value, and as that p-value is very tiny, you’re attributing disbelief in that small value to an error in fitting. It is not a p-value. The experiment clearly does not have enough statistics to report such a p-value (they only have ~16000 events). In physics the distribution of errors on a counting experiment such as this is generally unknowable, and is generally not Gaussian. One can only know the actual arrival time error function as well as the 16000 events will allow you. So what happens instead is that a fit is done, and a central value for that fit, with 1-sigma error bars is reported. When the 1-sigma error bar is separated from the expected value by a factor of six, we report “six sigma”. That is six times the 68% confidence interval, not a statistical p-value of 2e-9. e.g. the unobserved tails of the arrival time pdf are assumed to be gaussian. They never are. They are always much flatter than gaussian far away from the central value. The 5-sigma used in physics to report a “discovery” is a rule of thumb, forced upon us by the fact that we cannot reasonably determine the pdf of the measurement to the precision required. In most cases we can simulate the pdf using a model of the detector and backgrounds, but simulating the tails to the required precision to report the p-values of a discovery is out of the question computationally. Furthermore, the tails tend to be dominated by extremely rare physics that is not accounted for in the simulation. There are a very large number of possible sources of 0.1% errors, most of which are not known, and to chase them all down in order to improve the simulation is impractical from a manpower perspective, and usually impossible physically. In our field, we see three and four “sigma” results disappear on a regular basis, while any reasonable statistician would agree that the p-value corresponding to three sigma is sufficient to claim a discovery. Often the cause of an erroneous measurement is never actually discovered, but subsequent experiments fail to confirm it, and the older experiment is dismissed.

  50. Bernhard says:

    Off-topic: Witten will be Ireland talking about “The Quantum Theory of Knots.”:

    http://www.irishtimes.com/newspaper/sciencetoday/2011/0929/1224304927736.html

Comments are closed.