Physics News

Various physics-related news:

  • The LHC is back in business doing physics, with intensity ramp-up for the 2018 run ongoing. Today the machine is colliding 1551 bunches of protons, ultimate goal is to get to 2556 bunches. They are at least a week ahead of the planned schedule, which would have only reached 1200 bunches next week.
  • There’s a conference going on at the KITP this week, discussing the latest state of dark matter theory and experiment. By the way, see here for how prominent theorists communicate these days instead of using email…
  • At the Atlantic and Knowable Magazine, Tom Siegfried provides yet more multiverse coverage. Seems that he’s at work on a multiverse book.
  • At the Edge, Sabine Hossenfelder has an on-target analysis of the situation in fundamental theoretical physics. The problems she points to are ones that motivated books by Lee Smolin and myself back in 2006. Things haven’t improved since then and I hope she’ll have better luck generating concern for these issues than we did.
  • At Alta, Jennifer Ouellette has a fascinating account of the maneuverings for credit among the many observers of the neutron star merger last year. It sounds like some of those involved are suffering from the same disease as Brian Keating: not reality-based conviction that they’re in the running for a Nobel Prize.
  • Director Claire Denis is now at work on a sci-fi movie entitled High Life that sounds more promising than the last black hole movie. She’s getting scientific advice from Aurelien Barrau and comments:

    If there are theories about me, I’d rather not know. Astrophysics – now that’s fascinating. String theory, worm holes, the expanding universe, the Big Bang versus the Big Bounce – those are the kind of theories that make you feel like living and understanding the mystery of the world. Film theory is just a pain in the ass.


Update
: One more. David Gross will be back in Princeton this week, giving talks on gravity and particle physics. I may head down Thursday to revisit scenes of my youth.

Update: I just watched some of the KITP dark matter “debates”, see here. Highly recommended if you want to hear informative exchanges between experts on the subject, see especially the Dan Hooper talk (and the MOND/dark matter debate here).

Update: For machine learning experts who want to try their hand at an HEP data analysis problem, a Kaggle competition to build a track reconstruction algorithm opened yesterday. For details, see here and here.


Update
: Gross’s Princeton talk on the “Future of Particle Physics” was not much different than this one from a couple years ago (with the enthusiasm for supersymmetry deleted). I was thinking of writing something about this, comparing it to similar talks from way back when (see for instance here). Probably better though to wait for a better opportunity to write something substantive about where the path followed by Gross and others over the years has ended up.


Update
: Video of the David Gross Princeton talk on the Future of Particle Physics is available here.

This entry was posted in Uncategorized. Bookmark the permalink.

10 Responses to Physics News

  1. Dave Miller says:

    Peter,

    Bee Hossenfelder’s analysis of the sociology of physics, to which you linked, is one of the clearest explanations I’ve seen: I urge everyone to read it.

    Incidentally, much of what Bee writes was anticipated by Szilard in his little satire, “The Mark Gable Foundation,” published in his The Voice of the Dolphins and Other Stories, which is also well worth reading.

  2. Azadi says:

    Peter,

    I was wondering if you had seen today’s latest news item about the late Professor Hawking’s final paper, in which he apparently disclaims the string-landscape multiverse (i.e. with the different laws of physics applying in the pocket universes, courtesy of M-Theory) in favour of Tegmark’s “level 1” multiverse (an extension of our own, all having the same physical laws and physical constants).

    I haven’t read the original text, so I am basing this upon the media accounts.

    The reports, somewhat irritatingly, appear to be exaggerating the significance of this: as if Hawking, in this paper, was making a ground-breaking assertion that significantly moved the field on, in terms of explaining the so-called fine-tuning problems without recourse to a mere probabilistic, selection effect that has no predictive power.

    But I’m glad, at least, that Hawking’s last piece of research appears to have taken a stab at the string-theory-multiverse mania.

    See:

    https://www.theguardian.com/science/2018/may/02/stephen-hawkings-final-theory-sheds-light-on-the-multiverse

    “…Reality may be made up of multiple universes, but each one may not be so different to our own, according to Stephen Hawking’s final theory of the cosmos.

    The work, completed only weeks before the physicist’s death in March, paints a simpler picture of the past 13.8 billion years than many previous theories have proposed.

    Published on Wednesday in the Journal of High Energy Physics, the new work is the result of a long collaboration with Thomas Hertog, a Belgian physicist at the Catholic University of Leuven. “We sat on this one for a very long time,” Hertog said. “I do believe he was really fond of it.”…

    In the latest work, Hawking and Hertog challenge that view. Instead of space being filled with pocket universes where radically different laws of physics apply, these alternate universes may not actually vary that much from one another…

    “In the old theory there were all sorts of universes: some were empty, others were full of matter, some expanded too fast, others were too short-lived. There was huge variation,” said Hertog. “The mystery was why do we live in this special universe where everything is nicely balanced in order for complexity and life to emerge?”

    “This paper takes one step towards explaining that mysterious fine tuning,” Hertog added. “It reduces the multiverse down to a more manageable set of universes which all look alike. Stephen would say that, theoretically, it’s almost like the universe had to be like this. It gives us hope that we can arrive at a fully predictive framework of cosmology.”…”

  3. Fred P says:

    “For machine learning experts who want to try their hand at an HEP data analysis problem…” – This limit shocked me:
    “The software … should run on a i686 processor with 2GB (tbc) of memory.” source: https://sites.google.com/site/trackmlparticle/

    The i686 is a more than 22-year old processor (evidence: https://en.wikipedia.org/w/index.php?title=P6_(microarchitecture)&oldid=837000310 ) in a field where the main recent improvements rely heavily on much more modern hardware to take large problems and get results in reasonable timespans. Here’s a somewhat dated summary if you’re interested: https://blog.algorithmia.com/hardware-for-machine-learning/

    I am unclear that they will get useful results from this competition. Either the environment they’ll need will be radically different from the one they’re testing in, thus skewing any results to ones that may be sub-optimal for their purposes, and/or they’ll have to run using toy datasets that may have dubious applicability to the problems they are interested in.

  4. Peter Woit says:

    Azadi/Doolittle,
    I was trying to ignore this, finally gave in (see the latest posting).

  5. Carlos says:

    Dave,
    Thank you for mentioning Szilard’s nice piece (from 1948!). It’s one of the best science satires I’ve read in a long time, and it should definitely be more widely known. It makes us really wonder how far back has science started to become sick. “Off your teeth!”

  6. fg says:

    to Fred P: my guess regarding this constraint is that this software would be run as part of very high level triggers, very close to the detectors themselves, where only radiation hardened electronics can survive. And the best way for electronics to be robust against radiation is to have rather big transistors (much larger than the current state of the art in miniaturization).

  7. Anon says:

    fg: I don’t know the reason for i686 condition but the higher level triggers of the LHC experiments are on the surface, so radiation is not the reason. The experiments update their computing every year or two with several thousand of the latest computers, and are either using already or developing GPU systems for the future.

  8. fg says:

    Anon,
    My bad… I meant low level triggers. Having tracking information in these early triggers would open up new ways of selecting some rare events.

  9. Jan Dybicz says:

    Peter,

    Is there a video of David Gross’s two Princeton talks available?

  10. Peter Woit says:

    Jan Dybicz,
    Not that I know of. At the talk I attended, there was video being recorded, don’t know where/when it might appear publicly.

Comments are closed.