Is the Standard Model Just an Effective Field Theory?

An article by Steven Weinberg entitled On the Development of Effective Field Theory appeared on the arXiv last night. It’s based on a talk he gave in September, surveys the history of effective field theories and argues for what I’d call the “SM is just a low energy approximation” point of view on fundamental physics. I’ve always found this point of view quite problematic, and think that it’s at the root of the sad state of particle theory these days. That Weinberg gives a clear and detailed version of the argument makes this a good opportunity to look at it carefully.

A lot of Weinberg’s article is devoted to history, especially the history of the late 60s-early 70s current algebra and phenomenological Lagrangian theory of pions. We now understand this subject as a low energy effective theory for the true theory (QCD), in which the basic fields are quarks and gluons, not the pion fields of the effective theory. The effective theory is largely determined by the approximate SU(2) x SU(2) chiral flavor symmetry of QCD. It’s a non-linear sigma model, so non-renormalizable. The non-renormalizability does not make the theory useless, it just means that as you go to higher and higher energies, more possible terms in the effective Lagrangian need to be taken into account, introducing more and more undetermined parameters into the theory. Weinberg interprets this as indicating that the right way to understand the non-renormalizability problem of quantum gravity is that the GR Lagrangian is just an effective theory.

So far I’m with him, but where I part ways is his extrapolation to the idea that all QFTs, in particular the SM, are just effective field theories:

The Standard Model, we now see – we being, let me say, me and a lot of other people – as a low-energy approximation to a fundamental theory about which we know very little. And low energy means energies much less than some extremely high energy scale 1015−1018 GeV.

Weinberg goes on to give an interesting discussion of his general view of QFT, which evolved during the pre-SM period of the 1960s, when the conventional wisdom was that QFTs could not be fundamental theories (since they did not seem capable of describing strong interactions).

I was a student in one of Weinberg’s graduate classes at Harvard on gauge theory (roughly, volume II of his three-volume textbook). For me though, the most formative experience of my student years was working on lattice gauge theory calculations. On the lattice one fixes the theory at the lattice cut-off scale, and what is difficult is extrapolating to large distance behavior. The large distance behavior is completely insensitive to putting in more terms in the cut-off scale Lagrangian. This is the exact opposite of the non-renormalizable theory problem: as you go to short distances you don’t get more terms and more parameters, instead all but one term gets killed off. Because of this, pure QCD actually has no free parameters: there’s only one, and its choice depends on your choice of distance units (Sidney Coleman liked to call this dimensional transvestitism).

The deep lesson I came out of graduate school with is that the asymptotically free part of the SM (yes, the Higgs sector and the U(1) are a different issue) is exactly what you want a fundamental theory to look like at short distances. I’ve thus never been able to understand the argument that Weinberg makes that at short distances a fundamental theory should be something very different. An additional big problem with Weinberg’s argument is its practical implications: with no experiments at these short distances, if you throw away the class of theories that you know work at those distances you have nothing to go on. Now fundamental physics is all just a big unresolvable mystery. The “SM is just a low-energy approximation” point of view fit very well with string theory unification, but we’re now living with how that turned out: a pseudo-scientific ideology that short distance physics is unknowable, random and anthropically determined.

In Weinberg’s article he does give arguments for why the “SM just a low-energy approximation” point of view makes predictions and can be checked. They are:

  • There should be baryon number violating terms of order $(E/M)^2$. The problem with this of course is that no one has ever observed baryon number violation.
  • There should be lepton number violating terms of order $E/M$, “and they apparently have been discovered, in the form of neutrino masses.” The problem with this is that it’s not really true. One can easily get neutrino masses by extending the SM to include right-handed neutrinos and Dirac masses, no lepton number violation. You only get non-renormalizable terms and lepton number violation when you try to get masses using just left-handed neutrinos.

He does acknowledge that there’s a problem with the “SM just a low-energy approximation to a theory with energy scale M=1015−1018 GeV” point of view: it implies the well-known “naturalness” or “fine-tuning” problems. The cosmological constant and Higgs mass scale should be up at the energy scale M, not the values we observe. This is why people are upset at the failure of “naturalness”: it indicates the failure not just of specific models, but of the point of view that Weinberg is advocating, which has now dominated the subject for decades.

As a parenthetical remark, I’ve today seen news stories here and here about the failure to find supersymmetry at the LHC. At least one influential theorist still thinks SUSY is our best hope:

Arkani-Hamed views split supersymmetry as the most promising theory given current data.

Most theorists though think split supersymmetry is unpromising since it doesn’t solve the problem created by the point of view Weinberg advocates. For instance:

“My number-one priority is to solve the Higgs problem, and I don’t see that split supersymmetry solves that problem,” Peskin says.

On the issue of quantum gravity, my formative years left me with a different interpretation of the story Weinberg tells about the non-renormalizable effective low-energy theory of pions. This got solved not by giving up on QFT, but by finding a QFT valid at arbitrarily short distances, based on different fundamental variables and different short distance dynamics. By analogy, one needs a standard QFT to quantize gravity, just with different fundamental variables and different short distance dynamics. Yes, I know that no one has yet figured out a convincing way to do this, but that doesn’t imply it can’t be done.

Update: I just noticed that Cliff Burgess’s new book Introduction to Effective Field Theory is available online at Cambridge University Press. Chapter 9 gives a more detailed version of the same kind of arguments that Weinberg is making, as well as explaining how the the Higgs and CC are in conflict with the effective field theory view. His overall evaluation of the case
“Much about the model carries the whiff of a low energy limit” isn’t very compelling when you start comparing this smell to that of the proposals (SUSY/string theory) for what the SM is supposed to be a low energy limit of.

Posted in Uncategorized | 22 Comments

Various Links

Our semester at Columbia started earlier than usual this year, with first classes this week, my first class yesterday. This semester I’m teaching the second half of a year-long course on the mathematics of quantum mechanics. There’s a Youtube channel with the lectures for the first half of the course, and now also for the second half. The course is largely following the textbook I wrote based on teaching this is earlier years. The first lecture yesterday was a summary of a point of view on canonical quantization explained in the first semester and in the book. This point of view is essentially that Hamiltonian mechanics is based on a Lie algebra (functions on phase space with Poisson bracket the Lie bracket), and canonical quantization is all about the essentially unique unitary representation of (a subalgebra of) that Lie algebra. On Thursday I’ll start on the fermionic version of canonical quantization, which has a very much parallel structure, giving a super-Lie algebra and spinors.

A few other items:

  • John Baez’s This Week’s Finds in Mathematical Physics was an unprecedented project conducted over 17 years, providing a wealth of fantastic expository material on topics in math and physics. It started in 1993, and on its twentieth anniversary I wrote an appreciation (in an appropriate font) here. John has now announced that this material has been typeset (2610 pages!) and he is editing it, to be released in batches. The first part is now available, on the arXiv as This Week’s Finds in Mathematical Physics (1-50). As I find time, I’m looking forward to reading through these, encourage everyone interested in math and physics to do the same.
  • Frank Wilczek has a new book out, and there’s an interview with him at Quanta. You can see a conversation between him and Brian Greene here on Friday.
  • Another physicist with a new book is Jesper Grimstrup, whose Shell Beach: The search for the final theory I’ve just finished reading and enjoyed greatly. The book is quite personal and non-technical, with topic Grimstrup’s life as a theorist pursuing a unified theory. His career story is quite interesting, giving insight into the ways academic theoretical physics is challenging for young theorists trying to pursue non-mainstream research programs. Several books have appeared in recent years aimed at putting this kind of physics research in a human and philosophical context, telling you what it has to do with the meaning of life. There’s some of that in this book too, of a much more compelling sort than what you see elsewhere. Grimstrup has a website here, and in recent years has ended up leaving academia and trying to fund his research with donations. I can think of a lot worse things you could do with your money than send him some.

    I’m quite sympathetic to the underlying theme that he describes pursuing (together with Johannes Aastrup) in the book, that of bringing together the insights of loop quantum gravity and non-commutative geometry. More recently they’ve been working on some new ideas for formulating QFT non-perturbatively that seem worth investigating. There’s a survey blog post here.

Update: Another bit of private math/physics funding news. The IAS has announced establishment of the Carl P. Feinberg Cross-Disciplinary Program in Innovation

Scientific research at the Institute is traditionally driven by the collaboration and independent projects of a full-time Faculty and a revolving class of more than 200 researchers at various stages in their careers. The Carl P. Feinberg Cross-Disciplinary Program in Innovation will build on this successful model with the recruitment of mid-career scholars who have pioneered foundational developments in new areas. Bringing together scholars with such unique insights—which may not be obviously connected to the existing themes of the past 20 or 30 or 40 years—ensures that IAS will remain agile and responsive to new intellectual developments that do not yet fit the mold of what graduate students and postdocs generally know. In order to close this knowledge gap, the program will feature intense, focused workshops and “master classes.”

“Since its founding, the Institute has served as a world center for investigations into the fundamental laws of nature. We are currently in the middle of a grand symbiosis of ideas, from the equations of general relativity to the quantum information of black holes,” stated Robbert Dijkgraaf, IAS Director and Leon Levy Professor. “This revolutionary program will provide a dedicated space and the necessary flexibility to accelerate these exciting developments, and will surely forge new connections across fields.”

Posted in Uncategorized | 3 Comments

Martin Veltman 1931-2021

I heard today of the recent death of Martin Veltman, a theorist largely responsible (with his student Gerard ‘t Hooft) for showing the renormalizability of non-abelian gauge theories, a breakthrough crucial to the Standard Model that won both of the them the 1999 Nobel Prize. For the story of this work, the best source is likely Veltman’s Nobel lecture.

My one memory of meeting Veltman in person was when he visited Stony Brook at the time that I was a postdoc there (mid 1980s). There was a party at someone’s house, and I spent part of the evening talking to him then. What most struck me was his great passion for whatever it was we were talking about. One topic I remember was the computer algebra program Schoonschip (which Wolfram acknowledges as an inspiration for Mathematica). I vaguely recall that at that time Veltman had recently ported the program to a microprocessor and he was selling copies in some form. It also seems to me that one remarkable aspect of the program was that it was written in assembly language, not compiled from a higher level language. At the time I was doing computer calculations, but of a very different kind (lattice gauge theory Monte-Carlos). Since my own interests were focused on non-perturbative calculations, I wasn’t paying much attention to Veltman’s work, although I do remember finding his Diagrammar document (written with ‘t Hooft) quite fascinating.

A comment that evening that really struck me was about students, in particular that “you give your students your life-blood!”. This seemed likely to have some reference to Veltman’s relations with his ex-student ‘t Hooft, but I’m pretty sure I didn’t quiz him on that topic.

Many years later, when I was trying to get Not Even Wrong published, I contacted Veltman and he was quite helpful. At the time he had recently published his own popular book about particle physics, Facts and Mysteries in Elementary Particles, which contained his own version of the Not Even Wrong critique:

The reader may ask why in this book string theory and supersymmetry have not been discussed. . . The fact is that this book is about physics and this implies that theoretical ideas must be supported by experimental facts. Neither supersymmetry nor string theory satisfy this criterion. They are figments of the theoretical mind. To quote Pauli, they are not even wrong. They have no place here.

That book is quite good, I strongly recommend it. May its author rest in peace.

Posted in Obituaries | 13 Comments

DAU. String Theory

I first wrote here in 2015 about DAU, the unusual film project based to some extent on the life of Landau. Parts of the film first were shown in Paris early in 2019, and this past year started appearing on the DAU website. I’d been looking forward to seeing Gross, Yau, Rovelli and others in the film, so paid to watch one of the first parts, DAU. Degeneration, when it became available last year. It’s over six hours long, for a review, see here. I ended up doing a certain amount of fast-forwarding, was disappointed to only see Nikita Nekrasov and Dmitri Kaledin, none of the other math/physics world figures I had heard had participated.

DAU largely was funded by Russian oligarch Sergei Adoniev. For an excellent article discussing the project and its context in current Russian culture, see Sophie Pinkham’s article Nihilism for Oligarchs.

There wasn’t much physics in DAU. Degneration, but evidently it plays a significant role in other parts of the film. According to the DAU website,

Real-life scientists, who were able to continue with their research in the Institute, included: physicist Andrei Losev; mathematicians Dmitri Kaledin and Shing-Tung Yau; string theorist Nikita Nekrasov; Nobel-Prize winning physicist David Gross; neuroscientist James Fallon; and biochemist Luc Bigé. “One group was researching string theory and another researching quantum gravity. These groups hated each other. One stated there were 12 dimensions, the other claimed there were 24. The string theory group believed there couldn’t be 24 dimensions. The quantum gravity group believed that the other scientists were narrow-minded,” explained Khrzhanovskiy.

Now available is a part which seems to more centrally involve physics, DAU. String Theory, which is described as follows:

Nikita Nekrasov is a scientist, a theoretical physicist who studies our world and other possible worlds. He refuses to make a choice between mathematics and physics, between one woman and another, as he ponders the existence of the multi-universe. At scientific conferences, attended by eminent foreign scientists and a rising younger generation of physicists alike, Nekrasov gets carried away debating the beauty of string theory. He attempts to explain to all of his women – Katya, the librarian, Zoya, the scientific secretary, Svetalana, the head of department – about the theory of his own polygamy, and the possibility of having enough feelings to satisfy everyone.

Multiple universes have always been advertised with “in some other universe you’re dating Scarlett Johansson”, relating the idea to multiple partners in this universe is an innovation.

I haven’t yet watched DAU. String Theory, will likely find time for that soon. I’m worried that I’ll still not get to see Gross, Yau, Rovelli and others though, and lack the time and energy to look through all the other parts of the film. I’d like to crowd-source a solution to this problem: if anyone watching these things can let the rest of us know in which parts (at what times) well-known math/physics personalities appear, that would be greatly appreciated.

Posted in Uncategorized | 9 Comments

20 Years Later

Almost exactly twenty years ago I started writing a short article about the problems with string theory. I had been thinking about doing this for quite a while, and the timing of entering the twenty-first century seemed appropriate for evaluating something that had long been advertised as “a piece of 21st-century physics that had fallen by accident into the 20th”. The piece was done in a week or two, after which I sent it around to a group of physicists to ask for comments. The reaction was mostly positive, although at least one well-known theorist told me that publicly challenging string theorists in this way would be counter-productive.

One person who wrote back was Phil Anderson, I’ve quoted some of what he wrote to me in this posting. He suggested I send it to Gloria Lubkin at Physics Today, and evidently talked to her about it. I did do this, and after not hearing anything back for a week or two, decided to go ahead and post the article to the arXiv, where it appeared as String Theory: An Evaluation.

Rereading that article today, there’s little I would change. Its argument is even more valid now than then. The problems of the theory and how it was pursued evolved over the next twenty years in ways far worse than what I could have imagined back then. In particular, the “multiverse” argument explaining away why string theory predicts nothing is something I could not have conceived of in 2001. The tribalistic sociology that has led to a large group of people calling themselves “string theorists” when what they do has nothing to do with string theory is also something I would have thought impossible.

In many ways, twenty years of further failure have had less than no effect. Lubos Motl is still arguing that string theory is the language in which God wrote the universe, and Michio Kaku has a new book about to appear, in which it looks like string field theory is described by the God Equation. Ignoring these extreme examples, string theory remains remarkably well-entrenched in mainstream physics: for example, my university regularly offers a course training undergraduates in string theory, and prestigious \$3 million prizes are routinely given for work on the subject. The usual mechanisms according to which a failed scientific idea is supposed to fall by the wayside for some reason have not had an effect.

While string theory’s failures have gotten a lot of popular press, the situation is rather different within the physics community. One reason I was interested in publishing the article in Physics Today was that discussion of this issue belongs there, in a place it could get serious attention from within the field. To this day, that has not happened. The story of my article was that I finally did hear back from Lubkin on 2/21/2001. She told me that she would talk to the Physics Today editor Stephen Benka about it. I heard from Benka on 5/6/2001, who told me they wouldn’t publish an article like that, but that I should rework it for publication as a shorter letter to the editor. I did this and sent a short letter version back to them, never heard anything back (a few months later I wrote to ask what had happened to my letter, was told they had decided not to publish it, but didn’t bother to let me know). In 2002 an editor from American Scientist contacted me about the article, and it ended up getting published there.

Looking back at how Physics Today has covered string theory and related speculation over the past 25 years, I did a search and here’s what I found:

The only thing I could find anywhere during those 25 years indicating to Physics Today readers that none of this speculation had worked out was a short opinion column by Burt Richter

It seems to me that those now in charge of Physics Today should be thinking about this history, their role in it, and what they might be able to do to make up for this heavily one-sided coverage of a controversial issue.

Posted in Uncategorized | 60 Comments

Various and Sundry

A few recent items of interest:

  • Martin Greiter has put together a written version of Sidney Coleman’s mid-1990s lecture Quantum Mechanics in Your Face, based on a recording of one version of the lecture and copies of Coleman’s slides.

    It’s often claimed that leading physicists of Coleman’s generation were educated in and stuck throughout their careers to a “shut up and calculate” approach to the interpretation of quantum mechanics. Coleman’s lecture I believe gives a much better explanation of the way he and many others thought about the topic. In particular, Coleman makes the crucial point:

    The problem is not the interpretation of quantum mechanics. That’s getting things just backwards. The problem is the interpretation of classical mechanics

    He ends with the following approach to the measurement problem:

    Now people say the reduction of the wave packet occurs because it looks like the reduction of the wave packet occurs, and that is indeed true. What I’m asking you in the second main part of this lecture is to consider seriously what it would look like if it were the other way around—if all that ever happened was causal evolution according to quantum mechanics. What I have tried to convince you is that what it looks like is ordinary everyday life.

    While some might take this and claim Coleman as an Everettian, note that there’s zero mention anywhere of many-worlds. Likely he found that an empty idea that explains nothing, so not worth mentioning.

  • For the past year the CMSA has been hosting a Math-Science Literature Lecture Series. The talks have typically been excellent surveys of areas of mathematics, given by leading figures in each field. To coordinate with Tsinghua, the talks have often been at 8am here in NYC, and several times the last couple months I’ve made sure to get up early enough to have breakfast while watching one of the talks. All of the ones I’ve seen were very much worth the time, they were the ones given by Edward Witten, Andrei Okounkov, Alain Connes, Arthur Jaffe and Nigel Hitchin. Jaffe’s covered some of the ideas on Euclidean field theory that I’ve been spending a lot of time thinking about. For a more detailed version of his talk I highly recommend this article.
  • Peter Scholze has posted at the Xena blog a challenge to those interested in formalizing mathematics and automated theorem proving (or checking): formalize and check the proof of a foundational result in his work with Dustin Clausen on “condensed mathematics”. As part of the challenge, he provides an extensive discussion of the motivation and basic ideas of this subject, which attempts to provide a replacement (with better properties) for the conventional definition of a topological space.

    Spending a little time reading this and some of the other expositions of the subject convinced me this is a really thorny business. Scholze explains in detail his motivation for making the challenge in part 6 of the posting. My suspicion has always been that most of the value of computer theorem checking lies in forcing a human to lay out clearly and unambiguously the details of an argument, with that effort likely to make clear if there’s a problem with the theorem. It will be fascinating to see what comes of this project.

    I see there’s also a blog posting about this at the n-category cafe.

  • On the topic of theorems that definitely don’t have a clear and unambiguous proof, supposedly PRIMS will be publishing Mochizuki’s IUT papers in 2021. Mochizuki and collaborators have a new paper claiming stronger versions of Mochizuki’s results.
Posted in Uncategorized | 43 Comments

Contemplating the End of Physics

In a remarkable article entitled Contemplating the End of Physics posted today at Quanta magazine, Robbert Dijkgraaf (the director of the IAS) more or less announces the arrival of the scenario that John Horgan predicted for physics back in 1996. Horgan argued that physics was reaching the end of its ability to progress by finding new fundamental laws. Research trying to find new fundamental constituents of the universe and new laws governing them was destined to reach an endpoint where no more progress was possible. This is pretty much how Dijkgraaf now sees the field going forward:

Confronted with the endless number of physical systems we could fabricate out of the currently known fundamental pieces of the universe, I begin to imagine an upside-down view of physics. Instead of studying a natural phenomenon, and subsequently discovering a law of nature, one could first design a new law and then reverse engineer a system that actually displays the phenomena described by the law. For example, physics has moved far beyond the simple phases of matter of high school courses — solid, liquid, gas. Many potential “exotic” phases, made possible by the bizarre consequences of quantum mechanics, have been cataloged in theoretical explorations, and we can now start realizing these possibilities in the lab with specially designed materials.

All of this is part of a much larger shift in the very scope of science, from studying what is to what could be. In the 20th century, scientists sought out the building blocks of reality: the molecules, atoms and elementary particles out of which all matter is made; the cells, proteins and genes that make life possible; the bits, algorithms and networks that form the foundation of information and intelligence, both human and artificial. This century, instead, we will begin to explore all there is to be made with these building blocks.

In brief, as far as physics goes, elementary particle physics is over, from now on it’s pretty much just going to be condensed matter physics, where there at least is an infinity of potential effective field theory models to play with.

Dijkgraaf ends with an argument indicating that human intelligence is outmoded, artificial intelligence is our future:

Science concerns all phenomena, including the ones created in our laboratories and in our heads. Once we are fully aware of this grander scope, a different image of the research enterprise emerges. Now, finally, the ship of science is leaving the safe inland waterways carved by nature, and is heading for the open ocean, exploring a brave new world with “artificial” materials, organisms, brains and perhaps even a better version of ourselves.

Along the same lines, today also brings an article in the New York Times by Dennis Overbye, Can a Computer Devise a Theory of Everything? The article discusses the new MIT Institute for Artificial Intelligence and Fundamental Interactions and Max Tegmark’s hopes that AI will “discover all kinds of new laws of physics”. My guess is that this will work just fine if you give up on the 20th century understanding of what a “law of physics” is and follow Dijkgraaf’s lead. The problem then may be not so much “will we understand the new laws of physics found by AI?”, but rather that of them not being interesting enough to be worth understanding…

: To clarify the point I was trying to make about the Dijkgraaf piece arguing against the “end of physics”, compare it to the similar 1996 piece Gross and Witten published in the Wall Street Journal (a summary is here, an extract here). Gross and Witten were strongly disagreeing with Horgan, whereas it seems to me that Dijkgraaf implicitly agrees with Horgan that fundamental physics has hit a wall and theorists are moving on to do something else.

Posted in Uncategorized | 31 Comments

Various Links, String Theory now Untethered

I’ve been spending most of my time recently trying to get unconfused about Euclidean spinor fields, will likely write something here about that in the not too distant future. Some other things that may be of interest:

  • I did an interview a couple days ago with Fraser Cain, who runs the Universe Today website. He had some excellent and well-informed questions about the state of HEP physics. I regret a little bit that I focused on giving an even-handed explanation of the arguments over a next generation collider, didn’t emphasize that personally I think building such a thing is a good idea (if the money can somehow be found), since the alternative would be giving up and abandoning this kind of fundamental science.
  • On Monday, the Simons Center celebrated its 10th birthday, talks are here, giving a good overview of the kinds of math and physics that have been going on there during its first decade.
  • For the latest on the formulation of the local Langlands correspondence in terms of the geometry of the Fargues-Fontaine curve, Peter Scholze is teaching a course now in Bonn, website here.
  • Kirill Krasnov has a book out from Cambridge, Formulations of General Relativity. If you share my current interest in chiral formulations of GR and twistors, there’s a lot about these in the book. For a more general interest survey of what’s in the book, see Krasnov’s lectures last year at Perimeter (links and slides are on his website).
  • A couple weeks ago, a very well-done explanation of what’s been going on around the black hole information paradox written by George Musser appeared at Quanta Magazine. Periodically in recent years I’ve tried to follow what’s up with this subject, generally giving up after a while, frustrated especially at not being able to figure out what underlying theory of quantum gravity was being studied. All that ever was clear was that this was about low-dimensional toy model calculations involving some assumptions that had ingredients coming from holography and AdS/CFT.

    Musser’s article makes quite a few things clearer, with one striking aspect the news that:

    researchers cut the tether to string theory altogether.

    which I gather means that any foundation in AdS/CFT is gone, with what is being discussed now purely semi-classical. I don’t understand what these new semi-classical calculations are, and whether optimistic claims that the information paradox is on its way to a solution are justified (history hasn’t been kind to previous such claims). In recent years the pro-string theory research argument has often been that while there no longer were any prospects that it would tell us about particle physics, it was the best route to solving the problem of quantum gravity. It will be interesting to see what the effect will be of that cord getting cut by leading researchers.

    If you think it’s a good idea to follow discussions of this kind of thing on Twitter, you might enjoy threads from Sabine Hossenfelder and Ahmed Almeiri.

Posted in Uncategorized | 27 Comments

Do Particle Physicists Continue to Make Empty Promises?

Blogging has been light here, since little worthy of note in math/physics has been happening, and I’ve been busy with teaching, freaking out about the election, and trying to better understand Euclidean spinors. I’ll write soon about the Euclidean spinors, but couldn’t resist today making some comments about two things I’ve seen this week.

Sabine Hossenfelder yesterday had a blog entry/Youtube video entitled Particle Physicists Continue to Make Empty Promises, which properly takes exception to this quote:

A good example of a guaranteed result is dark matter. A proton collider operating at energies around 100 TeV will conclusively probe the existence of weakly interacting dark-matter particles of thermal origin. This will lead either to a sensational discovery or to an experimental exclusion that will profoundly influence both particle physics and astrophysics.

from a recent article by Fabiola Gianotti and Gian Francesco Giudice in Nature Physics. She correctly notes that

They guarantee to rule out some very specific hypotheses for dark matter that we have no reason to think are correct in the first place.

A 100 TeV collider can rule out certain kinds of higher-mass WIMPs, but it’s simply untrue that such an exclusion will “profoundly influence both particle physics and astrophysics.” Very few people think such a thing is likely since there’s no evidence for it and no well-motivated theory that predicts it.

Where I part company with Hossenfelder though is that I don’t see much wrong with the rest of the Gianotti/Giudice piece and don’t agree with her point of view that the big problem here is empty promises like this and plans for a new collider. Twenty years ago when I began writing Not Even Wrong, I started out by writing a chapter about the inherent physical limits that colliders were starting to hit, and the significance of this for the field. It was already clear that getting to higher proton energies than the LHC, or higher lepton energies than LEP was going to be very difficult and expensive. HEP experimentalists are now facing painful and hard choices about the future, which I wrote about in detail here under the title Should the Europeans Give Up?. The worldwide experimental HEP community is addressing the problem in a serious way, with the European Strategy Update one aspect, and the US now engaged in a similar Snowmass 2021 effort.

Many find it tempting to believe that the answer is simple: just redirect funds from collider physics to non-collider experiments. The problem is that there’s little evidence of promising but unfunded ideas for non-collider experiments. For the last decade there has been no new construction of high energy colliders, with as much money as ever available worldwide for HEP experiments. This should have been a golden age for those with non-collider ideas to propose. This continues to be the case: if you look at the European Strategy Update and Snowmass 2021 efforts, they have seriously focused on finding non-collider ideas to pursue. This should continue to be true, since I see no evidence anyone is going to decide to go ahead with a next generation collider and start spending money building it during the next few years. The bottom line result from the European process was not a decision to build a new collider, but a decision to keep studying the problem, then evaluate what to do in 2026. For the ongoing American process, as far as I know a new US collider is not even a possibility being discussed.

While HEP experiment is facing difficult times because of fundamental physical, engineering and economic limits, the problems of HEP theory are mostly self-inflicted. The decision nearly 40 years ago by a large fraction of the field to orient their research programs around bad ideas that don’t work (SUSY extensions of the Standard Model and string theory unification), then spend decades refusing to acknowledge failure is at the core of the sad state of the subject these days.

About the canniest and most influential HEP theorist around is Nima Arkani-Hamed, and a few days ago I watched an interview of him by Janna Levin. On the question of the justification for a new collider, he’s careful to state that the justification is mainly the study of the Higgs. He’s well aware that the failure of the “naturalness” arguments for weak-scale SUSY needs to be acknowledged and does so. He also is well aware that any attempt to argue this failure away by saying “we just need a higher energy collider” won’t pass the laugh test (and would bring Hossenfelder and others down on him like a ton of bricks…).

The most disturbing aspect of the interview is that Levin devotes a lot of time (and computer graphics) to getting Arkani-Hamed to explain his 1998 ideas about “large extra dimensions”, repeatedly telling the audience that he has been given a \$3 million prize for them. This paper has by now been cited over 6300 times, and the multi-million dollar business is correct, with the prize citation explaining:

Nima Arkani-Hamed has proposed a number of possible approaches to this [hierarchy problem] paradox, from the theory of large extra dimensions, where the strength of gravity is diluted by leaking into higher dimensions, to “split supersymmetry,” motivated by the possibility of an enormous diversity of long-distance solutions to string theory.

At the time it was pretty strange that a \$3 million dollar prize was being given for ideas that weren’t working out. It’s truly bizarre though that Levin would now want to make such failed ideas the centerpiece of a presentation to the public, misleading people about their status. The website for the interview also promotes Arkani-Hamed purely in terms of his failures, presented as successes:

Nima Arkani-Hamed is one of the leading particle physics phenomenologists of the generation. He is concerned with the relation between theory and experiment. His research has shown how the extreme weakness of gravity, relative to other forces of nature, might be explained by the existence of extra dimensions of space, and how the structure of comparatively low-energy physics is constrained within the context of string theory. He has taken a lead in proposing new physical theories that can be tested at the Large Hadron Collider at CERN in Switzerland,

This is part of the overall weird situation of the failed ideas (SUSY/strings) of 40 years ago: they still live on in a dominant position when the subject is presented to the public.

At the same time, the topics Arkani-Hamed is working on now are ones I think are more promising than most of the rest of what is going on in HEP theory. The interview began with a discussion of Penrose’s recent Nobel Prize, with Arkani-Hamed explaining Penrose’s fantastic insights about twistor geometry and noting that his own current work involves a fundamental role for twistor space (personally I see some other promising directions for using twistor geometry, more to come about this here in the future).

In contrast to Hossenfelder, what I’m seeing these days in HEP physics is not a lot of empty promises (which were a dominant aspect of HEP theory for several decades). Instead, on the experimental side, there’s an honest struggle with implacable difficulties. On the theory side increasingly people have just given up, deciding that it’s better to let the subject die chained to a host of \$3 million prizes for dead ideas than to honestly face up to what has happened.

In case anyone needs any reminder of how bad the propaganda problem is:
It appears this kind of thing is driven by a propaganda problem on Wikipedia.

Posted in Uncategorized | 32 Comments

2020 Physics Nobel Prize

The 2020 Physics Nobel Prize was announced this morning, with half going to Roger Penrose for his work on black holes, half to two astronomers (Reinhard Genzel and Andrea Ghez) for their work mapping what is going on at the center of our galaxy. I know just about nothing about the astronomy side of this, but am somewhat familiar with Penrose’s work, which very much deserves the prize.

Penrose is a rather unusual choice for a Physics Nobel Prize, in that he’s very much a mathematical physicist, with a Ph.D. in mathematics (are there other physics winners with math Ph.Ds?). In addition, the award is not for a new physical theory, or for anything experimentally testable, but for the rigorous understanding of the implications of Einstein’s general relativity. While I’m a great fan of the importance of this kind of work, I can’t think of many examples of it getting rewarded by the Nobel prize. I had always thought that Penrose was likely to get a Breakthrough Prize rather than a Nobel Prize, still don’t understand why that hasn’t happened already.

Besides the early work on black holes that Penrose is being recognized for, he has worked on many other things which I think are likely to ultimately be of even greater significance. In particular, he’s far and away the person most responsible for twistor theory, a subject which I believe has a great future ahead of it at the core of fundamental physical theory.

In all his work, Penrose has shown a remarkable degree of originality and creativity. He’s not someone who works to make an advance on ideas pioneered by others, but sets out to do something new and different. His book “The Road to Reality” is a masterpiece, an inspiring original and deep vision of the unity of geometry and physics that outshines the mainstream ways of looking at these questions.

Congratulations to Sir Roger, and compliments to the Nobel prize committee for a wonderful choice!

Posted in Uncategorized | 51 Comments