Should the Europeans Give Up?

The European HEP community is now engaged in a “Strategy Update” process, the next step of which will be an open symposium this May in Granada. Submissions to the process were due last month, and I assume that what was received will be made publicly available at some point. This is supposed to ultimately lead to the drafting of a new European HEP strategy next January, for approval by the CERN Council in May 2020.

The context of these discussions is that European HEP is approaching a very significant crossroads, and decisions about the future will soon need to be made. The LHC will be upgraded in coming years to a higher luminosity, ultimately rebranded as the HL-LHC, to start operating in 2026. After 10-15 years of operation in this higher-luminosity mode, the LHC will reach the end of its useful life: the marginal extra data accumulated each year will stop being worth the cost of running the machine.

Planning for the LHC project began back in the 1980s, and construction was approved in 1994. The first physics run was 16 years later, in 2010. Keep in mind that the LHC project started with a tunnel and a lot of infrastructure already built, since the LEP tunnel was being reused. If CERN decides it wants to build a next generation collider, this could easily take 20 years to build, so if one wants it to be ready when the LHC shuts down, one should have started already.

Some of the strategy discussion will be about experiments that don’t require the highest possible collision energies (the “energy frontier”), for instance those that study neutrinos. Among possibilities for a new energy frontier collider, the main ones that I’m aware of are the following, together with some of their advantages and drawbacks:

  • FCC-ee: This would be an electron-positron machine built in a new 100 km tunnel, operating at CM energies from 90 to 365 GeV. It would provide extremely high numbers of events when operated at the Z-peak, and could also be operated as a “Higgs factory”, providing a very large number of Higgs events to study, in a much cleaner environment than that provided by a proton-proton collider like the LHC.

    In terms of drawbacks, it is estimated to cost \$10 billion or so. The CM energy is quite a bit less than that of the LHC, so it seems unlikely that there are new unknown states that it could study, since these would have been expected to show up by now at the LHC (or at LEP, which operated at 209 GeV at the end).

    Another point in favor of the FCC-ee proposal is that it would allow for reuse of the tunnel (just as the LHC followed on LEP) for a very high energy proton-proton collider, called the FCC-hh, which would operate at a CM energy of 100 TeV. This would be a very expensive project, estimated to cost \$17 billion (on top of the previous \$10 billion cost of the FCC-ee).

  • HE-LHC: This would essentially be a higher energy version of the LHC, in the same tunnel, built using higher field (16 T vs. 8.33 T) magnets. It would operate at a CM energy of 27 TeV. The drawbacks are that, while construction would be challenging (there are not yet appropriate 16 T magnets), only a modest (27 vs. 14 TeV) increase in CM energy would be achieved. The big advantage over the FCC-hh is cost: much of the LHC infrastructure could be reused and the machine is smaller, so the total cost estimate is about \$7 billion.
  • CLIC: This would be a linear electron-positron collider with first stage of the project an 11 km-long machine that would operate at 380 GeV CM energy and cost about \$7 \$6 billion. The advantage of this machine over the circular FCC-ee is that it could ultimately be extended to a longer 50 km machine operating at 3 TeV CM energy (at a much higher cost). The disadvantage with respect to the FCC-ee is that it is not capable of operating at very high luminosity at lower energies (at the Z-peak or as a Higgs factory).

For some context for the very high construction costs of these machines, the CERN budget is currently around \$1.2 billion/year. It seems likely that member states will be willing to keep funding CERN at this level in the future, but I have no idea what prospects if any there are for significantly increased contributions to pay for a new collider. A \$10 billion FCC-ee construction cost spread out over 20 years would be \$500 million/year. Can this somehow be accommodated within CERN’s current budget profile? This seems difficult, but maybe not impossible. Where the additional \$17 billion for the FCC-hh might come from is hard to see.

If none of these three alternatives is affordable or deemed worth the cost, it looks like the only alternative for energy frontier physics is to do what the US has done: give up. The machines and their cost being considered here are similar in scale to the SSC project, which would have been a 40 TeV CM energy 87 km proton-proton collider but was cancelled in 1993. Note that the capabilities of the SSC would have been roughly comparable to the HE-LHC (it had higher energy, lower luminosity). Since it would have started physics around 2000, and an HE-LHC might be possible in 2040, one could say that the SSC cancellation set back the field at least 40 years. The worst part of the SSC cancellation was that the project was underway and there was no fallback plan. It’s hard to overemphasize how disastrous this was for US HEP physics. Whatever the Europeans do, they need to be sure that they don’t end up with this kind of failure.

Faced with a difficult choice like this, there’s a temptation to want to avoid it, to believe that surely new technology will provide some more attractive alternative. In this case though, one is running up against basic physical limits. For circular electron-positron machines, synchrotron radiation losses go as the fourth power of the energy, whereas for linear machines one has to put a lot of power in since one is accelerating then dumping the beam, not storing it. For proton-proton machines, CM energy is limited by the strength of the dipole magnets one can build at a reasonable cost and operate reliably in a challenging environment. Sure, someday we may have appropriate cheap 60T magnets and a 100 TeV pp collider could be built at reasonable cost in the LHC tunnel. We might also have plasma wakefield technology that could accelerate beams of electrons and positrons to multi-TeV energies over a reasonable distance, with a reasonable luminosity. At this point though, I’m willing to bet that in both cases we’re talking about 22nd century technology unlikely to happen to fall into the 21st century. Similar comments apply to prospects for a muon collider.

Another way to avoid the implications of this difficult choice is to convince oneself that cheaper experiments at low energy, or maybe astrophysical observations, can replace energy frontier colliders. Maybe one can get the same information about what is happening at the 1-10 TeV scale by looking at indirect effects at low energy. Unfortunately, I don’t think that’s very likely. There are things we don’t understand about particle physics that can be studied using lower energies (especially the neutrino sector) and such experiments should be pursued aggressively. It may be true that what we can learn this way can replace what we could learn with an energy-frontier collider, but that may very well just be wishful thinking.

So, what to do? Give up, or start trying to find the money for a very long-term, very challenging project, one with an uncertain outcome? Unlike the case of the LHC, we have no good theoretical reason to believe that we will discover a new piece of fundamental physics using one of these machines. You can read competing arguments from Sabine Hossenfelder (here and here) and Tommaso Dorigo (here, here and here).

Personally, I’m on the side of not giving up on energy frontier colliders at this point, but I don’t think the question is an easy one (unlike the question of building the LHC, which was an easy choice). One piece of advice though is that experience of the past few decades shows you probably shouldn’t listen to theorists. A consensus is now developing that HEP theory is in “crisis”, see for instance this recent article, where Neil Turok says “I’m busy trying to persuade my colleagues here to disregard the last 30 years. We have to retrace our steps and figure out where we went wrong.” If the Europeans do decide to build a next generation machine, selling the idea to the public is not going to be made easier by some of the nonsense from theorists used to sell the LHC. People are going to be asking “what about those black holes the LHC was supposed to produce?” and we’re going to have to tell them that that was a load of BS, but that this time we’re serious. This is not going to be easy…

Update: Some HEP experimentalists are justifiably outraged at some of the negative media stories coming out that extensively quote theorists mainly interested in quantum gravity. There are eloquent Twitter threads by James Beacham and Salvatore Rappoccio, responding to this Vox story. The Vox story quotes no experimentalists, instead quotes extensively three theorists working on quantum gravity (Jared Kaplan, Sabine Hossenfelder and Sean Carroll). Not to pick specifically on Kaplan, but he’s a good example of the point I was making above about listening to theorists. Ten years ago his work was being advertised with:

As an example question, which the LHC will almost certainly answer—we know that the sun contains roughly 10^60 atoms, and that this gigantic number is a result of the extreme weakness of gravity relative to the other forces—so why is gravity so weak?

Enthusiasm for the LHC then based on the idea that it was going to tell us about gravity was always absurd, and a corresponding lack of enthusiasm for a new collider based on negative LHC results on that front is just as absurd.

Update: Commenter abby yorker points to this new opinion piece at the New York Times, from Sabine Hossenfelder. The subtitle of the piece is “Ten years in, the Large Hadron Collider has failed to deliver the exciting discoveries that scientists promised.” This is true enough, but by not specifying the nature of the failure and which scientists were responsible, it comes off as blaming the wrong people, the experimentalists. Worse, it uses this failure to argue against further funding not of failed theory, but of successful experiment.

The LHC machine and the large-scale experiments conducted there have not in any sense been a failure, quite the opposite. The machine has worked very well, at much higher than design luminosity, close to design energy (which should be achieved after the current shutdown). The experiments have been a huge success on two fronts. In one direction, they’ve discovered the Higgs and started detailed measurements of its properties, in another they’ve done an amazing job of providing strong limits on a wide range of attempted extensions of the standard model.

These hard-won null results are not a failure of the experimental program, but a great success of it. The only failure here is that of the theorists who came up with bad theory and ran a hugely successful hype campaign for it. I don’t see how the lesson from seeing an experimental program successfully shoot down bad theory is that we should stop funding further such experiments. I also don’t see how finding out that theorists were wrong in their predictions of new phenomena at the few hundred GeV scale means that new predictions by (often the same) theorists of no new phenomena at the multiple TeV scale should be used as a reason not to fund experimentalists who want to see if this is true.

Where I think Hossenfelder is right is that too many particle physicists of all kinds went along with the hype campaign for bad theory in order to get people excited about the LHC. Going on about extra dimensions and black holes at the LHC was damaging to the understanding of what this science is really about, and completely unnecessary since there was plenty of real science to generate excitement. The discussion of post-LHC experimental projects should avoid the temptation to enter again into hype-driven nonsense. On the other hand, the discussion of what to defund because of the LHC results should stick to defunding bad theory, not the experiments that refute it.

Update: Some more commentary about this, from Chris Quigg, and the CERN Courier. In particular, the CERN Courier has this from Gerard ‘t Hooft:

Most theoreticians were hoping that the LHC might open up a new domain of our science, and this does not seem to be happening. I am just not sure whether things will be any different for a 100 km machine. It would be a shame to give up, but the question of whether spectacular new physical phenomena will be opened up and whether this outweighs the costs, I cannot answer. On the other hand, for us theoretical physicists the new machines will be important even if we can’t impress the public with their results.

and, from Joseph Incandela:

While such machines are not guaranteed to yield definitive evidence for new physics, they would nevertheless allow us to largely complete our exploration of the weak scale… This is important because it is the scale where our observable universe resides, where we live, and it should be fully charted before the energy frontier is shut down. Completing our study of the weak scale would cap a short but extraordinary 150 year-long period of profound experimental and theoretical discoveries that would stand for millennia among mankind’s greatest achievements.

Update: Also, commentary at Forbes from Chad Orzel here.

Update: I normally try and not engage with Facebook, and encourage others to follow the same policy, but there’s an extensive discussion of this topic at this public Facebook posting by Daniel Harlow.

Posted in Experimental HEP News, Favorite Old Posts | 78 Comments

Michael Atiyah 1929-2019

While away on vacation, I heard last week the sad news of the death last week of Michael Atiyah, at the age of 89. Atiyah was both a truly great mathematician and a wonderful human being. In his mathematical work he simultaneously covered a wide range of different fields, often making deep connections between them and providing continual new evidence of the unity of mathematics. This unifying vision also encompassed physics, and the entire field of topological quantum field theory was one result.

I had the great luck to be at MSRI during the 1988-89 academic year, when Atiyah spent that January there. Getting a chance to talk to him then was a remarkable experience. He had one of the quickest minds I’ve ever seen, often grasping what you were trying to explain before the words were out of your mouth. At one point that month I ran into Raoul Bott walking away from an ongoing discussion with Atiyah and Witten at a blackboard. Bott shook his head, saying something like “it’s just too scary listening to the two of them”.

Any question, smart or stupid, would lead to not just an answer, but a fascinating explanation of all sorts of related issues and conjectures. For Atiyah, his love of discussing mathematics was something to be shared at all times, with whoever happened to be around.

The last time I met him was in September 2016 in Heidelberg. He was his usual cheerful and engaging self, still in love with mathematics and with discussing it with anyone who would listen. I did notice though that age had taken its toll, in the sense that he no longer would engage with anything that got into the sort of complexities that in the past he had been quick to see his way through. It’s unfortunate that near the end of his life far too much attention was drawn to implausible claims he started making that he could see how to solve some of the most difficult and intractable open problems of the subject.

There’s a lot more I could write here about Atiyah and his remarkable career, but I’ve realized that most of it I’ve already gotten to in one post here or another. So, for more, see some of the following older posts, which discussed:

Interviews and profiles here, here and here.

Atiyah and his work with Raoul Bott.

Atiyah and topological quantum field theory.

Update: In recent years Andrew Ranicki had been maintaining a page with Atiyah-related links.

Posted in Obituaries | 7 Comments

Roy Glauber 1925-2018: Notes on QFT

I saw today that Roy Glauber has passed away, at the age of 93. John Preskill speculates that Glauber was the last living member of the wartime T division at Los Alamos.

My only interaction with him was that he was the instructor for the first quantum field theory course I took, at Harvard during the 1976-77 academic year. The course was my first exposure to quantum field theory, and was taught from what seemed then (at the time of the advent of gauge theories and wide use of the path integral method) a rather stodgy point of view. It’s one however that I have later in life come to appreciate more.

I just located the binder of notes I kept from the class and plan to look over them. It occurred to me that if I want to look at these on vacation, the thing to do is to scan them. So, I just did this, and am making the scans available here in case others are interested:

Roy Glauber: Quantum Field Theory notes 1976-77

Roy Glauber: Quantum Field Theory problems and solutions 1976-77

Update: The New York Times has an obituary here.

Posted in Obituaries | 13 Comments

This Week’s Hype (and a couple other things)

For today’s university press release designed to mislead the public with hype about string theory, Uppsala University has Our Universe: An expanding bubble in an extra dimension. It’s the Swampland variant of string theory hype, based on this preprint, which is now this PRL. Marketing to other press outlets as usual starts here and here.

In the current Swampland hype, string theorists have “discovered” that string theory doesn’t really necessarily have that landscape of vacua making it untestable, and now we’re finally on our way to testing string theory. In this press release version:

For 15 years, there have been models in string theory that have been thought to give rise to dark energy. However, these have come in for increasingly harsh criticism, and several researchers are now asserting that none of the models proposed to date are workable….

The Uppsala scientists’ model provides a new, different picture of the creation and future fate of the Universe, while it may also pave the way for methods of testing string theory.

For some other things that may be of more interest:

Finally, I’m leaving tomorrow for a two-week or so vacation in France, blogging likely slim to non-existent.

Posted in Swampland, This Week's Hype, Uncategorized | 2 Comments

Tim May 1951-2018

I was sorry to learn yesterday of the death of Tim May, who had been a frequent commenter here on the blog. For more about his life, see here and here.

One can find his comments here for instance by this search. In some of these he told a bit of the story of his life. I’ll include here part of one such comment:

In 1970 I was accepted at MIT, Stanford, and Berkeley for college. I transferred my acceptance and Regents Scholarship from Berkeley to UC Santa Barbara. A lesser school compared to Berkeley, on overall grounds, but a more interesting fit to my interests. (College of Creative Studies, with many advantages.)

By around 1972 it was clear the Big Drought was unfolding. Tales of Ph.D.s driving taxi cabs, professors advising that the odds of the then-current Ph.D. candidates getting a real position were dwindling. (Besides the overall downsizing of HEP and other physics funding, there was a glut of physics professors who had been hired in the post-Sputnik boom era….and they were still 30 years or more from retirement.)

Fast forwarding, I decided to not apply to grad school and instead join a small semiconductor company. There, I worked on a bunch of “engineering physics” probems. Because we were the leaders in dynamic RAM memory, I had exposure to some interesting problems. One of them was the mysterious issue of bits sometimes being flipped, but not permanently. In fact, the bit flips were apparently random and occurred only once (or at least close to only once…).

My physics background served me well, as I knew about the physics of how the devices worked (more so than a lot of the EE folks, who thought in terms of circuits), and I knew some geology. I had a brain storm that maybe low levels of uranium or thorium or the like in our ceramic and glass packages were causing the problem. Some experiments confirmed this. And all of the physics calculations about charged particle tracks in silicon matched. A lot of stuff I don’t have the space here to describe.

So my career was launched. Lots of papers on this “soft error” phenomenon. (Oh, and the cosmic ray corrollary was indeed obvious: but in 1978 when the first paper was presented, it was insignificant as a source as compared to alpha particles.)

Instead of spending until 1980-82 doing a Ph.D. and then 4-8 years or more as a post-doc, I had some fun and retired from Intel in 1986.

I’ve been pleasantly able to pursue whatever interested me ever since.

Posted in Obituaries | 2 Comments

The Chronic Incompleteness of String Theory

Ex-string theorist turned philosopher Richard Dawid has become known over the years for his arguments that string theory is a theory to be evaluated not by the conventional scientific method, in which experiment plays a role, but by “post-empirical theory assessment” methods. He has a book about this, and I’ve written about his arguments here, here and here.

Today he has a new paper out, entitled Chronic Incompleteness, Final Theory Claims, and the Lack of Free Parameters in String theory, which tries to address the “chronic incompleteness” problem of string theory’s claim to be a complete unified theory. This problem is starting to look very serious:

Rather than bringing the time horizon for the completion of fundamental physics from virtual infinity to somewhere within our lifetime, string theory’s final theory claim seems to be associated with an extension of the time horizon for the completion of this particular theory that may, once again, virtually reach towards infinity.

In other words, there’s a chronic problem of string theorists not being able to tell us what the theory actually is, and now it’s looking like they’ll never be able to do so. Dawid is right that this is the root of the problem, not usual excuses like “it predicts stuff, but you’d need an accelerator as big as the galaxy to test these predictions” or “the equations are just too hard to solve”.

One question here is that of defining what “string theory” even means anymore. Dawid does the best he can with this in a footnote:

Here as throughout the entire paper, the term ‘string theory’, if not specified otherwise, denotes the overall theory that aims at describing the observed world and is identified by the present knowledge on perturbative superstring theory, duality relations, etc.

This isn’t exactly a precise definition, it’s basically just “string theory is a conjectural theory with a certain list of properties which I’m not going to even try and describe, since that would get really complicated and different people likely have different lists”.

In the main text, Dawid explains that “string theory has no fundamental dimensionless free parameters”, a claim often made that I’ve always found kind of baffling. If you don’t know what the theory is, how do you know that it doesn’t have free parameters??? He makes a great deal of this assumption, adding in the argument that this lack of parameters means no classical limit, and I guess thus no formulation of the theory as “quantization” of something describable in classical terms.

I don’t really see what the big deal is about having a quantum system that is not defined as the quantization of some classical system. A simple example of such a system is the qubit that we often start teaching quantum mechanics with. Somehow Dawid wants to get from “not quantization of a classical system” to “we can’t ever hope to write down the theory”, but I don’t see how this follows.

He examines various possibilities for how the problem of no fundamental theory can be resolved. His alternative C is the obvious one: we just haven’t found it yet. He would like to argue that this might not be right, that string theory is a new and different kind of science:

…string theory and the conceptual context within which it is developed is in a number of ways substantially different from anything physicists have
witnessed up to this point. Therefore, it is far from clear whether prevalent physical intuitions as to which kinds of questions can be expected to have a fully calculable theoretical answer are applicable in this case. It seems
difficult to rule out that what seems to be a question that finds a fully calculable theoretical answer in fact rather resembles the case of the leaf carried by autumn winds and just defies calculation.

Dawid seems to argue that string theory may be an example of what he calls alternative A:

Even in principle, there exists no mathematical scheme that is empirically equivalent to string theory and generates quantitative results that specify the fundamental dynamics of the theory. In that case, the fundamental theory is conceptually incomplete by its very nature. It has no fundamental dynamics and no set of solutions that can be deduced from its first principles. The fundamental theory merely serves as a conceptual shell that embeds low energy descriptions (ground states of the theory) consistent with the principles encoded in the fundamental theory. Those low energy descriptions contain specified parameter values and do generate quantitative results. But there is no way to establish from first principles how probable specific ground states of the system are.

His summary of his vision of string theory is as follows:

Full access to a theory without free parameters thus might be expected to require representations that don’t have their own classical limit. The fact that they cannot be developed by generalizing away from a classical limit seems to impede the full formulation of a final theory even once one has found it. The resulting idea of a fundamental theory whose full formulation is hidden from the physicists’ grasp because its most adequate representation lacks intuitive roots has even more radical rivals, which amount to questioning the possibility of calculating the dynamics of the fundamental theory either within the bounds of human calculational power or as a matter of principle.

At one point Dawid acknowledges that some people have drawn the obvious conclusion about the current situation, the one consistent with our usual understanding of science:

It has been suggested by various exponents and observers of contemporary fundamental physics (see e.g. Smolin 2003, Woit 2003, Hossenfelder 2018) that the chronic incompleteness of string theory represents a substantial failure of the research program that is indicative of a strategical problem that has afflicted fundamental physics in recent decades.

He doesn’t like this conclusion, so argues that this time it’s different:

Considering the range and character of the very substantial differences that set the current state of fundamental physics apart from any previous stage in the history of physics, there is little reason to expect that theory building at the present stage can be judged according to criteria that seemed adequate in the past.

While he doesn’t say so, this argument takes him back to the problem of how one is to judge “string theory”, but taking a position even more radical than his earlier one. The argument now seems to be that we’re supposed to consider accepting as the final, fundamental theory of physics, a “theory” that is not just untestable, but is a “chronically incomplete” framework based on something we can never hope to define or understand. I’m having trouble understanding why this is supposed to be science rather than another human endeavor that it looks a lot more like, theology.

Posted in Uncategorized | 12 Comments

This Week’s Hype

A few months ago string theorists at Stanford had their university press office put out hype-filled promotional material about their research field, this was discussed here. One odd thing about this was that normally such PR efforts are made in connection with news of a supposed advance, but this press story had no news, just promotion of old and unsuccessful ideas.

This week it’s the turn of the string theorists at Princeton University, with Beyond Einstein: Physicists find surprising connections in the cosmos. Like the Stanford one, this story is not about any recent advance (there haven’t been any) but just a recounting of something from twenty years ago, without any acknowledgment that things haven’t worked out as hoped. To compare and contrast, another Princeton press office effort on the same topic back in 2007 (see here) at least had a specific new paper to advertise. The level of deception of the public though remains constant, no change from the 2007 highly misleading “Princeton Physicists Connect String Theory With Established Physics”.

The way this works, the misleading university press release gets picked up by others, who either just pass along the press release or write something based on it (with an even more misleading title). As an example of the first, phys.org yesterday had Gravity is mathematically relatable to dynamics of subatomic particles. For the second, today
Science Alert has Here’s Why String Theory Might Actually Point Us Towards a ‘Theory of Everything’. I suspect we’ll see yet more of this in the next few days as the hype-diffusion process initiated by the Princeton theorists takes its usual course.

The basic, intentionally misleading, PR claim at that bottom of this is the characterization of AdS/CFT as:

The key insight is that gravity, the force that brings baseballs back to Earth and governs the growth of black holes, is mathematically relatable to the peculiar antics of the subatomic particles that make up all the matter around us.

What’s not mentioned is that this has nothing to do with either gravity as experienced by baseballs, or the subatomic particles that make up matter. The AdS/CFT conjecture relates gravity in the wrong space-time dimension (5), with the wrong space-time curvature (AdS) to a quantum field theory that doesn’t describe any known particles (N=4 SSYM). For the last twenty years there has been lots of speculation about the possibility of extending this to the real world cases, but this hasn’t worked out. There’s no known dual QFT to gravity in the physical dimension with the physical sign of curvature, and, from the other side, no known gravity theory dual to the Standard Model.

As with the recent Stanford effort, the great thing about having your own university press office do this and not involve journalists is that they just talk about you and let you say whatever misleading thing you want. No danger that this kind of story will raise embarrassing questions or that it will give a voice to or even acknowledge the existence of anyone likely to raise objections.

Posted in This Week's Hype | 3 Comments

Notes on Current Affairs

Blogging has been light recently, partly due to quite a bit of traveling. This included a brief trip the week before last to Los Angeles, where I met up with, among others, Sabine Hossenfelder. This past week I was in Washington DC for a few days, and gave a talk at the US Naval Observatory, one rather similar to my talk earlier this year in Rochester. In addition, I’ve been trying to spend my time on more fruitful activities, especially a long-standing unfinished project to make sense of the relationship between BRST and Dirac cohomology. Optimistically, I’ll have a finished (or at least finished enough to make public) version of something later in January, after taking a couple weeks at the beginning of the month for a vacation in France.

I’m completely in agreement with Sabine about the sad state of high energy particle theory, and glad to see that she has been forcefully trying to get people to acknowledge the problem. I don’t agree though with her “Lost in Math” characterization of the problem, and my talks in DC and Rochester tried to make the case that what is needed is more interaction with mathematics, not less.

Various things that might be of interest that have to do with the state of high energy physics theory are the following:

  • A Y Combinator blog interview with Lenny Susskind. I think it’s fair to say that Susskind now admits that string theory, as currently understood, cannot explain the Standard Model, and that as a result he has given up on trying to make any progress on particle theory. He says:

    My guess is, the theory of the real world may have things to do with string theory but it’s not string theory in it’s formal, rigorous, mathematical sense. We know that the formal, by formal I mean mathematically, rigorous structure that string theory became. It became a mathematical structure of great rigor and consistency that it, in itself, as it is, cannot describe the real world of particles. It has to be modified, it has to be generalized, it has to be put in a slightly bigger context. The exact thing, which I call string theory, which is this mathematical structure, is not going to be able to, by itself, describe particles…

    We made great progress in understanding elementary particles for a long time, and it was always progressed, though, in hand-in-hand with experimental developments, big accelerators and so forth. We seem to have run out of new experimental data, even though there was a big experimental project, the LHC at CERN, whatever that is? A great big machine that produces particles and collides them. I don’t want to use the word disappointingly, well, I will anyway, disappointingly, it simply didn’t give any new information. Particle physics has run into, what I suspect is a temporary brick wall, it’s been, basically since the early 1980s, that it hasn’t changed. I don’t see at the present time, for me, much profit in pursuing it.

  • Susskind instead spends his time on highly speculative ideas relating geometry and quantum theory, with the idea that while this has no connection to particle physics, it might somehow lead to progress in understanding quantum gravity. Natalie Wolchover at Quanta has a new story about Susskind’s latest speculations.
  • This past week the Simons Foundation-funded “It from Qubit” collaboration has been having a two-part conference. This started at the IAS, talks available here, then moved on to the Simons Foundation headquarters in NYC (see here). Videos of the IAS part are available, for the NYC part, there’s Twitter. George Musser reports that Juan Maldacena has figured out how to construct (in principle) traversable wormholes, and that he’s arguing that “quantum computers are so powerful that they create spacetime”. For a tweet showing a summary of what has been achieved, see here.

    Personally I’ve always been dubious that we’ll ever have a useful “quantum theory of gravity” unless we have some sort of unification with the standard model, which would provide a connection to things we can understand and measure. Lacking such a connection, another way to go would be to try and evaluate a “quantum theory of gravity” proposal based on its mathematical consistency, coherence and beauty. My problem with the “It from Qubit” program is that, ignoring the way it gives up on connecting to what we understand, I’ve never seen anything coming out of it that looks like an actual well-defined theory of quantum gravity that one could evaluate as a mathematical model consistent with quantum mechanics and what we know about 3+1d general relativity.

  • For something about quantum computation and its relation to fundamental physics that I can understand, John Preskill has a wonderful article on Simulating quantum field theory with a quantum computer. Nothing there I can see about quantum gravity.
  • Given that its founder Susskind and the other leading figures of the field at the IAS have pretty much given up on the project of relating string theory to particle physics, an interesting question is that of why the pathology of so many researchers still working on this failed project? The source of this pathology and the question of what can be done about it I think are at the center of Sabine Hossenfelder’s book and recent blogging. An important question that is getting raised here is that of the damage this situation is doing to the credibility of science. If you want to fight the good fight against those who, because it threatens their tribe, want to deny the facts of climate science, is it helpful if many of the best and brightest in science are denying facts that threaten their tribe? Scientific American has a story about Why Smart People Are Vulnerable to Putting Tribe Before Truth, but it doesn’t make clear the depth of the problem (i.e. that some of the smartest scientists around are doing this).
  • For an example of the problem, see an interview with Gabriele Veneziano about the history and current state of string theory. He’s in denial of the obvious fact that string theory makes no predictions:

    People say that string theory doesn’t make predictions, but that’s simply not true. It predicts the dimensionality of space, which is the only theory so far to do so, and it also predicts, at tree level (the lowest level of approximation for a quantum-relativistic theory), a whole lot of massless scalars that threaten the equivalence principle (the universality of free-fall), which is by now very well tested. If we could trust this tree-level prediction, string theory would be already falsified. But the same would be true of QCD, since at tree level it implies the existence of free quarks. In other words: the new string theory, just like the old one, can be falsified by large-distance experiments provided we can trust the level of approximation at which it is solved. On the other hand, in order to test string theory at short distance, the best way is through cosmology. Around (i.e. at, before, or soon after) the Big Bang, string theory may have left its imprint on the early universe and its subsequent expansion can bring those to macroscopic scales today.

    This take on how you evaluate a theory by comparing it to experiment is not one that will give the average person much understanding of the scientific method or much confidence in scientists and their devotion to it.

Posted in Uncategorized | 19 Comments

The End of (one type of) Physics, and the Rise of the Machines

Way back in 1996 science writer John Horgan published The End of Science, in which he made the argument that various fields of science were running up against obstacles to any further progress of the magnitude they had previously experienced. One can argue about other fields (please don’t do it here…), but for the field of theoretical high energy physics, Horgan had a good case then, one that has become stronger and stronger as time goes on.

A question that I always wondered about was that of what things would look like once the subject reached the endpoint where progress had stopped more or less completely. In the book, Horgan predicted:

A few diehards dedicated to truth rather than practicality will practice physics in a nonempirical, ironic mode, plumbing the magical realm of superstrings and other esoterica and fret­ting about the meaning of quantum mechanics. The conferences of these ironic physicists, whose disputes cannot be experimentally resolved, will become more and more like those of that bastion of literary criticism, the Modern Language Association.

This is now looking rather prescient. For some other very recent indications of what this endpoint looks like, there’s the following:

  • In today’s New York Times, in celebration of forty years of the Science Times section, Dennis Overbye has a piece reporting that Physicists are no longer unified in the search for a unified theory. His main example is the recent Quanta article by the IAS director that got headlined There Are No Laws of Physics. There’s Only the Landscape. The latest from Dijkgraaf is that string theory is probably the answer, but we don’t know what string theory is:

    Probably there is some fundamental principle, he said, perhaps whatever it is that lies behind string theory.
    But nobody, not even the founders of string theory, can say what that might be.

  • Overbye also quotes Sabine Hossenfelder, who is now taking on the thankless role of the field’s Jeremiah. Her latest blog posting, The present phase of stagnation in the foundations of physics is not normal, is a cry of all too justifiable frustration at the sad state of the subject and the refusal by many to acknowledge what has happened. Well worth paying attention to are comments from Peter Shor here and here.

Another frightening vision of the future of this field that has recently struck me as all too plausible has turned up appended to a piece entitled The Twilight of Science’s High Priests, by John Horgan at Scientific American. This is a modified version of a review of books by Hawking and Rees that Horgan wrote for the Wall Street Journal, and it attracted a response from Martin Rees, who has this to say about string theory:

On string theory, etc., I’ve been wondering about the possibility that an AI may actually be able to ‘learn’ a particular model and calculate its consequences even of this was too hard for any human mathematician. If it came up with numbers for the physical constants that agreed (or that disagreed) with the real world, would we then be happy to accept its verdict on the theory? I think the answer is probably ‘yes’ — but it’s not as clear-cut as in the case of (say) the 4-colour theorem — in that latter case the program used is transparent, whereas in the case of AI (even existing cases like Alpha Go Zero) tor programmer doesn’t understand what the computer does.

This is based on the misconception about string theory that the problem with it is that “the calculations are too hard”. The truth of the matter is that there is no actual theory, no known equations to solve, no real calculation to do. But, with the heavy blanket of hype surrounding machine learning these days, that doesn’t really matter, one can go ahead and set the machines to work. This is becoming an increasingly large industry, see for instance promotional pieces here and here, papers here, here, here and here, and another workshop coming up soon.

For an idea of where this may be going, see Towards an AI Physicist for Unsupervised Learning, by Wu and Tegmark, together with articles about this here and here.

Taking all these developments together, it starts to become clear what the future of this field may look like, and it’s something even Horgan couldn’t have imagined. As the machines supersede human’s ability to do the kind of thing theorists have been doing for the last twenty years, they will take over this activity, which they can do much better and faster. Biological theorists will be put out to pasture, with the machines taking over, performing ever more complex, elaborate and meaningless calculations, for ever and ever.

Update: John Horgan points out to me that he had thought of this, with a chapter at the end of his book, “Scientific Theology, or the End of Machine Science” which discusses the possibility of machines taking over science.

Posted in Multiverse Mania | 43 Comments

Langlands/Frenkel and Some Other Things

The Canadian publication The Walrus today has a wonderful article about Robert Langlands, focusing on his attitude towards the geometric Langlands program and its talented proponent Edward Frenkel. I watched Frenkel’s talk at the ongoing Minnesota conference via streaming video (hopefully the video will be posted soon), and it was an amazing performance on multiple levels. A large part of it was a beautiful explanation of the history and basic conception of what has come to be known as geometric Langlands. He then went on to explain carefully some of the ideas in the recent Russian paper by Langlands, basically saying that they worked in the Abelian case, but could not work in the non-Abelian case. He ended by describing some alternate ideas that he is working on with David Kazhdan. Langlands was in the audience and at the end of the talk rose to comment extensively, but I couldn’t hear his side of this since he had no microphone (that Frenkel was sticking to his guns though was clear).

Besides giving the talk, Frenkel has made available a manuscript which gives a much more detailed version of the talk. See section 3.5 for an explanation of what he sees as the fundamental problem with what Langlands is trying to do: even in the simpler case of G/B over the complex field, you can’t successfully define a Hecke algebra in the way that Langlands wants.

The conference is finishing up right now, with final remarks by Langlands coming up later this afternoon.

A few more items, mostly involving my Columbia math department colleagues:

  • If you connect quickly to the streaming video from Minnesota, you may be able to catch Michael Harris’s talk on local Langlands.
  • Quanta magazine has an article about a recent proof of an old conjecture by Dorian Goldfeld about ranks of elliptic curves. This is due to Alexander Smith, now a third fourth year graduate student at Harvard (he started working on this while an undergrad at Princeton, with Shouwu Zhang). His twin brother Geoffrey is also a math grad student at Harvard.
  • Andrei Okounkov has been giving some talks recently at various places about developments in geometric representation theory with some connection to physics, under the title New worlds for Lie Theory. The slides from the ICM version of the talk are here.
  • For those more interested in physics than mathematics the new issue of Inference has some articles you might enjoy. In particular, Sheldon Glashow is no fan (neither is Chris Fuchs) of the book I reviewed here

Update: Michael Harris is appearing via Skype from his home near here, since transportation out of NYC yesterday was mostly shut down (very early season unprecedented snowstorm, during rush hour…).

Update: I’m listening to the closing talk by Langlands. He is explaining his version of geometric Langlands, responds to criticism from Frenkel with “As far as I know there are no errors in the paper, no matter what you may see elsewhere”. He ends his talk with something like “At the last page I threw down my pen… It works and it works by a miracle. Don’t doubt it, it does work!”

Update: Another livestream, starting in moments: Alice and Bob Meet the Wall of Fire, a panel discussion with Quanta writers at the Simons Foundation.

Update: Videos from the Langlands Abel conference are now available, in particular Frenkel here and Langlands here.

Update: For another expository piece about the Langlands program, one that I somehow missed when it came out recently, see Sol Friedberg’s What is the Langlands Program? in the AMS Notices.

Update: An updated version of Frenkel’s notes is now available at the arXiv. Highly recommended for its lucid explanation of the form the geometric Langlands program has taken.

Posted in Langlands, Uncategorized | 14 Comments