The European HEP community is now engaged in a “Strategy Update” process, the next step of which will be an open symposium this May in Granada. Submissions to the process were due last month, and I assume that what was received will be made publicly available at some point. This is supposed to ultimately lead to the drafting of a new European HEP strategy next January, for approval by the CERN Council in May 2020.
The context of these discussions is that European HEP is approaching a very significant crossroads, and decisions about the future will soon need to be made. The LHC will be upgraded in coming years to a higher luminosity, ultimately rebranded as the HL-LHC, to start operating in 2026. After 10-15 years of operation in this higher-luminosity mode, the LHC will reach the end of its useful life: the marginal extra data accumulated each year will stop being worth the cost of running the machine.
Planning for the LHC project began back in the 1980s, and construction was approved in 1994. The first physics run was 16 years later, in 2010. Keep in mind that the LHC project started with a tunnel and a lot of infrastructure already built, since the LEP tunnel was being reused. If CERN decides it wants to build a next generation collider, this could easily take 20 years to build, so if one wants it to be ready when the LHC shuts down, one should have started already.
Some of the strategy discussion will be about experiments that don’t require the highest possible collision energies (the “energy frontier”), for instance those that study neutrinos. Among possibilities for a new energy frontier collider, the main ones that I’m aware of are the following, together with some of their advantages and drawbacks:
- FCC-ee: This would be an electron-positron machine built in a new 100 km tunnel, operating at CM energies from 90 to 365 GeV. It would provide extremely high numbers of events when operated at the Z-peak, and could also be operated as a “Higgs factory”, providing a very large number of Higgs events to study, in a much cleaner environment than that provided by a proton-proton collider like the LHC.
In terms of drawbacks, it is estimated to cost \$10 billion or so. The CM energy is quite a bit less than that of the LHC, so it seems unlikely that there are new unknown states that it could study, since these would have been expected to show up by now at the LHC (or at LEP, which operated at 209 GeV at the end).
Another point in favor of the FCC-ee proposal is that it would allow for reuse of the tunnel (just as the LHC followed on LEP) for a very high energy proton-proton collider, called the FCC-hh, which would operate at a CM energy of 100 TeV. This would be a very expensive project, estimated to cost \$17 billion (on top of the previous \$10 billion cost of the FCC-ee).
- HE-LHC: This would essentially be a higher energy version of the LHC, in the same tunnel, built using higher field (16 T vs. 8.33 T) magnets. It would operate at a CM energy of 27 TeV. The drawbacks are that, while construction would be challenging (there are not yet appropriate 16 T magnets), only a modest (27 vs. 14 TeV) increase in CM energy would be achieved. The big advantage over the FCC-hh is cost: much of the LHC infrastructure could be reused and the machine is smaller, so the total cost estimate is about \$7 billion.
- CLIC: This would be a linear electron-positron collider with first stage of the project an 11 km-long machine that would operate at 380 GeV CM energy and cost about
\$7\$6 billion. The advantage of this machine over the circular FCC-ee is that it could ultimately be extended to a longer 50 km machine operating at 3 TeV CM energy (at a much higher cost). The disadvantage with respect to the FCC-ee is that it is not capable of operating at very high luminosity at lower energies (at the Z-peak or as a Higgs factory).
For some context for the very high construction costs of these machines, the CERN budget is currently around \$1.2 billion/year. It seems likely that member states will be willing to keep funding CERN at this level in the future, but I have no idea what prospects if any there are for significantly increased contributions to pay for a new collider. A \$10 billion FCC-ee construction cost spread out over 20 years would be \$500 million/year. Can this somehow be accommodated within CERN’s current budget profile? This seems difficult, but maybe not impossible. Where the additional \$17 billion for the FCC-hh might come from is hard to see.
If none of these three alternatives is affordable or deemed worth the cost, it looks like the only alternative for energy frontier physics is to do what the US has done: give up. The machines and their cost being considered here are similar in scale to the SSC project, which would have been a 40 TeV CM energy 87 km proton-proton collider but was cancelled in 1993. Note that the capabilities of the SSC would have been roughly comparable to the HE-LHC (it had higher energy, lower luminosity). Since it would have started physics around 2000, and an HE-LHC might be possible in 2040, one could say that the SSC cancellation set back the field at least 40 years. The worst part of the SSC cancellation was that the project was underway and there was no fallback plan. It’s hard to overemphasize how disastrous this was for US HEP physics. Whatever the Europeans do, they need to be sure that they don’t end up with this kind of failure.
Faced with a difficult choice like this, there’s a temptation to want to avoid it, to believe that surely new technology will provide some more attractive alternative. In this case though, one is running up against basic physical limits. For circular electron-positron machines, synchrotron radiation losses go as the fourth power of the energy, whereas for linear machines one has to put a lot of power in since one is accelerating then dumping the beam, not storing it. For proton-proton machines, CM energy is limited by the strength of the dipole magnets one can build at a reasonable cost and operate reliably in a challenging environment. Sure, someday we may have appropriate cheap 60T magnets and a 100 TeV pp collider could be built at reasonable cost in the LHC tunnel. We might also have plasma wakefield technology that could accelerate beams of electrons and positrons to multi-TeV energies over a reasonable distance, with a reasonable luminosity. At this point though, I’m willing to bet that in both cases we’re talking about 22nd century technology unlikely to happen to fall into the 21st century. Similar comments apply to prospects for a muon collider.
Another way to avoid the implications of this difficult choice is to convince oneself that cheaper experiments at low energy, or maybe astrophysical observations, can replace energy frontier colliders. Maybe one can get the same information about what is happening at the 1-10 TeV scale by looking at indirect effects at low energy. Unfortunately, I don’t think that’s very likely. There are things we don’t understand about particle physics that can be studied using lower energies (especially the neutrino sector) and such experiments should be pursued aggressively. It may be true that what we can learn this way can replace what we could learn with an energy-frontier collider, but that may very well just be wishful thinking.
So, what to do? Give up, or start trying to find the money for a very long-term, very challenging project, one with an uncertain outcome? Unlike the case of the LHC, we have no good theoretical reason to believe that we will discover a new piece of fundamental physics using one of these machines. You can read competing arguments from Sabine Hossenfelder (here and here) and Tommaso Dorigo (here, here and here).
Personally, I’m on the side of not giving up on energy frontier colliders at this point, but I don’t think the question is an easy one (unlike the question of building the LHC, which was an easy choice). One piece of advice though is that experience of the past few decades shows you probably shouldn’t listen to theorists. A consensus is now developing that HEP theory is in “crisis”, see for instance this recent article, where Neil Turok says “I’m busy trying to persuade my colleagues here to disregard the last 30 years. We have to retrace our steps and figure out where we went wrong.” If the Europeans do decide to build a next generation machine, selling the idea to the public is not going to be made easier by some of the nonsense from theorists used to sell the LHC. People are going to be asking “what about those black holes the LHC was supposed to produce?” and we’re going to have to tell them that that was a load of BS, but that this time we’re serious. This is not going to be easy…
Update: Some HEP experimentalists are justifiably outraged at some of the negative media stories coming out that extensively quote theorists mainly interested in quantum gravity. There are eloquent Twitter threads by James Beacham and Salvatore Rappoccio, responding to this Vox story. The Vox story quotes no experimentalists, instead quotes extensively three theorists working on quantum gravity (Jared Kaplan, Sabine Hossenfelder and Sean Carroll). Not to pick specifically on Kaplan, but he’s a good example of the point I was making above about listening to theorists. Ten years ago his work was being advertised with:
As an example question, which the LHC will almost certainly answer—we know that the sun contains roughly 10^60 atoms, and that this gigantic number is a result of the extreme weakness of gravity relative to the other forces—so why is gravity so weak?
Enthusiasm for the LHC then based on the idea that it was going to tell us about gravity was always absurd, and a corresponding lack of enthusiasm for a new collider based on negative LHC results on that front is just as absurd.
Update: Commenter abby yorker points to this new opinion piece at the New York Times, from Sabine Hossenfelder. The subtitle of the piece is “Ten years in, the Large Hadron Collider has failed to deliver the exciting discoveries that scientists promised.” This is true enough, but by not specifying the nature of the failure and which scientists were responsible, it comes off as blaming the wrong people, the experimentalists. Worse, it uses this failure to argue against further funding not of failed theory, but of successful experiment.
The LHC machine and the large-scale experiments conducted there have not in any sense been a failure, quite the opposite. The machine has worked very well, at much higher than design luminosity, close to design energy (which should be achieved after the current shutdown). The experiments have been a huge success on two fronts. In one direction, they’ve discovered the Higgs and started detailed measurements of its properties, in another they’ve done an amazing job of providing strong limits on a wide range of attempted extensions of the standard model.
These hard-won null results are not a failure of the experimental program, but a great success of it. The only failure here is that of the theorists who came up with bad theory and ran a hugely successful hype campaign for it. I don’t see how the lesson from seeing an experimental program successfully shoot down bad theory is that we should stop funding further such experiments. I also don’t see how finding out that theorists were wrong in their predictions of new phenomena at the few hundred GeV scale means that new predictions by (often the same) theorists of no new phenomena at the multiple TeV scale should be used as a reason not to fund experimentalists who want to see if this is true.
Where I think Hossenfelder is right is that too many particle physicists of all kinds went along with the hype campaign for bad theory in order to get people excited about the LHC. Going on about extra dimensions and black holes at the LHC was damaging to the understanding of what this science is really about, and completely unnecessary since there was plenty of real science to generate excitement. The discussion of post-LHC experimental projects should avoid the temptation to enter again into hype-driven nonsense. On the other hand, the discussion of what to defund because of the LHC results should stick to defunding bad theory, not the experiments that refute it.
Update: Some more commentary about this, from Chris Quigg, and the CERN Courier. In particular, the CERN Courier has this from Gerard ‘t Hooft:
Most theoreticians were hoping that the LHC might open up a new domain of our science, and this does not seem to be happening. I am just not sure whether things will be any different for a 100 km machine. It would be a shame to give up, but the question of whether spectacular new physical phenomena will be opened up and whether this outweighs the costs, I cannot answer. On the other hand, for us theoretical physicists the new machines will be important even if we can’t impress the public with their results.
and, from Joseph Incandela:
While such machines are not guaranteed to yield definitive evidence for new physics, they would nevertheless allow us to largely complete our exploration of the weak scale… This is important because it is the scale where our observable universe resides, where we live, and it should be fully charted before the energy frontier is shut down. Completing our study of the weak scale would cap a short but extraordinary 150 year-long period of profound experimental and theoretical discoveries that would stand for millennia among mankind’s greatest achievements.
Update: Also, commentary at Forbes from Chad Orzel here.
Update: I normally try and not engage with Facebook, and encourage others to follow the same policy, but there’s an extensive discussion of this topic at this public Facebook posting by Daniel Harlow.
Nice article Peter!
One thing I would modify is “One piece of advice though is that experience of the past few decades shows you probably shouldn’t listen to theorists.” I think that the right lesson is not to listen to SOME theorists. I think you’ll find that the really great ones are dead honest about the LHC. Here, for example, is Steven Weinberg, speaking to the public about the LHC, pre-LHC (2010) without the slightest trace of getting over excited about speculative ideas. 52:08 https://www.youtube.com/watch?v=Gnk0rnBQrR0
Pick any field. It’s almost inevitable that if you interview lots of professors about their research, they will try to make what they are doing sound exciting by talking about the most exciting possible outcomes. That’s not really something bad about particle physics in particular.
HGB wrote: My main issue here is Sabine Hossenfelder’s argument, to wit, we have no compelling theoretical reasons to expect that those accelerators would find any new physics.
I hope that if a new accelerator can tack on a few digits after the decimal point on some number of measurements — i.e., reveals existing physics to much greater precision, that would be considered worthwhile.
Peter, thanks for this: “…if you don’t want to give up completely, your only hope is the experimentalists” Yes.
Let’s consider the two arguments against a 100 TeV collider separately.
The money argument is specious. A few 10s of Giga$$ will be easily repaid into the world economy just by the technical and human products of the project.
The other seems to be “theorists have not written a clever enough paper to warrant a bigger machine.” This is ridiculous given the two decade lead time. While plenty of machines were driven by theory (Bevatron for the antiproton, SPS for the W/Z, PEP, Petra, Tevatron for the top), there were many that were not (SLAC’s Linac and SPEAR, BNL’s AGS come to mind).
See the following for some examples of BSM physics that do not require higher energy colliders, and challenges in the funding of such experiments (and theorists who work on them):
with slides at http://online.kitp.ucsb.edu/online/hepfront18/safronova/pdf/Safronova_HEPFront18_KITP.pdf
These are discussions of using ultra-high precision experiments rather than higher energy experiments to discover and explore deviations from the SM and Dark Matter.
LHC experiments discovered the Higgs boson, therefore it didn’t fail, period.
On the other hand, the money involved is a lot, and the question is not science vs. military expenses, it’s science vs. science at this point. (Human Brain Project, anyone?)
Also, a lot of what is going on is ego games by aging academics who have been used to be treated overly well. Apologies, but there is a societal/psychological side to almost everything, biases should be recognised even when accepted.
Even worse, have the consequences and the public repercussions of not delivering anything been considered? I hope they have by some. Because I’d consider such a scenario as catastrophic.
On the other other and more practical hand, I think there is an underrepresented aspect of the situtation in public discussions: Accelerator-based science and new experiments are not going to leave us. It’s not “100TeV or nothing”. Even CERN started inviting proposals for ingenuous new projects based on smaller-scale accelerators a few years ago, precisely as good alternatives to a lack of higher energies.
Some higher energy frontier fans have been giving the impression that we know only one way forward and that is brute force; brute force is a lesser hacking tactic, it may well be a lesser nature hacking tactic as well.
What is the current status of muon colliders? They seem like an attractive idea to me — they are elementary particles, giving “clean” collisions; their mass is high enough that synchrotron radiation is not a problem; and yet their mass is one-ninth that of a proton, so the collider can be smaller for the same energy. On the other hand, the half life of the muon is 2.2 microseconds, so you have milliseconds at most to use them, even accounting for relativistic time dilation.
Fermilab has (or had) a “Project X” that included a muon collider but the information available on the web is out of date.
Perhaps money would be better spent on this than on electron or proton colliders?
I think one very important point to remember is that the LHC was sold to the public and the funding agencies using not just “we’ll find the Higgs” but also “we’ll find supersymmetry.”
In retrospect, it doesn’t matter. It’s clear that the cost was justified by finding the Higgs and by what we’ve learned about the Standard Model.
But suppose we try to sell this new collider using the argument “we’ll add a few more decimal points to our knowledge of the Standard Model, and there’s also maybe a tiny chance that a miracle occurs and we find a new particle that there’s currently no evidence for.” Do you think it’ll be funded? I don’t.
So in order to get the new collider funded, we’ll end up lying (maybe not lying, exactly, because there are string theorists who still believe that supersymmetry will be found just around the corner, but close enough). And then, if it actually doesn’t discover anything, there’s going to be a backlash against physics—probably all of physics.
So I think Sabine is ultimately right in that we shouldn’t be trying to build a new collider now. Let the LHC run a few more years, see whether it sees hints of anything new, and decide then.
I have no skin in this game other than as a taxpayer and interested bystander — and I am all for building the new collider, because frankly I’m curious what’s out there at the “energy frontier”!
But I have a (I think) a legitimate question, which I did not see raised in the discussions I’ve read. This is: if there is new physics at the new collider energy, what are the chances it will be actually detected without “guidance” from the theorists?
Ever since I’ve learned about the triggers and how most events in the collisions are _not_ collected and recorded, I was wondering about it. As far as I understand it, the logic of the triggers is based on the Standard Model calculations. But cannot it lead to a false self-confirmation and circularity?
I’ve recently read on an internet forum an anonymous post claiming that LHC may be already producing “tons” of glueballs and how these events would look very similar to some common decays, and so dismissed by the triggers, and how one needs to have the right theory to distinguish these events. I have no idea whether the poster was a real physicist or a crackpot, but isn’t there a danger of a similar scenario, and how big this danger is?
And if there is such danger, it would seem to me it increases with the size of a collider, because the bigger the energy, the larger is the percentage of events that one needs to discard (please correct me if I’m wrong).
So, shouldn’t this also be a factor in assessing costs/benefits of a new collider?
Tupeloid, your comment “it’s science vs. science at this point” is dangerously naive.
Do you seriously think the few 10’s of G$ that would have gone to the FCC-like collider will be plowed into other science? Are you still waiting for the post-SSC-cancellation funding bump for brain science and condensed matter?
No, the money just disappears. Probably tax cuts for the well off. If we do not build the next collider science will shrink as a fraction of human endeavor.
And I am effing sick of all the “but the theorists promised us SUSY wah wah wah…”. You lot should be smarter than that. No one can “promise” that. What we promised, and delivered on, was *substantially* improved understanding of nature at ever shorter distances. Including the Higgs which wasn’t a promise, but a reasonable guess given top mass and electroweak asymmetries.
Nature is what nature is. If we stop studying it because it doesn’t confirm to our preconceived notions, it diminishes us.
NoGo, your comment “LHC may be already producing “tons” of glueballs and how these events would look very similar to some common decays” is well taken. Probably not a crackpot (but you never know on the internet). This sort of physics a personal interest of mine.
We throw away over 99% of the collisions due to filters that require high energy deposits, or objects like electrons, muons, missing momenta… There are serious concerns about this sort of “filter bias” and there are regular discussions of how the new physics signatures could be hiding from us. One of them is what you describe, colored objects that decay in a shower of soft pions and escape all filters. Other ways include long lived particles (longer than the 20ns data-capture window), or signatures that travel along the beamline rather than transverse to it.
One of the ways we are addressing this is to use cutting edge fast electronics to do data analysis at a rate fast enough to recognize these reactions.
There certainly are experiments founded on good ideas that aren’t being *effectively* pursued alongside the big LHC and neutrino experiments. As an example, there are a number of promising new ideas in the search for Dark Matter which involve small investments – experiments costing ~\$1M-\$10M. The *community* has been actively pursuing these projects for many years, but funding support has been extremely thin because they are not called out by name in the P5 report (which simply calls for a portfolio of “small and medium sized projects”) and they have been squeezed, sometimes to the edge of viability or to extinction, by emphasis on the LHC and the neutrino program. Meanwhile, new proposals in this space have been slow-walked by the funding agencies while they figure out how to build a case to spend some relatively modest money within a system designed to shepherd much larger projects. The latest step in this now multi-year process was a recent “Basic Research Needs” workshop (formal DOE workshop used to define the “mission need” for a new research area): here is the glossy “show your congressperson” brochure shown at the recent HEPAP meeting based upon the workshop report:
…where the full report is due to be released by OHEP in the next several days. There is an intent – if it is decided that some money can be set aside for this – to call for proposals based on this report in the coming months, but I am certain that many good ideas will not make the cut even in the rosiest scenario. I believe this is an example of the kind of experiments that really aren’t being aggressively pursued, and could be if we were willing to say we don’t need every last dollar (and then a lot more) to build the next huge machine. Some of these ideas are indeed part of the European Strategy discussion.
It used to be that a smart lab director with a vision had the discretion to make experiments of this scale happen, and many great discoveries resulted. Whatever comes out of the discussion of what happens after the LHC, I hope there can be a solution, at least a partial one, to making a path forward for these kinds of small experiments – not just in this area, but across the field, because the system we have for funding, approving and managing \$1B experiments has proven unsuitable to dealing with these small experiments.
(It looks like my long link isn’t going to be preserved, but you can easily google basic research needs dark matter.)
Thanks for the comment. I don’t think though that this sort of problem (how the current system deals with funding for small investments) is really that relevant to the collider discussion. The US has no expensive collider program now, and the last couple years DOE HEP has seen huge increases (10% in FY2018, 8% in FY2019, due to the Trump era jettisoning of any restraint on spending). So if, with plenty of new money and no collider, the US is not funding such small investments, the problem lies elsewhere than in the size of the budget or the existence of an expensive collider construction program. If CERN decides not to build a collider, they’ll then likely be in the same sort of situation the US is now in, no reason to believe the problems of small investments will be any different.
> Tupeloid, your comment “it’s science vs. science at this point” is dangerously naive.
I understand that it might sound like it because of the unusual tone, but it isn’t really:
> Do you seriously think the few 10’s of G$ that would have gone to the FCC-like collider will be plowed into other science?
I’m describing the situation as it is, not as it should be. In an ideal world both a huge collider and many smaller projects should be funded, but now, yes, if this order of money goes into one project then the science budget of several countries will have to cut down elsewhere. We cannot simply forgo this fact when thinking about the proposals.
> Are you still waiting for the post-SSC-cancellation funding bump for brain science and condensed matter?
From the context I understand that you didn’t read correctly my mention of the Human Brain Project. I didn’t mention it as a nice funding target, and I’m definitely not talking about taking money from physics and putting it elsewhere.
If you look up the Project’s history you’ll see that it is a very centralized mega-project which dried European funding from brain research in areas or labs not related to it. When it was under discussion many brain researchers opposed it on this ground. Now it is approaching its designated duration and it has infamously produced almost nothing.
So, I used it as an example of a real very centralized mega-project which did dry funding from others working in the same field.
> No, the money just disappears. Probably tax cuts for the well off. If we do not build the next collider science will shrink as a fraction of human endeavor.
Again, I’m describing the current situation, not the ideal one. (Still, I’m not sure that science will shrink as a result, if this money is spent on other HEP projects.)
You stated in a reply to Sabine that you had no evidence that there were promising new ideas – e.g. in Dark Matter – that weren’t being aggressively pursued. This is what I was replying to:
“Personally, I just haven’t seen any evidence that there are other (non-collider) equally promising and cheaper places to look that aren’t already being pursued. In particular, the search for “dark matter” has been a huge priority among theorists and experimenters for decades now, I’m dubious there are promising experimental avenues corresponding to well-motivated theory models that are not being actively and aggressively pursued.”
I was pointing out that this just isn’t the case. No – neutrinos are not colliders – but it’s the US mega-project and pushing the schedule for that, along with the other large projects, has eaten up all of the big increases you mention (along with the QIS push in OHEP, which is funding a lot of ideas of very dubious quality just because there is a big pot of money and not many really interesting ideas to use quantum computing for HEP.)
In any case, I never said, or implied, that there was a problem in the size of the budget, but I didn’t think that was what was being discussed here. I thought the discussion was about how to spend the money that can be available, and I do think there is a problem where only the biggest and highest visibility projects squeeze everything else out and there is no process appropriate to bring forward promising small projects (which absolutely do exist!) and see that they have adequate resources to succeed.
You’re right, I should have made the point I was trying to make in a different way. I’m not well informed about the specifics of dark matter experiments, you may very well be right that there are good experimental proposals not getting funded. If this is so though, it’s not because the issue of dark matter doesn’t get much attention, or because of a lack of money that could fund small-scale experiments. So, I just don’t think you can pin this problem on colliders, or expect that it will improve if collider proposals fail.
I suspect this is kind of a generic issue: non-collider scientists are aware of what they see as good science in their field which is not getting funded, think maybe if collider funding stops, then that good science will get funded instead. In most cases though, scientists thinking this way are likely to be disappointed. In this case, if there’s a problem with small-scale experiments not getting attention and funding now, that problem is still going to be there after collider projects are cancelled. If CERN changes direction, they’re not likely to decide “we’re going to spend the \$500 million/year we save by not building a collider on 100 new small experiments each year”. Instead they’ll pick some small set of larger initiatives to fund.
Interested to hear your take on the redirection of HEP funding to Quantum Information Science. I’m beginning to suspect that both on the theory and experiment side, this is a really dubious trend, one that physicists should be pushing back on instead of joining.
I should perhaps also have been more clear. I’m not arguing against a future big collider or trying to “pin the problem on colliders”, but simply that we have to find a way to make sure they don’t squeeze out new ideas for small experiments. It’s a problem, but not with colliders, rather more with the sociology that develops around big projects that often get oversold, to politicians, to the public, and to young people in the field who buy into the idea that these are the only things worth doing and are guaranteed to make huge new discoveries. I’ll admit to having a dog in this hunt – since I work on some of these small Dark Matter efforts – but I’ve worked on the big collider experiments too, so I see this from many sides.
As far as these new initiatives, the full report on Dark Matter Basic Research Needs should be out soon and you can judge whether you find the case compelling. Who knows, maybe this process will be a watershed moment in trying to solve the problem. I know this has been seen as successful so far, and there is an intention to repeat this process to define the programs for other new areas of inquiry.
As far as QIS for HEP, here are the awards for which an estimated $12M was available, and you can decide for yourself what you think:
My impression (I can really only comment on the experimental side) is that a few of these are interesting, but perhaps will not get enough support among many others that are little more than a mashup of buzzwords without any real justification for the use of quantum computing techniques.
Tupeloid, I understand your concern. Yes I would rather have a marketplace of several small experiments than one MegaProject. Socially, politically, financially it is definitely better.
But it appears Nature doesn’t care, and is holding cards close to the vest. If we want to go beyond the effective theory that is the Standard Model, we’re gonna need a bigger collider.
And since I grew up in Chicago let me quote Daniel Burnham: “Make no little plans; they have no magic to stir men’s blood and probably themselves will not be realized. Make big plans; aim high in hope and work.”
Amitabh Lath wrote:
“If we want to go beyond the effective theory that is the Standard Model, we’re gonna need a bigger collider.”
We keep approaching this as if “a bigger collider” is the only way to make progress in experimental, fundamental/particle physics. It’s all or nothing – either we keep building colliders or we just stop and give up. It’s not, and we shouldn’t.
It should be noted that, due to limitations in their very design, high-luminosity high-energy colliders may no longer be the best approach to advancing our knowledge. Without meaning in any way to diminish the outstanding accomplishments of the experimentalists, engineers and technicians at FermiLab and CERN over the last decades, we need to recognize some of the fundamental limitations of that approach, and include those in decisions on how to proceed – and what to ask to be funded. The LHC is a wonderful “theory confirmation” machine, but it has issues with discovering unexpected results that aren’t factored in to the designs of both the hardware and software (and analysis) required to handle the vast flood of both collision debris and data generated by those detectors, to the level of precision sometimes needed. It may be time to start thinking about whether other approaches could actually produce better results – before asking for more money, just to keep on doing what we’ve been doing for the last 75 years or so.
Part of this emphasis on “higher energy” being the only way to make progress is probably due to some of our assumptions/prejudices about how we think the Standard Model can be generalized/unified. It’s what “high energy physicists” do, and have done, both experimentalists and theorists, often (but not always) with great success. But it also can lead to blind spots about limitations in the way the Standard Model itself (and the underlying quantum theory) is put together, as well as the mathematical framework around which it has been developed.
To use a very crude and imperfect analogy: It may be time to start thinking about how to use scalpels and forceps to dissect the most subtle aspects of fundamental physics, rather than just hitting it harder with a bigger meat cleaver. Then we can talk about the cost(s) of those approaches vs. potential results. Just leaving those options off the table and hoping that somebody will do them sometime somehow just as long as we get more money for that bigger collider, when in fact they may be the preferable options, would be counterproductive scientifically as well as financially and politically.
Very nice post, thank you. No of course we should not give up, especially since there are important questions (The Baryon Asymmetry of the Universe, Dark Matter and the origin of neutrino masses) that are very likely to have particle physics answers.
Let me make a couple more additions to your description of the FCC-ee:
-1- One thing which has not permeated in the community very much, is that the ‘new particles’ that could explain the above questions do not necessarily have the kind of couplings that would allow to see them at the LHC. In fact it is very possible that they have much smaller couplings — this would automatically explain why we see no hint of new physics in the precision Electroweak measurements, while the top quark effect was a hefty 10 sigmas shift — for sure there isn’t another top quark, up there. (we also know this from the Higgs production cross-section at the LHC).
A couple examples are documented in the FCC CDR: a famous one is right-handed heavy neutrinos (also named “sterile neutrinos, heavy Majorana neutrinos, heavy neutral leptons” etc..) — which are very much expected since the (left-handed) light neutrinos have been discovered to be massive. Another is ‘Axion-like particles’ as dark matter candidates. Both are very well motivated and are excellent candidates to solve one or more of the above questions. In both case there is significant (and relevant) phase space that can be probed in the Z exposure of the FCC-ee: there being 510^12 Zs produced, rare events can be searched with a sensitivity down to couplings of ~10^-11 of the weak coupling.
-2- a second point concerns the ’10 billions’ of the cost. As I am sure you know well, about 2/3 of the cost concerns infrastructure that is there to stay, e.g. for the (100 TeV) hadron collider (and possibly other facilities), and thus is a long term (some 70 years or more) investment. This makes it much easier to argue about with the host states and local authorities.
On the HEP theory front, I very much agree with the argument that it is long past the time that theorists should give up on the main directions they have been pursuing, go back to fundamentals, and try a new direction. This is difficult to do, but the reasons are sociological, not fundamental problems of how physical reality and allocation of expensive resources works. Some theorists would like to claim that they should just keep doing what they have been doing, the computations are just unfortunately getting harder. Maybe, magically, all of a sudden something will appear.
The situation with theory though is very different than the situation with experiment (and I think Hossenfelder’s essay doesn’t properly recognize this). While HEP theory has conclusively failed at what it has tried to do, HEP experiment has succeeded at doing exactly what it set out to do. The discovery of the Higgs and the on-going study of its properties has been a huge success. There are now carefully put together proposals for how to extend this success to smaller distance scales. Unfortunately they appear to be inherently expensive.
The idea that there is some different, much less expensive, promising route to progress is wonderful if it were true, but one has to come up with a viable proposal. I assume a significant part of the ongoing European study will be to examine such proposals. If such proposals exist, quite possibly the result will be to not fund a new collider, but to fund such proposals instead. Absent such proposals, what I don’t think is a good idea is wishful thinking, i.e. deciding that since the known route is difficult and painful, let’s just not do it and hope that somehow an easier one will appear.
Regarding the costs, since the world is always more full of billionaires which are more and more rush, what about some private fundings ? I completely agree that States should finance CERn, but in case they do not, is it totally exclude that some private philanthropists could contribute ? If I remember well, CERN has already “accepted” sone private fundings accepting the Breakthrough prize, and there are some examples of private financing of fundamental research which are working well, no ? (see PI, or the Simons Fondation). 5 billions is just 5 per mille of Amazon value (or Apple a few month ago).
WTW, I agree with you. New physics could well be hiding in the collisions we already have. One of my personal interests is particles with strong interactions and low mass. They would be produced in huge quantities but this reaction would be overwhelmed by the huge rate of normal strong reactions in hadron collisions.
This is just one discussion. Another is long-lived particles. Think of a massive, slow moving BSM particle. It would be missed completely. We (experimentalists and phenomenologists) are constantly asking the question “Where could the new physics be hiding?”.
The next generation colliders will not only be higher energy, but have smarter hardware and software as well. Expect machine learning and AI to permeate all levels from trigger to final analysis.
thanks for your informative post.
Due to the lack of direct discoveries of new physics so far at the LHC, the current strategy in experimental HEP is to search extension of the Standard Model indirectly, by viewing the SM as an effective field theory (SMEFT, eg, arXiv:1706.03783 and arXiv:1812.08163 ) and thus perform the most precise measurements possible (by reducing the statistical uncertainties by collecting more data, and the systematic uncertainties with better detectors, etc.) Of course, this is a challenge for computing but also for theory to provide the most accurate possible calculations of SM quantities! This is the so-called “Intensity Frontier” program (a better name would be “Precision Frontier”) in contrast to the “Energy Frontier” program.
This is the idea behind HL-LHC, HE-LHC and part of FCC programs. Of course, if one finds new physics directly at FCC, the better. It is thus of no surprise that a recent Les Houches school was dedicated to Effective Field Theory
Let me finish by quoting Burton Richter (arXiv:1409.1196)
Abstract: “The success of the first few years of LHC operations at CERN, and the expectation of more to come as the LHC’s performance improves, are already leading to discussions of what should be next for both proton-proton and electron-positron colliders. In this discussion I see too much theoretical desperation caused by the so far unsuccessful hunt for what is beyond the Standard Model, and too little of the necessary interaction of the accelerator, experimenter, and theory communities necessary for a scientific and engineering success. Here, I give my impressions of the problem, its possible solution, and what is needed to have both a scientifically productive and financially viable future.”
this Nima conference from four months ago might be pretty relevant, IMO, for what´s being discussed here.
While I agree with much of what Arkani-Hamed has to say in that talk, he definitely was someone I had in mind when I wrote above: “experience of the past few decades shows you probably shouldn’t listen to theorists.” He’s smart and great at conveying enthusiasm, but he’s also now spent a lot of his career vigorously and enthusiastically promoting the LHC in unfortunate ways (large extra dimensions, SUSY (Split or not), the LHC will explain why gravity is weak, only SM physics at the LHC is evidence for the multiverse, etc., etc.).
Pingback: Shtetl-Optimized » Blog Archive » Sabineblogging
Pingback: The future of journal publishing here today – Guest Post by Syksy Räsänen | In the Dark
Pingback: The Particle Physics Curse of Knowledge | 4 gravitons