# US Particle Physics Planning

Last week both SLAC and Fermilab hosted “Users Meetings”, providing a forum to discuss the current status and future plans of the two laboratories. The SLAC agenda is here, and talks from previous years are available here, with this year’s perhaps available later.

The Fermilab meeting was also celebrating the 40th anniversary of its first Users Meeting, which was held back in 1967 at a time when Fermilab was under construction, with plans for a 200 GeV fixed-target machine underway, led by director Robert Wilson. This year’s talks are available here. The status of the Tevatron is described in Roger Dixon’s talk. Already the machine has delivered nearly 3 fb-1 of luminosity to the two experiments there, half of this over the last year. They are projecting to have 6-7 fb-1 by the end of FY 2009 (a bit more than two years from now). The current plan calls for operation of the Tevatron only until the end of FY 2009, and a year or so ago there was even some discussion of shutting it down before then. With the machine operating well, a healthy US HEP budget, the LHC startup now not until 2008, and some cautious optimism that that the Tevatron might be able to accumulate enough data to see the Higgs under some scenarios, it looks like no one is about to shut the Tevatron down early, rather the question will be how much extra time to give it. There seems little point to shutting it down as long as the LHC is not producing results that make it obsolete, and no one knows yet how long that is going to take. While those running Fermilab would like to know what they will be doing several years in advance so that they can plan and budget, it may be difficult to do this since no one knows what will happen with the LHC.

Fermilab is in the middle of a long-range planning exercise, with a Steering Group meeting trying to put together a plan by August 1. They have many of their materials available on-line. Some of the discussion revolves around the question of the ILC, with talks showing that in principle it would be possible to start construction of the ILC in 2012 and have it built by 2019, but few people believe that things will happen this fast. Whether building the machine makes sense will depend on what is seen at the LHC. Other scenarios are under discussion, for example see here. Other than the LHC, the main things one could conceive of building at Fermilab would be a more intense proton beam (proton driver), or accelerating muons to provide a “neutrino factory” and perhaps ultimately a muon collider.

While US HEP has a difficult task ahead to figure out what to do after the Tevatron shuts down and the energy frontier moves to CERN, at least the budget situation is looking a lot better than it was a few years ago. At the Users Meeting, there was a presentation by the DOE’s Robin Staffin showing budget figures that included a 6.8% increase planned for FY2008, after a 5.9% increase from FY2006 to FY2007. For some reason the federal government seems to have decided to put significantly more money into fundamental physics research, and HEP is benefiting from this. For more about the general situation with the Federal science research budget, see this recent talk by John Marburger, the director of Office of Science and Technology Policy.

For the budget situation in mathematics, see this report in the latest Notices of the AMS about the NSF budget numbers. After flat budget numbers for the past couple years, there was a 3.3% increase for mathematics research in FY2007, and the proposal for FY2008 has a 8.5% increase. Math is cheap compared to HEP, with the NSF spending on math (which is the bulk of federal math research funding) only about a quarter the size of the HEP budget. The AMS Notices article also computes numbers for what fraction of the NSF budget goes to different fields, noting that in FY2004 18.3% was for math, 20.9% for physics, while the FY2008 proposal goes 17.8% to math, 23.6% to physics.

Update: Also at SLAC, this week the DOE is there to review the lab. Presentations prepared for the DOE are on-line. Michael Peskin gave a presentation about the work of the theory group. He highlighted (besides hopes about the LHC) the work of SLAC’s Lance Dixon on computing perturbative QCD amplitudes, including its relation to N=4 supersymmetric Yang-Mills and to the conjectural finiteness of N=8 supergravity.

This entry was posted in Uncategorized. Bookmark the permalink.

### 23 Responses to US Particle Physics Planning

1. Eric says:

Peter, Thurston is celebrating 60th birthday at Princeton
http://www.math.princeton.edu/Thurston60th/

2. DB says:

Isn’t it interesting that while the US is the undisputed world leader in string theory research, it has not led the way in experimental HEP since the 1970s.? Remember that the only reason the Tevatron is currently the world’s most powerful collider is because CERN’s LEP was shut down to build the LHC. Oh, and just before the LEP shutdown we were also told about events that “might” indicate that the Higgs had been seen there. MiniBoone was interesting, but a pale shadow of Kamiokande and Sudbury. By substituting “cheap” math for real physics, is it such a surprise that we end up with string theory?

3. alex says:

You say the NSF spending on math is only a quarter of the HEP budget, but go on to quote figures making the NSF budget for math not much smaller than for physics as a whole (17/18 % compared to 21/24 %). Are you talking about a different budget the second time?

4. Peter Woit says:

alex,

Math is funded mainly by the NSF, and the NSF gives comparable amounts to math and physics. But HEP is mainly funded by DOE, and it is this funding that is much greater than math.

DB,

The US HEP problems can be traced back to the SSC debacle. I don’t think that’s something that can be blamed on the string theorists…

5. Zathras says:

Within physics, is HEP getting a smaller portion of the pie? Any other areas that are getting significantly more than they had before?

6. Intellectually Curious says:

“By substituting “cheap” math for real physics, is it such a surprise that we end up with string theory?” -DB

Now, I understand why math is cheap, since one needs only paper and pencil (and usually a trash can) to do pure research. But what is “cheap math”?

7. John H. says:

Peter, with so much money being spent on HEP research, if neither LHC nor ILC provides new and meanigful data this may summon the end for particle physics… atleast for some time to come. Then what?! What will all those who devoted half of their lives (and sometimes more for senior researchers) do? What will they have to show for their lives’ efforts…

Just a thought…

8. Coin says:

Peter, with so much money being spent on HEP research, if neither LHC nor ILC provides new and meanigful data this may summon the end for particle physics

Even if this is so, is it at all possible that this would be not the end of experimental particle physics, but just the end of collider experiments?

I mean, if it were ever absolutely, truly necessary to do so, surely there’s some way to do frontier fundamental particle research even if there’s no funding for new ubercolliders.

9. John H. says:

Sure, HES [High-Energy Speculation]… 😉

10. DB says:

Experimental HEP research is increasingly using the astrophysics channel – which has a long and distinguished history of discovery from the muon (Anderson, 1936) to the charged pions (1947) to the neutrino oscillations at Kamiokande and Sudbury. Expect continued growth in the placing of sophisticated detectors in orbit – why the ISS is not being better exploited for this purpose is a mystery to me.
The US remains the world leader in astrophysics with the infrastructure to design and deliver the best in class detectors so if, as I expect in maybe ten years time, non-terrestrial HEP once again becomes the dominant mode of particle physics research, it is perfectly placed to retake the lead. However, when I look at Nasa’s funding priorities, its scaling back on its science budget, Europe and Japan’s resolve, I just wonder.

11. Peter Woit says:

John,

It’s not the experimentalist’s fault if it turns out Nature has decided to look exactly like the Standard Model up to 1 TeV or more. Finding out about Nature is what experimentalists do, and they have been and are doing it. Sometimes what they find out is dramatic and unexpected, sometimes not, but it’s still valuable to know what actually happens. The fact that the Standard Model is so good is in itself a very interesting experimental result.

What is worrisome about the post-LHC future is that, if the SM still holds, no one has a plausible idea about how to do what physicists have always done: keep trying to find out what happens at higher energy and shorter distance scales. The LHC will provide a jump of a factor of 7 in energy, but there’s no affordable technology out there that could give us another such factor anytime soon.

12. Matteo Martini says:

Peter Woit wrote:
” The LHC will provide a jump of a factor of 7 in energy, but there’s no affordable technology out there that could give us another such factor anytime soon ”

The maximum integrated luminosity increase of the existing options is about a factor of 4 higher than the to the LHC ultimate performance, unfortunately far below the LHC upgrade project’s initial ambition of a factor of 10

13. Matteo Martini says:

Sorry,
I did not write the sentence was quoted from Wikipedia:

Matteo, SLHC gives higher luminosity, not higher energy. More collisions, but at the same energy.

15. Matteo Martini says:

Thomas,

16. Peter Woit says:

Matteo,

As the Wikipedia article mentions, the “VLHC” idea would require a machine hundreds of kilometers in size, and the cost would likely be prohibitive. There’s little chance of such a thing being built within the next few decades.

17. Matteo Martini says:

Peter,
I have read something the VLHC, in this file, http://vlhc.org/Limon_seminar.pdf, written by P. Limon, and, it is stated that:
1) the technology needed to build a VLHC, is available today; and
2) if we can afford a linear electron collider, we can afford a VLHC ( do not understand the exact meaning of this sentence, though.. )

What do you think?

18. Peter Woit says:

Matteo,

The problem with the VLHC is not the technology, but its size. It would be bigger than the failed SSC project. Given that the SSC was going to cost more (in constant dollars) than the ILC is supposed to, I suspect the VLHC would be significantly more costly than the ILC to construct.

19. Matteo Martini says:

OK
Let s hope something completely new comes out from LHC, then.
Than, there will be maybe hope that the world s governments could decide to go ahead, one day, with a bigger Hadron Collider.
Please, not that, last year the defense budget of the U. S. alone was over USD400 billions ( http://www.whitehouse.gov/omb/budget/fy2007/defense.html )
Since the estimated cost of the ILC is around USD5 billions ( http://www.linearcollider.org/pdf/RDR_Machine%20Overview_v5-1.pdf ), I think there is still room, if there is any compelling reason for governments, to go ahead, and build a machine significantly larger than the ILC.
Note, that world powers did not have any problem to pop out some USD95 billions for the Space Station ( http://www.space.com/news/spacestation/gao_iss_inquiry_010619.html ).
I did not really read of any real practical outcome and scientific progress, from that expensive program.
Please, note that the figure I gave above ( USD\$400 billions+ ), is the budget of only 1 year, and, of only 1 country.
The cost of building any machine, such as the VLHC, would have to be split in many calendar years, therefore, reducing the cost per year to a fraction.
Also, collaboration and splitting costs between nations, would provide a further reduction of the expenses for each nation.
The bis point, in my opinion, is: will we really find something new next spring?
Will some theorist, come out with a brand new theory that requires a 50 TeV machine to be proven, but, that could change our lives and spur progress?
If yes, there could be reasons, in the near future, to work for an even bigger Collider.
If not, maybe, we will be stuck with the Standard Model for another 100 years.

20. Nuttata says:

>If not, maybe, we will be stuck with the Standard Model for another 100 years.

No reason to complain, it could have been much worse. Look at high Tc superconductivity, for example. Those guys have been stuck for more than twenty years, and they don’t even have a theory. At least the SM does work.

But there’s no way the LHC could possibly leave us entirely empty-handed. Some kind of EW symmetry breaking must be there, and/or some unitarity-restoring sector. “Nothing” is not in the menu, as soon as the experiment is up and running.

21. Matteo Martini says:

As far as I can see, human progress of the last 100 years, has been mainloy driven by breakthroughs in elementary physics.
The inventions of the:
– semiconductors ( in the early 70s ), which lead to the birth of the modern electronic and informatic business ( Intel, AMD, Microsoft, Google and the modern IBM );
– the discovery of the DNA, in the early 50s ( at the basis of genetics and modern biochemistry );
– laser optics ( which lead to the development of current optical cables, and made it possible the birth of companies like Cisco systems and of the internet );
– the future quantum computing?
were, almost all, based on the discoveries of the behaviour of the atom, and, based on general relativity and quantum mechanics

If you look at the ” old ” industries ( such as: avionics, rocket industries, car business, machines of all kinds ), there has been nothing really new in these fields, in the last 40 years, at least.
The car engine, it has been the same in the last 100 years, and, after the man in the moon, in 1969, no big advancement in space exploration too.
Planes, are not inherentically new, after the invention of the jet engine ( made in the sixties ).
This is because, all the ” old ” mechanical industries, basically, use the 3 laws of physics, discovered by Newton in the late 1600s.

The same is happening with the ” new ” industries ( informatics, semiconductor business, .. ), there is still a lot of innovation, there, but, for how long??

In my opinion, if we do not advance, to the next level of knowledge of particle physics, human progress will probably, reach a halt in 20-30 years..

Just my opinion, though..

22. sinus says:

The car engine, it has been the same in the last 100 years, and, after the man in the moon, in 1969, no big advancement in space exploration

I beg to disagree. It is true that the most basic ideas involved in the explosion engine have not changed. But today’s cars, and their engines, are very different from those of 100 years ago. The technology involved in those differences is highly non trivial and, in many ways, revolutionary.

Regarding outer space exploration, we’ve been living a golden age of cosmology for more than ten years now. Cosmological and astrophysical knowledge has undergone many revolutions in the last two decades or so, due to enormous advances in space exploration which produced large amounts of high quality data. To quote but one example: several dozen extrasolar planets have been discovered, whereas none was known during the XX century.

23. Matteo Martini says:

Sinus wrote:
I beg to disagree. It is true that the most basic ideas involved in the explosion engine have not changed. But today’s cars, and their engines, are very different from those of 100 years ago. The technology involved in those differences is highly non trivial and, in many ways, revolutionary

Matteo:
Yes, there have been many substantial innovations in the car engine, as well as in the commercial aviation business ( look at the A380 ), but, the basic structure of the car engine, as well of that of the plane, has not changed in the last decades.
Current car engines use the Otto cycle, or the Diesel cycle, to provide energy, and, these cycles have been discovered more than 100 years ago.

Sinus wrote:
Regarding outer space exploration, we’ve been living a golden age of cosmology for more than ten years now.

Matteo: