Why No “New Einstein”?

Lee Smolin has a piece in the latest Physics Today entitled Why no “new Einstein”?. Unfortunately it’s only available to Physics Today subscribers, although Lee tells me he will see if he can put it on-line on his web-page. Tony Smith previously mentioned this in a comment to an earlier posting.

The problem Lee addresses seems to me to be an extremely important one. Pretty much every knowledgeable particle theorist that I talk to these days, string theorist and non-string theorist, agrees that current ideas about how to go beyond the standard model are not working very well. Everyone hopes that some big new idea will come along and show the way forward, with people often wistfully speaking about how maybe some bright post-doc out there may be at this very moment working on the needed new idea. The problem with this is that what is needed is probably something quite different than any of the current popular research programs, and finding it may be difficult enough to require someone’s concerted effort over quite a few years. If this is so, it’s very hard to see how anyone on the standard career path in the US is going to be able to do this. A young post-doc here generally only has a couple years in between needing to apply for new jobs, and if he or she were to devote those years to working hard on a very speculative new idea, this would most likely be suicidal for their career.

Some will argue that young theorists should just try and work on speculative ideas in their spare time, spending enough time working on currently fashionable topics such as string theory to impress people enough to ultimately get a permanent job, at which point they can work more seriously on their speculative idea. The problem with this is that getting up to speed and participating in the latest trendy research in string theory is a very demanding task, one that isn’t likely to leave much time or energy for other projects. In addition, it’s not at all clear that being willing to work hard on an obviously failed research program like string theory is consistent with having the intelligence and drive needed to do something really new. Instead of working on string theory, a young theorist could try and work on one of the other popular topics such as cosmology or phenomenology, but these are very different subjects than fundamental work in quantum field theory. A young theorist would be more likely to be able to find the necessary time if he or she went to work as a night-time security guard.

Lee makes several excellent proposals about how to restructure the way hiring is done to encourage young people who want to try something new. I hope he has some success in getting the powers-that-be to realize what a serious problem the field is facing and take some of the actions he suggests.

Two completely unrelated topics:

Lubos Motl has a posting about the Harvard Commencement, where it seems they’re giving Witten an honorary degree (Columbia already did this in 1996). He also writes about a new web-site for the Sidneyfest, the conference in Sidney Coleman’s honor that was discussed here and on many other weblogs. The new web-site includes copies of letters to Coleman from people who couldn’t attend the conference. In one of them Greg Moore recalls and reproduces Coleman’s proof from the late eighties that string theory is the unique theory of nature.

For something pretty weird, see this from the latest Notices of the AMS. There’s more about the activities of its author on Robert Helling’s weblog. The new issue of the Notices also contains an article about the 2006 NSF budget request for mathematics.

Update: Lubos Motl has his own comments on Smolin’s article, together with a link to some site where someone seems to have posted the article without attribution.

Posted in Uncategorized | 86 Comments

Future of Fermilab

Nature this week has an editorial about Fermilab entitled All or Nothing at Fermilab associated with a news article Fermilab: High-risk physics. The article and editorial are about the fundamental problem facing Fermilab: in a few years the high energy frontier will move to the LHC at CERN, with many physicists leaving Fermilab. The future of the lab remains up in the air, as the only viable plan for a new high-energy accelerator is the ILC project, and this would require massive new funding which is still quite uncertain. While SLAC has diversified into X-ray physics, Fermilab remains committed to operating at the highest energies. Many people worry that if the ILC is not funded or delayed for many years, Fermilab will be in a difficult position, and a prime target for budget cuts.

This week the lab is hosting the annual “User’s Meeting”. Presentations about current and future activities at Fermilab are available on-line.

Posted in Uncategorized | 1 Comment

Rutgers Workshop

I spent most of last week commuting down to Rutgers to participate in a workshop on “Groups and Algebras in M-theory”, organized by Lisa Carbone. Lisa was a student of Hyman Bass’s here at Columbia some years back, and in recent years has been working on Kac-Moody groups and algebras over finite fields.

Much is known about one special class of Kac-Moody algebras, the so-called affine Lie algebras. These are basically Lie algebras associated to loop groups, with a central extension. The study of the representation theory of these algebras is closely connected to quantum field theory in 2d space-time dimensions, and my first talk was about this topic. For more details about this, from the point of view I was taking, see the remarkable book by Pressley and Segal called “Loop Groups”, lecture notes from 1985 by Goddard and Olive at the Erice Summer School and Srni Winter school (see Int. J. Mod. Phys. A1:303, 1986), and Witten’s paper “Quantum Field Theory, Grassmanians and Algebraic Curves” in Communications in Mathematical Physics, 113 (1988) 529-600.

An elaboration of these ideas in one direction leads to the concept of a “Vertex Operator Algebra” (first introduced by Richard Borcherds), and the study of these was pioneered by Jim Lepowsky, who also participated in the workshop, together with his ex-student and now Rutgers faculty member Yi-Zhi Huang. Several other current and ex-students of Lepowsky and Huang were also there and gave talks. For more about vertex operator algebras, see the recent short review by Lepowsky, or the materials on Huang’s web-site. A VOA is essentially the same thing that Beilinson and Drinfeld call a chiral algebra, and these have applications in the geometric Langlands program.

What Lisa is really interested in is the non-affine case, where relatively little is known. Non-affine Kac-Moody algebras and groups seem to have no known tractable realizations, and many basic questions about both the algebras and the groups, as well as their representations, remain open. In recent years several of these algebras have been conjectured to have something to do with M-theory, most notably E11, and the study of this connection has been the main focus of the work of Peter West, who gave a series of talks at the Rutgers workshop. For some more about this, see his recent papers, especially one on The Symmetry of M-theories. West’s graduate student P. P. Cook also has a weblog, and recently wrote a posting explaining a bit about this topic.

Greg Moore was at many of the talks and kept the speakers honest. He gave a fast-pace talk covering some older work, roughly the same material as in his paper with Jeff Harvey entitled Algebras, BPS States and Strings. I gave a second talk explaining a bit about my point of view on the Freed-Hopkins-Teleman theorem and its relation to representation theory and QFT.

After the talks Thursday afternoon there was a discussion section on what is going on with string theory, supersymmetry, and mathematics. No one was willing to defend work on the “Landscape” and I was surprised to find myself pretty much in agreement with quite a few people there about the way string theory has been pursued in recent years. On the whole the mathematicians are kind of bemused by the whole string theory controversy. The subject has certainly led to some very interesting and important mathematics, and they are happy to concentrate on that, although interested to hear about the controversy surrounding string theory in physics.

Posted in Uncategorized | 13 Comments

PITP Showcase Conference

The Pacific Institute for Theoretical Physics, based at UBC in Vancouver, held a Showcase Conference a couple weeks ago, which was supposed to “celebrate the exciting new developments taking place in theoretical physics”. According to the organizers there are lots of exciting new developments in string theory, since six of the invited speakers (Myers, Ooguri, Randall, Schwarz, Shenker, Susskind) spoke on that topic, but no one at all spoke about elementary particle physics. There were also quite a few talks on condensed matter physics.

The talk of John Schwarz consisted mainly of the standard recounting of the history and basics of string theory that anyone who has been to conferences like this has heard a hundred times. This part stopped with Maldacena’s work more than 7 years ago. On more recent topics, about the anthropic explanation of the cosmological constant, Schwarz says: “Is there another explanation? I hope so.” He ends by putting up a long list of questions about string theory, more or less the same list everyone has had for twenty years now.

Steve Shenker spoke on Emergent Quantum Gravity, with “emergent” the new buzzword of the field. There was a separate workshop on emergence overlapping with the Showcase conference, organized by Phil Anderson and others, with Susskind the only string theorist allowed to speak there. Shenker introduced a new terminology to justify string theory: it is “An algorithmically complete, consistent description of quantum gravity”, although he does add the caveat “In certain simple situations (like flat space)”. By this I guess he is trying to get around the problem of how to claim that your theory is complete and consistent when you don’t know what it is. The idea is that at least you have an algorithm for doing computations. Perhaps he means perturbative string theory, although that is neither consistent nor complete (the expansion in the number of loops diverges). Perhaps he means a non-perturbative formulation like a matrix model, which works in 11 flat dimensions, but then he really should note that he’s not talking about quantum gravity in four dimensions, which is what most people care about.

There was an interesting panel discussion on The Theory of Everything?, which was moderated by Steve Shenker. He seemed mainly interested in making the obvious point that string theorists weren’t actually claiming that their theory explained anything about, say, biochemistry. The panel was actually balanced between string theory enthusiasts (Shenker, Schwarz, Randall), and skeptics (‘t Hooft, Unruh, Wald). Some of Shenker’s introductory remarks are inaudible, but he did repeat his claim about the “algorithmically complete” nature of string theory. “t Hooft had some quite interesting comments. He recalled that at a conference back in 1985 he had been the only one who didn’t think that twenty years later string theory would have solved all the problems of particle physics, noting that it was now 20 years later, he had been right, everyone else at the conference wrong. He was making the point that string theory now is extremely far from solving any problems in particle theory, and one can’t tell if this situation will change in 20, 200 or 2000 years. He tried to say some positive things about string theory, but they were pretty half-hearted. For instance he noted that dualities were very interesting, but they linked one ill-defined theory to another ill-defined theory. He also noted that in its present formulation string theory is only defined on-shell, which he takes as meaning that it doesn’t give a true local description of what is going on. He has reasons for being suspicious of people who claim that all one needs is an on-shell theory.

Schwarz attributed the TOE terminology to John Ellis. He said that he feels string theory is very far from explaining anything about elementary particle physics, that it was “almost hopeless to find the right vacuum”. He described what landscapeologists are doing in a skeptical tone, but didn’t actually criticize this. Answering ‘t Hooft, he claimed that back in 1985 he and Mike Green were actually more pessimistic than most other people about the prospects for getting quick results out of string theory.

Bill Unruh made the standard criticism that what is wrong with string theory is that string theorists are motivated by beautiful math, not physics. He doesn’t seem to have noticed that few string theorists are now doing math, since unfortunately most of them have taken to heart the criticisms of people like him. The failure of string theory has unfortunately reinforced the skepticism of many people like Unruh about the use of math in theoretical physics.

Wald quoted what sounded like a recent description of what string theorists think they are doing, then revealed that his quotes were from the 19th century, and referred not to string theory, but to the popular theory of the time that atoms were vortices in the ether. He deftly made the point that it is quite possible, if not likely, that string theory is just as wrong an idea as the vortex one.

Lisa Randall made some defensive comments about string theory as a guide for future research, even if it turns out not to work. These included the bizarre political analogy that it was wrong to worry about string theory ruining the credibility of physics, because, after all, the bogus WMD business didn’t seem to have hurt Bush’s credibility.

There were then some questions and comments from the audience. Susskind was in the first row, looking very peevish and defensive. He kept repeating that the field of theoretical physics had “no real choice but to track this down”, meaning to investigate the infinite landscape, and that this would take the efforts of many physicists. He explicity worried that funding agencies would not give any grants to anyone working on the landscape, to which Unruh responded that the shoe was really on the other foot, with some NSF panelists refusing to fund anyone who wasn’t doing string theory.

The conference web-site also includes an explanation of string theory which claims that in recent years string theory has “evolved very rapidly”, that the reason it can’t be tested is because of the small distance scales involved, and that it may be testable by observing a “5th force”, all of which is a load of nonsense.

Lubos Motl has an interesting post going over all the possible ideas he can think of that might lead to the next superstring revolution. Needless to say, they all sound extremely unpromising to me. Judge for yourself. He also quotes the promotional material for Susskind’s book due out late this year. It seems that “the Laws of Physics as we know them today are determined by the requirement that intelligent life is possible”.

Posted in Uncategorized | 192 Comments

Smart People

Via Slashdot, an article that seems quite relevant to the current situation of string theory.

Posted in Uncategorized | 6 Comments

News From SLAC and Elsewhere

Earlier this week Jonathan Dorfan, the director of SLAC, announced a reorganization of the structure of the laboratory. The new structure involves four divisions, two scientific and two operational. One of the scientific divisions will bring together particle physics and astrophysics. It will be led by Persis Drell who also will be a deputy director of the laboratory, a position previously held by her father, particle theorist Sidney Drell. The other scientific division will be called “Photon Science”, which will make use of the SLAC x-ray sources. At the moment SLAC produces intense x-ray beams at the SSRL, using synchrotron radiation from a ring which is a descendent of the original SPEAR electron-positron ring that was crucial in the “November Revolution” of 1974 (and which also provided me with a job one summer).

The main SLAC linac is being turned into a free electron X-ray laser to be called the Linac Coherent Light Source (LCLS), which will be operational in 2009. At that time the plan is for SLAC to be out of the accelerator based high-energy physics business, with the PEP-II collider also shut down. The last fixed target experiment using the linac, E158, recently reported the most accurate measurement of the weak mixing angle at relatively low energies (at LEP it was very accurately measured at the Z pole). This measurement shows the running of the ratio of coupling constants predicted by the renormalization group. For more about this experiment, see an article in the latest Nature magazine.

This week’s Science magazine also has an article about particle physics. It reports on the HEPAP meeting mentioned here earlier where a plan to evaluate whether to shut down PEP-II or the Tevatron early was put forward. On a more positive note, the House Appropriations committee has restored some of the cuts in the FY 2006 DOE budget proposed by the White House. The House committe added $22 million to the high energy physics budget, bringing it back to the FY 2005 level (which, accounting for inflation, would still be a cut, but a smaller one).

An article in New Scientist about the same House bill explains that money is being taken away from the ITER international project to build a fusion reactor and used to bring funding for domestic fusion research also back to FY2005 levels. This may have something to do with the fact that the latest news about ITER is that a deal has been reached that will site it in France.

Posted in Uncategorized | 12 Comments

Running Scared

Last Wednesday night, a paper appeared on the arXiv that spelled very bad news for the whole “Landscape” scenario of how to get physics out of string theory. This paper produced what appears to be an infinite number of possible vacuum states for string theory, ruining hopes for getting predictions out of the Landscape by doing a statistical analysis of vacuum states.

Tonight a new paper by a prominent Landscapeologist (Michael Dine) has appeared. The abstract gives no hint of trouble, claiming evidence of “distinctive predictions for the structure of soft breakings”, but the beginning and the end of the paper tell a different story. The second paragraph of the paper admits that the infinite number of states destroys this research program, but deals with this by saying that the author will just ignore the problem for now:

“If this (infinite number of states) is true, many of the ideas discussed in this paper will have to be reconsidered…. the discussion of this paper will be predicated on the assumption that the number of relevant states in the landscape is finite and naive statistical ideas can be applied.”

In the paper’s conclusion, Dine states:

“There are many ways, as we have indicated, in which the ideas described here might fail. Perhaps the most dramatic is that the landscape may not exist, or alternatively that there might exist infinite numbers of states, whose existence might require signficant rethinking of our basic understanding of string theory and what it might have to do with nature.”

I’m looking forward to Dine and others finally getting around to “rethinking what string theory might have to do with nature”. It’s about time.

Posted in Uncategorized | 55 Comments

Stalking the Riemann Hypothesis

My friend Dan Rockmore has a new book out, entitled Stalking the Riemann Hypothesis, which is quite good. Dan had the misfortune of starting work on this book at the same time as several other people had the idea of a popular book about the Riemann Hypothesis. For better or worse, his has appeared after the others, which came out last year. In solidarity with him, I haven’t read the others, so can’t directly compare his to theirs.

Dan’s book begins with a mixture of history and explanations of the math involved. In the sections having to do with more recent work, he concentrates on one particular approach to proving the Riemann hypothesis, an approach that has interesting relations to physics. This involves an idea that goes back to Hilbert and Polya, that one should look for a quantum mechanical system whose Hamiltonian has eigenvalues given by the Riemann zeta-function zeros. Self-adjointness of the Hamiltonian then corresponds to the Riemann Hypothesis. This conjecture has motivated a lot of the research that Dan describes in detail, including relations to random matrix theory, quantization of chaotic dynamical systems, and much else.

Philosophically, I’m very fond of the idea that quantum mechanics is basically representation theory, and that the way to produce interesting quantum mechanical systems is by using geometric constructions of representations using cohomological or K-theoretic methods. While I’m no expert on the Riemann Hypothesis, my favorite idea about it is that proving it will require a mixture of the Hilbert-Polya search for a quantum mechanical system, together with the cohomological approach that worked in the case of function fields. In that case, the Weil conjectures famously were based on the idea of constructing an appropriate cohomology theory. This was carried through by Grothendieck and others during the fifties and sixties, with Deligne finally using this technique to get a proof in the early seventies.

For the number field case, the most developed conjecture that I know of about what might be the right sort of cohomology theory is due to Christopher Deninger. He has a very interesting recent review article about this, see also his lecture at the 1998 ICM.

Update: For another nice discussion of zeta-functions and the Riemann Hypothesis, see John Baez’s latest This Week’s Finds.

Update: There’s a nice article in the Washington Post about Dan and his book.

Posted in Book Reviews | 17 Comments

US HEP News

The US High Energy Physics Advisory Panel (HEPAP) is meeting in Washington yesterday and today, and some of the presentations are already available on-line. These include one from the DOE Office of High Energy Physics which notes that, given budgetary constraints, the only way significant funds will become available for new projects (including significant work on the proposed ILC linear collider), is by shutting down operations at the Tevatron or PEP-II. The Tevatron is now scheduled to operate until 2009 (at which point it can’t compete with the LHC), PEP-II at SLAC until 2008. The DOE is asking the P5 committee to advise about whether or not it might be a good idea to shut these facilities down early, and redirect the funds that are freed up elsewhere.

There are also reports on the status of PEP-II and the Tevatron. PEP-II and other accelerators at SLAC were shut down after an accident last October, only turned back on last month. The plan now is to run the machine steadily until July 2006, with only a one-month break in October. Presumably it’s down today, since if you try and connect to the SLAC web-site, you get a message saying that power is out at SLAC due to a tree falling and severing the main power feed to the site.

The Tevatron is doing well this year, recently achieving record luminosity, and its integrated luminosity so far this year is running ahead of even optimistic projections. It seems highly unlikely to me that the P5 committee will suggest shutting it down early.

There’s also a report from the ongoing National Academy of Sciences EPP2010 study of the future of US particle physics. Presentations from a meeting earlier this week at Fermilab are now available. These include presentations dealing with what is going on outside the US, including ones from DESY in Germany and KEK in Japan.

The biggest issue facing US particle physics is what to do about the International Linear Collider (ILC) project. In the presentation of Michael Witherell (ex-director of Fermilab), he notes that the world is in a transition from having five major labs running the largest accelerators to possibly only two: CERN with the LHC, and wherever the ILC is sited, if it is built. For US experimental high energy physics to remain a world leader, it is crucial that the ILC be built, and built in the US. Witherell recalls how the US HEP budget has declined by $100-150 million in real dollars over the last few years, but then gives a plan for the future that involves this budget increasing by 4% over inflation every year, something I find hard to believe is going to happen. The EPP2010 site also contains feedback they have received from various members of the community in response to questions about plans for the ILC.

In other experimental HEP news, the Experimental High Energy Physics Job Rumor Mill has been revived, joining the Theoretical Particle Physics Jobs Rumor Mill. Send them both your inside information!

Posted in Uncategorized | 5 Comments

Game Over

Shamit Kachru (described by Lenny Susskind as the “master Rube Goldberg architect”) and collaborators have a new paper out this evening on flux compactifications, one that in a rational world should finish off the subject completely. Recall that Kachru is one of the K’s responsible for the KKLT construction of these flux compactifications that stabilize all moduli, and for the last couple years debate has raged over whether this sort of construction gives 10100, 10500 or even 101000 possible string theory vacuum states.

Susskind, Arkani-Hamed, and other anthropic principle aficionados have argued that the fact that this number is at least 10100 is a great triumph because it means that there are so many vacua that at least some will have small enough cosmological constant to be consistent with our existence. But if there are too many, all hope of getting predictions out of string theory disappears. With 101000 vacua, you can find not only the cosmological constant you want, but probably any values of anything particle experimentalists have ever measured or ever will measure, and the theory becomes completely unpredictive.

Even so, the study of these vacua has become more and more popular over the last year or two, with many arguing that, no matter how big the number is, at least it’s finite, so you have improved over the standard model, which has continuously tunable parameters. This argument was made in the panel discussion at the Perimeter Institute a month or so ago. Also, a finite number of vacua allows you to study their statistics, by assigning a weight one to each possible vacuum state and getting a probability measure by dividing by the total number. You can then engage in wishful thinking that this probability measure will be peaked about certain values, giving a sort of prediction.

The new paper gives a construction of flux compactifications of type IIA string theory, and in this case the authors find an infinite number of possibilities. This should kill off any hopes of extracting predictions from string theory by counting vacua and doing statistics. The authors try and put a brave face on what has happened, writing:

“we should emphasize that the divergence of the number of SUSY vacua may not be particularly disastrous. A mild cut on the acceptable volume of the extra dimensions will render the number of vacua finite.”

but then they go on to puncture their own argument by noting that:

“one can legitimately worry that the conclusions of any statistical argument will be dominated by the precise choice of the cut-off criterion, since the regulated distribution is dominated by vacua with volumes close to the cut-off.”

With this new result, the infinitesimally small remaining hope of getting predictions out of the string theory landscape framework has now vanished. It will be interesting to see if this slows down at all the ever-increasing number of string theorists working in this field.

Update: Lubos Motl has some comments about this same paper.

Posted in Uncategorized | 25 Comments