String Theory Fails Another Test, the “Supertest”

Wednesday’s CMS result finding no black holes in early LHC data has led to internet headlines such as String Theory Fails First Major Experimental Test (for what this really means, see here). At a talk today at CERN, yet another impressive new CMS result was announced, this one causing even more trouble for string theory (if you believe in purported LHC tests of string theory, that is…).

Back in 1997, Physics Today published an article by Gordon Kane with the title String Theory is Testable, Even Supertestable. It included as Figure 2 a detailed spectrum which was supposed to show the sort of thing that string theory predicts. Tevatron results have already caused trouble for many of these mass predictions. For example, gluinos are supposed to have a mass of 250 GeV, but the PDG lists a lower bound (under various assumptions) of 308 GeV. At CERN today, the CMS talk in the end-of-year LHC jamboree has a slide labeled “First SUSY Result at the LHC!”, showing dramatically larger exclusion ranges for possible squark and gluino masses. Over much of the relevant range, gluino masses are now excluded all the way up to 650 GeV. It looks like string theory has failed the “supertest”.

If you believe that string theory “predicts” low-energy supersymmetry, this is a serious failure. Completely independently of string theory, it’s a discouraging result for low-energy supersymmetry in general. The LHC has just dashed hopes that, at least for strongly-interacting particles, supersymmetry would show up just beyond the energy range accessible at the Tevatron.

Posted in Experimental HEP News | 35 Comments

Physicists Finally Find a Way to Test Superstring Theory

More than ten years ago, the New York Times ran a story explaining that Physicists Finally Find a Way to Test Superstring Theory. At the time, the test was scheduled to start in 2005-6:

In fact, it might be possible to concentrate so many heavy gravitons into a tiny volume of space that they would collapse in on themselves and create miniature black holes, those cosmic sinkholes from which nothing can escape. Experiments like this will be on the agenda when the Large Hadron Collider begins operation in five or six years at the CERN accelerator center in Geneva. ”These black holes should be quite safe,” Dr. Giddings said, for they would rapidly evaporate.

Today CMS released the results of the long awaited test of superstring theory, based on 35 inverse picobarns of data. It failed.

Update: Since this is getting wider than usual attention via Slashdot, I suppose I should remove tongue from cheek and make clear what is going on here. Claims such as the one in the 2000 Times headline always were nonsense: string theory unification failed long ago because it can’t predict anything. Various physicists back then came up with “string theory inspired” models of extra dimensions that would in principle have observable effects at LHC energies. There never was any reason at all to believe these models (and they were no more “predictions of string theory” than anything else), but there was a lot of hype about them, often promoted to the media by people who should have known better. Now that the LHC is finally working, the result is exactly what everyone expected: these exotic phenomena that had no good reason to happen don’t actually happen. It’s great evidence that the LHC is working as expected, but not an experimental refutation of string theory.

Posted in Experimental HEP News | 27 Comments

This Week’s Hype

This week’s contribution to the long tradition of universities issuing press releases hyping non-existent “experimental tests of string theory” by their employees is from Duke University, which advertises “String Theory in a Lab“. This is based on a paper that just appeared in Science describing measurements of the viscosity of a Fermi gas. The paper explains the relationship of the measurements to string theory as:

The measurement of the viscosity is of particular interest in the context of a recent conjecture, derived using string theory methods, which defines a perfect normal fluid.

referring to this paper which first suggested that gauge/gravity duality implied a value of 1/4π for the ratio of shear viscosity to entropy density.

In the press release, this connection to string theory has been promoted to a headline, as well as to the claim that:

The results may also allow experimental tests of string theory in the future.

which I suppose is better than the usual claim in these press releases that what is being promoted is already an experimental test of string theory. It seems likely that one reason this isn’t yet an “experimental test” is that the data comes out 4 to 5 times higher than the string theory value.

Posted in This Week's Hype | 30 Comments

Math Research Institute, Art, Politics, Transgressive Sex and Geometric Langlands

I learned from a colleague last night about recent events bringing together the topics of the title of this posting, something that one wouldn’t have thought was possible. Last Wednesday there was a showing in Berkeley of Edward Frenkel’s short film Rites of Love and Math, together with the Yukio Mishima film Rites of Love and Death that inspired it. Frenkel is a math professor at Berkeley, and one of the leading figures in geometric Langlands research (which he describes as a “grand unified theory of mathematics”). He’s also a wonderful expositor, almost single-handedly making the beauty of a subject initially renowned for its obscurity accessible to a much wider audience. Recently he has worked with Witten on relations of geometric Langlands to quantum field theory, and with Langlands and Ngo on relations to number theory. At the same time, while a visiting professor in Paris, he co-directed (with Reine Graves) and acted in this new film.

MSRI was one of the two sponsors of the showing of the film, but pulled out of this role recently, for reasons explained here by MSRI director Robert Bryant. He had found that some people in the math community were upset by the film and MSRI’s involvement with it, feeling that it glamorized an objectionable view of the relationship of women to mathematics. There’s a plan to organize some sort of event at MSRI to discuss the issues brought up by the film and the decision to withdraw sponsorship.

I still haven’t seen the film, although I gather that a DVD will soon be available. Congratulations to all involved in this for finding a unique way to make mathematics and mathematicians look interesting and worthy of media coverage. I had no idea it was still possible to stir up controversy in the Bay area with art involving transgressive sex, and would never have thought that using research mathematics was the way to do it.

Update: Andrew Ranicki has written a review of the film for the London Math Society newsletter, available here. He identifies the notorious equation in question (5.7 of http://arxiv.org/abs/hep-th/0610149), and makes the comment that, sartorially, this film is a breakthrough, since, in other films:

By and large, male mathematicians are portrayed as crazies who are smart and lovable, but badly dressed. Likewise for female mathematicians, although they tend to be better dressed. This said, in the film under review, the actors are either very well dressed, or not dressed at all.

Posted in Uncategorized | 25 Comments

The BMO Financial Group Isaac Newton Chair in Theoretical Physics

I learned this morning from Matin Durrani’s blog that the Perimeter Institute has announced today the first of what they expect to be five very well-funded Perimeter Research Chairs in theoretical physics. The next four will be named after Maxwell, Bohr, Einstein and Dirac (as well as whatever other wealthy individual or organization comes up with funding).

The BMO Financial Group is putting up 4millionand4 million is coming out of the Perimeter endowment (which is mostly from Blackberry’s Mike Lazaridis). An endowment of 8millionforachairisquitehigh.Itseemsthattypicalnumbersforendowmentpayoutsthesedaysarearound5400,000 or so a year to pay some prominent theorist. For comparison, the Simons Foundation has recently announced that it will fund endowed Math+X chairs aimed at mathematicians working at the interface with some other subject. Simons may be the wealthiest hedge-fund manager in the world, but he’s a piker compared to the Canadian financiers, with only 1.5milliongoingtoeachchair(tobematchedby1.5 million from the institution that gets the chair, for a total of $3 million). Then again, it just may be that prominent mathematicians are dirt-cheap compared to prominent theoretical physicists.

The Perimeter Institute in recent years has moved away from supporting non-mainstream topics in theoretical physics, while expanding dramatically. The only two conferences announced there for the next year or so are on the topics of LHC physics and AdS/CFT, about as mainstream as one can possibly imagine. If they manage to fund what might be the five highest-paid theoretical physics positions in the world and hire the people they want into them, they will be well on their way to a dominant position in the subject. While, like most industries these days, the tactic here is to shower the top few people in the field with cash, they are also expanding their hiring at more junior levels. According to the rumor mill, last year out of a total of fourteen people hired to tenure-track positions in theoretical particle physics in North America, three of the fourteen went to Perimeter.

For more on this, see here, and an interview with Lazarides and the BMO CEO here.

Posted in Uncategorized | 15 Comments

Assorted News

  • HEPAP is meeting in Washington today, presentations available here. The idea of this regular meeting is for the US HEP community and the funding agencies to meet and plan for the future, something that’s not easily done in an environment where these agencies have no budget at all for the current year, just an authorization to spend money at last year’s rate that expires in a couple weeks from now. No one seems to be sure what funding prospects are for the next few months, much less the next few years. Fermilab is dealing with this situation by offering 600 of its staff incentives to quit or retire next month (see here). There’s a new DOE Committee of Visitors report out here, it contains the bizarrely familiar recommendation of all such reports: the US needs to fund more HEP theory students (they don’t explain why, or where the money should come from).
  • In dark matter news, Princeton this week hosted a workshop on the subject, talks available here. Still no results from the latest Xenon100 run. This week’s Nature has a nice review of the various searches for WIMP dark matter, with conclusion:

    With the advent of the Large Hadron Collider at CERN, and a new generation of astroparticle experiments, the moment of truth has come for WIMPs: either we will discover them in the next five to ten years, or we will witness their inevitable decline.

    (Update: a commenter points out that this article is also available on the arXiv here.)

    One new astroparticle experiment that is supposed to look for evidence of dark matter is Sam Ting’s Alpha Magnetic Spectrometer, set to be launched in February, and described in a front-page New York Times article yesterday.

  • Ten days after first collisions, Alice already has two papers out (here and here) with experimental results on lead-lead collisions at an energy more than an order of magnitude higher than ever before. String theorists are very enthusiastic about this (see here and here), claiming that what is being observed is “properties of a type that can be nicely captured using string theory models”. I’d be quite curious to see any AdS/CFT based predictions that could be compared to these new results (or to forthcoming ones).
  • For the latest from the LHC, see here. Current plan is to have a proton-proton beam back around February 21, followed by at least 2 weeks of beam recommissioning. The proton run would end in November, followed by another ion run. First estimates for 2011 are that the run will be at 4 TeV/beam, and a “reasonable” estimate of total luminosity would be 2.2 inverse femtobarns, double the initial goal. Even more optimistically, the possible “ultimate reach” for next year would be a luminosity that would give a total of 7.2 inverse femtobarns if sustained over the hoped for 200 days of running. This kind of higher luminosity would allow the LHC to see evidence of a Higgs over the entire expected range, as well as allowing it to finally overtake the Tevatron in the Higgs race. The experiments so far are reporting results that match exactly the Standard Model, more announcements to come at the Winter Conferences early next year.
  • There’s an interesting trend of our LA-based theorist-blogger-media-stars starting to resist making dubious media appearances. A few months ago Sean Carroll described storming off the set of a TV pilot here. Now Clifford Johnson (whose media mishaps include appearing as a scientific expert on the question of how big women’s breasts need to be to crush beer cans, see here) tells us that Sometimes I Say No.
  • Posted in Experimental HEP News, Uncategorized | 28 Comments

    A Geometric Theory of Everything

    The December issue of Scientific American is out, and it has an article by Garrett Lisi and Jim Weatherall about geometry and unification entitled A Geometric Theory of Everything. Much of the article is about the geometry of Lie groups, fiber-bundles and connections that underpins the Standard Model as well as general relativity, and it promotes the idea of searching for a unified theory that would involve embedding the SU(3)xSU(2)xU(1) of the Standard Model and the Spin(3,1) Lorentz group in a larger Lie group.

    The similarities between (pseudo)-Riemannian geometry in the “vierbein” formalism where there is a local Spin(3,1) symmetry, and the Standard Model with its local symmetries makes the idea of trying to somehow unify these into a single mathematical structure quite appealing. There’s a long history of such attempts and an extensive literature, sometimes under the name of “graviGUT”s. For a recent example, see here for some recent lectures by Roberto Percacci. The Scientific American article discusses two related unification schemes of this sort, one by Nesti and Percacci that uses SO(3,11), another by Garrett that uses E8. Garrett’s first article about this is here, the latest version here.

    While I’m very sympathetic to the idea of trying to put these known local symmetry groups together, in a set-up close to our known formalism for quantizing theories with gauge symmetry, it still seems to me that major obstructions to this have always been and are still there, and I’m skeptical that the ideas about unification mentioned in the Scientific American article are close to success. I find it more likely that some major new ideas about the relationship between internal and space-time symmetry are still needed. But we’ll see, maybe the LHC will find new particles, new dimensions, or explain electroweak symmetry breaking, leading to a clear path forward.

    For a really skeptical and hostile take on why these “graviGUT” ideas can’t work, see blog postings here and here by Jacques Distler, and an article here he wrote with Skip Garibaldi. For a recent workshop featuring Lisi, as well as many of the most active mathematicians working on representations of exceptional groups, see here. Some of the talks feature my favorite new mathematical construction, Dirac Cohomology.

    One somewhat unusual aspect of Garrett’s work on all this, and of the Scientific American article, is that his discussion of Lie groups puts their maximal torus front and center, as well as the fascinating diagrams you get labeling the weights of various representations under the action of these maximal tori. He has a wonderful fun toy to play with that displays these things, which he calls the Elementary Particle Explorer. I hear that t-shirts will soon be available…

    Update
    : T-shirts are available here.

    Posted in Uncategorized | 110 Comments

    The Anderson-Higgs Mechanism

    One reason for this posting is that exchanges in the comment section of the previous one led me to look into some history, and I found some odd and possibly interesting facts I hadn’t previously known. So, part of this will just be lifting of some links from comments in the last posting.

    Another reason is that while the history may seem obscure, what’s at issue is the central unsolved problem of particle physics: the nature of electroweak symmetry breaking, and no excuse for thinking more about this topic should be let to pass by. The Yang and Mills work on non-abelian gauge theory published in 1954 had one huge problem: in perturbation theory it has massless particles which don’t correspond to anything we see. One way of getting rid of this problem is now fairly well-understood, the phenomenon of confinement realized in QCD, where the strong interactions get rid of the massless “gluon” states at long distances (they are relevant at short distances, visible in terms of jets seen at colliders).

    By the very early sixties, people had begun to understand another source of massless particles: spontaneous symmetry breaking of a continuous symmetry. If the vacuum state is non-invariant under a continuous symmetry, you expect to find one massless state in the theory for each generator of the symmetry. These are called “Nambu-Goldstone” particles, and pions provide an example (only approximately massless, since the symmetry is approximate).

    What Philip Anderson realized and worked out in the summer of 1962 was that, when you have both gauge symmetry and spontaneous symmetry breaking, the Nambu-Goldstone massless mode can combine with the massless gauge field modes to produce a physical massive vector field. This is what happens in superconductivity, a subject about which Anderson was (and is) one of the leading experts. His paper on the subject was submitted to Physical Review that November, and appeared in the April 1963 issue of the journal, in the particle physics section. It explains what is commonly called the “Higgs mechanism” in very much the same terms that the subject appears in modern particle physics textbooks and notes:

    It is likely, then, considering the superconducting analog, that the way is now open for a degenerate-vacuum theory of the Nambu type without any difficulties involving either zero-mass Yang-Mills gauge bosons or zero-mass Goldstone bosons. These two types of bosons seem capable of “canceling each other out” and leaving finite mass bosons only.

    All that is missing here is an explicit relativistic example to supplement the non-relativistic superconductivity one. This was provided by several authors in 1964, with Higgs giving the first explicit relativistic model. Higgs seems also to have been the first to explicitly discuss the existence in models like his of a massive mode, of the sort that we now call a “Higgs particle”, the target of active searches at the Tevatron and LHC.

    Anderson tells his story here:

    So it was probably completed summer ’62. Very little attention was paid to it except that in fact— well, Higgs reinvented it. In some ways the particle physicists tell me he had less understanding; in some ways he had more. He certainly made a real model out of it where I had only a mechanism…

    about the Anderson-Higgs phenomenon, if I may use the word. In the paper that I wrote I definitely said people have been worried about the Goldstone boson in broken symmetry phenomena. The Goldstone boson is not necessary. Here is the possibility of removing the Goldstone boson, mixing it with a gauge boson, and ending up with zero mass. [should be “non-zero” maybe a transcription error]…So I think I really understood the nature of the mechanism…

    It was not published as a paper in the Condensed Matter Physics. It was published as a paper in Particle Physics. Brout paid attention to it. And he and Englert two years later produced a model of symmetry breaking, which if you’ll read carefully the summary of their work that t’Hooft and Veltman give (Nobel Prize winner this year), they say that they took off very much from the Brout-Englert paper, and there’s no way Brout was not perfectly aware of my work and I would be surprised if the Brout Englert paper doesn’t reference it rather than Higgs or along with Higgs. So in fact it didn’t fall completely on deaf ears.

    Note added 5/15/2013: I’ve heard from Martin Veltman that at the time they were working on the renormalizability of Yang-Mills, he and ’t Hooft were not aware of the Brout/Englert work, or of the general issues about the Goldstone theorem and the Higgs mechanism that Brout/Englert and others were addressing. Veltman’s Nobel lecture describes the history in detail, and has nothing like what Anderson describes (neither does ’t Hooft’s).

    Given the background Brout had in condensed matter physics and Anderson’s claim that “there’s no way Brout was not perfectly aware of my work”, it is quite surprising that no reference to Anderson occurs in the paper he and Englert published in Physical Review Letters. It arrived at the journal June 26, 1964 and came out in an issue dated August 31, 1964. In historical talks about this given back in 1997 (available here), Brout and Englert write:

    We knew from our study of ferromagnetism that long range forces give mass to the spin waves and we were aware, from Anderson’s analysis of superconductivity [5], of the fact that the massless mode of neutral superconductors, which is also a Nambu-Goldstone mode, disappears in charged superconductors in favor of the usual massive plasma oscillations resulting from the long range coulomb interactions in metals. Comforted by these facts, we decided to confront, in relativistic field theory, the long range forces of Yang-Mills gauge fields with the Nambu-Goldstone bosons of a broken symmetry.

    The latter arose from the breaking of a global symmetry and Yang-Mills theory extends the symmetry to a local one [6]. Although the problem in this case is more subtle because of gauge invariance, the emergence of the Nambu-Goldstone massless boson is very similar. We indeed found that there were well defined gauges in which the broken symmetry induces such modes. But, as we expected, the long range forces of the Yang-Mills fields were conflicting with those of the massless Nambu Goldstone fields. The conflict is resolved by the generation of a mass reducing long range forces to short range ones. In addition, gauge invariance requires the Nambu-Goldstone mode to combine with the Yang Mills excitations. In this way, the gauge fields acquire a gauge invariant mass!

    This work was finalized in 1964.

    Very oddly, the only reference to Anderson’s work that they give (their [5]) is to a 1958 paper of his, not to the 1963 paper which had the same conclusions as theirs, a year earlier.

    Brout and Englert don’t give a full model, just assume existence of a scalar field with spontaneously broken symmetry, and specified couplings to the gauge fields. Working independently, Peter Higgs in July 1964 sent a paper to Physics Letters arguing that, even relativistically, Anderson’s argument worked and there is no need for massless particles in the case of spontaneous symmetry breaking with a local symmetry. This paper was published, but a paper he sent a week later in which he wrote down an explicit model (the Abelian Higgs model) was rejected. It was later submitted to (August 31, 1964) and accepted at Physical Review Letters (published in the October 19, 1964 issue), where the referee (Nambu) made Higgs aware of the Brout-Englert paper, which Higgs refers to in a footnote. The Higgs paper does refer to Anderson’s 1963 paper, writing in the introduction:

    This phenomenon is just the relativistic analog of the plasmon phenomenon to which Anderson [3] has drawn attention.

    Higgs gives his version of the history here, and refers to the “Anderson mechanism”, writing:

    During October 1964, Higgs had discussions with Gerald Guralnik, Carl Hagen and Tom Kibble, who had discovered how the mass of non-interacting vector bosons can be generated by the Anderson mechanism.

    Guralnik, Hagen and Kibble had been working on what Higgs calls the “Anderson mechanism” and Anderson the “Anderson-Higgs mechanism”, writing a paper about it for submission to PRL. Guralnik gives his version of the history here (writing about the “Brout, Englert, Guralnik, Hagen, Kibble, Higgs phenomenon”, Higgs last, no Anderson), Kibble’s is here. In Guralnik’s version:

    as we were literally placing the manuscript in the envelope to be sent to PRL, Kibble came into the office bearing two papers by Higgs and the one by Englert and Brout. These had just arrived in the then very slow and unreliable (because of strikes and the peculiarities of Imperial College) mail. We were very surprised and even amazed. We had no idea that there was any competing interest in the problem, particularly outside of the United States. Hagen and I quickly glanced at these papers and thought that, while they aimed at the same point, they did not form a serious challenge to our work.

    His explanation for why they did not refer to Anderson is:

    At the same time, Kibble brought our attention to a paper by P.W. Anderson [26]. This paper points out that the theory of plasma oscillations is related to Schwinger’s analysis of the possibility of having relativistic gauge invariant theories without massless vector particles. It suggests the possibility that the Goldstone theorem could be negated through this mechanism and goes on to discuss “degenerate vacuum types of theories” as a way to give gauge fields mass and the necessity of demonstrating that the “necessary conservation laws can be maintained.” In general these comments are correct. However, as they stand, they are entirely without the analysis and verification needed to give them any credibility. These statements certainly did not show the calculational path to realize our theory and hence the unified electroweak theory. It certainly did not even suggest the existence of the boson now being searched for at Fermi lab and LHC. The actual verification that the same mechanism actually worked in non-relativistic condensed-matter theories as in relativistic QFT had to wait for the work of Lange [28], which was based on GHK. We did not change our paper to reference the Anderson work.

    See Guralnik’s paper for a detailed discussion of those points which he feels Anderson, Brout, Englert and Higgs had missed about all this. It remains true that the full understanding of how this works non-perturbatively is rather tricky, especially in the chiral, non-perturbative context that is relevant to the Standard Model. It may very well be that there is some important piece of understanding about this that has been missing and will someday lead to a final understanding of the origin of electroweak symmetry breaking.

    Update: For two other recent expository articles about this subject and its history, see here and here.

    Posted in Favorite Old Posts, Uncategorized | 31 Comments

    Massive

    There’s a wonderful new book about particle physics that has just come out, Massive:The Missing Particle that Sparked the Greatest Hunt in Science, by Ian Sample, who is a science correspondent for the Guardian. The topic is the huge open question currently at the center of particle physics: is the Higgs mechanism the source of electroweak symmetry breaking (and, at the same time, the source of the mass terms in the Standard Model)? The Tevatron and the LHC are now in a race to either detect the Higgs particle or rule out its existence, with one alternative or the other very likely to come through within the next few years.

    Truly explaining what the Higgs mechanism is can only be done with mathematics and physics background far beyond that expected in a popular book, but Massive makes a good try at it. Sample does a wonderful job of telling about the history behind this subject. He’s the first writer I know of who has gotten Peter Higgs to tell his story in detail. The original paper on the subject by Higgs was rejected by Physics Letters, but ultimately published by Physical Review Letters. There’s a complicated priority issue one can argue over and that someday soon a Nobel committee may need to resolve, involving Higgs, Englert, Brout, Guralnick, Hagen and Kibble. My personal opinion is that it was condensed matter theorist Philip Anderson who first understood and described the Higgs mechanism, quite a while before anyone else.

    Sample’s book is full of wonderful stories about particle physics, and alludes to some that he can’t give the details of:

    On June 8,1978, Adams marked the achievement in extraordinary fashion. He jotted down a poem about Rubbia and Van der Meer’s efforts and sent it out as a memo. The poem — too offensive to reprint here — suggested that Rubbia had exploited van der Meer’s brilliance to further his own career.

    The footnote to this says that the memo is in the CERN archive, dated June 8,1978 and entitled “Approval of ppbar facility”.

    One of the later parts of the story involves the discussion of Higgs rumors on particle physics blogs, and debates among the experimental collaborations about this. With a little bit of luck, we may hope to see more of this soon.

    All in all, the book is a great read, by far this year’s best popular book that could be recommended to lay people who want some idea of what’s going on in particle physics now and why it is exciting.

    Posted in Book Reviews | 29 Comments

    This and That

  • There’s a new preprint here explaining the scientific case for running the Tevatron past 2011. A couple weeks ago the P5 subpanel came out with its report on the subject, generating news stories “Momentum builds for Tevatron extension” and “Panel Wants U.S. to Chase ‘God Particle’—If There’s Money”. The panel recommended extending the Tevatron run, but only if another $35 million/year in additional federal funding was provided to do it. Prospects for this are very unclear, with the next stage in the process a decision about whether to include this in the President’s DOE budget request for FY2012, due next February.

    The US for a long time now has operated under a bizarre budgeting system, where government agencies typically spend much of the time operating without a budget. FY 2011 began October 1, with the Congress still far from having come up with a FY 2011 budget, instead operating the government on a series of “continuing resolutions”, which allow spending at the FY 2010 level. The latest of these expires December 3, after which there will undoubtedly be another one. Sooner or later an “omnibus bill” setting the actual budget will presumably get passed and one might think the result would correspond to the levels set by the appropriation committees of the House and Senate (which contain an increase of FY 2010 levels). Every other year though is an election year, so the Congress that had left town for weeks to campaign comes back after a post-election rest as a lame-duck organization, with lots of its members pushing for delaying everything until next year if their side did well in the election. This year the Republicans did very well at the polls arguing that the government spends too much money on “discretionary” things. Unfortunately for HEP, it’s in the relatively small “discretionary” part of the budget. There will undoubtedly be a strong push from the Republicans to not wait for FY 2012, but to start cutting this year’s budget, in the middle of the fiscal year. Given the dysfunctional nature of the US political system, it’s anybody’s guess how this will turn out. For comments from the Fermilab director about this situation, see here.

  • One organization that doesn’t have to worry about federal funding is FQXI, which was initially funded by the Templeton Foundation with a grant that was supposed to take them through the end of last year. I don’t know where their money is coming from these days, but they have recently announced a new essay contest carrying $40,000 or so in prize money (the Gruber Foundation is one sponsor), on the topic “Is Reality Digital or Analog?”.

    Among other FQXI activities, next August its members will go on a cruise during which they will discuss foundational questions related to the nature of time.

  • The LHC proton-proton run has ended for 2010, with an integrated luminosity of about 45 inverse picobarns. The plan is to restart around February 4 and collide protons for 9 months in 2011. It’s possible that the energy of the beams will be raised from 3.5 TeV to 4 TeV. Detailed plans will be made at workshops at Evian and Chamonix (January 24-28). For recent news and some idea of long term plans, see this talk at the US LHC users organization meeting. It has the LHC shutdown to replace splices from December 2011-March 2013, and another long shutdown for a luminosity upgrade in 2016. In the very long term, there are discussions of upgrading the machine with new, higher field magnets that would allow operation at 16.5 TeV/beam, but this is probably about 20 years off…
  • The theorists at CERN have been retreating, see here.
  • There’s an interview with Steven Weinberg in Scientific American, see here.
  • Nature Physics has a review by Eva Silverstein of the Yau/Nadis book (that I discussed here).
  • See Dennis Gaitgory’s website here for some notes in progress on geometric Langlands.
  • Update:
    One more: An interesting diavlog between Lee Smolin and Robert Wright at the Big Questions Online site.

    Update: Tommaso Dorigo has a critical discussion of the paper making the case for an extended Tevatron run.

    Posted in Experimental HEP News, Langlands, Uncategorized | 10 Comments