Sidney Coleman 1937-2007

I just learned today the sad news of the death of Sidney Coleman, yesterday at the age of seventy. Coleman had been in quite poor health in recent years. I wrote about him here back in 2005, after attending a conference held at Harvard in his honor.

Update: More from Betsy Devine, Lubos Motl and Sean Carroll.

Posted in Obituaries | 25 Comments

Project X and Flavor Physics

Last week Fermilab hosted two workshops on the so-called Project X proposal for building a linac designed to produce a high-intensity proton beam. The first workshop dealt with issues surrounding the proposed accelerator itself, the second with the physics that it might be able to investigate. Project X is being discussed in the context of an increasing realization that prospects for the ILC getting approved and built anytime soon are slim, so the US particle physics community in general, and Fermilab in particular, need to have a viable plan B for what they will be doing during the next decade. DOE secretary Ohrbach, in a recent talk at Fermilab made it clear that he thinks the ILC project is still at the stage of an R and D project, not yet near the point where a decision about it can be made and a full engineering design developed. For commentary about this from Barry Barish, director of the ILC project, see here.

One argument for Project X is that it would help develop some of the linac technology needed for the ILC, but the main arguments for the machine revolve around a striking change of direction for US particle physics, from the use of colliders to do experiments at the energy frontier to fixed-target physics at lower-energies. In some ways this would be a return to the older style of particle physics experiments that was the norm before the era of colliders. The point of Project X would be to produce a beam capable of being used to generate more intense beams of neutrinos that could contribute to neutrino physics, and to do what is now often called “flavor physics”. This is the study of phenomena involving heavy quarks and/or rare decays, with the hope of seeing beyond the standard model effects that occur not in lowest order approximation, but in higher order contributions to decay rates. There are quite a few decays that one can look for that either can’t occur at all in the standard model, or only can occur at unobservably small rates. An observation of such a decay and measurement of its rate would provide evidence of new physics. Many such studies already conducted provide strong bounds on quite a few possibilities, so one can imagine competing with colliders such as the LHC to either rule out or find new TeV-scale physics by doing this sort of experiment.

One interesting document to read about this is the account of a panel discussion on charm physics that occurred this past August. A participant emphasized how history has recently been running against the people working on flavor physics, telling the following story:

… over lunch we were talking about the future of the field, and I was drifting off, and ended up in a fantasy world where things were done the right way. And in this world the LHC was in fact built and came on the air, and found the Higgs, and found many new events that we couldn’t explain with the Standard Model. And people had realised that in order to interpret these possible signals of new physics, we would also have to have flavour physics studies of rare phenomena, so that we could start to see patterns emerging… and working symbiotically together, the LHC and the flavor sector would get to the root of what was happening, something that would be very difficult if not impossible to do with the LHC alone.

But then I woke up. And I thought about a colloquium I’d given recently, where one of the chief experimentalists there took me into his office and shut the door and said to my face, “Flavor physics is dead!” and apparently he’s not the only one who said it: some pretty important people have said it. And when something like that is said over and over it begins to have a truth of itself.

Deciding whether Project X makes sense will require figuring out exactly what kinds of experimental results it will make possible that would not be possible using existing or currently planned facilities. For more about this, see the introductory and wrap-up talks by Joe Lykken and a talk by Jon Bagger that summarizes the issues well. The workshop also featured an excellent talk by Michelangelo Mangano summarizing the current situation of particle physics, emphasizing what it might be possible to learn through other means than the LHC, which is what is getting almost all the attention these days. He pointed to the activities of the CERN Working Group on the Interplay Between Collider and Flavour Physics that are documented at this web-site.

Update: Alexey Petrov was at the Project X workshop, and has a very interesting posting about it.

Posted in Experimental HEP News | 35 Comments

Popularizing Science

While it’s not one of my main goals in life, I’m all in favor of the idea of popularizing science and making it as accessible as possible to as many people as possible. But sometimes I do wonder about the kind of things scientists get involved with when they try and do this. Just this morning I ran into these stories about science that make me ask myself:

  • Is it a good idea for physicists to appear on a radio show discussing what happened before the big bang, or does the lack of any evidence about this or of a convincing model mean that this is just inherently too speculative a topic to be sold as serious science to a wide audience? Should one perhaps leave this topic to the Bogdanovs?
  • Is it a good idea for physicists to promote to the public their work on time travel? Or might this also give the public some misleading ideas about science? (via i postdoc, therefore I am, but there seems to be a whole genre of “time travel” books written by theoretical physicists).
  • Is it a good idea for physicists to appear on a TV show explaining the forces involved in crushing beer cans, as part of a segment on whether women can crush beer cans with their breasts? Especially physicist bloggers known for attacking other physicist bloggers for their sexism and media-inflated nonsense? (via here and here)
  • Posted in Uncategorized | 70 Comments

    An Exceptionally Simple Theory of Everything?

    It’s been unusually long since my last posting, with the main reasons being that

  • Not much has been happening on the math/physics front…
  • I’ve been busy learning more about geometric Langlands, which is a daunting subject. I keep intending to write something about recent work by Witten and others in this area, but saying anything both correct and intelligible seems a rather challenging task that I haven’t been quite up for.
  • Garrett Lisi has a new paper on the arXiv, with the rather over-the-top title of An Exceptionally Simple Theory of Everything. Sabine Hossenfelder has a typically excellent posting about the paper, and Garrett has been discussing his work with people in the comment section there. Lubos Motl, has a typically, how shall I say, Lubosian posting on the topic.

    I’m the first person thanked in the acknowledgment section of the paper, but at Sabine’s blog Garrett explains that this is just because he is using reverse alphabetical order. I’ve corresponded with him in the past about his research in this area, without being able to provide any real help other than a certain amount of encouragement. Two of the ideas he is pursuing are general ones I’m also very fond of. One is well-known, and many people have also tried this, it’s the idea of bringing together the internal gauge symmetry and the symmetry of local frame rotations. The problems with this are also well-known, and some have been brought up by the commenters at Sabine’s blog. I don’t think Garrett has found the answer to this, or that he claims to. I’m still hopeful that this line of thinking will lead somewhere, but think some dramatically different new idea about this is still needed. The other idea he likes is that of trying to interpret the fermionic degrees of freedom of the BRST method for handling gauge invariance as providing the fermions of the Standard Model. I suspect there is something to this, but to get anywhere with it, a much deeper understanding of BRST will be required. I’ve been spending a lot of time trying to understand some of the mathematics related to BRST in recent years, and am in the middle of writing some of this up. It seems to me that there is a lot that is not understood yet about this topic even in much simpler lower-dimensional contexts, so we’re a long way from being able to really see whether something can be done with this idea in a realistic four-dimensional setting.

    One idea Garrett is fond of that has generally left me cold is the idea of unification via a large simple Lie algebra like E8. While there may be some sort of ultimate truth to this, the problem is that, just as for GUTs and for superstring models, all you’re doing when you do this is changing the unification problem into the problem of what breaks the large symmetry. This change in the problem adds some new structure to it, but just doesn’t seem to help very much, with the bottom line being that you get few if any testable predictions out of it (one exception is with the simplest GUTs, where you do get a prediction, proton decay, which turns out to be wrong, falsifying the models).

    Anyway, I’m glad to see someone pursuing these ideas, even if they haven’t come up with solutions to the underlying problems. Garrett is a serious and competent researcher who has pursued a non-traditional career path, and was recently awarded a grant to by the FQXI organization. You can read more about him in an article on their web-site.

    Unfortunately, some of the reaction to Garrett’s article has been depressing. A commenter who sounds well-informed but hides behind anonymity goes on about “this nonsense” (although Garrett’s polite reaction to him/her did lead to a more sensible discussion). Early on in my experience with blogs I believed that no serious professional in particle physics would attack someone and try and carry on a scientific argument anonymously, so any such comments had to be coming from misguided students, or someone not in the profession. Unfortunately I’ve all too often seen evidence that I was wrong about this. Lubos Motl on his blog denounced the fact that Garrett’s paper appeared in the hep-th section of the arXiv, then later wrote in to Sabine’s blog to crow that it had been removed from hep-th. As always with the arXiv, how moderation occurs there is non-transparent, so I don’t know how or why this happened. My own experience with the arXiv over trackbacks to hep-th has been a highly disturbing one. The current hep-th policy seems to be to allow any sort of nonsense to be posted there if it fits into the current string-theory-based ideology (see for example here), while suppressing any criticism of this. A paranoid person might be tempted to wonder whether hep-th is being moderated by someone so ideological and petty that criticism of string theory or including string theory critics in an acknowledgment section would be cause for having ones article removed from hep-th…

    Update: I hear from Garrett that the story of this paper at the arXiv is that it was submitted to gr-qc, not hep-th. Before it was posted, it was re-classified as hep-th, and appeared there. Later on (after the appearance of Lubos’s blog entry denouncing the arXiv for allowing the paper on hep-th I believe), it was re-classified again, this time as general physics (with cross-listing to hep-th).

    Update: Latest news about this is that the paper has now been reclassified again, to the perfectly appropriate hep-th, cross-listed as gr-qc, although no one seems to know why this happened. Another continuing mystery is the trackback situation: there are four trackbacks to the paper, to postings by Lubos, Bee, and to Physics Forums, as well as to an old TWF from John Baez that doesn’t even link to the paper. My postings still seem to be non-trackback worthy on hep-th, not that I can argue with this particular case, since the discussion elsewhere has been more substantive (except for Lubos’s, which is valuable for the way it accurately represents the hysterical reaction to speculation that is not string theory speculation all too common in certain quarters).

    Update: Garrett is making the news here. Whether this is a good thing is yet another question for debate on the next thread, I guess. A lot of the attraction for the media seems to be his personal story. Maybe it’s a good thing for physics for people to see that one can be a theoretical physicist while surfing in Hawaii…

    Update: Lisi-mania spreads. See stories in New Scientist, the Ottawa Citizen, Slashdot, and probably lots of other places I haven’t noticed.

    Update: Steinn Sigurdsson has an excellent posting summarizing the situation. As usual, blogs are the place to get the highest quality information about scientific issues…

    Update: I’ve given up on keeping track of the media stories on this. For some discussion of the representation theory involved, see this posting by Jacques Distler, and comments from Garrett.

    Update: The Angry Physicist examines the Distler critique in some detail.

    Posted in Uncategorized | 176 Comments

    Pressure Mounts to Tie String Theory to the Real World

    Last week David Gross was in New Mexico, giving an “unclassified talk” at Los Alamos, and one on The State of String Theory at the Santa Fe Institute. There’s a report on the Los Alamos talk from the Los Alamos Monitor, entitled Loose Strings: Pressure mounts to tie string theory to the real world. Unfortunately, pressure to tie string theory to the real world leads sometimes to reporters getting misled about such ties, since the article includes the information that:

    Located on the border of France and Switzerland, the LHC’s headline tasks include the potential discovery of a Higgs boson, a relatively massive particle known as “the god particle,” that would help explain how other particles have mass. Proof of its existence would tend to support string theory, according to the theorists.

    Hermann Nicolai has an article in a recent issue of Nature entitled Back to Basics, on a more promising idea for tying string theory to the real world, one that has nothing to do with using string theory as an idea about unification. He reports on recent progress towards getting an exact solution of N=4 SSYM, allowing one to test whether it really is dual to a string theory.

    According to a blog posting on Superstring Theory and the End of Man, we better hope that superstring theory doesn’t connect to the real world, because if it does, a combination of the Anthropic Principle and the Doomsday Argument would show that humanity doesn’t have much time left. The author expresses the opinion that mankind better hope that I am right about string theory, an opinion I endorse even if I disagree with his logic.

    Finally, on a completely unrelated note, the latest issue of Symmetry magazine is out, featuring the results of a reader’s contest to invent new particles. Third place goes to Jacobo Konigsberg, the spokesperson for CDF, who postulates the blogino, which he describes as

    Particles created by non-abelian Blog-Blog interactions. Bloginos typically are produced in a very excited state and with a high degree of spin. Even though all their properties have not yet been determined, it is commonly agreed that they exhibit considerable truthiness. They also have the annoying ability to propagate into extra dimensions, away from the blogosphere, and generate lots of phone calls.

    Update: Lubos has a link to a Youtube video of a version of this talk by Gross that he gave in Berkeley on October 19, together with commentary. It appears to me essentially the same talk that Gross gave here in New York three and a half years ago, which I wrote about in my first real blog posting here. It is striking to note how little has changed in this field during this period.

    Posted in This Week's Hype | 15 Comments

    Is Science Near Its Limits?

    The past two days I’ve been at a conference here in Lisbon organized by the Gulbenkian Foundation on the ostensible topic of Is Science Near Its Limits? The Gulbenkian is probably the most well-known and best-funded cultural organization in Portugal, and it includes a world-famous museum housing the wonderful art collection of its founder, Calouste Gulbenkian, who made his fortune in the oil business early during the last century.

    The conference was extremely well-run and well-attended, filling a large lecture hall where there was simultaneous translation of the talks into Portuguese. It was organized by literary critic, writer and polymath George Steiner, who gave the introductory talk. I hadn’t known that Steiner had originally started out studying mathematics, but was discouraged from pursuing a career in the subject at the University of Chicago by Irving Kaplansky, which led to his turning to the study of literature and philosophy. Steiner had quite a lot to say provocative to scientists, including questioning whether they had been able to justify to the public the large sums of money being spent on the LHC, and characterizing the lack of testability of string theory as strong evidence that science had hit a limit beyond which it could not progress.

    On the whole the rest of the speakers actually didn’t have much to say about limits of science, taking the standard view of most scientists that their own field had a bright future, with no limits in sight. The final talk of the conference did return to the limits issue, with John Horgan giving an uncompromising defense of the thesis of his 1996 book The End of Science (although he did allow that possible advances in neuroscience such as the decoding of a neural code, could be as revolutionary as previous advances). While the scientists in the audience took Steiner’s attacks in stride, partly because he was our host, they were less charmed by Horgan, who got a rather hostile reaction from many of them. I hope he’ll write about his point of view on the conference at his blog, or discuss it in one of his Bloggingheads discussions with George Johnson.

    I was one of the few other speakers discussing the question of limits, with my talk emphasizing that particle physics is now in a new, different environment than that of the past, one in which progress, even revolutionary progress, is possible, but much more difficult. A written version of my talk is available here. I was paired with string theorist Dieter Lust, who gave a presentation of the case for string theory unification and the Landscape. We were introduced by Gustavo Calstelo Branco of the IST, who emphasized recent advances in our understanding of neutrinos. Also speaking in another session was Luis Alvarez-Gaume of CERN, who gave a very upbeat talk on the prospects for particle physics, taking the point of view on string theory that, like any idea, string theorists will give up on it if it doesn’t work out. He already sees a diminishment of interest in string theory among particle physicists, with people moving instead towards subjects that promise some sort of interaction with experimental data. The three of us were brought together later for an interesting small and very lively discussion of the issues surrounding string theory and recent media attention to it. This was taped, and may appear in some form or other in the future.

    Update: There’s an entertaining conversation between John Horgan and George Johnson about the Lisbon conference now up at Bloggingheads.

    Posted in Uncategorized | 61 Comments

    Comment on Technicolor/Extended Technicolor Models

    Robert Shrock of Stony Brook sent me the following to post as a response to one of the comments on the latest posting. With his permission I’m putting it here as a separate posting, since I think it’s a valuable informed summary of the current state of technicolor/extended technicolor models.

    I would like to respond to Eric’s recent comment on Oct. 23 in which he said that “technicolor models were..eventually rejected due to some serious shortcomings. Namely, in order to generate fermion mass hierarchies for the SM fermions, one ends up with serious problems with FCNC’s.” and that these theories “led to a plethora of technimesons, for which there is absolutely no evidence.”.

    While it is true that FCNC’s are a relevant constraint on technicolor and extended technicolor (TC/ETC) theories and were viewed as very serious before the development of walking TC in the mid-1980’s, they do not obviously exclude TC/ETC models where the TC sector has walking behavior. The walking (slow running of the TC gauge coupling over an extended range), results naturally from the presence of an approximate IR fixed point in the renormalization group equation for the TC gauge coupling. This walking has the effect of enhancing SM fermion masses for a fixed set of ETC breaking scales. Indeed, since the mid-1980’s, the only viable TC models have been those with walking behavior. This walking allows one to use higher ETC breaking scales and still obtain the same SM fermion masses. It also enhances the masses of (pseudo)Nambu-Goldstone bosons. This was discussed in T. Appelquist and L. C. R. Wijewardhana, Phys. Rev. D 35, 774 (1987); Phys. Rev. D36, 568 (1987) and reviewed already a number of years ago, e.g., in R. S. Chivukula, hep-ph/9503202, hep-ph/9803219 and K. Lane, hep-ph/0202255, as well as more recent reviews such as C. Hill and E. Simmons, hep-ph/0203079 (published in Phys. Repts.) and my brief SCGT06 review, hep-ph/0703050. PNGB’s in one-family TC/ETC models may still be a phenomenological concern, but the early estimates of their masses were substantially increased by walking TC.

    Let me explain in more detail how TC/ETC models may be able to satisfy FCNC constraints. ETC models generically gauge the generational index and combine it with TC, so a simple SU(NETC) model has NETC=Ngen+NTC. With three generations of SM fermions, Ngen=3 and using the minimal value of NTC, namely NTC=2, this yields an SU(5) ETC theory. The SU(5) ETC symmetry can break to an exact residual vectorial SU(2) TC gauge group in three stages, characterized by three different mass scales, $\Lambda_j$, j=1,2,3. The ETC gauge bosons with masses $\Lambda_1$ mediate transitions between SM fermions of the first generation and the technifermions, and so forth for the other scales. With the values
    $$\Lambda_1 \simeq 10^3\ TeV$$
    $$\Lambda_2 \simeq 10^2\ TeV$$
    and
    $$\Lambda_3 \simeq few\ TeV$$
    this model appears to be able to fit constraints on FCNC processes. Consider, for example, one of the most severe such constraints, arising from $K^0 – \bar K^0$ mixing. In early studies in the 1980’s, in the absence of an explcit ultraviolet ETC completion, one simply wrote down a generic form for the low-energy effective Lagrangian for this process, $\simeq (c/\Lambda_{ETC}^2)$ times the relevant four-quark operators, where $\Lambda_{ETC}$ was taken to be “the” ETC breaking scale. But the key observation is the following: in the initial $\bar K^0$, the d sbar pair can annhilate to produce a $V^2_1$ ETC gauge boson, where the indices are the gauged generational indices. But this cannot directly produce the s dbar pair of the K0, which requires a $V^1_2$ ETC gauge boson. Hence, in order for the K-Kbar transition to proceed, the the actual ETC propagator factor is not 1 over the mass squared of the $V^2_1$ gauge boson, $1/\Lambda_1^2$, but instead
    $$(1/\Lambda_1^2)\ \Pi\ (1/\Lambda_1^2)$$
    where Π denotes the requisite nondiagonal propagator insertion that takes a $V^2_1$ to a $V^1_2$. Using a reasonably ultraviolet-complete ETC theory, in the paper hep-ph/0308061, published as Phys. Rev. D 69, 015002 (2003), Appelquist, Piai, and I showed, via explicit calculation of the ETC gauge boson mixings, that the nondiagonal ETC gauge boson mixing term is generically of order the square of a low ETC breaking scale, essentially as a consequence of a residual approximate generational symmetry in the ETC theory. This suppresses the $K^0 -\bar K^0$ mixing strongly, by a factor like $(\Lambda_3/\Lambda_1)^2$, i.e., the coefficient of the four-quark operator is not $1/\Lambda_1^2$ but the much smaller $\Lambda_3^2/\Lambda_1^4$, which is sufficient to satisfy the constraint from the experimentally measured mixing and resultant $K_L – K_S$ mass difference. In this and a series of other papers, taking account of the mixing between ETC group eigenstates of fermions to form mass eigenstates, we also examined many other FCNC constraints on TC/ETC theories and showed that they appear to be able to be satisfied.

    There are also FCNC processes that do not involve mixing of ETC gauge bosons. For example, the (conjugate of the) process $s \rightarrow d\ \mu^-\ e^+$ via exchange of a virtual $V^2_1$ ETC gauge boson gives rise to $K^+ \rightarrow \pi^+\ \mu^+ e^-$, for which the upper bound on the branching ratio (from the E865 experiment at BNL) is $BR(K^+ \rightarrow \pi^+\ \mu^+\ e^-) < 1.3 \times 10^{-11}$ (90 % CL). This is satisfied with the above value, $\Lambda_1 = 10^3\ TeV$.

    Although Eric did not mention the effect that technifermions have on Z and W boson propagators, these also serve as a stringent constraint on technicolor models, especially the electroweak S parameter. However, because technicolor is strongly interacting at the scale of a few hundred GeV, it is not possible to use a perturbative estimate of S, and nonperturbative estimates based, e.g., on spectral function integrals, are difficult to make reliably for a walking TC theory since one cannot just scale up results from QCD. (There have been a number of papers over the years giving estimates on this, and I can send a list to anyone who is interested, but the issue is not resolved yet.)

    In the SM, the electroweak symmetry breaking (EWSB) is produced by the vacuum expectation value of the hypothesized Higgs field. But this breaking is simply put in by hand, via an ad hoc choice of a negative coefficient for the quadratic Higgs term. No explanation is given in the SM of why this coefficient was not positive, when, a priori, it could just as well have been. Since the SM give no explanation for this negative sign of the coefficient, it does not provide a satisfactory fundamental explanation for EWSB. Indeed, it is interesting to recall that in both of the previous two main cases where a scalar field was used in phenomenological models for spontaneous symmetry breaking, namely in the Ginzburg-Landau free energy functional for superconductivity and the Gell-Man Levy sigma model for chiral symmetry breaking in hadronic physics, the microscopic physics did not involve the vacuum expectation value of a fundamental scalar field, but instead a bilinear fermion condensate – the Cooper pair in the BCS theory and the quark condensate in the case of QCD. In technicolor theories, it is precisely this type of bilinear fermion condensate – now involving technifermons,- which is responsible for electroweak symmetry breaking. Furthermore, the quark condensate in QCD already breaks electroweak symmetry. Thus, the original construction of technicolor models by Weinberg and Susskind was quite well motivated.

    This is, of course, not to say that TC/ETC theories do not face many challenges. They are very ambitious, since they try to explain both EWSB and the spectrum of SM fermion masses, and no fully realistic model of this type has been constructed. Moreover, it is certainly true that far more people are currently working on various variants of SUSY models than on dynamical EWSB approaches such as TC/ETC. But at least readers should know that Eric’s comment refers to the old TC theories of the early 1980’s, which were, indeed, rejected; the results of more recent work indicate that modern walking TC theories appear to be able to satisfy FCNC constraints. PNGB’s and the S parameter are concerns, but, in the opinion of a number of us who work in this area, they do not obviously exclude these theories. In any case, we should know soon from the LHC whether dynamical EWSB via TC/ETC or some other possibility like low-energy SUSY is realized in nature.

    Posted in Uncategorized | 4 Comments

    String Theory’s Next Top Model

    I’ll write soon about the conference I spoke at today in Lisbon. A few hours ago I participated in an interesting discussion about some of the issues around string theory with two string theorists. One of them was quite vehement that a big part of the problem is things being hyped to the media, and that this is an American disease, something that doesn’t happen to the same degree in Europe.

    I think he may be right, since I certainly haven’t noticed as much media hype (either pro or anti-string) from European sources, although I follow US ones more closely. When I got back to the hotel this evening, I noticed that SLAC is promoting on its web-site a news story about String Theory’s Next Top Model. The story appears to be about this paper that was just published in Physical Review D. In it, the authors consider three toy models of inflation in string theory and find that they don’t work. Their conclusion:

    This may be an artifact of the simplicity of the models that we study. Instead, more complicated string theory models appear to be required, suggesting that explicitly identifying the inflating subset of the string landscape will be challenging.

    So, the gist seems to be that they went looking for toy models of string inflation, didn’t find a workable one, but decided that this was worth a SLAC press release, presumably because “string theory is currently the most popular candidate for a unified theory of the fundamental forces”, so one should go to the press with any result one gets, even if negative. I think the Europeans may be right that this sort of thing doesn’t happen here….

    Posted in This Week's Hype | 11 Comments

    Future Linear Colliders

    There are two highly active projects to design a linear collider that would collide electrons and positrons at energies higher than those achieved at LEP. Recently there were workshops discussing the state of the projects.

    The ILC is the farthest along of the two and uses more conventional technology. It is a design for a 250+250GeV collider, upgradeable to 500+500Gev. There’s a very jazzy new web-site aimed at selling the idea to the US public. This week Fermilab is hosting a workshop on the ILC, talks available here. Michael Peskin gave an introductory talk with an unfortunate title (“The Physics Landscape”, I really think serious physicists should not be reminding people of this, especially when they’re making a pitch to the public for money). He argues that the LHC will see a spectrum of new particles (in order to solve the hierarchy and dark matter problems), and motivates the ILC as the machine to study these. This of course depends on their existence, at a mass low enough to allow production at the ILC, but high enough to evade bounds from LEP and the LHC (for some new results from the latter, see here). It now appears certain that no decision about building the ILC will be made until results are in from the LHC (2010?) that will resolve whether there are new particles in the mass range that the ILC is capable of studying.

    DOE’s Ray Orbach gave a talk about the ILC project, emphasizing:

    It is critical that planning for the ILC takes into account the realities of the funding situation, the need to formalize the ILC arrangements between governments, the changing scientific landscape, the scientific capabilities at other facilities, and the health of our national scientific structure.

    Orbach seems concerned that the ILC project does not have a realistic schedule (“I judge that these arrangements will require more time than the currently proposed schedule of the GDE”) , and does not have commitments from other countries. I’m guessing that he sees financing it out of the current and expected DOE budget is not doable without large contributions from other countries, and a relatively long time-frame. He emphasized that there is now a well-defined process for projects like this: they have to survive a series of “critical decisions”. The ILC is not yet ready for the first critical decision “CD-0, Mission Need” and won’t be until after the LHC results are in. He also mentions “other planned international projects”, and the importance of not duplicating their activities (I take this to be a reference to CLIC). Finally, he is critical of the plan of the ILC project to move to an “Engineering Design Report” that would give detailed engineering plans for the machine, since he sees it as still in an R and D phase.

    Over at CERN, the Resonaances blog reports on a workshop devoted to CLIC, a more ambitious and less technologically developed plan for a 1500 + 1500 Gev collider (upgradeable to 2500 + 2500 Gev). If CLIC really turns out to be feasible, and buildable on a time-scale close to that of the ILC, it will not be possible to justify building the ILC, since it would operate at much lower energy.

    This week internet access is more iffy, since I’m in Lisbon, for a conference late in the week on “Is Science Near Its Limits?”, sponsored by the Gulbenkian Foundation. After it is over, I’ll write about it here, and I think I can post a copy of the talk I’ll be giving.

    Update: Science magazine has a short piece about the Orbach talk, see here:

    Orbach said physicists must follow the department’s protocol that requires a large project to pass five critical decision milestones. The ILC has not passed the first, which allows researchers to proceed from basic R&D to design, Orbach said. Previously, DOE officials had been “completely open” to a less formal approach, says Caltech’s Barry Barish, who leads the design team. What counts as “engineering design” remains to be determined, he says.

    Another piece of this story recently pointed out to me is that the new CERN director is from DESY and associated to the ILC project there. This might cause people at CERN to wonder how hard he’ll push for its CLIC projector, which is in some ways an ILC competitor.

    Posted in Experimental HEP News | 35 Comments

    Susskind Joins PI

    I haven’t been able to confirm Lubos Motl’s claims that the Perimeter Institute offered him a job, but yesterday they announced that Lenny Susskind has accepted an offer to join them as an associate member. According to the announcement, this means that “he will spend focused time at PI each year to conduct research activities.”

    At the Frankfurt book fair, Backreaction’s Stefan Scherer took a picture of one of the displays, that featured a large poster advertising Susskind’s forthcoming book The Black Hole War, which carries the subtitle “My battle with Stephen Hawking to make the world safe for quantum mechanics”.

    Update: Marcus at Physicsforums points to this interview with Susskind about his forthcoming book:

    For two decades an intellectual war took place between Stephen Hawking, on the one side, and myself and Gerard ‘t Hooft on the other. The book is about the scientific revolution that the controversy spawned, but also about the colorful personalities and the passions that gave the story its drama. The story starts in Werner Ehrhardt’s Mansion in San Francisco, and eventually passes through all seven continents, including Antarctica.

    Posted in Uncategorized | 36 Comments