The NSA, NIST and the AMS

Among the many disturbing aspects of the behavior of the NSA revealed by the Snowden documents, the most controversial one directly relevant to mathematicians was the story of the NSA’s involvement in a flawed NIST cryptography standard (for more see here and here). The New York Times reported:

Classified N.S.A. memos appear to confirm that the fatal weakness, discovered by two Microsoft cryptographers in 2007, was engineered by the agency. The N.S.A. wrote the standard and aggressively pushed it on the international group, privately calling the effort “a challenge in finesse.”

The standard was based on the mathematics of elliptic curves, so this is a clearly identifiable case where mathematicians seem to have been involved in using their expertise to subvert the group tasked with producing high quality cryptography. A big question this raises has been what the NIST will do about this. In April they removed the dubious algorithm from their standards, and published the public comments (many of which were highly critical) on a draft statement about their development process.

At the same time a panel of experts was convened to examine what had gone wrong in this case, and this panel has (on a very short time-scale) just produced its report (associated news stories here, here and here). The rules of how such panels are set up evidently require that each panelist provide an individual report, rather than attempt to have a consensus version. The new NIST document gives these reports together with minutes of the meetings where the panelists were provided with information. It seems that the NSA provided no information at all as part of this process, and they remain unwilling to answer any questions about their actions.

Appendix E contains the individual reports. These include, from Edward Felten:

The bottom line is that NIST failed to exercise independent judgment but instead deferred extensively to NSA with regard to DUAL_EC. After DUAL_EC was proposed, two major red-flags emerged. Either one should have caused NIST to remove DUAL_EC from the standard, but in both cases NIST deferred to NSA requests to keep DUAL_EC…
at the time NIST had nobody on staff with expertise in elliptic curves.
NSA’s vastly superior expertise on elliptic curves led NIST to defer
to NSA regarding DUAL_EC, while NIST people spent more of their limited time on other parts of the standard that were closer to their expertise.

From Bart Preneel:

There is no doubt that the inclusion of Dual EC DRBG in SP 800-90A was a serious mistake…
The explanations provided by NIST are plausible, but it seems that not all decisions in the standardization process of SP 800-90A are properly documented; moreover, we did not have access to the source documents. This means that it is impossible to decide whether this mistake involved in addition to clever manipulation of the standards processes by NSA also some form of pressure on the technical and/or management staff of NIST. It is also not clear whether there would be any traces of such pressure in documents. Without access to the documents, it is also diffcult to decide whether or not NIST has deliberately weakened Dual EC DRBG…

However, it seems that NSA (with its dual role) seems to be prepared to weaken US government standards in order to facilitate its SIGINT role. This undermines the credibility of NIST and prevents NIST reaching its full potential in the area of cryptographic standards. In view of this, the interface between NSA and NIST and the role of the NSA should be made much more precise, requiring an update to the Memorandum of Understanding. At the very least, the terms “consult”, “coordination” and “work closely” should be clarified. Ideally, NIST should no longer be required to coordinate with NSA. There should be a public record of each input or comment by NSA on standards or guidelines under development by NIST.

From Ronald Rivest (the “R” in “RSA”):

Recent revelations and technical review support the hypothesis that, nonetheless, the NSA has been caught with “its hands in the cookie jar” with respect to the development of the Dual-EC-DRBG standard. It seems highly likely that this standard was designed by the NSA to explicitly leak users’ key information to the NSA (and to no one else). The Dual-EC-DRBG standard apparently (and I would suggest, almost certainly) contains a “back-door” enabling the NSA to have surreptitious access. The back-door is somewhat clever in that the standard is not designed to be “weak” (enabling other foreign adversaries to perhaps exploit the weakness as well) but “custom” (only the creator (NSA) of the magical P,Q parameters in the standard will have such access).

NIST (and the public) should know whether there are any other current NIST cryptographic standards that would not be acceptable as standards if everyone knew what the NSA knows about them. These standards should be identified and scheduled for early replacement. If NSA refuses to answer such an inquiry, then any standard developed with significant NSA input should be assumed to be “tainted,” unless it possesses a verifiable proof of security acceptable to the larger cryptographic community. Such tainted standards should be scheduled for early replacement.

One way this goes beyond the now-withdrawn NIST standard is that the committee also looked at other NIST current standards now in wide use, which in at least one other case depend upon a specific choice of elliptic curves made by the NSA, with no explanation provided of how the choice was made. In particular, Rivest recommends changing the ECDSA standard in FIPS186 because of this problem.

For a detailed outline of the history of the Dual-EC-DRBG standard, see here. Note in particular that this states that in 2004 when the author asked where the Dual-EC-DRBG elliptic curves came from, the response he got was “NSA had told not to talk about it.”

Also this week, the AMS Notices contains a piece by Richard George, a mathematician who worked at the NSA for 41 years before recently retiring. Presumably this was vetted by the NSA, and is a reasonably accurate version of the case they want to make to the public. Personally I’d describe the whole thing as outrageous, for a long list of reasons, but here I’ll just focus on what it says about Dual-EC-DRBG, since it now seems likely that it is all we will ever get from the NSA about this. It says:

I have never heard of any proven weakness in a cryptographic
algorithm that’s linked to NSA; just innuendo.

The reaction from a commenter here (publicly anonymous, but self-identified to me) was:

As a member of a standards committee involved the removal of the mentioned algorithm from a standard, none of the members believe the “innuendo” theory, and all believe it was deliberately weakened.

Read carefully (and I think it was written very carefully…), note that George never directly denies that the NSA back-doored Dual-EC-DRBG, just claims there is no “proven weakness”. In other words, since how they chose the elliptic curves is a classified secret, no one can prove anything about how this was done. All the public has is the Snowden documents which aren’t “proof”. This is highly reminiscent of the US government’s continuing success at keeping legal challenges to NSA actions out of the courts, even when what is at issue are actions that everyone agrees happened, on the grounds that plaintiff can’t “prove” that they happened, since they are classified. Snowden’s release of documents may yet allow some of these cases to come to a court, just as they were the one thing capable of getting the NIST to acknowledge the Dual-EC-DRBG problem.

I hope that there will be more response to the NSA issue from the Math community than there has been so far. In particular, Rivest’s call for the removal from NIST standards of material from the NSA which the NSA refuses to explain should be endorsed. The innuendo from George is that the NSA may be refusing to explain because they used superior technology to choose better, more secure elliptic curves. If this is the case I don’t see why an official statement to that effect, from the NSA director, under oath, cannot be provided.

On the many other issues the George article raises, I hope that the AMS Notices will see some appropriate responses in the future. Comments here should be restricted to the NIST/NSA story, with those from anyone knowledgeable about this story most encouraged.

Update: The NIST has made available on its website the materials provided to the panel looking into this.

One remarkable thing about the panel’s investigation is that the NSA evidently refused to participate, in particular refusing to make anyone available to answer questions at the panel’s main meeting on May 29 (see page 12 of the report). This appears to be in violation of the Memorandum of Understanding that governs the NIST/NSA relationship, which explicitly states that “The NSA shall … Be responsive to NIST requests for assistance including, but not limited to, all matters related to cryptographic algorithms and cryptographic techniques, information security, and cybersecurity.” All evidence I’ve seen is that the NSA sees itself as above any need to ever justify any of its actions. I can’t see any possible argument as to why they did not have an obligation to participate in the work of this committee.

: A new development in this story is a letter from Congressman Grayson to NSA Director Clapper asking exactly the right questions about what happened at the NIST. Will be interesting to see if a member of Congress can get anything out of the NSA beyond the usual stone-walling.

Posted in Uncategorized | 23 Comments

Mathematics Items

  • For an Oxford conference last week, Langlands contributed a one-hour video talk, filmed in his office. One hour was not enough, so hours two and three are also available, as well as a separate text, and some additional comments.
  • The latest AMS Notices has a long section of excellent articles about Friedrich Hirzebruch and his mathematical work.
  • Also in the AMS notices is a long defense of the NSA, written by a mathematician who worked there for 41 years. About the main recent controversy here, the Snowden revelation of an NSA backdoor in an NIST standard, all the author has to say is:

    I have never heard of any proven weakness in a cryptographic
    algorithm that’s linked to NSA; just innuendo.

    This seems to me to exemplify pretty well the disturbing tactic of the US security establishment of claiming there is no problem while refusing to discuss anything problematic since it is classified.

  • Bhargava, Skinner and my colleague Wei Zhang have a new paper out proving that better than 66% of elliptic curves satisfy the BSD conjecture. It seems not implausible that they or others might in the not too distant future get to 100%. One should note though that showing 100% of elliptic curves satisfy BSD wouldn’t be the same thing as showing all elliptic curves satisfy BSD, so wouldn’t be eligible for the $1 million Millennium prize.
  • With the ICM less than a month away, I find it outrageous that no one has yet leaked to me the names of the Fields Medal winners. All I’ve heard is speculation, and the only name I’d bet any money on is Bhargava.

: For something both ICM and Langlands related, Michael Harris on his web site has his ICM contribution Automorphic Galois representations and the cohomology of Shimura varieties. Many of the ICM 2014 proceedings contributions are already available on arXiv, via this search.

Posted in Langlands | 23 Comments

String Theory and Post-Empiricism

Note: This is being published simultaneously here and at Scientia Salon. Discussion will be at the Scientia Salon site.

Last month’s Strings 2014 conference in Princeton included two remarkable talks by prominent physicists, both of whom invoked philosophy in a manner unprecedented for this kind of scientific gathering. On the first day, Paul Steinhardt attacked the current practice of inflationary cosmology as able to accommodate any experimental result, so, on philosophical grounds, no longer science. He included a video clip of Richard Feynman characterizing this sort of thing as “cargo cult physics”. On the final day, David Gross interpreted Steinhardt’s talk as implicitly applying to string theory, then went on to invoke a philosopher’s new book to defend string theory, arguing that string theorists needed to read the book in order to learn how to defend what they do as science.

The book in question was Richard Dawid’s String Theory and the Scientific Method, which comes with blurbs from Gross and string theorist John Schwarz on the cover. Dawid is a physicist turned philosopher, and he makes the claim that string theory shows that conventional ideas about theory confirmation need to be revised to accommodate new scientific practice and the increasing significance of “non-empirical theory confirmation”. The issues of this kind raised by string theory are complex, so much so that I once decided to write a whole book on the topic (Not Even Wrong). A decade later I think the arguments of that book still hold up well, with its point of view about string theory now much more widespread among working physicists. One thing I wasn’t aware of back then was the literature in philosophy of science about “progressive” vs. “degenerating” research programs, which now seems to me quite relevant to the question of how to think about evaluating string theory.

I’ve written a bit about the Dawid book and earlier work of his (see here and here), although as for any serious book there’s of course much more to say, even if I lack the time or energy for it. Recently an interview with Dawid appeared, entitled string theory and post-empiricism, which summarizes his views and makes some claims about string theory critics which deserve a response, so that will be the topic here. In the interview he says:

I think that those critics make two mistakes. First, they implicitly presume that there is an unchanging conception of theory confirmation that can serve as an eternal criterion for sound scientific reasoning. If this were the case, showing that a certain group violates that criterion would per se refute that group’s line of reasoning. But we have no god-given principles of theory confirmation. The principles we have are themselves a product of the scientific process. They vary from context to context and they change with time based on scientific progress. This means that, in order to criticize a strategy of theory assessment, it’s not enough to point out that the strategy doesn’t agree with a particular more traditional notion.

Second, the fundamental critics of string theory misunderstand the nature of the arguments which support the theory. Those arguments are neither arbitrarily chosen nor uncritical. And they are not decoupled from observation. String theory is indirectly based on the empirical data that drove the development of those theories string theory aims to unify. But more importantly for our discussion, the arguments for the viability of string theory are based on meta-level observations about the research process. As described before, one argument uses the observation that no-one has found a good alternative to string theory. Another one uses the observation that theories without alternatives tended to be viable in the past.

Taking the second part of this first, Dawid seems to be claiming that Smolin and I don’t understand what he calls the “No Alternatives Argument” (discussed in detail in his book, as well as in this recent paper). In response I’ll point out that one of the concluding chapters of my book was entitled “The Only Game in Town” and devoted explicitly to this argument. To this day I think that a version of such an argument is the strongest one for string theory, and is what motivates most physicists who continue to work on the theory. The version of this argument that I hear often privately and that has been made publicly by theorists like Edward Witten goes something like:

ideas about physics that non-trivially extend our best theories (e.g. the Standard Model and general relativity) without hitting obvious inconsistency are rare and deserve a lot of attention. While string theory unification hasn’t worked out as hoped, we have learned a lot of interesting and unexpected things by thinking about string theory. If they see a new idea that looks more promising, string theorists will shift their attention to that.

This is a serious argument, one that I tried to seriously address in the book. Beyond that, more naive versions of it seem to me to have all sorts of obvious problems. Of course, if you really can show that alternatives to a given model are impossible, that’s a convincing argument for the model, but this is rarely if ever possible. Working scientists beating their heads against a hard problem are always in the position of having “no alternatives” to some flawed ideas, until the day when someone solves the problem and finds the alternative. The only example I can recall seeing from Dawid of a successful example of the “No Alternatives Argument” is the discovery of the Higgs, and I find that very hard to take seriously. Pre-2012, the Standard Model was a very precise and exhaustively tested theory, providing a huge amount of indirect evidence for the Higgs. There were plenty of alternatives (technicolor, SUSY, etc.), all much more complicated and with no evidence for them. Making a “No Alternatives Argument” for a theory with overwhelming experimental evidence behind it is something completely different than trying to do the same thing for a theory with zero experimental evidence.

As for the other mistake that Dawid thinks string theory critics make, that of believing in some unchanging notion of empirical theory confirmation, the first thing to point out is that of course every theorist is well aware that one can can’t just demand experimental predictions and confirmation for ideas, that one spends basically all one’s time working on better understanding ideas that are far from the point where empirical confirmation comes into play. The second thing to point out is that I agree completely with Dawid that as experiments become more difficult, one needs to think about other ways of evaluating ideas to see if they are going anywhere. The last chapter of my book was devoted to exactly this question, arguing that physicists should look carefully at how mathematicians make progress. Mathematics is certainly “post-empirical”, and while logical rigor is a constraint, it is not one that necessarily points mathematicians to fertile new ideas. There is a long history and a deeply-ingrained culture that helps mathematicians figure out the difference between promising and empty speculation, and I believe this is something theoretical physicists could use to make progress.

The epigram from that last chapter though was something that kept going through my head when thinking about this, a line from Bob Dylan’s “Absolutely Sweet Marie”:

But to live outside the law, you must be honest.

Yes, theoretical particle physics is in a stage where empirical results are not there to keep people honest, and new and better “post-empirical” ways of evaluating progress are needed. But these must come with rigorous protections against all-too-human failings such as wishful thinking and Lee Smolin’s “groupthink”, and I just don’t see those anywhere in Dawid’s proposal for new kinds of theory confirmation.

I’d like to thank Massimo Pigliucci for the opportunity to write something here at Scientia Salon, and hope it will generate an interesting discussion. Contributions from philosophers to this kind of debate in physics I believe are very much needed, on this issue and others. Don’t even get me started about the multiverse…

: Frank Wilczek, unlike Gross and Schwarz, is not a fan of Dawid. From Twitter:

Wheeler: “Mass Without Mass”, “Charge Without Charge”, “Field Without Field”
Dawid: “Physics Without Physics”

Update: Sabine Hossenfelder blogs about this here, with a response from Dawid here. Dawid writes

My claim is that a strong record of empirical confirmation in a research field under certain conditions can increase the probability of the viability of other theories in the field that have not been empirically confirmed. The currently predominant philosophical conception of theory confirmation (Bayesianism) equates confirmation with the increase of the probability that a theory is true or viable. For that reason I speak of “non-empirical theory confirmation”.

This seems to just be an argument that HEP theorists have been successful in the past, so one should believe them now, an argument in conflict with the argument that things have changes due to the difficulty in getting relevant experimental data.

Posted in Uncategorized | Comments Off

Quick Links

  • Jim Simons is profiled in the Wall Street Journal yesterday, the New York Times today. The WSJ piece is partly about a recent $50 million donation to the Simons Center for Quantitative Biology at Cold Spring Harbor, but it reports that Simons is moving away from “broad institutional support”, in favor of “collaborative, goal-driven science”. Recently Simons has funded the Simons Array of telescopes that will be looking at polarization in the CMB, and the NYT piece reports that he was talking to Stanford physicists working on experiments looking for the axion. Simons is estimated to have a net worth of $12.5 billion, the Simons Foundation now has $2 billion.
  • Quite a few years ago I started a trip to Paris by getting off the plane from New York and heading directly to attend talks at the Seminaire Bourbaki. The main thing I remember now of that is an epic struggle to stay awake, since I hadn’t slept on the plane, and the room was rather overheated. There’s now a much better way to enjoy talks from this historic program, which since its inception in has been the source of some of the great expositions of new mathematics. Talks are on Youtube, links are on the latest program (learned about this from Emmanuel Kowalski’s blog).
  • In other news from France, this year’s Baccalaureat exam features questions about the Higgs and the LHC. They start off with a quote from Carlo Rovelli about the Higgs discovery being “as important for intellectual history as Newton’s law of gravitation”. Rovelli’s reaction: “I’ve never thought such a stupid thing.” For more on the mini-controversy, see here.
  • For more videos to watch, Oxford has an interview with Atiyah here, Penrose here. Cambridge has a large collection of such video interviews, including Peter Swinnerton-Dyer, John Coates, Martin Rees and John Polkinghorne.
  • The AMS has been encouraging discussion in the mathematical community of the implications of the Snowden revelations about the activities of the NSA, supposedly the largest employer of mathematicians in the US. This month’s Notices includes pieces from Keith Devlin and Andrew Odlyzko, introduced by Michael Harris and Allyn Jackson. Further contributions to this discussion are encouraged.
  • On the abc conjecture front, perhaps the planned lecture series this September by Go Yamashita will give mathematician’s a fighting chance to understand Mochizuki’s claimed proof. While the talks will be in Japanese, presumably Yamashita will be producing something written in English.
Posted in Uncategorized | 8 Comments

Physical Mathematics and the Future

The “vision” talk at Strings 2014 that I found most interesting was that of Greg Moore, whose topic was “Physical Mathematics and the Future”. He has a very extensive written version of the talk here, which includes both what he said, as well as a lot of detail about current topics at the interface of mathematics and physics.

I think what Moore has to say is quite fascinating, he’s giving a wonderful survey of where this intellectual subject is, how it has gotten there, and where it might be going. I’m very much in sympathy with his historical discussion of how math and physics have evolved, at times closely linked, at others finding themselves far apart. He’s concerned about an issue that I’ve commented on elsewhere, the fact that physics and math seem to be growing apart again, with no mathematicians speaking at the conference, instead attending their own conference (“Strings-Math 2014″). Physics departments increasingly want nothing to do with mathematics, which is a shame. One reason that Moore gave for this I found surprising, the idea that

most mathematicians are not as fully blessed with the opportunities for pursuing their research that many theoretical physicists enjoy.

It seems there’s a perception among many physicists that research mathematicians labor under some sort of difficult conditions of low pay and high teaching loads, but I think this is a misconception. Moore may be generalizing too much from the situation at Rutgers, where very unusual positions were created for string theorists at the height of that subject’s influence. From what I’ve seen, the salaries of top research mathematicians and theoretical physicists are quite comparable (if you don’t believe me, do some searches in the on-line data of salaries of faculty employed by public universities). Senior mathematicians do sometimes have slightly higher teaching loads, although often with a freedom to teach what they want. At the postdoc level, it is true that theoretical physics postdocs typically have no teaching, while similar positions in math often do require teaching. On the other hand, the job situation in theoretical physics is much more difficult than in mathematics. I’d say that working in an environment where you know you’re likely to find a permanent job is much preferable to one where you know this is unlikely, with doing some teaching not at all a significant problem.

On the question “What is String Theory?”, Moore’s take was that the “What is M-theory?” question is no longer getting much attention, with people kind of giving up. There was a very odd exchange at the end of the talk, when Witten asked him if he thought that maybe people should be emphasizing the string question, not the M-theory question, and Moore responded that the emphasis on M-theory was something he had learned from Witten himself.

His main point about this though was one I very much agree with, that the more interesting question now is “What is QFT?”. The standard way of thinking about QFTs in terms of an action principle doesn’t capture much of the interesting things about QFT we have learned over the years. Moore emphasizes certain examples, such as the (2,0) 6d superconformal theories, but discusses in his written version the relation of QFT to representation theory of some infinite dimensional groups, which I think provides even better examples of a different and more powerful way of thinking about QFT.

The written version contains a wealth of information surveying current topics in this area, is highly recommended to anyone who wants to try and understand what people working on “string theory” and mathematics have been up to. It appears that this document is a work in progress, with more material possibly to come (for instance, there’s a section 4.4 on Geometric Representation Theory still to be filled in). I look forward to future versions.

Posted in Strings 2XXX | 15 Comments

String Theory Visions

Strings 2014 ended today, with five separate “vision” talks, giving a good picture of where the leaders of the string theory community see the subject going. I saw the talks on streaming video, presumably they should appear on the conference website in the next few days.

  • Michael Green led off by noting that string theory had at one point been intended as a unified theory, but now has “blossomed into something much more significant”, a framework covering all sorts of things. He went on to say that he would avoid discussing the grand questions, and instead just give a summary of what he found interesting about work reported at the conference. The main area that he covered in the talk were the many different kinds of results about scattering amplitudes that are under study. He seemed to be somewhat of a skeptic about Arkani-Hamed’s widely publicized “Amplitudehedron”, saying it had some ways to go before it was useful for computations (unlike other methods).
  • Juan Maldacena gave a talk that had nothing to do with string theory, mainly about the conceptual issues of black holes and quantum theory. He claimed that the BICEP2 results showed that quantum gravity was now an experimentally-based subject, answering those who were skeptical about studying an untestable subject.

    At the end of the talk he gave an answer to the question “What is String Theory?”:

    Solid Theoretical Research In Natural Geometric Structures

  • Andy Strominger gave his own answer to the “What is String Theory?” question:

    anything that anybody in this room or any of their friends has ever worked on.

    He did note that there were hardly any strings anywhere to be seen at recent string theory conference talks.

    About the LHC and any conceivable follow-on higher energy accelerator, his comment was that it was now highly unlikely that string theory could make predictions relevant to them, and that he didn’t want this to be a defining goal of the field. Clearly, the failure to find SUSY at the LHC has now pretty much killed off most hopes that string theory unification is relevant to particle physics in any testable way. Like Maldacena, he pointed to BICEP2 as showing that quantum gravity was now an experimental subject.

    He ended by explaining that he had sent around emails to a hundred people asking for their suggestions about what problems there would be progress on over the next 5-10 years. He got 80 responses, and quickly put up some slides with them. No time to really read these, but he says that they’ll be posted online soon, and that should be a quite interesting document

  • The last talk was by David Gross, who pointed out that he has given many of these things before. He then went on to discuss Paul Steinhardt’s impassioned talk earlier in the week, which included a video of Richard Feynman discussing “Cargo Cult Science” and theories that were too vague to be testable. Steinhardt had been arguing that inflation was so vague and flexible a theory that it could not be tested and so was not science. Gross like many others realized that you could replace “inflation” by “string theory” in Steinhardt’s argument. He then gave a long and very defensive discussion of why string theory might still be science, invoking the recent book by Richard Dawid and telling the audience they needed to read it so they could defend their subject against the accusations it is facing.

    I wrote here last year about the Dawid book, including an explanation of Dawid’s three main arguments for string theory. Gross went through these in detail, and I think what I wrote last year also responds to what Gross has to say. He did include in the “Meta-Inductive Argument” the argument that SUSY is a related research program to string theory and that it will be vindicated at the LHC in the next few years. It will be interesting to hear what he has to say at a future Strings 20XX after this hasn’t worked out. He announced that Strings 2015 will be in Bangalore, Strings 2016 in Tsinghua, Strings 2017 in Israel, Strings 2018 in Japan and Strings 2019 in Belgium.

    He explicitly addressed the fact that many in the field were experiencing depression and anxiety due to things not working out, pointing out that even if the first derivative of progress in a field is negative, there can be jump discontinuities. So, although things don’t look good, maybe a big piece of progress will come along out of nowhere.

There was one more vision talk, but I’ll discuss that in a separate posting.

Update: Strominger’s slides are available here, and include the 80 responses he got from others about the open problems of the field.

Update: Videos of the talks are now available.

Posted in Strings 2XXX | 18 Comments

Pierre van Baal 1955-2013

I only recently heard about the death late last year of Dutch particle theorist Pierre van Baal. Pierre was my office mate when we were both postdocs at Stony Brook during the mid-eighties, and he was one of the people I most enjoyed talking with about physics during those years. He arrived at Stony Brook after completing a Ph.D. with Gerard ‘t Hooft, and later went on to CERN and ultimately a professorship in Leiden. I last saw him at a conference in Stony Brook in 2008 (described here) where he told me that he had suffered a serious stroke in 2005. Pierre was quite modest, and always a cheerful and optimistic presence in the room. In 2008 he was still very much himself, but more halting in his speech. From what I remember, he told me that he was resuming teaching, but was frustrated that he was no longer capable of engaging in research work.

The lecture notes of his class on quantum field theory are available (an online version here, a published version here). Like Pierre himself, they’re a model of clear and concise thinking and exposition. Last year a book with a selected collection of his papers was published, see here.

A major theme in Pierre’s work was the study of quantum gauge theory via semi-classical methods, for the case of a system in a “box”, i.e. finite extent in space and time dimensions. In the Euclidean picture, periodic boundary conditions in time correspond to doing computations at non-zero temperature, inversely proportional to the size of the time dimension. As a result, this work is quite relevant to the study of QCD at finite temperature, including the expected deconfining phase transition.

While Pierre was not really a lattice gauge theorist, his work was highly relevant to lattice gauge theory, where computer simulations inherently take place in a finite box, and understanding the effects of this on the physics is crucial. As a result, Pierre was well-known in the lattice gauge theory community. This week Columbia University is hosting Lattice 2014, the big yearly meeting for lattice gauge theorists (plenary talks are streamed on This morning I attended the talk by Michael Mueller-Preussker on Recent results on topology on the lattice, which was given in memory of Pierre. The slides have a lot more information about Pierre and his work, as well as surveying the latest on lattice results involving the topology of gauge fields.

For more about Pierre, see this site, which has an appreciation written by Chris Korthals Altes (in English here), as well as pieces by ‘t Hooft and Hans van Leeuwen (in Dutch).

Posted in Uncategorized | 2 Comments

2015 Breakthrough Prizes in Mathematics

The first set of winners of the $3 million Milner/Zuckerberg financed Breakthrough Prizes in mathematics was announced today: it’s Donaldson, Kontsevich, Lurie, Tao and Taylor. There’s a good New York Times story here.

When these prizes were first announced last year, I was concerned that they would share a problem of Milner’s Fundamental Physics Prizes, an emphasis on rewarding one particular narrow area of research. I’m happy to say that I was wrong: the choices made are excellent, including a selection of the absolute best people in the field, working in a wide range of areas of pure mathematics. The prize winners are mathematicians who are currently very active, doing great work. It’s clear that there was an effort to avoid making this a historical prize, i.e. giving this to people purely for great work done in the past (which to some extent the Abel Prize is doing). The recipients are on average in their 40s, at the height of their powers.

One oddity is the award to Kontsevich, who already received $3 million from the Fundamental Physics prize. Given my interests, I suppose I shouldn’t criticize a prize structure where physicists get $3 million, mathematicians $3 million, and mathematical physicists $6 million.

While this prize doesn’t suffer from the basic problem of the Physics prize (that of rewarding a single, narrow, unsuccessful idea about physics), it’s still debatable whether this is a good way to encourage mathematics research. The people chosen are already among the most highly rewarded in the subject, with all of them having very well-paid positions with few responsibilities beyond their research, as well as access to funding of research expenses. The argument for the prize is mainly that these sums of money will help make great mathematicians celebrities, and encourage the young to want to be like them. I can see this argument and why some people find it compelling. Personally though, I think our society in general and academia in particular is already suffering a great deal as it becomes more and more of a winner-take-all, celebrity-obsessed culture, with ever greater disparities in wealth, and this sort of prize just makes that worse. It’s encouraging to see that most of the prize winners have already announced intentions to redirect some of the prize moneys for a wider benefit to others and the rest of the field.

Update: Among the private reactions I’ve heard from prominent mathematicians this morning, one is the desirability of funding a new “sidekick” prize for collaborators of the $3 million winners…

Posted in Uncategorized | 24 Comments

Smoking Gun No Longer Smoking

The BICEP2 paper is now out in Physical Review Letters, with major revisions to its conclusions from the preprint/press conference version of last March. For another sort of associated revision, compare this (from a March 17 Stanford press release):

Linde, now a professor of physics at Stanford, could not hide his excitement about the news. “These results are a smoking gun for inflation, because alternative theories do not predict such a signal,” he said. “This is something I have been hoping to see for 30 years.”

to this (from an interview with Linde in the latest New Scientist):

I don’t like the way gravitational waves are being treated as a smoking gun.

If we found no gravitational waves, it wouldn’t mean inflation is wrong. In many versions of the theory, the amplitude of the gravitational waves is miserably small, so they would not be detectable.

Last month, Resonaances broke the news that there was a problem with the BICEP2 claims, specifically with the bottom line (and punch line) of their preprint abstract:

Subtracting the best available estimate for foreground dust modifies the likelihood slightly so that r=0 is disfavored at 5.9σ.

Back then the BICEP official reaction to the Resonaances claim that they were admitting to a mistake was “We’ve done no such thing.” Post-refereeing, there have been extensive changes in the paper (for example, the “DDM2″ dust model based on scraped Planck data is gone), and the bottom line of the abstract has been changed to:

Accounting for the contribution of foreground, dust will shift this value [non-zero r at 7.0 sigma] downward by an amount which will be better constrained with upcoming data sets.

If the BICEP collaboration is still not admitting a mistake in their treatment of Planck data or the bottom line of their preprint, then it seems that referees have told them they can’t publish these in PRL.

Back in March the BICEP2 results made the front page of the New York Times with a Dennis Overbye story Space Ripples Reveal Big Bang’s Smoking Gun, but today the NYT has Astronomers Hedge on Big Bang Detection Claim, which explains well what has been going on.

Update: Nature has a story out about this, which includes the news of a recent presentation at a Moscow cosmology conference by Jean-Loup Puget of the Planck collaboration:

Using for the first time the newest Planck maps available, Puget and his collaborators have directly examined the polarization of dust in these high galactic regions rather than extrapolating from dustier regions in the plane of the Milky Way. Averaging over some 350 high-galactic-latitude patches of sky similar in size to the region observed by BICEP2, Puget reported that polarization from interstellar dust grains plays a significant role and might account for much of the BICEP2 signal that had been attributed to inflation-generated gravitational waves. Puget told Nature that an article detailing these findings would be published in about six weeks.

Update: I’ve been watching Paul Steinhardt’s talk at Strings 2014, where he’s giving a dramatic attack on the way inflationary cosmology is being pursued as in violation of the scientific method. One thing he does is put up exactly the Linde quotes from this posting.

Posted in Uncategorized | 47 Comments

Strings 2014

This coming week and the next Princeton will host both the big yearly string theory conference Strings 2014, and Prospects in Theoretical Physics 2014, a program designed to train young physicists in string theory.
Princeton is definitely the right place for this, since it now is very much a singular point in the world-wide theoretical physics research community. At the IAS the director is a string theorist, so is the past director (now faculty member). Of the other four senior HEP theorists, three are string theorists and one might be described as a fellow-traveller. Over at the university, of the nine faculty in HEP theory, all are string theorists except for one junior faculty member.

One will be able to follow the Strings 2014 talks live here, and video and slides should be posted here.

The Strings 20XX conferences provide a good place to see what the latest trends in string theory are, with the talks chosen to highlight what some of the most influential people in the field consider the most interesting work. I’ve written posts on the blog here about previous such conferences, which one can compare to this year’s to see how the field has evolved. Looking at the list of over 70 talks and their topics, some things that strike me (in many cases, much the same things as in other recent years):

  • Talks actually about strings are a small minority (20%), something that has been true for quite a few years. The percentage may have grown from a minimum back in 2011 when some of the speakers were mentioning the small role strings were playing at a string theory conference.
  • AdS/CFT and holography remain a dominant theme, as they have for many years. Possible applications of this to condensed matter physics are a continuing hot topic. The previous hot topic of this kind, applying AdS/CFT to heavy ion physics, now seems to be dead, something people would rather not talk about anymore since it never worked out as advertised.
  • Amplitudes are the other big hot topic.
  • Discussion of the LHC and hoped-for LHC results in the past was often a major topic at these conferences. Now that the LHC results are in and a huge disappointment (no SUSY or extra dimensions), it looks like there’s a chance the LHC will not be mentioned at all at this year’s conference, with “string phenomenology” in general the topic of only a very few talks.
  • String phenomenology does have its own yearly conference (see here), but at least as far as the US participants go, the top US research institutions are not represented there, whereas they are heavily represented at the Princeton conference. Whatever “string phenomenology” is these days, it’s not popular at all among the Princeton crowd. It’s no longer being done at the most prestigious US institutions, and in Europe is concentrated in certain places (popular in England for some reason, not at all in France).
  • While research into string theory unification schemes now seems to be very unpopular at Princeton, for some reason it’s a topic that the young must still be trained in. The PiTP program includes a series of lectures on string compactications, for which the Princeton people needed to bring in Martijn Wijnholt from Munich, one of the places still doing this kind of thing.
  • To the extent there’s anything about connection to experiment, B-modes are the hot topic.
  • There was a time when mathematicians were sometimes invited to Strings 20XX, but that’s over and done with. It seems most prominent string theorists no longer want to hear anything from mathematicians.
  • Finally, zero about the multiverse or the landscape. Clearly some on the organizing committee still have strong opinions and are not going to tolerate that kind of nonsense.
  • Witten will just give a 15 min welcoming speech. In the past, David Gross has ended the conference with a “vision” speech. This year there will be five “vision talks”, and it may be interesting to see a wider range of opinions on where the field is heading.

Update: One more notable thing about this version of the yearly conference is that (as far as I can tell), it’s the first one in many years that has not included a promotional public talk about string theory. It may very well be that this was considered unnecessary in Princeton.

Update: Martin Wijnholt’s lectures to the students and postdocs in Princeton about string compactifications are available here. Lots of nice material on Calabi-Yaus and algebraic geometry, nothing at all about extracting the standard model from all this. One thing that has always surprised me is how little most string theorists know about the state of the art of getting particle physics out of the theory. This is less surprising now after seeing the kind of lectures they get on the subject.

Posted in Strings 2XXX | 44 Comments