Mathematics Without Apologies

If you’d asked me ten years ago to describe a book I’d love to read that could be characterized as part of an “incredibly unlikely trend in books about math for the general public”, I might have chosen “brilliant meditations on the practice of mathematics and on mathematics at the deepest level, from first-rate mathematicians, focusing on the Langlands program, with expert-level discussion of the subject.” And yet, here we are, not much more than a year after Edward Frenkel’s Love and Math, with the publication last week of another very different but equally fascinating example of exactly this trend: Michael Harris’s Mathematics Without Apologies. If you are interested at all in what mathematics really is and what the best mathematicians really do (and you’re up for an intellectual challenge) I highly recommend that you get a copy and set some time aside for delving into this unusual book.

While Harris shares many of Frenkel’s themes and concerns, his style is very different, favoring density, indirectness, the post or post-post-modern, and deep engagement with history, philosophy and sociology. Only one of these two authors assumes a familiarity with Max Weber. Where Frenkel is ever guileless and straight-forward, Harris has a whole chapter on the “trickster”, taking some pride in being known for “Harris’s tensor product trick.” While reading, more than once one wonders whether one is really supposed to take something seriously (for instance, there’s quite a long bit about Thomas Pynchon’s novels and conic sections…).

Normally when I’m reading a book I want to later write about, my practice is to fold down the corners of pages that contain something new, unexpected, especially insightful, or something I’d really like to argue with. Then I can start writing by reviewing those pages. My problem with this book is that I ended up folding down the corners of a large fraction of the pages, so when I sat down to write, my usual method would force me to reread pretty much the entire book. Not a bad idea, since I’m convinced I missed a lot the first time through, but other tasks beckon and it’s not a quick read.

I’m not sure I can do much better here than randomly list a few of the themes of the book: the pleasures of doing mathematics, the role of pure mathematicians in society (Wall Street!) and many forms of art and culture, how best to explain number theory to an insightful actress, the philosophy of mathematics and philosophy of Mathematics (two different things), Indian Metaphysics, n-categories, the yoga of motives, Voevodsky’s univalent foundations, the life and thought of Alexander Grothendieck and Robert Langlands, etc., etc. There’s also serious doses of sex (including an extensive discussion of Frenkel’s film), drugs (from Erdos to Andreas Floer to late nights at Oberwolfach) and rock and roll (from the “Math Rock” genre which I’d never heard of before to the IAS house band “Do Not Erase”).

Harris manages to move back and forth between the deepest ideas about mathematics at the frontiers of the subject, insightful takes on the sociology of mathematical research, and a variety of topics pursued in a sometimes gonzo version of post-modern academic style. You will surely sometimes be baffled, but definitely will come away knowing about many things you’d never heard of before, and with a lot of new ideas to think about.

For some more about the book, including some early versions of some chapters, see Harris’s website here.

Update: Princeton University Press now has a Q and A with Harris about the book up here.

: The book now has a blog.

Posted in Book Reviews | 8 Comments

Snowpocalypse 2015

For the last few days the media in New York have been filled with continuous frantic warnings of the deadly storm of the century bearing down on the city. Grocery stores have been emptied, with long lines of desperate people trying to stock up on supplies.

Midday yesterday Columbia announced that classes were canceled starting at 3pm, Barnard went one hour better, canceling classes starting at 2pm. The city announced that it would be illegal to be in the parks after 6pm (a snow-covered branch might fall on you), the transit system would start shutting down at 7pm and by 11pm there would be no public transit, and all roadways in the entire tri-state area would be closed to non-emergency traffic. The mayor’s office warned people not to try and order takeout delivery since it would be illegal for the delivery people to travel on the streets to deliver it.

By late afternoon the university was deserted, and stores on Broadway had signs announcing early closing due to the impending disaster. Weather reports the day before had said the storm would start at 1pm Monday, but by early evening there hadn’t been much more than snow flurries, with maybe an inch or two total accumulation. When I went to sleep around midnight, the city was completely locked down, with the TV news channels filled with blaring warnings of the two to three feet of snow about to arrive, interspersed with press conferences from public officials telling people to barricade themselves in their homes and not go outside.

The strange thing about this was that if you actually looked at the weather report, they were now forecasting 3-5 inches of snow overnight. Waking up in the morning and looking out the window, all that was visible were more flurries, and a total accumulation of 2-3 inches, with the streets clear. Turning on the TV news, the huge “Blizzard of 2015″ logos were still up, and camera crews seem to have been sent out to search the region (mostly unsuccessfully) for a snow drift to put a reporter in front of. The contrast between looking out the window and watching TV was pretty dramatic.

Anyway, my class today is canceled, so students will have to wait until Thursday to hear more about the mathematics of quantization of the harmonic oscillator (complex structures, squeezed states, coherent states). Lecture notes still being worked on, but this is chapter 21 of the current notes.

Columbia never used to shut down at all, New York City never used to shut down the transit system, and the states never used to shut down all roadways. Until the past decade or so people tried to go about their business here in the winter, taking action to shut things down only once snow had arrived and was causing a problem. The US has now become a nation of hysterics, with media-driven hype frightening everyone about everything, and public officials desperately taking action to protect the citizenry from imaginary threats.

Luckily for us all, people cowering in their homes do have the internet and can still learn quantum mechanics. MIT has just announced that edX will have an online version of their quantum course, Mastering Quantum Mechanics, which looks quite good and will start February 10. The instructor will be Barton Zwiebach, and I’m glad to see that one of the topics covered will be squeezed and coherent states of the harmonic oscillator.

Posted in Uncategorized | 39 Comments

The NSA, NIST and the AMS, Part II

Last summer I wrote here about an article in the AMS Notices which appeared to make misleading claims about the NSA’s involvement in putting a backdoor in an NIST cryptography standard known as DUAL_EC_DRBG. The article by Richard George, a mathematician who worked at the NSA, addressed the issue of the NSA doing this kind of thing by discussing an example of past history when they were accused of doing this, but were really actually strengthening the standard. He then went on to claim that:

I have never heard of any proven weakness in a cryptographic algorithm that’s linked to NSA; just innuendo.

This appears to be a denial of an NSA backdoor in the standard, while not saying so explicitly. If there is a backdoor, as most experts believe and the Snowden documents indicate, this was a fairly outrageous use of the AMS to mislead the math community and the public. At the time I argued with some at the AMS that they should insist that George address explicitly the question of the existence of the backdoor, but didn’t get anywhere with that. One of their arguments was that George was speaking for himself, not the NSA.

The question of fact here is a very simple and straightforward mathematical one: how was the choice used in the standard of points P and Q on an elliptic curve made? There is a known way to do this that provides a backdoor. Did the NSA use this method, or some other one for which no backdoor is known? The NSA refused to cooperate with the NIST investigation into this question. The only record of what happened when the NIST asked about how P and Q were chosen early on in the development of the standard is this, which indicates that people were told by the NSA that they were not allowed to publicly discuss the question.

Remarkably, the latest AMS Notices has a new article with an extensive discussion of the DUAL_EC_DRBG issue, written by mathematician Michael Wertheimer, the NSA Director of Research. At first glance, Wertheimer appears to claim that the NSA was unaware of the possibility of a backdoor:

With hindsight, NSA should have ceased supporting the dual EC_DRBG algorithm immediately after security researchers discovered the potential for a trapdoor. In truth, I can think of no better way to describe our failure to drop support for the Dual_EC_DRBG algorithm as anything other than regrettable.

On close reading though, one realizes that Wertheimer does not address at all the basic question: how were P and Q chosen? His language does not contain any actual denial that P and Q have a backdoor.

For a careful examination of the Wertheimer piece by an expert, see this from Matthew Green. Green concludes that

… it troubles me to see such confusing statements in a publication of the AMS. As a record of history, Dr. Wertheimer’s letter leaves much to be desired, and could easily lead people to the wrong understanding.

In a recent podcast on the subject Green states

I think it’s still going on… I think that the NSA has really adopted a policy of tampering with cryptographic products and they’re not going to give that up. I don’t think that this is a time that they want to go out admitting what they did in this particular case as a result of that.

Given that this is now the only official NSA statement about the DUAL_EC_DRBG issue, the Notices article has drawn a lot of attention, see for instance here. The Register summarizes the story with the headline NSA: So sorry we backed that borked crypto even after you spotted the backdoor.

The publication of the George and Wertheimer pieces by the AMS has created a situation where there are just two possibilities:

  • Despite what experts believe and Snowden documents indicate, the NSA chose P and Q by a method that did not introduce a backdoor. For some reason though they are unwilling to state publicly that this is the case.
  • P and Q were chosen with a backdoor, and the AMS has now repeatedly been used to try and mislead the mathematics community about this issue.

I’ve contacted someone at the AMS to try and find out whether the question of a backdoor in P and Q was addressed in the refereeing process of the article, but been told that they won’t discuss this. I think this is an issue that now needs to be addressed by the AMS leadership, specifically by demanding assurances from Wertheimer that the NSA did not choose a backdoored P and Q. If this is the case I can see no reason why such assurances cannot be provided. If the NSA and Wertheimer won’t provide this, I think the AMS needs to immediately cut off its cooperative programs with the agency. There may be different opinions about the advisability of such programs, but I don’t think there can be any argument about the significance of the AMS being used by the NSA to mislead the mathematics community.

: There’s an Ars Technica story here, with a peculiar update of its own:

An NSA spokesperson emailed Ars on Friday to say Wertheimer retired in the fall of 2014 and submitted the article after he left his position. The Notices article made no mention of his retirement.

Another odd thing about the Wertheimer piece is that in a different part of it he seems to reveal what I would have thought the NSA considered a closely held piece of information about Taliban communication methods (see here). If he can discuss that publicly, why can’t he say whether P and Q were backdoored?

Update: This is getting international attention, with le Monde reporting the AMS Notices piece as an admission by the NSA that they backdoored DUAL_EC_DRBG.

Update: The NIST has put out a revised draft on its cryptographics standards process and asked for comments. On the NSA problem, it says that no changes have been made to the NSA-NIST Memorandum of Understanding, and that

cooperation with NIST is governed by an MOU between the two agencies and technical staff meet monthly to discuss ongoing collaborative work and future priorities.

It seems (see the NIST VCAT report) that, despite its obligations under the MOU, the NSA has refused to explain what it did with regards to compromising the DUAL_EC_DRBG standard, and experts believe (see above) that the NSA is committed to continuing to tamper with cryptographic products. Under these circumstances I don’t see how the NIST can expect anyone to not be suspicious of their standards.

A promise is made to identify NSA contributions to standards, but a footnote says that names of some NSA staff cannot be revealed and that documents involving NIST-NSA collaboration provided in response to FOIA requests may be redacted. I don’t see anything here that would keep the NSA from misleading or corrupting NIST staff to produce a backdoored standard, while keeping their input out of any record available to the public.

Posted in Uncategorized | 23 Comments

Short Items

  • The latest issue of the New York Review of Books has an article about the new Turing film, explaining in detail how it gets pretty much everything completely wrong about Turing and his story (see my review here). In related news, this week it was announced that the film is one of the final Oscar nominees for Best Adapted Screenplay.
  • The DESY research magazine femto has a sequence of articles about the LHC, SUSY and BSM physics.
  • The Swedish Research Council has just announced a ten-year grant of $60 million SEK (about $7 million) to bring Frank Wilczek to Stockholm University.
  • Mike Duff has some complaints about the Dean Rickles “A Brief History of String Theory” (for mine, see here.)
  • Jim Stewart, a mathematician who became wealthy based on his popular Calculus book (which we use here at Columbia) passed away last month at the age of 73. For more about him, see here and here. I had the pleasure of meeting him a couple times, with one occasion including a tour of his remarkable home in Toronto, Integral House.
  • For a new book about a certain mathematical point of view on QFT, see Factorization algebras in quantum field theory, by Kevin Costello and Owen Gwilliam.
  • Quanta magazine has a nice article by Kevin Hartnett on Ciprian Manolescu’s work on the triangulation conjecture.

One more. The Yale Art Gallery now has an exhibition of prints based on equations chosen and drawn by well-known mathematicians and physicists. It’s called The Art of the Equation, and impresario of the project Dan Rockmore will be discussing it there at 5:30 on Thursday January 22.

Posted in Uncategorized | 23 Comments


Now back from vacation, and as far as I can tell, not much happened while I was away. Here are a few things I’ve seen that may be of interest:

  • Mochizuki has posted a long progress report on “activities devoted to the verification of IUTeich.” New Scientist has an article about this here, which quotes Minhyong Kim making comments I think most experts would agree with:

    Some mathematicians say Mochizuki must do more to explain his work, like simplifying his notes or lecturing abroad. “I sympathise with his sense of frustration but I also sympathise with other people who don’t understand why he’s not doing things in a more standard way,” says Kim. It isn’t really sustainable for Mochizuki to teach people one-on-one, he adds, and any journal would probably require independent reviewers who have not studied under Mochizuki to verify the proof.

    Lieven Le Bruyn has a less charitable take (see here and here):

    If you are a professional mathematician, you know all too well that the verification of a proof is a shared responsability of the author and the mathematical community. We all received a referee report once complaining that a certain proof was ‘unclear’ or even ‘opaque’?

    The usual response to this is to rewrite the proof, make it crystal-clear, and resubmit it.

    Few people would suggest the referee to spend a couple of years reading up on all their previous papers, and at the same time, complain to the editor that the referee is unqualified to deliver a verdict before (s)he has done so.

    Mochizuki is one of these people.

    His latest Progress Report reads more like a sectarian newsletter.

    There’s no shortage of extremely clever people working in arithmetic geometry. Mochizuki should reach out to them and provide explanations in a language they are used to.

    Mochizuki’s progress report strikes me as quite an odd document, especially in its insistence that experts need:

    to deactivate the thought patterns that they have installed in their brains and taken for granted for so many years and then to start afresh, that is to say, to revert to a mindset that relies only on primitive logical reasoning, in the style of a student or a novice to a subject.

    He at times seems to be arguing that his ideas are nearly disconnected from the rest of known mathematics, and the only way to understand why the abc conjecture is true. This is highly implausible, since the great beauty and strength of mathematics is the way in which deep ideas are interconnected, with many paths from one place to another. If he wants to convince people that he really has what he claims, the best way to do it would be to follow the conventional route: write himself a document giving an exposition of a proof of abc, in as clear and simple terms as possible.

    Unfortunately, that doesn’t seem to be what he has planned, with his efforts devoted to getting others to start from the beginning and master his long series of papers. If this works, at some point there will be others able to write up a proof of abc using his ideas, and when that happens, experts may have something they can work with. This looks now like a story that is going to go on for a long time…

  • The last couple weeks in Jerusalem there was a Winter School on General Relativity. It included a final session (video here) largely devoted to defending string theory as the one true path to quantum gravity. This included a panel discussion where Carlo Rovelli held his own in a battle of the LQG/string wars, with him ganged up on by Gross and Arkani-Hamed. Mostly I don’t think there were any new arguments, just a rehash of the tediously familiar. Gross did give an enthusiastic call for all students to read the Dawid book discussed here.
    For yet another promotional effort about strings, one that seems like it could have been written exactly the same way twenty years ago, see here.
  • One new argument from the Rovelli side was to point out that “Nature talks”, and what it has said at the LHC so far is that SUSY is not there, blowing a big hole in the expectations of the superstring theory community. The Economist has a piece about how the upcoming LHC run at 13 TeV will be:

    the last throw of the dice for the theory, at least in its conventional form.

    As often the case though, the article misrepresents the strength of arguments for SUSY:

    But, though the Standard Model works, it depends on many arbitrary mathematical assumptions. The conundrum is why these assumptions have the values they do. But the need for a lot of those assumptions would disappear if the known particles had heavier partner particles: their supersymmetric twins.

    This is pretty much complete nonsense, since the problem with SUSY has always been that it doesn’t actually explain why the SM model parameters take the values that they do, and this has always been the best reason to be skeptical about it.

    On the other hand, the Economist and Rovelli do get the basic story right: Nature talks, and if what it says in LHC Run 2 is that the theoretical physics community has been barking up the wrong tree for the last forty years, it will be interesting to see if theorists are still willing to listen.

Posted in Uncategorized | 32 Comments

Winter Break

Blogging here should be light to non-existent for a while, with family holiday celebrations tomorrow and departure for a trip to Europe the day after. Travel plans still in flux, but the general idea is to head south after arriving in Paris, spend a couple weeks on the road and mostly in Italy, end up back in Paris January 6, back to New York on the 11th.

I somehow seem to have caused in the last posting (see the comment section there) an eruption of an even odder version of the kinds of attacks from string theorists that were common in 2006, a period known to aficionados as the “String Wars”. The new version is more like the “Multiverse Wars”. From past experience I know that involvement in such things is not a good way to spend your vacation, so I think when I head to the airport I’ll likely shut off comments. In the meantime, I hope the holiday spirit will reign…

: Off on vacation, comments will be off. Some last minute links that may be of interest:

Posted in Uncategorized | 6 Comments


There’s a very interesting new paper on the arXiv by Joe Polchinski, a survey article for Studies in History and Philosophy of Modern Physics, entitled just Dualities. It’s an unusually lucid summary of the story of dualities in quantum field theory and string theory. This is a very complex subject which has been a central one in theoretical physics for the last few decades, but most expository writing on the subject has tended to be either superficial promotional material or mired in technical detail obscuring fundamental issues.

One reason for this is that, as Polchinski does an admirable job of making clear, in a very real sense we still do not understand at all the fundamental issues raised by these dualities. He notes that “we are still missing some big idea”, and points to the same comments from Nati Seiberg last month that I blogged about here. For most of the dualities at issue, our current standard technology for dealing with QFTs (the Lagrangian and the path integral over classical fields) is capable of capturing the two QFTs that are in some sense “dual”, but we lack a viable larger framework that would give the two QFTs in two different limits and explain the duality relationship.

For an example of the problem, probably the oldest and most well-studied case where we are missing something is Montonen-Olive duality, a non-abelian duality between electric and magnetic charges and fields. A currently popular idea is to find the explanation of this in “Theory X”, a 6d superconformal QFT, with duality coming from compactifying the theory on a torus (for more about this, see talks last week in Berkeley). The problem with this is that we don’t have a definition of the “Theory X”.

Polchinski places this problem in the context of a conjectural “M-theory” with various string theory limits. This has been the dominant idea in the subject for nearly 20 years now, but we seem no closer now to finding an actual realization of this conjectural picture than we were back in the mid-90s. Twenty years and thousands of papers have just given better understanding that various possible ideas about this don’t work.

One place where I think Polchinski’s survey is weak is in the treatment of this conjecture, where at times he takes as solid result something highly conjectural. For instance he starts off at one point with:

String-string dualities imply that there is a unique string/M-theory.

and moves on to the conjecture that

In this sense it may be that every QFT can be understood as a vacuum state of string/M-theory.

The problem here is that he’s built a speculative view of the unification of physics, constructed on an assumption about a “unique” theory, when we don’t know at all that such a thing exists. One basic lesson of mathematical research is that you need to keep very clear the distinction between what you really understand and what is speculation, because your speculation is often wrong and if so will lead you in the wrong direction. I think particle theory of recent decades likely suffers from people forgetting that some ideas are speculative, not firmly grounded, and may be pointing in the wrong direction.

One wrong direction this takes Polchinski is to the non-predictive, pseudo-scientific landscape of supposed string theory solutions and the multiverse, which he blithely invokes as our best fundamental explanation of physics. Tellingly, unlike the clear explanations of other topics, here he makes no attempt to describe these ideas other than to note that

they rest on multiple approximations and no exact theory.

In a final section, Polchinski addresses the question of what all this tells us about what is “fundamental” and what is the role of symmetries. This is the crucial question, and I’d argue that our lack of understanding of where these dualities come from likely is due to our missing some understanding of how symmetries are realized in QFT or string theory. This has been the lesson of history, with the Standard Model only coming into being when people better understood how symmetries, especially gauge symmetries, could act in QFT. Polchinski largely takes the opposite point of view, arguing that the fundamental theory maybe has no symmetries, local or global. He quotes Susskind as suggesting that symmetries have nothing to do with fundamental equations, are just calculational tools for finding solutions. I think this is completely misguided, that a strong case can be made (and I do it here) that “symmetry” (in the sense of the mathematics of groups and their representations) lies at the very foundation of quantum mechanics, and thus any quantum mechanical theory, even string/M-theory, whatever it might be.

Wondering whether there will be an arXiv trackback to this, and whether Polchinski has something to say about it…

Update: The arXiv Monday evening has a large collection of excellent review articles entitled “Exact results on N=2 supersymmetric gauge theories”, edited by J.Teschner (first is arXiv:1412.7118, last arXiv:1412.7145). Some of the results reviewed are based on deriving implications of the existence of the 6d (2,0) models discussed here and in the comment section.

Update: I’ve put this blog posting in the Multiverse Mania category, not because of the posting content, but because of comments in the comment section from Polchinski and Bousso.

Posted in Multiverse Mania | 61 Comments

Quick Links

  • The Planck data release has been delayed yet again. December 22, is now off the table, the latest plan is “before the end of January 15″, see here. Some peeks at their results are in slides from the Ferrara conference, available here. The fact that the slides for the “Planck low-ell CMB power spectra” talk are unavailable correlates with the rumor I’ve heard that they have recently found serious problems with that part of their data analysis, which would explain why the data release keeps getting pushed back.

    This week there’s a conference in Paris, no slides yet. Streaming video has been available, which I took a look at for a while. Just managed to catch the tail end of questions about what the state of their analysis is relevant to the crucial B-mode business. Not enough to get the bottom line of what the state of affairs is. Perhaps someone who was there or who watched the whole thing can report. About the best source of information on cosmology these days seems to be Twitter, hashtag #planck2014. Something else of interest at the Paris conference was a debate about inflation featuring Steinhardt, Mukhanov, Linde and Brandenberger. Maybe video will be available someday, along with the slides.

  • Scott Aaronson has more here about the problems with the recent movie about Turing that I mentioned here. Despite (or maybe because of…) having little relation to reality, the screenplay of the film has been nominated for a Golden Globe award.
  • David Mumford and John Tate wrote a biographical sketch of Grothendieck for Nature. Unfortunately it seems that it won’t be published there because of being too technical. It is however available at Mumford’s blog.
  • There’s an interesting interview with Nikita Nekrasov at the artist Marina Abramovic’s MAI site.

Update: Shantanu points out that the Paris talk videos are available here. Looking a bit, I didn’t see anything from the Planck people about when they will release direct B-mode polarization results (next month? later?). Steinhardt gave a powerful talk arguing in detail that inflation does not predict anything, and that the usual claims for it are untenable. For the Steinhardt, Mukhanov, Linde, Brandenberger debate, see here.

Update: For yet another explanation of the problems with the Turing movie, set this at the New York Review of Books.

Posted in Uncategorized | 5 Comments

Defend the Integrity of Physics

This week’s Nature features a call to arms from George Ellis and Joe Silk, entitled Scientific method: Defend the integrity of physics. I’m very glad to see well-known physicists highlighting the serious problem for the credibility of science raised by the string theory multiverse and the associated ongoing campaign to justify the failures of string theory by attacking the scientific method. Acknowledging evidence that an idea you cherished doesn’t work is at the core of what science is and physics now has a major problem with prominent theorists refusing to abide by this principle. Ellis and Silk do a great job of identifying and characterizing an important challenge the scientific community is facing.

The issue is however complicated, and while the Nature piece carefully and clearly addresses some of the complexities, there are places where things get over-simplified. In particular, the introduction frames the issue as whether a theory being “sufficiently elegant and explanatory” allows it to not need experimental testing. The problem with the string theory multiverse though is not this, since such a theory is the antithesis of “elegant and explanatory”. There’s just about nothing in science as inelegant as the various attempts (e.g. the KKLT mechanism) to make string theory fit with known physics, and “the multiverse did it” is no more an actual explanation of anything than “a big omnipotent turtle did it”.

Trying to cut through the complexities, Ellis and Silk write:

In our view, the issue boils down to clarifying one question: what potential observational or experimental evidence is there that would persuade you that the theory is wrong and lead you to abandoning it? If there is none, it is not a scientific theory.

This is at the heart of the matter, but there are subtleties. A common recent move among some prominent string theorists has been to argue that string theory is falsifiable: it is based on quantum mechanics, so if experiments falsify quantum mechanics, they falsify string theory. This just makes clear that the question of falsifiability can be slippery. Philosophers of science are experts at the intricacies of such questions and Ellis and Silk are right to call for help from them.

They also make the interesting call for the convening of a conference to address these issues. How such a thing would work and how it might be helpful seem well worth thinking about. As for one of their other recommendations though:

In the meantime, journal editors and publishers could assign speculative work to other research categories — such as mathematical rather than physical cosmology — according to its potential testability.

I’m leery of the impulse among physicists to solve their problem of how to deal with bad physics by calling it mathematics. Yes, there is good mathematics that has come out of untestable ideas about string theory, but no, this doesn’t include the string landscape/multiverse cop-out, which physicists need to face up to themselves.

For the specific arguments from Sean Carroll and Richard Dawid that Ellis and Silk address, I’ve written about them elsewhere, see for instance here, where I discussed in some detail Dawid’s arguments.

: Sabine Hossenfelder has commentary on this here.

Update: Taking the opposite side of the argument in January’s Smithsonian magazine is by colleague Brian Greene, with an article entitled Is String Theory About to Unravel?. As you might expect, Brian’s answer is “No”, and he gives a good account of the point of view Ellis and Silk are warning against. He mentions the possibility of encouraging news for string theory from the next LHC run, but says that “I now hold only modest hope that the theory will confront data during my lifetime.”

Update: Sean Carroll responds to the criticism from Ellis and Silk with a tweet characterizing them as belonging to the “falsifiability police”:

My real problem with the falsifiability police is: we don’t get to demand ahead of time what kind of theory correctly describes the world.

Update: Gordon Kane joins the fight in a comment at Nature, claiming that, before the LHC, string theory predicted a gluino mass of 1.5 TeV.

The literature contains clear and easily understood predictions published before LHC from compactified string theories that gluinos, for example, should have been too heavy to find in Run 1 but will be found in Run 2 (gluino mass of about 1.5 TeV).

As far as I can tell, this is utter nonsense, with Kane publicly claiming string theory predictions of a gluino mass of around 600 GeV (see page 22 of this) back in 2011, then moving the “prediction” up as Run 1 data falsified his earlier predictions. Kane at least makes falsifiable predictions, the problem with him only comes when they get falsified…

Update: Chad Orzel has his take here.

Update: Adam Frank has an essay on this here.

Posted in Multiverse Mania | 34 Comments

Weinberg on the Desert, Seiberg on QFT

Last week Steven Weinberg gave a Lee Historical Lecture at Harvard, entitled Glimpses of a World Within. There’s a report on the talk at the Harvard Gazette.

In essence, Weinberg argues in the talk for an idea that first started to dominate thinking among HEP theorists nearly forty years ago, one that is sometimes called the “Desert Hypothesis”. The idea is that by looking at what we know of the SM and gravity, you can find indications that the next level of unification takes place around the Planck scale, with no new physics over the many orders of magnitude between the scales we can observe and that scale, at least no new physics that will affect running of coupling constants for instance. The evidence Weinberg gives for this is three-fold (and very old by now):

  • He describes listening to Politzer’s first talk on asymptotic freedom in 1973, and quickly realizing that if the strong coupling decreases at short distances, at some scale it would become similar to the coupling for the other fundamental forces. In a 1974 paper with Georgi and Quinn this was made explicit, and he argues this is evidence for a GUT scale a bit below or around the Planck scale.
  • He explains about the Planck scale, where gravity should be of similar strength to the other interactions. This idea is even older, well-known in the fifties I would guess.
  • He refers to arguments (which he attributes to himself, Wilczek and Zee in 1977) for a Majorana neutrino mass that invoke a non-renormalizable term in the Lagrangian that would come from the GUT scale.

Weinberg sees these three hints as “strongly suggesting” that there is a fundamental GUT/Planck scale, and that’s what will explain unification. Personally though, I don’t see how three weak arguments add up to anything other than a weak argument. GUTs are now a forty-year old idea that never explained very much to start with, with their best feature that they were testable since they generally predicted observable proton decay (which we haven’t seen). We know nothing at all about the source of particle masses and mixing angles, or the reason for their very different scales, and there seems to be zero evidence for the mechanism Weinberg likes for getting small neutrino masses (including zero evidence that the masses are even Majorana). As for quantum gravity and the Planck scale, again, we really have no evidence at all. I just don’t think he has any significant evidence for a desert up to a Planck unification scale, and this is now a very old idea, one that has been unfruitful in the extreme.

Weinberg ended his talk with another very old idea, that cosmology will somehow give us evidence about unification and GUT-scale physics. That also hasn’t worked out, but Weinberg quotes the BICEP2 value of r as providing yet more evidence for the GUT scale (he gives it a 50/50 chance of being correct). Again though, one more weak piece of evidence, even if it holds up (which I’d give less than 50/50 odds for at this point…), is still weak evidence.

For a much more encouraging vision talk, I recommend listening to Nati Seiberg at the recent Breakthrough Prize symposium. Seiberg’s talk was entitled What is QFT?, and to the claim that QFT is something understood, he responds “I really, really disagree”. His point of view is that we are missing some fundamental insights into the subject, that QFT likely needs to be reformulated, that there exists some better and more insightful way of thinking about it than our current conventional wisdom. In particular, there seems to be more to QFT than just picking a Lagrangian and applying standard techniques (for one thing, there are QFTs with no known Lagrangian). Seiberg takes the fact that mathematicians (who he describes a “much smarter than most quantum field theorists”…) have not been able to come up with a satisfactory rigorous version of QFT to indicate not that this is a boring technical problem, but that we don’t have the right definition to work with.

To make things more specific, he describes joint recent work (for another version of this see here) on “Generalized Global Symmetries” that works with global symmetries associated to higher co-dimension spaces than the usual codimension one case of Noether symmetries and Lagrangian field theory. Evidently there’s a forthcoming paper with more details. I’m in complete agreement with him that there must be better ways of thinking about QFT, and I think these will involve some deeper insights into the role of symmetries in the subject.

Update: The paper Seiberg mentions is now available here.

Posted in Uncategorized | 68 Comments