Robert Hermann 1931-2020

I was sorry to hear today of the recent death of Robert Hermann, at the age of 88. While I unfortunately never got to meet him, his writing had a lot of influence on me, as it likely did for many others with an overlapping interest in mathematics and fundamental physics. Early in my undergraduate years during the mid-1970s I first ran across some of Hermann’s books in the library, and found them full of fascinating and deep insights into the relations between geometry and physics. Over the years I’ve often come back to them and learned something new about one or another topic. The main problem with his writings is just that there is so much there that it is hard to know where to start.

While the relations between Riemannian geometry and general relativity were well-understood from Einstein’s work in the beginning of the subject, the relations between geometry and Yang-Mills theory were not known by Yang, Mills or other physicists working on the subject during the 1950s and 1960s. The understanding of these relations is conventionally described as starting in 1975, with the BPST instanton solutions and Simons explaining to Yang at Stony Brook about fiber bundles (leading to the “Wu-Yang dictionary” paper). But if you look at Hermann’s 1970 volume Vector Bundles in Mathematical Physics, you’ll find that it contains an extensive treatment of Yang-Mills theory in terms of connections and curvature in a vector bundle. While I don’t know if Hermann had written about the sort of topologically non-trivial gauge field configurations that got attention starting in 1975, he had at that point for a decade been writing in depth about the details of the relations between geometry and physics that were news to physicists in 1975.

Being ahead of your time and mainly writing expository books is unfortunately not necessarily good for a successful academic career. Looking through his writings this afternoon, I ran across a long section of this book from 1980, entitled “Reflections” (pages 1-82). I strongly recommend reading this for Hermann’s own take on his career and the problems faced by anyone trying to do what he was doing (the situation has not improved since then).

A general outline of his early career, drawn from that source is:

1948-50: undergraduate in physics, University of Wisconsin.
1950-52: undergraduate in math, Brown University.
1952-53: Fulbright scholar in Amsterdam.
1953-56: graduate student in math, Princeton. Thesis advisor Don Spencer.
1956-59: instructor at Harvard (“Harvard hired me as an instructor in the mistaken belief that I must be a topologist since I came from Princeton”).
1953-59: “My real work from 1953-59 was studying Elie Cartan!”
1959-61: position at MIT Lincoln Lab, taught course at Berkeley in 1961.

Hermann ultimately ended up at Rutgers, which he left in 1973, because he was not able to teach courses there in his specialty, and felt he had too little time to conduct the research he wanted to work on. It appears he expected to get by with some mix of grant money and profits from running a small publishing operation (Math Sci Press, which mainly published his own books). The “Reflections” section of the book mentioned above also contains some of his correspondence with the NSF, expressing his frustration at his grant proposals being turned down. At the end of a letter from late 1977 (which was at the height of excitement in the physics community over applying ideas from geometry and topology to high energy physics) he writes in frustration:

However, when I look in the Physical Review today, all the subjects which people in your position so enthusiastically supported ten years ago are now dead as the Phlogiston theory – and good riddance – while the topics I was working on then are now everywhere dense. Does one get support from the NSF by being right or by being popular?

John Baez has written something here, and there’s an obituary notice here.

Update: I’ve been reading some more of the essays Hermann published in the “Reflections” section of this book. Especially recommended is the section on Mathematical Physics of this 1979 essay (pages 30-38). His evaluation of the situation of the time I think was extremely perceptive.

Update: For more about Hermann, see some of the comments at this old blog posting. Also, on the topic of his book reviews, see this enthusiastic review of the Flanders book Differential forms with applications to the physical sciences.

Last Updated on

Posted in Obituaries | 8 Comments

Various

  • A few months ago I ended up doing a little history of science research, trying to track down the details of the story of the Physical Review’s 1973 policy discouraging articles on “Foundations”. The results of that research are in this posting, where I found this explanation from the Physical Review editor (Goudsmit) of the problem they were trying to deal with:

    The event [referring to a difficult refereeing problem] shows again clearly the necessity of rapid rejections of questionable papers in vague borderline areas. There is a class of long theoretical papers which deal with problems of interpretation of quantum and relativistic phenomena. Most of them are terribly boring and belong to the category of which Pauli said, “It is not even wrong”. Many of them are wrong. A few of the wrong ones turn out to be valuable and interesting because they throw a brighter light on the correct understanding of the problem. I have earlier expressed my strong opinion that most of these papers don’t belong in the Physical Review but in journals specializing in the philosophy and fundamental concepts of physics.

    I had heard that people studying foundations of quantum mechanics, frustrated by this policy, had started up during the 1970s their own samizdat publication, called “Epistemological Letters”. I tried to see if there was any way to read the articles that appeared in that form, but it looked like the only way to do this would be to go visit one or two archives that might have some copies. Unbeknownst to me, around the same time Notre Dame University had just finished a project of scanning all issues of Epistemological Letters and putting them online. They are now available here, with an article about them here and an introductory essay here.

  • There’s an interesting essay on the arXiv about the current state of BSM physics, by HEP theorist Goran Senjanović, entitled Natural Philosophy versus Philosophy of Naturalness.
  • Here’s an article about problems string theorist Amer Iqbal has been having in Pakistan.
  • The New York Times has an article about Cedric Villani and his campaign for mayor of Paris. The election is next month, and I’m having a hard time figuring out why Villani is running. There doesn’t seem to be a lot of difference in policy views between the current mayor (Hidalgo) and the Macronistas (Griveaux and Villani), with the main effect of Villani entering the race a splitting of the Macron party vote.
  • I was sorry to hear recently about the death of mathematician Louis Nirenberg. Kenneth Chang at the New York Times has written an excellent obituary. Terry Tao has some comments here.

Update: Excellent rant on Twitter from Philip Ball about misrepresentations of the Copenhagen interpretation. For your own rants, please engage in them on Twitter rather than here.

Last Updated on

Posted in Uncategorized | 20 Comments

London Calling with Career Opportunities II

If you’re a mathematician, you don’t need to go work for Dominic Cummings in order to have dramatically improved career opportunities in the UK. The British government has just announced a huge increase in funding for mathematical research: 60 million pounds/year (or about \$80 million dollars) for the next five years (see here and here). To get some idea of the scale of this, note that the US GDP is about 8 times the UK’s and the NSF DMS budget is about \$240 million/year. So the comparable scale of this funding in the US would be about two and a half times the NSF budget for mathematics.

Many of my mathematician colleagues have sometimes seemed to me to be of the opinion that a huge increase in funding for math research is the best way to improve a society. We’ll see if this works for Britain.

While the new UK government ran on a nativist platform of restricting immigration, with the goal of keeping outsiders from taking bread out of the mouths of UK citizens, this doesn’t apply to mathematicians: all limits are off and we’re encouraged to flood the country. The law will be changed on Friday, changes go into effect Feb. 20. This will include an “accelerated path to settlement”, no need to even have a job offer, and all your “dependents [will] have full access to the labour market”, no problem with them and the taking the bread out of the mouths of the locals thing.

Update: More here (except it’s mostly behind a paywall, but evidently Ivan Fesenko is quoted).

Last Updated on

Posted in Uncategorized | 25 Comments

This and That

  • I was sorry to hear of the death a few months ago of Tony Smith, who had been a frequent commenter on this blog and others. Unfortunately my interactions with him mainly involved trying to discourage him from hijacking the discussion in some other (often unusual) direction. Geoffrey Dixon did get to know him well, and has written a wonderful long blog entry about Tony, which I highly recommend (Dixon’s newish blog also has other things you might find interesting).
  • On the Jim Simons front, the Simons Foundation has put together something to celebrate their 25th anniversary. It explains a lot about their history and what they are doing now, as well as giving some indication of plans for the future. On these topics, read pieces written by Jim Simons and Marilyn Simons. The Foundation has been in a high growth mode, having an increasingly large impact on math and physics research. Their main statement about the future is that the plan is for this to go on for a very long time:

    According to its bylaws, the Simons Foundation is intended to focus almost entirely on research in mathematics and science and to exist in perpetuity. If future leadership abides by these guiding principles, Marilyn and I believe the foundation will forever be a force for good in our society.

    My impression is that the Simons children have their own interests, and foundations with other goals to run.

    News from the \$75 billion source of the money (RenTech) today is that Simons is increasingly turning over control of that to his son Nathaniel, who has been named co-chairman. He has also added four new directors to the board, four of them senior Renaissance executives, and one his son-in-law Mark Heising.

  • There are various IAS-related videos you might want to take a look at:

    Pierre Deligne explaining motives last night.

    Michael Douglas on the use of computers in mathematics.

    A Dutch documentary (not all of it is in Dutch…).

  • If you aren’t regularly reading Scott Aaronson’s blog, you really should be. Latest entries are a detailed report from Davos and a guest post with a compelling argument about a major factor behind the problem of why women leave STEM careers more than men.
  • For the latest on the “It from Qubit” business, see talks at a KITP conference. John Preskill notes “lingering confusion over what it all means”, which makes me glad to hear that I’m not the only one…

Last Updated on

Posted in Uncategorized | 7 Comments

Why the foundations of physics have not progressed for 40 years

Sabine Hossenfelder has a new piece out, making many of the same arguments she has been making for a while about the state of fundamental theory in physics. These have a lot in common with arguments that Lee Smolin and I were making in our books published back in 2006. The underlying problem is that the way theorists successfully worked up until the seventies is no longer viable, with the Standard Model working too well, up to the highest energies probed:

The major cause of this stagnation is that physics has changed, but physicists have not changed their methods. As physics has progressed, the foundations have become increasingly harder to probe by experiment. Technological advances have not kept size and expenses manageable. This is why, in physics today, we have collaborations of thousands of people operating machines that cost billions of dollars.

With fewer experiments, serendipitous discoveries become increasingly unlikely. And lacking those discoveries, the technological progress that would be needed to keep experiments economically viable never materializes. It’s a vicious cycle: Costly experiments result in lack of progress. Lack of progress increases the costs of further experiment. This cycle must eventually lead into a dead end when experiments become simply too expensive to remain affordable. A $40 billion particle collider is such a dead end.

I have a somewhat different view about a potential next collider (see here), but agree that the basic question is whether it will be “too expensive to remain affordable.”

What has happened over the last forty years is that the way HEP theory is done has become dysfunctional, in a way that Hossenfelder characterizes as follows:

Instead of examining the way that they propose hypotheses and revising their methods, theoretical physicists have developed a habit of putting forward entirely baseless speculations. Over and over again I have heard them justifying their mindless production of mathematical fiction as “healthy speculation” – entirely ignoring that this type of speculation has demonstrably not workhat woed for decades and continues to not work. There is nothing healthy about this. It’s sick science. And, embarrassingly enough, that’s plain to see for everyone who does not work in the field.

This behavior is based on the hopelessly naïve, not to mention ill-informed, belief that science always progresses somehow, and that sooner or later certainly someone will stumble over something interesting. But even if that happened – even if someone found a piece of the puzzle – at this point we wouldn’t notice, because today any drop of genuine theoretical progress would drown in an ocean of “healthy speculation”…

Why don’t physicists have a hard look at their history and learn from their failure? Because the existing scientific system does not encourage learning. Physicists today can happily make career by writing papers about things no one has ever observed, and never will observe. This continues to go on because there is nothing and no one that can stop it.

This story brings up a lot of complex issues in the philosophy and sociology of science, but to me there’s one aspect of the problem that is relatively simple and deserves a lot more attention than it gets: how do you get theorists to abandon failed ideas and move on to try something else?

The negative LHC results about SUSY have had some effect, but even in this case it’s remarkable how many theorists won’t abandon the failed idea of a SUSY extension of the Standard Model. This was always a highly dubious idea, explaining nothing about the Standard Model and adding a huge number of new degrees of freedom and more than a hundred new undetermined parameters. Not seeing anything at the LHC should have put the final nail in the coffin of that idea. Instead, I see that this past fall MIT was still training its graduate students with a course on Supersymmetric Quantum Field Theories. You can try and argue that SUSY and supergravity theories are worth studying even if they have nothing to do with physics at observable energies, but it is a fact that these are extremely complicated QFTs to work with and have explained nothing. Why encourage grad students to devote the many, many hours it takes to understand the details of this subject, instead of encouraging them to learn about something that hasn’t been a huge failure?

The techniques one gets trained in as a graduate student tend to form the basis of one’s understanding of a subject and have a huge influence on one’s future career and the questions one has the expertise needed to work on. Besides SUSY, string theory has been the other major course topic at many institutions, with the best US grad students often spending large amounts of time trying to absorb the material in Polchinski’s two-volume textbook, even though the motivations for this have turned out to also be a huge failure, arguably the largest one in the history of theoretical physics.

To get some idea of what is going on, I took a look at the current and recent course offerings (on BSM theory, not including cosmology) at the five leading (if you believe US News) US HEP theory departments. I may very well be missing some offered courses, but the following gives some insight into what leading US departments are teaching their theory students. Comparing to past years might be interesting, possibly there’s a trend towards abandoning the whole area in favor of other topics (e.g. cosmology, quantum information, condensed matter).

The places not offering string theory courses this year seem to have had them last year.

Update: Something relevant and worth reading that I think I missed when it came out: Jeremy Butterfield’s detailed review of Lost in Math, which has a lot about the question of why theorists are “stuck”.

Update: There’s some serious discussion of this on Twitter. For those who can stand that format, try looking here and here.

Update: Mark Goodsell has a blog posting about all this here, including a defense of teaching the usual SUSY story to graduate students.

Update: A correspondent pointed me to this recent CERN Courier interview with John Ellis. Ellis maintains his increasingly implausible defense of SUSY, but he’s well aware that times have now changed:

People are certainly exploring new theoretical avenues, which is very healthy and, in a way, there is much more freedom for young theorists today than there might have been in the past. Personally, I would be rather reluctant at this time to propose to a PhD student a thesis that was based solely on SUSY – the people who are hiring are quite likely to want them to be not just working on SUSY and maybe even not working on SUSY at all. I would regard that as a bit unfair, but there are always fashions in theoretical physics.

Last Updated on

Posted in Uncategorized | 48 Comments

Musings on the Current Status of HEP

To start the new decade there’s an article very much worth reading by Misha Shifman, entitled Musings on the Current State of HEP. It’s somewhat of an update of something he wrote back in 2012, which I wrote about here. He starts off with:

Now, seven years later, I will risk to offer my musings on the same subject.The seven years that have elapsed since [1] brought new perspectives: the tendencies which were rather foggy at that time became pronounced. My humble musings do not pretend to be more than they are: just a personal opinion of a theoretical physicist… For obvious reasons I will focus mostly on HEP, making a few marginal remarks on related areas. I would say that the most important message we have received is the absence of dramatic or surprising new results. In HEP no significant experimental findings were reported, old ideas concerning Beyond the Standard Model (BSM) physics hit dead-ends one after another and were not replaced by novel ideas. Hopes for key discoveries at the LHC (such as superpartners) which I mentioned in 2012 are fading away. Some may even say that these hopes are already dead. Low energy supersymmetry is ruled out, and gone with it is the concept of naturalness, a basic principle which theorists cherished and followed for decades. Nothing has replaced it so far…

HEP, “my” branch of theoretical physics since the beginning of my career, seems to be shrinking. A change of priorities in HEP in the near future is likely as business as usual is not sustainable. The current time is formative.

I encourage you to take a look at the rest, there’s a lot more detailed discussion of the state of HEP and allied fields, especially about the central role of quantum field theory.

Shifman also includes a section very critical of Richard Dawid, the “non-empirical confirmation” business and talks given at the “Why Trust a Theory?” conference (discussed here):

With all due respect I strongly disagree with Richard Dawid and all supporting speakers at the conference and beyond… I object against applying the term “non-empirically confirmed” to science (the more so, the term “postempiric science”). Of course, we live in liberal times and everybody is entitled to study and discuss whatever he or she wants. But the word science is already taken. Sorry, colleagues. For “postempiric science,” please, use another word, for instance, iScience, xScience, or something else.

As for David Gross’s attempt to claim that string theory is, like quantum mechanics and quantum field theory, not testable just because it is a framework, not a theory, Shifman is having none of it:

David Gross is a great theoretical physicist, whose discovery of asymptotic freedom made him immortal, but I respectfully disagree with him. Framework or not, both QM and QFT have absolutely solid confirmations in all their aspects in thousands of experiments.

As for the once popular idea that string theory could provide a “theory of everything”, he writes:

Well… it never happened and – I will risk to say – never will.

Last Updated on

Posted in Uncategorized | 13 Comments

London Calling with Career Opportunities

At some point within the past couple years I noticed that one blog that had Not Even Wrong on its blogroll was the blog of Dominic Cummings, who was often getting credited with masterminding the political campaign that got the British to vote (narrowly) for Brexit in 2016. Cummings has had further success recently as Chief Special Adviser to British Prime Minister Boris Johnson, with a blow-out election victory three weeks ago putting him securely in control of the British state.

Today on his blog Cummings has, invoking Grothendieck, posted a job advertisement: ‘Two hands are a lot’ — we’re hiring data scientists, project managers, policy experts, assorted weirdos…. He’s looking for mathematicians, physicists and others to join him to change British society, working

in the intersection of:

  • the selection, education and training of people for high performance
  • the frontiers of the science of prediction
  • data science, AI and cognitive technologies (e.g Seeing Rooms, ‘authoring tools designed for arguing from evidence’, Tetlock/IARPA prediction tournaments that could easily be extended to consider ‘clusters’ of issues around themes like Brexit to improve policy and project management)
  • communication (e.g Cialdini)
  • decision-making institutions at the apex of government.

For some other descriptions of who Cummings would like to hire, on the economics side there’s:

The ideal candidate might, for example, have a degree in maths and economics, worked at the LHC in one summer, worked with a quant fund another summer, and written software for a YC startup in a third summer!

We’ve found one of these but want at least one more.

He also wants “Super-talented weirdos”, with examples given from William Gibson novels, such as “that Chinese-Cuban free runner from a crime family hired by the KGB.”

The remarkable things to me about this long document are what it doesn’t contain. In particular I see nothing at all about any specific policy goals. Usually a new government would recruit people by appealing to their desire to make the world a better place in some specific way, but there’s nothing about that here. The goal is to control the government and what the British population believes, but to what end?

In addition, a more conventional hiring process would be asking for candidates of high ethical values, with some devotion to telling the truth. Cummings seems to be asking for exactly the opposite: best if your background is “from a crime family hired by the KGB.”

Best of wishes to my British readers, now joining the US and other nations in a new dystopic post-truth era. It’s massively depressing to me to see how this has worked out here, I hope you do better. Maybe you should be sending in your applications to Cummings and hoping to sign up for a role in the new power structure. If so, tell him “Not Even Wrong” sent you…

Update: For more on Cummings, there’s a good Financial Times article.

Last Updated on

Posted in Uncategorized | 30 Comments

Are Physical Laws Inevitable?

The last couple days have seen various discussions online generated by a piece at Quanta Magazine with the dubious headline Why the Laws of Physics Are Inevitable and an even worse sub-headline claiming “physicists working on the ‘bootstrap’ have rederived the four known forces” (this is utter nonsense). For some of this discussion, see Sabine Hossenfelder, John Baez and Will Kinney.

One reason this is getting a lot of attention is that the overall quality of reporting on math and physics at the relatively new Quanta Magazine has been very high, a welcome relief from the often highly dubious reporting at many mainstream science media outlets. The lessons of what happens when the information sources society relies on are polluted with ideologically driven nonsense are all around us, so seeing this happen at a place like Quanta is disturbing. If you want to understand where this current piece of nonsense comes from, there is an ideology-driven source you need to be aware of.

A major line of defense of their subject by string theorists has essentially been the claim that, while it may lack any experimental support, string theory is “the only consistent way to combine quantum theory and general relativity”. I’ve often explained what the problem with this is, won’t go on about it again here. Nima Arkani-Hamed is at this point likely the most influential theorist around, for some good reasons. The roots of the problem with the Quanta article lie in taking too seriously the kind of arguments he tends to make in the many talks he gives. He’s trying to make as strong as possible a case for the research program he is pursuing, so unfortunately gives all-too-convincingly a very tendentious take on the scientific issues involved. For more about this, see a posting here about the problems with the recent Quanta article that motivated the latest one.

Debates over generalities about whether the “laws of physics are inevitable” are sterile and I don’t want to engage in them here, but I thought it would be a good idea to explain what the serious ideas are that Arkani-Hamed and others are trying to refer to when they make dubious statements like “there’s just no freedom in the laws of physics”. Here’s an attempt at outlining this story:

Quantum mechanics and special relativity:

A mathematically precise implication of putting together fundamental ideas about quantum mechanics and special relativity is that the state space of the theory should carry a unitary linear representation (this is the QM part) of the Poincaré group (this is the special relativity part). You also generally assume that the time translation part of the Poincare group action satisfies a “positive energy” condition. To the extent you can identify “elementary particles”, these should correspond to irreducible representations. The irreducible unitary representations of the Poincaré group were first understood and classified by Wigner in the late 1930s. My QM textbook has a discussion in chapter 42. If you impose the condition of positive energy and for simplicity consider the case of non-zero mass, you find that the irreducible representations are classified by the mass and spin (which is 0,1/2,1,3/2, etc.). Non-interacting theories are completely determined by the representation theory and exist for all values of the mass and spin.

Extensions of Poincare and the No-go theorem of Coleman-Mandula

To get further constraints on a fundamental theory, one obvious idea is to extend the Poincaré group to something larger. States then should transform according to unitary representations of this larger group, carrying extra structure. Restricting to the Poincaré subgroup, one hopes to get additional constraints on which Poincaré representations can occur (they’ll be those that are restrictions of the representations of the larger group). The problem with this is the Coleman-Mandula theorem (1967) which implies that for interacting theories the larger group can only be a product of Poincaré times an internal symmetry group. Representations will just be products of the Poincaré group representations and representations of the internal group, with space-time symmetries and internal symmetries having nothing to do with each other. This is why the Quanta headline about “rederiving the four known forces” is nonsense: the three non-gravitational forces are determined by internal symmetries, have nothing to do with what the Quanta article is describing, work on space-time symmetries.

One way to avoid the Coleman-Mandula theorem is to work with not Lie algebras but Lie superalgebras. Here you do get a non-trivial extension of the Poincaré group and a prediction that Poincaré representations should occur in specific supermultiplets. The problem is that there is no evidence for such supermultiplets.

Another possible extension of the Poincaré group is the conformal group. Here the problem is that the new symmetry implications are too strong, they rule out the massive Poincaré group representations that we know exist. One can work with the conformal group if one sticks to massless particles, and this is what the methods advertised in the Quanta article do.

The idea that our fundamental space-time symmetry group is the conformal group is mathematically an extremely attractive one, with the twistor picture of space-time playing a natural role in this context. I strongly suspect that any future truly unified theory will somehow exploit this. Unfortunately, as far as I know, no one has yet come up with a way of exploiting this symmetry consistent with what we know about elementary particles. Likely a really good new deep idea is missing.

Quantum field theory

To get stronger constraints than the ones coming from Poincaré symmetry, one needs to decide how one is going to introduce interactions. One way to go is quantum field theory, with a principle of locality of interactions. This gets encoded in a condition of (anti)commutativity of the fields at space-like separations, which then implies various analyticity properties of correlation functions and scattering amplitudes. The analyticity properties can then be used to prove things like the CPT theorem and the spin-statistics theorem, which provide some new constraints.

Given a method of constructing a Poincaré invariant quantum field theory, typically done by choosing a set of classical fields and a Lagrangian, one can try and realize the various possible Poincaré group representations as interacting theories. What one finds is that, for spins greater than two one runs into various seemingly intractable problems with the construction. One also finds exceptionally beautiful theories in the spin 1/2 and spin 1 cases that exhibit an infinite dimensional group of gauge symmetries. An example of these is the Standard Model. Unfortunately, we know of no principle or symmetry that would provide a constraint that picks out the Standard Model. If we did, we might be tempted to announce that the principle or symmetry is “inevitable” and thus the “laws of physics are inevitable”. We’re not there yet…

Amplitudes and the S-matrix philosophy
In the S-matrix philosophy one takes the analyticity properties as fundamental, working with amplitudes, not local quantum fields. The 1960s version of this program (also often called the “bootstrap” program) was based on the hope that certain physically plausible analyticity assumptions would so tightly constrain the theory of strong interactions that it was essentially uniquely determined. This didn’t work out. In his recent introductory lecture for his course at Harvard, Arkani-Hamed explains why. The research program he and others are currently pursuing is in some sense a modernized version of the failed 60s program. The hope is that new structures in amplitudes can be found that will replace the structures one gets from local quantum fields.

Amplitudes based arguments about, for instance, why you don’t see fundamental higher-spin states, and why spin 1/2 particles have forces of the kind given by gauge theory have a long history, see for instance work on massless particles by Weinberg in the mid-sixties and Weinberg-Witten in 1980.

As far as I can tell, the work referred to in the Quanta article gives new amplitudes-based arguments of this kind for massless particles, exploiting conformal symmetry. It’s not clear to me exactly what’s new here as opposed to earlier such arguments, or how strong an argument about real world physics one can make using these new ideas. One thing that is clear though is that the Quanta quote that what has been discovered implies that “There’s just no freedom in the laws of physics” is as much nonsense as the “we rederived the four known forces” business.

Update: For some discussion with the author of the Quanta piece, Natalie Wolchover, see the comments starting here.

Update: The Quanta article has been revised, see comments in the comment section here. There Daniel Baumann provides a link to a popular summary of the facts about massless particle interactions that his quotes were about.

Last Updated on

Posted in This Week's Hype | 47 Comments

This and That

First, a few physics items:

  • Mark Alpert has a new novel out, Saint Joan of New York, a thriller subtitled “A Novel About God and String Theory”, which is an accurate description. It’s published by Springer, so you may be able to get access to it like I did through an institutional license here.

    The plot revolves around Joan, a talented high school student here in New York, who has been learning more advanced material through a mentor at City College, and in particular has learned about string theory and Calabi-Yaus. This Joan plays the role of a modern-day analog of Joan of Arc, using divine help to do battle not with the English, but with more modern dark forces. This divine help includes a revelation about Calabi-Yaus and the theory of everything. It’s a thriller, so I’ll avoid telling more about the plot so as not to spoil it.

    I quite enjoyed reading the book even though I’m not much of a fan of thrillers, although a lot of enjoyment was due to the fact that much of the action takes place here in New York on the Upper West Side, and that the main plot revolves around the question of string theory and existence of a TOE. Edward Witten plays a role in the story.

    If you like this one, you might also want to read some of Alpert’s other novels, a couple of which also involve themes of a TOE.

  • Most theorists have abandoned the search for a TOE, or the idea of explaining anything about the Standard Model, in favor of concentrating on hopes to find some sort of emergent theory of quantum gravity. For the latest on this, talks from the recent misleadingly titled Quantum Gravity in the Lab conference at Google might at some point be available. John Preskill’s slides are here. He indicates that the general idea is that quantum gravity will emerge from “Massive Entanglement, Quantum Chaos and Complexity.” This week the IAS will host a similar event, a workshop on Qubits and Spacetime. Wednesday evening many of the participants will be put on a bus to Manhattan, where they’ll continue with the 2019 meeting of the Simons Foundation-funded “It From Qubit” collaboration.
  • Also here in New York this week, Roger Penrose will be at Pioneer Works Friday night for a public program involving a conversation with Janna Levin. I have no idea whether his presence in New York at the same time as “It From Qubit” is a coincidence or not. If not, maybe the “It from Qubit” people will get back on the bus and head out to Red Hook Friday night.
  • Instead of being at the IAS, Nima Arkani-Hamed has been spending the past semester at Harvard, with activities that include teaching a course, Physics 283B: Spacetime and Quantum Mechanics, Total Positivity and Motives. Videos of his lectures are online here (first one here). It would be great if someone could put together a written set of lecture notes from these videos.
  • Finally, for some multiverse-related book reviews that have the unusual feature of showing some skepticism, see John Horgan here, Matt Leifer here, also Chris Fuchs here. Fuchs explains the problem with multiple worlds as a solution to the measurement problem:

    Its main shortcoming is simply this: The interpretation is completely contentless. I am not exaggerating or trying to be rhetorical. It is not that the interpretation is too hard to believe or too nonintuitive or too outlandish for physicists to handle the truth (remember the movie A Few Good Men?). It is just that the interpretation actually does not say anything whatsoever about reality. I say this despite all the fluff of the science-writing press and a few otherwise reputable physicists, like Sean Carroll, who seem to believe this vision of the world religiously.

Some mathematics items:

Update: In case you haven’t been getting enough hype about the multiverse recently, Scientific American has Long Live the Multiverse! for you, from Tom Siegfried. Siegfried assures us that “multiverse advocates have been right historically”. He also assures SciAm readers that multiverse theories are testable, in a way similar to the way Einstein demonstrated the existence of atoms in 1905 using Brownian motion:

For that matter, it’s not necessarily true that other universes are in principle not observable. If another bubble collided with ours, telltale marks might appear in the cosmic background radiation left over from the big bang. Even without such direct evidence, their presence might be inferred by indirect means, just as Einstein demonstrated the existence of atoms in 1905 by analyzing the random motion of particles suspended in liquid.

He doesn’t mention that his analog of the Brownian motion experiment has been done: people have looked for the predicted indirect effects of other bubble universes on ours, and found nothing. To the extent that the multiverse is testable, it has been tested and found to not be there.

Last Updated on

Posted in Book Reviews, Multiverse Mania, Uncategorized | 23 Comments

News From HEPAP

For presentations a couple days ago at the latest HEPAP meeting, see here. One piece of news, from this presentation, is that there likely will be a delay in the scheduled startup of the HL-LHC, with the next LHC run (Run 3) extended for an additional year (through 2024), and the next shutdown (LS3) extended by a half year. The HL-LHC would then start physics in 2028.

Most of the HEPAP discussions have to do with funding. The pattern of recent years has been one of huge decreases in funding proposed by the Trump administration. These are completely ignored by both the Democrats and Republicans in Congress, which passes large increases in funding (then signed into law by Trump). For FY2020 this continues: at DOE the HEP budget for FY2019 was \$980 million, for FY2020 the White House budget request was a massive cut to \$768 million. This was taken no more seriously by anyone than the last few of these, with the FY2020 House Mark \$1,045 million, the Senate Mark \$1,065 million. The FY2020 budget remains to be finally finished and passed, in the meantime the federal government has been operating under a sequence of continuing resolutions.

Specifically on theory funding, JoAnne Hewett has a presentation on The State of Theory. It has no numbers in it, but the DOE numbers given here show an increase from \$60 million in FY2017 to \$90 million in FY2019 for Theoretical, Computational and Interdisciplinary Physics. But within this category, pure theoretical HEP is pretty flat, with big increases for Computational HEP and a huge new investment in Quantum Information Science (\$27.5 million in FY2019). There does seem to have been some sort of decision to de-prioritize conventional theoretical HEP in favor of newer trendy areas.

Hewett describes the general consensus on current problems with theory funding as

  • Universal concern on ever decreasing levels of funding for university groups: concern that university programs are dying.

    -Private institutions attempt to offset cuts with non-federal funding sources.
    -Cuts to program further accumulated in 2019. Many postdocs learned in May 2019 that their contracts would not be renewed for the fall. It was then too late to apply for new positions.

  • Lab theory programs are also losing researchers.
  • Even distribution of cuts across U.S. theory program has indirect proportional effect to small programs.
  • Large fluctuations cycle-to-cycle is making groups less cohesive and more inclined to opt for “safer” research projects.
  • There is the perception that the recent emphasis on QIS comes at a cost to more traditional HEP theory research.
  • Summer salary has been capped or reduced to 1 month in many cases. Removal of summer salary across the board is demoralizing.

and ends with

The situation is becoming increasingly unstable.
University-based theory is suffering its most serious crisis in decades.
Its future is in jeopardy.

It would be interesting to see some numbers on the size of new private research funding going to HEP theory (for instance funding from the Simons Foundation or the private funding of the CMSA at Harvard). I don’t know of such numbers but I’m curious whether what is happening is that the total funding level has seen reasonable growth, but increases in funding are going to a small number of elite institutions, with the rest of the field in decline.

On the question of caps or reductions in summer salary, I doubt that any significant number of researchers is reacting to only getting 1 month of summer salary by signing up for another job (e.g. teaching summer school) and not doing research during the other two months of the summer. There has been another huge influx of money to the field that in some sense replaces grant-funded salary supplements: the multi-million dollar Breakthrough Prizes. A sizable number of HEP theorists have now partaken in all or part of one of the \$3 million prizes. If you add in this money, on average HEP theorists may have been seeing significant increases in income, however with almost all of it going to a small number of people (at the same elite institutions that are doing well). What we’re seeing may just be the same trend as in the rest of the US economy: a move to a star system with ever larger increases in inequality.

Another problem for the field of HEP theory may be that funding is stagnating because the DOE and NSF are skeptical about its intellectual health. Hewett notes that “Formal theory resides solely in university environment and has undergone significant funding cuts.” Trying to make the positive case for this part of the field, she lists three areas of advances, but oddly, the first two are identical. The two areas of advances in formal theory she describes are:

Advances in strongly coupled quantum field theory (gravity/field theory duality, bootstrap program, amplitudes) has implications for particle physics, cosmology and beyond.

Geometric advances in particle physics constructions from String/F-theory has implications for the “swampland program”.

For the second of these, it’s quite possible that most physicists don’t see this as an advance at all.

Update: Physics World has more about the delay here. It is supposed to be announced on Tuesday. The cause evidently is a budget gap caused by some planned contributions from non-member countries now not happening. The story doesn’t explain which non-member countries are involved or why their planned contributions are now not expected.

Last Updated on

Posted in Uncategorized | 26 Comments