Various and Sundry

A few recent items of interest:

  • Martin Greiter has put together a written version of Sidney Coleman’s mid-1990s lecture Quantum Mechanics in Your Face, based on a recording of one version of the lecture and copies of Coleman’s slides.

    It’s often claimed that leading physicists of Coleman’s generation were educated in and stuck throughout their careers to a “shut up and calculate” approach to the interpretation of quantum mechanics. Coleman’s lecture I believe gives a much better explanation of the way he and many others thought about the topic. In particular, Coleman makes the crucial point:

    The problem is not the interpretation of quantum mechanics. That’s getting things just backwards. The problem is the interpretation of classical mechanics

    He ends with the following approach to the measurement problem:

    Now people say the reduction of the wave packet occurs because it looks like the reduction of the wave packet occurs, and that is indeed true. What I’m asking you in the second main part of this lecture is to consider seriously what it would look like if it were the other way around—if all that ever happened was causal evolution according to quantum mechanics. What I have tried to convince you is that what it looks like is ordinary everyday life.

    While some might take this and claim Coleman as an Everettian, note that there’s zero mention anywhere of many-worlds. Likely he found that an empty idea that explains nothing, so not worth mentioning.

  • For the past year the CMSA has been hosting a Math-Science Literature Lecture Series. The talks have typically been excellent surveys of areas of mathematics, given by leading figures in each field. To coordinate with Tsinghua, the talks have often been at 8am here in NYC, and several times the last couple months I’ve made sure to get up early enough to have breakfast while watching one of the talks. All of the ones I’ve seen were very much worth the time, they were the ones given by Edward Witten, Andrei Okounkov, Alain Connes, Arthur Jaffe and Nigel Hitchin. Jaffe’s covered some of the ideas on Euclidean field theory that I’ve been spending a lot of time thinking about. For a more detailed version of his talk I highly recommend this article.
  • Peter Scholze has posted at the Xena blog a challenge to those interested in formalizing mathematics and automated theorem proving (or checking): formalize and check the proof of a foundational result in his work with Dustin Clausen on “condensed mathematics”. As part of the challenge, he provides an extensive discussion of the motivation and basic ideas of this subject, which attempts to provide a replacement (with better properties) for the conventional definition of a topological space.

    Spending a little time reading this and some of the other expositions of the subject convinced me this is a really thorny business. Scholze explains in detail his motivation for making the challenge in part 6 of the posting. My suspicion has always been that most of the value of computer theorem checking lies in forcing a human to lay out clearly and unambiguously the details of an argument, with that effort likely to make clear if there’s a problem with the theorem. It will be fascinating to see what comes of this project.

    I see there’s also a blog posting about this at the n-category cafe.

  • On the topic of theorems that definitely don’t have a clear and unambiguous proof, supposedly PRIMS will be publishing Mochizuki’s IUT papers in 2021. Mochizuki and collaborators have a new paper claiming stronger versions of Mochizuki’s results.
Posted in Uncategorized | 43 Comments

Contemplating the End of Physics

In a remarkable article entitled Contemplating the End of Physics posted today at Quanta magazine, Robbert Dijkgraaf (the director of the IAS) more or less announces the arrival of the scenario that John Horgan predicted for physics back in 1996. Horgan argued that physics was reaching the end of its ability to progress by finding new fundamental laws. Research trying to find new fundamental constituents of the universe and new laws governing them was destined to reach an endpoint where no more progress was possible. This is pretty much how Dijkgraaf now sees the field going forward:

Confronted with the endless number of physical systems we could fabricate out of the currently known fundamental pieces of the universe, I begin to imagine an upside-down view of physics. Instead of studying a natural phenomenon, and subsequently discovering a law of nature, one could first design a new law and then reverse engineer a system that actually displays the phenomena described by the law. For example, physics has moved far beyond the simple phases of matter of high school courses — solid, liquid, gas. Many potential “exotic” phases, made possible by the bizarre consequences of quantum mechanics, have been cataloged in theoretical explorations, and we can now start realizing these possibilities in the lab with specially designed materials.

All of this is part of a much larger shift in the very scope of science, from studying what is to what could be. In the 20th century, scientists sought out the building blocks of reality: the molecules, atoms and elementary particles out of which all matter is made; the cells, proteins and genes that make life possible; the bits, algorithms and networks that form the foundation of information and intelligence, both human and artificial. This century, instead, we will begin to explore all there is to be made with these building blocks.

In brief, as far as physics goes, elementary particle physics is over, from now on it’s pretty much just going to be condensed matter physics, where there at least is an infinity of potential effective field theory models to play with.

Dijkgraaf ends with an argument indicating that human intelligence is outmoded, artificial intelligence is our future:

Science concerns all phenomena, including the ones created in our laboratories and in our heads. Once we are fully aware of this grander scope, a different image of the research enterprise emerges. Now, finally, the ship of science is leaving the safe inland waterways carved by nature, and is heading for the open ocean, exploring a brave new world with “artificial” materials, organisms, brains and perhaps even a better version of ourselves.

Along the same lines, today also brings an article in the New York Times by Dennis Overbye, Can a Computer Devise a Theory of Everything? The article discusses the new MIT Institute for Artificial Intelligence and Fundamental Interactions and Max Tegmark’s hopes that AI will “discover all kinds of new laws of physics”. My guess is that this will work just fine if you give up on the 20th century understanding of what a “law of physics” is and follow Dijkgraaf’s lead. The problem then may be not so much “will we understand the new laws of physics found by AI?”, but rather that of them not being interesting enough to be worth understanding…

: To clarify the point I was trying to make about the Dijkgraaf piece arguing against the “end of physics”, compare it to the similar 1996 piece Gross and Witten published in the Wall Street Journal (a summary is here, an extract here). Gross and Witten were strongly disagreeing with Horgan, whereas it seems to me that Dijkgraaf implicitly agrees with Horgan that fundamental physics has hit a wall and theorists are moving on to do something else.

Posted in Uncategorized | 31 Comments

Various Links, String Theory now Untethered

I’ve been spending most of my time recently trying to get unconfused about Euclidean spinor fields, will likely write something here about that in the not too distant future. Some other things that may be of interest:

  • I did an interview a couple days ago with Fraser Cain, who runs the Universe Today website. He had some excellent and well-informed questions about the state of HEP physics. I regret a little bit that I focused on giving an even-handed explanation of the arguments over a next generation collider, didn’t emphasize that personally I think building such a thing is a good idea (if the money can somehow be found), since the alternative would be giving up and abandoning this kind of fundamental science.
  • On Monday, the Simons Center celebrated its 10th birthday, talks are here, giving a good overview of the kinds of math and physics that have been going on there during its first decade.
  • For the latest on the formulation of the local Langlands correspondence in terms of the geometry of the Fargues-Fontaine curve, Peter Scholze is teaching a course now in Bonn, website here.
  • Kirill Krasnov has a book out from Cambridge, Formulations of General Relativity. If you share my current interest in chiral formulations of GR and twistors, there’s a lot about these in the book. For a more general interest survey of what’s in the book, see Krasnov’s lectures last year at Perimeter (links and slides are on his website).
  • A couple weeks ago, a very well-done explanation of what’s been going on around the black hole information paradox written by George Musser appeared at Quanta Magazine. Periodically in recent years I’ve tried to follow what’s up with this subject, generally giving up after a while, frustrated especially at not being able to figure out what underlying theory of quantum gravity was being studied. All that ever was clear was that this was about low-dimensional toy model calculations involving some assumptions that had ingredients coming from holography and AdS/CFT.

    Musser’s article makes quite a few things clearer, with one striking aspect the news that:

    researchers cut the tether to string theory altogether.

    which I gather means that any foundation in AdS/CFT is gone, with what is being discussed now purely semi-classical. I don’t understand what these new semi-classical calculations are, and whether optimistic claims that the information paradox is on its way to a solution are justified (history hasn’t been kind to previous such claims). In recent years the pro-string theory research argument has often been that while there no longer were any prospects that it would tell us about particle physics, it was the best route to solving the problem of quantum gravity. It will be interesting to see what the effect will be of that cord getting cut by leading researchers.

    If you think it’s a good idea to follow discussions of this kind of thing on Twitter, you might enjoy threads from Sabine Hossenfelder and Ahmed Almeiri.

Posted in Uncategorized | 27 Comments

Do Particle Physicists Continue to Make Empty Promises?

Blogging has been light here, since little worthy of note in math/physics has been happening, and I’ve been busy with teaching, freaking out about the election, and trying to better understand Euclidean spinors. I’ll write soon about the Euclidean spinors, but couldn’t resist today making some comments about two things I’ve seen this week.

Sabine Hossenfelder yesterday had a blog entry/Youtube video entitled Particle Physicists Continue to Make Empty Promises, which properly takes exception to this quote:

A good example of a guaranteed result is dark matter. A proton collider operating at energies around 100 TeV will conclusively probe the existence of weakly interacting dark-matter particles of thermal origin. This will lead either to a sensational discovery or to an experimental exclusion that will profoundly influence both particle physics and astrophysics.

from a recent article by Fabiola Gianotti and Gian Francesco Giudice in Nature Physics. She correctly notes that

They guarantee to rule out some very specific hypotheses for dark matter that we have no reason to think are correct in the first place.

A 100 TeV collider can rule out certain kinds of higher-mass WIMPs, but it’s simply untrue that such an exclusion will “profoundly influence both particle physics and astrophysics.” Very few people think such a thing is likely since there’s no evidence for it and no well-motivated theory that predicts it.

Where I part company with Hossenfelder though is that I don’t see much wrong with the rest of the Gianotti/Giudice piece and don’t agree with her point of view that the big problem here is empty promises like this and plans for a new collider. Twenty years ago when I began writing Not Even Wrong, I started out by writing a chapter about the inherent physical limits that colliders were starting to hit, and the significance of this for the field. It was already clear that getting to higher proton energies than the LHC, or higher lepton energies than LEP was going to be very difficult and expensive. HEP experimentalists are now facing painful and hard choices about the future, which I wrote about in detail here under the title Should the Europeans Give Up?. The worldwide experimental HEP community is addressing the problem in a serious way, with the European Strategy Update one aspect, and the US now engaged in a similar Snowmass 2021 effort.

Many find it tempting to believe that the answer is simple: just redirect funds from collider physics to non-collider experiments. The problem is that there’s little evidence of promising but unfunded ideas for non-collider experiments. For the last decade there has been no new construction of high energy colliders, with as much money as ever available worldwide for HEP experiments. This should have been a golden age for those with non-collider ideas to propose. This continues to be the case: if you look at the European Strategy Update and Snowmass 2021 efforts, they have seriously focused on finding non-collider ideas to pursue. This should continue to be true, since I see no evidence anyone is going to decide to go ahead with a next generation collider and start spending money building it during the next few years. The bottom line result from the European process was not a decision to build a new collider, but a decision to keep studying the problem, then evaluate what to do in 2026. For the ongoing American process, as far as I know a new US collider is not even a possibility being discussed.

While HEP experiment is facing difficult times because of fundamental physical, engineering and economic limits, the problems of HEP theory are mostly self-inflicted. The decision nearly 40 years ago by a large fraction of the field to orient their research programs around bad ideas that don’t work (SUSY extensions of the Standard Model and string theory unification), then spend decades refusing to acknowledge failure is at the core of the sad state of the subject these days.

About the canniest and most influential HEP theorist around is Nima Arkani-Hamed, and a few days ago I watched an interview of him by Janna Levin. On the question of the justification for a new collider, he’s careful to state that the justification is mainly the study of the Higgs. He’s well aware that the failure of the “naturalness” arguments for weak-scale SUSY needs to be acknowledged and does so. He also is well aware that any attempt to argue this failure away by saying “we just need a higher energy collider” won’t pass the laugh test (and would bring Hossenfelder and others down on him like a ton of bricks…).

The most disturbing aspect of the interview is that Levin devotes a lot of time (and computer graphics) to getting Arkani-Hamed to explain his 1998 ideas about “large extra dimensions”, repeatedly telling the audience that he has been given a \$3 million prize for them. This paper has by now been cited over 6300 times, and the multi-million dollar business is correct, with the prize citation explaining:

Nima Arkani-Hamed has proposed a number of possible approaches to this [hierarchy problem] paradox, from the theory of large extra dimensions, where the strength of gravity is diluted by leaking into higher dimensions, to “split supersymmetry,” motivated by the possibility of an enormous diversity of long-distance solutions to string theory.

At the time it was pretty strange that a \$3 million dollar prize was being given for ideas that weren’t working out. It’s truly bizarre though that Levin would now want to make such failed ideas the centerpiece of a presentation to the public, misleading people about their status. The website for the interview also promotes Arkani-Hamed purely in terms of his failures, presented as successes:

Nima Arkani-Hamed is one of the leading particle physics phenomenologists of the generation. He is concerned with the relation between theory and experiment. His research has shown how the extreme weakness of gravity, relative to other forces of nature, might be explained by the existence of extra dimensions of space, and how the structure of comparatively low-energy physics is constrained within the context of string theory. He has taken a lead in proposing new physical theories that can be tested at the Large Hadron Collider at CERN in Switzerland,

This is part of the overall weird situation of the failed ideas (SUSY/strings) of 40 years ago: they still live on in a dominant position when the subject is presented to the public.

At the same time, the topics Arkani-Hamed is working on now are ones I think are more promising than most of the rest of what is going on in HEP theory. The interview began with a discussion of Penrose’s recent Nobel Prize, with Arkani-Hamed explaining Penrose’s fantastic insights about twistor geometry and noting that his own current work involves a fundamental role for twistor space (personally I see some other promising directions for using twistor geometry, more to come about this here in the future).

In contrast to Hossenfelder, what I’m seeing these days in HEP physics is not a lot of empty promises (which were a dominant aspect of HEP theory for several decades). Instead, on the experimental side, there’s an honest struggle with implacable difficulties. On the theory side increasingly people have just given up, deciding that it’s better to let the subject die chained to a host of \$3 million prizes for dead ideas than to honestly face up to what has happened.

In case anyone needs any reminder of how bad the propaganda problem is:
It appears this kind of thing is driven by a propaganda problem on Wikipedia.

Posted in Uncategorized | 32 Comments

2020 Physics Nobel Prize

The 2020 Physics Nobel Prize was announced this morning, with half going to Roger Penrose for his work on black holes, half to two astronomers (Reinhard Genzel and Andrea Ghez) for their work mapping what is going on at the center of our galaxy. I know just about nothing about the astronomy side of this, but am somewhat familiar with Penrose’s work, which very much deserves the prize.

Penrose is a rather unusual choice for a Physics Nobel Prize, in that he’s very much a mathematical physicist, with a Ph.D. in mathematics (are there other physics winners with math Ph.Ds?). In addition, the award is not for a new physical theory, or for anything experimentally testable, but for the rigorous understanding of the implications of Einstein’s general relativity. While I’m a great fan of the importance of this kind of work, I can’t think of many examples of it getting rewarded by the Nobel prize. I had always thought that Penrose was likely to get a Breakthrough Prize rather than a Nobel Prize, still don’t understand why that hasn’t happened already.

Besides the early work on black holes that Penrose is being recognized for, he has worked on many other things which I think are likely to ultimately be of even greater significance. In particular, he’s far and away the person most responsible for twistor theory, a subject which I believe has a great future ahead of it at the core of fundamental physical theory.

In all his work, Penrose has shown a remarkable degree of originality and creativity. He’s not someone who works to make an advance on ideas pioneered by others, but sets out to do something new and different. His book “The Road to Reality” is a masterpiece, an inspiring original and deep vision of the unity of geometry and physics that outshines the mainstream ways of looking at these questions.

Congratulations to Sir Roger, and compliments to the Nobel prize committee for a wonderful choice!

Posted in Uncategorized | 51 Comments

Quick Links

A few quick links:

  • I was sorry to hear of the recent death of Vaughan Jones. A few things about his life and work have started to appear, see here, here and here.
  • For a wonderful in-depth article about the life of Michael Atiyah written by Nigel Hitchin, see here.
  • There are now many new places where you can find talks about math and physics to listen to. For instance, just for math and just at Harvard, there is a series of Harvard Math Literature talks and Dennis Gaitsgory’s geometric Langlands office hours.
  • Breakthrough Prizes were announced today. There’s an argument to be made that the best policy is to ignore them. Weinberg has another 3 million dollars.
  • For an interview with Avi Loeb about why physics is stuck, see here.
  • For an explanation from John Preskill of why quantum computing is hard (which I’d claim has to do with why the measurement problem is hard), see here.

Update: Last night I watched The Social Dilemma on Netflix, which included some segments with my friend Cathy O’Neil (AKA Mathbabe). Highly recommended, best of the things I’ve read or watched that try and come to grips with the nature of the horror irresponsibly unleashed by Mark Zuckerberg and Facebook in the form of the AI driven News Feed. Comparing to a documentary about Oxycontin from a while back, the effects of the News Feed are arguably more damaging. I’m wondering why the Oxycontin-funded Sackler family donations to cultural organizations and universities have been heavily criticized, unlike the News Feed-funded Zuckerberg/Milner donations to scientists.

Update: Alain Connes has written a short appreciation of Vaughan Jones and his work here.

Update: For another article about Vaughan Jones well-worth reading, see Davide Castelvecchi at Nature.

Posted in Uncategorized | 17 Comments

Fall Quantum Mechanics Class

I’ll be teaching a course on quantum mechanics this year here at Columbia, from a point of view aimed somewhat at mathematicians, emphasizing the role of Lie groups and their representations. For more details, the course webpage is here.

The course is being taught online using Zoom, with 37 students now enrolled. I’ve set things up in my office to try and teach using the blackboard there, and will be interacting with the students mostly via Zoom. As an experiment, I’ve also set up a Youtube channel. If all goes well you should be able to find a livestream of the class there while it’s happening, which is scheduled for 4:10-5:25 Tuesdays and Thursdays, starting tomorrow, September 8. I’ll also try and make sure the recorded livestreams get uploaded and saved at this playlist. Unfortunately I won’t be able to interact with people watching on Youtube, should have my hands full trying to get to know the students enrolled here in the course, with only this virtual connection.

Posted in Quantum Mechanics | 19 Comments

AMS Open Math Notes

The AMS for the last few years has had a valuable project called AMS Open Math Notes, a site to gather and make available course notes for math classes, documents of the sort that people sometimes make available on their websites. This provides a great place to go to look for worthwhile notes of this kind (many of them are of very high quality), as well as ensuring their availability for the future. They have an advisory board that evaluates whether submitted notes are suitable.

A couple months ago I submitted the course notes I wrote up this past semester for my Fourier Analysis class, and I’m pleased that they were accepted and are now available here at the AMS site (and will remain also available from my website).

Posted in Uncategorized | 3 Comments

Quantum Reality

Jim Baggott’s new book, Quantum Reality, is now out here in US, and I highly recommend it to anyone interested in the issues surrounding the interpretation of quantum mechanics. Starting next week I’ll be teaching a course on quantum mechanics for mathematicians (more about this in a few days when I have a better idea how it’s going to work). I’ll be lecturing about the formalism, and for the topic of how this connects to physical reality I’ll be referring the students to this new book (as well as Philip Ball’s Beyond Weird).

When I was first studying quantum mechanics in the early-mid 1970s, the main popular sources discussing interpretational issues were uniform triumphalist accounts of how physicists had struggled with these issues and finally ended up with the “Copenhagen interpretation” (which no one was sure exactly how to state, due to diversity of opinion among theorists and Bohr’s obscurity of expression). Everyone now says that the reigning ideology of the time was “shut up and calculate”, but that’s not exactly what I remember. The Standard Model had just appeared, offering up a huge advance and a long list of new questions with powerful methods to attack them. In this context it was was hard to justify spending time worrying about the subtleties of what Copenhagen might have gotten wrong.

In recent decades things have changed completely, with the question of what’s wrong with Copenhagen and how to do better getting a lot of attention. By now a huge and baffling literature about alternatives has accumulated, forming somewhat of a tower of Babel confronting anyone trying to learn more about the subject. Some popular accounts have dealt with this complexity by turning the subject into a morality play, with alternative interpretations portrayed as the Rebel Alliance fighting righteous battles against the Copenhagen Empire. Others accounts are pretty much propaganda for a particular alternative, be it Bohmian mechanics or a many-worlds interpretation.

Instead of something like this, Baggott provides a refreshingly sane and sensible survey of the subject, trying to get at the core of what is unsatisfying about the Copenhagen account, while explaining the high points of the many different alternatives that have been pursued. He doesn’t have an ax to grind, sees the subject more as a “Game of Theories” in which one must navigate carefully, avoiding Scylla, Charybdis, and various calls from the Sirens. One thing which is driving this whole subject is the advent of new technologies that allow the experimental study of quantum coherence and decoherence, with great attention being paid as possible quantum computing technology has become the hottest and best-funded topic around. Whatever you think about Copenhagen, what Bohr and others characterized as inaccessible to experiment is now anything but that.

While one of my least favorite aspects of discussions of this subject is the various ways the terms “real” and “reality” get used, I have realized that one has to get over that when trying to follow people’s arguments, since the terms have become standard sign-posts. What’s at issue here are fundamental questions about physical science and reality, including the question of what the words “real” and “reality” might mean. In Quantum Reality, Baggott provides a well-informed, reliable and enlightening tour of the increasingly complex and contentious terrain of arguments over what our best fundamental theory is telling us about what is physically “real”.

Update: For a much better and more detailed review of the book, Sabine Hossenfelder’s is here.

Posted in Book Reviews | 37 Comments

Funding Priorities

The research that gets done in any field of science is heavily influenced by the priorities set by those who fund the research. For science in the US in general, and the field of theoretical physics in particular, recent years have seen a reordering of priorities that is becoming ever more pronounced. As a prominent example, recently the NSF announced that their graduate student fellowships (a program that funds a large number of graduate students in all areas of science and mathematics) will now be governed by the following language:

Although NSF will continue to fund outstanding Graduate Research Fellowships in all areas of science and engineering supported by NSF, in FY2021, GRFP will emphasize three high priority research areas in alignment with NSF goals. These areas are Artificial Intelligence, Quantum Information Science, and Computationally Intensive Research. Applications are encouraged in all disciplines supported by NSF that incorporate these high priority research areas.

No one seems to know exactly what this means in practice, but it clearly means that if you want the best chance of getting a good start on a career in science, you really should be going into one of

  • Artificial Intelligence
  • Quantum Information Science
  • Computationally Intensive Research

or, even better, trying to work on some intersection of these topics.

Emphasis on these areas is not new; it has been growing significantly in recent years, but this policy change by the NSF should accelerate ongoing changes. As far as fundamental theoretical physics goes, we’ve already seen that the move to quantum information science has had a significant effect. For example, the IAS PiTP summer program that trains students in the latest hot topics in 2018 was devoted to From Qubits to Spacetime. The impact of this change in funding priorities is increased by the fact that the largest source of private funding for theoretical physics research, the Simons Foundation, shares much the same emphasis. The new Simons-funded Flatiron Institute here in New York has as mission statement

The mission of the Flatiron Institute is to advance scientific research through computational methods, including data analysis, theory, modeling and simulation.

In the latest development on this front, the White House announced today \$1 billion in funding for artificial intelligence and quantum information science research institutes:

“Thanks to the leadership of President Trump, the United States is accomplishing yet another milestone in our efforts to strengthen research in AI and quantum. We are proud to announce that over $1 billion in funding will be geared towards that research, a defining achievement as we continue to shape and prepare this great Nation for excellence in the industries of the future,” said Advisor to the President Ivanka Trump.

This includes an NSF component of \$100 million dollars in new funding for five Artificial Intelligence research institutes. One of these will largely be a fundamental theoretical physics institute, to be called the NSF AI Institute for Artificial Intelligence and Fundamental Interactions (IAIFI). The theory topics the institute will concentrate on will be

  • Accelerating Lattice Field Theory with AI
  • Exploring the Multiverse with AI
  • Classifying Knots with AI
  • Astrophysical Simulations with AI
  • Towards an AI Physicist
  • String Theory Conjectures via AI

As far as trying to get beyond the Standard Model, the IAIFI plan is to

work to understand physics beyond the SM in the frameworks of string and knot theory.

I’m rather mystified by how knot theory is going to give us beyond the SM physics, perhaps the plan is to revive Lord Kelvin’s vortex theory.

Update: Some more here about the knots. No question that you can study knots with a computer, but I’m still mystified by their supposed connection to beyond SM physics.

Posted in Uncategorized | 36 Comments