If you’re a mathematician, you don’t need to go work for Dominic Cummings in order to have dramatically improved career opportunities in the UK. The British government has just announced a huge increase in funding for mathematical research: 60 million pounds/year (or about \$80 million dollars) for the next five years (see here and here). To get some idea of the scale of this, note that the US GDP is about 8 times the UK’s and the NSF DMS budget is about \$240 million/year. So the comparable scale of this funding in the US would be about two and a half times the NSF budget for mathematics.
Many of my mathematician colleagues have sometimes seemed to me to be of the opinion that a huge increase in funding for math research is the best way to improve a society. We’ll see if this works for Britain.
While the new UK government ran on a nativist platform of restricting immigration, with the goal of keeping outsiders from taking bread out of the mouths of UK citizens, this doesn’t apply to mathematicians: all limits are off and we’re encouraged to flood the country. The law will be changed on Friday, changes go into effect Feb. 20. This will include an “accelerated path to settlement”, no need to even have a job offer, and all your “dependents [will] have full access to the labour market”, no problem with them and the taking the bread out of the mouths of the locals thing.
Update: More here (except it’s mostly behind a paywall, but evidently Ivan Fesenko is quoted).
I was sorry to hear of the death a few months ago of Tony Smith, who had been a frequent commenter on this blog and others. Unfortunately my interactions with him mainly involved trying to discourage him from hijacking the discussion in some other (often unusual) direction. Geoffrey Dixon did get to know him well, and has written a wonderful long blog entry about Tony, which I highly recommend (Dixon’s newish blog also has other things you might find interesting).
On the Jim Simons front, the Simons Foundation has put together something to celebrate their 25th anniversary. It explains a lot about their history and what they are doing now, as well as giving some indication of plans for the future. On these topics, read pieces written by Jim Simons and Marilyn Simons. The Foundation has been in a high growth mode, having an increasingly large impact on math and physics research. Their main statement about the future is that the plan is for this to go on for a very long time:
According to its bylaws, the Simons Foundation is intended to focus almost entirely on research in mathematics and science and to exist in perpetuity. If future leadership abides by these guiding principles, Marilyn and I believe the foundation will forever be a force for good in our society.
My impression is that the Simons children have their own interests, and foundations with other goals to run.
News from the \$75 billion source of the money (RenTech) today is that Simons is increasingly turning over control of that to his son Nathaniel, who has been named co-chairman. He has also added four new directors to the board, four of them senior Renaissance executives, and one his son-in-law Mark Heising.
There are various IAS-related videos you might want to take a look at:
Sabine Hossenfelder has a new piece out, making many of the same arguments she has been making for a while about the state of fundamental theory in physics. These have a lot in common with arguments that Lee Smolin and I were making in our books published back in 2006. The underlying problem is that the way theorists successfully worked up until the seventies is no longer viable, with the Standard Model working too well, up to the highest energies probed:
The major cause of this stagnation is that physics has changed, but physicists have not changed their methods. As physics has progressed, the foundations have become increasingly harder to probe by experiment. Technological advances have not kept size and expenses manageable. This is why, in physics today, we have collaborations of thousands of people operating machines that cost billions of dollars.
With fewer experiments, serendipitous discoveries become increasingly unlikely. And lacking those discoveries, the technological progress that would be needed to keep experiments economically viable never materializes. It’s a vicious cycle: Costly experiments result in lack of progress. Lack of progress increases the costs of further experiment. This cycle must eventually lead into a dead end when experiments become simply too expensive to remain affordable. A $40 billion particle collider is such a dead end.
I have a somewhat different view about a potential next collider (see here), but agree that the basic question is whether it will be “too expensive to remain affordable.”
What has happened over the last forty years is that the way HEP theory is done has become dysfunctional, in a way that Hossenfelder characterizes as follows:
Instead of examining the way that they propose hypotheses and revising their methods, theoretical physicists have developed a habit of putting forward entirely baseless speculations. Over and over again I have heard them justifying their mindless production of mathematical fiction as “healthy speculation” – entirely ignoring that this type of speculation has demonstrably not workhat woed for decades and continues to not work. There is nothing healthy about this. It’s sick science. And, embarrassingly enough, that’s plain to see for everyone who does not work in the field.
This behavior is based on the hopelessly naïve, not to mention ill-informed, belief that science always progresses somehow, and that sooner or later certainly someone will stumble over something interesting. But even if that happened – even if someone found a piece of the puzzle – at this point we wouldn’t notice, because today any drop of genuine theoretical progress would drown in an ocean of “healthy speculation”…
Why don’t physicists have a hard look at their history and learn from their failure? Because the existing scientific system does not encourage learning. Physicists today can happily make career by writing papers about things no one has ever observed, and never will observe. This continues to go on because there is nothing and no one that can stop it.
This story brings up a lot of complex issues in the philosophy and sociology of science, but to me there’s one aspect of the problem that is relatively simple and deserves a lot more attention than it gets: how do you get theorists to abandon failed ideas and move on to try something else?
The negative LHC results about SUSY have had some effect, but even in this case it’s remarkable how many theorists won’t abandon the failed idea of a SUSY extension of the Standard Model. This was always a highly dubious idea, explaining nothing about the Standard Model and adding a huge number of new degrees of freedom and more than a hundred new undetermined parameters. Not seeing anything at the LHC should have put the final nail in the coffin of that idea. Instead, I see that this past fall MIT was still training its graduate students with a course on Supersymmetric Quantum Field Theories. You can try and argue that SUSY and supergravity theories are worth studying even if they have nothing to do with physics at observable energies, but it is a fact that these are extremely complicated QFTs to work with and have explained nothing. Why encourage grad students to devote the many, many hours it takes to understand the details of this subject, instead of encouraging them to learn about something that hasn’t been a huge failure?
The techniques one gets trained in as a graduate student tend to form the basis of one’s understanding of a subject and have a huge influence on one’s future career and the questions one has the expertise needed to work on. Besides SUSY, string theory has been the other major course topic at many institutions, with the best US grad students often spending large amounts of time trying to absorb the material in Polchinski’s two-volume textbook, even though the motivations for this have turned out to also be a huge failure, arguably the largest one in the history of theoretical physics.
To get some idea of what is going on, I took a look at the current and recent course offerings (on BSM theory, not including cosmology) at the five leading (if you believe US News) US HEP theory departments. I may very well be missing some offered courses, but the following gives some insight into what leading US departments are teaching their theory students. Comparing to past years might be interesting, possibly there’s a trend towards abandoning the whole area in favor of other topics (e.g. cosmology, quantum information, condensed matter).
Update: There’s some serious discussion of this on Twitter. For those who can stand that format, try looking here and here.
Update: Mark Goodsell has a blog posting about all this here, including a defense of teaching the usual SUSY story to graduate students.
Update: A correspondent pointed me to this recent CERN Courier interview with John Ellis. Ellis maintains his increasingly implausible defense of SUSY, but he’s well aware that times have now changed:
People are certainly exploring new theoretical avenues, which is very healthy and, in a way, there is much more freedom for young theorists today than there might have been in the past. Personally, I would be rather reluctant at this time to propose to a PhD student a thesis that was based solely on SUSY – the people who are hiring are quite likely to want them to be not just working on SUSY and maybe even not working on SUSY at all. I would regard that as a bit unfair, but there are always fashions in theoretical physics.
To start the new decade there’s an article very much worth reading by Misha Shifman, entitled Musings on the Current State of HEP. It’s somewhat of an update of something he wrote back in 2012, which I wrote about here. He starts off with:
Now, seven years later, I will risk to offer my musings on the same subject.The seven years that have elapsed since  brought new perspectives: the tendencies which were rather foggy at that time became pronounced. My humble musings do not pretend to be more than they are: just a personal opinion of a theoretical physicist… For obvious reasons I will focus mostly on HEP, making a few marginal remarks on related areas. I would say that the most important message we have received is the absence of dramatic or surprising new results. In HEP no significant experimental findings were reported, old ideas concerning Beyond the Standard Model (BSM) physics hit dead-ends one after another and were not replaced by novel ideas. Hopes for key discoveries at the LHC (such as superpartners) which I mentioned in 2012 are fading away. Some may even say that these hopes are already dead. Low energy supersymmetry is ruled out, and gone with it is the concept of naturalness, a basic principle which theorists cherished and followed for decades. Nothing has replaced it so far…
HEP, “my” branch of theoretical physics since the beginning of my career, seems to be shrinking. A change of priorities in HEP in the near future is likely as business as usual is not sustainable. The current time is formative.
I encourage you to take a look at the rest, there’s a lot more detailed discussion of the state of HEP and allied fields, especially about the central role of quantum field theory.
Shifman also includes a section very critical of Richard Dawid, the “non-empirical confirmation” business and talks given at the “Why Trust a Theory?” conference (discussed here):
With all due respect I strongly disagree with Richard Dawid and all supporting speakers at the conference and beyond… I object against applying the term “non-empirically confirmed” to science (the more so, the term “postempiric science”). Of course, we live in liberal times and everybody is entitled to study and discuss whatever he or she wants. But the word science is already taken. Sorry, colleagues. For “postempiric science,” please, use another word, for instance, iScience, xScience, or something else.
As for David Gross’s attempt to claim that string theory is, like quantum mechanics and quantum field theory, not testable just because it is a framework, not a theory, Shifman is having none of it:
David Gross is a great theoretical physicist, whose discovery of asymptotic freedom made him immortal, but I respectfully disagree with him. Framework or not, both QM and QFT have absolutely solid confirmations in all their aspects in thousands of experiments.
As for the once popular idea that string theory could provide a “theory of everything”, he writes:
Well… it never happened and – I will risk to say – never will.
At some point within the past couple years I noticed that one blog that had Not Even Wrong on its blogroll was the blog of Dominic Cummings, who was often getting credited with masterminding the political campaign that got the British to vote (narrowly) for Brexit in 2016. Cummings has had further success recently as Chief Special Adviser to British Prime Minister Boris Johnson, with a blow-out election victory three weeks ago putting him securely in control of the British state.
the selection, education and training of people for high performance
the frontiers of the science of prediction
data science, AI and cognitive technologies (e.g Seeing Rooms, ‘authoring tools designed for arguing from evidence’, Tetlock/IARPA prediction tournaments that could easily be extended to consider ‘clusters’ of issues around themes like Brexit to improve policy and project management)
communication (e.g Cialdini)
decision-making institutions at the apex of government.
For some other descriptions of who Cummings would like to hire, on the economics side there’s:
The ideal candidate might, for example, have a degree in maths and economics, worked at the LHC in one summer, worked with a quant fund another summer, and written software for a YC startup in a third summer!
We’ve found one of these but want at least one more.
He also wants “Super-talented weirdos”, with examples given from William Gibson novels, such as “that Chinese-Cuban free runner from a crime family hired by the KGB.”
The remarkable things to me about this long document are what it doesn’t contain. In particular I see nothing at all about any specific policy goals. Usually a new government would recruit people by appealing to their desire to make the world a better place in some specific way, but there’s nothing about that here. The goal is to control the government and what the British population believes, but to what end?
In addition, a more conventional hiring process would be asking for candidates of high ethical values, with some devotion to telling the truth. Cummings seems to be asking for exactly the opposite: best if your background is “from a crime family hired by the KGB.”
Best of wishes to my British readers, now joining the US and other nations in a new dystopic post-truth era. It’s massively depressing to me to see how this has worked out here, I hope you do better. Maybe you should be sending in your applications to Cummings and hoping to sign up for a role in the new power structure. If so, tell him “Not Even Wrong” sent you…
The last couple days have seen various discussions online generated by a piece at Quanta Magazine with the dubious headline Why the Laws of Physics Are Inevitable and an even worse sub-headline claiming “physicists working on the ‘bootstrap’ have rederived the four known forces” (this is utter nonsense). For some of this discussion, see Sabine Hossenfelder, John Baez and Will Kinney.
One reason this is getting a lot of attention is that the overall quality of reporting on math and physics at the relatively new Quanta Magazine has been very high, a welcome relief from the often highly dubious reporting at many mainstream science media outlets. The lessons of what happens when the information sources society relies on are polluted with ideologically driven nonsense are all around us, so seeing this happen at a place like Quanta is disturbing. If you want to understand where this current piece of nonsense comes from, there is an ideology-driven source you need to be aware of.
A major line of defense of their subject by string theorists has essentially been the claim that, while it may lack any experimental support, string theory is “the only consistent way to combine quantum theory and general relativity”. I’ve often explained what the problem with this is, won’t go on about it again here. Nima Arkani-Hamed is at this point likely the most influential theorist around, for some good reasons. The roots of the problem with the Quanta article lie in taking too seriously the kind of arguments he tends to make in the many talks he gives. He’s trying to make as strong as possible a case for the research program he is pursuing, so unfortunately gives all-too-convincingly a very tendentious take on the scientific issues involved. For more about this, see a posting here about the problems with the recent Quanta article that motivated the latest one.
Debates over generalities about whether the “laws of physics are inevitable” are sterile and I don’t want to engage in them here, but I thought it would be a good idea to explain what the serious ideas are that Arkani-Hamed and others are trying to refer to when they make dubious statements like “there’s just no freedom in the laws of physics”. Here’s an attempt at outlining this story:
Quantum mechanics and special relativity:
A mathematically precise implication of putting together fundamental ideas about quantum mechanics and special relativity is that the state space of the theory should carry a unitary linear representation (this is the QM part) of the Poincaré group (this is the special relativity part). You also generally assume that the time translation part of the Poincare group action satisfies a “positive energy” condition. To the extent you can identify “elementary particles”, these should correspond to irreducible representations. The irreducible unitary representations of the Poincaré group were first understood and classified by Wigner in the late 1930s. My QM textbook has a discussion in chapter 42. If you impose the condition of positive energy and for simplicity consider the case of non-zero mass, you find that the irreducible representations are classified by the mass and spin (which is 0,1/2,1,3/2, etc.). Non-interacting theories are completely determined by the representation theory and exist for all values of the mass and spin.
Extensions of Poincare and the No-go theorem of Coleman-Mandula
To get further constraints on a fundamental theory, one obvious idea is to extend the Poincaré group to something larger. States then should transform according to unitary representations of this larger group, carrying extra structure. Restricting to the Poincaré subgroup, one hopes to get additional constraints on which Poincaré representations can occur (they’ll be those that are restrictions of the representations of the larger group). The problem with this is the Coleman-Mandula theorem (1967) which implies that for interacting theories the larger group can only be a product of Poincaré times an internal symmetry group. Representations will just be products of the Poincaré group representations and representations of the internal group, with space-time symmetries and internal symmetries having nothing to do with each other. This is why the Quanta headline about “rederiving the four known forces” is nonsense: the three non-gravitational forces are determined by internal symmetries, have nothing to do with what the Quanta article is describing, work on space-time symmetries.
One way to avoid the Coleman-Mandula theorem is to work with not Lie algebras but Lie superalgebras. Here you do get a non-trivial extension of the Poincaré group and a prediction that Poincaré representations should occur in specific supermultiplets. The problem is that there is no evidence for such supermultiplets.
Another possible extension of the Poincaré group is the conformal group. Here the problem is that the new symmetry implications are too strong, they rule out the massive Poincaré group representations that we know exist. One can work with the conformal group if one sticks to massless particles, and this is what the methods advertised in the Quanta article do.
The idea that our fundamental space-time symmetry group is the conformal group is mathematically an extremely attractive one, with the twistor picture of space-time playing a natural role in this context. I strongly suspect that any future truly unified theory will somehow exploit this. Unfortunately, as far as I know, no one has yet come up with a way of exploiting this symmetry consistent with what we know about elementary particles. Likely a really good new deep idea is missing.
Quantum field theory
To get stronger constraints than the ones coming from Poincaré symmetry, one needs to decide how one is going to introduce interactions. One way to go is quantum field theory, with a principle of locality of interactions. This gets encoded in a condition of (anti)commutativity of the fields at space-like separations, which then implies various analyticity properties of correlation functions and scattering amplitudes. The analyticity properties can then be used to prove things like the CPT theorem and the spin-statistics theorem, which provide some new constraints.
Given a method of constructing a Poincaré invariant quantum field theory, typically done by choosing a set of classical fields and a Lagrangian, one can try and realize the various possible Poincaré group representations as interacting theories. What one finds is that, for spins greater than two one runs into various seemingly intractable problems with the construction. One also finds exceptionally beautiful theories in the spin 1/2 and spin 1 cases that exhibit an infinite dimensional group of gauge symmetries. An example of these is the Standard Model. Unfortunately, we know of no principle or symmetry that would provide a constraint that picks out the Standard Model. If we did, we might be tempted to announce that the principle or symmetry is “inevitable” and thus the “laws of physics are inevitable”. We’re not there yet…
Amplitudes and the S-matrix philosophy
In the S-matrix philosophy one takes the analyticity properties as fundamental, working with amplitudes, not local quantum fields. The 1960s version of this program (also often called the “bootstrap” program) was based on the hope that certain physically plausible analyticity assumptions would so tightly constrain the theory of strong interactions that it was essentially uniquely determined. This didn’t work out. In his recent introductory lecture for his course at Harvard, Arkani-Hamed explains why. The research program he and others are currently pursuing is in some sense a modernized version of the failed 60s program. The hope is that new structures in amplitudes can be found that will replace the structures one gets from local quantum fields.
Amplitudes based arguments about, for instance, why you don’t see fundamental higher-spin states, and why spin 1/2 particles have forces of the kind given by gauge theory have a long history, see for instance work on massless particles by Weinberg in the mid-sixties and Weinberg-Witten in 1980.
As far as I can tell, the work referred to in the Quanta article gives new amplitudes-based arguments of this kind for massless particles, exploiting conformal symmetry. It’s not clear to me exactly what’s new here as opposed to earlier such arguments, or how strong an argument about real world physics one can make using these new ideas. One thing that is clear though is that the Quanta quote that what has been discovered implies that “There’s just no freedom in the laws of physics” is as much nonsense as the “we rederived the four known forces” business.
Update: For some discussion with the author of the Quanta piece, Natalie Wolchover, see the comments starting here.
Update: The Quanta article has been revised, see comments in the comment section here. There Daniel Baumann provides a link to a popular summary of the facts about massless particle interactions that his quotes were about.
Mark Alpert has a new novel out, Saint Joan of New York, a thriller subtitled “A Novel About God and String Theory”, which is an accurate description. It’s published by Springer, so you may be able to get access to it like I did through an institutional license here.
The plot revolves around Joan, a talented high school student here in New York, who has been learning more advanced material through a mentor at City College, and in particular has learned about string theory and Calabi-Yaus. This Joan plays the role of a modern-day analog of Joan of Arc, using divine help to do battle not with the English, but with more modern dark forces. This divine help includes a revelation about Calabi-Yaus and the theory of everything. It’s a thriller, so I’ll avoid telling more about the plot so as not to spoil it.
I quite enjoyed reading the book even though I’m not much of a fan of thrillers, although a lot of enjoyment was due to the fact that much of the action takes place here in New York on the Upper West Side, and that the main plot revolves around the question of string theory and existence of a TOE. Edward Witten plays a role in the story.
If you like this one, you might also want to read some of Alpert’s other novels, a couple of which also involve themes of a TOE.
Most theorists have abandoned the search for a TOE, or the idea of explaining anything about the Standard Model, in favor of concentrating on hopes to find some sort of emergent theory of quantum gravity. For the latest on this, talks from the recent misleadingly titled Quantum Gravity in the Lab conference at Google might at some point be available. John Preskill’s slides are here. He indicates that the general idea is that quantum gravity will emerge from “Massive Entanglement, Quantum Chaos and Complexity.” This week the IAS will host a similar event, a workshop on Qubits and Spacetime. Wednesday evening many of the participants will be put on a bus to Manhattan, where they’ll continue with the 2019 meeting of the Simons Foundation-funded “It From Qubit” collaboration.
Also here in New York this week, Roger Penrose will be at Pioneer Works Friday night for a public program involving a conversation with Janna Levin. I have no idea whether his presence in New York at the same time as “It From Qubit” is a coincidence or not. If not, maybe the “It from Qubit” people will get back on the bus and head out to Red Hook Friday night.
Finally, for some multiverse-related book reviews that have the unusual feature of showing some skepticism, see John Horgan here, Matt Leifer here, also Chris Fuchs here. Fuchs explains the problem with multiple worlds as a solution to the measurement problem:
Its main shortcoming is simply this: The interpretation is completely contentless. I am not exaggerating or trying to be rhetorical. It is not that the interpretation is too hard to believe or too nonintuitive or too outlandish for physicists to handle the truth (remember the movie A Few Good Men?). It is just that the interpretation actually does not say anything whatsoever about reality. I say this despite all the fluff of the science-writing press and a few otherwise reputable physicists, like Sean Carroll, who seem to believe this vision of the world religiously.
Update: In case you haven’t been getting enough hype about the multiverse recently, Scientific American has Long Live the Multiverse! for you, from Tom Siegfried. Siegfried assures us that “multiverse advocates have been right historically”. He also assures SciAm readers that multiverse theories are testable, in a way similar to the way Einstein demonstrated the existence of atoms in 1905 using Brownian motion:
For that matter, it’s not necessarily true that other universes are in principle not observable. If another bubble collided with ours, telltale marks might appear in the cosmic background radiation left over from the big bang. Even without such direct evidence, their presence might be inferred by indirect means, just as Einstein demonstrated the existence of atoms in 1905 by analyzing the random motion of particles suspended in liquid.
He doesn’t mention that his analog of the Brownian motion experiment has been done: people have looked for the predicted indirect effects of other bubble universes on ours, and found nothing. To the extent that the multiverse is testable, it has been tested and found to not be there.
For presentations a couple days ago at the latest HEPAP meeting, see here. One piece of news, from this presentation, is that there likely will be a delay in the scheduled startup of the HL-LHC, with the next LHC run (Run 3) extended for an additional year (through 2024), and the next shutdown (LS3) extended by a half year. The HL-LHC would then start physics in 2028.
Most of the HEPAP discussions have to do with funding. The pattern of recent years has been one of huge decreases in funding proposed by the Trump administration. These are completely ignored by both the Democrats and Republicans in Congress, which passes large increases in funding (then signed into law by Trump). For FY2020 this continues: at DOE the HEP budget for FY2019 was \$980 million, for FY2020 the White House budget request was a massive cut to \$768 million. This was taken no more seriously by anyone than the last few of these, with the FY2020 House Mark \$1,045 million, the Senate Mark \$1,065 million. The FY2020 budget remains to be finally finished and passed, in the meantime the federal government has been operating under a sequence of continuing resolutions.
Specifically on theory funding, JoAnne Hewett has a presentation on The State of Theory. It has no numbers in it, but the DOE numbers given here show an increase from \$60 million in FY2017 to \$90 million in FY2019 for Theoretical, Computational and Interdisciplinary Physics. But within this category, pure theoretical HEP is pretty flat, with big increases for Computational HEP and a huge new investment in Quantum Information Science (\$27.5 million in FY2019). There does seem to have been some sort of decision to de-prioritize conventional theoretical HEP in favor of newer trendy areas.
Hewett describes the general consensus on current problems with theory funding as
Universal concern on ever decreasing levels of funding for university groups: concern that university programs are dying.
-Private institutions attempt to offset cuts with non-federal funding sources.
-Cuts to program further accumulated in 2019. Many postdocs learned in May 2019 that their contracts would not be renewed for the fall. It was then too late to apply for new positions.
Lab theory programs are also losing researchers.
Even distribution of cuts across U.S. theory program has indirect proportional effect to small programs.
Large fluctuations cycle-to-cycle is making groups less cohesive and more inclined to opt for “safer” research projects.
There is the perception that the recent emphasis on QIS comes at a cost to more traditional HEP theory research.
Summer salary has been capped or reduced to 1 month in many cases. Removal of summer salary across the board is demoralizing.
and ends with
The situation is becoming increasingly unstable.
University-based theory is suffering its most serious crisis in decades.
Its future is in jeopardy.
It would be interesting to see some numbers on the size of new private research funding going to HEP theory (for instance funding from the Simons Foundation or the private funding of the CMSA at Harvard). I don’t know of such numbers but I’m curious whether what is happening is that the total funding level has seen reasonable growth, but increases in funding are going to a small number of elite institutions, with the rest of the field in decline.
On the question of caps or reductions in summer salary, I doubt that any significant number of researchers is reacting to only getting 1 month of summer salary by signing up for another job (e.g. teaching summer school) and not doing research during the other two months of the summer. There has been another huge influx of money to the field that in some sense replaces grant-funded salary supplements: the multi-million dollar Breakthrough Prizes. A sizable number of HEP theorists have now partaken in all or part of one of the \$3 million prizes. If you add in this money, on average HEP theorists may have been seeing significant increases in income, however with almost all of it going to a small number of people (at the same elite institutions that are doing well). What we’re seeing may just be the same trend as in the rest of the US economy: a move to a star system with ever larger increases in inequality.
Another problem for the field of HEP theory may be that funding is stagnating because the DOE and NSF are skeptical about its intellectual health. Hewett notes that “Formal theory resides solely in university environment and has undergone significant funding cuts.” Trying to make the positive case for this part of the field, she lists three areas of advances, but oddly, the first two are identical. The two areas of advances in formal theory she describes are:
Advances in strongly coupled quantum field theory (gravity/field theory duality, bootstrap program, amplitudes) has implications for particle physics, cosmology and beyond.
Geometric advances in particle physics constructions from String/F-theory has implications for the “swampland program”.
For the second of these, it’s quite possible that most physicists don’t see this as an advance at all.
Update: Physics World has more about the delay here. It is supposed to be announced on Tuesday. The cause evidently is a budget gap caused by some planned contributions from non-member countries now not happening. The story doesn’t explain which non-member countries are involved or why their planned contributions are now not expected.
The great German artist Anselm Kiefer now has a show up in London at the White Cube Bermondsey gallery, with a review in the Guardian entitled Terrifying Odyssey Through a Cursed World. The review describes some of the works as follows:
Another room is given over to panoramic blasts of brown and black that map sweeping vistas of desolate fields. A road twines through a morass of mud and collaged sticks. Lines of fence poles vanish in the distance. These scenes are drawn in black on a vertiginous scale. Kiefer uses perspective, the Renaissance technique of showing the real world shrinking towards a single vanishing point, to define his landscapes – but the perspective view is a transparency on top of a muddy tumult of colour and texture, with real, 3D stuff stuck over that in turn. From the right distance, the picture of a landscape can be read clearly, like a painting by Van Gogh. Go closer and the picture dissolves in a mess of bulges and muck.
What’s the inspiration for these works (besides the Holocaust)?
These landscapes are entitled Superstrings, a reference to string theory, an influential idea in contemporary physics that seeks to unify quantum mechanics with Einstein’s relativity.
White Cube is pleased to present an exhibition of new work by Anselm Kiefer. The exhibition brings together many of the interests that have characterised Kiefer’s work for decades, including mythology, astronomy and history. Located across the entire Bermondsey space, it features a large-scale installation and paintings that draw on the scientific concept known as string theory.
The Guardian review continues:
The main gallery at White Cube Bermondsey is already pretty bleak in its featureless emptiness. Kiefer makes it work for him by heightening the chill, turning the White Cube into a morgue for Europe. Snow-covered landscapes with none of the cheer of Bruegel stretch away to infinity. They are marked with sticks as black as gravestones and nets that catch at nothing. Kiefer’s science reading clearly hasn’t cheered him up. The curvy grids of space-time become horrible wire traps in a devastated nowhere. We might be on the no-man’s land of the Ukraine border. Anyway, this place has got death in its hard black furrows.
This is where we come to string theory – the monolith of Kiefer’s new show. Though he admits that he doesn’t quite understand what string theory is, Kiefer professes complete fascination with the idea that there is a scientific equivalent to the allegorical Gordian Knot – an idea that he picked up after thirty years of subscribing to Spektrum, a German monthly science magazine…
String theory cannot be verified empirically. Rather, it is an attempt to provide an all-encompassing description of the universe. And that, says Kiefer, is just why it is beautiful. “I suppose it’s like painting,” he says. “You cannot prove if a painting is good or bad. That is the point of it – it is descriptive… and there’s something sublime in that.”
From the images available, the work does look quite amazing. Kiefer quite possibly has gotten to the very heart of superstring theory, seeing in it a dark, desolate and blasted mythology which “cannot be verified empirically.”
There’s an excellent new book out about Jim Simons and Renaissance Technologies, The Man Who Solved the Market, by Gregory Zuckerman. I recommend it enthusiastically to anyone interested in the story of how a geometer ended up being worth \$23 billion. Lots of other mathematicians and physicists have also been involved in this over the years.
I first heard about Simons and his investment operation when I was a postdoc at Stony Brook in the mid-eighties, and have heard bits and pieces of this story from various sources over the years, sometimes clearly distorted in the retelling. It’s very satisfying to finally get a reliable explanation of what Simons and those working with him have been up to all this time. For those with more interest than me in the details of quant strategies, the book provides far and away the most information available about how Simons and RenTech have been making so much money so successfully. The author managed to get some degree of cooperation from Simons, and was thus able to get a lot of those involved with him to talk. As a result, while this isn’t an “authorized” biography, it’s written from a point of view rather sympathetic to Simons.
One question that keeps coming up in the book is that of motivation. Why did Simons abandon a highly successful career doing research mathematics in order to focus on making as much money as possible? Part of the answer is that, from the beginning, Simons always had one foot out of the research math world, playing poker and trading commodities even when he was a graduate student working with Chern at Berkeley. Later, while employed at the IDA in Princeton, he spent time working not just on government projects but on the mathematical analysis of stock market trading strategies. While I’ve often heard the story of how he was fired from IDA after publicly criticizing the Vietnam War, less well known is that a big problem was that he was quoted in Newsweek saying he planned to work on his own projects, not government ones, until the war was over.
Unfortunately, the book has very little to say about a question I’m fascinated by: what does Simons intend to do with the \$23 billion (and counting, the RenTech Medallion Fund that he has a large piece of continues to be an incredible money-making machine)? There’s very little in the book about his philanthropic activities, the most visible of which are at the Simons Foundation, which now has assets of nearly \$3 billion with amounts of the order of \$300 million/year coming in as income and going out as research funding. I think that on the whole Simons had made excellent choices with the math and physics that he has decided to fund, from the Simons Center for Geometry and Physics at Stony Brook to a wide array of programs funded by his foundation.
A question that keeps reappearing throughout the book is that of the social significance of RenTech. It’s a rather pure test case for the moral question about quant investing: would the world be better off without it? In the case of the main money-maker, their Medallion fund, it’s hard to argue that the short-term investment strategies they use provide important market liquidity. The fund is closed to outside investors, and makes money purely personally for those involved with RenTech, not for institutions like pension funds. So, the social impact of RenTech will come down to that of what Simons and a small number of other mathematicians, physicists and computer scientists decide to do with the trading profits (calculated by Zuckerman at over \$100 billion so far).
Simons himself has engaged in some impressive philanthropy, but one perhaps should weigh that against the effects of the money spent by Robert Mercer, the co-CEO he left the company to (Zuckerman discusses Mercer in detail). Mercer and his daughter have a lot of responsibility for some of the most destructive recent attacks on US democracy (e.g. Breitbart and the Cambridge Analytica 2016 election story). In the historical evaluation of whether the world would have been better off with or without RenTech, the fact that RenTech money may have been a determining factor in bringing Trump and those around him to power is going to weigh heavily on one side.