Penrose at The Portal

Since last summer Eric Weinstein has been running a podcast entitled The Portal, featuring a wide range of unusual and provocative discussions. A couple have had a physics theme, including one with Garrett Lisi back in December.

One that I found completely fascinating was a recent interview with Roger Penrose. Penrose of course is one of the great figures of theoretical physics, and someone whose work has not followed fashion but exhibited a striking degree of originality. He and his work have often been a topic of interest on this blog: for one example, see a review of his book Fashion, Faith and Fantasy.

Over the years I’ve spent a lot of time thinking about Penrose’s twistors, becoming more and more convinced that these provide just the radical new perspective on space-time geometry and quantization that is needed for further progress on fundamental theory. For a long time now, string theorists have been claiming that “space-time is doomed”, and the recent “it from qubit” bandwagon also is based on the idea that space-time needs to be replaced by something else, something deeply quantum mechanical. Twistors have played an important role in recent work on amplitudes, for more about this a good source is a 2011 Arkani-Hamed talk at Penrose’s 80th birthday conference.

One of my own motivations for the conviction that twistors are part of what is needed is the “this math is just too beautiful not to be true” kind of argument that these days many disapprove of. There are many places one can read about twistors and the mathematics that underlies them. One that I can especially recommend is the book Twistor Geometry and Field Theory, by Ward and Wells. A one sentence summary of the fundamental idea would be

A point in space time is a complex two-plane in complex four-dimensional (twistor) space, and this complex two-plane is the fiber of the spinor bundle at the point.

In more detail, the Grassmanian G(2,4) of complex two-planes in $\mathbf C^4$ is compactified and complexified Minkowski space, with the spinor bundle the tautological bundle. So, more fundamental than space-time is the twistor space T=$\mathbf C^4$. Choosing a Hermitian form $\Omega$ of signature (2,2) on this space, compactified Minkowski space is the set of two-planes in T on which the form is zero. The conformal group is then the group SU(2,2) of transformations of T preserving $\Omega$ and this setup is ideal for handling conformally-invariant theories. Instead of working directly with T, it is often convenient to mod out by the action of the complex scalars and work with $PT=\mathbf{CP}^3$. A point in complexified, compactified space-time is then a $\mathbf{CP}^1 \subset \mathbf{CP}^3$, with the real Minkowski (compactified) points corresponding to $\mathbf{CP}^1$s that lie in a five-dimensional hypersurface $PN \subset PT$ where $\Omega=0$.

On the podcast, Penrose describes the motivation behind his discovery of twistors, and the history of exactly how this discovery came about. He was a visitor in 1963 at the University of Texas in Austin, with an office next door to Engelbert Schucking, who among other things had explained to him the importance in quantum theory of the positive/negative energy decomposition of the space of solutions to field equations. After the Kennedy assassination, he and others made a plan to get together with colleagues from Dallas, taking a trip to San Antonio and the coast. Penrose was being driven back from San Antonio to Austin by Istvan Ozsvath (father of Peter Ozsvath, ex-colleague here at Columbia), and it turned out that Istvan was not at all talkative. This gave Penrose time alone to think, and it was during this trip he had the crucial idea. For details of this, listen to what Penrose has to say starting at about 47 minutes before the end of the podcast. For a written version of the same story, see Penrose’s article Some Remarks on Twistor Theory, which was a contribution to a volume of essays in honor of Schucking.

Posted in Uncategorized | 16 Comments

This Week’s Hype

Sabine Hossenfelder already has this covered, but I wanted to add a few comments about this week’s hype, a new article in Quanta magazine by Philip Ball entitled Wormholes Reveal a Way to Manipulate Black Hole Information in the Lab (based on this paper). It’s the latest in a long tradition of bogus claims that studying relatively simple quantum systems is equivalent to studying string theory/quantum gravity. For an example from ten years ago, see here. The nonsensical idea back then (which got a lot of attention) was that somehow studying four qubits would “test string theory”.

A first comment would be that this is just profoundly depressing, because Ball is one of the best and most sensible science writers around (see my review of his excellent recent book on quantum mechanics) and Quanta magazine is about the the best semi-popular science publication there is. If this article were appearing in any one of the well-known examples of publications that traffic in misleading sensationalism, it wouldn’t be surprising and would best be just ignored.

Hossenfelder has pointed out one problem with the whole idea (we don’t live in AdS space), but a more basic problem is the obvious one pointed out by one of the first commenters at Quanta:

In the end, if an experiment is performed based on standard quantum mechanics, and verifies standard quantum mechanics as expected, then it is irrelevant that this aspect of standard quantum mechanics might be analogous to a vaguely-formulated and incomplete speculative idea about spacetime emergence — nor can it provide any experimental support whatsoever for that idea.

I understand that, for science journalists hearing that a large group of well-known physicists from Google, Stanford, Caltech, Princeton, Maryland and Amsterdam has figured out how to study quantum gravity in the lab (by teleporting things from one place to another via traversable wormholes!!), it’s almost impossible to resist the idea that this is something worth writing about. Please try.

Update: Philip Ball responds here.

Update: More from Philip Ball (and, if it appears, a response from me) at the Quanta article comment section, comments from one of the paper’s author’s also comments here.

Update: Commenter Anonyrat points out that the Atlantic is republishing this piece, as A Tiny, Lab-Size Wormhole Could Shatter Our Sense of Reality: How scientists plan to set up two black holes and a wormhole on an ordinary tabletop.

Update: In the future, I hope to as much as possible outsource coverage of this kind of thing to the Quantum Bullshit Detector. Today, see for instance this.

Posted in This Week's Hype | 32 Comments

Why String Theory Is Both A Dream And A Nightmare (as well as a swamp…)

Ethan Siegel today has a new article at Starts With a Bang, entitled Why String Theory is Both a Dream and a Nightmare. For the nightmare part, he writes:

its predictions are all over the map, untestable in practice, and require an enormous set of assumptions that are unsupported by an iota of scientific evidence.

which I think just confuses the situation, which could be much more accurately and simply described as “there are no predictions”. The fundamental reason for this is also rather simply stated: the supposed unified theory is a theory in ten space-time dimensions, and no one has figured out a way to use this to get a consistent, predictive model with four space-time dimensions. If you don’t believe this, try watching the talks going on in Santa Barbara this week, which feature, after 17 years of intense effort, complete confusion about whether it is possible to construct such models with the right sign of the cosmological constant.

Siegel gets a couple things completely wrong, although this is not really his fault, due to the high degree of complexity and mystification which surrounds the 35 years of failed efforts in this area. About SUSY he writes

For one, string theory doesn’t simply contain the Standard Model as its low-energy limit, but a gauge theory known as N=4 supersymmetric Yang-Mills theory. Typically, the supersymmetry you hear about involves superpartner particles for every particle in existence in the Standard Model, which is an example of an N=1 supersymmetry. String theory, even in the low-energy limit, demands a much greater degree of symmetry than even this, which means that a low-energy prediction of superpartners should arise. The fact that we have discovered exactly 0 supersymmetric particles, even at LHC energies, is an enormous disappointment for string theory.

Like everything else, there’s no prediction from string theory about how many supersymmetries will exist. The special role of N=4 supersymmetric Yang-Mills theory has nothing to do with the problem of low energy SUSY, instead it occurs as the supposed dual to a very special 10d superstring background (AdS5 x S5). This is of interest for completely different reasons, one of which was the hope that this would provide a string theory dual to QCD, allowing use of string theory not to do quantum gravity, but to do QCD computations. This has never worked, with one main reason being that it can’t reproduce the asymptotic freedom property of QCD. Siegel tries to refer to this with

And when you look at the explicit predictions that have come out for the masses of the mesons that have been already discovered, by using lattice techniques, they differ from observations by amounts that would be a dealbreaker for any other theory.

including a table with the caption

The actual masses of a number of observed mesons and quantum states, at left, compared with a variety of predictions for those masses using lattice techniques in the context of string theory. The mismatch between observations and calculations is an enormous challenge for string theorists to account for.

He’s getting this from slide 31 of a talk by Jeff Harvey, but mixing various things up. The table has nothing to with lattice calculations, those are relevant to the other part of the slide, which is about string theory predictions for pure (no fermions) QCD glueballs. These are not physical objects, thus the comparison to lattice computer simulations, not experiment. The table he gives is from here and about real particles. The “predictions” are not made as he claims “using lattice techniques in the context of string theory.” There are no lattice techniques involved.

Normally Siegel does a good job of navigating complex technical subjects. The subject of string theory is now buried in a huge literature of tens of thousands of papers over forty years with all sorts of claims, many designed to obscure the fact that ideas haven’t worked out. It’s fitting that the name chosen for the kind of discussions going on at Santa Barbara this week is “The String Swampland”. String theory verily is now deep in a trackless swamp…

Posted in Swampland, Uncategorized | 6 Comments

Robert Hermann 1931-2020

I was sorry to hear today of the recent death of Robert Hermann, at the age of 88. While I unfortunately never got to meet him, his writing had a lot of influence on me, as it likely did for many others with an overlapping interest in mathematics and fundamental physics. Early in my undergraduate years during the mid-1970s I first ran across some of Hermann’s books in the library, and found them full of fascinating and deep insights into the relations between geometry and physics. Over the years I’ve often come back to them and learned something new about one or another topic. The main problem with his writings is just that there is so much there that it is hard to know where to start.

While the relations between Riemannian geometry and general relativity were well-understood from Einstein’s work in the beginning of the subject, the relations between geometry and Yang-Mills theory were not known by Yang, Mills or other physicists working on the subject during the 1950s and 1960s. The understanding of these relations is conventionally described as starting in 1975, with the BPST instanton solutions and Simons explaining to Yang at Stony Brook about fiber bundles (leading to the “Wu-Yang dictionary” paper). But if you look at Hermann’s 1970 volume Vector Bundles in Mathematical Physics, you’ll find that it contains an extensive treatment of Yang-Mills theory in terms of connections and curvature in a vector bundle. While I don’t know if Hermann had written about the sort of topologically non-trivial gauge field configurations that got attention starting in 1975, he had at that point for a decade been writing in depth about the details of the relations between geometry and physics that were news to physicists in 1975.

Being ahead of your time and mainly writing expository books is unfortunately not necessarily good for a successful academic career. Looking through his writings this afternoon, I ran across a long section of this book from 1980, entitled “Reflections” (pages 1-82). I strongly recommend reading this for Hermann’s own take on his career and the problems faced by anyone trying to do what he was doing (the situation has not improved since then).

A general outline of his early career, drawn from that source is:

1948-50: undergraduate in physics, University of Wisconsin.
1950-52: undergraduate in math, Brown University.
1952-53: Fulbright scholar in Amsterdam.
1953-56: graduate student in math, Princeton. Thesis advisor Don Spencer.
1956-59: instructor at Harvard (“Harvard hired me as an instructor in the mistaken belief that I must be a topologist since I came from Princeton”).
1953-59: “My real work from 1953-59 was studying Elie Cartan!”
1959-61: position at MIT Lincoln Lab, taught course at Berkeley in 1961.

Hermann ultimately ended up at Rutgers, which he left in 1973, because he was not able to teach courses there in his specialty, and felt he had too little time to conduct the research he wanted to work on. It appears he expected to get by with some mix of grant money and profits from running a small publishing operation (Math Sci Press, which mainly published his own books). The “Reflections” section of the book mentioned above also contains some of his correspondence with the NSF, expressing his frustration at his grant proposals being turned down. At the end of a letter from late 1977 (which was at the height of excitement in the physics community over applying ideas from geometry and topology to high energy physics) he writes in frustration:

However, when I look in the Physical Review today, all the subjects which people in your position so enthusiastically supported ten years ago are now dead as the Phlogiston theory – and good riddance – while the topics I was working on then are now everywhere dense. Does one get support from the NSF by being right or by being popular?

John Baez has written something here, and there’s an obituary notice here.

Update: I’ve been reading some more of the essays Hermann published in the “Reflections” section of this book. Especially recommended is the section on Mathematical Physics of this 1979 essay (pages 30-38). His evaluation of the situation of the time I think was extremely perceptive.

Update: For more about Hermann, see some of the comments at this old blog posting. Also, on the topic of his book reviews, see this enthusiastic review of the Flanders book Differential forms with applications to the physical sciences.

Update: For an interesting review covering many of Hermann’s books, at the Bulletin of the AMS in 1973, see here.

Posted in Obituaries | 11 Comments

Various

  • A few months ago I ended up doing a little history of science research, trying to track down the details of the story of the Physical Review’s 1973 policy discouraging articles on “Foundations”. The results of that research are in this posting, where I found this explanation from the Physical Review editor (Goudsmit) of the problem they were trying to deal with:

    The event [referring to a difficult refereeing problem] shows again clearly the necessity of rapid rejections of questionable papers in vague borderline areas. There is a class of long theoretical papers which deal with problems of interpretation of quantum and relativistic phenomena. Most of them are terribly boring and belong to the category of which Pauli said, “It is not even wrong”. Many of them are wrong. A few of the wrong ones turn out to be valuable and interesting because they throw a brighter light on the correct understanding of the problem. I have earlier expressed my strong opinion that most of these papers don’t belong in the Physical Review but in journals specializing in the philosophy and fundamental concepts of physics.

    I had heard that people studying foundations of quantum mechanics, frustrated by this policy, had started up during the 1970s their own samizdat publication, called “Epistemological Letters”. I tried to see if there was any way to read the articles that appeared in that form, but it looked like the only way to do this would be to go visit one or two archives that might have some copies. Unbeknownst to me, around the same time Notre Dame University had just finished a project of scanning all issues of Epistemological Letters and putting them online. They are now available here, with an article about them here and an introductory essay here.

  • There’s an interesting essay on the arXiv about the current state of BSM physics, by HEP theorist Goran Senjanović, entitled Natural Philosophy versus Philosophy of Naturalness.
  • Here’s an article about problems string theorist Amer Iqbal has been having in Pakistan.
  • The New York Times has an article about Cedric Villani and his campaign for mayor of Paris. The election is next month, and I’m having a hard time figuring out why Villani is running. There doesn’t seem to be a lot of difference in policy views between the current mayor (Hidalgo) and the Macronistas (Griveaux and Villani), with the main effect of Villani entering the race a splitting of the Macron party vote.
  • I was sorry to hear recently about the death of mathematician Louis Nirenberg. Kenneth Chang at the New York Times has written an excellent obituary. Terry Tao has some comments here.

Update: Excellent rant on Twitter from Philip Ball about misrepresentations of the Copenhagen interpretation. For your own rants, please engage in them on Twitter rather than here.

Posted in Uncategorized | 20 Comments

London Calling with Career Opportunities II

If you’re a mathematician, you don’t need to go work for Dominic Cummings in order to have dramatically improved career opportunities in the UK. The British government has just announced a huge increase in funding for mathematical research: 60 million pounds/year (or about \$80 million dollars) for the next five years (see here and here). To get some idea of the scale of this, note that the US GDP is about 8 times the UK’s and the NSF DMS budget is about \$240 million/year. So the comparable scale of this funding in the US would be about two and a half times the NSF budget for mathematics.

Many of my mathematician colleagues have sometimes seemed to me to be of the opinion that a huge increase in funding for math research is the best way to improve a society. We’ll see if this works for Britain.

While the new UK government ran on a nativist platform of restricting immigration, with the goal of keeping outsiders from taking bread out of the mouths of UK citizens, this doesn’t apply to mathematicians: all limits are off and we’re encouraged to flood the country. The law will be changed on Friday, changes go into effect Feb. 20. This will include an “accelerated path to settlement”, no need to even have a job offer, and all your “dependents [will] have full access to the labour market”, no problem with them and the taking the bread out of the mouths of the locals thing.

Update: More here (except it’s mostly behind a paywall, but evidently Ivan Fesenko is quoted).

Posted in Uncategorized | 25 Comments

This and That

  • I was sorry to hear of the death a few months ago of Tony Smith, who had been a frequent commenter on this blog and others. Unfortunately my interactions with him mainly involved trying to discourage him from hijacking the discussion in some other (often unusual) direction. Geoffrey Dixon did get to know him well, and has written a wonderful long blog entry about Tony, which I highly recommend (Dixon’s newish blog also has other things you might find interesting).
  • On the Jim Simons front, the Simons Foundation has put together something to celebrate their 25th anniversary. It explains a lot about their history and what they are doing now, as well as giving some indication of plans for the future. On these topics, read pieces written by Jim Simons and Marilyn Simons. The Foundation has been in a high growth mode, having an increasingly large impact on math and physics research. Their main statement about the future is that the plan is for this to go on for a very long time:

    According to its bylaws, the Simons Foundation is intended to focus almost entirely on research in mathematics and science and to exist in perpetuity. If future leadership abides by these guiding principles, Marilyn and I believe the foundation will forever be a force for good in our society.

    My impression is that the Simons children have their own interests, and foundations with other goals to run.

    News from the \$75 billion source of the money (RenTech) today is that Simons is increasingly turning over control of that to his son Nathaniel, who has been named co-chairman. He has also added four new directors to the board, four of them senior Renaissance executives, and one his son-in-law Mark Heising.

  • There are various IAS-related videos you might want to take a look at:

    Pierre Deligne explaining motives last night.

    Michael Douglas on the use of computers in mathematics.

    A Dutch documentary (not all of it is in Dutch…).

  • If you aren’t regularly reading Scott Aaronson’s blog, you really should be. Latest entries are a detailed report from Davos and a guest post with a compelling argument about a major factor behind the problem of why women leave STEM careers more than men.
  • For the latest on the “It from Qubit” business, see talks at a KITP conference. John Preskill notes “lingering confusion over what it all means”, which makes me glad to hear that I’m not the only one…
Posted in Uncategorized | 7 Comments

Why the foundations of physics have not progressed for 40 years

Sabine Hossenfelder has a new piece out, making many of the same arguments she has been making for a while about the state of fundamental theory in physics. These have a lot in common with arguments that Lee Smolin and I were making in our books published back in 2006. The underlying problem is that the way theorists successfully worked up until the seventies is no longer viable, with the Standard Model working too well, up to the highest energies probed:

The major cause of this stagnation is that physics has changed, but physicists have not changed their methods. As physics has progressed, the foundations have become increasingly harder to probe by experiment. Technological advances have not kept size and expenses manageable. This is why, in physics today, we have collaborations of thousands of people operating machines that cost billions of dollars.

With fewer experiments, serendipitous discoveries become increasingly unlikely. And lacking those discoveries, the technological progress that would be needed to keep experiments economically viable never materializes. It’s a vicious cycle: Costly experiments result in lack of progress. Lack of progress increases the costs of further experiment. This cycle must eventually lead into a dead end when experiments become simply too expensive to remain affordable. A $40 billion particle collider is such a dead end.

I have a somewhat different view about a potential next collider (see here), but agree that the basic question is whether it will be “too expensive to remain affordable.”

What has happened over the last forty years is that the way HEP theory is done has become dysfunctional, in a way that Hossenfelder characterizes as follows:

Instead of examining the way that they propose hypotheses and revising their methods, theoretical physicists have developed a habit of putting forward entirely baseless speculations. Over and over again I have heard them justifying their mindless production of mathematical fiction as “healthy speculation” – entirely ignoring that this type of speculation has demonstrably not workhat woed for decades and continues to not work. There is nothing healthy about this. It’s sick science. And, embarrassingly enough, that’s plain to see for everyone who does not work in the field.

This behavior is based on the hopelessly naïve, not to mention ill-informed, belief that science always progresses somehow, and that sooner or later certainly someone will stumble over something interesting. But even if that happened – even if someone found a piece of the puzzle – at this point we wouldn’t notice, because today any drop of genuine theoretical progress would drown in an ocean of “healthy speculation”…

Why don’t physicists have a hard look at their history and learn from their failure? Because the existing scientific system does not encourage learning. Physicists today can happily make career by writing papers about things no one has ever observed, and never will observe. This continues to go on because there is nothing and no one that can stop it.

This story brings up a lot of complex issues in the philosophy and sociology of science, but to me there’s one aspect of the problem that is relatively simple and deserves a lot more attention than it gets: how do you get theorists to abandon failed ideas and move on to try something else?

The negative LHC results about SUSY have had some effect, but even in this case it’s remarkable how many theorists won’t abandon the failed idea of a SUSY extension of the Standard Model. This was always a highly dubious idea, explaining nothing about the Standard Model and adding a huge number of new degrees of freedom and more than a hundred new undetermined parameters. Not seeing anything at the LHC should have put the final nail in the coffin of that idea. Instead, I see that this past fall MIT was still training its graduate students with a course on Supersymmetric Quantum Field Theories. You can try and argue that SUSY and supergravity theories are worth studying even if they have nothing to do with physics at observable energies, but it is a fact that these are extremely complicated QFTs to work with and have explained nothing. Why encourage grad students to devote the many, many hours it takes to understand the details of this subject, instead of encouraging them to learn about something that hasn’t been a huge failure?

The techniques one gets trained in as a graduate student tend to form the basis of one’s understanding of a subject and have a huge influence on one’s future career and the questions one has the expertise needed to work on. Besides SUSY, string theory has been the other major course topic at many institutions, with the best US grad students often spending large amounts of time trying to absorb the material in Polchinski’s two-volume textbook, even though the motivations for this have turned out to also be a huge failure, arguably the largest one in the history of theoretical physics.

To get some idea of what is going on, I took a look at the current and recent course offerings (on BSM theory, not including cosmology) at the five leading (if you believe US News) US HEP theory departments. I may very well be missing some offered courses, but the following gives some insight into what leading US departments are teaching their theory students. Comparing to past years might be interesting, possibly there’s a trend towards abandoning the whole area in favor of other topics (e.g. cosmology, quantum information, condensed matter).

The places not offering string theory courses this year seem to have had them last year.

Update: Something relevant and worth reading that I think I missed when it came out: Jeremy Butterfield’s detailed review of Lost in Math, which has a lot about the question of why theorists are “stuck”.

Update: There’s some serious discussion of this on Twitter. For those who can stand that format, try looking here and here.

Update: Mark Goodsell has a blog posting about all this here, including a defense of teaching the usual SUSY story to graduate students.

Update: A correspondent pointed me to this recent CERN Courier interview with John Ellis. Ellis maintains his increasingly implausible defense of SUSY, but he’s well aware that times have now changed:

People are certainly exploring new theoretical avenues, which is very healthy and, in a way, there is much more freedom for young theorists today than there might have been in the past. Personally, I would be rather reluctant at this time to propose to a PhD student a thesis that was based solely on SUSY – the people who are hiring are quite likely to want them to be not just working on SUSY and maybe even not working on SUSY at all. I would regard that as a bit unfair, but there are always fashions in theoretical physics.

Posted in Uncategorized | 48 Comments

Musings on the Current Status of HEP

To start the new decade there’s an article very much worth reading by Misha Shifman, entitled Musings on the Current State of HEP. It’s somewhat of an update of something he wrote back in 2012, which I wrote about here. He starts off with:

Now, seven years later, I will risk to offer my musings on the same subject.The seven years that have elapsed since [1] brought new perspectives: the tendencies which were rather foggy at that time became pronounced. My humble musings do not pretend to be more than they are: just a personal opinion of a theoretical physicist… For obvious reasons I will focus mostly on HEP, making a few marginal remarks on related areas. I would say that the most important message we have received is the absence of dramatic or surprising new results. In HEP no significant experimental findings were reported, old ideas concerning Beyond the Standard Model (BSM) physics hit dead-ends one after another and were not replaced by novel ideas. Hopes for key discoveries at the LHC (such as superpartners) which I mentioned in 2012 are fading away. Some may even say that these hopes are already dead. Low energy supersymmetry is ruled out, and gone with it is the concept of naturalness, a basic principle which theorists cherished and followed for decades. Nothing has replaced it so far…

HEP, “my” branch of theoretical physics since the beginning of my career, seems to be shrinking. A change of priorities in HEP in the near future is likely as business as usual is not sustainable. The current time is formative.

I encourage you to take a look at the rest, there’s a lot more detailed discussion of the state of HEP and allied fields, especially about the central role of quantum field theory.

Shifman also includes a section very critical of Richard Dawid, the “non-empirical confirmation” business and talks given at the “Why Trust a Theory?” conference (discussed here):

With all due respect I strongly disagree with Richard Dawid and all supporting speakers at the conference and beyond… I object against applying the term “non-empirically confirmed” to science (the more so, the term “postempiric science”). Of course, we live in liberal times and everybody is entitled to study and discuss whatever he or she wants. But the word science is already taken. Sorry, colleagues. For “postempiric science,” please, use another word, for instance, iScience, xScience, or something else.

As for David Gross’s attempt to claim that string theory is, like quantum mechanics and quantum field theory, not testable just because it is a framework, not a theory, Shifman is having none of it:

David Gross is a great theoretical physicist, whose discovery of asymptotic freedom made him immortal, but I respectfully disagree with him. Framework or not, both QM and QFT have absolutely solid confirmations in all their aspects in thousands of experiments.

As for the once popular idea that string theory could provide a “theory of everything”, he writes:

Well… it never happened and – I will risk to say – never will.

Posted in Uncategorized | 13 Comments

London Calling with Career Opportunities

At some point within the past couple years I noticed that one blog that had Not Even Wrong on its blogroll was the blog of Dominic Cummings, who was often getting credited with masterminding the political campaign that got the British to vote (narrowly) for Brexit in 2016. Cummings has had further success recently as Chief Special Adviser to British Prime Minister Boris Johnson, with a blow-out election victory three weeks ago putting him securely in control of the British state.

Today on his blog Cummings has, invoking Grothendieck, posted a job advertisement: ‘Two hands are a lot’ — we’re hiring data scientists, project managers, policy experts, assorted weirdos…. He’s looking for mathematicians, physicists and others to join him to change British society, working

in the intersection of:

  • the selection, education and training of people for high performance
  • the frontiers of the science of prediction
  • data science, AI and cognitive technologies (e.g Seeing Rooms, ‘authoring tools designed for arguing from evidence’, Tetlock/IARPA prediction tournaments that could easily be extended to consider ‘clusters’ of issues around themes like Brexit to improve policy and project management)
  • communication (e.g Cialdini)
  • decision-making institutions at the apex of government.

For some other descriptions of who Cummings would like to hire, on the economics side there’s:

The ideal candidate might, for example, have a degree in maths and economics, worked at the LHC in one summer, worked with a quant fund another summer, and written software for a YC startup in a third summer!

We’ve found one of these but want at least one more.

He also wants “Super-talented weirdos”, with examples given from William Gibson novels, such as “that Chinese-Cuban free runner from a crime family hired by the KGB.”

The remarkable things to me about this long document are what it doesn’t contain. In particular I see nothing at all about any specific policy goals. Usually a new government would recruit people by appealing to their desire to make the world a better place in some specific way, but there’s nothing about that here. The goal is to control the government and what the British population believes, but to what end?

In addition, a more conventional hiring process would be asking for candidates of high ethical values, with some devotion to telling the truth. Cummings seems to be asking for exactly the opposite: best if your background is “from a crime family hired by the KGB.”

Best of wishes to my British readers, now joining the US and other nations in a new dystopic post-truth era. It’s massively depressing to me to see how this has worked out here, I hope you do better. Maybe you should be sending in your applications to Cummings and hoping to sign up for a role in the new power structure. If so, tell him “Not Even Wrong” sent you…

Update: For more on Cummings, there’s a good Financial Times article.

Posted in Uncategorized | 30 Comments