Some short items on a wide variety of topics:
- The Hawking/Perry/Strominger paper on a new idea about the black hole information paradox (see here for an early discussion) based on BMS supertranslation symmetries has now appeared on the arXiv. I’m no expert on the intricate arguments about this paradox, so have no idea what the implications of this paper for that really are. However, it does seem to be a very interesting approach to quantum gravity questions (although the paper mostly deals with simpler gauge theory calculations). The ideas are squarely in the mainstream of what has been the most successful way of making progress in fundamental theory: identifying new implications of symmetries that are at the center of our core theories (the standard model and GR). Such a new understanding looks like a far more promising way forward than much of what is currently popular in the subject.
- For an example of what is currently popular, the KITP is hosting a workshop this week of the the It from Qubit Simons Collaboration, on Quantum Error Correction and Tensor Networks. I gather this is supposed to somehow explain AdS/CFT, but I’ve never understood how this is supposed to come about. Evidently I’m not the only one wondering about this. John Presskill reports that, in his talk leading off a series of lectures on this, Patrick Hayden commented that
I’m unsure what we are trying to learn from these tensor network models of holography.
- Tonight PBS will be showing the film Particle Fever, which I wrote about here. It’s a great film, highly recommended, despite the larding with comical nonsense about the multiverse (if you believe the theorists in the film, the multiverse is supposed to be tested by its prediction of a mass of 140 GeV for the Higgs). The capsule summary in the New York Times TV listing this morning for the film is “Scientists recreate conditions from the big-bang theory”. While the LHC has nothing to do with the big-bang theory, maybe this summary refers to the comedy of the theorists and another well-known TV show, in which case viewers may be a bit disappointed.
- In other LHC related news, the AMVA4NewPhysics project now has a blog, latest posting explains the basics of b-tagging.
- I’ve never been able to really make sense of many of the arguments about “Bayes’s Theorem”, and the recent attempts to justify string theory using this just seemed bizarre. John Horgan has a great explanation of what is going on here, including this take on the Bayes/string theory/multiverse business:
In many cases, estimating the prior is just guesswork, allowing subjective factors to creep into your calculations. You might be guessing the probability of something that–unlike cancer—does not even exist, such as strings, multiverses, inflation or God. You might then cite dubious evidence to support your dubious belief. In this way, Bayes’ theorem can promote pseudoscience and superstition as well as reason.
Embedded in Bayes’ theorem is a moral message: If you aren’t scrupulous in seeking alternative explanations for your evidence, the evidence will just confirm what you already believe. Scientists often fail to heed this dictum, which helps explains why so many scientific claims turn out to be erroneous. Bayesians claim that their methods can help scientists overcome confirmation bias and produce more reliable results, but I have my doubts.
And as I mentioned above, some string and multiverse enthusiasts are embracing Bayesian analysis. Why? Because the enthusiasts are tired of hearing that string and multiverse theories are unfalsifiable and hence unscientific, and Bayes’ theorem allows them to present the theories in a more favorable light. In this case, Bayes’ theorem, far from counteracting confirmation bias, enables it.
- The recent Munich conference trying to justify string theory by Bayesian methods wasn’t the only example of European funding for philosophers to weigh in on the latest in fundamental physics. Another just announced European LHC-related project is a 2.5 million Euro research unit aiming to investigate the LHC “from an integrated philosophical, historical and sociological perspective.”
- I just ran across a recent paper by Kristian Camilleri and Sophie Ritson on The role of heuristic appraisal in conflicting assessments of string theory. It is very good, unlike almost every other discussion of this topic, I think it gets right the central serious argument of the “string wars”: how does one evaluate the prospects for the string unification idea? There is no simple answer to this, you need to understand what the state of efforts to connect a hoped for unified string theory to reality really are, how they have evolved, and try to make a sensible judgment about whether this is a failed idea or whether there is hope left. I highly recommend reading this for those who are not completely tired of this subject.
- In the same journal I noticed another quite good article, by Porter Williams on naturalness. He carefully explains the different incarnations of “naturalness” and I think comes to the right conclusion that it is best thought of as the idea that physical behavior at widely different distance scales should not be correlated. By the way, the name “naturalness” for this is a bit of marketing genius (how could “nature” not be “natural”?).
- In geometric representation theory news, the Simons Center is running a program on the topic this month, videos here. Here at Columbia Roman Bezrukavnikov will be the Spring 2016 Eilenberg lecturer, with his topic “Geometric categorification in representation theory”. I believe talks will be Thursdays at 2:40, watch the Columbia math department website for more news.
- Personally, I’m about to head out tomorrow night on vacation, so expect minimal blogging and possibly even shutting off of comments. When I get back, I’ll be teaching our spring semester graduate course on groups and representations, see here. Also trying to finish my book on quantum theory and representation theory. Current state (see here, comments always welcome) is that I’ve gone over and rewritten the first 34 chapters (except the introduction), planning on rewriting and adding material to the rest of the manuscript this semester. This better be done by this summer, partly because that’s when it is supposed to be delivered to Springer, partly because I’m already quite tired of this project and want to work on other things…
Update: Any mention of Bayesianism seems to attract a large number of people who want to discuss it, especially aspects that have nothing to do with the string theory/multiverse business. Please discuss this topic with John Horgan at his blog.
Update: Sabine Hossenfelder has more on the Hawking/Perry/Strominger paper here.
Update: Scientific American has an interesting interview with Strominger, who explains some of the ideas behind Hawking/Perry/Strominger. Jacques Distler has come out of retirement at Musings to object that this work violates two central ideological tenets: one should not pay attention to gauge invariance, and the answer to all questions should be string theory or AdS/CFT.
Could someone please educate a mathematician about how physicists get to say this in a paper – “Recently such an a priori reason for doubt has emerged from new discoveries about the infrared structure of quantum gravity in asymptotically flat spacetimes.” Infrared structure of quantum gravity? Quantum gravity? Did someone come up with a correct theory for quantum gravity while I was sleeping? Is the implication somehow that ANY theory of quantum gravity will have the exact same infrared structure in asymptotically flat spacetimes? Really?
The famous problems with quantum gravity are ultraviolet problems, so shouldn’t be relevant for the infrared structure. And the infrared structure should be largely determined by the fact that you want GR in the classical limit.
Those who want to argue about Bayesianism and Horgan’s views should do it at his site.
Any theory of quantum gravity should reproduce General Relativity in the far asymptotic region (“infrared”). The infrared structure you quote comes purely from studying the symmetries of asymptotically flat spacetimes in General Relativity.
So even if we don’t have a full quantum theory of gravity, _any_ such theory better reproduce this structure to be physically viable.
(original post: https://kartikprabhu.com/notes/re-quantum-gravity-infrared-structure )
Thanks for the answers, that makes sense. Now I’m only confused about “new discoveries” since they can’t mean new discoveries about GR, right? Or do they mean new implications about possible theories which have the right classical limit? Have there been limits found in that direction? Personally, I think Hawking was right in the first place about black hole information loss, and you just don’t actually have unitarity, but what do I know.
Is there a draft or something of the Porter Williams paper that is not behind the SD paywall? I looked it up on the arXiv, and failed. Yet from your description it seems a very interesting article to read, so…
Try the philosophy of science version of the arXiv
@JeffreyM for some context:
The infrared structure here refers to the symmetries of asymptotic flat spacetimes given by the BMS group. This has been known for a long time in classical GR. Some recent work by Strominger relates these symmetries to the so-called “soft theorems” in scattering calculations (involving gravitons) in perturbative QFT. In essence, what it says is, since the BMS group is infinite-dimensional, there is an infinite-dimensional degeneracy in what one would like to call the “asymptotic ground state”.
How this shakes out for the information “paradox” remains to be seen.
(original post: https://kartikprabhu.com/notes/re-quantum-gravity-infrared-structure2 )
I haven’t read it yet, but this is the first time that I’ve seen an editorial office include a boxed notice of this sort: “When citing this paper, please use the full journal title Studies in History and Philosophy of Modern Physics.”
Neither CASSI nor Crossref has an entry, and “Stud. Hist. Philos. Mod. Phys.” doesn’t appear to have much competition on the abbreviations front.
Pingback: Bayes’ theorem promotes "superstition"? | Uncommon Descent
Thanks for the link! 🙂
The Williams article is an excellent read, cutting straight through the naturalness ideology that has been dominating particle physics for too long, IMO. In a more general context, it is refreshing to see that QFT ideology is on the retreat — first a step back from fundamental QFT to EFT, then the serious criticism (even failure) of naturalness as a guiding principle. The next in line will be renormalization, once enough people gain enough awareness that gravity is nonrenormalizable (which puts a wrench into the gear of renormalizability as a guiding principle, so to say), and demote it to simply a useful technique of studying EFTs. Williams also hints in this direction in the article, but doesn’t want to stray too much from his main topic…
The era of QFT dominance is fading. With the latest results from the LHC (mainly evidence of Higgs and no evidence of SUSY), serious cracks are beginning to show in the QFT paradigm. This is important, because people really need to get disenchanted of QFT as a set of ideas that are universally valid. Once that happens, the younger generation of string theorists will gain awareness and courage to step back and ask “wait, what are we really doing here when constructing ST?”, and start thinking outside the box.
The Williams article is one of the first, but extremely important steps in this direction.
… physical behavior at widely different distance scales should not be correlated…
If AdS/CFT relies on the asymptotic spacetime being AdS, but claims to describe string-scale physics, isn’t that a correlation of physical behavior at widely different distance scales?
Thanks in advance!
No, not really — AdS/CFT is a duality between two theories, rather than between two different scales of the same theory. For example, UV range of CFT maps into IR range of AdS, which by itself doesn’t tell you anything about the IR range of CFT. And vice versa.
An example that actually does violate naturalness would be the UV/IR mixing in noncommutative geometry models.
Porter Williams lists some work in progress:
including “”The explanatory failures of the multiverse”.
Looking forward to the results of that research!
I am struggling to understand what a physical theory that is highly correlated at different scales would look like? Are we talking about a “fractal” theory here?
Any explanations that would inform intuitions on what these theories might look like from a big picture sense would be appreciated!
Simple examples would be free field theories or conformally invariant theories. In some sense what’s surprising is the “natural” behavior, that the behavior at long distances is so independent of how you define the theory at the cutoff scale.
In addition to what Peter said, it is interesting to imagine what nature would look like if Standard Model were not “natural” as much as it is. For example, imagine that all quark and lepton flavors/generations had substantially long lifetime. In such a scenario, despite the fact that the top quark is very heavy and contributes only on TeV scales, the low-energy behavior of the theory (say, chemistry) would be completely different (read: much richer) than it is.
So “naturalness” is a statement that for example chemistry has properties based on up/down quarks and the electron, regardless of what other particles may exist at higher energies. This behavior is, as Peter said, actually very surprising. In an “unnatural” SM chemistry would highly depend not only on these three particles, but on all possible particles that can exist in nature (and I mean *all*, up to Planck scale, not just the particles we have discovered so far). That’s what “mixing of the scales” would mean.
IMO, the word “naturalness” is a complete misnomer for the phenomenon.
Back in late 1940s no “naturalness” was expected – by “common” opinion the QED should die at 100 MeV, or so.
One could argue that “naturalness” (or whatever else one might call it) is extremely important for the development of science. Given “mixing of the scales,” the progressive, staged development of physics and chemistry would be far more difficult, if not altogether impossible. (I’m reminded of the importance of “separation of concerns” in computer science and software engineering.)
Naturalness and symmetry are in some sense opposites. In quantum theory we expect parameters to renormalize to generic values, even if the bare parameters vanish, unless:
1) some symmetry principle forbids it.
2) the bare parameters are finetuned.
Thus symmetry and finetuning result in the same experimental signature: non-generic values for parameters. That the Higgs mass seems to sit at the only a priori non-generic value – the boundary of the stability region – is rather striking from this pow.
Schellekens posted an interesting paper (Big Numbers in String Theory, http://arxiv.org/pdf/1601.02462.pdf) with a funny joke about landscape scanning:
«We were hit directly by the Big Number problem because the computer program experienced some mysterious segmentation faults each time it was starting to explore a very large hidden sector. We finally discovered that this was due to a message stating something like “remaining time …. years; aborting”. The number of years, expressed as an unlimited size integer, was so large that it overflowed the text buffer allocated for it.»
lots of good bits in that paper
from the abstract:
“…I correct a huge but inconsequential error, …”
So Peter have you heard anything more about the LIGO rumors which have been going like crazy over the last few days?
Another: “There are no negative results in science, just bad expectations.” 🙂
Loved that statement about the moral message in Bayes theorem. As an ex-particle physicist on Wall Street, I would say to my former colleagues that you have no idea how good you have it! Abuse of Bayes theorem and statistics is rampant in the business world to the point that correct usage is almost a cause for suspicion.
On the subject of statistical bias and LIGO there is a very nice book by Harry Collins called “Gravity’s Shadow”. He looks at statistical bias more from a philosophical point of view than a mathematical one. The book focuses on experimental bias in the search for gravity waves. One very interesting story that comes out of this is how the LIGO collaboration eventually rejected a test signal injected into the data to simulate a gravity wave.
“Jacques Distler has come out of retirement at Musings to object that this work violates two central ideological tenets: one should not pay attention to gauge invariance, and the answer to all questions should be string theory or AdS/CFT.”
You’re kidding, right? Just in case there are actually some smart young people reading this blog who can’t tell you’re kidding, you should clarify that you were kidding, that Distler acknowledged that gauge degrees of freedom become dynamical at boundaries, only it wasn’t clear why this is appropriate for horizons of decaying, non-eternal black holes. I mean, I get it- “don’t read that, Distler is a knee jerk ideologue” is your code for “read Distler’s blog, there’s a discussion thread of 25 or so posts by people who’ve actually published in this topic.” But not everyone’s hip to your satire, the way you criticize the internet’s rancorous politicization of physics by pretending to be so focused on the politics you’ve lost track of the physics- and I wouldn’t want the kids to miss out on a cool discussion.
When I wrote that, the substantive discussion thread over there hadn’t really started, I’m glad to see it did. I was being somewhat humorous, but certainly didn’t say “don’t read that”. If I want people to not read something I ignore it, don’t advertise and discuss it here.
As far as the youngsters being misled, I think the misleading thing would be to point them in Distler’s direction without warning them he’s a knee jerk ideologue and what the ideology is.