Philip Ball’s Beyond Weird is the best popular survey I’ve seen of the contemporary state of discussions about the “interpretation” of quantum mechanics. It appeared earlier this year in a British edition (which I just read a copy of), with the US edition scheduled to come out next month. Since it’s already out in Britain, there are several reviews you can take a look at, an insightful one is Natalie Wolchover’s at Nature.
The topic of the “weirdness” of quantum mechanics is one receiving a lot of attention these days, with two other books also appearing this year: Adam Becker’s What is Real? (which I wrote about here), and Anil Ananthaswamy’s Through Two Doors at Once. Lack of time as well as not having much of interest to say about the book has kept me from writing about Through Two Doors at Once. It’s much more focused than the other two, giving close attention to the two-slit experiment and surprising variants of it that have actually been performed in recent years.
Some of what I very much liked about Beyond Weird is the way Ball avoids getting into the usual ruts that books on this topic often end up in (with the Becker book one example). He avoids the temptation to follow a historical treatment, something that is almost irresistible given the great story of the history of quantum mechanics. The problem is that the early history of quantum mechanics and the struggles of Bohr, Einstein and Heisenberg to understand what it was saying is a fascinating story, perhaps the most compelling in the history of physics, but it is one that has been well-told many times in many places. Books that cover the later history have found it hard to resist the temptation of revisionism, caricaturing Bohr, Heisenberg and the dominant “Copenhagen interpretation” while making heroes instead of David Bohm, John Bell and Hugh Everett.
Ball has little to say about the personalities involved, but instead seriously engages with the central troublesome issues of the quantum mechanical picture of the world. The Copenhagen interpretation is given a fair treatment, as a warning about the limits one runs up against trying to reconcile the quantum mechanical and classical pictures of reality.
Instead of spending a lot of time in the rut of Bohmian mechanics, Ball dismisses it quickly as
But it is hard to see where the gain lies… Even Einstein, who was certainly keen to win back objective reality from quantum theory’s apparent denial of it, found Bohm’s idea ‘too cheap.”
Dynamical collapse models like GRW also get short shrift:
It’s a bodge, really: the researchers just figured out what kind of mathematical function was needed to do this job, and grafted it on… What’s more of a problem is that there is absolutely no evidence that such an effect exists.
As for the “Many-Worlds Interpretation”, which in recent years has been promoted in many popular books, Ball devotes a full chapter to it, not because he thinks it solves any problem, but because he thinks it’s a misleading and empty idea:
My own view is that the problems with the MWI are overwhelming – not because they show it must be wrong, but because they render it incoherent. It simply cannot be articulated meaningfully… The MWI is an exuberant attempt to rescue the ‘yes/no’, albeit at the cost of admitting both of them at once. This results in an inchoate view of macroscopic reality suggests we really can’t make our macroscopic instincts the arbiter of the situation…
Where Copenhagen seems to keep insisting ‘no,no and no’, the MWI says ‘yes, yes and yes’. And in the end, if you say everything is true, you have said nothing.
There’s a lot of material about serious efforts to go beyond Copenhagen, by understanding the role that decoherence and the environment play in the emergence of classical phenomena out of the underlying quantum world. This discussion includes a good explanation of the work of Zurek and collaborators on this topic, including the concept of “Quantum Darwinism”.
The last part of the book is up to date on what seem to be some currently popular ideas about the foundations of quantum mechanics. One aspect of this goes under the name “Quantum Reconstruction”, the attempt to derive the supposedly hodge-podge axioms of quantum theory from some more compelling fundamental ideas, hopefully the kind your grandmother can understand. These ideas are conjectured to somehow have to do with “information” and limits on it. I’m not sympathetic to these, since the axioms seem to me not “hodge-podge”, but connected to the deepest unifying ideas of modern mathematics. At the same time, I remain confused about what “information” is supposed to be and how these new foundations are supposed to work. And, as far as I’ve ever been able to tell, these are not things your grandmother is likely to understand, unless your grandmother is Scott Aaronson…
I’m pleased he discusses Griffith’s Consistent Quantum theory which to me is the most satisfying interpretation. I especially like the idea that experiments actually measure what’s there prior to the measurement and that wave functions don’t actually collapse.
Griffith’s book is available at Amazon or free to download here http://quantum.phys.cmu.edu/CQT/
I agree with you (and Einstein!) that Bohmian mechniacs is “too cheap.” But, it should also be mentioned that making Bohmian mechanics consistent with special relativity involves kludges that are unbelievably artificial.
The interest of Bohmian mechanics lies simply in the fact that it shows that something that von Neumann thought he had proved to be impossible is in fact possible, even if it is not plausible. In short, Bohm widened the possibilities of thought aboout QM, even if his own theory is almost certainly not true (in his later writings, Bohm basically admitted that Bohmian mechnics was more a toy model than a likely candidate for a true theory).
Bohm’s model does seem to have been one of the motivcations behind Bell’s work, and Bell seems to have been one of the founts for the work based on entaglement (quantum computing, quantum encryption, etc.) of recent decades.
On MWI, I think we also should mention that the technical names for the two big problems are “the preferred-basis problem” and the “probability-measure problem”: in ordinary English, why does the multi-world universe split into classical separate worlds in just the way it does, and how do you get probabilities out of the theory when in fact the probability that anything (i.e., everything!) happens is always 1.000. The MWI proponents have been wrestling over these problems for decades and never have come to a resolution that satisfies everyone: personally, I doubt they ever will.
There are certainly various hints that QM somehow has something to do with “information,” but I have never seen anyone spell out the details.
From all of which I conclude that still no one truly understands quantum mechanics.
The “derivations” of QM from information-theoretic principles, for example those of Hardy and Chiribella et al., are beautiful and surprising and interesting. They underscore just how little mathematical freedom you have in trying to generalize probability theory the way that QM does. But their wider meaning remains unclear.
Even if I *were* your grandmother, I can attest that I wouldn’t find it particularly obvious a-priori that the “pure states” of a theory (meaning, those not nontrivially expressible as probabilistic mixtures of other states) should all be continuously transformable into each other, or that it should be possible to purify any mixed state in an essentially unique way by passing to a larger system. Indeed, these don’t strike me as statements that anyone would’ve even thought to formulate, had they not had the example of QM! I would’ve treated it as even less obvious that any compound mixed state must be recoverable from the statistics of local measurements on its components. The latter axiom needs to be invoked, in almost all informational derivations, to rule out the restriction of QM to real amplitudes only, and to explain why amplitudes are complex numbers.
If a 19th-century mathematician had wanted to invent QM from first principles, with zero guidance from experiment, it still seems to me that their best bet would’ve been to ask themselves, “what’s the best way to generalize the rules of probability to involve negative and complex numbers?”
In any case, I see the informational derivations as trying to answer a question that’s almost orthogonal to the measurement problem that one cares about in interpretation of QM. Instead of asking what kind of “reality” (if any) the mathematical formalism is describing, we’re simply asking how one could’ve guessed, from general principles, that this particular mathematical formalism would be the right one. (I.e., we’re asking whether QM has a ‘derivation’ as compelling as Einstein’s 1905 derivation of the Lorentz transformations.)
There is a kind of derivation of principles of quantum physics by following the tao of stable homotopy theory, I discuss that in arXiv:1402.7041. This exhibits quantization as closely akin to the passage to “motives”, hence comes with some evidence that it is possible to arrive at this solely from thinking about first principles.
A key idea here is that, in view of the foundational role of homotopy theory, it is naive to assume that rings (as in: numbers) in fundamental physics are necessarily as taught in high-school, such as the rings of real or complex numbers; instead they may be “rings up to coherent homotopy”, technically known as “E-infinity ring spectra”. The above note considers a god-given concept of path-integral with phases not in the ring C, but in the E-infinity ring KU, and proves that this reproduces geometric quantization at least of compact Poisson manifolds.
One of your Becker book review updates had a quantum superposition effect on the text.
“collapse of the state vBeyond Weirdector with probabilities given by the Born rule”
From the description in the first link:
“We now realise that quantum mechanics is less a theory about particles and waves, uncertainty and fuzziness, than a theory about information: about what can be known and how. ”
What is that supposed to mean? I’m asking in earnest. Have duality and uncertainty been abandoned recently? I don’t know if it refers only to the “last part of the book” but it sounds more generic.
> I’m not sympathetic to these, since the axioms seem to me not “hodge-podge”, but connected to the deepest unifying ideas of modern mathematics.
Where might one read more about this connection? (I’m a math student, so not too good with the physics)
Just a quick elaboration of the comment by Urs: the passage to “motives” mentioned there seems to be related to the fact that the category of spans (or correspondences) is a dagger-compact category. It turns out that quantum computation (or the finite parts of quantum mechanics anyway) can be formulated in terms of dagger-compact categories (à la Abramsky-Coecke).
Relevant nLab pages:
“Have duality and uncertainty been abandoned recently? I don’t know if it refers only to the ‘last part of the book’ but it sounds more generic.”
Uncertainty hasn’t been abandoned, but the book argues that the idea of particle-wave duality may not be as helpful as thinking about quantum mechanics as a theory of information.
I think a 19th century mathematician’s best bet would’ve been to “view the algebra of random variables and their expectation values as the foundational concept” in a generalisation of probability theory – as Segal eventually did – so that it could accommodate their invention of QM.
Thanks for the comments! I don’t see any particular reason you should be able to “guess” QM from some general principles about how the world is supposed to work, but maybe it is useful to categorize the constraints imposed by various assumptions.
To explain that to your grandmother, she would have to be Grothendieck…
See my book
Please resist the temptation to use this comment section to discuss your favorite ideas about QM, and stick to comments of relevance to the Ball book.
I’m resisting temptation to go on about such ideas, but at least I can put them in another posting, which I may do soon…
Peter, I try never to forget your injunction to “remember that this is not a general physics discussion board, or a place for people to promote their favorite ideas about fundamental physics”, but it’s all too easy to forget during the chase that what seems to me like a reasonable response to Scott’s comment that ‘If a 19th-century mathematician had wanted to invent QM from first principles, with zero guidance from experiment, it still seems to me that their best bet would’ve been to ask themselves, “what’s the best way to generalize the rules of probability to involve negative and complex numbers?”’ might not seem so reasonable to you.
I’ll repeat the first paragraph of my bad comment: “I went to the trouble of obtaining the English edition because Philip Ball is visiting Yale’s YQI this Fall. I also enjoyed it and very much recommend it. That is, I found its sensibilities much more like my own than I’ve found of most recent popular books.”
A small remark related to the last paragraph about the so-called ‘quantum reconstruction’. It’s not about my favorite approach, but just a historical observation. When I read about these extensions of probability, the first thing I thought was that this was already been done 50 years ago, and pretty convincingly so. In the so-called ‘quantum logic’ approach, von Neumann, already in the 30s, introduced the non-Boolean space of projectors in a Hilbert space as the ‘event space’ for quantum probability, which replaces the standard Boolean one of classical probability (realized as the borel algebra of subsets of phase space). Then in the 50s Mackey conjectured that the only possible quantum probability measures on such lattice were the ones given by the standard Born rule via the usual trace and density matrix formula. This conjecture was then proved to be true by Gleason, in a landmark theorem. Mackey also conjectured, on some of von Neumann’s old ideas, that the necessity of the lattice of projectors in quantum physics could perhaps be derived as a consequence of imposing some restrictions on an abstract lattice, in particular, orthomodularity. In the 60s, Piron proved that this implied that this abstract lattice is necessarily isomorphic to the lattice of closed subspaces on some generalized Hilbert space. Then, finally, in the 90s, Sóler proved that this generalized Hilbert space can only be real, complex, or quaternionic. Recently, Moretti proved that Poincaré symmetry makes the real case to collapse to the complex one. Thus, in a very venerable line of research which goes back to von Neumann himself, the standard textbook axioms of quantum mechanics have already been reconstructed from a generalization of probability (in particular, the lattice of events is allowed to be non-Boolean, that’s why it’s called generalized). This is the standard approach and is the one which is discussed in most of the mathematical physics literature. It also leads directly to the Topos reformulation of those axioms. Another reconstruction is the GNS construction in the C*-algebraic approach. So, having all those long established results, which are mathematically rigorous and crystal clear in their assumptions, why do we need another one based on the loose notion of ‘information’ is something I don’t see.
absolutely, relation to dagger-structure is discussed in section 4.5.
Thank you for publicizing Phillip Ball’s book and linking to Natalie Wolchover’s review of it. I look forward to receiving my copy from Amazon and learning why none of my pets have ended up in a superposition of dead and alive states 😉.