Philip Anderson has a piece in the latest Nature entitled Thinking Big. It’s about the interpretation of quantum mechanics, and in it he claims that Fritz London was the first one to really have the right idea about the problem. Commenting on the Bohr-Einstein debates on the subject, Anderson says “In reading about these debates I have the sensation of being a small boy who spots not one, but two undressed emperors.” Instead of the Bohr or Einstein positions, Anderson promotes a point of view he attributes to London, who wrote a paper about it in 1939 with Edmond Bauer. He says “Taking London’s point of view, one immediately begins to realize that the real problem of quantum measurement is not in understanding the simple electron being measured, but the large and complicated apparatus used to measure it” and that “The message is that what is needed is an understanding of the macroscopic world in terms of quantum mechanics.”
I take Anderson’s point to be that the classical physics of a measuring apparatus is an “emergent phenomenon”, and understanding this is the real problem of interpreting quantum mechanics. He ends with his favorite slogan: “more is different!”.
Interesting that your paraphrase of Anderson’s comments is exactly the position of Einstein and exactly the opposite of Bohr’s position. Specifically, Einstein demanded that the previously known physics on the large scale emerge deductively (at some appropriate level of rigor) from the new microscopic theory. Bohr wanted to retain classical objects (measurement devices) as irreducible elements of the new theory. As I understand it, this, and not any question of indeterminism, was at the heart of Einstein’s criticism of Copenhagen QM.
Not on topic and may fairly be erased: thought NEW readers might like to look at the Freeman Dyson article on Feynman
occasioned by the publication of a collection of Feynman’s letters.
Dyson sketches his own ideas of Feynman’s style of thought, approach to physics, accomplishments, character.
Free download—not restricted to paying subscribers of the NYRB.
Just added that Dyson article to a new posting containing a list of assorted interesting links.
Anderson (like Robert Laughlin) seems to following in Chomsky’s footsteps. For thirty years he has been peddling emergence (like Chomsky’s deep structure), yet I still have no concrete idea what he is actually talking about. My understanding is that what he means is something like saying “a computer can look at the waveform of speech, but it does not *understand* speech, ie there is something more to speech than just the waveform” to which one can reply
* the theological position: yes there is some sort of magic fairy dust that animates speech and macro-physics
* the Dawkins position: there is no fairy dust, the waveform is all there is, you just lack the appropriate way (so far) to extract info from the waveform
Anderson and followers irritate me because they seem to speak in the theological mode, but when pushed revert back to the Dawkins mode. If you accept Dawkins, then WTF is the big deal? Yes, we don’t have all the appropriate tools yet, but no-one, not even Weinberg or any of the other Anderson demons claims that, so what it their beef. Yes, they want more money for solid state research, I get that, but this mystical mumbo-jumbo is not the way to get there.
Maynard — I agree with you about the slipperiness of this “emergence” concept as used by Anderson and Laughlin. The latter’s popular book does a very poor job at trying to spell out the ideology and his papers aren’t much better.
It’s a shame because the emergent quantum gravity idea (the geometry of space-time as a mathematical convenience for describing the physical effects of a condensed matter ether) is intriguing and fully consistent with your “Dawkins” position.
The “appropriate way” to extract information from the speech waveform is the emergent property.
To expand on what Arun said, the problem of interpreting (extracting information from) a speech waveform is very much like interpreting a data stream coming over a wire. If one knows nothing about network protocols and encoding methods one will have a hell of a hard time getting beyond a crude physical and statistical description of the signal. (Ditto for interpreting signaling in the nervous system.)
What is emergent, in the sense that biological complexity is emergent, is the scheme for speech generation and processing that our biological and cultural evolution has worked out over several million years. At the physical level what is being produced is still just some aperiodic acoustic signal (often supplemented by hand gestures and facial expressions). We homo sapiens have a grasp of this scheme the way a batter has grasped a solution to the problem of connecting his bat with a baseball coming at him at 80+ mph. Objectifying this grasp of the problem, and implementing it in a computing device, is inherently difficult; nobody has or will supply us with the relevant technical specs or theoretical background.
Reinforcing the first comment (posted by Eric Dennis) John Stachel asserted 15 years ago in his contribution to Conceptual Problems of Quantum Gravity (Ashtekar and Stachel, editors) that Einstein’s position on the indeterminism of quantum theory has been widely misunderstood. He said that Einstein objected to the fact that the theory offers no explanation for the extent of its statistical description. If the physical world is not (in effect) a deterministic machine, then why shouldn’t it be so totally chaotic as to utterly defeat any attempts to rationally investigate it?
Fortunately for us (and for that matter, all living things) such radical chaos does not reign, but Einstein felt that we still don’t understand this as well as we can and should; quantum theory is not the last word on the subject.
(By the way, in the last few months Stachel has produced some interesting papers on foundational questions in quantum gravity.)