The Ottawa Citizen today has an Op-Ed by string cosmologist Jim Cline, headlined The Big Idea That Won’t Die, with a subtitle “The fact that string theory is suddenly under attack only underscores its success as a path to a unified description of nature.” There’s a lot that it outrageous about this piece, beginning with the subtitle. Normally scientists don’t start going on about the success of their theories until they have some experimental evidence for them.

Most outrageous are Cline’s claims that Smolin’s book and mine are written in a “defamatory style”, and are “slandering” string theory. Since he gives no evidence for either of these claims, there’s not much to say about them except that they’re defamatory and slanderous.

Cline makes the standard claim that string theory should be accepted since it has legitimately triumphed in the marketplace of ideas, while clearly being rather upset about the success that critics of string theory have recently been having in this same marketplace. Somehow, overhyping string theory is a legitimate marketplace activity, pointing out its problems is not.

He makes many of the by now standard bogus claims about supposed predictions and tests of string theory. At some point I suppose I should write a FAQ about these, since the string theory hype machine keeps promoting these things in a less than honest way to a public that is not well-equipped to see through the hype. Here’s a pretty complete list of the bogus “predictions”

**String theory predicts supersymmetry and extra dimensions. The LHC will test these predictions.**

The problem is that there is no prediction of either the scale of supersymmetry breaking or the size of the extra dimensions; in string theory these could be anything. All we know is that the energy scales involved are at least a TeV or so, since otherwise we’d have seen these phenomena already. There’s no reason at all to expect the extra dimension scale to be observable at the LHC, even most string theorists think this is highly unlikely. There is a standard argument that the hierarchy problem could be explained by a low supersymmetry breaking scale, but this is already starting to be in conflict with the lack of any observations of effects of supersymmetry in precision electroweak measurements, and now string theorists seem very willing to say that supersymmetry may be broken at an unobservably high scale.

**String theory predicts observable effects in the CMB or gravitational waves.**

If you look into this, this is based on very specific cosmological scenarios such as brane inflation, and again string theory doesn’t tell you even what the energy scale of the supposed predictions is. Undoubtedly you can get “predictions” from specific models, once one chooses various parameters, but not observing these “predicted” effects would not show that string theory is wrong but just that a specific scenario is wrong, with many other possible ones still viable. There’s a new review article by Henry Tye where he claims that “string theory is confronting data and making predictions”, which isn’t true. It is only certain specific scenarios that he has in mind, he admits that other, equally plausible, scenarios (such as using not branes but moduli fields as the inflaton) make no predictions at all. For more about this, one can watch recent talks by Tye and Polchinski at the KITP.

**The anthropic landscape predicts the value of the cosmological constant and will make other predictions.**

The latest contribution to the anthropic landscape hype is from Raphael Bousso and is entitled Precision Cosmology and the Landscape. I’ve written many times about the problems with the cosmological constant “prediction”. Bousso claims that “there is every reason to hope that a set of 10^500 vacua will yield to statistical reasoning, allowing us to extract predictions”. He doesn’t give any justification at all for this, neglecting to mention arguments about the inherent computational intractability of this question, and the failure of the program to try and predict the answer to the one question that seemed most likely to be approachable: is the supersymmetry breaking scale low or high?

**String theory makes predictions testable at RHIC.**

There are lots of problems with this, but the main one is that the “string theory” involved is a different one than the one that is supposed to unify particle physics and quantum gravity.

**Update**: For more promotional material about string theory, you can buy a set of lectures by Jim Gates entitled Superstring Theory: The DNA of Reality. I haven’t seen the videos, but Gates is probably not indulging in the kind of claims about “predictions” of string theory being made by many others.

**Update**: A couple people have pointed out that a new paper has appeared pointing out that the one “prediction” of the landscape claimed by Susskind, that of the sign of the spatial curvature, isn’t sustainable. This issue was discussed here with Steve Hsu, who was blogging from a conference where Susskind made that claim, and wrote about it in more detail here. Hsu is one of the co-authors of the new paper.

**Update**: There’s a rather critical review of Lee Smolin’s book in this week’s Science magazine by Aaron Pierce entitled Teach the Controversy! Somehow I suspect Pierce did not write the headline, since he doesn’t seem to think much of opposition to string theory or that it is a good idea to encourage any dissent about it. In his review, he pretty much completely ignores the fact that string theory is supposed to be a unified theory, explaining the standard model as well as quantum gravity, discussing just the question of string theory as a theory of quantum gravity. This is rather odd since quantum gravity isn’t even Pierce’s specialty. I’m somewhat curious what he might think of my book, which is pretty much all about string theory’s failure as an idea about particle theory.

Cline correctly writes that (tenured) theorists can choose to work on the ideas that look more appealing. Maybe he could also explain why some theorists work on brane collisions: the idea of putting two perfectly parallel branes looks to me so childish (sorry, I cannot find a more polite word) that I never got interested in addressing its problems in getting inflation and problems in getting scale-invariant perturbations.

A FAQ is an excellent idea.

Uh oh. Looks like another string theory “prediction” bites the dust.

http://arxiv.org/abs/hep-th/0610231

hack: interesting work, but you are too drastic; it just means that strings can now make statistical predictions for the sign of the curvature. (Just to be sure: this is a joke).

There is another interesting paper today: hep-th/0610231.

Yeah, I’d second the call for an FAQ. It’s a bit frustrating trying to bring this issue up with some people unfamiliar with the subject, and when they ask for more information or clarification, the best I can do is point them to this blog or tell them to read the book. So yeah, having it all in one place would certainly make it easier to introduce people to the current points of contention than expecting them to wade through hundreds of articles over the past year. (Not to mention it might force some other people to try and think up some fresh arguments for a change.)

Great idea.

M,

that is, indeed, witty.

Peter,

FAQ is a very good idea. Maybe it is helpful to invoke simple parables to get the points across.

I agree with the FAQ idea.

I only wonder why Peter should have to waste his time doing this, when the string theorists themselves should be exercising the required level of scientific scrutiny and mentioning both the pro and con in their ideas.

If the FAQ could get permission to reprint Feynman’s “Cargo Cult Science” essay– the antithesis to hype in my opinion — the layperson could have a first-rate understanding of what it means to be honest, as a scientist, in ones scientific claims.

Jim Cline writeup is a reasonable defense of ST. The good part is he didn’t propose to redefine science in order to accommodate ST. He suggested a number of predictions to be tested by experiments. But he should know suggestions is very different from writing them up in a scientific paper in collaboration with the experimentists to specify meaningful prediction parameters. No such papers have appeared with regard to any experiments confirming or falsifying any claim of ST. The way ST have developed, no such papers will likely ever appear. Is Jim prepared to do another writeup proclaiming ST to have entered the realm of metaphysical religion?

Lubos Motl is back in the amazon.com reviews, thinly disguised as Non-String Physics Grad Student, complete with a series of fake endorsements of his review of Lee’s book. He certainly has time on his hands. Interesting he classifies himself as a graduate student.

relativist,

It doesn’t sound to me like Lubos, who is not the only person with a very hostile attitude toward Smolin’s book (and mine). The review reads to me exactly like what it purports to me, the work of a student who knows little or nothing about string theory.

I may be mistaken, but I don’t recall Lubos showing knowledge of decoherence in QM…but perhaps it was off topic so just never came up.

It is quite true, as the grad student alludes to, that there is a body of research and experiment indicating that QM is now understood (contrary to the famous Bohr dictum to the contrary), using decoherence, and Smolin did not mention this in TTWP.

I concur that the grad student review seems legitimate.

One point in this proposed FAQ that could be helpful would be to clarify the relationship between theories and models. I sometimes have trouble tracking the claims here–is the theory non-testable or incoherent and hence not even wrong, or is it falsified? It seems that some of the confusion here comes from mixing together claims about theories versus specific models within those theories.

Is it common in theoretical physics to insist on the testability of a theory, or is it OK, if less desirable, to be able to test specific models (or classes of models) within that theory? I bring this up because a theoretical framework or approach may be a fecund source or inspiration (“the context of discovery,” as the philosophers say) for specific testable models, without itself being testable in toto. So is the claim: a) Individual models made of stringy components may be testable (and may have been falsified), but because there are zillions of such models you can’t experimentally rule all of them out, b) No individual stringy model has ever been produced which is testable, or c) No testable individual stringy model CAN ever be produced? Each of these claims is distinct and has distinct implications.

In particular, if a) is the claim, then it might be reasonable for a theorist to say “String-type theory is the best idea I have for how to construct testable models that might explain quantum gravity (or whatever). I can’t prove that some other starting point might not be a better place to start looking, but this is where I want to spend my time.” If enough theorists agree with this point of view, it will look like string hegemony, but in this hypothetical case it would just be convergent individual judgments about research prospects.

On the other hand, if c) were the claim, sticking with strings would suggest serious problems with the field. And of course an unwillingness to consder alternatives seriously would also hinder progress.

I think of science as working like the stock market–in the short run it is a voting machine, but in the long run it is a weighing machine. My sense is that the critics of string theory believe that the voting period is extending well beyond the point where weighing should have determined the outcome. The string theorists don’t agree, and the discussion is really about when to give up and why. I would find this discussion clearer if the theory/model distinction were more apparent.

To LDM: Decoherence is of course a real phenomena, and is mentioned briefly on p 252 of TTWP. While there are claims in the literature that it by itself solves the measurement problem this has, in the view of many experts, been shown wrong, indeed quite a long time ago, for example in papers of Abner Shimony. Decoherence does play a role in some formulations of quantum theory that address the measurement problem, for example the consistent histories approach. But I believe that I correctly characterized the situation in Chapter 1, which is that there is no consensus among experts as to whether any of the proposed interpretations or reformulations of quantum mechanics so far proposed are completely satisfactory. You can find arguments in the literature pro and con many different viewpoints and approaches. Had I represented one of them as a consensus view I would not have been accurately representing the field.

Thanks,

Lee

For whatever it’s worth, I agree with Lee on this point. Decoherence makes it a pain in the neck to figure out what on earth is going on with the quantum to classical transition, but it doesn’t actually solve anything.

Steve,

a) is true, and it is the big problem: it means that whatever we observe, among the 10^500 string models, 10^300 or so should be able of “explaining” it.

Presumably string theorists will try for a few years to study if the situation is so bad. Unless some good surprise will be found, anybody who wants to spend his time on this approach should put his 10^300 models in a big bag and move to a philosophy department.

Lee,

Yes, decoherence is even in the index in TTWP — I should have checked instead of trusting my memory. Sorry.

Also, thank you for the Abner Shimony reference. The book I am reading — “Decoherence and the Appearance of a Classical World in Quantum Theory” has 51 pages of references at the back , but I am unable to find Shimony mentioned at all. However, if you have a reference to the particular paper of Shimony that you are referring to, would you mind posting it? Thanks.

a): If in fact a) is correct, then it’s hard to argue that string theory is some sort of plague. It’s just an urn into which some people are reaching hoping to draw out a black ball (testable and correct model) instead of a red ball (untestable or falsified model). Other people get to draw from other urns. My concern would be if the community somehow had different standards for what is black or red depending on which urn the ball came out of.

The absurd argument that the fact that idea X is criticized underscores its success is a bizarre form of scientific stalinism. In its original form, J. V. Stalin emphasized that the resistance of the capitalist class will increase as the march toward communist utopia goes on closer and closer to its perceived goal – and it was the major pretext for all purges in 1930s and 40s, with the tragic loss of millions of lives. Now, we here the same moronic nonsense invoked in defense of string theory!? I suppose the author of that nonsensical piece of propaganda also “predicted” that the criticism of stringy nonsense will also mount and increase in ferocity as the time goes on, offering even more “confirmation” to its successes, and so on, ad infinituum and ad nauseam.

Dear Peter,

I have been reading your blog for a bit and like it a lot. Regarding RHIC (which is my field) it is certainly true that the connection between the AdS/CFT correspondence and the actual experiment is very vague and wildly exagerated. However, the qualitative features of the kinetics of strongly coupled plasmas predicted by the correspondence are so wildly different from their perturbative counterparts, that if the QCD plasma were to behave anything like the AdS-Super Plasma there surely must be observable consequences at RHIC or the LHC. Hopefully someday we’ll find a nice crisp observable which can probe these qualitative differences.

Interesting point, Derek. Do you have any handy references for AdS/CFT plasmas? I am actually working on this myself now.

I would agree that the graduate student review is legitimate. I’ve had discussions with Lubos Motl on quantum mechanics which make me think that his views on decoherence and his objections to the book would be quite different than this reviewer. In fact, I’m quite sympathetic to the graduate student on this. While I don’t think the quantum-to-classical transition is a solved problem yet, I expect that it will be solved through a more complete analysis of decoherence, unlike Smolin’s other four problems, which are likely to need new physics for their solution.

On the other hand, since Penrose and ‘t Hooft side with Smolin on this one, I am nowhere near presumptuous enough to claim that any of them show a deep misunderstanding of physics.

Peter Shor

I don’t think that decoherence can explain the process of factualization of potentiality, I think this remains outside present QM. What it can do is find an excuse which is sufficient for all practical purposes.

Hi Peter, just wondering if you’d seen this from a week ago.

http://online.itp.ucsb.edu/online/resident/johnson2/

Mike,

There’s a long discussion of it two posts back, the one on “The String Wars”

Of all the criticism of ST I find that of Phil Anderson the most penetrating.

Of course I ‘have tremendous admiration for Peter’s courage and I did not trust my eyes when several years ago I saw his critique on the lanl server. Lee’s criticism had this important sociological component from the very beginning, this risk in his case was more to become an enemy of his former colleagues without desiring this role (as the director of an institute he did not suffer a direct career threat).

Not that there were no other critical minds; among my QFT colleagues I do not know any one who did not think of ST as freakish. But they have interesting projects and do not want to be sucked into controversies (especially since most of them have ST colleagues which, as Glashow at Harvard, they could not prevent the course of metaphoric physics). I myself belong to that class of cowards who only after retirement open their mouth.

But in the present situation in which we find our-self the criticism of Phil Anderson is the most fitting one. The hubris and prepotence state of mind as a result of the undeserved luck which led to discovery of the standard model is certainly the powerful combustion force which has brought us into the present mess. It would be very difficult for anybody to say this because it involves some of his Nobel colleagues and a field medal winner, but he can do it. It is a deep shame (and certainly be an issue for historians of science) that the particle physics community was unable to keep its house in order.

Of course neither Phil Anderson nor all those other critical minds have any illusions, there is nothing which can stop that collective madness, it just has to play out. This is not different from similar phenomena in the political realm, in fact some proponents of ST are at the same time apologist (or at least were) of the destruction of Iraq in the name of regime change and a new order of the near east (despite the many voices which predicted the present mess). Given the human nature, in both cases it has to play out to the bitter end. But it testifies to the power of human self-reflection to have articles as that by Phil Anderson.

The ‘Non-String Physics Grad Student’ review is gone from Amazon. It would be interesting to know if he has re-thought his views and so deleted the review himself, or whether Amazon removed it.

Re: factualization of potentiality. Of the people who believe that quantum amplitudes and wavefunctions represent information about the physical world (rather than representing physical states of affairs directly), nobody thinks there is a measurement problem, or anything unusual about collapse. As Heisenberg said, the world doesn’t change discontinuously, but our information about it does.

On the other hand, of those who believe the opposite, namely that the quantum amplitudes and wavefunctions represent the actual state of affairs, everybody agrees that there is a problem which needs to be solved. Decoherence was presented as a candidate solution but most agree that it doesn’t really help.

Over the course of the twentieth century, the number of physicists in the former camp has dwindled and the number in the latter camp has grown to include almost everyone. I asked Penrose, after a talk, about why he thought this had happened, and if he knew of any proof or argument implying that the view of the former camp was wrong (since he’s firmly in the second camp along with almost everyone else). He said that he didn’t know of any such proof but that the wavefunction “seems objective”.

(This is probably not the right place, but perhaps Peter permits me to come to a closure with my previous somewhat sketchy remark)

The decoherence solves the problem for all practical purposes, but being within QM it cannot explain the closure of this process namely the irreversible factualization which happens at the moment of registration (which may be done by a machine).

Bert Schroer wrote:

(…) the irreversible factualization which happens at the moment of registration (which may be done by a machine).Could you please elaborate? If Peter Woit thinks this is not the right place to continue the discussion, allow me to announce that I will be accepting guest posts to commemorate the 1 Year Anniversary of Christine’s Background Independence (refer to my last post over at my blog for instructions).

Best wishes,

Christine

Christine,

I think Peter will not mind a very short answer, since it comes with a citation of a lovely personal account of a giant of QFT, the creator of AQFT Rudolf Haag (which, I am sure, also Peter will read with great interest)

http://br.arxiv.org/abs/hep-th/0001006

there is also a whole section on factualization i.e. the conversion of potentialities (e.g. the process of decoherence) into events. Since irreversibility is outside QT, there is as yet no theory about it.

voce vai gostar

cordialmente

Bert

cor

Dear Bert Schroer,

Thanks! Muito obrigada!

Christine

After that short excursion on factualization and the problem of reality from the perspective of QT, let me return to the main topic of this section: hype.

All my colleagues from QFT who know that framework of particle physics (say beyond e.g. the Peskin-Schroeder level) agree with me that holography from d+1 to d dimension is nothing else than changing the spatial encoding of a specified algebraic substrate which was given in the natural spacetime labeling of a d+1 spacetime to the labeling which associates naturally with its horizon (in the highly symmetric case of AdS one takes a brane at infinity). Of course physics depends not only on the substrate but also on its spacetime organization. It is a bit like stem cells which by encymes can be forced to organize in different ways (organs). Localization in QFT, independent of whether it can be physically realized (as in case of black holes) or just thought about (Localization in Minkowski spacetime a la Unruh), leads to thermal manifestations which are caused by vacuum polarization near the causal (or event) horizon. The Hawking effect is fully accounted for by quantum matter in a Schwartzschild spacetime. Any state which extends from the outside into the black hole will lead to Hawking radiation at the Hawking temperature (in particular the state which is invariant under the Killing motion). Nowhere in the existing derivation are gravitons or QG entering. Since the state is thermal, it has also an associated entropy. To compute that localization entropy, I have developed a formalism of holography on null-surfaces (which led to extended chiral QFT) which explains why the entropy follows an area law. This area behavior is totally generic and has nothing to do with Bekenstein. In fact one obtains a one-parametric family of entropies depending on the chosen thickness epsilon of the vacuum polarization “atmosphere”. This family corresponds to the family of boxed Gibbs systems which one introduces to define the thermodynamic limit. Of course one can use Bekenstein’s classical formula and equate it with this microscopically computed entropy to determine epsilon (I have not done this, but there can be no doubt that at this point the Planck length enters). The calculations are in two papers (the first one is published in CQG and the second has been submitted there)

http://br.arxiv.org/abs/hep-th/0507038

http://br.arxiv.org/abs/hep-th/0511291

The reason I did not do the calculation in curved space time directly for the Schwarzschild model is very simple: by doing it for double cones in Minkowski spacetime I have a chance to slip through the prejudice of a possible string referee who would immediately reject a calculation of black hole entropy because there is no use of QG but who does not mind to do some (in his mind irrelevant) calculation on the thermal manifestation of double cone localization in Minkowski spacetime.

The important application of holography to the AdS—CFT and its connection with ST hype will be explained tomorrow in a separate blog.

Addition

I forgot to highlight the most important lesson: the localization entropy which I compute is not a counting entropy. Counting entropies (QM-like level counting in a global QFT) contradict the principle of local covariance (mentioned in earlier blogs) and the same holds for the energy caused by localization through vacuum polarization. This is also the reason why no expert of QFT in CST (as e.g. Hollands and Wald) accepts the Weinberg kind of counting estimates for CC (those which led to those absurd values).

The mysterious connection of holography with QG process came through ‘t Hooft and it is interesting to note that Susskind in one of his first papers had the right intuition that holography has something to do with the lightfront. Since he was very impatient he never got beyond the old lightcone quantization which is contradictory and has to be replaced by the much more subtle lightfront holography. If he would have been more patient and careful he would not presently find himself stuck in that metaphoric lanscape trao from which he will never get away any more.

Here is the continuation:

As the null-surface holography is a change in the spacetime encoding of an algebraic substrate which is given in bulk form (explained in my previous blog), so is the AdS—CFT correspondence. In that case the change of the spacetime ordering-device is more gentle and has a unique inverse (without necessitating additional assumptions). This has been elaborated with great clarity by Rehren in a series of papers, including the demonstration that it agrees with Witten’s version of a prescribed conformal source in a functional integral representation (to the extend that the artistic functional integral approach dan be used for proofs). I do not know any competent quantum field theorist who does not accept Rehren’s work as the correct formulation of AdS—CFT holography (Hollands, Wald, Brunetti, Fredenhagen, Verch, Buchholz,….)

On the other hand the Maldacena conjecture alleges to address the AdS—CFT correspondence, but it burdens the AdS side with that vague ‘t Hooft idea of holography as a consequence of QG (in Maldacena’s case the QG attributed to ST) which it is not capable to carry.

As argued abobe this is not what holography can deliver, but since the computation is so vague (involving not only perturbation theory but also additional limiting assumptions), and there is no clear structural expectation on the AdS side, the ST people think that their computational massaging has supported whatever their conjecture is. One should say in addition that even their assumption that their SUYM is conformal has never been shown (apart from conformality it is not even clear that the beta-function vanishes in all orders, and in addition the N–>infinity limit is not really a QFT). We are seeing in front of us precisely that situation which Phil Anderson describes: the unmerited success of the SM has created an era of hubris and self-delusion fueled by two Nobel laureates and one Field medal winner (but if I may add even Nobel laureates would not be able to do this without the foot soldiers being ready for a new age physics). I would bet a case of good wine with any reader of this blog that the Maldacena conjecture will never be proved, it will just fade away as all conjectures coming from ST.

In a weird attempt to maintain the Maldacena construct Distler and Motl have ignited a conceptual stink bomb (using the truism that in applying theorems one has to check prerequisites) accusing Rehren of something which they and not Rehren committed. This is to burden holography with the requirement of producing a quantum gravity Fatah Morgana on the AdS side.

The metaphoric approach to particle physics will certainly continue (on Friday there was a paper where the conjecture is used as the bases for meta-metaphoric excursions into the blue yonder). There is nothing which can stop it, similar to things in the political era hubris has to play out all the way.

Smolin’s post is full of misleading statements. For starters, he claims to mention decoherence in his book.

“Decoherence is of course a real phenomena, and is mentioned briefly on p 252 of TTWP.”

This is completely misleading. He mentions decoherence in a context completely unrelated to quantum mechanics and measurements, and without any explanation whatsoever. The term decoherence appears as an offhand remark about particles propagating through spin networks. Sorry, but merely mentioning a word in a book just so that you can point to it in your index doesn’t count.

Then, just to make clear to everyone why he doesn’t understand quantum mechanics, Smolin says:

“While there are claims in the literature that it by itself solves the measurement problem this has, in the view of many experts, been shown wrong, indeed quite a long time ago, for example in papers of Abner Shimony. Decoherence does play a role in some formulations of quantum theory that address the measurement problem, for example the consistent histories approach. But I believe that I correctly characterized the situation in Chapter 1, which is that there is no consensus among experts as to whether any of the proposed interpretations or reformulations of quantum mechanics so far proposed are completely satisfactory. You can find arguments in the literature pro and con many different viewpoints and approaches. Had I represented one of them as a consensus view I would not have been accurately representing the field.”

First of all, the papers he is referring to by Abner Shimony (from 1974) are deeply flawed and completely miss the point of decoherence. They prove a theorem that there will generally be cross terms in the final density operator for an experimental apparatus after a measurement takes place, and thus the experimental apparatus isn’t a mixture of definite eigenstates corresponding to definite results.

Well, DUH! That completely misses the point of decoherence. In decoherence, there will certainly be cross terms. The point is that all cross terms have coefficients that go like ~e^(-t/tau) for some small rate constant tau, and thus go to zero exponentially fast. The cross terms aren’t literally zero, but in an extremely rapid time interval (for an object like a table sitting in a room filled with air molecules, at time scale of order

10^-43, which is essentially instantaneous since time scales that small don’t really make sense anyway) the cross terms become completely undetectable by any other observer, directly or indirectly, and thus are really gone. If no other observer could ever even in principle distinguish between a density operator with such infinitesimal cross terms and a density operator without them, then you are pretty crazy to demand that they exist in any physical sense any more than invisible pink elephants.

Look, there are certainly issues when quantum mechanics is applied to the universe as a whole, to the big bang, etc. Quantum mechanics may need to be revised, extended, or replaced by a deeper, more general framework. But there are simply no remaining issues with quantum mechanics as applied to “every-day systems”. And it’s time physicists stop telling the public otherwise.

And no, I’m not Lubos.

anonymous

I cannot speak for Smolin, but I agree with you (see my contribution here before Christine); decoherence leads to a loss of phases for all practical purposes and hence solves the measurement problem fapp. However the event of observation (blackening of a photoplate, click in a counter) is irreversible (factualization) and QM (decoherence is part of QM) is reversible.

I really wish people would take discussion of the foundations of QM elsewhere, it’s off-topic, and definitely not something I want to moderate a discussion of here.

Anonymous,

I don’t think you’re Lubos, but you share some really obnoxious behavior with him (and you are at Harvard, what the hell has gotten into the drinking water there?). Your claim that Smolin “makes clear to everyone why he doesn’t understand quantum mechanics” is just rude and idiotic. From my reading in this subject, the opinion that decoherence does not completely solve the interpretational problems of QM is a pretty conventional one among experts on this.

I don’t happen to share Smolin’s opinion that thinking about these foundational issues of QM is probably necessary to make progress on quantum gravity, but there are quite a few very prominent physicists who agree with him, and until the question of quantum gravity is settled it’s definitely a valid position.

If you want to argue with Smolin about QM, find somewhere else to do it. You might also consider whether anonymously insulting people is really an acceptable thing to be doing, no matter what the circumstances.

Assuming this doesn’t overtax Peter’s patience with the digression, it ought to be mentioned that one of the very prominent physicists who would presumably agree with Smolin’s opinion that foundational issues may be important for further progress in fundamental physics is the guy who first came up with the very idea of decoherence way back in 1952 (Phys Rev). That guy would be David Bohm. And he certainly did not think the mechanism of decoherence

by itselfcould resolve the foundational problems in QM.