Michael Douglas has written up his talk at the recent Solvay conference, with the title Understanding the Landscape. For previous postings about this conference, see here, here, and here.

Douglas’s article is mainly a series of excuses for why string theory can’t predict anything. He begins with an historical analogy, comparing the present state of string theory to that of quantum mechanics in the period 1913-1926, and the Bohr model of the hydrogen atom to N=4 supersymmetric YM and AdS/CFT. This kind of analogy has some rather obvious problems: it took 13 years to get from the Bohr model to the complete theory of QM, whereas the idea of string unification has been around for about 30 years, with not the slightest sign of success. More importantly, the Bohr model of the hydrogen atom gave a fairly accurate prediction of the spectrum of the hydrogen atom, so it was clear that there was something very right about it. On the other hand, N=4 supersymmetry YM doesn’t predict anything that corresponds to the real world. In a footnote, Douglas writes:

*A similar analogy was made by David Gross in talks given around 2000. However, to judge from his talk here, he now has serious reservations about it.*

but I’m not sure what reservations of Gross this refers to.

Douglas’s second analogy is a “chemical” analogy, basically pointing to the fact that deriving bulk properties of solids from the underlying many-particle Schrodinger equation is difficult if not impossible. Again, this isn’t a very good analogy. The problem isn’t that you have a simple theory whose ground state is hard to identify because of the numbers of degrees of freedom, like in condensed matter physics. The problem instead is that the ground state of the superstring doesn’t look like the real world. Douglas describes the current situation as:

*Perhaps all this is a nightmare from which we will awake, the history of Kekule’s dream being repeated as farce.*

He goes on to give the standard argument for the landscape, referring to it by the novel name of the *Weinberg-Banks-Abbott-Brown-Teitelboim-Bousso-Polchinski et. al. solution to the cosmological constant problem*, but notes that:

*On further developing these analogies, one realizes that we do not know even the most basic organizing principles of the stringy landscape.*

and proceeds to discuss a little bit two topics that he hopes might be related to this problem, but I don’t see any evidence for this. One topic is an abstract proposal for a metric on the “space of all CFTs”, the other is the classification of D-branes on a Calabi-Yau in terms of a derived category.

He goes on to discuss his recent work with Denef on computational complexity, which indicates that even if our universe corresponds to some local minimum in the landscape, there is no hope of ever identifying which one it is and actually computing anything about the real world. In his concluding section, he admits that there is no evidence of any simple structure in the landscape, and argues that maybe this is just the way the world is:

*Still, our role as physicists is not to hope that one or the other picture turns out to be more correct, but to find the evidence from experiment and theory which will show us which if any of our present ideas are correct.*

The problem with his invocation of the role of experimental evidence is that he has just finished making an excellent case that there is no hope at all of ever getting any such evidence for string theory. He seems to have completely abandoned without comment his project of the last few years of counting vacua in hopes of making statistical predictions, and is left with not a single idea about how one can ever hope to get a prediction of any kind out of this framework. In this kind of circumstance, standard scientific practice is to acknowledge that this is a failed project and go on to something else. Douglas not only refuses to acknowledge the failure of string theory, he doesn’t anywhere even mention the possibililty that the underlying idea might be wrong.

Yesterday’s hep-th preprints included another landscape one, Probabilities in the Landscape by Alexander Vilenkin. Vilenkin shows no more evidence of having a viable way of ever making a physical prediction than does Douglas, but like Douglas invokes experimental confirmation at the end of his paper, then closes with one prediction that is sure to be accurate:

*It seems safe to predict that we will hear more on this subject in the future.*

For some more entertaining reading, take a look at A Comment on Emergent Gravity by Waldyr A. Rodrigues Jr. Rodrigues says that he gave as an exercise to his students to find the errors in Jack Sarfatti’s recent arXiv posting Emergent Gravity. Sarfatti had been blocked for many years from posting on the arXiv, but the arXiv moderators recently relented, perhaps because this paper includes a section at the end about the landscape. Susskind’s recent book is one of only two references in the paper. Sarfatti’s paper has gone through six drafts, three of which are on the arXiv, and the latest of which incorporates some of the objections from Rodrigues. Rodrigues however concludes his paper thus:

*Sarfatti’s paper we regret to say, is unfortunately a potpourri of nonsense Mathematics. The fact that he found endorsers which permitted him to put his article in the arXiv is a preoccupying fact. Indeed, the incident shows that endorsers did not pay attention to what they read, or worse, that there are a lot of people with almost null mathematical knowledge publishing Physics papers replete of nonsense Mathematics…*

*A careful reading of [Sarfatti’s paper] shows that his hypotheses are completely ad hoc assumptions, since in our view no arguments from Physics or Mathematics are given for them. Summing up, we must say that Sarfatti’s claim to have deduced Einstein’s equations as an emergent phenomena is a typical example of self delusion and wishful thinking.*

Sarfatti will be giving the closing talk at the 22nd Pacific Coast Gravity Meeting, to be held this coming weekend at the KITP in Santa Barbara.

This sounds unduly harsh. Not that I’m disagreeing with the basic

conclusion. To quote the Douglas paper:

Are there fruitful analogies between these long-ago problemsand our own? What is the key issue we should discuss in 2005? What are our hydrogen atom(s)?

If we have them, they are clearly the maximally supersymmetrictheories …

What can one say. Interesting that he goes into Cheeger’s work. I

recently had the good fortune to meet Cheeger. Of course, the

physicists present all wanted to know about the sort of thing that

Douglas discusses, although not the Stringy version. Cheeger said

he hadn’t really thought about it, but he did mention a 1980’s

paper of his:

J. Cheeger, W. Mueller, R. Schrader Commun. Math. Phys. 92, 3

(1984) 405-454

which I haven’t had a chance to look at, unfortunately. Maybe in

another life.

Sarfatti’s paper needs another revision. It’s not displaying correctly on my computer!

To be genuinely honest, if Sarfatti’s paper got through with a lot of mathematical errors, it doesn’t surprise me. There are not a few academic papers which bandy about mathematical terms in a very terse and sparse fashion, referring to this or that paper or terminology to justify what they put down. To someone who is not

very familiarwith the field, i.e. 99.9% of undergraduates, a paper filled with nonsense is indistinguishable from one filled with correct but sparely documented terms.For example, from the Sarfatti paper:

Is this right? Is it wrong? I know nothing about any of this. I personally do not know what the following terms really mean: “superfluid”, “Cartan closed inexact 1-form”.

This statement could of course be correct, but as someone new to the field, I’m basically taking most of this on faith. I don’t think I’m alone here. Papers do this. A Lot. Not just physics papers either.

I actually think this obfuscated and/or terse presentation style is a serious problem in scientific academica. Authors are not making the effort to make their discoveries accessable to a wider audience. Their papers can only be understood by a handful of other people. This doesn’t help science. It just creates a huge barrier of entry into any paticular field. And of course, it also means it is easier for papers containing errors to slip through.

I follow Feynman’s philosophy on this: If something cannot be taught to in first year undergraduate course, then it is not fully understood yet. Sorry about the rant, but I’ve had to put up with a lot of rubbish papers and books lately.

Kea,

Cheeger’s a very good geometer. I first met him when I was a postdoc at Stony Brook, where he was for many years. He’s now here in New York, at NYU. The theorem that Douglas refers to corresponds to the intuitive idea that if you have a manifold of a fixed size, to make it more topologically complex you need to make it more curved. If you bound the curvature, you bound the number of topological types.

But I really don’t see at all why he is bringing that up. While you want to bound the curvature to make sure the low energy supergravity approximation is valid, if you’re doing string theory, why should you insist on such a curvature bound? And I really don’t see at all what this has to do with his proposed metric on the space of CFTs.

OMF,

Unfortunately it really isn’t possible to write every paper in a way that any undergraduate can understand it. Mathematics is a language, and like any language takes a significant amount of effort to absorb. You really need to immerse yourself in this stuff, think about it a lot and take the time to become comfortable with it. This can take quite a while and is not easy, but once you do this, you really can quickly tell the difference between someone speaking gibberish, and someone who has something interesting to say.

Peter

Yes, it does look suspiciously like Douglas was rambling on about a dream for a metric that he hadn’t thought about too much. He refers to the 1970 Cheeger paper. At the conference I was attending, Cheeger spoke about things like this paper with Tian.

the paper of douglas seems indeed very much as expression of a big crisis on the subject. It looks similar to the crisis when it was discovered that general relativity is not renormalisable.

Or even more, since this was one the reason, why one thought strings might help out.

To someone who is not very familiar with the field, i.e. 99.9% of undergraduates, a paper filled with nonsense is indistinguishable from one filled with correct but sparely documented terms.Such papers are not designed for laypersons or for undergraduates. They are designed for people who are indeed very familiar with the field.

Is this right? Is it wrong? I know nothing about any of this. I personally do not know what the following terms really mean: “superfluid”, “Cartan closed inexact 1-form”.So what? If it’s beyond you personally, then it’s trash?

Authors are not making the effort to make their discoveries accessable to a wider audience. Their papers can only be understood by a handful of other people. This doesn’t help science. It just creates a huge barrier of entry into any paticular field.The arxiv and the refereed journals are not an introduction to anything; they are (in principle) a place for the high-level, cutting-edge work going on in these fields. They are not supposed to be a tutorial, so it should not surprise you that they do not make a good tutorial.

If something cannot be taught to in first year undergraduate course, then it is not fully understood yet.Even if this is true, it does not pertain to the situation at hand. These papers discuss areas of active and often controversial research. RESEARCH. If it were fully understood, it wouldn’t be a subject of research anymore; it would be standard textbook fare.

This idea has probably been considered and dismissed before, but I’ll throw it out anyway. Couldn’t the arXiv support a declaration of intended audience for submissions, which could be used as a qualifier in database searches? One could then search specifically for reviews, pedagogical articles, research reports, etc. Any comments or elaborations on this notion would be of interest.

Perhaps there are already some “aftermarket” compilations of preprint links that include such categorizations.

(Sorry if this is too far off-topic.)

Closed inexact Cartan forms are an elegant way of describing superfluids and emergent phenomena in general.

fyi

We live in interesting times! 😉 Waldyr explained to me that he was pressured by some people that his students might suffer, otherwise he would not have written anything had I not mentioned him in the comments. I was, of course, trying to be honest that it was his critique that gave me the idea of 8 Goldstone phases with 6 corresponding to the extra degrees of freedom of Calabi-Yau, the connection to torsion and Shipov’s 10D manifold. I was very clear in my acknowledgment that Waldyr did not “endorse” the content of the paper. The “math” in the paper is not important. It’s the ideas – mainly both positive and negative vacuum energy at different scales as most of the stuff of the universe w

On Feb 28, 2006, at 7:00 PM, Jack Sarfatti wrote:

bcc

We live in interesting times! 😉 Waldyr explained to me that he was pressured by some people that his students might suffer, otherwise he would not have written anything had I not mentioned him in the comments. I was, of course, trying to be honest that it was his critique that gave me the idea of 8 Goldstone phases with 6 corresponding to the extra degrees of freedom of Calabi-Yau, the connection to torsion and Shipov’s 10D manifold. I was very clear in my acknowledgment that Waldyr did not “endorse” the content of the paper. The “math” in the paper is not important. It’s the ideas – mainly both positive and negative vacuum energy at different scales as most of the stuff of the universe w less than – 1/3. The math can be fixed where needed. What if I turn out to be right in the long run about LHC and DM detectors being null like the Michelson-Morley experiment?

On Feb 28, 2006, at 3:37 PM, Tony Smith wrote:

Jack, Peter Woit’s blog has a new entry at

http://www.math.columbia.edu/~woit/wordpress/?p=355#comments

that mentions your arXiv posting and Waldyr’s critical paper,

and Peter closes his blog comment with the following statement:

“… Sarfatti will be giving the closing talk at the 22nd Pacific Coast Gravity Meeting, to be held this coming weekend at the KITP in Santa Barbara. …”.

Due to the wide readership of Peter’s blog, and the severe criticism by Waldyr quoted by Peter, your KITP talk may get a lot of attention, and

it may be a very good opportunity for you to present your physical ideas.

Here are a couple of lawyer-type bits of advice for your 10 minute talk + 2 minute questioning, scheduled for 17:06 – 17:18 on Saturday afternoon:

You are the LAST speaker, so if you run over it doesn’t hurt anybody else’s schedule and there would be no problem if your question period were extended.

….

As to the format of your 10 minute talk, my suggestion is to begin with a short statement saying that your arXiv paper did contain some technical math problems, some of which were mentioned in a critical paper by Waldyr Rodrigues, but that your main ideas are physical, and you only have time here to give a rough outline of the basic physics ideas, which are, as you said in your recent e-mails:

1 – general relativistic spacetime emerges from Goldstone phases of LOCAL vacuum coherent order parameter in similar way that the superfluid velocity field emerges from Goldstone phase of local helium ground state coherence order parameter.

Good!

2 – point defect at center of Sun could explain the Pioneer anomaly as a hollow dark matter (exotic vacuum) mini-halo centered on the Sun …

Good!

3 – LHC & local DM detectors will not “click” on DM particles ever to explain OmegaDM = 0.23

Good!

Will do. Those are some of the key ideas. Another is taking equivalence principle to the max, i.e. all universal space-time symmetry generators must be locally gauged to induce geometrodynamic fields. If, in addition, there is hidden spontaneous broken symmetry in the vacuum for some of these space-time symmetries the “Meissner” mass gap would explain why curvature dominates over torsion for example at least in most situations.

After stating the 3 main points, then go into more detail, particularly about 1, showing the two forms of vacuum nonlocal coherence forming curved spacetime and incoherent well-known Casimir type effects and

their relationship to /\

Good suggestion but I could not do that in 10 minutes.

At end, plug the possibility that if the Dark Energy can be controlled with Josephson Junction arrays, it would be a vast energy source, a prize even greater than nuclear energy tamed by the Manhattan Project.

That would be way out of their comfort zone. Remember most of these guys who are not senior people are running scared about funding and are afraid to stick their necks out. Most of the hall talk at UCLA DM 2006 was people scared about funding. Even Max Tegmark joked about it in his talk.

Don’t mention UFOs in your main talk.

Of course not! 🙂

If asked, say that control of Dark Energy could enable us to build things that do most of the things that UFOs are described as doing, and leave it at that.

If any of the above suggestions don’t feel right to you, then don’t use them. They are only suggestions, and the most important thing is that you should feel comfortable, and use your Cornell theatrical training to deal with …. et al as you would a heckler in a theater. For 10 minutes, the stage belongs to YOU, if you take it.

On the other hand, maybe the hall will be empty? 😉

I think that maybe Peter is being a bit hard on string theorists. 🙂

For example, Keith Dienes’ latest paper hints that there may be a lower bound on positive cosmological constants on the landscape, and that this lower bound may be way too large. If evidence like this keeps coming in, I think people really will begin to give up, and not make “excuses”.

Amanda,

Maybe you’re right. But I’m having a hard time figuring out what would possibly cause Douglas to give up on string theory. And he’s not telling.

Regarding Waldyr’s comments below, he told me that he was threatened that his students might have trouble getting funding if he did not respond negatively. Note that none of Waldyr’s remarks are relevant to the the actual new physical ideas and are not even relevant to the actual final version of the paper. The math problems that may be still there are really quite minor and any real ones can be fixed. The bulk of Waldyr’s 21 pages of difficult reading does not at apply to my new conjectures at all, but refers to standard background material I mentioned in a cursory way in the first version. See also further remarks below Waldyr’s.

“Sarfatti’s paper we regret to say, is unfortunately a potpourri of nonsense Mathematics. The fact that he found endorsers which permitted him to put his article in the arXiv is a preoccupying fact. Indeed, the incident shows that endorsers did not pay attention to what they read, or worse, that there are a lot of people with almost null mathematical knowledge publishing Physics papers replete of nonsense Mathematics…

A careful reading of [Sarfatti’s paper] shows that his hypotheses are completely ad hoc assumptions, since in our view no arguments from Physics or Mathematics are given for them. Summing up, we must say that Sarfatti’s claim to have deduced Einstein’s equations as an emergent phenomena is a typical example of self delusion and wishful thinking.”

Perhaps, however, Waldyr missed the beautiful simple analogy that the curved tetrad fields are like the emergent superfluid velocity field and that the world hologram with quantized area bits is essentially the Bohm-Aharonov effect for closed non-exact Cartan forms in the presence of topological defects in the emergent coherent order parameter. This is a wonderful idea that must first be grasped heuristically and the math will follow.

The historical Bohm-Aharonov effect is for quantized loop integrals of closed inexact 1-forms around line topological defects. What we need for geometrodynamics is the quantized closed surface integrals of closed inexact 2-forms around point defects. Waldyr’s saying this is adhoc shows he missed the simple intutive idea. It’s not adhoc, it’s compelling and almost obvious to my mind at least.

On Feb 28, 2006, at 11:02 AM, Jack Sarfatti wrote:

Note that I received from several sources who do not wish to be named about Waldyr’s general Wolfgang Pauli “attack dog” style of reviewing papers comments such as:

“But since he knows so much mathematics, he often rushes into criticism, without reading a paper carefully enough. So he thinks that the author has committed an error, whilst this was not the (our case).”

Of course I had some minor (formal at least) ambiguities in the formal expressions in 1st version of

http://arxiv.org/abs/gr-qc/0602022 that Waldyr usefully corrected, but I was also accused of misconceptions I did not make – again from my, perhaps, poor exposition and noting that Waldyr is not a native English speaker. 🙂

http://www.physics.ucsb.edu/~relativity/22nd-PCGM.html

I am giving a 10 minute talk at above meeting it is at

http://qedcorp.com/APS/PCGM2006.pdf

or

http://qedcorp.com/APS/PCGM2006.ppt

On Feb 28, 2006, at 9:34 AM, Jack Sarfatti wrote:

On Feb 28, 2006, at 1:20 AM, Paul Zielinski wrote:

If Waldyr’s criticisms are technically correct and your mathematical errors are reparable, then a detailed response to Waldyr’s published commentary can only improve the strength of your arguments.

The version up there now corrects some very minor formal flaws in the presentation of STANDARD physics that was simply background material that Waldyr wanted more rigorous. Also he accused me of things that were not so, but that may be because I was not clear enough. For example, I certainly knew that there are 4 distinct tetrad first rank Diff(4) tensor 1-forms e^a and 6 zero torsion spin connection 1-forms S^a^b, i.e.

e^a = e^audx^u

S^a^b = S^a^budx^u

The great bulk of Waldyr’s 21 pages has little relevance to the latest version posted now

http://arxiv.org/abs/gr-qc/0602022

The vexing “Yilmaz” issue of locality vs nonlocality of pure gravity energy that Waldyr’s general remarks on energy-momentum conservation in GR in his 21 pages pertains to are not at all relevant to my new conjectures, i.e.

I. tetrad field emerges from the several Goldstone phases of LOCAL vacuum coherent order parameter in similar way that the superfluid velocity field emerges from the single Goldstone phase of the local helium ground state coherence order parameter.

That’s the key idea. I now seem to recall that Hagen Kleinert has 4 phases associated with the tetrads, but not specifically in connection with vacuum coherence? I will have to look at his book – it’s a vague memory.

i.e. the PHYSICAL IDEA is that smooth c-number tetrad field is a vacuum coherence effect – not a random ZPF effect.

II. LHC & local DM detectors will not “click” with “the Right Stuff” ever to explain OmegaDM ~ 0.23. Looking for real DM particles whizzing through space is like looking for Earth’s motion through the old mechanical aether (distinct from CMB Doppler shifts relative to FRW Hubble flow of course).

III. Bekenstein-Hawking and ‘t-Hooft-Susskind ideas point to a 2D quantized period DeRham integral from point defects in the vacuum coherence i.e. area quantization and volume without volume from a closed non-exact “area density” 2-form emergent from the Goldstone phases.

Bohm-Aharonov effect is “area without area” i.e. S1 non-trivial first homotopy – quantized loop integrals of closed nonexact 1-forms around line defects

World Hologram is “volume without volume” i.e. S2 non-trivial second homotopy – quantized closed surface integrals of closed non-exact 2-forms around point defects, e.g. one at center of Sun in Pioneer anomaly? This would be a hollow dark matter (exotic vacuum) mini-halo of the Sun.

Also, you may not have expressed yourself as clearly as you might have in presenting some of your points. Some disambiguation in response to criticisms appearing in Waldyr’s paper might be a good thing.

Sure. These are all new ideas not found in literature.

I agree with Tony that Waldyr’s charges of “ad hocness” are questionable. You evidently have a clear intuitive understanding of the basic physical model you are proposing. However, while I entirely agree that mathematical rigor is secondary to the physics, it does appear that several of Waldyr’s points are not merely a matter of “rigor mortis”, but are concerned with the proper definitions of important mathematical terms.

Yes, I think the current version is essentially OK in that regard. I basically eliminated all reference to what Waldyr found objectionable in the background stuff that is not essential to my thesi.

As for Waldyr’s intemperate language, perhaps it would have been better if you hadn’t cited him at all in later versions of your paper?

I thought I was being polite and honest and I made it clear that it should not be construed that Waldyr agreed with my new ideas. In fact Waldyr’s contribution is forcing me to confront the need for 8 Goldstone phases instead of only 2 and then I associated the extra 6 with Calabi-Yau and massive torsion fields, i.e. both locally gauge Lorentz group and then hide its symmetry in the vacuum – i.e. non-Abelian Meissner effect for the torsion fields whose mass gap prevents us from seeing it easily.

IV. Extended local equivalence principle, i.e. locally gauge ALL universal space-time symmetries of the physical action to induce geometrodynamic fields beyond the original 1915 curvature field.

Note that at the square root tetrad level the curvature and torsion gauge potentials are SPIN 1 just like in Yang-Mills theory. When you go to geometrodynamic level

ds^2 = e^a(Minkowski)abe^b

Then you get a COMPOSITE spin 2 from entangled pairs of “subspace” SPIN 1 tetrad level fields.

Jack Sarfatti wrote:

…

My key physical idea is dark matter is same as dark energy with difference only in scale and the sign of the energy density and w less than -1/3 and from a distance it mimics w = 0 CDM.

It is obvious to me that the curved tetrad & Bekenstein-Hawking quantized area is like the superfluid circulation and that t Hooft-Susskind world hologram “volume without volume” is like Bohm-Aharonov “magnetic field without magnetic field” even if I did not get the formal math completely right on first try working alone outside of academia – Einstein needed Grossman to get his tensors right and he took ten years. I only took a few weeks so far and have not found my Grossman I guess? 🙂

Remember what Feynman said about “rigor mortis”. Are Feynman integrals rigorous? Is M theory? I hear some new math is coming from M theory however. I think Waldyr is making a general point about lack of mathematical rigor in many physics papers. Waldyr is actually a Professor of Mathematics I think and there is a cultural difference. To me the math is secondary to the heuristic ideas.

On Mar 1, 2006, at 12:01 AM, Jack Sarfatti wrote:

Instead of some of the alleged “Dirty Dozen” threatening Waldyr that his students might not get funding because it appears that he “endorsed” my paper when I explicitly wrote that he did not do so, why don’t these brave professors allegedly in search of the truth publish their own specific objections to the actual final version of the paper for all to see and for the historical record? What equation numbers in the actual version now up there are objectionable? Why do they choose to hide behind Waldyr? Remember what will be recorded IF in fact what I say about the LHC and the DM detectors comes to pass – that they forever remain silent except for false alarms. What I predict is easily falsified by an “ugly fact.”

Ode on the remains of a Dark Matter Detector! 😉

Ozymandius

by: Percy Bysshe Shelley

I met a traveler from an antique land

Who said: “Two vast and trunkless legs of stone

Stand in the desert… Near them, on the sand,

Half sunk a shattered visage lies, whose frown,

And wrinkled lip, and sneer of cold command,

Tell that its sculptor well those passions read

Which yet survive, stamped on these lifeless things,

The hand that mocked them and the heart that fed;

And on the pedestal these words appear:

My name is Ozymandius, King of Kings,

Look on my works, ye Mighty, and despair!

Nothing beside remains. Round the decay

Of that colossal wreck, boundless and bare

The lone and level sands stretch far away.

On Feb 28, 2006, at 10:54 PM, Tony Smith wrote:

Jack, did you get a copy of the message from Waldyr dated Tue, 28 Feb 2006 to Zielinski with copy to me in which Waldyr said in part:

On Feb 28, 2006, at 11:23 PM, Jack Sarfatti wrote:

No I did not. How the “Dirty Dozen” could construe from what I wrote that Waldyr endorsed my paper is beyond me. I was very clear to say specifically that he did not endorse it at all. Well no more Mr. Nice Guy eh? I bent over backwards to credit Waldyr’s effort. That’s what one gets by being “collegial” I suppose? 🙂

The “math” is the least important part of my paper. I was not writing a “math paper.” Math is not physics. So far no one has said anything at all about the PHYSICAL IDEAS in the paper. Waldyr has not. No one has. None of Waldyr’s math points are relevant to the these physical ideas that I can see.

Here is what is in my paper about Waldyr

“Acknowledgment

I would like to thank Professor Waldyr Rodrigues, Jr of UNICAMP, Brazil for pointing out mathematical deficiencies in the first version of this paper that is really a progress report. The reader should not infer that Professor Rodrigues endorses all of the ideas of this paper or that he thinks the mathematics is up to his rigorous standards. I remind the reader that important new physics has been created with bad mathematics that is usually patched up later. A case in point is the history of the Feynman diagrams and the path integrals, which are still non-rigorous from the point of view of the top mathematicians.”

What’s wrong with these “dozen” Paragons of Omniscience? Can’t these Victorian Station Masters read plain English? Look at this sentence:

“The reader should not infer that Professor Rodrigues endorses all of the ideas of this paper or that he thinks the mathematics is up to his rigorous standards.”

Shame on that “Dirty Dozen.”

Waldyr allegedly wrote:

“… It is true that I [Waldyr] said to him [Jack] sometime ago that I did not intend to post or publish the notes for the time being. I changed my mind (after almost two weeks) due to a comment that he inserted in the sixth draft of his paper and which to an eventual reader seems to suggest that I made corrections to that version of his paper, something which is not the case.

And indeed, yesterday before deciding to post my notes I received dozen of mails from all around the world asking: how could you endorse that paper?

Well, after that it remained no other alternative as the one I followed

which was to post my notes in the arXiv.

…

Concerning arXiv, endorsers and blacklists I have to say the following.

I think that the arXiv must eventually accept any possible paper it

receives subject to the rules:

(a) Every article should receive a signed review, which is to be posted

together with the article.[1]

(b) Any author could post a reply to the signed review. The reviewer can

write a treplic, and so on.

(c) A blacklist must exist: dishonest people who are plagiarists or invent, e.g., false affiliations must be punished, i.e., they must not be allowed to post any paper. The penalty eventually may not hold for ever, but for a time long enough for them to have time to reflect.

(d) the blacklist must be public. The reason for someone to be in the

blacklist must be given. …”.

—————————————————————

Tony

PS – It might be interesting to know who were the authors of

the “dozen of mails from all around the world”

I can guess who some of them are.

and exactly what they said, particularly in light of your statement that Waldyr told you “that he was threatened that his students might have trouble getting funding if he did not respond negatively.”

And this is a problem. Even if someone is interested in learning more about a subject, a terse presentation will be confusing and offputting. Are references to pertinent papers or textbooks too much to ask?

No, but it creates a barrier to me if I want to learn more about the field. There were’t even any explanations as to what these terms were. The author has assummed that only persons skilled in the field will read the paper, and it becomes a self fulfilling assumption.

I don’t expect a tutorial, but I am disappointed by how inaccessable a lot of scientific papers are. I mislike the Ivory Tower effect that this type of writing gives off, however unintentional it may be. I am an applied mathematics undergraduate with a reasonably detailed knowladge of physics and mathematical physics, but I am frequently thrown off by authors who write for a closed shop.

There was a post in this blog a while back about public engagement in science, and how people were dumbing things down with string theory violin shows and cookery classes. Really, well, trite and condescending presentations of science which I think the public sees through.

My point here is that if scientists cannot even hope to enage and interest scientists in different fields, what hope is there of eliciting general public interest and engagement in our research? If papers are being published with mistakes, perhaps this is another symptom of the same underlying issue.

On “difficulty” of technical papers: they’re meant for experts in the field and so use the short hand of jargon. Of course, math is not a special language — everything in math can be expressed in English, but once you’re used to the short hand you can read the math directly without translation (for example, since I seldom use vector calculus, I still have to translate del into partials). It is true that the jargon is often used as a way to exclude outsides but without an appendix of definitions, etc., there’s no way around it. Does Sarfatti need to explain cartan forms?

‘…how people were dumbing things down with string theory violin shows and cookery classes. Really, well, trite and condescending presentations of science which I think the public sees through.’ – OMF

What matters is that most students see through it. It acts like a filter: emphasising the weirder parts of a non-tested speculative mainstream “theory” only few mathematical competent students into physics. In Britain, where string theory was loudly emphasised in best-selling books by Stephen Hawking, a 4% per year drop in A-level physics followed, which continues today.

The normal reaction of the teaching establishment to this is just to demand more propaganda, or better quality stuff, to attract students into physics. You don’t need the landscape controversy to see where physics is going wrong.

String theory is a tiny component of physics, but because it so hard to test and yet so over-publicised, it casts a dark cloud over physics. If string theory looks right, it is only because it is the conquest of free thought by censorship of alternatives.

Compare general books written about the search for a unified theory in physics when it was a popular subject in Britain (before the 1985 ST revolution), to those today when it is unpopular.

Before 1985, the future of physics was open, there were many exciting possibilities. Today, there is widespread prejudice in favour of one speculative model, with no objective predictions let alone evidence. It is defended by obfuscation, which is unethical.

I feel there is a difference between a terse presentation and a readable one, even if the topic is highly advanced. A simple note at the begining of any document telling th ereader that familiarity with concepts X,Y and Z is assummed would in itself do a lot for ease of access. Then at least the reader knows where to look if they are lost.

In humanities, if one makes reference to an opinion or analysis, they make a big deal about citing properly. Obviously I’m not suggesting that absolutely everything must have an explanatory reference. But yes, it would be helpful if when mentioning cartan forms, the author could provide a succinct reference explaining what they are. That is, if one wants to make the paper more accessable.

About evidence, well, what I collected in the last three years is evidence that physicists are not intererested on evidence (a recent example I have already told: I was the first one -after six months- telling the ODG that their file http://pdg.lbl.gov/2005/mcdata/mass_width_2004.csv has a wrong value for the muon lifetime and others; the file is still (1/March/06) there but at least you can now get the corrected file by privately emailing to the pdg people; the point here is not a couple erratas, but that nobody noticed them, thus nobody had plotted the thing). At least not interested if it is not evidence for their theory; evidence is not anymore an starting point for research except if it is against SM beyond four sigma… there is a rare feeling of completeness, of impossibility to get anything new from the already measured data.

I find an interesting counterexample -or just a different way to do things- on the approach of Connes-Lott, that took pains to start with the standard model Lagrangian, and I remember Alain kept asking every practicioner about the differences between pole masses, running masses etc to be sure about how all the experimental thing could fit inside the model.

Also hep-ph people have different approach that hep-th, the -ph people keep launching models (either for the love of play or as tickets for Helsinki, you choose), but they do an effort to check against some main parameters.

In the last week there were some papers where string theory makes strong predictions about the real world (check on the hep-th and you will find them). So things are moving, little by little string theory will overcome the landscape obstacle.

Theorist,

I’ve looked at the recent hep-th papers and haven’t seen anything like what you describe. It would be major news if there were recent strong predictions from string theory. What papers did you have in mind?

ObsessiveMathsFreak, I really feel your pain. I clearly remember thinking the same things when I was an undergraduate.

But not to worry, there is light at the end of the tunnel. A Ph.D means that the government, or some other funding agency, has spent hundreds of thousands of dallors teaching you the precise definitions of those obscure terms. By the time you’re done you will be able to read the vast majority of the literature like you read the newspaper.

Also, after completing a Ph.D you will understand that most of the things you found obscure as an undergraduate are actually trivial and writing (reading) those things would be a waste of your (the readers) time. Just think about it. To understand the term

superfluidone should have taken a course on quantum fluids, which require courses on critical phenomena and field theory, which require courses on Statistical Mechanics, Classical Field theory, Mathematical Methods, and Quantum Mechanics, which require courses on classical mechanics and thermodynamics, which require classes on calculus and algebra…Or, put another way, go get Dr. in front of your name.But, the main thing you’ll come to understand is that everyone thinks about things differently. This means that while an intuitive wishy-washy explaination may make perfect sense to you, it may not make sense to others (especially if the others read the paper 100 years from now). For this reason it’s necessary to have a set of well defined ideas, procedures, and their associated vocabulary. This way when Jack Sarfatti says

we know precisely what he means; there is no ambiguity. Without this people like Rodrigues would never be able to “po-po” all over the paper.

ObsessiveMathsFreak, I just thought of a better way to think about things. Imagine your in freshman calculus and your instructor decides to teach classes so elementary school children can understand (college freshman is to elementary school as Ph.D is to college freshman).

This instructor starts of by saying a derivative is the time rate of change of a function. A function is a mapping from one variable X to another variable Y, a variable is an abstract representation of some number, a number is an inumeration device, a mapping is…

Every freshman who already understands high school algebra just went to sleep.

ksh95has already responded very well toObsessiveMathsFreak, so I will not embellish too much. The point is that yes, it is too much to ask for preprints and journal articles to cater in any way to the novice. They are not intended and should not be intended for newcomers to the field. They should not provide introductory material known to every expert, and they should not serve as a guide to purchasing introductory textbooks. If you are “disappointed by how inaccessible a lot of scientific papers are,” then get a PhD, because that’s who the papers are intended for. They’re not meant to be toilet reading for the layperson, and the suggestion that they ought to be is ludicrous.Douglas’s “analogies” present an accurate demonstration of the analytical skill of the author.

1) In the period 1913-1930 one saw a continual improvement – by huge leaps – in the performance and sophistication of laboratories. It was hard to say which was advancing faster, experimental technique or theory.

2) The Bohr-Sommerfeld theory was well-founded mathematically and made definite predictions. Also, it provided a proving ground for applying relativity to quantum problems.

Peter did away with the chemical analogy, but this point is so vacuous that it’s not even worth discrediting. For a theory with no vacuum, string theory makes a lot of empty claims.

On a happy note, it’s good to see Sarfatti’s name in the news 🙂

-drl

An alternative for OMF: Instead of getting a PhD, read older papers. Many modern papers are inaccessible because they are part of a continuing conversation about some topic of mathematics. You can’t pick them up and read them any more than you can expect two guys who’ve been talking for hours in a bar about some mutual passion to stop and explain themselves to you. But you can go back, and look at how a field got started. Usually, if you have trouble reading a bunch of different papers in a field, you’ll find that they all reference some earlier foundational paper, where the basic vocabulary is introduced and explained.

To take two examples from one of my favorite subfields of mathematical physics: 1) Atiyah & Bott’s paper “The Yang-Mills Equations on Riemann Surfaces” is more or less required reading if you intend to think about the relations between Yang-Mills theory and the moduli space of principal G-bundles. And you can find in there references to earlier works in the subject, reaching back to Narasimhan & Seshadri, and Mumford. 2) When Witten was writing in the late 70s and early 1980s, he couldn’t assume that his colleagues knew very much of the basic vocabulary of differential geometry. So he spent quite a lot of time giving simple explanations of mathematical ideas.

Most mathematical papers do tell you where to look for an introduction to their language. If they don’t tell you where to find a concept, you can always just try searching for it on wikipedia. There are some quite talented people (e.g. Borcherds) who’ve spent time writing expositions there.

Finally: Mathematical jargon was invented to make our lives easier, not to exclude laymen. If we’d didn’t use it, we’d talk like treants: It would take days just to get through the first sentence.

ksh95 wrote:

“By the time you’re done[with your phd]you will be able to read the vast majority of the literature like you read the newspaper.”I’m pretty sure that this is simply not true. Certainly not for the average phd student. Most certainly not in mathematics.

“Also, after completing a Ph.D you will understand that most of the things you found obscure as an undergraduate are actually trivial”While I agree on this…

“and writing (reading) those things would be a waste of your (the readers) time.”…I’m not that sure about this one.

Also, while completing your phd, (hopefully) you will understand that there is at least 10 times as much new obscure stuff you wouldn’t even dream about as an undergrad.

Basically I must agree with OMF on that papers should be more accessible in general (of course, I don’t think that they should be at undergraduate level, that’s impossible), and written with much more care.

Just compare an average preprint/textbook/conference speech with one by (insert names like Witten, Atiyah, Milnor, etc here). Difference in readability is like heaven and earth.

(sorry if this is very offtopic.)

Hello h,

I have no problem reading most of the physics literature. Is the situation different in math?

Anyway, I strongly agree that papers should be written with more care, but that issue is an orthogonal issue to the conversation.

Back to the topic at hand. I was always taught that papers should be aimed towards the post-doc level. Any lower you bore people and any higher you confuse people. After all, the whole point is to advance the field. Advancement is optimal when we streamline communication between those doing the

“advancing”. Of course communication to novices and laypersons effects replacement rates and funding, but these are secondary effects.Perhaps Theorist is referring to hep-th/0602286, where Dienes claims that a small CC naturally leads to the standard model gauge group.

Ironically, this is actually how first year calculus classes are in fact taught. At least to mathematics students.

There’s probably a difference here between physics and mathematics papers, the latter of which I’m more familiar with. Though I understand that a lot of theoretical physics papers are quite mathematical.

My basic point was that many papers are in fact quite obfuscated by terminology and terseness. Obviously you cannot expect everyone to be able to read your paper, but at the same time it’s a little unreasonable to expect everyone who reads it to be an expert in the field with a photographic recall of terms and definitions. A paper should, in as far as it is feasable to do so, be as modular and self contained as possible.

I think what people are trying to tell you, OMF, is that it just isn’t possible. What you get in papers IS the lingua franca that has been established for a given field. Human knowledge in science at least has become far vaster than one person can ever hope to grasp.

A brief tip that works in my field might also apply to maths – read important dissertations (or parts thereof) in your subject. You can get many via UMI if your library has access. I did this a lot in my early years to get a leg up and be able to comprehend papers. Also, you also MUST talk to people about specific papers, preferably ones that you’re both interested in. When starting out it’s common to have a lot of misconceptions that can hamper understanding, and talking to someone else can often cut through these things quickly.

Thomas,

I’m aware of the Dienes paper. He’s looking at the statistical properties of 10^5 models, ones that are unstable beyond tree-level, so they are not even minima of the landscape. In addition, saying anything about the properties of 10^500 models based on looking at 10^5 very special ones of them doesn’t make any sense.

On top of this, what he finds is a preference for lots of low-rank factors in the gauge group, which favors things like the standard model group over GUT groups, but he gets lots of these factors. To the extent there’s any kind of “prediction” (which there isn’t…), it would be for something like 8 copies of U(1) and SU(2), not one of each.

Claiming that any of this is a “strong prediction about the real world” is just absurd.

I was refering to some papers which appeared last Friday. They might be harder to read and understand but they contain very solid results, not assumptions and guesses…. Solid mathematics and solid physics are used in at least one of them and the pheno predictions based on string theory are testable at LHC.

Theorist,

If you’ll be so good as to tell us which papers you are referring to we can discuss this further. I took a quick glance at last Friday’s papers, and none of them seem to contain anything like a solid prediction of string theory testable at the LHC (and if they did, this would be big news). I’m not going to waste any more time trying to guess what it is you are talking about.

Obviously, Douglas talk is wrong in several historical, epistemological, and technical details. Some points were cited here therefore i will not repeat them.

Douglas’s chemical analogy is rather outdated if one interprets it in a traditional reductionist way, especially when he uses the word “deriving”… Their perception is a beautiful example of why string theory is in the very wrong way. The lot of funding for obtaining a TOE from string theory is just lost money…

As stated by Jean-Marie Lehn (Nobel Prize for macromolecular chemistry), one cannot derive chemical phenomena in bulk from a knowledge of constituents; Lehn clearly states there is not reduction of a hierarchical level to the other, just integration between the different levels [1].

Douglas cites condensed matter physicists; well, P.W. Anderson [Nobel prize for condensed matter physics] said that condensed matter physics, for example, is NOT applied particle physics [2]. A full understanding of atoms (atomic physics) is not sufficient understand solids, and doing progresses in the field. It is more, Anderson cites examples of mainstream atomic knowledge leading to wrong conclusions in condensed matter.

Anderson’s “More is different” has become a common “mantra” in condensed matter physics but is often ignored by particle physicists, stringers, and other reduccionists. It is natural to read Anderson’s criticism to believe on string theory as The Final Theory (the main “mantra” of stringers). Anderson considered string theory “a futile exercise as physics” the last year.

Even the “reduccionist” Murray Gell-Mann (who is often cited by stringers as supporting the “beauty” of string theory) said that string theory -even if finally correct will be NOT a theory of everything [2].

The analogy with chemistry is rather surprising. Douglas says: “On further developing these analogies, one realizes that we do not know even the most basic organizing principles of the stringy landscape. For the landscape of chemistry, these are the existence of atoms, the maximal atomic number, and the facts that each atom (independent of its type) takes up a roughly equal volume in three-dimensional space and that binding interactions are local.”

What is really the link of above with string theory except as prose for more funding on string theory during another ineffective 40 years?

Chemistry is a science; string theory is not. Moreover, knowing Douglas CV, I seriously doubt that Douglas knows that and atom is, maybe the union of an attractor and its basin Michael? 🙂

About Landscape, very little can be said those days. Landscape just reflects the desperation in the field. People took wrong ways in the past and now are in a dead end. There is not possibility for computing the CC from the Landscape, and even if they was done (via a new “revolution”), the prize would be too high for science.

I personally would prefer computing billions of things from standard science and leave the CC as a mystery before computing the CC via Antrophic/Landscaping arguments and then to predict nothing of the rest of the world.

Douglas states in his conclusions that

“We believe string theory has a set of solutions, some of which might describe our world.”

That is all science behind stringers work. “We believe… might…”.

1) These kinds of arguments are being repeated since 40 years ago. We are a bit saturated and is time for the research in other directions.

2) String theory does not work, it newer worked, point. M-theory is not even formulated (if they nonperturbative regime really exists).

3) Mass media, public, undergraduate students, funding agencies are misguided about real success of string theory (i.e none).

In short, we obtain a “theory” (string theory) is not theory (M-theory is even undefined), the “theory” explains/predicts nothing can be explained using GR+SM, offers wrong experimental bounds for many phenomena (e.g. wrong causalitty structure for GR, wrong basic symmetries for the SM at low energies, wrong non-zero T bosonic behavior, etc) and, for obtaining certain bounds on the CC, it needs completely break that satisfactory item we call science.

References:

[1] See Lehn introductory chapter to “Chemistry for the 21st century” (Ehud Keinan; Israel Schechter; Eds.) Wiley-VCH, 2001.

[2] Anderson, P.W. Science 1972, 177, 393-396

[3] Gell-Mann, Murray. The Quark and the Jaguar: Adventures in the Simple and Complex. W. H. Freeman; 1995.

—

Juan R.

Center for CANONICAL |SCIENCE)

Understanding the landscape that created the landscape…

http://www.opinionjournal.com/extra/?id=85000550

-drl

This paper should be widely read, and I expect it will be:

Lorentz Invariance Violation and its Role in Quantum Gravity Phenomenology, by John Collins, Alejandro Perez and Daniel Sudarsky [hep-th/0603002]The authors also make some important general observations about Lorentz invariance violation (LIV), or rather the apparent lack thereof, and the foundations of quantum field theory.

Thanks to Christine Dantas for posting about this. See the excerpts in the comments on her post.

[I know this is off-topic, but the Landscape is so depressing I felt compelled to change the subject (again).]

There is another paper (http://arxiv.org/abs/hep0603010) today about Lorentz invariance violation. It was not so long ago that anyone who said SR was wrong was considered a crackpot. But then again, it was not so long ago that one would have been considered a crackpot for suggesting that we live in one of the few universes out of 10^500 that support life.

Douglas has at least created a great new metaphor – a nightmare.

This talk about Lorentz violation is silly in this context. In the regime where we abandon Galilean invariance, the reason is not symmetry breaking, but “decontraction”, which is very different. So if Lorentz symmetry doesn’t hold up, the reason will not be another shell game of symmetry breaking, rather, recognition of another regime for matter and spacetime considered jointly.

-drl

I notice no one here is sharp enough to even offer an argument with what I stated, that is, you don’t even understand the problem. The problem with the world is, it’s filled with mediocrities.

-drl

Hi Danny,

I may well not be sharp enough to offer an argument: I did not really understand what you were saying. Can I repeat my suggestion that you collect your best thoughts on physics and put them on a web site, like I did?

Chris, you were not in mind, and if anyone could understand it, you could – it goes like this…

One can violate a given symmetry by making more symmetry in a larger context, and then assigning the several old symmetries to their own rooms in the new one. If we make a door between rooms, one room from two, we “decontract” the original symmetries. See here:

http://arxiv.org/abs/hep-th/0106273

The key point is that, contractions are associated with dimensional parameters assigned to groups (Lorentz->Galilei = speed of light; Poincare->Lorentz = Planck length, etc.) while decontractions are associated with assigning a group to a dimensional parameter, not a one-to-one process and so inherently observational. Superficially this is like symmetry breaking – the difference is that instead of adding fields (Higgs mechanism) we “push back” the idea of vacuum itself by introducing a new fundamental symmetry, incorporating the originals by allowing a contraction to recover them. The decontraction looks like the original in some regimes (again like symmetry breaking) because the enlarged vacuum is “stiff” in the dimensional parameter (speed of light is very large; Planck length is tiny etc.)

This is really why string theory is so lame – instead of pushing back the vacuum to incorporate matter as we know it from experiments, it makes matter even more obstreperous by removing its locality.

-drl

-drl