More Swampiness

Jacques Distler has a new posting about the Swampland, based on hearing a talk by Cumrun Vafa and discussions with him in Eugene, Oregon. Vafa seems to have made clear to Jacques that what he had in mind was just what he wrote about in his paper, investigating qualitative issues such as what gauge groups could arise in string theory. Jacques notes correctly that if string theory is ever to make contact with experiment, it has to have detailed, quantitative things to say. Vafa didn’t think that such things were currently addressable, an attitude Jacques found perhaps overly cautious, although it just sounds to me realistic.

Jacques enlarges Vafa’s swampland question to make it include the obvious crucial problem for string theory: given some arbitrary choice of the 120 or so parameters of the MSSM, can you get this out of the string theory framework? He makes much of the fact that current constructions of flux vacua are parametrized by sets that are not continuous, but discrete, although of such a huge if not infinite number that it is unclear whether this is of any practical significance.

Jacques and many others seem to be of the opinion that the thing theorists should now be doing is studying the details and physical implications of these huge numbers of flux vacua. Besides the fact that this is a horribly complex and ugly business, without the slightest indication from physics that it is a promising thing to do, it seems to me to be something inherently doomed to failure. Without knowing what the non-perturbative formulation of the theory actually is, the reliability of the perturbative string approximation one is using is unclear, with wishful thinking the only reason to believe that the real world will correspond to a region where the approximation is sufficiently reliable. Furthermore, these flux vacua constructions have been accurately described by Susskind as “Rube Goldberg mechanisms”, and it seems to me likely that one can get just about whatever one wants by further complicating the mechanism. This is the completely conventional way wrong scientific ideas often fail: the simplest version of the idea doesn’t work, so people keep trying to fix it by adding more and more ugliness and complexity. Sooner or later the whole thing collapses or fades into deserved obscurity when people finally give up hope of getting anything out of it.

To make the whole question of calculating anything in this framework even worse (something hard to imagine), it seems that there is an inherent theoretical problem with the computational complexity of the question of figuring out which flux vacua correspond to specified observable quantities. Frederik Denef mentioned this in some of his recent talks and Michael Douglas will be giving a talk about this on Wednesday at the KITP, entitled “Computational Complexity of the Landscape”. I guess perhaps the new line about all this will be that string theory is the TOE, but it can be rigorously shown that one can’t ever actually calculate anything with the theory.

Update: The Douglas talk is now on-line. As far as I can tell he has now given up on the idea of doing statistics of vacua, and is instead concentrating on the problem of whether you can show that, given one of the known flux vacua constructions, some flux vacua give you what you want, e.g. a cosmological constant of the right magnitude. Given how poorly string theory on these flux vacua is actually understood, I don’t see that he can even formulate a calculation that makes any sense. But he doesn’t actually calculate anything, engaging instead in a long meta-discussion about computatibility. Kind of a weird performance. Gross seems to have been in the audience, but not spoken up. I hope he hasn’t given up.

This entry was posted in Swampland. Bookmark the permalink.

6 Responses to More Swampiness

  1. MathPhys says:

    If they run into computational complexity issues, then that’s probably the end of the story. They definitely need a new brilliant idea.

  2. Who says:

    MathPhys Says:
    November 15th, 2005 at 7:46 am
    If they run into computational complexity issues, then that’s probably the end of the story. They definitely need a new brilliant idea.

    Here’s a new idea Ma.Ph., I don’t know whether it is brilliant.
    It approaches TOE (joining gravity and particle physics) by deriving features of the Standard Model from beefed-up spin-networks (the LQG states of geometry) so that a quantum description of the geometry includes a description of matter (various numbers of different kinds of fermions)

    I don’t know any source for this except a set of slides from the recent Loops ’05 conference. If you want to check it out, start reading at around slide #35 in this bunch of transparencies for Smolin’s talk

    [url]http://loops05.aei.mpg.de/index_files/PDF_Files/smolin.ppt[/url]

    The abstract has links to both the slides and the recorded talk (video).
    [url]http://loops05.aei.mpg.de/index_files/abstract_smolin.html[/url]

    If you want to watch the video allow a quarter of an hour or so for it to download:
    [url]http://loops05.aei.mpg.de/index_files/Video/smolin.wmv[/url]

    the (so far speculative) model being described here draws on work by Bilson-Thompson
    http://arxiv.org/abs/hep-ph/0503213
    A topological model of composite preons
    ===============
    so whether or not the specifically stringy approaches are bogged and need a new idea (as you suggest) that could be true but string IS NOT THE ONLY PROMISING APPROACH TO JOINING GRAVITY WITH PARTICLE PHYSICS, and this seems to be a new idea in the Loop approach—-in case you or anyone might be interested.

  3. Chris W. says:

    Who,

    For a different take on this, still involving spin foams, see hep-th/0403137 and other papers by the same author.

  4. Quantoken says:

    “Computational Complexity of the Landscape” is an utter understatement of the problem, which makes it sound like a technicality issue of computer science. It is NOT. 10^500 vacuas is a HUGE HUGE HUGE number that few super string theorists even have enough brain cells to comprehend how big the number actually is. To put it in procpsective the univerve has roughly 10^80 various particles and it’s Hawking entropy is roughly 10^120, far shy from the 10^500. You can construct a quantum computer using the whole energy of a gogoplex of universes, and still has not even been able start to enumerate even just a small fraction of the 10^500 vacuas. For all practical purposes and intents, it’s an intractable math problem. Computer science is not the limit, physics is.

    Put the computability issue aside, what makes any one to believe there is ONE, and ONLY ONE vacua, and no more, out of that total of 10^500, that happen to match up to the Standard Model? There may be more than one “correct answser”. There may be a dozen. Actually there could be much more.

    An arbitrary vacua, or any arbitrary theory of any sort, has approximately 1 in 10^12 random chance of matching a particular numerical result to a discrepancy of no more than one part in 10^12 disprepancy, purely by random chance. If we consider one part in 10^12 as a reasonable precision requirement before we consider the experimental result as a matching one, then in average we have one “correct” vacua out of every 10^12 of them.

    In all 10^500, lots of them, about 10^488 of them, will be deemed to be “correct” in providing the correct parameters to the Standard Model. When a theory gives 10^488 equally good and “accurate” descriptions of the nature, it is as good as useless.

    Quantoken

  5. Quantoken: “You can construct a quantum computer using the whole energy of a gogoplex of universes, and still has not even been able start to enumerate even just a small fraction of the 10^500 vacuas. For all practical purposes and intents, it’s an intractable math problem. Computer science is not the limit, physics is.”

    No, that’s not really true. Supposing there were 10^500 vacua, it wouldn’t follow that you’d have to enumerate all of them to find out whether there exists a vacuum with desired properties. Maybe there’s a much faster method. Granted, we believe there’s no general way to speed up combinatorial search by more than a small amount; even quantum computers are known to yield exponential speedups only for a few highly-structured problems (such as factoring and discrete logarithm). Right now, there doesn’t seem to be a good reason to think that the problem of (say) finding a minimum-energy vacuum given a collection of scalar fields has the requisite structure. But you need computer science to tell you that. 🙂

  6. Does anyone see the corruption and celine of Wall Street pension plans, Hollywood, and culture tied directly to the decline of Physics?

    Elite groups of snarky insiders have replaced truth and beauty with trash and bureaucracy, all to benefit themselvs at the expense of the public.

    But less people trust Wall Street. Less people are going to see movies. And less people are believing the string theory hype.

    A renaissance is around the corner.

Comments are closed.