SUSY 2015, this year’s version of the big annual conference on supersymmetry, has been going on for the past week at Lake Tahoe. Joe Lykken began his summary talk by explaining how as a kid he was a big fan of the Bonanza TV show, about a ranch along the shores of Lake Tahoe. He always wanted to travel to Lake Tahoe and visit the Ponderosa ranch, was bitterly disappointed when he finally got to Lake Tahoe and found that the Ponderosa did not exist. The relevance of the story to his talk is “left as an exercise for the audience”, with a hint in the next slide, which gives the executive summary of the search for SUSY at the LHC: “We haven’t found it.” He ends the summary talk with this now well-known prediction for the SUSY 2215 conference.

In his discussion of “naturalness” (see slide 25), he makes what to me has always been an obvious point, but I haven’t seen it made ever at a conference like this:

Is the Standard Model (almost) all there is?Maybe the naturalness argument applied to the Higgs is just wrong (well, it was apparently wrong for the vacuum energy too)

The Standard Model plus some TeV scale renormalizable additions (like dark matter) might be all there is

The Standard Model itself, or with such modest additions, is completely natural (EW scale is not a prediction)

Usual counterargument involves the putative Planck scale and unification thresholds, but this is speculative

An unsatisfying scenario, leaving many questions unanswered, but has a certain minimalist appeal

Some of the other talks included:

- Nima Arkani-Hamed’s slides and title aren’t available, but the Twitter coverage of the conference included a picture of one of his slides. This tweet reports that he announced that he was “rather annoyed” and “sick of thinking” about naturalness. I guess if you’ve spent most of your career arguing that the LHC would “resolve the naturalness problem” (because it would move fine-tuning from the Tevatron 10% to 1% if it saw nothing), gearing up now for an argument that a 100 TeV collider is needed to “resolve the naturalness problem” (by going from 1% to .1%) would be kind of annoying… In the slide he makes the same argument as in Particle Fever: no SUSY means particle theory should commit suicide and embrace the multiverse.
- In Gordon Kane’s talk he gave his latest string theory-based predictions for the masses of superpartners. For the last twenty years or so, his gluino mass prediction has always been the same: just a little bit higher than the current bounds. These bounds however keep moving up, so his prediction has moved from 250 GeV in 1997 to 1.5 TeV today (“Detectable soon”!). I think one very solid prediction that one could make is that Kane’s SUSY 2016 gluino mass prediction will be 2 TeV.
- Joe Polchinski’s talk defended moduli stabilization schemes such as the KKLT one that came out of his original work with Bousso, ending with the claim that

The KKLT construction has been thoroughly vetted, and it seems to me has survived robustly.

The de Sitter vacua are still there, as is the landscape.

Besides his pre-KKLT role in moduli stabilization, Polchinski is one of the most prominent exponents of the idea that particle theorists should just give up, using the KKLT moduli-stabilized string vacua as a reason for why string theory can never be tested, but we should believe it anyway. He’s been promoting this heavily since 2004, when he got Scientific American to publish this article with Raphael Bousso. My take on that article seems to have upset him greatly, and had a lot to do with the arXiv policy of not allowing trackbacks to this blog (which continues to this day).

This blog does seem to have played an odd role in the topic of Polchinski’s talk. Back in early 2014 someone wrote to me to suggest that I might want to discuss a series of papers by Bena, Grana et al. which pointed out problems with KKLT. I responded that I didn’t think this was a good idea: while I was no expert, it seemed to me that the KKLT construction was an absurdly complex one involving poorly controlled approximations (thus hard to conclusively decide if it was “right” or “wrong”), and it had entered the realm of ideology, as a bedrock for explaining why string theory could never predict anything. What I didn’t say was that I’m a fan of arguments showing string theory can never predict anything, so why should I publicize work challenging them?

Later in 2014, the same person wrote to me again to suggest that I change my mind, that there was a new preprint about this. I weakened and mentioned it here. A couple months later Polchinski published this, which mounts a defense of KKLT against criticisms like those of Bena, Grana, et al. I didn’t know about this, but did write a long posting about another arXiv preprint he had posted the same day, which was about “dualities” in general, and which I found quite interesting. Only after he and Bousso appeared in the comment section to criticize me about KKLT did I realize that they saw me as responsible for promoting anti-KKLT views, not realizing that I’m actually a KKLT fan. Some strange things have happened over the years at this blog, that Christmas Eve discussion was right up there.

In any case, just to make my views clear: I’m a big fan of KKLT, glad to hear that it now has been “thoroughly vetted”. Back when I first heard about it in 2003, I thought “this is great! now that string theorists have proved to themselves that their theory isn’t predictive, they’ll move on to something else”. I’ve been surprised though about how long that is taking to happen…

Very droll!

Ah, Bonanza. Or the Cartwright brothers, as the show was called on Swedish television. The intro music was very catchy.

It is nice to see that Lykken mentions what I see as the single most striking result from LHC Run I – that the SM seems to be sitting precisely at the edge of stability. If we elevate this observation to a principle – the principle of living dangerously – we can immediately make a prediction, both for LHC Run II and future accelerators: there will be no new physics which moves the SM away from the stability boundary. New physics which stays on the stability boundary is not affected, of course.

Peter,

one reason that people have difficulties to believe that the standard model is final might be the following. When the electromagnetic and the weak interaction were made consistent by Salam, Weinberg and Glashow, physicists did not call it electroweak mixing, but called it electroweak unification. Voices of dissent (Feynman above all, in a fit of desperation, shouted that the standard model is made up of “three independent theories!”) were ignored.

This misuse of the term `unification’ put almost everybody on the wrong track for several decades. Everybody started looking for similar kinds of `unification’. And of course, they had no success, because GSW did not unify anything in the first place.

Even today, this is a minority view, but maybe it can explain part of the story.

CIP,

Thanks. I think the comic aspects of some of what is going on here are greatly underappreciated…

Gianleo,

“unification” is just a word, one that I think can justifiably be used for the electroweak theory. I don’t think though the SM is a “final theory”, it leaves unanswered questions that should have an answer, including that of the relation of the three different gauge theories in it. One reason I’ve always been skeptical about SUSY extensions of the SM is that they don’t answer such questions, just introduce more.

Peter,

you are right, `final’ is not correct for the standard model ( I meant `almost all there is’). But electroweak `unification’ did not reduce the number of parameters. So in what sense is it a unification? In what sense is it correct to speak of electroweak interaction as just a single interaction? The weak interaction remains separate from electromagnetism, or doesn’t it?

Peggy Lee had a good song for this https://www.youtube.com/watch?v=LCRZZC-DH7M

Gianleo,

I think that “unification” is an appropriate word because the theory describes weak and EM forces in terms of the same sort of QFT (gauge theory), so a “unification” of framework. Then there’s the Weinberg angle: this is not just a simple product of two gauge theories. But “unification” is just a word and there can be more and less of it. The question of why the two different couplings, why the different charges, why the Higgs, are still awaiting a good idea about further unification.

Takes a Lykken. He has a dry sense of humor, as befits his Scandinavian ancestors. I was a fan of Bonanza when I was a kid as well, and I too was disappointed when I finally made it to Lake Tahoe and saw no Cartwright brothers.

Particle physics will commit suicide

byembracing the landscape. The Planck scale isn’t speculative, but the unification scale of the other three forces is. There are surely alternative “naturalness” problems that arise from other unification schemes, and alternative ways to solve those “naturalness” problems. The landscape is surely and thoroughly unnatural — illegal in 38 states, or it should be.There’s likely to be something to SUSY, but she’s probably not dressed in a convenient, easy-to-spot, quasi-perturbative outfit. Particle physicists have wanted to believe in this implementation because it looks like, and lies close to, the Standard Model they know already.

Someone needs more imagination. Too much particle theory of the last 25 years has been virtuoso performances on a few, already established, strings whose tunes might have already been exhausted in, say, 1990. Young folks have had to dance to these worn-out tunes if they wanted to get jobs. The field has been dominated by a small number of established senior people pushing strings for too long.

Gianleo, do you have a reference to the Feynmann quote about standard model?

First time, I am hearing something like this.

“Genius: The Life and Science of Richard Feynman” – by James Gleick

https://books.google.com/books?id=LYEFPnMxXQUC&lpg=PA433&ots=UOox-G1Qw3&dq=%22the%20standard%20model%20feynman%20repeated%20dubiously%22&pg=PA433#v=onepage&q=%22the%20standard%20model%20feynman%20repeated%20dubiously%22&f=false

Interesting. The same interview is described in The Second Creation by Crease and Mann. I guess they were the recipients of Feynman’s ire.

I think Feynman was being precise….. that nature may not have unification like in grand unified theories, where the SU(3), SU(2), U(1) of standard model are unified into one force such as SU(5) or SU(10).

I hope this isn’t too OT, but I myself have wondered for a long time what Feynman really meant by that. My limited understanding aside, it seems to me quite legitimate to say there is an electroweak force with a broken symmetry, with some fairly compelling experimental evidence even by the time of that interview. Certainly the case for a Grand Unified Force is much more tenuous to this day, experimentally. Feynman seems to be rejecting even the case for there being even a convincing unification of the electromagnetic and weak forces, which makes him sound more like a contrarian than a skeptic on that particular subject. Am I in error?

That said, I’ve admired the unfashionable integrity of this view overall. Like his many comments on a host of other subjects, he was often remarkably prescient.

Matt Grayson: Nice! Only connect!

LMMI: I once heard Feynman comment after a colloquium: “You have to explain why the weak force is so much weaker than EM”, with emphasis on “why”. Or words to that effect. Or so my memory insists…

I’m probably confused, but I never had the impression that electroweak unification aspired to answer the “why” kinds of questions that people find so unsatisfactory about the Standard Model. But is this limitation a good reason to suspect that unification may somehow be illusory? I just can’t tell if Feynman is being overly scrupulous here, because I can’t figure out exactly what he’s being scrupulous about.

If we drop the U(1)_Y group from the Standard model, there will only be the weak and strong interactions and no photon. The photon owes its existence entirely to the U(1)_Y group. In std model there are 3 forces and 3 gauge groups. There is electro-weak physics, but the words “electroweak unification” are probably just to say it is the unified framework of QFT rather than unification of forces like in grand unified or even Pati-Salam (partially unified) models.

And of course there is no evidence so far for GUTs or Pati-Salam and a lot of parameter space is being ruled out. If there is no such unification of forces in nature (which appears to be more and more the way experiments are pointing) then we will have just have several forces and several gauge groups.

Yes, the story is from “Genius” by Gleick. Search for “three theories”.

(I was wrong to add “different”, but if you read the text, you will see that Feynman meant it.)

Anon: Your ideas are very close to those of Veltman, who expresses them in a colourful and forthright way in his 1999 Nobel prize interview. Veltman stresses that nature could just as well be non-unified, so discussions in the popular literature of unification are a form of (annoying and possibly stupid ) propaganda for that area of research. I recommend the whole 30 min interview for those who have not already seen it – it is informative as well as entertaining. In the interview he also gives a ‘Veltman centric’, (and on the face of it correct) explanation for why the LHC was built but the SCC was not!

^^^ 6:39 of Veltman’s interview here: http://www.nobelprize.org/nobel_prizes/physics/laureates/1999/veltman-interview.html

Veltman’s Nobel lecture also makes the point, http://www.nobelprize.org/nobel_prizes/physics/laureates/1999/veltman-lecture.pdf (PDF file)

“Usual counterargument involves the putative Planck scale and unification thresholds, but this is speculative”

is there a hierarchy problem if there is no planck scale or unification threshold?

anti,

No. The hierarchy problem is that of why the electroweak mass scale is so much smaller (and thus “fine-tuned”) than the GUT or Planck scale. If you give up the assumption that the SM is just some effective theory for some other theory with those mass scales, than the “fine-tuning” problem disappears.

peter

what are the physical consequences and predictions if you give up the assumption that the SM is just some effective theory for some other theory with those mass scales?

anti,

Giving up such assumptions about unification leaves you with just the SM, it doesn’t make the SM itself more predictive, and the origin of the electroweak scale remains a mystery. It does get rid of certain “problems” that appear generically when you make those unification assumptions (e.g. the hierarchy problem).

is that a serious proposal among particle physicists? what is your own personal view?

anti,

It’s not a “serious proposal”, it’s just a reminder that the “hierarchy problem” is based on certain assumptions that we do not know are correct. I’ve often made this point here, and I think it’s understood by most people in the field. I was just interested to see that Lykken was making the same reminder in his survey talk, since it is not often brought up explicitly in such talks.

sm, Anonyrat — thanks for the pointer to Veltman – that is interesting.

If we do not have unification of course we can still have new physics such as due to a higher theory based on parity (P) symmetry, or another way to proceed could be horizontal gauge symmetry between generations, and see what regions of their parameter space can be tested. Non-unification would motivate more work in such directions, I think.

Peter — Normally I agree with you on most things you write and am a regular visitor to your blog. However on Hierarchy problem, I think it does not matter whether there is or there isnt new physics at an intermediate scale such as GUT scale — we have the hierarchy problem even if it is standard model all the way up to the Planck scale — or if we argue that is a non-problem in this case, then I think an argument can also be made that it is a non-problem even if there are intermediate GUT scales.

This is because the Planck scale still exists even if there are intermediate scales — and the divergence due to Planck scale can be fine tuned to give any mass just as if it were the divergence in pure standard model from Planck scale.

Anyway there are more direct ways to see that not having intermediate scales is not a solution to the hierarchy problem.

I do think that experiments are showing that hierarchy problem is not solved in nature and I feel it maybe that its a non-problem — whether or not there are intermediate scales.

Anon,

The point is the same for the Planck scale. You’re making an assumption that there is some different physics that acts as a cutoff for the SM, and that this different physics takes place at the Planck scale. But we actually know nothing at all about what is happening at the Planck scale (this is why Lykken refers to it as the “putative Planck scale”).

The standard argument for quantum gravity coming into play at the Planck scale is dimensional analysis. We know Newton’s constant, it’s a dimensional number, and if we assume it can be calculated as some number of order one times the scales in the problem, we get the Planck scale. But (whatever string theorists might tell you), we don’t have a viable theory in which to do such a calculation. How do you know that if we could do this calculation, we won’t get some exponentially small number, not a number of order one (for example, some “induced gravity” theory, where the Einstein-Hilbert action is an effective action, caused by some exponentially small effect)?

Of course I don’t have a viable quantum gravity theory of this kind. But I think it’s important to be reminded of what assumptions one is making, and to remember that we don’t have any evidence for assumptions about what is happening at the Planck scale. All we do know is that the cut-off SM is quadratically sensitive to what is happening at the cut-off. This is undoubtedly a clue, but only a “problem for the theory” if you make some assumptions such as a high quantum gravitational scale acting as a cutoff.

Regarding the electroweak unification — I often like to present it as “half-unification”. Namely, the only thing that makes it more unified than taking the simple product of weak and electromagnetic forces is the fact that SU(2)xU(1) is broken down to the electromagnetic U(1) subgroup which is *not* the second factor in the product (i.e. charge and hypercharge are not the same thing). Other than that, there is really nothing that unifies electromagnetic and weak forces in the way that doesn’t also unify the strong interactions in SU(3)xSU(2)xU(1). Crucially, the number of free parameters is not reduced, which should be the main feature of any proper unification. So while there is something nontrivial in the electroweak unification, it is still not the proper real thing. Therefore half-unification. 🙂

Regarding the hierarchy problem — in the appropriate classical limit, we observe three scales in nature: the Planck scale (i.e. the Newton’s gravitational constant), the electroweak scale (i.e. the Higgs mass) and the cosmological scale (i.e. the cosmological constant). The huge discrepancies between these three scales are called the hierarchy problem (ratio of Higgs to Planck scale is around 10^{-34}) and the cosmological constant problem (ratio of the CC to Planck scale is around 10^{-122}).

At the fundamental quantum level, it may happen that the ratios of these scales are of order one (case A). If that is the case, the hierarchy and CC problems become the problems of explaining the differences in the scales when taking the semiclassical limit of the theory. This has to be quite nontrivial, since we need to obtain some extremely small numbers by solving equations which feature only numbers of order one. It is likely a nonperturbative effect, and given some putative theory of everything, it is ultimately a mathematical problem.

OTOH, it may also happen that at the fundamental quantum level the ratios of the three scales remain basically the same as in the classical regime (with only small quantum corrections). In that case (B), the hierarchy and CC problems become the unanswered “why” questions for the fundamental theory. Then it is a matter of taste whether or not they are considered problems (a failure in our understanding of nature) or just facts of the real world (either by God-did-it design or by multiverse-did-it randomness). While the design/randomness resolutions are metaphysical, I prefer to think of the hierarchy and CC problems as our failure to understand something, which maintains them as physical problems, and means that our putative theory of everything is still not fundamental enough.

To sum up, case (A) requires a proper fundamental theory and a proper mathematical resolution of both problems. Case (B) either reduces to case (A), or denies the hierarchy and CC problems at the metaphysical level.

HTH, 🙂

Marko

Very interesting. Thanks all, learned a lot (with awareness, of course, that deeper understanding of most of the relevant facts are beyond me). There just isn’t much discussion in popsci literature about what nature might look like in the absence of a “unified” TOE of the sort you are all familiar with. These subtleties are glossed over to the point it becomes difficult for the interested layperson to even imagine how or why it might not be so, or what HEP might focus on profitably in place of the putative new broken symmetry at the TeV scale…or any scale.

Peter and Marko,

Thanks for your comments. What I feel is that there is a procedure of doing calculations that gives finite and meaningful answers — regularization and renormalization — that works both for Standard model as well as for GUTs and non-GUT extensions of the standard model (that have more mass scales than the standard model).

I think therefore if hierarchy problem is a non-problem in Standard model, it is also a non-problem in any renormalizable extension of the standard model that may have additional mass scales.

With neutrino masses and mixing we are already going beyond the standard model. Any theory that takes them into account will be consistent and will only lead to a different set of “why” questions.

— if we just add 3 neutrinos and give them small Dirac masses so as not to introduce any new scale, then the why questions would be: why are the Dirac Yukawa couplings so small ~10^-12 (esplly nagging for the third generation)? Why is there an ungauged global B-L symmetry? Why is electric charge quantized if there are no Majorana mass terms? Why is the Majorana mass term absent?

— if we add 3 neutrinos to Std Model and add a Majorana mass term this will lead to a new mass scale (lets say much greater then SM scale, example 10^14 GeV), and all the above why questions disappear. A new why question would be: why is the seesaw scale 10^14 GeV. The hierarchy problem (/ hierarchy non-problem) which was anyway there in SM remains as a hierarchy problem (/hierarchy non-problem) in this case.

In both the above cases we can do QFT, renormalization and get finite answers for measurable quantities.

Anon,

This has now wandered far away from the topic of the posting. Questions such as the ones you bring up about neutrino masses are important, but a very different topic, and I have to remind everyone that I’m not able/willing to run this blog as a general discussion forum.

Hi Peter,

I don’t understand what you mean by “the assumption that the SM is just some effective theory”. The U(1) and scalar sectors of the standard model are renormalizable only within perturbation theory; non-perturbatively they suffer from the triviality problem. As such, the standard model as it stands strictly only makes sense if defined with a cutoff.

Oliver,

Yes, all indications are that the SM is not a complete theory, and that something new has to come into play at shorter distance scales. My comment was very over-simplified. One point it intended to refer to is that we know that appropriate gauge field + fermion theories are asymptotically free and make perfectly good sense at all distance scales, not just as an effective theory. This doesn’t work for the U(1) factor in the SM, but one can assume for instance some GUT scenario where at short distances there is no U(1) factor (it is part of a larger gauge symmetry)

The real problem are the scalars, with that part of the theory only seeming to make sense with a cutoff. One should keep in mind though that at this point we have limited experimental information about the scalar sector of the theory. While we know much more since the observation of the Higgs and many of its decays, still nothing is known experimentally about scalar field self-interactions. It is the scalar sector and its Yukawas that is responsible for most of the parameters in the SM that we don’t know how to calculate. It also has the odd property that to a good approximation the self-coupling is running to zero.

So, my argument is more that most of the ingredients of the SM make sense at arbitrarily short distances, with the main exception the scalar sector, which is the least understood and most problematic. One can imagine a deeper insight into the scalar sector which would change its high-energy behavior to make the triviality problem disappear (not that I have the vaguest idea how to do that).

The Ponderosa Ranch location was near Lake Tahoe. It was open as a tourist attraction until 2004. Those of us too young for the show (Lorne Greene was on a TV show before Battlestar Galactica? Who’s Lorne Greene?) still enjoyed visiting the ranch.

LM,MI …. thanks so much for the link to the story of Feynman and SU(3)xSU(2)xU(1). Made my day. But there is something to the concept of the Conserved Vector Current. Perhaps Veltman nailed it. Seems to me once there was a mini-industry of testing from the ground up whether the electroweak Lagrangian was really right, but everything was so consistent that eventually everyone caved. Well, there are still some STUck on similar concepts. Good.