The May issue of Scientific American has a very good cover story by Joe Lykken and Maria Spiropulu, entitled Supersymmetry and the Crisis in Physics (the article is now behind their subscriber paywall, but for those with access to Nature, it will soon be here).

Here are some excerpts:

It is not an exaggeration to say that most of the world’s particle physicists believe that supersymmetry

mustbe true—the theory is that compelling. These physicists’ long-term hope has been that the LHC would finally discover these superpartners, providing hard evidence that supersymmetry is a real description of the universe…Indeed, results from the first run of the LHC have ruled out almost all the best-studied versions of supersymmetry. The negative results are beginning to produce if not a full-blown crisis in particle physics, then at least a widespread panic. The LHC will be starting its next run in early 2015, at the highest energies it was designed for, allowing researchers at the ATLAS and CMS experiments to uncover (or rule out) even more massive superpartners. If at the end of that run nothing new shows up, fundamental physics will face a crossroads: either abandon the work of a generation for want of evidence that nature plays by our rules, or press on and hope that an even larger collider will someday, somewhere, find evidence that we were right all along…

During a talk at the Kavli Institute for Theoretical Physics at the University of California, Santa Barbara, Nima Arkani-Hamed, a physicist at the Institute for Advanced Study in Princeton, N.J., paced to and fro in front of the blackboard, addressing a packed room about the future of supersymmetry. What if supersymmetry is not found at the LHC, he asked, before answering his own question: then we will make new supersymmetry models that put the superpartners just beyond the reach of the experiments. But wouldn’t that mean that we would be changing our story? That’s okay; theorists don’t need to be consistent—only their theories do.

This unshakable fidelity to supersymmetry is widely shared. Particle theorists do admit, however, that the idea of natural supersymmetry is already in trouble and is headed for the dustbin of history unless superpartners are discovered soon…

The authors go on to describe possible responses to this crisis. One is the multiverse, which they contrast to supersymmetry as not providing an answer to why the SM parameters are what they are, although this isn’t something that supersymmetry ever was able to do. Another is large extra dimensions as in Randall-Sundrum, but that’s also something the LHC is not finding, with few ever thinking it would. Finally there’s the “dimensional transmutation” idea about the Higgs, which I wrote about here last year. About this, the authors write:

If this approach is to keep the useful virtual particle effects while avoiding the disastrous ones—a role otherwise played by supersymmetry—we will have to abandon popular speculations about how the laws of physics may become unified at superhigh energies. It also makes the long-sought connection between quantum mechanics and general relativity even more mysterious. Yet the approach has other advantages. Such models can generate mass for dark matter particles. They also predict that dark matter interacts with ordinary matter via a force mediated by the Higgs boson. This dramatic prediction will be tested over the next few years both at the LHC and in underground dark matter detection experiments.

It’s great to see such a high-profile public discussion of the implications of the collapse of the paradigm long-dominant in some circles which sees SUSY extensions of the Standard Model as the way forward for the field. One place where I disagree with Lykken and Spiropulu is their claim that “It is not an exaggeration to say that most of the world’s particle physicists believe that supersymmetry must be true.” Actually I think that *is* an exaggeration, with a large group of theorists always skeptical about SUSY models. For some evidence of this, take a look at this document from 2000, which shows a majority skeptical about SUSY at the LHC. By the way, I hear those on the right side of that bet haven’t yet gotten their cognac, with the bet renegotiated to wait for results from the next LHC run.

**Update**: I hear that the 2000 bet was revised in 2011, with a copy displayed publicly at the Niels Bohr Institute. The new bet is about whether a superpartner will be found by June 16, 2016, and the losers must come up with a bottle of good cognac. There are 22 on the yes side (including Arkani-Hamed and Quigg), and 22 on the no side (including ‘t Hooft, Komargodski, Bern). Also, 3 abstentions. It explicitly is an addendum to the 2000 wager, with those who lost the last one given the option of signing again, forfeiting two bottles of cognac, or accepting that “they have suffered ignominious defeat.”

**Update:** This report from the APS spring meeting includes the following about Spiropulu’s talk there:

Supersymmetry and dark matter have become so important to particle physicists that “we have cornered ourselves experimentally,” said Spiropulu. If neither is detected in the next few years, radical new ideas will be required. Spiropulu compared the situation to the era before 1905, when the concept of ether as the medium for all electromagnetic waves could not be verified.

You can watch the talk and see for yourself here.

“renegotiating a bet” ??? where did they grow up ? i hope they have to pay with more

cognac.

The same issue of Scientific American also has a nice article on Ken Ono (et al) and Ramanujan.

“Actually I think that is an exaggeration, with a large group of theorists always skeptical about SUSY models.”

There’s a name for it: False-consensus effect (look it up on wikipedia). Really, once you read through a list of common cognitive biases, it’s hard not to notice just how widespread they are in science, and the present susy discussion is a particularly extreme example with people trying to hang onto their hopes.

Surely the models such as Lykken is describing would be at least as malleable as susy when it comes to pushing predictions out of range of existing experiments. But it would certainly be interesting to see the second of the 1990-era naturalness dogmas overturned just as lamba=0 has been…

Peter does this article mention the Shaposhnikov-Wetterich model on asymptotic

safety? I bet it does not.

Shantanu,

The article is mostly about not finding SUSY, no details on alternatives. I quoted most of the substantive part about “dimensional transmutation”, the only specific reference was “Building on seminal work by William A. Bardeen…”. There was also reference to the fact that the Higgs mass is at the edge of the metastability region, but about the significance of that, just “Nature is trying to tell us something, but we don’t know what.”

” . . . no details on alternatives.”

Peter, what alternatives do think deserve more attention? Thanks.

Mike,

The problem with SUSY models has always been that they don’t really solve any of the problems of the SM. So, there’s no need to look for alternatives to SUSY in that sense. In the more general sense of ideas about how to improve the SM, given the huge amount of attention paid to SUSY, and the experimental disconfirmation, there’s a good argument that attention would better be paid to almost anything else. One thing I don’t want to do though is host and moderate here a discussion of everyone’s favorite ideas on this.

Nature might be trying to tell us this message: I HAVE NO MASS SCALE.

In such a case, power divergences must vanish and the naturalness problem changes. The Planck scale and the weak scale and the Dark Matter scale can all be generated dynamically.

AS,

“Nature might be trying to tell us this message: I HAVE NO MASS SCALE.”

While this is slightly off-topic, I’d have to disagree. Nature does not appear to be conformally invariant, and you cannot get nonzero mass scales out of nothing. Dimensional transmutation from conformal anomaly is a mirage, it doesn’t really generate a dimensionful quantity out of dimensionless ones (when you think about it, it is mathematically impossible to do such a thing). So at least one mass scale has to exist in nature, say Planck scale, and the ratio of Higgs mass to Planck mass then turns out to be extremely small. The “hierarchy problem” is the fact that physicists find the smallness of that ratio psychologically unsettling, and try to look for an explanation.

HTH, 🙂

Marko

“renegotiating a bet” ??? where did they grow up ?

Perhaps the no-SUSY side decided that they wanted to win on the merits rather than on a technicality?

vmarko,

Pure QCD is an example where you do get non-trivial dimensionful quantities out of dimensionless ones, because of the way renormalization works. As for the significance of the Planck scale, I don’t think one should assume anything about the relation of quantum gravity to the Higgs.

Another way to say this might be that the physics community has been obsessing about supersymmetry, while knowing that SUSY breaking is required to make it work and we don’t have a good idea about SUSY breaking. Instead of this, it might be a good idea to spend time obsessing about conformal symmetry, while knowing that conformal symmetry breaking is required to make it work, and we don’t (yet) have a good enough idea about conformal symmetry breaking.

Peter,

You once wrote an eloquent answer to an Edge question about eloquent explanations, explaining why you think symmetry is an eloquent and beautiful explanation. I am assuming what you said there is one reason why people think SUSY is compelling regardless of the results so far at LHC.

My question. Do you have a reason to believe that SUSY is less eloquent and compelling than other symmetries in physics, or is your concern solely on the lack of evidence?

Neil,

The issue of supersymmetry in general is a complicated one. As a general idea, supersymmetry does have some very compelling features. One of the simplest examples of the general phenomenon is that the Klein-Gordon operator has a square root, the Dirac operator, and in some sense the Dirac operator is a generator of a supersymmetry. This is quite fundamental and one of the most beautiful and compelling ideas in the subject. The details of this are in the notes of my quantum mechanics course.

What is being tested though at the LHC is a much more complicated example of a supersymmetry, the idea that you can start with the SM, and extend it to get a SUSY theory with a super-Poincare group invariance. This is what doesn’t work so well: you get theories that are much more complicated than the SM, with a new symmetry that acts trivially on the SM, telling you nothing new about it. It leads to a complicated ugly theory since you have to break the supersymmetry, and the LHC results bear out the expectation that this isn’t the way to go.

I wouldn’t be at all surprised if there is some deeper insight into the SM that will come from identifying some sort of “supersymmetry” of it, one that is more like the supersymmetry generated by the Dirac operator, not necessarily coming from the super-Poincare group used in SUSY extensions of the SM.

So, I think it’s a promising idea that there may be an eloquent “SUSY” explanation of fundamental physics, I just don’t think the version we have now qualifies, and the LHC is confirming that.

Peter,

“Pure QCD is an example where you do get non-trivial dimensionful quantities out of dimensionless ones”

No, it isn’t. When you solve the renormalization group equation for the running of the strong coupling, it becomes equal to one at a certain energy, called the “QCD scale”. What people often miss to notice here is that this equation also features an arbitrary dimensionful integration constant, which needs to be experimentally determined. IOW, the whole thing is a cheap trick, since it expresses the QCD scale in terms of the “renormalization scale” at which you measure the initial value for the strong coupling constant (before you let it run up to unity). The key point here are the words “experimentally determined”, because in experiment you cannot shut down quark masses. They invariably enter the process of measurement through the back door, and ultimately participate in fixing the QCD scale.

It is also easy to see this without any physics. Can you give me an example of an equation which has “1 kg” on the left-hand side, and the right-hand side built purely out of dimensionless quantities? No? Then it cannot be done in QCD either, nor in any other theory. Such an equation is not mathematically self-consistent, on “dimensional grounds”, regardless of any physics.

I am actually surprised by the number of people (including some Standard Model experts!) who don’t get this stuff right. It is usually explained in standard graduate courses on QCD and nonabelian gauge theories.

Best, 🙂

Marko

vmarko,

I’m talking about pure QCD, no fermions, i.e. SU(3) pure-gauge theory. All parameters in the Lagrangian are dimensionless, but it has a very non-trivial spectrum of dimensionful numbers, with the scale set by what you call a “cheap trick”, and I call a deep insight into nature…

Peter,

I understood what you meant by pure QCD. But the RG equation is a first-order differential equation, and doesn’t specify a single curve for the running of the coupling constant, but a family of curves. The initial condition which singles out one curve out of this family, i.e. the value of the coupling at a certain renormalization scale, cannot be predicted from the theory (not even in the elusive ab initio calculation). So it is a free parameter in the theory, and needs to be fixed by measurement.

Alas, in the real world QCD is not “pure”, it is always coupled to massive fermions. So any measurement of the coupling at some renormalization scale, and subsequent running up to QCD scale, does not really apply to pure QCD. From a theory POV, the pure QCD therefore contains the renormalization scale as a free parameter of the theory, and given no way to fix it, there is no way to say that the theory really “dynamically generates” any mass scale from purely dimensionless coupling constants.

Whether this is a cheap trick or a deep insight is a matter of perspective, but the bottomline is that there is no way to express a scale in terms of scale-free quantities. The only possible thing is to express a scale in terms of another scale, renormalization or otherwise.

Best, 🙂

Marko

@Peter,

Are there any particular searches in LHC data that you think constitute severe tests of BSM physics, SUSY or otherwise? I don’t expect there to be any silver bullet but studying some processes must probe the relevant physics better than others.

I agree with you that the “the end is just over THIS next hill, I promise” runaround from many theorists is getting quite old, but this is from someone outside of HEP. From a Bayesian perspective this near infinite flexibility when constructing models is far more a vice than a virtue. The willingness to dismiss this concern, on a scientific level, is baffling.

West,

I’m not an expert on the LHC searches, and this is a really complicated subject. For each different sort of BSM model, a lot of effort has gone into finding possible signatures, and by now many have been looked for by the LHC experiments. SUSY extensions to the SM have many extra parameters, but it is these models have been most intensively studied.

One of the simplest things to look for is the gluino, the superpartner for the gluon, so it is strongly interacting and thus should in general be produced copiously. It’s also something that can’t be pushed up to arbitrarily high masses without ruining the standard arguments for SUSY (naturalness, coupling constant unification). From what I’ve seen, it now looks like gluinos pretty much have to have masses significantly over 1 TeV to have escaped notice at the LHC, and this is in strong disagreement with the expectation that superpartners should be around the electroweak scale (100 GeV).

“it might be a good idea to spend time obsessing about conformal symmetry, while knowing that conformal symmetry breaking is required to make it work, and we don’t (yet) have a good enough idea about conformal symmetry breaking.”

– right on. This is a very promising direction.

Dear vmarko, let me explain better.

Following Bardeen, some people impose scale or conformal symmetry in order to argue that one must select a regulator that respects them, so that power divergences must vanish and the naturalness problem is circumvented. This has the problems that you mention.

Instead, I am proposing something different: that at fundamental level only adimensional couplings exist. Then, power divergences must vanish because they have mass dimension but there are no masses. Scale invariance would just be like baryon number in the SM: an ACCIDENTAL symmetry broken by quantum corrections. In such a context, scales can be generated by quantum corrections, and this paper proposes how the weak and the Planck scale can be generated http://arxiv.org/abs/1403.4226

The greatest physical and social scientists are the ones that can adjust their thinking as new facts and data come in. The thing is rejecting existing believes are hard. At least the supersymmetry debate is generally peaceful; remember what happened to Galileo and Corpenius, or Darwin and Huxley!

At least the debate of heliocentrism and evolution is essentially philosphical and theological, the debate with fancy physics has became a war of career and face. It is hard to accept new ideas when your job depends on not knowing them.

AS,

“http://arxiv.org/abs/1403.4226”

I assume you are one of the authors of this paper? It is actually an excellent example of what I’ve been trying to say regarding dimensional transmutation.

The key is the third requirement in eq. (44), which says that the Planck mass is proportional to the vev of the scalar field S. You don’t seem to explicitly calculate that vev anywhere in the paper (but only note that it is nonzero due to Coleman-Weinberg mechanism). But if you actually try to do it, you’ll find that the vev of S is proportional to the renormalization scale \mu. This is the only possible choice, since there are no other fundamental scales in the theory. Consequently, M_pl ~ \mu as well.

But this is in start contradiction with the classical (Newtonian) limit, where one measures M_pl ~ const at \mu->0.

Therefore, the model described in the paper is indeed scale-free, but the induced Planck scale runs along with renormalization scale, and goes to zero in the IR sector. The same holds for the Higgs scale, since all induced scales in the model are proportional to \mu. Therefore, all SM masses (fermions, Higgs, Z/W bosons, etc.) also go to zero when \mu->0.

In experiment, however, we observe that all these masses remain nonzero in the IR limit, which indicates that the fundamental theory must feature at least one mass scale other than \mu, and is therefore not scale-invariant.

I’m afraid we are getting off-topic here. Since I believe I’ve made my point already, I wouldn’t like to pollute Peter’s blog with this any more. 🙂 We can discuss this further privately if you wish.

Best, 🙂

Marko

vmarko,

No objection to your discussion with AS, it actually is on-topic, the kind of thing the SciAm article was discussing as an alternative to SUSY.

It is good to know that Nima and several others are still betting/predicting that LHC will find SUSY by 2016. Not finding it would then mean that the next run of LHC will provide a further blow to SUSY, as it cant be argued that it was not expected to be found anyway.

Also if fine tuning arguments of the hierarchy problem are to be believed, and if SUSY is not already ruled out for the next run — finding it in the next run always has a higher probability than not finding it but finding it at the next collider at an even higher energy scale….if you think of each energy doubling of collider as a run or as a new collider then probability of finding SUSY particles will go as “p, p/4, p/16 etc.” for “run 1, run 2 (but not at run 1), run 3 (but not at run 2 or 1) etc.”. p/4 + p/16 + p/64 +…. = p/3. So the next run of LHC (ie LHC at 13 or 14 TeV) has roughly 3 times greater chance of finding SUSY than all the future machines.

Ravi,

The latest bet was from September 2011. Now that pretty full results from Run 1 are in, it would be interesting to know whether Arkani-Hamed and other signers of that bet are still willing to bet on SUSY in the first year of Run 2.

There is a slight correction to the above — the probability will go down by factor of 2 with each doubling not factor of 4…… so the sequence we need to add is p/2 + p/4 + p/8 + …. = p. So LHC at 13/14 TeV has the same chance (p) of finding SUSY, as all future colliders would have together (if LHC doesn’t find it).

Yes Peter it would be very interesting to find out their bets given that LHC’s 7/8 TeV results are fully out. However by Sept 2011 results from 1 fb-inverse data were available. So I don’t think their bets on SUSY particles would change. Anyway will be good to know what their prediction/bet would now be for finding SUSY at next run of LHC.

“Perhaps the no-SUSY side decided that they wanted to win on the merits rather than on a technicality?”

i’ll try that with my bookie next week

yes, I am the author AS. I insist that dimensional transmutation works in the same sense in which it works in QCD.

In the usual Coleman-Weinberg mechanism the effective potential develops a minimum, which roughly is at the special RGE scale at which a running quartic coupling vanishes. Using the running coupling is just a trick to simplify the computation. The effective potential does not depend on the RGE scale mu.

In the gravitational case the situation is somehow different, and again the simplified argument based on the RGE running is just a trick that allows to establish that a flat-space minimum can arise, without computing the effective potential away from flat space.

Some 50 years ago particle physicists hoped to derive masses of “elementary” particles from a deep underlying principle. Are we getting there? Is it still a goal?

Were R. Feynman to be involved in this betting..he might have insisted on a sometimes used Blackjack rule: early/late surrender. This would allow the 2011 SUSYphiles to pull out their bet now (if they so wish) at the cost of only 0.375 liters of brandy.

Perhaps a comparison with the AS + AS proposal, discussed above, could help to illustrate why it might still be worthwhile to study models based on compactifications of M-theory, notwithstanding the lack of encouragement from the LHC so far. If I understand the AS + AS paper correctly, they envision that the fundamental theory of physics, including gravity, is a power-counting renormalizable quantum field theory in 4 dimensions, like the Standard Model, that is defined by some discrete choices of gauge groups and matter representations, plus around 20 real number parameters, whose values have to be determined by experiment – and once those real number parameters have been measured, no further understanding of their values can ever be achieved. In contrast to this, M-theory compactifications attempt to explain both the discrete and the continuous data defining the Standard Model, starting from purely discrete data that specifies, for example, the topology of a 6 or 7 dimensional manifold, plus some quantized fluxes and stacks of branes wrapping some cycles of the manifold.

Attempting to realize the SM by an M-theory compactification can also suggest possibilities that might otherwise go unsuspected, and could eventually have technological applications. An example is provided by large volume compact extra dimensions, with a quantum gravity scale in the range from say 10 TeV to 100 TeV. Here the biggest challenge is no longer the lack of encouragement from the LHC, but BICEP2: the de Sitter radius during inflation would be unlikely to be much smaller than the curvature radius of the compact dimensions, so the tensor-to-scalar ratio r would be far too small for BICEP2 to detect. However about half the BICEP2 B-mode signal could arise from primordial magnetic fields rather than primordial gravitational waves, (Bonvin et al), so if the remainder could arise from under-estimated foreground contamination, (Armitage-Caplan et al), lvced could remain viable. Something must then keep the proton adequately stable, and one possibility is a gauged baryon or lepton number, that is broken down to a Z_2 gauged discrete symmetry by a gaugino condensate, (Fileviez Perez and Wise). Isolated protons are then stable, but dibaryon annihilations turn on above about 10 TeV. If something could then catalyse dibaryon annihilations, the prospects for practical interstellar travel could be improved, because it would be possible, in principle, for a rocket engine to convert a substantial fraction of the rest mass of ordinary matter to the kinetic energy of light decay products such as charged pions, and to utilize a significant fraction of that kinetic energy for propulsion. Proton decay can be catalysed by magnetic monopoles in grand unified theories, (Brihaye et al), and CMS and ATLAS are searching for baryon number violation and magnetic monopoles.

AS wrote:

“Using the running coupling is just a trick to simplify the computation. The effective potential does not depend on the RGE scale mu.”

In theory it doesn’t but in practice it does. In theory to remove the dependence on the renormalization scale you have to sum up the whole perturbation series but in practice you can do perturbation only to a certain order; this way the dependence on the normalization scale is inevitable.

So I wouldn’t call it a trick.

I dislike all this talk of ‘crisis’ and ‘panic’ in science magazines and blogs, I suspect it has more to do with science journalism than science. I met several of the early pioneers of SUSY through my father and I never heard one of them give any indication that they felt it was a symmetry that ‘must’ be realised in nature.

Theoreticians like to explore different models, as you know, not to dictate to nature how to behave. Of course, a certain investment arises when apparatus is built to test theories, but that’s what they are – tests.

There always was, and always will be, the possibility of a desert. Crisis my arse!

AS,

“I insist that dimensional transmutation works in the same sense in which it works in QCD.”

I agree with that. The point I am trying to make is that dimensional transmutation in general — in your model, in QCD, and anywhere else — is just another way to introduce a mass scale into a theory by hand. There is no conceptual difference between postulating a mass scale term in the classical action and postulating a mass scale term in the first loop correction of the action. One way or another, you introduce the scale by hand.

“In the usual Coleman-Weinberg mechanism the effective potential develops a minimum, which roughly is at the special RGE scale at which a running quartic coupling vanishes.”

Sure, a nonzero minimum does exist for that RGE scale, and all scales below it. The problem is that the

positionof that minimum is not determined by the theory. Your equation (43) which determines this position is a first-order differential equation, and it determines where the minimum is up to an arbitrary multiplicative dimensionful constant.In order to fix the position of the minimum (i.e the vev for S), you have three choices:

(1) choose it to be zero, in which case the theory still does not have any scale, and is in contradiction with experiment;

(2) choose it to be proportional to the RG scale \mu, in which case the theory contradicts experiment in the IR limit;

(3) choose it to be proportional to the Planck scale, in which case you are putting the Planck scale into the theory by hand.

The option (3) is actually in contradiction with your principle from the beginning — that nature does not have a mass scale. There is nothing being “dynamically generated” by the theory in any profound fundamental sense. The third requirement in (44) is actually a postulate that nature

doeshave a scale, and this scale fixes the position of the vev for S.That is why I characterized dimensional transmutation as a cheap trick — the presence of the scale in the theory has to be

postulated, one way or another, so that the theory does not get excluded by experiment. This is necessary in your model, in QCD, in superconductivity models, etc. — everywhere where spontaneous symmetry breaking is generated by quantum corrections.Hopefully now I made myself more clear. 🙂

Best, 🙂

Marko

Curious George,

“Are we getting there?” No

“Is it still a goal?” Yes

I’ve always felt that supersymmetry is a compelling idea in general. But the perturbative “carbon copy” approach to implementing this symmetry seems unimaginative. Nature surely has something more clever. Peter makes an essential distinction here in his discussion of the Dirac operator.

When field theorists talk about “dimensional transmutation,” they mean a dimension comes into the theory upon solution or integration — something to do with boundary and/or initial conditions. It’s not part of the dynamics per se. There are plenty of such examples in physics of scale-free dynamics whose solution picks up a scale when the dynamics are integrated and you impose boundary/initial conditions.

Three issues are being elided here: one is that the dynamics are scale-free; another is that there *must be* such a scale in the general solution to the theory, without saying what value that scale takes. Then there’s the specific value itself, the third.

The issue of initial/boundary conditions is always a headache when you’re thinking about a general theory of the universe. Initial/boundary conditions come from the “outside” in the canonical approach. But here there’s no “outside.” You have to go to metaphysical speculation or multiversality. Landau used to put it by saying that science is about differential equations, and religion is about boundary conditions.

Curious George, Peter,

As late as the 1950’s, Einstein was asked if he could explain the confusion of hadron particles which were being found in ever increasing numbers. He replied, “I would be happy just to know what an electron is!”

Anyone know an accurate source for that? I found only un referenced versions of this on my somewhat quick internet search.

Thanks to vmarko, kavanna, and others for the illuminating discussion of how scales enter these theories.

Marko, I would say that a running quartic crosses zero at some scale, and that scale (whatever it is) can be called “Planck scale”.

To be concrete, let us consider a simpler toy a-dimensional model with no gravity: QCD with scalar quarks. Do you agree that the natural value of their quantum-generated masses is the QCD scale?

This is the problem that needs to be solved: finding if the Higgs mass hierarchy problem can have a similar interpretation. To achieve this, one needs a model of quantum gravity where (differently from the usual string models) there are no Planck-scale particles significantly coupled to the Higgs.

Once that one can compute physics above the Planck scale, one can worry about an issue somehow related to what you say: running couplings can indeed set bigger scales via Landau poles.

AS,

“Marko, I would say that a running quartic crosses zero at some scale, and that scale (whatever it is) can be called “Planck scale”.”

Sure, you can do that, but it doesn’t really help. The RGE for the quartic coupling is again a first-order differential equation, and determines the running of \lambda up to a dimensionful constant. Fixing the initial condition \lambda(\mu~M_pl)=0 is again a postulate, rather than a consequence of the theory.

In the paper you have explicitly calculated beta functions for all coupling constants in the model (a formidable task, I might add), but they all determine effective values of the couplings up to some arbitrary initial conditions, each involving a dimensionful constant, i.e. a scale. So regardless of how you set up the theory, you need to introduce the scale — or scales — by hand, they are never predicted by the theory.

“To be concrete, let us consider a simpler toy a-dimensional model with no gravity: QCD with scalar quarks. Do you agree that the natural value of their quantum-generated masses is the QCD scale?”

Sure. And do you agree that this QCD scale is completely arbitrary and not fixed by the model itself? Rather, it needs to be specified through an initial condition, external to the theory?

“This is the problem that needs to be solved: finding if the Higgs mass hierarchy problem can have a similar interpretation. To achieve this, one needs a model of quantum gravity where (differently from the usual string models) there are no Planck-scale particles significantly coupled to the Higgs.”

I fully agree that this is a legitimate research direction for discussing the hierarchy problem. Please don’t get me wrong — I am not criticising your work in general, it’s certainly an interesting idea. I am just criticising the postulate “nature has no scale” that you have mentioned both in the paper and in a comment on this blog. My statement is that such a statement cannot be postulated consistently, since eventually you need to introduce scales in the model, and the only way to do it is by hand, contradicting the postulate. The dimensional transmutation doesn’t really help with that in any way.

Aside from obvious deficiencies of the model (presence of ghosts, as noted in the paper), the material presented is actually quite good and interesting. But my advice is to just drop the postulate, and present the model without any reference to it — the presentation will be conceptually cleaner and less confusing regarding how scales arise in the theory.

Best, 🙂

Marko

Is it fair to say, at this point, that “naturalness”, at least as far as SUSY is concerned, is all but disproven? Even ignoring previous objections to the idea, it would appear that the combination of the current LHC run and some recent precision measurements should lead us presently to conclude, with a fair degree of confidence, that sparticle masses (if they exist) are beyond the reach of the LHC. I.e., there’s scant reason to expect that doubling the energy of the LHC and running it for another 20 years is going to tell us much about SUSY that we don’t already know. Excluding fanatics, this is a line of reasoning even the most ardent believer in SUSY would find difficult to dispel. Correct, or incorrect?

If correct, I fail to see why it’s a crisis, instead of a momentous opportunity to shift course. This ought to herald a paradigm shift worthy of the title. Of course, the peril of a shift to anthropic reasoning exists, I but can’t imagine a preponderance of young physicist could stomach such a travesty.

Dear vmarko,

Nature doesn’t care about what we consider to be natural. She has her own plot. The upper bound on the cosmological constant was so small that a number of distinguished physicist believed that its only “natural value” must be zero and constructed models in accordance with their wishes. Well, the cosmological constant turned out to be unnaturally small but not zero. In 1950’s four Nobel Laureates proposed the two-component neutrino theory whereby the neutrino masses were naturally zero. This was taken over by the inventors of the Standard Model who excluded right-handed neutrinos. They had no reason to do so, except the desire for simplicity.

As the old cliche says: history keeps repeating itself.

So, as one of the instigators of “the bet”, please note the question was not whether the you believe in supersymmetry. The question was whether it would be found by LHC by a certain date. (The renegotiation was to take into account the LHC accident a subsequent further two year delay and to allow the next run). I suspect many people on the list who said that susy would not be found still believe in supersymmetry at higher energy scales.

If they believe in SUSY at higher scales, they also believe that it is standard model till higher scales. They are putting their eggs in both baskets.

I have no idea what fraction of theorists would say they “believe in supersymmetry at higher energy scales”, but there’s an obvious problem with taking that position. Post-LHC it’s likely to be a very long time before we see data from much higher energy scales, and, unlike the naturalness argument for SUSY by the TeV scale, there’s no serious argument for why SUSY should show up at something like a 10 TeV scale. So, “belief” is very much the operative word, with no prospect of experiment telling believers if they’re wrong. That’s an unfortunately plausible scenario for the future, but not one the physics community should be comfortable with.

Dear Peter,

As you are not an advocate of string theory nor one of supersymmetry, do you have an opinion on what are likely candidates (if there are any) for a viable fundamental theory of nature? (Apologies if this has been addressed elsewhere.)

Tim,

That is off-topic, but a FAQ, so I’ve added something here:

http://www.math.columbia.edu/~woit/wordpress/?wp_super_faq=if-not-string-theory-what-is-your-unified-theory-of-everything

This is why I only study the mathematical structures of the nonlinear Bethe-Salpeter type equations.