Last week Steven Weinberg gave a Lee Historical Lecture at Harvard, entitled Glimpses of a World Within. There’s a report on the talk at the Harvard Gazette.

In essence, Weinberg argues in the talk for an idea that first started to dominate thinking among HEP theorists nearly forty years ago, one that is sometimes called the “Desert Hypothesis”. The idea is that by looking at what we know of the SM and gravity, you can find indications that the next level of unification takes place around the Planck scale, with no new physics over the many orders of magnitude between the scales we can observe and that scale, at least no new physics that will affect running of coupling constants for instance. The evidence Weinberg gives for this is three-fold (and very old by now):

- He describes listening to Politzer’s first talk on asymptotic freedom in 1973, and quickly realizing that if the strong coupling decreases at short distances, at some scale it would become similar to the coupling for the other fundamental forces. In a 1974 paper with Georgi and Quinn this was made explicit, and he argues this is evidence for a GUT scale a bit below or around the Planck scale.
- He explains about the Planck scale, where gravity should be of similar strength to the other interactions. This idea is even older, well-known in the fifties I would guess.
- He refers to arguments (which he attributes to himself, Wilczek and Zee in 1977) for a Majorana neutrino mass that invoke a non-renormalizable term in the Lagrangian that would come from the GUT scale.

Weinberg sees these three hints as “strongly suggesting” that there is a fundamental GUT/Planck scale, and that’s what will explain unification. Personally though, I don’t see how three weak arguments add up to anything other than a weak argument. GUTs are now a forty-year old idea that never explained very much to start with, with their best feature that they were testable since they generally predicted observable proton decay (which we haven’t seen). We know nothing at all about the source of particle masses and mixing angles, or the reason for their very different scales, and there seems to be zero evidence for the mechanism Weinberg likes for getting small neutrino masses (including zero evidence that the masses are even Majorana). As for quantum gravity and the Planck scale, again, we really have no evidence at all. I just don’t think he has any significant evidence for a desert up to a Planck unification scale, and this is now a very old idea, one that has been unfruitful in the extreme.

Weinberg ended his talk with another very old idea, that cosmology will somehow give us evidence about unification and GUT-scale physics. That also hasn’t worked out, but Weinberg quotes the BICEP2 value of r as providing yet more evidence for the GUT scale (he gives it a 50/50 chance of being correct). Again though, one more weak piece of evidence, even if it holds up (which I’d give less than 50/50 odds for at this point…), is still weak evidence.

For a much more encouraging vision talk, I recommend listening to Nati Seiberg at the recent Breakthrough Prize symposium. Seiberg’s talk was entitled What is QFT?, and to the claim that QFT is something understood, he responds “I really, really disagree”. His point of view is that we are missing some fundamental insights into the subject, that QFT likely needs to be reformulated, that there exists some better and more insightful way of thinking about it than our current conventional wisdom. In particular, there seems to be more to QFT than just picking a Lagrangian and applying standard techniques (for one thing, there are QFTs with no known Lagrangian). Seiberg takes the fact that mathematicians (who he describes a “much smarter than most quantum field theorists”…) have not been able to come up with a satisfactory rigorous version of QFT to indicate not that this is a boring technical problem, but that we don’t have the right definition to work with.

To make things more specific, he describes joint recent work (for another version of this see here) on “Generalized Global Symmetries” that works with global symmetries associated to higher co-dimension spaces than the usual codimension one case of Noether symmetries and Lagrangian field theory. Evidently there’s a forthcoming paper with more details. I’m in complete agreement with him that there must be better ways of thinking about QFT, and I think these will involve some deeper insights into the role of symmetries in the subject.

**Update**: The paper Seiberg mentions is now available here.

I continue to be amazed at the comments of those who hope-beyond-hope that the BICEP2 claim of r=0.2 will somehow survive the full results from Planck. I’d gladly take those 1:1 betting odds. It really requires some serious prior confidence in “large r” being true to get that odds in the face of a serious galactic dust problem.

Or maybe 50/50 is just a “pull a ratio out of the air to still seem confident/hopeful but not overly so.” Maybe someone should ask where that conjectured odds comes from next time.

” Maybe someone should ask where that conjectured odds comes from next time.”

50/50 means the same thing as “I haven’t got a clue”.

I don’t see any discussion of regularization/renormalization in Seiberg’s talk, the Strings14 powerpoint of his that you link to, nor in the one paper I’ve looked at, arXiv:1401.0740v2 [hep-th] (cited on the first page of the powerpoint). Is there some magic way to use his methods to construct a mathematically well-defined interacting QFT that is so obvious that renormalization doesn’t need to be discussed?

Peter Morgan,

Seiberg isn’t claiming to solve the problem of rigorously constructing interacting QFTs. My take is he’s making the claim that physicists understand less about QFTs than they think. One aspect of this is his suggestion that the reason for not having a rigorous construction of 4d interacting QFTs is that we’re missing some important understanding of the problem. But you should take his general claims about QFT as an argument about our ignorance, that we don’t know what QFT is, not as a claim that he knows what it is while others don’t. These general arguments are given just as a motivation for the very specific research program he then describes, investigating new ideas about symmetries.

Nobody,

At least in probability theory, one does not quantify the large uncertainty in a parameter r ∈ [0, ∞) with the invocation of a coin flip. Thats what posterior distributions and credible intervals are for. But if we want to compare nested models with an odds ratio, you still need to specify your chosen prior odds ratio and compute a ratio of marginal likelihoods.

So if a physicist has no clue what a reasonable estimate of the true value for a parameter is, he or she should just say so in colloquial language. On the other hand, if that person wishes to invoke a probability or an odds ratio as a meaningful probabilistic statement, I think demanding the reasoning as to how that value was reached is perfectly acceptable.

“What are the chances that the BICEP2 value of r is correct?”

“50/50”

You can’t get any more uncertain than that.

It is colloquial language. It just sounds slightly better than “I don’t know”, invoking a sense of calculation “to still seem confident/hopeful but not overly so” (as you put it).

You don’t think he calculated it do you? Was a range specified? If not, it’s meaningless, except as a big “I don’t know”.

Enough about Weinberg’s “50/50”. Listened to in context, he wasn’t saying “no idea”, but indicating a reasonable chance that BICEP2 was seeing something other than just dust, r some sizable fraction of the .2 initially claimed. He would like to claim this as yet more evidence for the GUT scale. We’ll see what Planck says.

One thing I do know though: if BICEP2 is all dust, Weinberg won’t count that as evidence against a GUT scale…

Peter,

in your post you seem to make several different, but related points:

1) There is no evidence for a desert up to a high (GUT/Planck) scale.

2) There is no evidence that unification takes place at the GUT scale, and such a scale probably does not exist, because GUTs disagree with experiments.

3) There is no evidence that unification takes place at the Planck scale.

4) There is no evidence that there is a Planck scale at all.

Did I summarize your view correctly ? If not, which points do you disagree with?

Flavio,

“4) There is no evidence that there is a Planck scale at all.”

I’d say this is a misnomer. Planck scale exists by definition (Planck length, time, mass, etc.). What you probably wanted to say is something like “that there is no evidence for any scale where gravity has strength comparable to other forces”, a hypothetical scale which may or may not coincide with the Planck scale.

HTH, đź™‚

Marko

On thing I’ve never quite understood – why should the fact that coupling constants are of similar strength in a certain regime be anything more than coincidence? particularly as, as I understand it, with the unadorned SM the constants don’t all cross over at the same point.

Ru,

if there is an underlying symmetry group that is broken at a certain scale, then the remnants start out with the same coupling at that scale.

and the crossing becomes more precise if you have TeV scale superpartners (not that i am advocating them – just stating the standard rationale).

Peter, did someone in the audience ask the obvious questions you alluded to such as how come no proton decay has been seen and so on?

Or are people so brainwashed that they cannot see the emperor is not wearing clothes?

Peter,

ifone takes the ill-definedness as a starting point, as Seiberg seems to, then what about the Wightman axioms and/or the Haag-Kastler axioms might reasonably and usefully be changed seems IMO the question to ask, not the deep dive into more-exotic-but-apparently-just-ill-defined. It seems as if Seiberg is not really starting from where he thinks he is. Hey ho.Flavio,

Change all the “There is no evidence” to “There is only very weak to weak evidence” and I’d agree with all those statements. The problem here isn’t that Weinberg has no argument or evidence, it’s that the arguments are quite tenuous and weak. Given the lack of good ideas to pursue, it’s reasonable to pursue weak arguments and see where they lead. The problem is that people have been doing this for forty years now with no success, which is a good reason to be suspicious of the arguments and well aware of why they are weak, instead of trying to push them the way Weinberg is doing, implying that they are stronger than they really are.

One problem with tenuous arguments is that it’s then easy to explain away any failure of their predictions. While claiming vindication from BICEP2, if its result disappears, that’s no problem. Another tenuous argument that Weinberg might have brought up 5 years ago if he was giving the same talk would have been that GUTs + no proton decay are evidence for SUSY (this is how Weinberg could have answered Shantanu’s question if it had been asked), and that plus the “naturalness argument” implies LHC-scale SUSY breaking. Now that that hasn’t worked out, this isn’t considered a serious problem for the whole scenario, because the argument was always tenuous.

Peter Morgan,

We’ve known for a long time that axiom sets like Wightman or Haag/Kastler aren’t good enough (gauge symmetry, gravity, etc, etc). The problem is what is the right thing to replace them, and I think it’s highly likely new ideas are needed.

Peter,

If, at a certain energy scale, the forces (except gravity) all have the same coupling strength, does it necessarily follow that they are all “unified” under one mathematical theory, as in the electroweak theory? Is there some kind of proof of this idea? Or is it possible that they can all have the same strength, yet still exist separately as the strong and electroweak forces? Is this discussed in QFT textbooks?

Thanks!

David,

The fact that running coupling constants have the same strength at some energy in and of itself says nothing about unification. The argument is that it’s evidence for unification, because in some GUT unification schemes you break the symmetry with Higgs fields, and the way this is done, there is an energy at which this breaking occurs and at which the coupling constants need to be the same. This actually depends on a mass of details, including how you define the coupling, how the Higgs breaking takes place, whether you choose a supersymmetric theory, what the scale of SUSY breaking is, other threshold effects, etc. etc. I wrote a little bit about this question in my book “Not Even Wrong”. If you want to know exactly what is going on, you have to get into the details of how SUSY GUTs work.

This is one of the two main pieces of evidence for SUSY GUTs (the other is that the particles in a generation fit well into the spinor rep of SO(10)). Again, the problem is just that it’s a weak piece of evidence.

Peter,

thank you for your clarifications, which makes your point of view very interesting and different from many others. What then is your present opinion about the following statements:

5) Unification surely occurs between LHC scales and at the latest some distance scale larger / some energy scale lower than the Planck scale.

6) Something about our present understanding of QFT has to change between LHC scales and some distance scale larger / some energy scale lower than the Planck scale.

7) Speaking about distance scales smaller than the Planck scale (or about energy scales for single particles higher than the Planck scale) makes no sense.

Do you tend to agree or disagree?

I agree that we need some new idea(s), but there are many possible weakenings either of Wightman or of Haag/Kastler. We can consider relinquishing microcausality or Lorentz invariance (both often relinquished at small scales, including for string theory and noncommutative geometry approaches, but fairly constrained), introduce non-associativity (limited C*-algebraic/Hilbert space structure, problematic for interpretation, I can’t see how it can work but maybe it’s not impossible if there’s non-associativity only at time-like separation), or introduce nonlinear structure instead of using distributions (for Haag/Kastler, no weak additivity, which is fairly often relinquished, but it’s new insofar as it’s not been concretely considered in the Wightman context, AFAIK, no definite constraints yet), or some other new modification. Do you just mean not a field theory and not an algebra of observables, which more-or-less cuts Wightman and Haag/Kastler off at their roots?

Flavio,

I don’t agree with any of those statements, because I don’t agree with “surely occurs”, “has to change”, “makes no sense”. All one can solidly say about these issues is that we don’t actually have any solid arguments or evidence, so should keep an open mind and not convince ourselves that our weak arguments imply that we must follow their logic.

Peter Morgan,

I don’t have specific alternate axioms in mind and I doubt Seiberg does either. I wish I did. If you want some vague personal speculation, a major motivation for the book I’ve been writing about QM (and some free QFT) is to see how far I can get recasting QM and QFT in terms of representation theory (for instance taking the enveloping algebra of a Lie algebra as one’s algebra of operators). It’s very clear though that this is insufficient, not capturing a lot of the mathematical structure one wants. It’s very unclear what the right general framework is to capture the structures one finds in the Standard Model. So, to me at least, better axioms are a long range goal, not something for the near future. Before worrying about axioms I think you first need to find a deeper understanding of the mathematics behind the Standard Model, and that question still holds many mysteries.

Peter,

thank you for your answer. I am intrigued that you do not agree with the statements 5, 6, and 7, as I have rarely heard about people not agreeing with them. Disagreeing with them means that you are open for the existence of scales smaller than the Planck length. Maybe that is indeed the way to go, advancing against the flow of the more usual arguments. All the best for your endeavours, and please let us know about your progress!

“there must be better ways of thinking about QFT” I agree. Someone compared QFT to a Rube Goldberg machine. One promising approach is the Amplituhedron. Scattering amplitudes are volumes of a geometric figure.

WHat exactly is a “QFT without a lagrangian”?

It is true that under certain symmetries (like conformal symmetry) and boundary conditions, you can calculate some or even all correlators without any lagrangian being necessary.

That said, if it also turns out that, whatever lagrangian you try in the classical level with this symmetry, the symmetry is broken by quantum corrections (CFT again!), it is not clear that you are talking about a “theory without a lagrangian” rather than “a symmetry impossible to realize in a quantum theory”.

I guess one has to wait until the paper, but Some of the discussion around QFT seems a little bit vacuus.

Minimal action,

The 6d N=(2,0) superconformal theory that is a hot topic this days is often given as an example of a theory with no Lagrangian.

Minimal action,

Any “isolated” quantum field theory—that means, one with no continuous parameters—cannot admit a Lagrangian description. To understand why, look at the converse statement: any time you have a QFT with Lagrangian L, meaning that the correlation functions may be computed by the path integral of exp(i L), then nothing stops you from considering the correlation functions computed by exp(2 i L), or exp(g i L) more generally. Doing so will give you a family of correlation functions depending on the parameter g (except in the trivial case when L is free; then g can be trivially absorbed into the path integral measure and has no relevance). The (2,0) theory, and many other QFTs, have no continuous parameters, so their correlation functions cannot possibly be computed by the path integral of any Lagrangian.

Physicists like theories that work. That is hard enough. Mathematical rigor comes second, and is often useless anyway. E.g. sloppy use of Dirac delta functions often works well enough. You only improve the math if the theory does not work well enough, which is why AQFT is of no interest to most physicists. That is just pragmatism. Data are needed to show that current QFT theories are not good enough, only then progress may be made. Speculating about the Planck scale without having relevant data is for retired physicists.

What is the state of understanding of far less advanced theory – QED ? It was a long time ago that I took the class on it, and it may not have been very up-to-date at that time, so my question may be obsolete. But isn’t renormalization in QED a bit of a crutch to compensate for lack of consistent theory?

A String Theorist:

Is this correct? To me, a quantum theory (not necessarily a field theory) is fully defined by its set of all correlation functions, but couldn’t there be a Lagrangian formulation anyway. E.g., the Ising model, i.e. the CFT with c=1/2, is an isolated theory, since the closest theories have c=0 or c=7/10. Nevertheless, it has a lattice formulation with a Hamiltonian (the real Ising model that Ising thought about), and can also be reached with epsilon expansion from phi^4 theory in 4D.

Thomas,

The Ising model is isolated only as a CFT, not as a QFT. You can perturb the theory by adding an operator to the Lagrangian. It will no longer be a CFT, but its a fine QFT. And voila, the coefficient of that operator you’ve added is the continuous parameter.

More to T.LarrsonĹ› comment, if you know the topology and all correlation functions, you should be able to reconstruct Ln(Z), and hence the lagrangian.

The issue is, not all symmetries “imposed“ are stable against quantum correction.

Naively I would say the same applies to the remark by `string theorist` about “isolated“theories, if any lagrangian changes the parameters continuisly, why do you talk about a “theory without a lagrangian“ rather than just an inconsistent theory?

String Theorist,

Now I feel stupid. The Ising model at a non-critical temperature is of course related to phi^4 theory without being conformal.

You can find a strong reason why the 6d (2,0) SCFT cannot have an action, on the first two paragraphs of page 10 of the following paper by Witten:

http://arxiv.org/abs/0712.0157

There are also no go theorems saying that you canâ€™t have a deformation of a free Abelian 2-form gauge theory to a non Abelian one but the assumption is that you have a local action

See e.g.

http://arxiv.org/abs/hep-th/9909094

“Itâ€™s very unclear what the right general framework is to capture the structures one finds in the Standard Model.”

Doesn’t Connes’ noncommutative geometry capture the structures satisfactorily ?

@Martin

Physicists like theories that work. That is hard enough. Mathematical rigor comes second, and is often useless anyway.I agree that rigor comes second, but it being useless is an inflammatory statement in the class of “I don’t know what’s going on but I can fib my way through, and I don’t really care whether any of this makes sense.”

Working on a mathematical object that is shown to imply âŠĄ after a decade or so in a footnote that no-one reads? I’m against that. That is not a theory that “works” in any sense, shape or form.

GBrown,

No. Noncommutative geometry ideas have been used to give a different formulation of QFT, but I don’t think the results at this point convincingly explain any of the things we don’t understand about the standard model.

See a later comment for more about this.@Yatima: OK, I withdraw “useless”. Sorry.

I meant: if a theory is approximate (effective) then mathematical rigor does not make it any better. See e.g. algebraic QFT – did this yield anything of use for actual particle physics? I have nothing against math, honest.

Martin,

Seiberg’s point was not that rigor matters, but that the failure of mathematicians to develop rigorous versions is a hint that something is missing, something that could lead to important new insights if you found it.

One thing you learn when doing mathematics is that sometimes when you can’t find a proof of a statement that you think you understand well and has to be true, it’s not because you’re not clever enough to find the proof, it’s because you were misunderstanding the situation, and the statement was actually not true.

Yes, Peter, I see that.

On the other hand, mathematical rigor does not imply physical correctness. I think that in some areas of theoretical physics (not in all!) math has overtaken physics, which is a sign of a crisis, at least from the physics point of view.

@Peter: I seem to have a different experience (being a mathematician myself), at least when dealing with structures where you already have found an amount of stuff. When in such cases I encounter a situation of not being able to find a proof, I usually have to dig more deep into those structures. And then one reveals more precise properties of the studied structures, usually along with that searched proof (and why the proof could not have been found without taking into account those more precise properties).

@Martin: Physics is, for sure, ultimately ruled by experiments. Still (if mathematics does exist underneath as I strongly expect), you want to have some theory that either directly describes the physical stuff, or at least it approximates that stuff. And you want to understand that math too, to not get lost in translation (as it apparently happened to those overtaken by the parts of mathematics that are irrelevant to physics).

To expand a bit on my perhaps overly dismissive comment about NCG approaches to QFT and the Standard Model, a better way to describe my impression of the subject is that I’m intrigued, but not convinced. I should make clear that this is an “impression”, based upon periodic attempts to learn a bit about the subject, which have not at all gone far enough to really understand what is going on. Such efforts have taken me just far enough to see what look like some interesting ideas, while not seeing something that, to me, convincingly explains what seem to me the big mysteries of unification.

Among the many things I haven’t had a chance to look at, it has been pointed out to me that there are new papers out by Connes and Collaborators, see

http://arxiv.org/abs/1409.2471

and

http://arxiv.org/abs/1411.0977

as well as a blog posting about this by Connes here

http://noncommutativegeometry.blogspot.com/2014/11/particles-in-quantum-gravity.html

which anyone interested in the subject should take a look at.

I should also point out that while I’m not convinced by this, I’m quite convinced that standard GUT approaches don’t work, better alternatives are few and far between, so this is one that deserves to be taken seriously.

Dear Peter,

You might look at a recent reformulation of the NCG formulation of the standard model by Boyle and Farnsworth: http://arxiv.org/abs/1408.5367.

Thanks,

Lee

You say

“zero evidence that the [neutrino] masses are even Majorana”.There’s also zero evidence that the masses are Dirac. The only real evidence we have that the neutrino masses are Dirac is that the other fermions we’ve seen are Dirac.And (since I don’t think you can count the different generations as separate pieces of evidence) we’ve seen exactly three elementary Dirac fermions: the electron, the up quark, and the down quark. I don’t think this is strong evidence that the neutrino isn’t Majorana.

“Noncommutative geometry ideas have been used to give a different formulation of QFT, but I donâ€™t think the results at this point convincingly explain any of the things we donâ€™t understand about the standard model.”

I would have rather permuted “QFT” and “Standard Model” in the above sentence:

NCG (in Connes sense) does not explain so much about QFT, in the sense that once obtained the Lagrangian of the Standard Model, in order to do physics one uses the usual tools of QFT (in particular the equation of the renormalization group, coming from usual approach to the SM). On the contrary, it gives explanations on several aspects of the SM that are rather ad-hoc otherwise:

– the Lagrangian, with the correct representations of particles;

– the need for a scalar field (which comes out on the same footing as the other gauge fields, as the “noncommutative” part of the connection 1-form);

– a constraint on the number of particles per generation (which should be 2 x (2 x n)^2 for n integer. The SM is n=2, that is (6 colored quarks + electron + neutrino) x 2 (left, right) x 2 (antiparticles) = 32;

– a relation between masses.

The gauge group itself has been “almost” derived from geometrical principle, up to an ad-hoc symplectic hypothesis. And it seems that in the last work of Connes and al you mention, this ad-hoc hypothesis can be removed.

One may say think this does not explain anything and this is just a way to rewrite in a complicated mathematics language things that one already knows. But one could say the same about GR (Weinberg does it in his book, at least an older version of it): “all the tensors with their funny way of transforming under Lorentz transformation can be given a geometrical interpretation, like curvature, but this is not relevant for physics”. [the quote is not exactly Weinberg, but there is a chapter in his book which says something like that].

In my opinion all the interest of GR is precisely its geometrical interpretation, which makes it much more that just a collection of rules of tensor transformations. If NCG manages to turn the SM into a geometrical theory, I would say it is quite relevant for physics.

Of course, there are other aspects of the SM one should explain which are not addressed in NCG, or not in a sufficiently convincing way. Peter, could you be more explicit on that point: in your opinion, what are the most important things we do not understand about the standard model ?

martibal,

I wrote a chapter in my book (chapter 8) specifically about this, don’t want to reproduce it here, in any case it’s pretty obvious to anyone who spends much time studying QFT and the Standard Model what the list is of things it doesn’t explain.

As for why I don’t find the list you give of answers provided by NCG convincing, part of the problem is that these are post-dictions, so that raises the bar for how compelling they need to be.

“… I think you first need to find a deeper understanding of the mathematics behind the Standard Model… ”

“As for why I donâ€™t find the list you give of answers provided by NCG convincing, part of the problem is that these are post-dictions … ”

Well, if you find a deep mathematical understanding of the standard model it will by default be in the form of post-dictions.

“Well, if you find a deep mathematical understanding of the standard model it will by default be in the form of post-dictions.”

That’s like vacuously arguing that Schwinger’s prediction of the electron’s magnetic moment was a post-diction, because it had already been measured. A better mathematical understanding of the SM might allow a precise prediction of neutrino masses, the strong coupling, the Weinberg mixing angle, or the CKM weak interaction mixing angles matrix. Thus it could be a prediction fully testable by future inprovements in SM parameter measurements, not just an ad hoc post-diction.

Jesper,

Sure. Nothing wrong with a good postdiction, they can be very convincing. But the bar is higher for such claims.