There’s a well-known list of high-profile problems in fundamental theoretical physics that have gotten most of the attention of the field during the past few decades (examples would be the problems of quantizing gravity, solving QCD, explaining dark energy, finding a model of dark matter, breaking supersymmetry and connecting it to experiment, etc.). Progress on these problems has been rather minimal, and in reaction one recent trend has organizations such as FQXI promoting research into questions that are much more “philosophical” (for instance, they are now asking for grant proposals to study “The Nature of Time”). In this posting I’d like to discuss a different class of problems, ones which I believe haven’t gotten anywhere near the attention they deserve, for an interesting reason.

The three problems share the characteristic of being apparently of a purely technical nature. The argument against paying much attention to them is that, in each case, even if one were to find a satisfactory solution, it might not be very interesting. It’s possible that all one would discover is that the conventional wisdom about these problems, that they’re just “technical” and thus not of much significance, is correct. The argument for paying more attention is that the technical problem may be an indication that we’re doing something wrong, that there is something of significance about the Standard Model that we haven’t yet understood. Achieving this understanding may lead us to the insight needed to successfully get beyond the Standard Model. At the moment all eyes are on the LHC, with the hope that experiment will lead to new insight. Whether this will work out is still to be seen, but in any case it looks like it’s going to take a few years. Perhaps theorists with nothing better to do but wait will want to consider thinking about these problems.

**Non-Perturbative BRST**

The BRST method used to deal with the gauge symmetry of perturbative Yang-Mills theory does not appear to generalize to the full non-perturbative theory, for a rather fundamental reason. This was first pointed out by Neuberger back in 1986 (Phys. Lett. B, 183 (1987), p337-40.), who argued that, non-perturbatively, the phenomenon of Gribov copies implies that expectation values of gauge-invariant observables will vanish. I’ve written elsewhere about a different approach to BRST that I’m working on (see here), which is still at a stage where I only fully understand what is going on in some toy quantum-mechanical models. My own point of view is that there’s still a lot of very non-trivial things to be understood about gauge symmetry in QFT and that the BRST sort of homological techniques for dealing with it are of deep significance. Others will disagree, arguing that gauge symmetry is just an un-physical redundancy in our description of nature, and how one treats it is a technical problem that is not of a physically significant nature.

One reaction to this question is to just give up on BRST outside of perturbation theory as something unnecessary. In lattice gauge theory computations, one doesn’t fix a gauge or need to invoke BRST. However, one can only get away with this in vector-like theories, not chiral gauge theories like the Standard Model. Non-perturbative chiral gauge theories have their own problems…

**Non-perturbative Chiral Gauge Theory**

Since the early days of lattice gauge theory, it became apparent that chiral symmetry was problematic on the lattice. One way of seeing this is that naively there should be no chiral anomaly on the lattice. The problem was made more precise by a well-known argument of Nielsen-Ninomiya. More recently, it has become clear that one can consistently introduce chiral symmetry on the lattice, at the cost of using fermion fields that take values in an infinite dimensional space. One such construction is known as “overlap fermions”, which have the crucial property of satisfying relations first written down by Ginsparg and Wilson. This kind of construction solves the problem of dealing with the global chiral symmetry in theories like QCD, but it still leaves unsolved the problem of how to deal with a gauged chiral symmetry, such as the gauge symmetry of the Standard Model.

Poppitz and Shang have recently written a nice review of the problem, entitled Chiral Lattice Gauge Theories Via Mirror-Fermion Decoupling: A Mission (im)Possible? They comment about the significance of the problem as follows:

Apart from interest in physics of the Standard Model — which, at low energies, is a weakly-coupled spontaneously broken chiral gauge theory that does not obviously call for a lattice study — interest in strong chiral gauge dynamics has both intensified and abated during the past few decades. From the overview in the next Section, it should be clear that while there exist potential applications of strong chiral gauge dynamics to particle physics, at the moment it appears difficult to identify “the” chiral theory most relevant to particle physics model-building (apart from the weakly-coupled Standard Model, of course). Thus, the problem of a lattice formulation of chiral gauge theories is currently largely of theoretical interest. This may or may not change after the LHC data is understood. Regardless, we find the problem sufficiently intriguing to devote some effort to its study.

In a footnote they compare two points of view on this: Creutz who argues that the question is important since otherwise we don’t know if the Standard Model makes sense, and Kaplan who points out that if there is some complicated and un-enlightening solution to the problem, it won’t be worth the effort to implement.

You can read more about the problem in the references given in the Poppitz-Shang article.

**Euclideanized Fermions**

Another peculiarity of chiral theories arises when one tries to understand how they behave under Wick rotation. Non-perturbative QFT calculations are well-defined not in Minkowski space, but in Euclidean space, with physical observables recovered by analytic continuation. But the behavior of spinors in Minkowski and Euclidean space is quite different, leading to a very confusing situation. Despite several attempts over the years to sort this out for myself, I remain confused, and can’t help suspecting that there is more to this than a purely technical problem. One natural mathematical setting for trying to think about this is the twistor formalism, where complexified, compactified Minkowski space is the Grassmanian of complex 2-planes in complex 4-space. The problem though is that thinking this way requires taking as basic variables holomorphic quantities, and how this fits into the standard QFT formalism is unclear. Perhaps the current vogue for twistor methods to study gauge-theory amplitudes will shed some light on this.

On the general problem of Wick rotation, about the deepest thinking that I’ve seen has been that of Graeme Segal, who deals with the issue in the 2d context in his famous manuscript “The Definition of Conformal Field Theory”. I saw recently that he’s given some talks in Europe on “Wick Rotation in Quantum Field Theory”, which makes me quite curious about what he had to say on the topic.

For some indication of why this confusion over Minkowski versus Euclidean spinors remains and doesn’t get cleared up, you can take a look at what happened recently when Jacques Distler raised it in related form on his blog here (he was asking about it in the context of the pure spinor formulation of the superstring). I’m not convinced by his claim that the thing to do is to go to Euclidean space-time variables, while keeping Minkowski spinors. Neither is Lubos, and he and Jacques manage to have an argument about this that sheds more heat than light. It ends up with Lubos accusing Jacques of behaving like Peter Woit, which lead to him being banned from commenting on the blog. While this all is, as Jacques describes it “teh funny”, it would be interesting to see a serious discussion of the issue. Since it in some sense is all about how one treats time, perhaps one could get FQXI funding to study this subject.

**Update**: Lubos Motl has immediately come up with a long posting explaining why these are all non-problems, of concern only to those like myself who are “hopeless students”, “confused by many rudimentary technicalities that prevented him from thinking about serious, genuinely physical topics.” If I would just understand AdS/CFT and Matrix theory I would realize that gauge symmetry is an irrelevance. Few in the theoretical physics community are as far gone as Lubos, but unfortunately he’s not the only one that thinks that concern with these “technicalities” is evidence that someone just doesn’t understand the basics of the subject.

Last Updated on

Sorry to have prolonged this discussion Peter.

“Witten’s argument for the eta-prime mass.”

This is probably a stupid question, but wasn’t that ‘t Hooft’s argument?

“Sorry to have prolonged this discussion Peter.”

I’m sure I’m not the only one who found it interesting.

Rondeau,

‘t Hooft showed that the argument that the eta-prime should have low mass is ruined by the anomaly. Witten actually gave a formula for the mass, arguing from the 1/N approximation. I guess Veneziano did this independently, it is known as the Witten-Veneziano mass formula.

Thomas,

That is right. The non-existence of strong CP violating term is confirmed by the non-existence of neutron’s electric dipole moment. Axions are also non-existent.

Anon,

The Kugo-Ojima formalism provides us the manifestly covariant formulation of the Standard Theory. It is important to recognize that local gauge transformation is a classical concept, because it contains arbitrary functions, i.e., functions chosen by human being. Quantum field theory is the fundamental theory; it must be independent of human will. The quantum version of local gauge invairance is nothing but BRS invariance. Local gauge invariance is recovered in the sense of gauge independence of the physical S-matrix.

The AB effect establishes experimentally that gauge potential is the fundamental quantity. Monopoles and theta vacua are merely hypothetical things.

Tunneling exists in quantum mechanics. But this does not imply that instanton is physically sensible in quantum field theory. The reasoning used here is no more than analogy. Can you prove the conservation of total probability in the process in which transition via instanton really occurs?

“Natural” is merely your feeling. You must logically show why. Path integral is no more than the generating function of Green functions. Anomalies, renormalization group, lattice theory, etc. can be considered also in the framework of operator formalism, because Green functions can be defined in it.

What I am emphasizing is that it is quite dangerous to make physical considerations based on the path integral beyond the extent justifiable by the operator formalism.

Peter Woit,

Thanks for buying a copy of my book; please read it. The Kugo-Ojima formalism is nothing but a natural extension of the manifestly covariant formalism of abelian gauge theory (QED and Higgs model) to the non-abelian gauge theory. Nothing is conceptually difficult.

I think that the lattice approach is an important tool for QCD. What I stress is that the lattice gauge theory cannot be a fundamental theory because the angular momentum conservation cannot be proved within its own framework (Remember the importance of partial-wave analysis.).

I don’t know much about the problem of eta-prime’s mass, but it is a matter crucially dependent on approximation method. What I assert is the matter of the fundamental principle of the theory. Field equations and canonical commutation relations are independent of a constant term of the action. This is a completely non-perturbative statement.

Path integral and operator formalism have some delicate differences. If you are interested in this problem, please see the following paper:

M. Abe and N. Nakanishi, Perturbative or Path-Integral Approach versus Operator-Formalism Approach, Prog. Theor. Phys. 102 (1999), 1187.

Peter Orland,

There is no strong CP violation; there are no axions.

To repeat, the Kugo-Ojima formalism is non-perturbative, manifestly covariant, unitary, and totally irrelevant to the Gribov problem.

Tunneling is possible in quantum mechanics, because the representation is unique. In quantum field theory, there are infinitely many representations. It is impossible to tunnel to different representations.

I never said their were axions and did not comment on strong CP violations. But a theta-term is possible to add to the QCD action.

As I said, if the Kugo-Ojima formulation has the properties you say, then it is wrong (but I don’t believe your statement that it does have such properties).

Tunneling is present in field theory (and not just gauge theories); provided the WKB factor is non-zero.

thanks for your answer

Peter Orland,

If you claim that my assertion is wrong, you must point out where and why it is so. It is non-scientific to claim “wrong” merely based on your feeling.

Nakanishi,

It is difficult to prove things in 4d lattice gauge theory, but you can do computer calculations and check whether what you get satisfies properties you expect. In particular, as you take the coupling to zero, you can check whether the breaking of rotational symmetry introduced by the lattice disappears as expected. My understanding is that this is what happens.

The reference you give to a discussion relating the covariant operator formalism and path integral formalism is for a 2d theory. The divergence structure of 2d QFT is quite special, and, especially in the context of conformal field theories, there is quite a bit known about the operator formalism and its relation to the path integral (this is a beautiful subject mathematically). However, the 4d divergence structure is quite different and much worse. I don’t see evidence that you are able to explicitly construct operator algebras and representations with the properties that you expect to have. The advantage of the lattice version of the path integral is that it is an explicit construction, and you can check that it has the properties you want, analytically in a few cases, numerically in others.

Actually I have no obligation to point that out. It is you who is making claims that the conventional view on gauge theories is wrong, so it is you who has an obligation to prove it.

The very loaded term “non-scientific”, in its simplest form means making assertions without evidence. I have evidence that I what I am saying is true. You claim to have other evidence. Only doing the mathematics can say who is right.

My research is in field theory, and I have some confidence in my assertions. I do my best to be careful in what I do. I make mistakes, but after a lot of checking and reworking, I try to fix them. Though I am not perfect, I work hard to understand the issues. If I am sure something is true after reading well-written and convincing papers and my own hard calculations, why should I believe someone who says it is all wrong?

Perhaps I can best express my reaction to your assertions though quoting the following:

http://xkcd.com/675/

Peter Woit,

I am not concerned with the matter of expectation or hope based on a finite number of numerical calculations. What I am saying is that the angular momentum conservation must be exactly established if the theory is fundamental.

Thanks for quick reading of my paper. The subject of this paper is not to discuss the divergence problem but to clarify concretely the pathological aspects of the perturbative or path-integral approach in 2d quantum gravity.

The original topic which you proposed is non-perturbative BRS. I don’t understand why you adhere to lattice approach so much. There is no BRS-formulated lattice theory!

Pete Orland,

I have never said that your work is wrong, because I don’t know your work at all. Probably, it is better to stop the discussion with you.

I was not talking about my own publications. But OK, you are right, this

has gone on too long.

This topic has been one of the best in the history of this blog. I wish there were more sane discussion on the net.

-drl

Dear Nakanishi san and Peter Orland,

Please continue the discussion. I personally found it instructive as I didn’t realize there were any controversies in these matters. There must be a way to talk about these things without getting all worked up.

Pingback: Math and Musings « Incognition

mathphys,

If you are interested in my point-of-view, please read the following paper:

N. Nakanishi, Method for solving quantum field theory in the Heisenberg picture, Prog. Theor. Phys. 111 (2004), 301-337 (Invited paper).

As for Orland’s opinion, please ask him.

D.R. and mathphys

I will not succeed in doing these topics justice, but I will try to give a summary below.

The first person to compute the effects of tunneling in gauge theories was Polyakov, who studied the problem for QED on a Euclidean lattice in 1975 (Physics Letters B), and in the O(3) Yang-Mills theory broken to a U(1) symmetry in 1977 (Nuclear Physics B). The lattice case was studied in the Hamiltonian formulation in

Qed On A Lattice: A Hamiltonian Variational Approach To The Physics Of The Weak Coupling Region.

S.D. Drell, Helen R. Quinn, Benjamin Svetitsky, Marvin Weinstein, (SLAC) . SLAC-PUB-2122, Jun 1978. 64pp.

Published in Phys.Rev.D19:619,1979.

So the path integral is absolutely and most definitely not essential.

Attempts to do full-fledged calculations of tunneling between n-vacua in four-dimensional Yang-Mills (instantons) are highly suggestive, but inconclusive, mainly because classical conformal invariance leads to infrared divergences. Some progress was made in the 2-dimensional O(3) sigma model, but even there the result is problematic (I am actually working on this problem some of the time).

Gribov worked in the continuum, but the Gribov problem is present in the Hamiltonian, as well as the Euclidean lattice formulations. The gauge fixing problem can be thought of as a spin-glass. If you think of minimizing TR A^{2} with respect to gauge transformations, the result is Lorentz gauge (on a 4D lattice) or Coulomb gauge (on a 3D lattice). This problem is finding the minimum (or classical-ground-state) energy of a spin glass, with a non-Abelian spin-space. The gauge field plays the role of the spin-glass frustration. Such problems are well-known not to have a unique minimum.

There is a lot more to say, but I hope this will give you some idea of what the problems are about.

Hi Peter W,

Thanks for the reply on Peter H’s paper.

What specifically were you thinking of, halfway down the Older Comments page, where you said, “The lesson of many years of work on understanding the problem of chirality on the lattice is that it’s not a lattice artifact, but something inherent in the use of a cut-off, some version of which is needed to make sense of the QFT”? Is there some other non-lattice kind of short-distance cut-off that causes fermion doubling, quadrupling, etc., in simple treatments?

Best,

Chris

Peter O,

Thank you for recalling these classic papers. Great works.

Chris,

Besides the lattice, I don’t know of any simple ways to write down a cut-off gauge theory non-perturbatively.

What I had in mind with that comment is that (take a look at the Neuberger paper about the chiral fermion problem) the anomaly is really a topological problem, one that can be reduced to a problem about how you count states in the spectrum of the Dirac operator. Any kind of naive discretization of the Dirac operator will somehow run into trouble with the anomaly, with doubling one possibility.

Exactly how species multiply to cancel the anomaly depends on what one does. If you do something like Kogut-Susskind fermions, another way of thinking about what you are doing is that you are discretizing differential forms, not spinors, and for these there is no anomaly, and you expect to get spinors with multiplicity given by the dimension of the spinor space. Always seemed to me that what one should look for is some formulation that inherently involves the geometry of spinors, not that of forms, but I’ve never had a good idea along those lines…

Thanks Peter

Hi Peter

Your book “Not Even Wrong” and your blog have made me

think about the future of physics.

I have read your book several times.

As you point out, Standard Model is the successful theory

which needs to be improved the more.

But I think Anyone doesn’t know how to improve this theory.

Maybe, String theory is suggested as the alternative to

Standard Model with the difficult problems.

I feel you believe that String theory is the failed theory.

You and your fans have confidence that String theory can’t

be the alternative to SM and that is not the true science.

I agree to your idea.

But, what is alternative to SM?

Is there any good idea to improve SM?

The insight to be ontained from researching for the improved

Gauge theory makes this situation better really?

I want to ask these questions of you and many visitors to this blog.

My thought is that physics can’t go ahead any longer

as far as it is confined in the concept “Field”.

“Field” is the good concept to explain the electic and magnetic

phenomena. Dirac and high energy physists developed this “Field”

as abstact concept to describe the particle creation and annihilation

in sub atomic world.

In fact, several problems in SM are relevant to Field concept.

The infinity appeared in the caculation is not irrelevant to using the Field.

Lattice gauge theory also be made for escaping from the infinity appeared

in the caculation with using Field.

Is there any concept alternative to Field to describe the High-enery physics?

I feel that theorists must begin to think what is the alternative to Field.

How about my thought?

Younghun park,

The concept of quantum fields is very successful, so it is too hasty to give it up. Learn from history! Newtonian mechanics was very successful, but to formulate quantum mechanics it was essential to rewrite it into analytical dynamics, which was achieved by treating spatial coordinates and conjugate momenta on an equal footing. Likewise, to formulate general relativity it was essential to rewrite special relativity into the four-dimensional tensor analysis, which was achieved by treating space and time on an equal footing.

Now, quantum field theory unified the concept of particles and that of forces. Next, we should proceed to unifying the concept of fields and that of spacetime. That is, I suggest that we should rewrite quantum field theory by treating quantum fields and spacetime on an equal footing, before a possible breakthrough is found.

The symmetry between quantum fields and spacetime was found to hold in the BRS-formulated manifestly covariant formalism of quantum gravity in the de Donder gauge. There is a 16-dimensional supersymmetry IOSp(8;8) on the basis of of the supercoordinates consisting of spacetime coordinates, gravitational B field, FP-ghost and FP-antighost, though, of course, most of the generators of this supersymmetry are spontaneously broken, and the graviton is one of Nambu-Goldstone bosons.

yonghun park,

This discussion really has nothing to do with the topic of this posting, and I don’t want to run a general physics discussion board here.

Hi,

Firstly, thanks to Peter Woit to write this post. All of them are seriously under-investigations by the field/ theorists. Sometimes making serious research public does help, although many times don’t! Anyway, I am not writing this to discuss those social science issues. I just wanted to point out some efforts to obtain a BRST-formulated lattice theory, mainly by evading the Neuberger’s 0/0 problem. Firstly it was Testa’s paper http://arxiv.org/pdf/hep-lat/9803025 which tried to restore BRST symmetry on the lattice. Meanwhile, Schaden and Baulieu showed that gauge fixing was a Witten-type TQFT http://arxiv.org/pdf/hep-th/9601039. Schaden then went on used this important interpretation to restore BRST symmetry on the lattice using coset space construction http://arxiv.org/pdf/hep-lat/9805020.

There were some other efforts too e.g. http://arxiv.org/pdf/hep-lat/9709154, but Neuberger had counter argumented http://arxiv.org/PS_cache/hep-lat/pdf/9801/9801029v2.pdf . I don’t know the current status of this approach.

Using Curcci-Feerrari gauge, in http://arxiv.org/pdf/0807.0480, it was also tried to evade the Neuberger problem.

Very recently, the Neuberger problem seems to be evaded using redefinition of the gauge-fixing on the lattice in http://arxiv.org/pdf/0710.2410 and http://arxiv.org/pdf/0812.2992 (someone has already mentioned these papers above). The current research is to do simulations with this BRST-formulated lattice theory.

The Kugo-Ojima theory is indeed appreciated in many of these papers.

It should be noted that relatively ‘low no. of citations’ to Neuberger’s original papers should not be taken as low interest in this problem. But rather as a difficult problem to be addressed.

If you are interested in some debates on relavant issues on these topics, there is a blog http://marcofrasca.wordpress.com/ where the author sometimes do discuss about these topics – though the blog seems to be inactive for the past few months. Note that I am not the blogger of that blog nor actively involved in this particular research nor here to defend/offend any of the above approaches. But closely watching and eager to see the progress.

Hope this helps.

tbs

Dear tbs,

Thank you very much for citing my blog as a reference for people working in the field. This blog is inactive for the simple reason that, in my place of work, has been installed Websense and a lot of sites that could be helpful for this activity are no more accessible. E.g. I can read this blog but not the one from Backreaction. Last but not least, I cannot see pictures, photos and formulas making the management really involved and restricted in a small range of time in the day that is generally not available for other reasons. I am able to put down these lines from my home computer.

I think that this is not the right place to cite my blog for the simple reason that me and Peter have had some questioning last year about my work. A couple of formulas of mine appeared in Wikipedia and Peter intervened to get them removed. This produced some fuss here and and in the blogosphere concluded with an intervention by Terry Tao. Curiously enough this was beneficial to my work and, now, I should say that Peter helped me to make things work.

Marco

tbs,

I think that Neuberger’s 0/0 problem is simply caused by his erroneous assumption. There is no reason for assuming that the BRS-invariant measure is a trivial one with respect to the

FP-ghost and antighost variables, that is, it may contain c-bar c terms.

Of course, the existence of the BRS-invariant measure is merely an assumption. In exactly solvable models, we often encounter the BRS anomaly in the path-integral approach, but not in the operator-formalism approach.