For a first slogan (see here for slogan zero) I’ve chosen:

Quantum theory is representation theory.

One aspect of what I’m referring to is explained in detail in chapter 14 of these notes. Whenever you have a classical phase space (symplectic manifold to mathematicians), functions on the phase space give an infinite dimensional Lie algebra, with Poisson bracket the Lie bracket. Dirac’s basic insight about quantization (“Poisson bracket goes to commutators”) was just that a quantum theory is supposed to be a unitary representation of this Lie algebra.

For a general symplectic manifold, how to produce such a representation is a complicated story (see the theory of “geometric quantization”). For a finite-dimensional linear phase space, the story is given in detail in the notes: it turns out that there’s only one interesting irreducible representation (Stone-von Neumann theorem), it’s determined by how you quantize linear functions, and you can’t extend it to functions beyond quadratic ones (Groenewold-van Hove no-go theorem). This is the basic story of canonical quantization.

For the infinite-dimensional linear phase spaces of quantum field theory, Stone-von Neumann is no longer true, and the fact that knowing the operator commutation relations no longer determines the state space is one source of the much greater complexity of QFT.

Something that isn’t covered in the notes is how to go the other way: given a unitary representation, how do you get a symplectic manifold? This is part of the still somewhat mysterious “orbit method” story, which associates co-adjoint orbits to representations. The center of the universal enveloping algebra (the Casimir operators) acts as specific scalars on an irreducible representation. Going from the universal enveloping algebra to the polynomial algebra on the Lie algebra, fixing these scalars fixes the orbit.

Note that slogan one is somewhat in tension with slogan zero, since it claims that classical physics is basically about a Lie algebra (with quantum physics a representation of the Lie algebra). From the slogan zero point of view of classical physics as hard to understand emergent behavior from quantum physics, there seems no reason for the tight link between classical phase spaces and representations given by the orbit method.

For me, one aspect of the significance of this slogan is that it makes me suspicious of all attempts to derive quantum mechanics from some other supposedly more fundamental theory (see for instance here). In our modern understanding of mathematics, Lie groups and their representations are unifying, fundamental objects that occur throughout different parts of the subject. Absent some dramatic experimental evidence, the claim that quantum mechanics needs to be understood in terms of some very different objects and concepts seems to me plausible only if such concepts are as mathematically deep and powerful as Lie groups and their representations.

For more about this, wait for the next slogan, which I’ll try and write about next week, when I’ll be visiting the Bay area, partly on vacation, but partly to learn some more mathematics.

Peter,

In your opinion, should physicists worry about the fact the Stone-von Neumann theorem is not true for infinite degrees of freedom, or is it purely a mathematical interest? I mean that one can always assume an arbitrarily high, finite number of degrees of freedom and get back a unique state space given the commutation relations.

“In our modern understanding of mathematics, Lie groups and their representations are unifying, fundamental objects that occur throughout different parts of the subject. Absent some dramatic experimental evidence, the claim that quantum mechanics needs to be understood in terms of some very different objects and concepts seems to me plausible only if such concepts are as mathematically deep and powerful as Lie groups and their representations.”

OK. but then why complex unitary representations? There are two possible answers I can think of to this.

1. If you imagine that the basic structures of quantum theory are fixed in advance of considering symmetry tranformations, then they have to correspond to unitary representations.

By the “basic structures”, I mean that the observables are a von Neuman algebra, the “properties” are projection operators on a Hilbert space, or some other statement like that. Then, you can derive the state space structure from Gleason’s theorem, and the structure of symmetry tansformations from Wigner’s theorem. Specifically, once you have derived the probability rule from Gleason’s theorem, you can show that the unitary transformations are the only automorphisms of the state space that preserve probabilities and are continuously connected to the identity.

Presumably we want symmetry transformations to preserve probability, otherwise there would be observable violations of the symmetry, and we want continuous symmetries to be continuously connected to the identity. Discrete symmetries do correspond to antiuintary transformations of course, which is the other thing allowed by Wigner’s theorem when you drop the continuity requirement.

2. If you want to say that the basic structures of quantum theory themselves come from symmetry representations, then you haven’t got a state space or probability rule fixed in advance. Presumably then you could investigate whether viable physical theories can be generated by varying the kind of group representation. You would then have to investigate whether a reasonable state space and probability rule could be defined for such structures.

Now, if you go this route, then we already know that ordinary Hilbert space QM is not the end of the story. We can construct viable theories on Jordan algebras for example, but these seem not to be instantiated in nature. It seems to me that if you want to go the “everything is representation theory” route then you have quite a job to do explaining why such things are ruled out.

Now, personally, I take the first point of view. This means that, for me, the ubiquity of unitary representations in quantum theory is not something particularly deep. That’s just the only way it could possibly be given the basic structures of the theory. It’s really the only possible way that symmetry transformations could be implemented.

This means that, for me, “quantum theory is representation theory” is wrong. I might agree that “quantum physics is representation theory” because the fact that modern physics is based on symmetry is something that I regard as deep, so representation theory is the way to embed the physics, i.e. the physical interpretation of the operators, into an otherwise abstract probabilistic theory. But, the abstract probabilistic theory comes first, and needs its own justification independent of symmetry considerations.

Alex,

Yes, you can approach any QFT problem by cutting off the number of degrees of freedom, and then studying what happens as you make that number larger and larger. What makes things tricky (and interesting…) is that the limit is generally going to be singular, with the limiting theory having different properties. This is why renormalization theory is very non-trivial. Also, in condensed matter systems you get phenomena that only occur in the limit. I guess this is one example of “emergence”.

If you can bypass this, and directly mathematically formulate the limiting theory, that could be very powerful. The failure of Stone-von Neumann tells you that certain things that make life much simpler are no longer going to be true in the limiting theory.

Matt,

In representation theory in mathematics in general, one mostly works with complex, unitary representations. One could instead work with quaternionic or real representations, but this is most easily done by starting with a complex representation and asking for some extra structure. For many groups (e.g. finite or compact Lie groups), all representations are unitary (or, more precisely, unitarizable). Even for non-compact groups and infinite-dimensional irreducible representations, it’s typically the unitary representations that are interesting.

But non-unitary reps do occur in physics, for example the spinor rep of SL(2,C), and the state space of “ghosts” used in dealing with gauge symmetry. So, I’m just not convinced of the primary role of the probabilistic interpretation. Of course, sure, when you look at those parts of the theory where a probabilistic interpretation is going to connect the theory to experimental predictions, in those parts of the theory you’ll have to have unitarity.

In some sense the point of the slogan is to argue that there’s more structure in fundamental quantum systems than just a von Neumann algebra. One is typically dealing with a universal enveloping algebra of a Lie algebra (not e.g. a Jordan algebra). The operators that generate the algebra of observables are given by the representation operators for a basis of a Lie algebra. From the point of view of the conventional formalism, it’s very surprising that the Q,P operators that generate the operator algebra come from a Lie algebra and a group representation.

Of course, this slogan has nothing to say about why certain Lie algebras occur, that remains a mystery.

typo: univeral enveloping algebra

Hi, Peter. Have you ever taken a look at these papers:

which were seminal in launching the idea of

quantization as deformation, with representation theory playing a secondary role?The thing is that a classical Poisson algebra is more than just a Lie algebra, it also has a commutative product and a Leibniz rule for the bracket with respect to the product. Quantizing by simply representing the Lie algebra part of the structure can easily “break” the product structure and cause lots of problems, like Groenewold-type no-go results, the need to choose polarizations on the phase space, the need to choose the appropriate Lie sub-algebra to represent, deciding which of these choices lead to equivalent or inequivalent quantizations, deciding what constitutes a classical limit, etc.

All of these issues are dealt with head on in deformation quantization. Moreover, once a classical Poisson algebra is deformed to a quantum non-commutative algebra, the non-commutative product defines the commutator bracket, allowing Lie algebras of symmetries to be represented in it. GNS type constructions give you representations of the quantum algebra on Hilbert spaces. Thus, unitary representations of Lie algebras of symmetries still appear.

Of course, deformation quantization is not a construction or a prescription, rather it’s a definition. On the other hand, the same can be said about quantization as Lie algebra representation. However, once given a definition, one can go and find constructive methods that satisfy it. This has actually been done with great success for deformation quantization.

verissimo,

Thanks, fixed.

Igor,

Thanks for the explanation, I’ve never completely understood how deformation quantization works. I gather the idea is to get a notion of quantization that gives something for any symplectic manifold. That’s very interesting mathematically, but kind of an opposite direction to what I’m most interested in, understanding how quantization works in the very specific cases that occur in fundamental physics.

Experience with CFT makes one expect that the infinite-dimensional groups of gauge transformations and diffeomorphisms should acquire an extension after quantization.

The group of gauge transformations in 3D has two types of extensions: the central extension [Etingof-Frenkel 1994] and the Mickelsson-Faddeev extension [Pressley-Segal section 4.10]. Only the latter corresponds to gauge anomalies in QFT.

The MF group is a “bad” extension in the sense that it lacks natural unitary representations [Pickrell 1989]. IMO this explains why gauge anomalies must cancel in the SM: nature abhors groups without good unitary representations.

The central extension can hence not arise in QFT except in 1D. To realize it, a more general theory is necessary.

The situation is analogous for diffeomorphisms. There are no diff anomalies at all in QFT in 3+1D, but the diffeomorphism algebra has Virasoro-like extensions in every dimension. Hence these can only be realized in a more general theory than QFT.

When the diffeomorphism algebra acquires an extension, it ceases to be a gauge symmetry, and local observables become possible in QG.

And look, I managed to say that without any self-reference at all.

Peter,

Those original papers are actually quite good in providing motivation for the deformation quantization (DQ) point of view even for the specific cases I think you have in mind (free particle, oscillator, hydrogen atom). Personally, I like to view specific cases as instances of something more general, and DQ does just that.

Here’s an example of how it helps. In your own notes (Ch.14), you say that going beyond quadratic polynomials, the assignment of operators to classical observables is highly ambiguous/non-unique, referring to the well-known

operator ordering ambiguity. So, just how much of this ambiguity is there, is any of it redundant (gives physically equivalent quantizations)? Turns out that this question is well posed in DQ and has even been answered. All inequivalent classes of quantizations are parametrized by the 2nd de Rham cohomology of the phase space. In particular, standard R^2n has a unique quantization. Note that this result is independent on the Stone-von Neumann theorem (no need to exponentiate observables), and even says something stronger, because it looks not only at the quantum algebra itself but also the specific choice of assignments of quantum operators to classical observables. I have not seen this question well formulated, let alone answered, in other points of view on quantization.So, to answer the question of how DQ works, one could say that it’s main benefit is to provide a good set of physically motivated definitions, sufficiently precise yet flexible for meaningfully answering questions such the one above.

Peter, sorry to be off topic, but have you seen Gordon Kane’s letter to SciAm? Perhaps you might be willing to bet him that super partners won’t be found in the next CERN run? He seems to be having trouble finding people willing to bet against it

Peter,

I know this is a bit of a philosophical, unfalsifiable discussion, but I think I’m on Matt Leifer’s side here.

It’s not enough to say simply that quantum theory is representation theory. You also have to say that information is encoded in density matrices, or, at the very least, that you’ve got a theory of probabilities here, and then appeal to Gleason’s theorem to argue that the probability measure must be encoded in density matrices.

The postulate of probabilities is a nontrivial additional postulate you have to make, and once you’ve made it, and have density matrices in hand to encode it, the necessity of unitary (or anti-unitary) transformations becomes inevitable from the Wigner representation theorem, or, more intuitively, to preserve information under symmetry transformations.

You say that “non-unitary reps do occur in physics, for example the spinor rep of SL(2,C), and the state space of “ghosts” used in dealing with gauge symmetry.” I don’t think I follow. On the physical Hilbert space, all representations must be unitary or anti-unitary. Ghosts are not physical states—they’re vectors states put in to make the given system’s Hilbert space more symmetric-looking and easier to work with. And non-unitary representations directly on fields are not fundamental features of quantum theory, but again are mere conveniences.

Reader297,

After spending more time with Schlosshauer’s book on decoherence while on a long plane trip, I’m more than ever suspicious that probability is not fundamental to QM, but just what you need to do when you break a quantum system up into two subsystems (e.g. a system under study and the apparatus/enviroment). But also spending time with his book “Elegance and Enigma” of interviews shows that opinions differ. Of the people in that book, I find myself most sympathetic with what Zurek has to say.

Given any structure, you can start with one end of it, and if it’s rigid enough, get to the others. Besides not seeing the necessity or desirability of starting with probability, I also don’t see how it gets you to any kind of explanation of the basic structures of QM: why are observables self-adjoint operators on a complex vector space? Why the Heisenberg commutation relations? Why Poisson bracket goes to commutator? Starting by taking as fundamental the mathematics that explains those structures seems to me more promising than starting with probability. I also suspect it is a more promising approach to QFT and the things we don’t understand about unification than starting with probability and trying to add other axioms with “physical” motivation.

Of course you get unitary representations in those cases where your state space is going to have an interpretation that will require unitarity. The point of the comment was just that the fundamental mathematical structure involves lots of use of representation theory, including non-unitary representations. One person’s “convenience” is another’s deep and powerful insight…

Jeff M,

Just saw that. Seems implausible that anyone would have trouble finding physicists who think odds are now no SUSY at the LHC. In any case, it should get interesting starting a year or so from now when results at higher energy start coming in. But, yet more discussion of this should wait for a more appropriate topic…

SL(2,C) is another possible number system for QM besides reals, complex, and quaternionic numbers. SL(2,C) arises out of a subalgebra of SO(2,4) ~ SU(2,2) by eliminating the “ghosts” and it provides another decomposition of the d’Alembertian. In fact QM over SL(2,C) is mathematically equivalent with spinors and Dirac’s equation. This gives rise not to a C* algebra but to a C*-Hilbert module. There is no QM over octonions because it can be proven that QM obeys a “dynamic correspondence” between observables and generators which rules out the only special Jordan algebra. This “dynamic correspondence” is related to Noether’s theorem.

Deformation quantization is a well established mature domain which builds up recursively in powers of hbar an associative start product. There are inequivalent ways of building the star product (related to 2nd deRham cohomology) and this makes the area hard (the symmetric part of the star product forms a Jordan algebra, while the skew-symmetric part generates a Lie algebra). The typical examples are the Weyl quantization and Berezin quatization. Weyl quantization preserves symmetries but has convergence problems outside of flat R2n while Berezin preserves positivity. Berezin is best understood in terms of creation and annihilation operators which form z and zbar in complex numbers. Deformation quatization maps QM in phase space to QM in Hilbert space and is a transition in a larger sense from non-commutative to commutative geometry. All Kahler manifolds admit a Berezin quantization if positivity is present, but if one starts from a pure classical system, one uses geometric quantization which requires the Born-Sommerfeld quantization condition as a pre-requisite.

The topic of Poisson manifolds is an active research area due to symplectic reduction. Some classical systems admit both an odd-dimensional Poisson manifold description and an even-dimensional symplectic manifold description. Even better, some systems are bi-Hamiltonian (solitonic systems).

Well, in my view the structure of quantum theory is a low energy phenomenon ( a little bit jokingly: perhaps only a small deformation of classical mechanics) whereas various modern frameworks seem to believe that it holds unaltered down to the Planck scale. But that is against our scientific experience. I would bet that quantum theory is emergent (perhaps even an epiphenomenon). There is a lot of deep mathematics around which may live on a microscopic scale with quantum theory emerging as some coarse-grained effective theory.

Dear Peter,

Three comments;-

1) On the Stone – Von Neumann Theorem.

This concerns only integrable representations, not any representation of the Heisenberg Lie algebra. This is physically justified as one can only perform finite measurements, not infinitesimal ones.

2) On the Groenewold – Van Hove theorem.

This also requires some integrability condition for some basic set of observables, see http://xxx.lanl.gov/abs/math-ph/9809015. Also, it has been generalized to other symplectic manifolds, see http://xxx.lanl.gov/abs/math-ph/9809011 or http://xxx.lanl.gov/abs/dg-ga/9605001

3) On the metaplectic algebra being a maximally quantizable Poisson subalgebra of the full Poisson algebra on R^{2n}. This is true, but it is not the only maximally quantizable Poisson algebra containing the q_i and p_i, see Sect. 5 of the first reference above.

Hi Peter,

Let me address your points in turn.

You write:

“After spending more time with Schlosshauer’s book on decoherence while on a long plane trip, I’m more than ever suspicious that probability is not fundamental to QM, but just what you need to do when you break a quantum system up into two subsystems (e.g. a system under study and the apparatus/enviro[n]ment).”

Suspicion is a wonderful and insufficiently appreciated sensibility. However, after decades on this idea, all we have are still only suspicions.

There have long been hopes of “deriving” probability ab initio from the bare mathematical formalism of quantum theory. Any such derivation, in particular, would represent a fundamental solution to the measurement problem without any a priori assumption of, say, the Born rule.

Advocates of the many-worlds interpretation have long argued that this goal is achievable, and much of their conviction derives from Zurek’s envariance and decoherence arguments. (These arguments have led to the idea that “decoherence alone solves the measurement problem.”)

However, no rigorous proof exists, and for good reason: Probability is more than just mathematics. The problem of understanding what probability actually means is a deep one in philosophy. Getting to probability from non-probability is hard a problem.

On the other hand, once you assume a priori something probability-like — namely, that we should be talking about a probability measure for vectors in a Hilbert space — then you can follow Zurek or Gleason (or even Hugh Everett, if you read his original papers) to show that the probability formula has to be the Born rule.

But simply starting from abstract vectors and somehow deriving the existence of probability from scratch has long been regarded as an impossible problem. On the other hand, if you’ve found a way to solve it, I’m sure we’d all love to see it!

You write: “Given any structure, you can start with one end of it, and if it’s rigid enough, get to the others.”

That’s quite true. But not all ends are necessarily on the same footing. If one end is more general than the other ends, then the more general end is the safer starting place.

As I’ve said, starting without probability and getting to probability runs into some long-known and deep metaphysical obstructions. But if we do start with probability, then going the other way is easier and more general. If we start with the axiom that probability is encoded in density matrices through the Born rule Tr[(density matrix)(...)], then the necessity of unitary (or anti-unitary) representations for symmetries is a mathematical consequence. When decoherence diagonalizes the density matrix in a basis approximating coherent states, we get a structure that closely resembles classical phase space, and the commutator behaves algebraically similarly to Poisson brackets without having to assume any classical system a priori. (And that’s important, because quantum theory appears to be more fundamental than classical physics. We shouldn’t be starting with classical systems in the first place.)

You write: “Besides not seeing the necessity or desirability of starting with probability, I also don’t see how it gets you to any kind of explanation of the basic structures of QM: why are observables self-adjoint operators on a complex vector space?”

Once you say that probabilities are encoded in density matrices through the Born rule, then reality of the probabilities implies that all observables must be self-adjoint.

That obviously still leaves the deeper question of why we have to work with complex numbers. Of course, that’s a question whether one starts with probabilities-in-density-matrices or one starts with the quantum-theory-is-representation theory. We’re not going to be able to “derive” quantum theory starting from just one axiom!

But there are definitely some simple cases that motivate the idea of working with complex numbers. If we consider, say, a two-state system (which, by the way, doesn’t have Heisenberg commutation relations [q,p] = ih), then even if we write down a real density matrix, there are observables we can write down that always give real probabilities. That is, if we consider the most general matrices that combine with density matrices in the Born rule to give real probabilities, then those are self-adjoint complex matrices.

Moreover, as you know, there are certain symmetry transformations you can do that are generated by self-adjoint operators.

So the motivation is that it’s a matter of generality. One could self-consistently consider just real-valued quantum theory (whether in the probabilities-in-density-matrices or the quantum-theory-as-representation-theory approaches), but it’s not the most general structure that is consistent with real probabilities. At the end of the day, however, someone has to postulate that complex numbers are a part of the story, whichever point of view one takes.

After all, even if we declare that quantum theory is just representation theory, we could always imagine a finite-dimensional real vector space acted on by real orthogonal matrices. That’s certainly not the most general thing we can do, but there’s no logical reason why we have to consider complex numbers unless we postulate that we must do so.

Finally, you write: “Of course you get unitary representations in those cases where your state space is going to have an interpretation that will require unitarity. The point of the comment was just that the fundamental mathematical structure involves lots of use of representation theory, including non-unitary representations. One person’s “convenience” is another’s deep and powerful insight…”

That’s true. But I suppose the issue here is that some things can be regarded as conveniences, whereas other things cannot be. And to a lot of folks, anything that cannot be regarded as a convenience is the deeper and more powerful insight.

PW: “After spending more time with Schlosshauer’s book on decoherence while on a long plane trip, I’m more than ever suspicious that probability is not fundamental to QM, but just what you need to do when you break a quantum system up into two subsystems (e.g. a system under study and the apparatus/enviroment). ”

I don’t get this distinction at all. Any observation requires (is?) a breakup of the quantum system. Discussing quantum mechanics without this means you are not doing anything empirical. I thought we were against that here.

Hendrik,

Thanks. In comments here I’m often not even trying to be precise, am trying to do so in the notes I am writing, so that’s helpful.

Reader297,

I wouldn’t claim to get probability from nothing, but it seems to me to come into play only at the point when you try to deal with the difficult issue of how to extract predictions about experimental results, formulated in terms of idealized descriptions in classical terms of your experimental set-up. This just seems to me at the other end of things from the question of what the fundamental theory is, but to some extent that’s a matter of perspective.

Experts I have been reading seem to agree that you need something beyond just decoherence to get the Born rule, but it’s unclear to me what it is. Zurek claims to derive the Born rule, presumably making some kind of assumption about “interpretation”, but also at other points claims to be agnostic about choice of interpretation. I’m having trouble putting my finger on where “probability” enters, and your comment referring to a deep problem in philosophy of course tempts me to see the question as one that might be on the other side of the nebulous philosophy science divide from what I feel I have to be concerned about.

By the way, for decoherence experts, I’m curious whether when you “derive” the Born rule you are making some assumptions that could be in principle violated. Is there even a thought experiment in which you would see something that could be interpreted as a violation of the Born rule (or even thought experiments where there is ambiguity about what the Born rule means or says)?

I don’t think you answered my question about the Heisenberg commutation relations. Yes, for the two-state system, all you have is linear algebra of 2 by 2 matrices, so there’s likely only one way to do things. For canonical quantization though (which is the basis of our understanding not just of single-particle QM, but of the quantization of linear fields), how does one explain where the Heisenberg relations come from, other than by invoking a Lie algebra representation (and one that is not a “symmetry”) of the theory?

Zathras,

First of all, you’re making the mistake many people make of thinking my complaints about string theory/SUSY etc are just based on lack of connection to experiment. That’s just not the case, if they were mathematically compelling I’d be a fan. In the case of QM my main interest is not in the hard problems of its connection to experiment (everyone seems to agree there is no such thing as an experiment in contradiction with the theory, and such a thing seems unlikely). Physical theories are also mathematical objects, with deep connections to mathematical issues, and those are what I’m often most interested in (with one justification the lack of relevant experimental clues about how to improve the theory).

It has always struck me that people seem to have much less trouble accepting how probability enters in classical mechanics. If you forget about phase space, then all you have is the Poisson algebra of observables. Probability distributions, aka

states, are then (positive, normalized) linear functionals on this algebra, physically interpreted as “expectation values”. A measure that is concentrated on a single point of the would-be phase space describes a given deterministic configuration of the system. Purely algebraically, such a state is an extremal (a boundary point) of the convex space of all allowed states and is calledpure. Thus, a statistical classical theory completely supersedes deterministic classical theory. For practical purposes, all our models are really statistical, and pure state are only idealizations that could be achieved on special subsystems of the universe and only under very special circumstances (these are the preparation protocols for controlled experiments).Now, turning to quantum mechanics, the whole preceding paragraph can be repeated word for word and carry the same exact meaning, with the exception that the algebra of observables is no longer Poisson, but is rather non-commutative. This changes also the condition of positivity. Pure states are still extremal points of the convex set of all allowed states, and are still only idealizations. Note that wave functions, Hilbert spaces or the Born rule need not be brought in at this fundamental level of description of quantum mechanics, yet one has enough machinery to make some calculations and associate them with experimental outcomes.

At this point, pure mathematics tells us that given a positive, normalized state on a possibly non-commutative algebra with conjugation (such as the quantum algebra of observables) always gives rise to a unitary representation of that algebra on a Hilbert space (this is the GNS theorem), such that application of the state is equal to the (Hilbert space)-trace of this representation against some density matrix. Further, if the state is pure, then the representation is irreducible and the corresponding density matrix has rank one. It is in this way that pure states are associated with vectors in a Hilbert space (by factoring the corresponding rank-one density matrix) and it is in this way how the Born rule arises. Namely, if one associates pure states with vectors in a Hilbert space, as described above, then we already know that we need to `square’ that vector to get a density matrix, which is then paired with observables inside a trace to get expectation values: the essence of the Born rule.

Granted, there was a notion of probability put into this discussion at the very beginning. So it does not address the desires of those who would like to see probabilities emerge from the formalism itself. However, given that probabilities are introduced in the same way as in classical theory, which I see very few people complain about, the Born rule follows automatically. To all those who are concerned by these issues, is this argument not already satisfactory to justify the Born rule?

Hi Peter,

I like your response a lot. Let me address your comments in turn.

You say: “I wouldn’t claim to get probability from nothing, but it seems to me to come into play only at the point when you try to deal with the difficult issue of how to extract predictions about experimental results, formulated in terms of idealized descriptions in classical terms of your experimental set-up. This just seems to me at the other end of things from the question of what the fundamental theory is, but to some extent that’s a matter of perspective.”

I happen to agree with you. I think a lot hinges on one’s perspective.

That is, I think it all depends on whether one regards quantum theory from a purely instrumentalist standpoint, namely, as nothing more than a tool for extracting the probabilities for measurement outcomes obtained by abstract observers “outside” of quantum theory. In that case, I agree with you that we can put those experimental questions aside for the moment and just talk about the internal, bare (and sublime) mathematical theory absent any questions about observers, measurements, or probabilities. (However, then we don’t have the complete quantum theory, because those probabilities ultimately need to get in there somehow!)

But there are other perspectives in which one tries to put observers into quantum theory as physical quantum systems in their own right, and then one runs directly into what it means to be an observer who is a quantum system in the Hilbert space. What does it mean to say that such an observer experiences probabilities, and why? How do bare mathematical state vectors turn into probabilities? Presumably, a quantum observer living in the Hilbert space has some state of being or experience — or maybe not! And is there some probability for an observer to be in a certain state of being? What does that even mean?

If one takes this more realist/physicalist perspective as I have just described it, rather than instead taking the instrumentalist perspective I mentioned above, then one cannot put the probability questions aside, because they are somehow entirely contained inside the bare mathematical theory at a very deep level.

You also write: “I’m having trouble putting my finger on where “probability” enters…” Me too. So far nobody has offered more than a hand-wavy “insert magic here” argument for deriving where probability exactly enters, which is why other people just bite the bullet and postulate probabilities as a fundamental axiom.

But, again, even if one takes an instrumentalist/bare-mathematical/representation-theory definition of quantum theory, one still has to deal with this step of getting probability into the theory somehow — axiomatically, I’d argue. And that means quantum theory is more than just representation theory. It’s also a theory of probability, and that’s something more than just representation theory per se. You need another axiom at least.

One can choose to ignore temporarily this additional probability axiom, of course, and focus on the mathematical of representation theory, but then one wouldn’t be doing actual quantum theory, but an (again sublime) mathematical shadow of quantum theory.

Later, you write: “I don’t think you answered my question about the Heisenberg commutation relations.”

Woops! I apologize! Let me try to do it in the context of your next several comments.

You write: “Yes, for the two-state system, all you have is linear algebra of 2 by 2 matrices, so there’s likely only one way to do things.”

Right. There’s no classical counterpart to this system, and that’s part of the reason why I’ve argued that one shouldn’t take classical systems as a fundamental starting point for quantum systems, nor regard canonical quantization as a physical thing, but merely a helpful mnemonic. Quantum systems appear to be more fundamental than classical descriptions, and so the arrow of reasoning goes from quantum to classical (in the appropriate decoherent/macroscopic/semiclassical limit).

Next, you write: “For canonical quantization though (which is the basis of our understanding not just of single-particle QM, but of the quantization of linear fields), how does one explain where the Heisenberg relations come from, other than by invoking a Lie algebra representation (and one that is not a “symmetry”) of the theory?”

Here I think I would cite Steven Weinberg, who addresses precisely this question in the introduction to his new textbook on quantum theory. (“Lectures on Quantum Mechanics.”) I think we can probably all agree that the guy who coined the term “Standard Model” knows a thing or two about quantum field theories (fallacy of appeal to authority—I’m guilty as charged!), and he has long believed (and vehemently argued) that the canonical-quantization procedure is nothing more than a convenient mnemonic that works at a practical level in certain situations, but is actually quite misleading especially in more general circumstances.

Instead, as I think you would enormously appreciate, Weinberg has argued that quantum systems should be defined by imposing symmetries on their Hilbert spaces. In particular, he argues in his QFT books that symmetries must be represented unitarily (or anti-unitarilty for certain discrete symmetries), and when you consider the unitary action of the Poincare group on states of a Hilbert space, the generators can be shown to satisfy the Poincare algebra (after much annoying work fixing phases, as Weinberg goes through in gory detail), and the P^0=H operator becomes what we identify as the system’s Hamiltonian H. For a system whose H is local, Weinberg argues that the subspace spanned by low-energy eigenstates can always be approximated by a local QFT, and, as you no doubt know, he calls that idea effective field theory.

However, because the effective field theory is only to be regarded as an approximate description of the low-energy part of the full Hilbert space, the field operators are not fundamental entities, and the Heisenberg commutation relations satisfied by the field operators (at least in the bosonic case in which we have them at all) are not fundamental, let alone axiomatic.

In the non-relativistic single-particle limit, one can show that this Hamiltonian reduces to p^2/2m + V(x), where [x,p]=ih. In fact, one can even write down what x is in terms of K (the boost generators) and P (the momentum/translation generators). (Exercise!)

At no point does Weinberg invoke canonical quantization anywhere. And, indeed, there are quantum field theories that don’t seem to have classical limits and/or Heisenberg-type commutation relations, such as fermionic field theories as well as the large classes of field theories that lack a Lagrangian description. But the Hilbert spaces don’t care whether there’s a nice classical limit.

I suppose one could argue that specifying the Poincare group is no better metaphysically speaking than specifying the Heisenberg commutation relations. We’re still effectively postulating something to constrain and define our quantum system of interest, and the end result is a set of operators that satisfy a certain algebra.

But I think Weinberg’s point is that this structure isn’t a universal postulate or axiom of quantum theory, but just a matter of providing one way (among many) to define a particular quantum theory of interest. That is, it’s just something we do out of convenience when we know a priori that a theory has to have certain symmetries. For other kinds of quantum systems, such as two-level systems, we don’t have to take that approach. We can just postulate a Hamiltonian and go from there.

Igor Khavkine– Please see my earlier comments about instrumentalism.

In particular, the situation you describe for classical systems is what one would call classical instrumentalism. That is, we just regard the formalism as a recipe for computing expectation values and probabilities, and nothing more. In that sense, there is definitely a parallel to quantum instrumentalism, with some “simple” replacements. However, I think Peter would argue that it’s not at all obvious why we replace classical observables with non-commuting quantum operators. And, again, I think most people would agree that once you say “probability,” there’s a unique prescription (the Born rule) for those probabilities.

But I’d like to focus on your prefatory statements: “It has always struck me that people seem to have much less trouble accepting how probability enters in classical mechanics. If you forget about phase space…”

I think your last clause is the key thing. In classical physics, you can certainly forget about phase space if you wish, but you don’t have to. And that’s important. You can always assume a phase space if you wish, in which case you’re not doing instrumentalism anymore, but are assuming that there is a reality under those probability distributions. That provides people with a metaphysical security blanket.

In particular, in classical physics, even when you have a mixed states, you can always assume (again, if you wish) deterministic, definite states can be assumed to exist, even though they’re not known with certainty.

But the weirdness of quantum theory is that this trick doesn’t seem to be available anymore. What if we don’t want to “forget about phase space” in the quantum case? Do we have the liberty to not forget it in quantum theory? Is there something there that we could choose not to forget, something analogous to classical phase space that we can choose to regard the Born rule as a probability of, apart from just the probability of a measurement outcome? If a system has a mixed state that isn’t an extremal point of the convex set of allowed probability states, can we still assume that the system has a deterministic, definite state of being that’s just not known with certainty, as we could choose to do in the classical case of a mixed state?

So when you write “However, given that probabilities are introduced in the same way as in classical theory…”, I think that not everyone would agree that it’s the same story in quantum theory. At the level of an instrumentalist perspective, perhaps yes, with my aforementioned provisos, but we have a much richer picture available to us in the classical case for understanding probabilities, and the absence of an obvious such picture in the quantum case makes quantum theory and quantum probabilities so much more mysterious and troubling.

Reader297,

I really do disagree with Weinberg here. In general I’m no fan of the “your deep, beautiful mathematical structure is just an effective theory, a low energy approximation to some more fundamental theory, although I don’t know what this is” argument. And in this particular case, the mathematics is very deep, and we don’t have anything better. Part of the beauty of the story is that the fermionic case runs perfectly parallel, and a big part of what I wanted to write about in the QM notes is exactly this story. Among physicists there seems to be an attitude that these are just ad hoc rules of thumb, but I think an appreciation of the coherence of the mathematics gives strong evidence against this.

Dear Peter, well, there do exist quite a few approaches which try to derive for example quantum theory from a deeper level. See e.g. the work of ‘t Hooft or my recent arXiv:1205.1619 (using as a technical tool among other things the concept of cellular networks). I do not claim that these speculations are conclusive but we are convinced that something in this direction will ultimately work. After all, classical mechanics (symplectic geometry) is also beautiful from a mathematical point of view while quantum theory is a beautiful theory as well but is completely different.

Manfred,

From the point of representation theory, irreps are the fundamental objects, and irreps behave like quantum theory (although not as quantum *field* theory except in low spacetime dimensions). So if the fundamenta of math and physics are the same, quantum theory is fundamental.

It was a while since I looked at ‘t Hooft’s Planck-scale determinism, but iirc his motivation for looking at hidden variable theories was locality. Of this I approve, but it is possible to have locality in QG without abandoning quantum theory. Recall the standard argument why there are no local observables in QG:

1) An observable in quantum theory is an operator which commutes with all gauge transformations.

2) In GR all diffeomorphisms are gauge transformations.

3) Hence there can be no observables in QG that depend on spacetime coordinates.

The flaw in this argument is that it assumes that the gauge symmetries of classical and quantum gravity are the same. If the algebra of diffeomorphisms acquires an extension (to avoid self-referencing, see the seminal paper of Rao and Moody 1994), this is not true.

So does it ever happens that the classical and quantum theories have different gauge symmetries? Yes, one (or 25) such cases are described in [GSW 1988], section 2. The free string can be defined in any number of dimensions, and it has a conformal (or Weyl) symmetry that is a gauge symmetry. After quantization, we must distinguish between three cases:

1) The critical free string D = 26. There is no anomaly, the theory is consistent, and the conformal symmetry is gauge.

2) The supercritical free string. The theory is inconsistent, because there are ghosts (negative-norm states) in the physical spectrum, and we can forget about it.

3) The subcritical free string. The theory has a conformal gauge anomaly, but is nevertheless consistent because there are no ghosts in the physical spectrum. However, the conformal symmetry can no longer be gauge, due to the conformal anomaly.

So in the subcritical free string, the (both consistent) quantum and classical theory have different gauge symmetries. It does happen!

(And yes, I know that the subcritical interacting string is inconsistent, but that was not what I was talking about.)

Hi Peter,

Those are fair points!

First, I wanted to give one more concrete example of where the Heisenberg commutation relations [x,p]=ih can come from without canonical quantization or an a priori classical system.

If you are given a QM system with an infinite basis that we can label with one continuous real label x, then we can define a unitary translation operator that changes |x> to |x+a> for any real a. The infinitesimal generator of that unitary translation operator is then just p, where it’s then just a simple calculation to show that [x,p]=ih. From a Noether standpoint, we call the generator of translations the momentum by definition. So this is a self-contained way to introduce and define p and obtain [x,p]=ih without reference to an a priori classical system. What’s lovely is that the eigenstates of z=x+ip then form a natural QM analogue to classical phase space, but have Gaussian wave functions with a size in x,p given by Planck’s constant h, so one also finds phase-space quantization. So in the limit h->0, we get back a classical phase space.

Now to your more recent comments. You write: “I really do disagree with Weinberg here. In general I’m no fan of the “your deep, beautiful mathematical structure is just an effective theory, a low energy approximation to some more fundamental theory, although I don’t know what this is” argument.”

There’s definitely room for philosophical disagreement here. But I think there are couple of things that need to be said about it.

One of Weinberg’s motivations here comes from condensed-matter theory. In many of the kinds of systems they study, there exists a well-defined long-distance regime that looks just like a field theory (sometimes even a CFT), regardless of what the microscopic exact model actually is. You could have atoms on a lattice, you could have graphene, you could have a fluid, etc. — really qualitatively different microscopic models — and yet the long-distance description looks like a field theory.

So in this case, there isn’t an “I don’t know what this is.” In lots of cases, we know what the microscopic model actually is. And the long-distance limit still looks like a field theory.

Now, what’s really interesting about this phenomenon is that the mapping of microscopic models to field theories is many-to-one — many qualitatively different microscopic models can share the same long-distance field theory — and each equivalence class (all microscopic models that share the same long-distance field theory) is called a universality class.

The existence of universality classes is a double-edged sword. On the one hand, they imply that by studying a single field theory, you can make statements and derive properties and calculate things (all in the long-distance regime) that hold for a huge class of microscopic models. That’s why it’s called effective field theory — because it’s really effective!

However, on the other hand, if you are simply handed the long-distance field theory, the many-to-one-ness of the mapping — that is, the large number of models in that universality class — implies that you’ll never be able to guess the correct underlying microscopic model, unless of course you can do some sort of high-resolution experiment to see what’s actually going on at short distances.

So we have a clear case study in which an elegant looking field theory really is just a long-distance limit of something that looks qualitatively different from a field theory — and, indeed, there’s a huge class of qualitatively different microscopic models that look like the same field theory at long distances. This really does happen in physics!

The upshot is that when people like Weinberg look at the field theories of particle physics, such as the Standard Model, there is no longer the belief that people had back in, say, the 1940s that the field theory is necessarily fundamental, nor that the underlying microscopic model is itself a field theory.

This hunch is backed up by the fact — which Weinberg spends his field theory books explaining — that local, weakly-interacting, Poincare-invariant quantum systems canonically look at low energies/long distances like field theories. (Even GR works this way — see the myriad papers by John Donoghue on the arXiv.)

So the fact that you have a field theory at low energy doesn’t tell you either way whether the microscopic model is a field theory or something truly qualitatively different.

One can certainly posit examples (like QCD or CFTs, or even the AdS/CFT correspondence) where the high-energy limit is still a field theory. But in a given practical case, that would be a huge assumption.

You write: “And in this particular case, the mathematics is very deep, and we don’t have anything better. Part of the beauty of the story is that the fermionic case runs perfectly parallel… Among physicists there seems to be an attitude that these are just ad hoc rules of thumb, but I think an appreciation of the coherence of the mathematics gives strong evidence against this.”

These lines seem suspiciously similar to what string theorists like to say!

And I think there’s a good reason why.

At the end of the day, you are proposing that because nature looks like a field theory at long distances/low energies, it ought to look like a field theory at short distances/high energies, because field theories are somehow mathematically elegant and coherent. That’s one proposal for what the microscopic models should be, and the string theorists propose a qualitatively different microscopic model that also looks like a field theory at long distances/low energies. The LQG people propose another. Etc.

What all these approaches have in common is (1) they are motivated in part by mathematical coherence and aesthetic appeal, (2) they look like field theories at long distances/low energies, as is inevitable (a la Weinbergian EFT reasoning) given that they are consistent with Poincare invariance, local interactions, and quantum theory, and (3) they are completely beyond current experimental confirmation.

Dare I say that — dum dum dum — they are ***not even wrong***?

Reader297 wrote: “In classical physics, you can certainly forget about phase space if you wish, but you don’t have to. And that’s important.”

Or is it? (Warning, rhetorical question!) This situation is, to me, a clear example of a case where there are two absolutely mathematically equivalent formulations of the same theory, though one is couched in terminology that leads to all sorts of philosophical difficulties (upon quantization, that is), while the other is not. What I will perhaps never understand is how some people, when faced with a choice between these mathematically equivalent formulations, consistently go for the one laden with philosophical difficulties. Unless, they are simply not aware of the alternative (though, in my experience, many of them are).

Reader297,

I understand very well Weinberg’s argument, just am not convinced. He also famously argued in the intro to his GR book that another kind of mathematics, geometry, was not needed for fundamental physics (he did this right before it became clear that gauge theory, a geometrical theory, was the way forward in particle theory). Yes, there too you can argue that geometry is just a low energy effective epiphenomenon if you want, but I don’t think it’s a convincing argument.

The argument against the “it’s just a low energy approximation viewpoint” is the LHC data: qft and gauge theory work perfectly at the shortest distances we can test (which, by the way, is rather a different situation than string theory. I was trying to keep some of these postings free of tedious arguments about string theory, please help…)

For those not aware of Weinberg’s views on geometry, see

http://www.math.columbia.edu/~woit/wordpress/?p=529

I’ve deleted some comments trying to tediously argue about string theory. Please, unless you can seriously relate this to the topic of this posting and have something new to say, just stop.

Hi Peter,

I’m agreed about avoiding string-theory discussions here, and keeping to the current topic.

You say “I understand very well Weinberg’s argument, just am not convinced.”

I think that’s fair. And it represents a basic philosophical impasse: If one person says s/he simply isn’t convinced, then that’s that, at least until experimental data can tell us which point of view is the more accurate one.

But what everyone has to admit is that the opposite point of view is a legitimate one as well — namely, that field theory won’t turn out to be fundamental — again until experiment can weigh in on the issue. People aren’t crazy to assume QFT is fundamental, nor are they crazy to assume it isn’t, because nature would look like a QFT at low energies either way. Perhaps the proper attitude about the high-energy regime is just to be agnostic at this point in time. People should work on what they like, and be tolerant of people who have a different theology about the presently unknown and unconstrained high-energy regime.

You also write: “He also famously argued in the intro to his GR book that another kind of mathematics, geometry, was not needed for fundamental physics (he did this right before it became clear that gauge theory, a geometrical theory, was the way forward in particle theory). Yes, there too you can argue that geometry is just a low energy effective epiphenomenon if you want, but I don’t think it’s a convincing argument.”

Gravity is a great thing to bring up in this discussion, because it appears to be something qualitatively different from a local QFT, as it seems to involve a change (now properly acknowledged by Weinberg) in the structure of spacetime itself, which is not something local QFTs do.

We have a pretty good low-energy effective QFT for general relativity for long-wavelength perturbations of the metric around a fixed background (again, see, e.g., Donoghue’s work), and we can even use it to extract some quantum corrections to, say, Newton’s law of gravitation, but there are basic conceptual problems that arise when one allows the geometry of spacetime to fluctuate nonperturbatively and if one wants to probe short-distance physics.

So taking seriously the geometric interpretation of GR is precisely what makes it hard to describe it in terms of a local QFT.

When Weinberg wrote that textbook in 1972, there were justifiable hopes that GR might ultimately admit a reasonably straightforward fundamental QFT description based on spin-2 massless particles, rather than the spin-1 massless particles underlying QED/QCD/Standard Model. Indeed, Weinberg showed in some papers in the late 1960s that a massless spin-2 particle had to couple to mass-energy as its source, and had to satisfy the equivalence principle. (That’s pretty amazing, when you think about it.)

If the QG-as-QFT approach had worked out, and we had one big QFT that contained all of basic physics, then people could well have argued that field theory was in fact fundamental after all, as you say it is.

But it turned out that GR did not obviously admit a straightforward description in terms of a QFT. That’s precisely the reason why so many people looked elsewhere for the microscopic theory of quantum gravity, knowing that any such microscopic description would look like a local field theory at low energies anyway, so why not look for something different in the short-distance regime?

What’s remarkable, of course, is that we now seem to have nontrivial conjectural examples of quantum gravity with certain asymptotic boundary conditions (e.g., AdS) in which there does appear to be a highly non-obvious description in terms of a local QFT. (For AdS, famously, it appears to be a CFT.) What’s ironic is that it was precisely folks who were looking at a non-QFT microscopic description (string theory) who stumbled on this discovery!

In any event, it is precisely taking a non-Weinbergian viewpoint toward the geometric picture of GR that motivated lots of people to take the Weinbergian viewpoint about local QFTs not being fundamental. But with examples like AdS/CFT, we might in the end be able to be non-Weinbergian about both viewpoints.

What are your feelings about AdS/CFT, by the way? Do you feel it has partially vindicated your hopes that we can describe fundamental quantum gravity in certain cases as a field theory?

You also write: “The argument against the “it’s just a low energy approximation viewpoint” is the LHC data: qft and gauge theory work perfectly at the shortest distances we can test.”

Well, yes, except that the shortest distances we can test aren’t very short in the grand scheme of things — they aren’t very short compared to the scales that people are actually arguing about when they talk about the question of non-fundamentalness of local QFT — unless the Planck scale is for some reason exponentially closer to us than we think it is.

That is, most people think that QFT in some form or another will work quite well at far shorter scales than we have probed at the LHC. The place people have long thought things would break down is, as you know, some 10 or 11 orders of magnitude higher in energy/shorter in distance. So the LHC isn’t telling us one way or another about the regime people are actually worried about. The fact that the LHC is consistent with gauge theories isn’t relevant to those questions, unfortunately.

That is, we don’t have anything close to the experimental data that needs to weigh in on the question of who’s correct about the fundamentalness of local QFT. That’s why I referred to any present-day speculation about this question as being “not even wrong.”

Igor Khavkine–

Well, what you’re saying is that if we take an instrumentalist point of view toward probability theory, then quantum probability theory is no worse than classical probability theory. And that’s long been pretty much accepted as true. The orthodox or instrumentalist point of view is very old, and is the refuge most people turn to when they give up trying to make more sense of quantum theory.

But what troubles people is precisely the question of instrumentalism. Not everyone wants to be an instrumentalist, for basic philosophical reasons.

In the classical case, we have the option of being non-instrumentalist if we wish — some people really like to think they personally have a state of being, and that people and detectors and measurement apparatuses are just as much physical systems as the things they examine — whereas in the quantum case, we don’t seem to have such a straightforward non-instrumentalist option.

You write: “What I will perhaps never understand is how some people, when faced with a choice between these mathematically equivalent formulations, consistently go for the one laden with philosophical difficulties.”

What I take from this statement is that you have no personal philosophical problem with instrumentalism, and don’t appreciate why some people might not like instrumentalism.

So I think you should go out and ask them why! If you don’t know of any reasons to be troubled by instrumentalism, then you might find some answers by doing a little research on the question.

Reader297,

Sorry, no interest at all in yet another empty time-wasting debate about QG, see:

http://www.math.columbia.edu/~woit/wordpress/?page_id=4338

I’m making a serious effort here to discuss something different and more interesting, please stop trying to turn this discussion towards the same tedious topics that no one has now had anything new to say about for years.

Hi Peter,

Woops! My comment was a direct response to your own comments — it definitely isn’t my intention to start a discussion over QG per se, and I’m not favoring any particular direction in that regard.

I was just responding to your statement that you believe QFT is fundamental, my response being that because anything satisfying some basic properties looks like a local QFT at the scales we can reach experimentally so far, and thus that one can just as legitimately argue the jury is out on whether QFT is fundamental. I was also conceding that AdS/CFT might favor your point of view.

My other point was that we’d need to wait for experiment to weigh in, but that the question about the fundamentality of QFT referred to energies far beyond the LHC, and so might not be settled empirically for a while.

These were intended to be directly related to your statements. I too am not interested in going into a deep discussion of QG here! The only reason I brought up QG at all was that when taking the geometric picture of gravity seriously, it seems to challenge the QFT-as-fundamental idea — but, again, there are possible ways around that issue.

By the way, thanks for the link to your FAQ. Which of the questions on your FAQ did you mean to point me to?

Dear Thomas, many thanks for your interesting remarks. As I understand it, it is argued that gravity and the presumed underlying quantum gravity have different diffeomorphism groups. It is a nice coincidence that recently (1206.0832; sorry for selfreferencing) I argued that diffeomorphism invariance is spontaneously broken due to the emergence of classical spacetime with the gravitons as Goldstone particles. This entails that observables need not be diff-invariant. More specifically, there exist two classes of observables, the Dirac ones and the broken ones.

I hope that your remarks are helpful in this context.

Reader297,

It’s a pleasure to read your comments, both their content and phrasing. Your discussions are very clear and well-reasoned, and I like the way you include ample context/background with your responses to explain why others may take positions different than Peter. It’s certainly a quality of communication to aspire to! I assume your explanations are as complete as they are because you want to make your arguments clear to readers whose background is more limited than Peter’s.

Peter,

I think this discussion is worthwhile to many of us in clarifying your apparent position that QFT is fundamental. I’m certainly finding it interesting at least. I hope you won’t shut it down because Reader297 brought up QG… My impression was that he wasn’t promoting an approach to QG — he was being responsive to your disagreement with some of Weinberg’s viewpoints and the question of how to assess whether QFT is indeed fundamental. (In fact, I was surprised you interpreted his response as an attempt to steer the discussion toward QG or other non-topical issues.)

Anyway, I hope this discussion can productively continue!

Hi Marty,

That’s very kind of you to say. Thanks so much.

Peter,

I also wanted to add that I, too, have found your recent blog posting and subsequent discussions very enlightening.

I’m always happy to see blog postings here in which you articulate your own proposals and explain your reasoning for them, as well as open them up for conversation. Thanks!

Marty/Reader297,

I’m afraid I have all too much experience with what happens here (and on many other blogs) once an argument over QG/AdS/CFT/string theory starts, so, trust me, if you want this comment section to be something worth reading (which I think it has been, thanks to Reader297 and others).

My sloganeering is intentionally challenging to what I think is some dubious received wisdom, and Reader297 has done a good job or laying out the standard point of view that I’d like to challenge. That’s all well and good, but getting into the usual arguments over string theory/QG is something different and not going to lead anywhere interesting.

For the specific FAQ entry

http://www.math.columbia.edu/~woit/wordpress/?wp_super_faq=what-do-you-think-about-question-x-in-quantum-gravity

I thought the comment about QG refered to me, because when I checked earlier today my last post seemed gone. However, before I had the time to get home and check the references for my scathing reply, the post was there again. So either I did not look hard enough, or the post reappeared. I plead guilty of having rehashed old arguments about QG in the past, but this time I think my comment was quite on topic.

Anyway, since I did spend quite a bit of time on this, let me say something else that Peter will also dislike: representation theory cannot be quantum

fieldtheory except in low dimensions.Reps of finite-dimensional Lie groups is QM, and reps of infinite-dimensional Lie groups living on the circle (affine Kac-Moody and Virasoro algebras) are QFT in 2D, or really in one complex dimension. However, the corresponding representations in higher dimensions are not QFTs, although they are quantum theories.

For definiteness, consider Yang-Mills theory in 3+1D. As is well known, this theory has gauge anomalies proportional to the third Casimir, and in particular the anomaly vanishes if the gauge group is SU(N) with N != 3. However, as is clearly stated in [Pressley-Segal, Loop Groups, 1988, section 4.10], the corresponding current group admits two inequivalent extensions: the MF extension, which is also proportional to the third Casimir, and the central extension, which is proportional to the second Casimir. The second Casimir is non-zero for all gauge groups including SU(N) for all N, and hence cannot be related to the gauge anomalies arising in QFT.

Let me let this fact sink in, because very few physicists seem to appreciate it, including those who have studied the Pressley-Segal book for decades.

Not all extensions of the algebra of gauge transformations in 3+1D correspond to gauge anomalies in QFT. The others arise in – something else. Definitely not in QFT.Worse, the MF algebra which does arise in QFT does not have any interesting quantum representations [Pickrell 1989]. Mickelsson tried to generalize the Pressley-Segal methods to higher dimensions, but after the Pickrell show-stopper he switched gears and looked for some kind of representations acting on a family of Hilbert spaces parametrized by a classical gauge field. This may be fine and dandy, but it is not really a true quantum theory. In a fully quantum theory, the gauge fields must be quantized as well, but then Pickrell’s no-go theorem returns with a vengeance. So the extension that does arise in QFT does not have any good representations. This is of course in agreement with the fact that gauge anomalies cancel in the standard model.

In contrast, the central extension does not arise in QFT, but it does have representations, some or maybe all described somewhere in Pressley-Segal section 9.1. One example is the rep induced from an affine algebra living on some circle embedded in the higher-dimensional ambient space. They possibly show (I think I have seen such a claim but I cannot find it any more) that all irreps are achieved in this way. Since this result is somewhat boring from some people’s point of view, it may be what Greg Moore refers to when he talk about discouraging no-go theorems [here, section 4.7]. Another reason why this extension cannot arise in QFT is that there is no privileged one-dimensional curve in QFT.

To summarize, there are three cases:

1) No extension. Then there is no non-trivial reps at all, by the same argument that the centerless affine algebra lacks reps.

2) MF extension. Arises as a gauge anomaly in QFT, but has no good representations. Besides, the gauge anomalies cancel in the standard model.

3) Central extension. Has unitary reps, but does not arise in QFT. Both because it has the wrong structure, and because the reps depend on a privileged 1D curve.

Since the extensions that arise in QFT don’t have representations, and the extensions that have representations don’t arise in QFT, I propose a better slogan:

Representation theory is quantum theory, but not quantumfieldtheory except in low dimensions.Hi Peter,

Thank you for clarifying that you weren’t specifically objecting to Reader297’s mention of quantum gravity — you foresaw that bringing up QG would probably cause others to hijack the discussion with pointless, repetitive arguing. I agree.

Although it isn’t your primary topic, you indicated in the comments that you believe the Standard Model may be fundamental. It would easier to understand this position if I knew specifically what it means (in your sense) to say that the Standard Model QFT is fundamental. Let me propose three alternatives; yours may be different.

1. The Standard Model is fundamental in the sense that its Lagrangian is inevitable. That is, whatever the ultraviolet completion of the SM might be, it will uniquely imply the SM as its low energy limit.

(Note: By an “ultraviolet completion” I include the possibility of explanatory microscopic models for which a Lagrangian can be specified.)

2. The SM is fundamental in the sense that it is the best description we can ever hope to empirically verify. Attempts at an ultraviolet completion may succeed in reproducing the SM at observable energies, but they will forever remain little more than “stories” we tell to reassure ourselves that we really can and do understand Nature “all the way down.”

3. The SM is fundamental in that it already describes physics at all energy scales, or at least up to the Planck scale. This view doesn’t preclude future extensions that explain the values of the various Yukawa couplings and gauge constants, provided those discoveries don’t modify the SM Lagrangian in any basic way.

I won’t argue with your definition since it stems from your own philosophical views.

My other question is about your statement,

Reader297 also asked about this. In what way do you disagree with the conventional view, i.e., that the LHC and future practical colliders can only probe “low” energies compared to the regime where the SM is expected to break down, so that collider data cannot help us determine whether the SM remains valid in the ultraviolet?

Marty/Reader297,

This is getting rather far afield from the question of the relation of QM and representation theory, but a few comments:

I just don’t think that describing the highest energy you can study as “low energy” and thus irrelevant to fundamental questions is sensible, nor is basing arguments on highly speculative assumptions about what is taking place at 10^12 or whatever times the highest energies you know anything about. And on this I don’t think my point of view is unconventional. Sure, at low energies, lots of other things may look like QFT, because they have to for consistency reasons, but if you don’t actually have a non-QFT theory at high energies that has evidence for it or explains anything in a compelling way, the conjecture that it’s QFT all the way down is a viable one.

I don’t think the SM is a fully satisfactory model, since it doesn’t explain a list of things one would expect a fundamental theory to explain. But I have no idea what does explain those things and what a better theory than the SM is. My argument is just that given how well the SM works, it’s a good idea to take very seriously the mathematical framework it uses and try and understand where that might come from and whether there is some aspect of it we are missing, instead of dismissing it as “just a low energy approximation”, and looking for something completely different. Especially since conjectures about something completely different have gone nowhere.

Hi Peter,

Again, fair remarks.

All I would say at this point is that we have enough physical examples now, e.g., from condensed matter, of systems that look like field theories at long distances but aren’t microscopically field theories that any speculations about fundamental physics being QFT “all the way down,” while perhaps viable, are not singled out by experiment. There’s no way to say that they’re even wrong.

And, again, it doesn’t help the situation that, as you note, “at low energies, lots of other things may look like QFT, because they have to for consistency reasons…”

What’s unfortunate is the dearth of hard experimental data high-energy physics has been suffering from for the past several decades. We’re flying blind. And it means that, at this moment in time, hypotheses about the fundamentality (or not) of QFT are mostly a matter of personal preference. Not that there’s anything wrong with that, but it does mean that people are going to argue a lot without changing each other’s minds.

Gentlemen,

you seem to agree that “at low energies, lots of other things look like QFT.”

Can you give some examples or references?

Peter: what’s your feeling about taking a signal processing approach to QM, instead of a classical mechanics approach? Hilbert spaces emerge naturally in signal processing through fourier transforms, and, through the elementary (and not very rigorous) mathematics of Leon Cohen’s Foundations of Physics 18, 983(1988), complex structure and incompatible operators are natural too when we take fourier transforms of probability distributions. Leon Cohen is at CUNY, so perhaps you know him?

Modern high energy physics seems to be more about feature detection in statistics of complex sets of signals than about particles.

Peter (Morgan),

Working in an instrumentalist framework, Cohen seems to have been motivated by a theorem of Khinchin’s that a complex-valued function M(u) is the characteristic function for some probability distribution P(x) if and only if M(u) can be expressed as the integral over x of a product of the form g*(x)g(x+u) for a complex-valued function g(x) for which |g(x)|^2 integrates to 1. Cohen notes that this construction looks suspiciously like we have a wave function g(x).

But there isn’t anything surprising here. We could have skipped M(u) altogether and just noticed that we can express P(x) as g*(x)g(x) for any complex function g(x) for which |g(x)|^2 integrates to 1. Taking the extra step of going through M(u) doesn’t actually buy us anything. One can derive all of Cohen’s results just from expressing P(x) in terms of g(x).

Cohen notices that characteristic functions can be used to define joint probability distributions that are positive but not bilinear in wave functions. But he doesn’t address how to pick a canonical such distribution, nor how to write down the time evolution for them, nor how to handle matching his joint distributions onto the probability distributions of a larger system in the presence of nontrivial entanglement.

Reader297, certainly Cohen’s paper is no more than indicative, and certainly there could be other approaches. That’s a large part of why the paper I mention is in Foundations of Physics. Cohen’s academic output is more usually found in IEEE and other signal processing journals.

My only claim would be that a signal processing approach is more abstract (even while being closer to experimental data) and avoids having as much classical mechanics baggage, albeit I’m not sure whether that’s a good or bad thing (particularly for a new student of QM). To some extent, it’s taking Heisenberg without the Schrödinger.