Nature is Fundamentally Conformally Invariant

For a third slogan I’ve chosen:

Nature is fundamentally conformally invariant.

Note the weasel-word “fundamentally”. We know that nature is not conformally invariant, but the kind of thing I have in mind is pure QCD, where the underlying classical theory is conformally invariant, with quantization dynamically breaking conformal invariance in a specific way.

The group of conformal symmetries, its representations, and what this has to do with physics are topics I haven’t written about in the notes I’ve been working on. This is because I suspect we still haven’t gotten to the bottom of these topics, and properly dealing with what is known would require a separate volume. Part of the story is the twistor geometry of four dimensions, which Roger Penrose pioneered the study of, and which recently has found important applications in the calculation of scattering amplitudes.

As a more advanced topic, this slogan would normally have been put off until later, but I wanted to point to a new article by Natalie Wolchover in Quanta magazine which deals with exactly this topic. It describes several different efforts by physicists to rethink the usual story about the hierarchy problem, taking a conformally invariant model as fundamental. For the latest example along these lines, see this arXiv preprint. The whole article is well-worth reading, and it includes a quote from Michael Dine (whose work I’ve been critical of in the past) that I found heart-warming:

“We’re not in a position where we can afford to be particularly arrogant about our understanding of what the laws of nature must look like,” said Michael Dine, a professor of physics at the University of California, Santa Cruz, who has been following the new work on scale symmetry. “Things that I might have been skeptical about before, I’m willing to entertain.”

Perhaps particle theorists are beginning to realize that the landscape is just a dead-end, and what is needed is a re-examination of the conventional wisdom that led to it.

Posted in Favorite Old Posts, Quantum Mechanics | 33 Comments

Grand Unification of Mathematics and Physics

For a second slogan about quantum mechanics I’ve chosen:

Quantum mechanics is evidence of a grand unification of mathematics and physics.

I’m not sure whether this slogan is likely to annoy physicists or mathematicians more, but in any case Edward Frenkel deserves some of the blame for this, since he describes (see here) the Langlands program as a Grand Unified Theory of mathematics, which further is unified with gauge field theories similar to the Standard Model.

This week I’m in Berkeley and have been attending some talks at an MSRI workshop on New Geometric Methods in Number Theory and Automorphic forms. Number theory is normally thought of as a part of mathematics about as far away from physics as you can get, but I’m struck by the way the same mathematical structures appear in the representation theory point of view on quantum mechanics and in the modern point of view on number theory. For example, the lectures on Shimura varieties have taken as fundamental example the so-called Siegel upper-half space, which is the space Sp(2n,R)/U(n). Exactly the same space occurs in the quantization of the harmonic oscillator (see chapters 21 and 22 of my notes), where it parametrizes possible ground states. Different aspects of the structure play central roles in the math and the physics. In the simplest physics examples one works at a fixed point in this space, with Bogoliubov transformations taking one to other points, something which becomes significant in condensed matter applications. In number theory, one is interested not just in this space, but in the action of certain arithmetic groups on it, with the quotient by the arithmetic group giving the object of fundamental interest in the theory.

The workshop is the kick-off to a semester long program on this topic. It will run simultaneously with another program with deep connections to physics, on the topic of Geometric Representation Theory. This second program will deal with a range of topics relating quantum field theory and representation theory, with the geometric Langlands program a major part of the story, one that provides connections to the number theoretical Langlands program topics of this week’s workshop. I’ve got to be in New York teaching this semester, so I’m jealous of those who will get to participate in the two related MSRI programs here in Berkeley. Few physicists seem to be involved in the programs, but these are topics with deep relations to physics. I do think there is a grand unified theory of some kind going on here, although of course one needs to remember that grand unified theories in physics so far haven’t worked out very well. Maybe the problem is just that one hasn’t been been ambitious enough, that one needs to unify not just the interactions of the standard model, but number theory as well…

Posted in Favorite Old Posts, Quantum Mechanics, Uncategorized | 7 Comments

Quantum Theory is Representation Theory

For a first slogan (see here for slogan zero) I’ve chosen:

Quantum theory is representation theory.

One aspect of what I’m referring to is explained in detail in chapter 14 of these notes. Whenever you have a classical phase space (symplectic manifold to mathematicians), functions on the phase space give an infinite dimensional Lie algebra, with Poisson bracket the Lie bracket. Dirac’s basic insight about quantization (“Poisson bracket goes to commutators”) was just that a quantum theory is supposed to be a unitary representation of this Lie algebra.

For a general symplectic manifold, how to produce such a representation is a complicated story (see the theory of “geometric quantization”). For a finite-dimensional linear phase space, the story is given in detail in the notes: it turns out that there’s only one interesting irreducible representation (Stone-von Neumann theorem), it’s determined by how you quantize linear functions, and you can’t extend it to functions beyond quadratic ones (Groenewold-van Hove no-go theorem). This is the basic story of canonical quantization.

For the infinite-dimensional linear phase spaces of quantum field theory, Stone-von Neumann is no longer true, and the fact that knowing the operator commutation relations no longer determines the state space is one source of the much greater complexity of QFT.

Something that isn’t covered in the notes is how to go the other way: given a unitary representation, how do you get a symplectic manifold? This is part of the still somewhat mysterious “orbit method” story, which associates co-adjoint orbits to representations. The center of the universal enveloping algebra (the Casimir operators) acts as specific scalars on an irreducible representation. Going from the universal enveloping algebra to the polynomial algebra on the Lie algebra, fixing these scalars fixes the orbit.

Note that slogan one is somewhat in tension with slogan zero, since it claims that classical physics is basically about a Lie algebra (with quantum physics a representation of the Lie algebra). From the slogan zero point of view of classical physics as hard to understand emergent behavior from quantum physics, there seems no reason for the tight link between classical phase spaces and representations given by the orbit method.

For me, one aspect of the significance of this slogan is that it makes me suspicious of all attempts to derive quantum mechanics from some other supposedly more fundamental theory (see for instance here). In our modern understanding of mathematics, Lie groups and their representations are unifying, fundamental objects that occur throughout different parts of the subject. Absent some dramatic experimental evidence, the claim that quantum mechanics needs to be understood in terms of some very different objects and concepts seems to me plausible only if such concepts are as mathematically deep and powerful as Lie groups and their representations.

For more about this, wait for the next slogan, which I’ll try and write about next week, when I’ll be visiting the Bay area, partly on vacation, but partly to learn some more mathematics.

Posted in Favorite Old Posts, Quantum Mechanics | 43 Comments

Smashing Physics

I recently finally found a copy of Jon Butterworth’s Smashing Physics, which came out in the UK a few months ago, but still hasn’t made it to the US. As far as I know it’s the first book about the Higgs written by someone actually involved in the LHC experiments. While there are several books already out there that do a good job of explaining the Higgs story (see for instance here) this one is a great read, giving a lively picture of what it was like to be involved in one of the experiments that made the discovery.

The book is structured as a bit of a travelogue, traveling through space to CERN and to various conferences, through time as the ATLAS LHC experiment unfolded, with physics explanations interspersed. A reviewer at New Scientist didn’t like this, but I think the personal and idiosyncratic approach works well. We’re given here not the highly processed take of a professional science writer, but a picture of what this sort of professional life is actually like for one specific scientist, from what happens in collaboration meetings, to an overnight spree on the Reeperbahn.

The perspective is definitely British (a lot of drinking goes on, with a scornful observation of American couples at a French bistro “drinking water”), and includes a fair amount of material about recent science funding problems in Britain. Butterworth’s comments are often to the point, if sometimes impolitic. For instance, about the “God Particle” business, there’s a footnote:

Yes, I know Lederman claims he wanted to call it The Goddamn Particle and blames his publishers for the change. But my publishers wanted to call this book something really silly, and I managed to stop them.

For readers who know nothing about the physics involved, this book may not be the right place to start, with the well-known scientific story not getting a detailed treatment, and little in the way of graphics besides some Feynman diagrams. On the other hand, if you’ve read one of the other books about the Higgs, Butterworth takes you a lot deeper into the subject of LHC physics, including some extensive material on his work on boosted objects and jet substructure, which may lead to important results in future LHC analyses. If you like your science non-abstract and human, this is a great place to learn about the Higgs discovery story.

There’s a quite positive review in the Guardian by Graham Farmelo, which describes the book well. That review though contains (like another review and like his wonderful book on Dirac) some odd material about string theory, in this case a long paragraph defending the theory, and telling us that “he [Butterworth] and his fellow sceptics will be proved wrong in the long term.” Actually there’s very little about string theory in the book other than some sensible comments about being more interested in things experimentally testable. Like Tom Siegfried, it seems some science journalists are likely to always be unwilling to admit that they were sold goods that didn’t turn out to work as advertised, and uncomprehending that most physicists, like Butterworth, never were buyers.

I gather the book may appear here in the US early next year, hope it gets some attention then.

Posted in Book Reviews | 5 Comments

2014 Fields Medals

I thought this wasn’t supposed to be announced until late this evening New York time, but the Fields Medal announcement is now online. The winners are:

  • Artur Avila
  • Manjul Bhargava
  • Martin Hairer
  • Maryam Mirzakhani

Mirzakhani is the first woman to win a Fields medal. Congratulations to all four.

I’m not at all knowledgeable about the work of this year’s medalists, for this you can consult the press releases on the ICM page.

Update: Quanta magazine has profiles of the winners. Avila, Bhargava, Hairer, Mirzakhani.


Update
: For ICM blogging, clearly the place to go is the blog of a Fields Medalist.

Update: According to Tim Gowers, the Fields Medal Committee was: Daubechies, Ambrosio, Eisenbud, Fukaya, Ghys, Dick Gross, Kirwan, Kollar, Kontsevich, Struwe, Zeitouni and Günter Ziegler.

Update: For two very different sorts of blog posts about the Fields Medal, see Terry Tao and Mathbabe.

Posted in Uncategorized | 53 Comments

What’s Hard to Understand is Classical Mechanics, Not Quantum Mechanics

For a zeroth slogan about quantum mechanics, I’ve chosen

What’s hard to understand is classical mechanics, not quantum mechanics.

The slogan is labeled by zero because it’s preliminary to what I’ve been writing about here. It explains why I don’t intend to cover part of the standard story about quantum mechanics: it’s too hard, too poorly understood, and I’m not expert enough to do it justice.

While there’s a simple, beautiful and very deep mathematical structure for fundamental quantum mechanics, things get much more complicated when you try and use it to extract predictions for experiments involving macroscopic components. This is the subject of “measurement theory”, which gives probabilistic predictions about observables, with the basic statement the “Born rule”. This says that what one can observe are eigenvalues of certain operators, with probability of observation proportional to the norm-squared of the eigenvector. How this behavior of a macroscopic experimental apparatus described in classical terms emerges from the fundamental QM formalism is what is hard to understand, not the fundamental formalism itself. This is what the slogan is trying to point to.

When I first started studying quantum mechanics, I spent a lot of time reading about the “philosophy” of QM and about interpretational issues (e.g., what happens to Schrodinger’s famous cat?). After many years of this I finally lost interest, because these discussions never seemed to go anywhere, getting lost in a haze of complex attempts to relate the QM formalism to natural language and our intuitions about everyday physics. To this day, this is an active field, but one that to a large extent has been left by the way-side as a whole new area of physics has emerged that grapples with the real issues in a more concrete way.

The problem though is that I’m just knowledgeable enough about this area of physics to know that I’ve far too little expertise to do it justice. Instead of attempting this, let me just provide a random list of things to read that give some idea of what I’m trying to refer to.

Other suggestions of where to learn more from those better informed than me are welcome.

I don’t think the point of view I take about this is at all unusual, maybe it’s even the mainstream view in physics. The state of a system is given by a vector in Hilbert space, evolving according to the Schrodinger equation. This remains true when you consider the system you are observing together with the experimental apparatus. But a typical macroscopic experimental apparatus is an absurdly complicated quantum system, making the analysis of what happens and how classical behavior emerges a very difficult problem. As our technology improves and we have better and better ways to create larger coherent quantum systems, thinking about such systems I suspect will lead to better insight into the old “interpretational” issues.

From what I can see of this though, the question of the fundamental mathematical formalism of QM decouples from these hard issues. I know others see things quite differently, but I personally just don’t see evidence that the problem of better understanding the fundamental formalism (how do you quantize the metric degrees of freedom? how do these unify with the degrees of freedom of the SM?) has anything to do with the difficult issues described above. So, for now I’m trying to understand the simple problem, and leave the hard one to others.

Update
: There’s a relevant conference going on this week.

Update
: I’ve been pointed to another article that addresses in detail the issues referred to here, the recent Physics Reports Understanding quantum measurement from the solution of dynamical models, by Allahverdyan, Balian and Nieuwenhuizen.

Posted in Favorite Old Posts, Quantum Mechanics | 46 Comments

Fall QM Course

This year I’ll be teaching a new version of the same course on quantum mechanics aimed at mathematicians that I taught during the 2012-3 academic year (there’s a web-page here). During the last course I started writing up notes, and have spent a lot of the last academic year working on these, the current version will always be here. At this point I have a few of the last chapters to finish writing, as well as a long list of improvements to be made in the earlier ones. I’ll be teaching the course based on these notes, and hope to improve them as I go along, partly based on seeing what topics students have trouble with, and what they would like to hear more about.

I’ve learned a lot while doing this, and it now seems like a good idea to write something on the blog discussing topics that don’t seem to me to be dealt with well in the standard textbook treatments. Johan de Jong’s Stacks Project has recently added a sloganerator (written by Pieter Belmans and Johan Commelin), which has inspired me to try to organize things as slogans. Slogan 0 to appear soon….

Posted in Quantum Mechanics | 8 Comments

Quantum Connection Could Revitalize Superstrings

Finally back from vacation, postings may appear somewhat more regularly…

Science journalist Tom Siegfried has been one of the most vociferous proponents of string theory for many, many years (see here), but even his faith seems like it might be failing as the decades roll on. His latest on the topic starts out:

Sometimes it’s nice to reflect nostalgically on the last couple of decades of the 20th century. You know, the era of Madonna and Duran Duran, Cheers and The X-Files, McGwire and Sosa, the Macarena, and superstring theory.

The article does try and mount an argument that string theory may not be moribund, with the hope for the future coming from a new paper by Bars and Rychkov entitled Is String Interaction the Origin of Quantum Mechanics?. The idea here seems to be that if you assume you somehow have a fully consistent string field theory, not based on quantum mechanics, then the occurrence in this theory of non-commutative phenomena would “explain” quantum mechanics. To me, this seems to deserve some sort of award for the most desperate attempt yet to justify string theory, but Siegfried is a fan, explaining:

For decades, explaining why nature observes the mysterious rules of quantum physics has perplexed physicists everywhere. Nobody could explain why those rules worked. The connection between string physics and the quantum math may now lead the way to an answer.

I’ll write more soon about those “mysterious rules of quantum physics”, but I just don’t see at all how string field theory (which supposedly is based on quantum mechanics…) makes anything about quantum mechanics less mysterious.

Siegfried of course is not just a fan of string theory, but also of the multiverse, so he ends with:

On top of all that, the string-quantum connection suggests an intriguing insight into the nature of reality. Quantum physics is notorious for implying the existence of multiple realities, as articulated in the “many worlds” interpretation of quantum mechanics. Superstring theory has also annoyed many physicists by forecasting the existence of a huge “landscape” of different vacuum states, essentially a multiverse comprising multiple universes with a wide range of physical properties (many not suitable for life, but at least one that is). If string interactions really are responsible for the rules of quantum physics, maybe there’s some connection between the multiple quantum realities and the superstring landscape. For fans of the late 20th century, it seems like an idea worth exploring.

One thing remarkable about this is that he has another piece that recently appeared, an interview with Katherine Freese, where he tries to convince her about the multiverse, but doesn’t get anywhere:

Theory predicts vastly more vacuum energy than the amount actually observed. Wouldn’t this huge disparity be explained if there are multiple universes, a multiverse, and each has a different density of vacuum energy? Then the reason we have a low amount in ours is because that’s the only way we could exist in it.

I don’t like that idea. A lot of people like it because of string theory. Originally people thought that string theory would give a unique solution to the vacuum-energy equations. But it turns out that in string theory there are maybe 10-to-the-500th different vacuum states. So the idea is that they’re all out there, but we have to live in one with a value of the cosmological constant close to the one we have. But I don’t like anthropic arguments. They rely on the fact that human life can only come to exist under certain conditions, so that of the many universes out there it’s not surprising we live in the one that supports our type of life. That’s not a good enough explanation for me. I feel there are physics problems that we have to answer, and we can answer them in this universe, in this piece of the universe we live in. I think it’s our job to try to do that, and it’s not good enough for me to give up on it and say, well, it has to have this value because otherwise we couldn’t exist. I think we can do better than that. I know, I’m old-fashioned.

Isn’t part of the question whether there is a multiverse or not? If you had really strong evidence that there is a multiverse, then the anthropic explanation becomes better motivated. Inflation, the rapid burst of expansion right after the Big Bang, supposedly can produce a multiverse by way of “eternal inflation.”

I do believe in inflation, so can inflation give you a multiverse or not? Because if it can, then I’m forced to consider this possibility. I recently wrote a paper with Will Kinney on this. We concluded that what we observe in the cosmic microwave background radiation is not giving rise to eternal inflation. So how do you know that ever happened?

Are the recent results on the cosmic microwave background from the BICEP2 experiment relevant to this issue?

If you take the BICEP data literally, which I’m not saying you should, you never have eternal inflation. So you don’t have to have eternal inflation, if you ask me. I was very happy about that.

Posted in This Week's Hype | 22 Comments

The NSA, NIST and the AMS

Among the many disturbing aspects of the behavior of the NSA revealed by the Snowden documents, the most controversial one directly relevant to mathematicians was the story of the NSA’s involvement in a flawed NIST cryptography standard (for more see here and here). The New York Times reported:

Classified N.S.A. memos appear to confirm that the fatal weakness, discovered by two Microsoft cryptographers in 2007, was engineered by the agency. The N.S.A. wrote the standard and aggressively pushed it on the international group, privately calling the effort “a challenge in finesse.”

The standard was based on the mathematics of elliptic curves, so this is a clearly identifiable case where mathematicians seem to have been involved in using their expertise to subvert the group tasked with producing high quality cryptography. A big question this raises has been what the NIST will do about this. In April they removed the dubious algorithm from their standards, and published the public comments (many of which were highly critical) on a draft statement about their development process.

At the same time a panel of experts was convened to examine what had gone wrong in this case, and this panel has (on a very short time-scale) just produced its report (associated news stories here, here and here). The rules of how such panels are set up evidently require that each panelist provide an individual report, rather than attempt to have a consensus version. The new NIST document gives these reports together with minutes of the meetings where the panelists were provided with information. It seems that the NSA provided no information at all as part of this process, and they remain unwilling to answer any questions about their actions.

Appendix E contains the individual reports. These include, from Edward Felten:

The bottom line is that NIST failed to exercise independent judgment but instead deferred extensively to NSA with regard to DUAL_EC. After DUAL_EC was proposed, two major red-flags emerged. Either one should have caused NIST to remove DUAL_EC from the standard, but in both cases NIST deferred to NSA requests to keep DUAL_EC…
at the time NIST had nobody on staff with expertise in elliptic curves.
NSA’s vastly superior expertise on elliptic curves led NIST to defer
to NSA regarding DUAL_EC, while NIST people spent more of their limited time on other parts of the standard that were closer to their expertise.

From Bart Preneel:

There is no doubt that the inclusion of Dual EC DRBG in SP 800-90A was a serious mistake…
The explanations provided by NIST are plausible, but it seems that not all decisions in the standardization process of SP 800-90A are properly documented; moreover, we did not have access to the source documents. This means that it is impossible to decide whether this mistake involved in addition to clever manipulation of the standards processes by NSA also some form of pressure on the technical and/or management staff of NIST. It is also not clear whether there would be any traces of such pressure in documents. Without access to the documents, it is also diffcult to decide whether or not NIST has deliberately weakened Dual EC DRBG…

However, it seems that NSA (with its dual role) seems to be prepared to weaken US government standards in order to facilitate its SIGINT role. This undermines the credibility of NIST and prevents NIST reaching its full potential in the area of cryptographic standards. In view of this, the interface between NSA and NIST and the role of the NSA should be made much more precise, requiring an update to the Memorandum of Understanding. At the very least, the terms “consult”, “coordination” and “work closely” should be clarified. Ideally, NIST should no longer be required to coordinate with NSA. There should be a public record of each input or comment by NSA on standards or guidelines under development by NIST.

From Ronald Rivest (the “R” in “RSA”):

Recent revelations and technical review support the hypothesis that, nonetheless, the NSA has been caught with “its hands in the cookie jar” with respect to the development of the Dual-EC-DRBG standard. It seems highly likely that this standard was designed by the NSA to explicitly leak users’ key information to the NSA (and to no one else). The Dual-EC-DRBG standard apparently (and I would suggest, almost certainly) contains a “back-door” enabling the NSA to have surreptitious access. The back-door is somewhat clever in that the standard is not designed to be “weak” (enabling other foreign adversaries to perhaps exploit the weakness as well) but “custom” (only the creator (NSA) of the magical P,Q parameters in the standard will have such access).

[Recommendation]
NIST (and the public) should know whether there are any other current NIST cryptographic standards that would not be acceptable as standards if everyone knew what the NSA knows about them. These standards should be identified and scheduled for early replacement. If NSA refuses to answer such an inquiry, then any standard developed with significant NSA input should be assumed to be “tainted,” unless it possesses a verifiable proof of security acceptable to the larger cryptographic community. Such tainted standards should be scheduled for early replacement.

One way this goes beyond the now-withdrawn NIST standard is that the committee also looked at other NIST current standards now in wide use, which in at least one other case depend upon a specific choice of elliptic curves made by the NSA, with no explanation provided of how the choice was made. In particular, Rivest recommends changing the ECDSA standard in FIPS186 because of this problem.

For a detailed outline of the history of the Dual-EC-DRBG standard, see here. Note in particular that this states that in 2004 when the author asked where the Dual-EC-DRBG elliptic curves came from, the response he got was “NSA had told not to talk about it.”

Also this week, the AMS Notices contains a piece by Richard George, a mathematician who worked at the NSA for 41 years before recently retiring. Presumably this was vetted by the NSA, and is a reasonably accurate version of the case they want to make to the public. Personally I’d describe the whole thing as outrageous, for a long list of reasons, but here I’ll just focus on what it says about Dual-EC-DRBG, since it now seems likely that it is all we will ever get from the NSA about this. It says:

I have never heard of any proven weakness in a cryptographic
algorithm that’s linked to NSA; just innuendo.

The reaction from a commenter here (publicly anonymous, but self-identified to me) was:

As a member of a standards committee involved the removal of the mentioned algorithm from a standard, none of the members believe the “innuendo” theory, and all believe it was deliberately weakened.

Read carefully (and I think it was written very carefully…), note that George never directly denies that the NSA back-doored Dual-EC-DRBG, just claims there is no “proven weakness”. In other words, since how they chose the elliptic curves is a classified secret, no one can prove anything about how this was done. All the public has is the Snowden documents which aren’t “proof”. This is highly reminiscent of the US government’s continuing success at keeping legal challenges to NSA actions out of the courts, even when what is at issue are actions that everyone agrees happened, on the grounds that plaintiff can’t “prove” that they happened, since they are classified. Snowden’s release of documents may yet allow some of these cases to come to a court, just as they were the one thing capable of getting the NIST to acknowledge the Dual-EC-DRBG problem.

I hope that there will be more response to the NSA issue from the Math community than there has been so far. In particular, Rivest’s call for the removal from NIST standards of material from the NSA which the NSA refuses to explain should be endorsed. The innuendo from George is that the NSA may be refusing to explain because they used superior technology to choose better, more secure elliptic curves. If this is the case I don’t see why an official statement to that effect, from the NSA director, under oath, cannot be provided.

On the many other issues the George article raises, I hope that the AMS Notices will see some appropriate responses in the future. Comments here should be restricted to the NIST/NSA story, with those from anyone knowledgeable about this story most encouraged.

Update: The NIST has made available on its website the materials provided to the panel looking into this.

One remarkable thing about the panel’s investigation is that the NSA evidently refused to participate, in particular refusing to make anyone available to answer questions at the panel’s main meeting on May 29 (see page 12 of the report). This appears to be in violation of the Memorandum of Understanding that governs the NIST/NSA relationship, which explicitly states that “The NSA shall … Be responsive to NIST requests for assistance including, but not limited to, all matters related to cryptographic algorithms and cryptographic techniques, information security, and cybersecurity.” All evidence I’ve seen is that the NSA sees itself as above any need to ever justify any of its actions. I can’t see any possible argument as to why they did not have an obligation to participate in the work of this committee.


Update
: A new development in this story is a letter from Congressman Grayson to NSA Director Clapper asking exactly the right questions about what happened at the NIST. Will be interesting to see if a member of Congress can get anything out of the NSA beyond the usual stone-walling.

Posted in Uncategorized | 23 Comments

Mathematics Items

  • For an Oxford conference last week, Langlands contributed a one-hour video talk, filmed in his office. One hour was not enough, so hours two and three are also available, as well as a separate text, and some additional comments.
  • The latest AMS Notices has a long section of excellent articles about Friedrich Hirzebruch and his mathematical work.
  • Also in the AMS notices is a long defense of the NSA, written by a mathematician who worked there for 41 years. About the main recent controversy here, the Snowden revelation of an NSA backdoor in an NIST standard, all the author has to say is:

    I have never heard of any proven weakness in a cryptographic
    algorithm that’s linked to NSA; just innuendo.

    This seems to me to exemplify pretty well the disturbing tactic of the US security establishment of claiming there is no problem while refusing to discuss anything problematic since it is classified.

  • Bhargava, Skinner and my colleague Wei Zhang have a new paper out proving that better than 66% of elliptic curves satisfy the BSD conjecture. It seems not implausible that they or others might in the not too distant future get to 100%. One should note though that showing 100% of elliptic curves satisfy BSD wouldn’t be the same thing as showing all elliptic curves satisfy BSD, so wouldn’t be eligible for the $1 million Millennium prize.
  • With the ICM less than a month away, I find it outrageous that no one has yet leaked to me the names of the Fields Medal winners. All I’ve heard is speculation, and the only name I’d bet any money on is Bhargava.


Update
: For something both ICM and Langlands related, Michael Harris on his web site has his ICM contribution Automorphic Galois representations and the cohomology of Shimura varieties. Many of the ICM 2014 proceedings contributions are already available on arXiv, via this search.

Posted in Langlands | 23 Comments