Chad Orzel has a very sensible piece at Forbes, headlined What Math Do You Need For Physics? It Depends, which addresses the question of what math a physicist like him (experimental AMO physics) really needs. I’m glad to see that he emphasizes the same basic courses my department offers aimed at non-majors:

- Multivariable calculus
- Differential equations
- Linear algebra

together with statistics (which here at Columbia is handled by a separate department). He disses complex analysis, for reasons that I can understand. That’s a beautiful subject, and the results you can get out of analytic functions and contour integration are often unexpected and seemingly magic, but they’re not of as general use as the other subjects.

One subject he doesn’t mention that I would argue for is Fourier Analysis, which is the class I’ll be teaching next semester. That’s an incredibly useful as well as profound subject which every physicist should know, but it is true some of its basics often gets taught in other courses (for example in ode courses as a method for solving differential equations).

Orzel starts off with an amusing discussion of a physics version of “Humiliation”, admitting that he’s never used or really worked through a proof of Noether’s theorem, widely considered “the most profound idea in science”. I’ve argued here for the Hamiltonian over Lagrangian method, in which case a different set of ideas about symmetry is fundamental, with Noether’s theorem playing no role. In the Hamiltonian case symmetries are generated by functions on phase space, and finding the function that generates any symmetry is a matter of Poisson brackets (as an experimentalist, maybe Orzel has never calculated a Poisson bracket either though…).

One says that a function F on phase space generates an infinitesimal transformation if such an infinitesimal transformation changes the function G by {G,F} (the Poisson bracket of the functions G and F). A basic example is the Hamiltonian function, with {G,H} the infinitesimal change of G by time translation, or in other words, Hamilton’s equation

dG/dt={G,H}

When {H,F}=0, we say that the infinitesimal transformation generated by F is a symmetry, since H is left unchanged by such transformations. Using the antisymmetry of the Poisson bracket, this can also be read as {F,H}=0, with Hamilton’s equation then the conservation law that F doesn’t change with time.

All this seems to me much more straightforward than the Lagrangian Noether’s theorem approach to symmetry transformations and conservation laws. My own analog of Orzel’s admission would be admitting (which I won’t do) how long it took me in life to understand this fundamental point (I blame my teachers).

For lots and lots more about this, see chapters 14 and 15 here.

To me, complex analysis and Fourier analysis are inseparable. For example, theorems like the Paley-Wiener theorem give direct, explicit, and important connections between complex analytic concepts like analyticity and Fourier analytic concepts like smoothness or rapid decay of Fourier coefficients. Perhaps the problem is that the two subjects are not taught in a way that synthesizes the most useful features for a physics audience.

Interesting. Back when I was a physics major, I did more math than most (ended up a mathematician of course). What about group theory? Lie groups. Had a course in that as an undergrad, which I guess is pretty unusual. Did it instead of Galois theory. And you need some complex. The only mathematical physics course I took was all complex analysis, mostly funky contour integrals. I agree about hamilton, and Fourier analysis. Of course, I guess it really depends on what sort of physics you do. Differential geometry is helpful, and nowadays graph theory. I guess you need enough time to take some physics courses….

I managed to misread Humiliation for Hamiltonian! As Jeff says above, I’d think you pretty much can’t do quantum mechanics and quantum field theory without group theory, Lie groups, representations thereof, and so on. But you can go a pretty long way without differential geometry.

In practice it is hard to directly measure momenta (we don’t “see” phase space easily). For this reason the Lagrangian description is fundamental. Sometimes one has to think like an experimentalist.

I’m an engineer, not a physicist, but wouldn’t you need tensor calculus for general relativity? Or is GR a specialized enough field that you would only take that math when you needed it?

Looking back at my college years and knowing what I know now, I would have traded my second semester of PDEs for a half advanced linear algebra, half advanced Fourier analysis class; I use those every day but haven’t dug out those fancy PDEs in decades.

I am probably pushing at an open door here, but I would vote for the theory of continuous groups as well. In 1980, as I was finishing my undergrad physics course, there were a lot of talks and TV programs on high-energy physics aimed at the general public (remember those days?) SU(3) would come up a lot. This was fascinating to me as I had no idea what it was. When, as a graduate student, I actually found out, I was almost disappointed. Continuous groups should, IMHO, be taught alongside quantum mechanics at undergrad level.

GR is very much a specialized field. I don’t think many physicists, apart from those working directly on GR or related subjects, know the mathematics needed for that.

I agree that parts of group theory are needed for QM, but still I wouldn’t say it’s something necessary for all physicists.

@sabine: I’ve also misread Humiliation for Hamiltonian, professional deformation I suppose.

@peter: You’re probably familiar with DeWitt’s book (well a series of books actually) on QFT. In there he advocates various approaches which are not standard in the HEP community. In particular there’s a lot of mentioning of the Pereils bracket, and thing like that. Now, as much as I wanted to read that book cover to cover I still didn’t find time for it, so I was wondering if the approach that you’re describing here in the post has any relation to DeWitt’s approach?

Cheers!

Off-topic:

Any thoughts on (no, not Trump, but):

https://www.quantamagazine.org/20161115-strange-numbers-found-in-particle-collisions/

Peter,

I’ve worked in engineering as well as physics: Any decent engineer needs to be very comfortable intuitively with Fourier analysis. To be sure, they do not have to remember the difference between the Dirichlet kernel and the Fejer kernel, but they need a strong physical sense for what is happening.

And, I once solved an important engineering problem using the Schwarz-Christoffel transformation. To be sure, most engineers do not remember enough complex analysis to handle that (perhaps that is why the company hired a physicist).

Dave Miller in Sacramento

cthulhu asked:

>I’m an engineer, not a physicist, but wouldn’t you need tensor calculus for general relativity? Or is GR a specialized enough field that you would only take that math when you needed it?

I recently worked out the Schwarzschild solution without using tensors: I ran with the approach Feynman lays out toward the end of volume II.

So, it’s possible to do some basic things in GR without tensors, though I have to admit it is quite a mess: I consider my little exercise to be justification as to why one really needs to learn the tensor apparatus.

Tensors are also used in various fields of engineering (stress, strain, and all that), although not in the same way as in GR. Anyway, I personally do not think someone should call himself a physicist if he does not understand the math and physical ideas underlying GR.

Dave

When I did GR as an undergrad, I taught myself tensors along the way. I had some differential geometry before that, but not a lot. But I was lucky enough that I was doing the GR as an independent study with a great professor, so I had a lot of help.

I kind of wish I had this type of thing when I was an undergrad physics major at CU back in the 80’s. I did take the Fourier Analysis course, along with a whole course on Calculus of Variations (not sure if it’s even still taught).

I ended up swallowing Morse&Feshbach in graduate school for fun.

I’m acquainted with a former GR post-doc who made the jump to medical imaging and does some freelance coding for VR gaming. Given the role tensor maths play in the diverse subjects, and how well-versed he also is with QM and the basics of QFT, I’m guessing he’d be appalled to hear most physicists don’t need tensors.

Lagrangian dymamics is a very usefull tool in Mechanical engineering, enabling to write equations of motion of very complex rigid body systems, including translations and 3d rotations, with generalized coordinates and complex boundry conditions (see the book “Lagrangian Dynamics” by A.D. Wells)

Fourier Analysis (in the complex plane), in continuous and discrete time, along with Hilbert space techniques, are among the most usefull tools of modern Electrical Enginnering – Communication &Information theory Information theory, Signals analysis, synthesis and processing (Audio, Cellular systems, Radar, Sonar), detection & parameter estimation of signals buried in noise, e.g., “Digital Signal Processing” by Oppenhaim, “Digital Communication” by Proakis and the classical 4 volume series of Van Trees. The communication and Internet revolution of the 1990’s was not possible without those tools, used daily by hundred thousands engineers worldwide.

Finally, Green function methods together with Fourier analysis are used by RF Antennae designers, solving Maxwell equations from the EM fields induced on the antenna surfaces, as means of calculating the far field Antenna gain pattern (which actually is a tow dimensional Fourier analysis with complex boundary conditions.

Hurrah for those 17-18 century French mathematicians!

In condensed matter theory, we are importing more and more abstract/modern mathematics to deal with topological insulators, superconductors, and topologically ordered phases. In my view the standard undergraduate training that focuses mainly on pre-20th century mathematics is leading to an increasing gap between what is taught and what is used in research. I’m not sure what the solution is, though, because if you want to apply category theory notions to (e.g.) non-abelian anyons in fractional quantum Hall states, learning category theory from math texts or courses is from my perspective a big circumnavigation around what a physicist really needs: intuitive grasp of the rules and algorithms for applying them. (Of course there are many exceptions–I take it that Greg Moore learned category theory overnight.)

My own small contribution has been to elaborate on R. Cahn’s Lie algebra book, to try to write course modules that could be accessible to an undergrad. My argument (that I made in a grant application at least) was that Lie algebras are kind of the “gateway drug” to many more advanced notions, such as affine Lie algebras and 2D CFT, braiding and fusion in anyon systems, Bethe ansatz solvable systems and quantum groups. Thus if students who want to do theory can learn LA representation theory as undergrads, maybe it further opens the door to assimilating these more modern topics in graduate school. It’s a work in progress, but what I have is available here:

http://mf23.web.rice.edu/

My first year in college ended with a class in vector calculus, including Cartesian tensor calculus. Once you understood how to juggle those indices, you didn’t have to struggle to remember in detail how to deal with those divs, curls and nablas – it all popped out automatically. It was like knowing about the Euler formula, when people taking “simpler” classes had to memorize tons of trig formulas. A real time-saver.

From the viewpoint of my later non-career, however, I would said that group theory is the most important. In particular, tensor calculus and differential geometry are secretely the same as the classical representation theory of the diffeomorphism (or Poincaré, in flat space) group. This just says that tensor calculus deals with objects that have a meaning independent of the coordinate system.

Orzel’s list sounds much like my undergraduate maths curriculum. 3 semesters of calculus, 1 semester of linear algebra, 1 semester of differential equations, and one of those grab bag “mathematical methods” courses. All the rest of the math I learned was either taught piecemeal in various physics courses of picked up on my own as needed.

A sometimes wish I had taken a proper complex analysis course.

Dear all, I think the core observation of the field theoretic Noether theorems (I and II) is the existence of conserved currents comong from symmetries. The second theorem even includes gauge theory (infinite dimensional Lie groups). In this sense it is very fundamental.

Two thumbs and ten toes up for Fourier Analysis. I’d encourage you to include at least an overview of how Shor’s Algorithm works. The implementation of a fourier transform as a series of phase rotation quantum gates is quite elegant. It’s a nice example of how being able to quickly measure periods of functions leads to some clever and unexpected results.

Bobito,

I’m not convinced that thinking simultaneously about position and momentum is that hard, even for an experimentalist (and often, momentum is easier to measure than position, e.g for light). What is deep and unintuitive is the Poisson bracket, one can argue more so than the Lagrangian (although my point is that once you understand the Poisson bracket, you have conserved quantities for free, while with the Lagrangian you have non-intuitive work to do to find them).

vskrin,

In my earlier discussion of this that I linked to, there’s a link to this article

https://arxiv.org/abs/1402.1282

which is the best explanation I know for the relation of the Peierls bracket and de Witt’s approach to more conventional Hamiltonian methods.

Yatima,

That’s an interesting topic, but I don’t know much about it, nothing useful to say here. The mathematical structures that are begin discussed come up in looking at terms in a series expansion, they’re interesting mathematics, and potentially quite useful, but not clearly part of the deep structure of the theory, rather than an artifact of the approximation.

All,

I obviously am quite devoted to the idea that Lie groups and their representations are the way to understand quantum mechanics (and can recommend a good book…). A problem though is that any young physics student who wants to learn this math is going to find that math department courses covering it only are offered at an advanced level, and not really accessible.

As for GR and geometry, yes, they should be studied together. But, for most physicists GR is disconnected from what they do and they can happily live without knowing anything about it.

My impression was that when physicists talk colloquially about Noether’s Theorem, often they really are thinking about a Hamiltonian picture, not the Lagrangian one in which Noether’s Theorem was literally derived.

More specifically, a *quantum* Hamiltonian picture, really. Some quantum operator commutes with the Hamiltonian, therefore it’s obviously a conserved quantity, and the transformation you get by exponentiating it must preserve the Hamiltonian as well, so there’s your symmetry. It’s all very simple once you’re used to QM.

Peter,

I’m slightly confused by you emphasizing the Hamiltonian over the Lagrangian approach, over and over. Purely at the math level of ODE’s, it is simply a theorem that N second-order ODE’s are equivalent to 2N first-order ODE’s (of course, given certain assumptions), which is all there is to say really regarding the relationship between Lagrangian and Hamiltonian approaches. Naturally, each of the two approaches has its strengths and weaknesses, and every physicist should be familiar with both, and use one formalism or the other as needed, the right tool for the job. But fundamentally, what is all the fuss about?

Regarding Noether theorems, their true power lies not in the discussion of the conserved charges, but in the discussion of gauge symmetries. And in the Hamiltonian approach, the presence of gauge symmetries requires one to understand the Dirac analysis of constrained Hamiltonian systems, first- and second-class constraints, Dirac brackets, Castellani’s construction of the symmetry generator, the proper boundary behavior of variables to construct conserved quantities, the whole shebang.

In other words, in the Lagrangian framework, the consequences of gauge symmetries are neatly packaged into two Noether theorems. On the other hand, in the Hamiltonian framework, one discovers that the evolution is not described by a single phase-space trajectory but an infinite family of trajectories (connected by a gauge transformation), and all hell breaks loose…

As an example, just try to explain the “problem of time” to a student, using the Hamiltonian framework. I just don’t know of any easy way to explain (without appealing to Noether thms) how come the Hamiltonian of a diff-invariant system is always zero, and how one is supposed to work out anything in the Hamiltonian framework if the actual Hamiltonian vanishes.

One always needs to understand some context — Emmy Noether was asked by Hilbert, Einstein and others to clear up the fuzziness around conservation laws in general relativity. That’s how the two theorems arose, and they explained away this fuzziness with two pure and beautiful pieces of math. Hamiltonian description of GR came much later, with the development of the ADM formalism, and it is confusing a lot of students to this very day. On the other hand, in QM the Hamiltonian approach was dominant from the very beginning, but there were no gauge symmetries involved. They came in as an afterthought (and gained mainstream importance only with the development of the Standard Model), but Hamiltonian formalism was never really well-suited to the study of gauge symmetries. The complexity of the Dirac formalism of constrained systems is a testament to that.

It seems to me that you are emphasizing the triviality of the first Noether theorem, while completely ignoring the second one, where all the power really lies. Let me suggest a challenge — try to derive the Bianchi identities in GR using the Hamiltonian (or ADM) formalism and moment maps, and then say it is better, clearer and easier than the Lagrangian approach. I just don’t believe that the Hamiltonian framework could ever possibly be superior there.

Best, 🙂

Marko

Matthew McIrvin,

Yes, I do think that when physicists invoke Noether’s theorem, often they’re misspeaking, having in mind the Hamiltonian rather than Lagrangian formalism. And I completely agree that, in the Hamiltonian formalism, the relation of conserved quantities and symmetry group actions is even simpler in quantum mechanics than in classical mechanics (as you say, because the group action is just given by exponentiating the operator corresponding to the conserved quantity).

Marko,

My comments were really motivated by Orzel’s pointing out that he doesn’t use or really understand Noether’s technique for finding conserved quantities, and I was arguing that for simple physical systems, how conserved quantities work is much clearer in the Hamiltonian formalism.

Yes, for something like GR the Lagrangian formalism and Noether’s second theorem come into their own (and that’s why she developed this). But the issue of gauge and diffeomorphism symmetry is a subtle and thorny one no matter how you deal with it (since it is so hard to cleanly separate out physical and gauge degrees of freedom). I’m not convinced the Lagrangian formalism doesn’t obscure parts of the problem that the Hamiltonian formalism at least brings to the fore.

The Lagrangian version of Noether’s theorem typically is preferred because there are important symmetries which are manifest at the level of the Lagrangian but not in the Hamiltonian. The most obvious example are the symmetries corresponding to Lorentz transformations. Here the Poisson bracket/Hamiltonian formulation suffers from the fact that the transformations on the phase space variables depend on the precise form of the Hamiltonian, in contrast to the Lagrangian formulation where the transformations depend only on representation content and not on the precise form of the action. To convince yourself of this, try proving the Lorentz invariance of non-abelian gauge theory in the Hamiltonian formulation. It is quite an ordeal, whereas in the Lagrangian formulation it follows by inspection.

I think the importance of complex analysis is much less subjective than claims of its beauty (or perhaps, what I write next is possibly an expression of that beauty): it is absolutely vital as a source of intuition and motivation for a vast amount of modern mathematics; given that function theory of the 19th-20th century led to the foundational papers in algebraic geometry (ex: Riemann’s Abelian function paper), algebraic topology and number theory. Covering spaces and (co)homology (via Cauchy’s theorem) arises naturally in a first course of complex analysis. One cannot master algebraic geometry at any level without some understanding of Riemann surfaces.

While it is possible to repurpose a machine for a variety of uses, I feel it is vital to at least have a sense of its original context.

I’m a mathematician, not a physicist, so I am not really qualified to comment, but

to my mind a bit of complex analysis is fundamental. (I had to review this material for physics students recently so it’s on my mind.) Fourier series is of course necessary for physics but you can’t legitimately teach

Fourier series without Euler’s formula (e^i\theta=…) since it simplifies the formulas and proofs so much, and once you are doing that, how can you not make the

connection between power series (Laurent series) restricted to the circle and Fourier series? I’m talking about intuitive connection, proofs not needed, as the details of

when what converges in what sense are subtle.

But for the basics you at least need the idea of radius of convergence of a complex power series, the complex expansion of exp, sin and cos. Euler then gives you the polar form of a complex number and the connection with rotation matrices. Once you have the idea of complex derivative the Cauchy-Riemann equations fall out of that. Now Cauchy’s theorem and formula et al are so closely related to vector calculus that it just reinforces that material which is so crucial for E and M. That may be all that is really needed. I agree that computing complicated contour integrals could be largely a waste of time. My favorite Complex Analysis text is Lang (surprisingly given some of his other books, this one is excellent). Marsden, Knopp I and II, and still Ahlfors are all excellent as well. For the professor to delve into more deeply for themselves, Katznelson’s book on Harmonic Analysis is remarkable.

If Fourier transforms are necessary, that’s tougher to do it rigorously, as you ideally need some measure theory and functional analysis. But at least they should have the intuitive ideas that the transform of a sin wave is a point mass at that frequency, and that the transform of a Gaussian is a Gaussian, with a thin one getting spread out and vice-versa.

My (winding down) career has been applied/experimental (coincidentally starting with AMO) ; however, I had earlier undergraduate/graduate theoretical pretensions. I was exposed in that phase to a number of areas that have been concretely helpful throughout my career, most notably complex/Fourier analysis (only somewhat also Group theory). I see from above many posters concur. I also see the mention of Marsden–which sits almost pristine on my shelf from 40 years ago. Perhaps it is my own taste, but the book I pull down the most on these issues is Carrier, Krook and Pearson. Mainly because without over-simplification it introduces and addresses techniques like Weiner-Hopf by demonstration and application. Lighthill, for all its skimpiness, is another useful text.

Highly recommended: D. Neuenschwander, “Emmy Noether’s Wonderful Theorem.” A marvelous little book.

Some guy,

My current favored way to think of this is that it is the equations of motion and the symplectic structure on the space of solutions (=phase space) that is fundamental. Lagrangians give a nice way of producing such things (as solutions of the variational problem), although often with a degenerate or singular symplectic structure (and thus descent into “constrained Hamiltonian dynamics”).

This point of view sometimes is called “covariant” phase space, with some names attached to it Crnkovic-Witten and Zuckerman, who wrote down the Yang-Mills case. It’s a point of view that ends up relating Noether’s theorem to the Hamiltonian point of view and Poisson brackets.

Typically one parametrizes solutions of equations of motion by initial values at t=0, and when you do this you get a problem describing some of the Lorentz generators (boosts). These aren’t really symmetries in the usual sense though (don’t commute with the Hamiltonian).

I think historically the preference for Lagrangian methods arose in the development of perturbation series methods in relativistic QFT (Schwinger, Feynman), since understanding renormalization is much more difficult without explicitly Lorentz-invariant terms in such series. Gauge symmetry leads to its own problems with Lorentz invariance, which are though quite fundamental (inherent conflict between positive Hilbert space and covariant elimination of the gauge degrees of freedom).

About complex analysis:

santarakshita: no question you should understand Riemann surfaces and complex analysis to have any idea what is going on in algebraic geometry, but physicists can (and do) almost all happily go through their careers knowing nothing about algebraic geometry.

Complexispunny: To discuss Fourier analysis you clearly want to talk about complex valued functions of a real variable, but you can mostly avoid functions of a complex variable and the notion of a holomorphic function (although that may be the best way to understand some facts).

paddy: I second the recommendation of Carrier, Crook and Pearson, which is full of wonderful applications of the subject. That’s actually the book I first learned complex analysis from, in an applied math course. Later on I learned a less applied point of view by teaching the subject in a math department.

Peter: might not be a coincidence on Carrier, Krook et al. My handwriting in inner leaf says ’74 and I am sure for a course in DEAP (?). Cannot recall the name of the professor.

PS: I’ve always felt that my preference for the Hamiltonian formalism (as more intuitive ?) was do to my not proceeding further into relativistic QFT than I did. Perhaps someday I will revisit.

paddy,

That’s very funny, not a coincidence. I just dug up my old course notes, seems I took Applied Math 201 Fall 1976, my sophomore year. I have written on the notes as instructors Richard Lindzen and John Hutchinson, but don’t remember much about them. Lindzen went on to fame (notoriety?) in climate science, Hutchinson I guess is still there, see

http://web-static-aws.seas.harvard.edu/hutchinson/

Applied Math 201 was the complex variables class and used Carrier, Krook and Pearson as one of the texts. Presumably this had something to do with the authors having taught this course at Harvard in the 60s

There was also a second semester of the class, Applied Math 202, about PDEs, which I took spring 1977. That used the book “Partial Differential Equations” by Carrier and Pearson, presumably also written based on that course.

OT a bit I know…but. The professor in ’74 in whatever the course was using CK&P would finish his lecture and, as students funneled out (and some forward to ask him questions), he would–as if rationing–begin unwrapping a pack of Players.

paddy,

I don’t remember this, but could easily have been Lindzen, who was there in 74. There’s this from a 2001 Newsweek interview

“Lindzen clearly relishes the role of naysayer. He’ll even expound on how weakly lung cancer is linked to cigarette smoking. He speaks in full, impeccably logical paragraphs, and he punctuates his measured cadences with thoughtful drags on a cigarette.”

Peter: I looked up Lindzen and you are correct. ‘Twas he. Good course and good teacher. I hope he (unlike I) has stopped smoking.

I had a double major in physics and math and continued to take math in my first year of PhD study when I intended to become a theorist (I was also accepted into math PhD programs). I left theory for experiment, but have continued to work with theorists. I have observed that most theorists and experimentalists could have used more computer science courses and, probably, less mathematics courses.

Group theory – right up there in my view as a “preferred” topic to be taught to physicists. Whilst i am here, I’ll say thanks for hosting such an interesting and engaging blog for so long.

Jonathan Miller: I agree that some more formal training in programming would be good for many or even most physicists. The reputation ‘self-taught scientist programmers’ have among software engineers is not entirely undeserved.

Peter, thank you for this illuminating and at times entertaining blog. I did a lot of math and physics years ago at the undergrad level and then dropped out for the outdoors and environmental activism. I would love to revisit some of this stuff. What would you recommend as a good textbook these days on calculus, on differential equations, and on linear algebra. Thank you, again.

About complex analysis –

Is an area of math that comes up everywhere in other mathematics also likely to come up lots of places in physics? My inclination is to say “yes”, but quite possibly I’m wrong.

Peter, thanks for reminding us that you took Applied Math 201 in the fall of 1976, our sophomore years at Harvard. By coincidence, I attended the first week of AM201 but then switched to Math 213a, the Math version of the graduate course in complex analysis. Math 213a in the fall of 1976 was the first and only time that John Tate ever taught complex analysis, and he saw it as a grand opportunity to provide his own perspective (not using Lars Ahlfors classic text). At the time we did not know what a privilege it was.

Complex analysis, definitely. You can’t even have a coherent discussion of common Green’s functions and common Fourier transforms (in electrodynamics, scattering theory, solid state physics, or QFT propagators) without knowledge of contour integrals.

Same goes for group theory and QM. Even a basic discussion of QM conservation laws in my intro grad QM class stumbled when it came to light the students had no background in groups. SO I had to introduce basic Lie groups (as in translation and rotation) ab initio, at grad level. Quite shocking.

Trailmut,

Sorry, but in my little experience with books for these courses, none stands out as unusual and worth specially recommending. Here at Columbia and many other places, Stewart’s Calculus is the text we use. I’ve always thought that calculus and introductory physics should be taught together, haven’t ever seen a book that does that though.

Mike Weiss,

Thanks for that story. Probably it’s very good that I didn’t do what you did. I got a fair amount out of the Applied Math version, although struggled a bit with that since I wasn’t exactly over-prepared for it. I suspect I would have gotten little out of Tate’s version, which later in life though would have been great to experience.

All in all, physicists should be wary of the fact that complex analysis can be taught very differently, depending especially on whether the emphasis is on applications (which are marvelous and something physicists should know) or theory (which leads to many deep areas of pure math, but less necessary for physicists).

anon,

I agree completely, and the problem you mention is one motivation for the book I’ve just finished. The material about translations/rotations and what they have to do with momentum/angular momentum should though be part of the standard physics undergrad courses on mechanics and QM.

Mike,

I took 213a the preceding year taught by Ahlfors. That actually seemed like a privilege at the time. Of course, I didn’t understand much of it until I taught the stuff ten years later.

Peter,

I was a TA for Math 22 taught by Bamberg and Sternberg. That made a serious attempt to combine second year calculus and linear algebra with physics. I remember them applying cohomology to resister networks and expressing Maxwell’s Equations in differential forms.

Trailmut and Peter,

I hope you won’t mind if I quote from the Zentralblatt MATH (now zbMATH) review of my own elementary calculus book, “Calculus: The Elements” (World Scientific, 2002): “A mathematician’s first reaction to the book may well be, Is this a text by a physicist, or by a mathematician. After delving further, one has the feeling that it is a work by a mathematician still in close touch with physics. … The author succeeds well in giving an excellent intuitive introduction while ultimately maintaining a healthy respect for rigor.” A fuller review by Roy Smith of the University of Georgia can be found at the World Scientific website.

One of the reason why few undergraduates know about group theory is the relative lack of a good group theory book designed for physics majors. They could have taken a course on abstract algebra, but then they would have to spend time learning about rings and fields, which are not quite connected to physics.

Even the currently available material seems to be too rigorous. It is well known that physicists apply Stoke’s theorem on non-compact spaces; physicists are reckless folks indeed.

When I was an undergrad, there was a book, “Group theory in quantum mechanics” or very similar title. I just checked, and it’s now Dover, so you can get it for less than $20. It’s listed as graduate level, but I used it as an undergrad (I was taking a mathematics modern algebra course at the same time). I remember it being a wonderful book.

On Group theory: Hamermesh was text I first learned from. Proved to be quite useful even in a mundane area like understanding fluorescence of transition metal ions in crystals. Once again my fly leaf indicates ’74 and this time I remember the professor: Glashow.