While quantum mechanics is considered “correct” and classical mechanics just an approximation in limit h\to 0, and for most practically interesting systems physicists already know the laws at the quantum level, the mastery of inventing new quantum theories starting with sometimes more obvious classical counterpart has been one of the main themes of mathematical physics ever since early 20th century.

My social experience tells me that great majority of both physicists and mathematicians feel uncomfortable with advanced study of quantization as an activity at some level, and consider the issue either overpriced, unnenecessary, or too difficult, or choose one small side of the story. Another feeling which I would like to share (the reasons behind it will likely appear in a later post) is that I have the impression from the scientific point of view that one could predict a qualitative surge in the field yet to come in next decade or two.

I assume that the reader knows few very basic cases of quantization.

Most physicists consider the quantization more or less a trivial issue, as they are mislead by the culture of university courses focusing on canonical quantization of Poisson brackets. This is of course rather wrong, as the canonical quantization depends on the choice of the generators of the Poisson algebra; namely a no-go theorem says that we can not simply promote all Poisson brackets to commutators as the textbooks make impression. We just promote certain hand-chosen generators to noncommuting operators with prescribed commutation relations that way. If we make a canonical transformation in the sense of classical mechanics (symplectic geometry) some other sets of generators may now look distinguished; canonical transformation recipe for those generators and their Poisson brackets may give a different quantization in general. In practice, there is indeed some distuinguished set of generators, say with special symmetry properties and it gives a hint of physically useful canonical quantization recipe. But such a simplicity is a very special circumstance. More complicated systems, involving nontrivial phase manifolds, constraints, infinitely many degrees of freedom (say, QFT), geometric background involving various cocycles like connections on gerbes, open many body systems etc…lead to need to develop sophisticated methods to study the quantization.

Many ideas and theories appeared in the 20th century, from various ordering prescriptions, basic notions like polarization, coherent states, geometric quantization, Berezin quantization, various types of “symbols” of operators…All of the mentioned on the intersection of operator theory, geometry and noncommutative algebra. The golden era of all these issues is maybe late 1960-s and 1970-s when the methods of symplectic geometry systematically inflooded study of quantization, integrable systems and representation theory.

On the other hand, the usage of Feynman path integrals was not that popular at the time, apart from here and there deriving recipes for Feynman rules. The theoretical physics since 1980-s gradually overturned its interest toward the path integral approach to quantization (with all its deficiencies of not being rigourously defined in reasonable generality).

One should next point out that the very procedure of renormalization is also a step of going from classical picture to true QFT, so it is itself a method of quantization. The nontriviality of renormalizations, say for nonabelian gauge systems has revealed how difficult the problem is and gave hints to many structures the quantization of field theories depends on. As for constrained systems in general, introduction of ghost fields is often useful. BRST and more general BV quantization procedures are based on such methods (introducing new auxiliary fields for the purposes of quantization) which are of cohomological nature, close to the method of Koszul resolutions in algebra.

Before I try to discuss some more contemporary insights into the story of quantization I should mention something what is for most mathematical physicists considered old-fashioned: WKBJ method of semiclassical approximation. In this method, one considers the short-wave asymptotics of the wave function. Textbook version of this method is in one dimension and is up to first order in Planck constant; old Bohr-Sommerfeld type conditions are corrolaries. Not very impressive and important from fundamental point of view, though handy for some easy calculations. However a more systematic version due mainly Maslov (from late 1950-s) revealed some important geometric ingredients for quantization in general (see references quoted at Maslov index) and enabled the method to be extended to all orders and to many dimensions (to manifolds). In 1970-s the method well resonated in the form of the stationary phase approximation or the study of (rapidly) oscillating integrals within the study of partial differential operators (and generalizations like pseudodifferential operators of Kohn and Nirenberg and the more general Fourier integral operators of Hoermander and Maslov) in harmonic analysis. The importance of the study of the behaviour of the wave phase factor has its reflection in the prevalence of the Lagrangean submanifolds in all that work (see the classical monograph of Duistermaat for example). Some, especially the Japanese school (Kashiwara, Saito, Jimbo, Miwa…) reflected this study within the framework of D-modules, but the basic underlying ideas are of course the same. One of the aspects is also the microlocal analysis. Japanese school was typically using more general theory of generalized functions then Laurent Schwarz type, namely the theory of hyperfunctions which may be viewed as the limiting boundary values of holomorphic functions, and are usually formulated using sheaf theory.

1980-s had witnessed large shift in emphasis: path integrals become more complicated, BV formalism has been discovered, operator methods of quantization are embraced and refined by Connes’ noncommutative geometry; deformation quantization is getting more popular in other circles and the Leningrad school of quantum inverse scattering method starts employing quantization as a tool in algebra, leading to quantum group theory. Symplectic geometry changes the emphasis from geometric quantization and related circle of questions to Gromov-Floer discovery of a new field of symplectic rigidity and symplectic topology.

In 1990-s cohomological methods get immensely refined by Kontsevich school leading to new geometric models like AKSZ model, Kontsevich formality theorem etc. Fukaya and Kontsevich introduce A_\infty-categories as the organizing principle of symplectic geometry and new phenomena like homological mirror symmetry. Witten, Freed and others understand better quantization of topological field theories, inlcuding toy examples like Dijkgraaf-Witten model. The point is that all of these lead to content of homotopical and higher categorical nature.

In last about 10 years the homotopy theory is getting ready to be absorbed by higher category theory. We are talking about the program conjectured by Grothendieck in early 1980-s and now being getting into a mature status. Therefore one can now attempt to more systematically look at quantization of nonlinear sigma models, topological field theories and alike from categorical point of view. I have had a luck to be in touch with some people having insight in this area, including Urs Schreiber who proposed a cleaner version of an idea due Daniel Freed to do the path integral quantization of some finite models via computing certain categorical Kan extension. While still unperfected the recipe is a step in the right direction.

What is now bugging me for last several months, maybe a year, is that the models covered by the recipes of Freed and of Schreiber are of the type for which the WKB-Maslov method gives the exact result. Such models when studied in the path integral method usually have that method of the exact contribution at stationary points of the action, and the formulas agree with the equivariant localization formulas (Duistermaat, Witten…). Supersymmetry, classical integrability and similar special circumstances make this localization possible. See the book by Richard Szabo on equivariant localization for path inetgral for an inspirative discussion of these issues mainly at the level of rigor of a theoretical physicist (a version can be found at arxiv).

My opinion is now the following: there is a part which is of a homotopical nature and higher category theory could give a major insight of cases of quantization which do not have analytic correction. Semiclassical expansion is close to the expansions in operator theory (heat kernel expansion, geometric measure theory, spectral geometry, Weyl tube formula, index theorems) in which the topological terms can be sensed as the leading terms. On the other hand it gives the exact answer at the cases where higher category theory can find the same in more systematic way. Hence incorporating the semiclassical method into that higher categorical machinery is the next important thing to try. And it seems to me that the subject is getting mature enough for this to be expected, though that comparison is not yet recognized enough.

See also Urs’s QFT manifesto and nlab entry quantization.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: