Descent for functors to come

December 7, 2010

I am resuming these days my work on descent theory. In August/September I was thinking on issues related to the descent for
equivariant functors between categories where each category is itself glued from pieces. Equivariant in the sense of coactions of comonads or a similar formalism. One of the motivations is study of principal bundles over noncommutative schemes. The Cech cocycles are a bit tricky here in full generality, there are phenomena which do not exist or do not matter in commutative context.

I will meet these days Gabi Bohm from Budapest to talk about such issues; she is well versed in comonads and related issues. This will be also excuse to post here some standard and not so standard background from selected parts of descent theory. This post will grow in few days. Keep tuned :)

books buying, selling, publishing

July 14, 2010

Talking about scientific publishing is a large topic, it includes the problems like the expensive and raising journal prices, voluntary copyright transfer to the publisher and our free service on editorial boards and as reviewers, quality of journals, low reliability of various impact factors and so on. But today I will not talk the journals but rather books. I will start with a shortened personal story but then go into more hot topic of some contemporary disorientation of publishers and bookstore in publishing and selling good books.

As a child I did not have much access to the books in science although I was interested in science; my first advanced math books I bought with my father at about age of 16 in some second-hand bookstore; quickly after I was reading Postnikov’s volumes on geometry and algebraic topology and started buying books massively; In a way I would make myself happier by indulging into choosing and buying the books; those books in Russian were cheap at the time but the problem was that you had available only what was published very recently. For example, I would get the second volume of Penrose-Rindler’s book on spinor geometry but not the first as the first was translated to Russian earlier and was not available hence any more. One of the secrets of the cheap Russian books was that the publisher did not make too many copies and finance their staying for years in the stocks. The whole stock would be sold very quickly. I was told by Russians that in Moscow people would go to exhibitions like Soviet Exhibition where they would get books which were not available often in bookstore; while cheap, most of the books were rare to find.

Of course, after years, it became a problem that I could not fit the books in my room and other places and had to stock them in boxes and so on. Then also travel came, my graduate school in Wisconsin and my interests involved, new books were coming, at much higher price and lower pace. And then I got in a way saturated. I had a nice personal collection (parts of it were lost though in travels, movings and so on), but was often accessing good libraries and kind of got used that I can have what I need most of the time, unlike in my early history. One of the reasons of the saturation is that I had to narrow my main interests to professional ones, and read less and less other subjects like linguistics, and in my own field there are few surprises unnoticed: we know in advance that somebody is writing a major book so coming to a bookstore will rarely raise great interest. I became a book-quality sceptic: books are either known to me, or bad or out of my interests.

Now after nearly 10 years of not buying books much, and even not being anymore fond of entering bookstores, I felt some revival of my book apetite in recent weeks, and made some spontaneous excursions into bookstores. But now I travel less and bookstores which I specially liked like the one of Cambridge University Press in Cambridge, like the former foreign bookstore in Zagreb in Gundulićeva street, the University Bookstore in Wisconsin, one impressive bookstore (I do not recall the name) in Barcelona and so on are far from my reach timewise or spacewise.

The new generation buys books online and does not bother browsing. the choice is bigger, and often there are online excerpts. But I really get the feeling whether I like the book mostly only if I browse it in my own hands. Online I often get wrong impression on proportions and feeling of the style and content. So I would still like to have good bookstores.

In Zagreb, now you have very little choice, the only reasonable collection of foreign titles is the Algoritam. Of course you can order anything but lets focus on the browsing feel and real competent choosing by sitting and browsing within the bookstore.

Well, the collection has few meters of math, physics and computer science titles at all levels mixed (“mixed” is here the bad thing, though I am sufficiently experienced to find my way through wrong targeted parts of the stock). But you know Zagreb is not a big market for scientific books and the bookstores should sell the books which are of sufficiently broad interest. For example, proceedings volume of a conference, or extremely specialized topics are unlikely to find a buyer. So what happens is that such books stay in the pool, and of course the bookstore can not afford spacewise and timewise to keep so many books unsold for long time, so once the crap takes over the shelves, it is hard to replace it by more reasonable titles. People tell me that a reason is that some people order books and then decide not to buy them, so the bad books enter the bookstore unplanned. But I see many bad choices in more than one copy, so the books were really ordered not by such an error but otherwise.

So if you look through the Algoritam bookstore, math section, you see that the they did not choose famous and widely sold books, and series but chosen some random books from random publishers. For example you have some expensive PDE textbook written by some local experts in Beijing but you do not have the most famous textbooks on PDEs of Gilbarg and Trudinger or the one from Evans. In fact there are no books published by American Mathematical Society which is a very good, modern and reasonably cheap publisher. There are some Indian reprints of books which I know are unlawful by Indian law to be sold out of India, but in Algoritam they say that they bought it from a regular supplier. Strange that they learned of a strange supplier of strange reprints in India but do not know that AMS publishes good many quality books in the field. The Springer’s Yellow series is underrepresented and Algoritam does not seem to maintain the action of the Yellow sale, which is traditionally quite an event for math book fans.

I was also disappointed into finding so many new editions of outdated books. For example some sort of Oxford companion of philosophy of mathematics, talking so much of 19th century and earlier metaphysical thinking of nominalism and alike notions, while have no hint of the thrills brought by modern foundational, semantic and other developments from schools of Lawvere, or Grothendieck and so on, who changed mathematics so much.

There are of course, also the dedicated reprint series. Like the Dover. Well, I think that youngsters should be warned that Dover series is outdated in large. Not in the way it was before. There are still great books there, like Goldblatt’s Topoi, or Abrikosov, Gоr’коv, Dzyaloshinskii Quantum field theoretical methods in statistical physics , ever quoted Weyl’s Classical groups and so on. But Dover also published reprints of many minor authors; some books of historical type without copyright which can be found online for extremely rare users (for example Chandrasekhar’s Mathematical theory of black holes is a masterpiece, but not good for a contemporary student; it is choice of exact and detailed calculations which a very rare specialist will look apart from a rare consultation in a library). But you know, a student sees a book which is reasonably well written about a subject which is not known to her/him. And after reading few nicely written paragraphs will decide to buy an obsolete book. I mean something what looks readable may be suboptimal from today’s point of view, notation, conventions, language, knowledge and after many new discovered powerful shortcuts or stronger results which are now standard. So, the student may subjectively feel that he gained a great insight, while the learnt material is not as powerful as what modern introductions offer or is not as organized as a contemporary colleagues would expect.

quantization

March 19, 2010

While quantum mechanics is considered “correct” and classical mechanics just an approximation in limit h\to 0, and for most practically interesting systems physicists already know the laws at the quantum level, the mastery of inventing new quantum theories starting with sometimes more obvious classical counterpart has been one of the main themes of mathematical physics ever since early 20th century.

My social experience tells me that great majority of both physicists and mathematicians feel uncomfortable with advanced study of quantization as an activity at some level, and consider the issue either overpriced, unnenecessary, or too difficult, or choose one small side of the story. Another feeling which I would like to share (the reasons behind it will likely appear in a later post) is that I have the impression from the scientific point of view that one could predict a qualitative surge in the field yet to come in next decade or two.

I assume that the reader knows few very basic cases of quantization.

Most physicists consider the quantization more or less a trivial issue, as they are mislead by the culture of university courses focusing on canonical quantization of Poisson brackets. This is of course rather wrong, as the canonical quantization depends on the choice of the generators of the Poisson algebra; namely a no-go theorem says that we can not simply promote all Poisson brackets to commutators as the textbooks make impression. We just promote certain hand-chosen generators to noncommuting operators with prescribed commutation relations that way. If we make a canonical transformation in the sense of classical mechanics (symplectic geometry) some other sets of generators may now look distinguished; canonical transformation recipe for those generators and their Poisson brackets may give a different quantization in general. In practice, there is indeed some distuinguished set of generators, say with special symmetry properties and it gives a hint of physically useful canonical quantization recipe. But such a simplicity is a very special circumstance. More complicated systems, involving nontrivial phase manifolds, constraints, infinitely many degrees of freedom (say, QFT), geometric background involving various cocycles like connections on gerbes, open many body systems etc…lead to need to develop sophisticated methods to study the quantization.

Many ideas and theories appeared in the 20th century, from various ordering prescriptions, basic notions like polarization, coherent states, geometric quantization, Berezin quantization, various types of “symbols” of operators…All of the mentioned on the intersection of operator theory, geometry and noncommutative algebra. The golden era of all these issues is maybe late 1960-s and 1970-s when the methods of symplectic geometry systematically inflooded study of quantization, integrable systems and representation theory.

On the other hand, the usage of Feynman path integrals was not that popular at the time, apart from here and there deriving recipes for Feynman rules. The theoretical physics since 1980-s gradually overturned its interest toward the path integral approach to quantization (with all its deficiencies of not being rigourously defined in reasonable generality).

One should next point out that the very procedure of renormalization is also a step of going from classical picture to true QFT, so it is itself a method of quantization. The nontriviality of renormalizations, say for nonabelian gauge systems has revealed how difficult the problem is and gave hints to many structures the quantization of field theories depends on. As for constrained systems in general, introduction of ghost fields is often useful. BRST and more general BV quantization procedures are based on such methods (introducing new auxiliary fields for the purposes of quantization) which are of cohomological nature, close to the method of Koszul resolutions in algebra.

Before I try to discuss some more contemporary insights into the story of quantization I should mention something what is for most mathematical physicists considered old-fashioned: WKBJ method of semiclassical approximation. In this method, one considers the short-wave asymptotics of the wave function. Textbook version of this method is in one dimension and is up to first order in Planck constant; old Bohr-Sommerfeld type conditions are corrolaries. Not very impressive and important from fundamental point of view, though handy for some easy calculations. However a more systematic version due mainly Maslov (from late 1950-s) revealed some important geometric ingredients for quantization in general (see references quoted at Maslov index) and enabled the method to be extended to all orders and to many dimensions (to manifolds). In 1970-s the method well resonated in the form of the stationary phase approximation or the study of (rapidly) oscillating integrals within the study of partial differential operators (and generalizations like pseudodifferential operators of Kohn and Nirenberg and the more general Fourier integral operators of Hoermander and Maslov) in harmonic analysis. The importance of the study of the behaviour of the wave phase factor has its reflection in the prevalence of the Lagrangean submanifolds in all that work (see the classical monograph of Duistermaat for example). Some, especially the Japanese school (Kashiwara, Saito, Jimbo, Miwa…) reflected this study within the framework of D-modules, but the basic underlying ideas are of course the same. One of the aspects is also the microlocal analysis. Japanese school was typically using more general theory of generalized functions then Laurent Schwarz type, namely the theory of hyperfunctions which may be viewed as the limiting boundary values of holomorphic functions, and are usually formulated using sheaf theory.

1980-s had witnessed large shift in emphasis: path integrals become more complicated, BV formalism has been discovered, operator methods of quantization are embraced and refined by Connes’ noncommutative geometry; deformation quantization is getting more popular in other circles and the Leningrad school of quantum inverse scattering method starts employing quantization as a tool in algebra, leading to quantum group theory. Symplectic geometry changes the emphasis from geometric quantization and related circle of questions to Gromov-Floer discovery of a new field of symplectic rigidity and symplectic topology.

In 1990-s cohomological methods get immensely refined by Kontsevich school leading to new geometric models like AKSZ model, Kontsevich formality theorem etc. Fukaya and Kontsevich introduce A_\infty-categories as the organizing principle of symplectic geometry and new phenomena like homological mirror symmetry. Witten, Freed and others understand better quantization of topological field theories, inlcuding toy examples like Dijkgraaf-Witten model. The point is that all of these lead to content of homotopical and higher categorical nature.

In last about 10 years the homotopy theory is getting ready to be absorbed by higher category theory. We are talking about the program conjectured by Grothendieck in early 1980-s and now being getting into a mature status. Therefore one can now attempt to more systematically look at quantization of nonlinear sigma models, topological field theories and alike from categorical point of view. I have had a luck to be in touch with some people having insight in this area, including Urs Schreiber who proposed a cleaner version of an idea due Daniel Freed to do the path integral quantization of some finite models via computing certain categorical Kan extension. While still unperfected the recipe is a step in the right direction.

What is now bugging me for last several months, maybe a year, is that the models covered by the recipes of Freed and of Schreiber are of the type for which the WKB-Maslov method gives the exact result. Such models when studied in the path integral method usually have that method of the exact contribution at stationary points of the action, and the formulas agree with the equivariant localization formulas (Duistermaat, Witten…). Supersymmetry, classical integrability and similar special circumstances make this localization possible. See the book by Richard Szabo on equivariant localization for path inetgral for an inspirative discussion of these issues mainly at the level of rigor of a theoretical physicist (a version can be found at arxiv).

My opinion is now the following: there is a part which is of a homotopical nature and higher category theory could give a major insight of cases of quantization which do not have analytic correction. Semiclassical expansion is close to the expansions in operator theory (heat kernel expansion, geometric measure theory, spectral geometry, Weyl tube formula, index theorems) in which the topological terms can be sensed as the leading terms. On the other hand it gives the exact answer at the cases where higher category theory can find the same in more systematic way. Hence incorporating the semiclassical method into that higher categorical machinery is the next important thing to try. And it seems to me that the subject is getting mature enough for this to be expected, though that comparison is not yet recognized enough.

See also Urs’s QFT manifesto and nlab entry quantization.

Beck’s theorem vs. Benabou-Roubaud

January 7, 2010

Late Jon Beck has made some profound contributions to category theory, descent theory and homological algebra, mainly via studying monads, monadic cohomology and monadicity. Unfortunately some of his early manuscripts were lost and some of his results might have been lost in their original form and reappeared later from folklore or being rediscovered. Some times ago at category list there was a discussion (which got heated at the moment) about the results of Jean Beneabou’s school of descent theory in comparison with the Beck’s results. In Canada and US, the Beck’s environment, indexed categories of R. Paré and D. Schumacher, which are close to the approach via pseudofunctors from the original FGA of Grothendieck, were used. French school followed Grothendieck-Gabriel’s fibered categories instead. The comparison of the descent in bifibered categories with monadic descent has been first done in Benabou-Roubaud’s paper; moreover certain Beck-Chevalley property has been utilized. In category list discussion, it seems that some contibutors prefer to attribute all to Beck’s unpublished works only, and not taking seriously that fibered category formalism is nontrivially related and not the same generality as the monadic aprpoach of Beck. Here is my response to category list which has been disgarded by the moderator because the discussion has been already closed when I posted the comment:

Prof. Benabou wrote:

Sorry Ross, the “indexed categories” are not Pare-Schumacher, and not Lawvere. They are, as I mentioned to Barr, also due to the “fertile brain” of “you know who”.

Later he quotes 1962 as the year and mentiones SGA1.6 (what corresponds to the seminars 1960-61).
However, Grothendieck introduced pseudofunctors into descent theory a bit earlier.
The first published Grothendieck’s text dealing with subject has only pseudofunctors and NOT the fibered categories:
it is in those Bourbaki seminars, which are LATER collected as FGA (Fondements de la géométrie algébrique. [Extraits du Séminaire Bourbaki, 1957--1962.])

The descent chapters of Bourbaki seminars are from 1959 and involve pseudofunctors (indexed category point of view).
Later, in SGA1 (corresponding to 1960-1961), the chapter 6 on fibrations and descent has been written by Pierre/Peter Gabriel and introduces fibered categories (proper) as the basic setup for the first time. As far as I heard from algebaric geometers indeed there was in reality some
input from Gabriel, not only Grothendieck, when rephrasing in a better language of fibered categories what is just partly supported by the footnote to Ch. 6 that the chapter is written down by Gabriel (the clarity of Gabriel’s style, parellel to the style in Gabriel-Zisman book, is quite recognizable!).
Thus in any case, a variant of indexed categories I mean pseudofunctors were under a different name introduced by Grothendieck, not by Pare and Schumacher: and for fibered categories Grothendieck was the one who introduced
the philosophy there of trading structure to property, but some (unclear in extent) part in making the transition
was played by Gabriel as well. This is the story known to most of people adhering to Grothendieck school and
I am surprised that the category list has that much confusion over it (including never mentioning Gabriel in this context).

As far as Benabou-Roubaud’s paper I indeed enjoyed the posts by Bunge and Benabou explaining the (sometimes subtle) differences from the known work of Beck (which fit into my earlier impressions).

I have retyped few years ago the C.R. paper into LaTeX, translated into English; though I should still recheck for typoses. I intended to post it to the arXiv as it is short and historically and pedagogically important while unavailable online and I hereto ask Prof. Benabou for permission (after rechecking the file together for correctness).
[I should remark that SGA1 is also on the arXiv, since 2002.]

It would be interesting if the discussion on Beck’s theorem and on Benabou-Roubaud’s theorem in this list would extend here to the case of higher categories. Lurie’s second volume
Derived algebraic geometry II: noncommutative algebra arXiv:0702299 has the Barr-Beck theorem for (infty,1)-categories. I recall discussing with A. Rosenberg and M. Kontsevich in 2004 extensively on the need for a Beck theorem in the setup of A-infty categories,
and tried to work on it but insufficiently to obtain it. My motivation was to explain certain version of noncommutative flag variety and they had another version long before needing
similar methods. In both cases Cohn universal localization was involved and we took a look of Neeman-Ranicki papers on usage of localization theory in algebraic K-theory implicitly involving higher Massey products in understanding the derived picture coming from Cohn’s localization. Roughly speaking, they understand Cohn’s localization, as H^0 following certain Bousfield localization at the level of chain complexes (earlier found by Vogel); extending this to comonads coming from a cover by Cohn localizations would give a comonad in the triangulated setup, andone wants to understand the quasicoherent sheaves for our examples obtained by gluing Cohn localizations by means of some Beck’s theorem in that setup). On July 16, 2004 MK and AR told me that they have just the day before
solved the problem, with a remark that they did not need to discuss “covers”, but I have not seen the A-infty version ever in print (some applications were given in a talk by MK at a conference in honour of van den Bergh’s birthday though; including getting usual formal schemes as noncommutative schemes via gluing along derived epimorphisms).
But Sasha Rosenberg has posted his lectures on the Max Planck site having Beck’s theorem in triangulated setup,

A. L. Rosenberg, Topics in noncommutative algebraic geometry, homological algebra and K-theory, preprint MPIM Bonn 2008-57 (be careful: 105 pages)
link
page 36-37 (triangulated Beck’s theorem)

which contains a version of the Kontsevich-Rosenberg result in the triangulated setup: the proof uses the Verdier’s abelianization functor. It would be interesting to compare the result to Lurie’s result somehow, as well as to try to see somehow the Street’s orientals in the picture explicitly.

Side remark: has anybody ever stated exactly or even proved that any sensible notion of pseudo-n-limit for n greater than 2 would be represented by “n-categories of n-descent data” where the latter are constructed via Street’s orientals? in other words, I do not consider that
the assertion

“n-descent object can be represented as the n-category of n-descent data”

is a tautology, but rather a conjecture worth working on
(even n=2 case of the assertion can not be found in print);
second in the Lurie’s framework can one formulate (similar to the descent object) the universal property of the appropriate Eilenberg-Moore category in a manner
parallel to the universal property of the descent object in 1-categorical situation??). In other words, one should see how for general n the orientals help to satisfy some
n-categorical universal property (the latter subject for n greater than 2 being to my inspection almost void in the literature anyway). At least for strict n-categories…
[[ Furthermore, what to do with descent data, orientals, nonabelian cocycles in other (usual) categories -- I mean, the combinatorics of nonabelian cocycles in Hopf algebra theory for example (e.g. Drinfeld twist, associator: the factor systems for cleft extensions of Hopf algebras etc.) seems to some extent superimposable to the one in group cohomology but still different and I do not know if orientals explain such things (cf. the bialgebra nonabelian cohomology in

Shahn Majid: Cross product quantisation, nonabelian cohomology and twisting of Hopf algebras. Generalized symmetries in physics (Clausthal, 1993), 13--41, World Sci. Publ., River Edge, NJ, 1994. arXiv: hep-th/9311184 ]]

In 2-categorical situation, the 2-fibered categories are defined in Gray’s work and then again introduced and discussed at length by Claudio Hermida, who has good ideas on higher n (and I will be trilled to hear once that he found the time to return to the topic and give us good answers). An appendix in

Claudio Herminda: Descent on 2-fibrations and strongly 2-regular 2-categories. (English summary) Appl. Categ. Structures 12 (2004), no. 5-6, 427–459.

is discussing a 2-categorical version of Chevalley condition (truly, the subtle historical and terminological differences between B-R vs Beck contributions for 1d case are not made there). Beck’s theorem (proper) for pseudomonads is in the paper

Le Creurer, I. J.; Marmolejo, F.; Vitale, E. M.
Beck’s theorem for pseudo-monads.
J. Pure Appl. Algebra 173 (2002), no. 3, 293–313.

starting

December 22, 2009

Hopefully, I will be starting soon. For now I just posted an entry containing my response to the historical discussion on the theorems of Beck and Benabou-Roubaud. For now I am more absorbed by nlab. My personal nlab pages can be found here and my web page at work is here.

Hello world!

August 31, 2009

Welcome to WordPress.com. This is your first post. Edit or delete it and start blogging!

Unfortunately, I dislike the very words blog and blogging.


Follow

Get every new post delivered to your Inbox.