Archive for the ‘Uncategorized’ Category

Are conferences overrated ?

May 3, 2020

I wanted to write this post for many years, but in zoom-era of scientific communication in covid-19 crisis, the level of awareness of alternatives to on-site conference communication and about their limitations is higher and I can not postpone any more. To say few words about my background: I have long surpassed a hundred conference attendances in my scientific career (mainly in physics and math) and I have had key role in organizing 2 conferences and one summer school and contributed as a committee member for another 3. I should also say in advance that weighing conference impact has quite many aspects, from mere cost in resources and participant’s and organizer’s time, travel pleasure, but also environmental carbon and other footprint and physical exhausture (say, developing problems with spine due long travel), false facade making, meeting and failing to meet, dissemination but also propaganda aspect, social support but also bullying, boosting researchers security but also their insecurity and so on, formal participations just to fulfill grant requirements and so on and some conferences for excuse of tourism; some aspects are shared with other sorts of intensive communication, some are specific. There is also a paradox of a number of available conferences in some field: if there are more conferences the probability that a specific top expert will be present at a given conference is smaller, hence in a world with too many conferences it is more likely that it will be harder to meet many critically interesting experts at a single conference. We should discuss how to make the conferences more cumulatively effective, for example posting background information in advance, recording and posting conference videos a posteriori, and supplementary materials at any time. Finally, there are amplifying benefits in combining on-site conferencing with online participation of some participants, which should be explored in planned and thoughtful way (it was happening sporadically and ad hoc so far).

Of course, we should not haste with conclusions based on temporary experience with recent mandate of mainly online communication, for some dominantly frustrating and for some even liberating. First of all, passage to online communication has to have well elaborated devices to replace (rather than to complement) real meetings. Thus some distinguish in teaching true online coursework from half-fulfilling emergency online teaching, see. e.g. https://er.educause.edu/articles/2020/3/the-difference-between-emergency-remote-teaching-and-online-learning

(UNDER CONSTRUCTION)

Subtracted value in commercial publishing

January 27, 2020

Commercial scientific publishers are often bitching about that their journals added value to the papers as opposed to the preprint versions. Here we do not talk about the obvious value of refereeing process which is done by scientists and editorial decisions by scientific editors who are typically not paid by the journals for their work, but about the non-scientific quality of the paper. They dwell on value of “professional” typesetting and UNIFORMITY of formatting within a journal (really: what is this in-journal uniformity for ?? Readers are usually picking one article per a time, and the next article to read will be typically anyway in another journal, with another style), especially respectfully to referencing/bibliography style and font formatting, position of figure captions, color pictures, grammatical proofreading, advertising etc.

While it is sometimes true, most treatments of the discussion about the need for commercial scientific journal publishing, even the magna charta of the recent The Cost of Knowledge action contra Elsevier’s practices, neglect the value of the papers subtracted by the journals in the process.

  • many journals ask for a short format of references, thus imposing the stripping off the title, MR number or the full reference information in general
  • many journals ask to omit the arxiv number for the commercially published papers: if the paper has published version it is written as the only version
  • many journals require adding some irrelevant information because they like it: for example requiring that one writes not only Cambridge University Press, but Cambridge University Press, Cambridge. This can be considered better for some purposes, but isn’t the author the one who is truly interested on balancing which presentation to offer as optimal for his targetted audience; isn’t he the most informed about his audience and its needs ? Is instead more concerned the money-blinded and money-blooded publisher, which bundles for instance Chaos, solitons and fractals (which was a bandit journal for a long period) in their diet package for a library sale ? In ordinary life this goes under bullying.
  • very often the journal size limitations or preference force people to shorten their writing, often making it hard to comprehend, with omitted large parts of the created material
  • very often the journal’s proofreaders introduce new errors in the final stage of the process, like spelling errors in names, changes in language which change the meaning, formula errors, and so on, without author’s chance to approve or overwrite the change

In personal communications with colleagues I heard hundreds of such complaints.

While in mathematics it is not so much of a concern, in some other sciences, the publishers sometimes ban on posting the preprint version on the web page, either before or after the publication acceptance, or ever. For example, papers considered by journals like Science and Nature can not be available anywhere, any news on the result publicly available beforehand automatically leads to a rejection and maybe more serious charges. Another aspect is that some journals offer the following policy: they offer expensive preediting service. In other words, before the paper goes to the referees they will lift its cosmetics, color pictures and so on up. One pays for the service, pays for the colors, and so on. If your paper comes with hi colors and was pre-lifted it may receive favorable status by the editorial and it will be easier to get accepted (some journals nowdays claim the opposite, but I believe that the independence of the decision is rare). If you insist not to retouche before the decision, or not to add color page with color page charges at all, you are likely not to get friendly treatment, possibly you can be rejected before contacting the expert referees.

It is substantial also that adapting to all the specific style requirements of the journals (as opposed to generic internet archives, e.g. arxiv.org) is quite a substantial hussle for authors who publish with variety of different publishers. It takes time (and even psychological adaptation) to adapt the formatting and style requirements from title pages, citation styles, theorem-styles, emphasis styles, section breaks, the level of detail allowed in the bibliography etc. With publishers changing their formatting macros from time to time, it is even messy for self-archiving the sources. Publishers consider this standardization as they use their standard, but for the author it is precisely breaking her/his own working standard over and over again with new papers. Finally, one sometimes changes the emphasis in working version and submits to another journal, which makes new cycle of changes, and the same if the paper has been rejected and has to be resubmitted to another journal. It is also an additional problem to inexperienced PhD students who write their theses with typically quite different formal requirements than the journals which publish the same material, hence the style can not be universally planned in advance. All this time better being spent for doing research. In experimental sciences, than say in mathematics, as the internal structure of the texts has less internal structure, namely one describes and lists the results, along with illustrations and data, while in mathematics one has extensive and highly structured lemma-theorem-proof-equation internal referencing and many authors optimized their internal systems of macros for bookkeeping it effectively and it leads to waste of time and lowering of the psychological control and intellectual clarity in the eye of the author when one has to break all of these and replace them with artifacts of the journal. When this adds to other frustration in the writing and publishing process, it adds to hesitation of many authors to start or timely finish their writings.

It is important to note that there is an alternative. Establishing overlay academic journals where the refereeing would certify the quality of paper which are otherwise posted on public archives, which, based on refereeing and editorial process, just get some stamp-confirmation/certificate of quality with practices of very mild conformation requests. And other free or cheap online journals respecting the authors and their creative choices.

Reviving this blog

October 14, 2018

(the entry updated in Jan 2020)

I have been too busy for about 8 years (as of Oct 2018) and the blog has been dormant, though I was often pondering agenda on various things I wanted to write about, plus finishing posts on what I promised before (and what I am still interested in).

My main subject used to be studying Hopf algebra related constructions in noncommutative algebraic geometry, including locally trivial fibre bundle where the locality is in the sense of noncommutative localization theory and the structure group is replaced by a Hopf algebra (or some abstraction of it via comonads, actions of monoidal categories etc). I had completed some of my conjectures from early 2000-s in 2011, in one significant part with Gabriella Bohm, but the manuscript were not unified in a ready to publish paper at the time. I am now preparing the main from these papers, which is the upgrade of my old manuscript Globalizing Hopf-Galois extensions, now a joint work. At the end of the year I have worked on the related questions in descent theory with M. Stojić (during my stay at IHES Oct 1-Dec 31, 2019). In the first month at IHES, I have been still working on the subject which I dedicated much of my research time (but I had spent much time in last several years of teaching, so all together this is not so tragic) to, in my opinion, less attractive topic of Hopf algebroids. One of focal points of my interest in that period were certain completed Hopf algebroids over a noncommutative base which are sort of Heisenberg doubles of universal enveloping algebras of Hopf algebras. In Spring 2008 I have somewhat accidentally found one minor application in ordinary differential equations, which I suspect might also be of interest in study of certain formulas in the subject of renormalization of QFTs. At IHES, I had figured out some basic things about extending bialgebroids to nonassociative base, especially in relation to bialgebroid twistings by 2-cochains (somewhat similar to the twistings of quasibialgebras). A new Tannaka type theorem is kind of built into the formalism, which is however yet not complete. As a byproduct I figured out how to extend the twisting of antipodes for Hopf algebroids via Drinfeld-Xu bialgebroid 2-cocycles and also found some interesting family of twists related to multiplicative unitaries. I hope to write at least the article about the antipode by some time in February 2020. Then I thought about nonabelian cocycles leading to noncommutative cross ratios of (Gelfand and) Retakh; during November I discussed the topic intensively with Sharygin and Rubtsov. These nonabelian cocycles should be interpreted by a version of noncommutative bundles I worked on in early 2000-s. This is one of the reasons I returned to the topic of noncommutative descent and fibre bundles for Hopf algebras, with significant new results, especially in collaboration with M. Stojić and extension of my old unpublished works from 2011 with G. Bohm (I resumed writing those papers). This will be my main topic of work and writing within year 2020.

Another of my developing interests is related to strong shape theory. Jacob Lurie has written in his Higher Topos Theory book about the (infinity,1)-topos point of view on strong shape. In fact, we owe this perspective to much earlier work of Guenther from early 1990s, influenced by his PhD advisor, topologist Friedrich Bauer from Frankfurt. Šime Ungar wrote about a version of Blakers-Massey theorem in the setup of ordinary shape theory. For strong shape it seems there is no major result yet. Regarding that there is a Blakers-Massey result for infinity-topoi, it is wise to try interpreting it in the case of strong shape.

In 2018, lead by some applied problems, I started being interested in a rather different subject: the study of balance of incentives concerning agents in a generalized ecosystem. This is more general than what game theory teaches. Ecosystem is for me a system which involves units with some level of autonomous behaviour. In game theory, such units are (not called actors but) players and they have preferences, possibly multiple, which they want to satisfy. They also have some strategy. You want to find optimal strategy for some player when some model is given for the strategies of others. In modern society, organizations, on the internet and so on, the strategies of the rest of the world change continuously, somewhat stochastically and our information on them is only partial. What actions and interactions can we propose to learn more about such a world ? How to measure from inside the model of the game ? This is clearly more general than the game theory, and it is not only a theory, but an engineering concept. Second, managers of organizatons are interested in designing frameworks for ecosystems where some goals will be satisfied by locking the balance of behaviours of agents. For this the agents need to get certain incentives. This is very different from what is called business intelligence. BI is static, rule based and given by a company. Here we want to see the emerging logics, of course, one creates certain preconditions but they have very different character than BI. An interesting example of different kind is conceiving the protocols for a blockchain technology so that the ecosystem balance will work well. The very principles of blockchain in great part work because consensus and other protocols count on economical incentives of the participants. But there are secondary phenomena of the growth of the system, appearance of new intermediaries, pools, resource usage and so on, which require further development of algorithms and understanding of the wider context as well. In 2018/2019 I was quite focused on study of smart contracting in blockchain and in particular the interaction with real world (off chain effects) with significant new ideas there. In September 2019, I understood the importance of time stamp negoation in true distributive computing environment: I think that little has been done in the right direction there, but it will be crucial for future applications of blockchain closer to real time.

 

 

Descent for functors to come

December 7, 2010

I am resuming these days my work on descent theory. In August/September I was thinking on issues related to the descent for
equivariant functors between categories where each category is itself glued from pieces. Equivariant in the sense of coactions of comonads or a similar formalism. One of the motivations is study of principal bundles over noncommutative schemes. The Cech cocycles are a bit tricky here in full generality, there are phenomena which do not exist or do not matter in commutative context.

I will meet these days Gabi Bohm from Budapest to talk about such issues; she is well versed in comonads and related issues. This will be also excuse to post here some standard and not so standard background from selected parts of descent theory. This post will grow in few days. Keep tuned 🙂

books buying, selling, publishing

July 14, 2010

Talking about scientific publishing is a large topic, it includes the problems like the expensive and raising journal prices, voluntary copyright transfer to the publisher and our free service on editorial boards and as reviewers, quality of journals, low reliability of various impact factors and so on. But today I will not talk the journals but rather books. I will start with a shortened personal story but then go into more hot topic of some contemporary disorientation of publishers and bookstore in publishing and selling good books.

As a child I did not have much access to the books in science although I was interested in science; my first advanced math books I bought with my father at about age of 16 in some second-hand bookstore; quickly after I was reading Postnikov’s volumes on geometry and algebraic topology and started buying books massively; In a way I would make myself happier by indulging into choosing and buying the books; those books in Russian were cheap at the time but the problem was that you had available only what was published very recently. For example, I would get the second volume of Penrose-Rindler’s book on spinor geometry but not the first as the first was translated to Russian earlier and was not available hence any more. One of the secrets of the cheap Russian books was that the publisher did not make too many copies and finance their staying for years in the stocks. The whole stock would be sold very quickly. I was told by Russians that in Moscow people would go to exhibitions like Soviet Exhibition where they would get books which were not available often in bookstore; while cheap, most of the books were rare to find.

Of course, after years, it became a problem that I could not fit the books in my room and other places and had to stock them in boxes and so on. Then also travel came, my graduate school in Wisconsin and my interests involved, new books were coming, at much higher price and lower pace. And then I got in a way saturated. I had a nice personal collection (parts of it were lost though in travels, movings and so on), but was often accessing good libraries and kind of got used that I can have what I need most of the time, unlike in my early history. One of the reasons of the saturation is that I had to narrow my main interests to professional ones, and read less and less other subjects like linguistics, and in my own field there are few surprises unnoticed: we know in advance that somebody is writing a major book so coming to a bookstore will rarely raise great interest. I became a book-quality sceptic: books are either known to me, or bad or out of my interests.

Now after nearly 10 years of not buying books much, and even not being anymore fond of entering bookstores, I felt some revival of my book apetite in recent weeks, and made some spontaneous excursions into bookstores. But now I travel less and bookstores which I specially liked like the one of Cambridge University Press in Cambridge, like the former foreign bookstore in Zagreb in Gundulićeva street, the University Bookstore in Wisconsin, one impressive bookstore (I do not recall the name) in Barcelona and so on are far from my reach timewise or spacewise.

The new generation buys books online and does not bother browsing. the choice is bigger, and often there are online excerpts. But I really get the feeling whether I like the book mostly only if I browse it in my own hands. Online I often get wrong impression on proportions and feeling of the style and content. So I would still like to have good bookstores.

In Zagreb, now you have very little choice, the only reasonable collection of foreign titles is the Algoritam. Of course you can order anything but lets focus on the browsing feel and real competent choosing by sitting and browsing within the bookstore.

Well, the collection has few meters of math, physics and computer science titles at all levels mixed (“mixed” is here the bad thing, though I am sufficiently experienced to find my way through wrong targeted parts of the stock). But you know Zagreb is not a big market for scientific books and the bookstores should sell the books which are of sufficiently broad interest. For example, proceedings volume of a conference, or extremely specialized topics are unlikely to find a buyer. So what happens is that such books stay in the pool, and of course the bookstore can not afford spacewise and timewise to keep so many books unsold for long time, so once the crap takes over the shelves, it is hard to replace it by more reasonable titles. People tell me that a reason is that some people order books and then decide not to buy them, so the bad books enter the bookstore unplanned. But I see many bad choices in more than one copy, so the books were really ordered not by such an error but otherwise.

So if you look through the Algoritam bookstore, math section, you see that the they did not choose famous and widely sold books, and series but chosen some random books from random publishers. For example you have some expensive PDE textbook written by some local experts in Beijing but you do not have the most famous textbooks on PDEs of Gilbarg and Trudinger or the one from Evans. In fact there are no books published by American Mathematical Society which is a very good, modern and reasonably cheap publisher. There are some Indian reprints of books which I know are unlawful by Indian law to be sold out of India, but in Algoritam they say that they bought it from a regular supplier. Strange that they learned of a strange supplier of strange reprints in India but do not know that AMS publishes good many quality books in the field. The Springer’s Yellow series is underrepresented and Algoritam does not seem to maintain the action of the Yellow sale, which is traditionally quite an event for math book fans.

I was also disappointed into finding so many new editions of outdated books. For example some sort of Oxford companion of philosophy of mathematics, talking so much of 19th century and earlier metaphysical thinking of nominalism and alike notions, while have no hint of the thrills brought by modern foundational, semantic and other developments from schools of Lawvere, or Grothendieck and so on, who changed mathematics so much.

There are of course, also the dedicated reprint series. Like the Dover. Well, I think that youngsters should be warned that Dover series is outdated in large. Not in the way it was before. There are still great books there, like Goldblatt’s Topoi, or Abrikosov, Gоr’коv, Dzyaloshinskii Quantum field theoretical methods in statistical physics , ever quoted Weyl’s Classical groups and so on. But Dover also published reprints of many minor authors; some books of historical type without copyright which can be found online for extremely rare users (for example Chandrasekhar’s Mathematical theory of black holes is a masterpiece, but not good for a contemporary student; it is choice of exact and detailed calculations which a very rare specialist will look apart from a rare consultation in a library). But you know, a student sees a book which is reasonably well written about a subject which is not known to her/him. And after reading few nicely written paragraphs will decide to buy an obsolete book. I mean something what looks readable may be suboptimal from today’s point of view, notation, conventions, language, knowledge and after many new discovered powerful shortcuts or stronger results which are now standard. So, the student may subjectively feel that he gained a great insight, while the learnt material is not as powerful as what modern introductions offer or is not as organized as a contemporary colleagues would expect.

quantization

March 19, 2010

While quantum mechanics is considered “correct” and classical mechanics just an approximation in limit h\to 0, and for most practically interesting systems physicists already know the laws at the quantum level, the mastery of inventing new quantum theories starting with sometimes more obvious classical counterpart has been one of the main themes of mathematical physics ever since early 20th century.

My social experience tells me that great majority of both physicists and mathematicians feel uncomfortable with advanced study of quantization as an activity at some level, and consider the issue either overpriced, unnenecessary, or too difficult, or choose one small side of the story. Another feeling which I would like to share (the reasons behind it will likely appear in a later post) is that I have the impression from the scientific point of view that one could predict a qualitative surge in the field yet to come in next decade or two.

I assume that the reader knows few very basic cases of quantization.

Most physicists consider the quantization more or less a trivial issue, as they are mislead by the culture of university courses focusing on canonical quantization of Poisson brackets. This is of course rather wrong, as the canonical quantization depends on the choice of the generators of the Poisson algebra; namely a no-go theorem says that we can not simply promote all Poisson brackets to commutators as the textbooks make impression. We just promote certain hand-chosen generators to noncommuting operators with prescribed commutation relations that way. If we make a canonical transformation in the sense of classical mechanics (symplectic geometry) some other sets of generators may now look distinguished; canonical transformation recipe for those generators and their Poisson brackets may give a different quantization in general. In practice, there is indeed some distuinguished set of generators, say with special symmetry properties and it gives a hint of physically useful canonical quantization recipe. But such a simplicity is a very special circumstance. More complicated systems, involving nontrivial phase manifolds, constraints, infinitely many degrees of freedom (say, QFT), geometric background involving various cocycles like connections on gerbes, open many body systems etc…lead to need to develop sophisticated methods to study the quantization.

Many ideas and theories appeared in the 20th century, from various ordering prescriptions, basic notions like polarization, coherent states, geometric quantization, Berezin quantization, various types of “symbols” of operators…All of the mentioned on the intersection of operator theory, geometry and noncommutative algebra. The golden era of all these issues is maybe late 1960-s and 1970-s when the methods of symplectic geometry systematically inflooded study of quantization, integrable systems and representation theory.

On the other hand, the usage of Feynman path integrals was not that popular at the time, apart from here and there deriving recipes for Feynman rules. The theoretical physics since 1980-s gradually overturned its interest toward the path integral approach to quantization (with all its deficiencies of not being rigourously defined in reasonable generality).

One should next point out that the very procedure of renormalization is also a step of going from classical picture to true QFT, so it is itself a method of quantization. The nontriviality of renormalizations, say for nonabelian gauge systems has revealed how difficult the problem is and gave hints to many structures the quantization of field theories depends on. As for constrained systems in general, introduction of ghost fields is often useful. BRST and more general BV quantization procedures are based on such methods (introducing new auxiliary fields for the purposes of quantization) which are of cohomological nature, close to the method of Koszul resolutions in algebra.

Before I try to discuss some more contemporary insights into the story of quantization I should mention something what is for most mathematical physicists considered old-fashioned: WKBJ method of semiclassical approximation. In this method, one considers the short-wave asymptotics of the wave function. Textbook version of this method is in one dimension and is up to first order in Planck constant; old Bohr-Sommerfeld type conditions are corrolaries. Not very impressive and important from fundamental point of view, though handy for some easy calculations. However a more systematic version due mainly Maslov (from late 1950-s) revealed some important geometric ingredients for quantization in general (see references quoted at Maslov index) and enabled the method to be extended to all orders and to many dimensions (to manifolds). In 1970-s the method well resonated in the form of the stationary phase approximation or the study of (rapidly) oscillating integrals within the study of partial differential operators (and generalizations like pseudodifferential operators of Kohn and Nirenberg and the more general Fourier integral operators of Hoermander and Maslov) in harmonic analysis. The importance of the study of the behaviour of the wave phase factor has its reflection in the prevalence of the Lagrangean submanifolds in all that work (see the classical monograph of Duistermaat for example). Some, especially the Japanese school (Kashiwara, Saito, Jimbo, Miwa…) reflected this study within the framework of D-modules, but the basic underlying ideas are of course the same. One of the aspects is also the microlocal analysis. Japanese school was typically using more general theory of generalized functions then Laurent Schwarz type, namely the theory of hyperfunctions which may be viewed as the limiting boundary values of holomorphic functions, and are usually formulated using sheaf theory.

1980-s had witnessed large shift in emphasis: path integrals become more complicated, BV formalism has been discovered, operator methods of quantization are embraced and refined by Connes’ noncommutative geometry; deformation quantization is getting more popular in other circles and the Leningrad school of quantum inverse scattering method starts employing quantization as a tool in algebra, leading to quantum group theory. Symplectic geometry changes the emphasis from geometric quantization and related circle of questions to Gromov-Floer discovery of a new field of symplectic rigidity and symplectic topology.

In 1990-s cohomological methods get immensely refined by Kontsevich school leading to new geometric models like AKSZ model, Kontsevich formality theorem etc. Fukaya and Kontsevich introduce A_\infty-categories as the organizing principle of symplectic geometry and new phenomena like homological mirror symmetry. Witten, Freed and others understand better quantization of topological field theories, inlcuding toy examples like Dijkgraaf-Witten model. The point is that all of these lead to content of homotopical and higher categorical nature.

In last about 10 years the homotopy theory is getting ready to be absorbed by higher category theory. We are talking about the program conjectured by Grothendieck in early 1980-s and now being getting into a mature status. Therefore one can now attempt to more systematically look at quantization of nonlinear sigma models, topological field theories and alike from categorical point of view. I have had a luck to be in touch with some people having insight in this area, including Urs Schreiber who proposed a cleaner version of an idea due Daniel Freed to do the path integral quantization of some finite models via computing certain categorical Kan extension. While still unperfected the recipe is a step in the right direction.

What is now bugging me for last several months, maybe a year, is that the models covered by the recipes of Freed and of Schreiber are of the type for which the WKB-Maslov method gives the exact result. Such models when studied in the path integral method usually have that method of the exact contribution at stationary points of the action, and the formulas agree with the equivariant localization formulas (Duistermaat, Witten…). Supersymmetry, classical integrability and similar special circumstances make this localization possible. See the book by Richard Szabo on equivariant localization for path inetgral for an inspirative discussion of these issues mainly at the level of rigor of a theoretical physicist (a version can be found at arxiv).

My opinion is now the following: there is a part which is of a homotopical nature and higher category theory could give a major insight of cases of quantization which do not have analytic correction. Semiclassical expansion is close to the expansions in operator theory (heat kernel expansion, geometric measure theory, spectral geometry, Weyl tube formula, index theorems) in which the topological terms can be sensed as the leading terms. On the other hand it gives the exact answer at the cases where higher category theory can find the same in more systematic way. Hence incorporating the semiclassical method into that higher categorical machinery is the next important thing to try. And it seems to me that the subject is getting mature enough for this to be expected, though that comparison is not yet recognized enough.

See also Urs’s QFT manifesto and nlab entry quantization.

Beck’s theorem vs. Benabou-Roubaud

January 7, 2010

Late Jon Beck has made some profound contributions to category theory, descent theory and homological algebra, mainly via studying monads, monadic cohomology and monadicity. Unfortunately some of his early manuscripts were lost and some of his results might have been lost in their original form and reappeared later from folklore or being rediscovered. Some times ago at category list there was a discussion (which got heated at the moment) about the results of Jean Beneabou’s school of descent theory in comparison with the Beck’s results. In Canada and US, the Beck’s environment, indexed categories of R. Paré and D. Schumacher, which are close to the approach via pseudofunctors from the original FGA of Grothendieck, were used. French school followed Grothendieck-Gabriel’s fibered categories instead. The comparison of the descent in bifibered categories with monadic descent has been first done in Benabou-Roubaud’s paper; moreover certain Beck-Chevalley property has been utilized. In category list discussion, it seems that some contibutors prefer to attribute all to Beck’s unpublished works only, and not taking seriously that fibered category formalism is nontrivially related and not the same generality as the monadic aprpoach of Beck. Here is my response to category list which has been disgarded by the moderator because the discussion has been already closed when I posted the comment:

Prof. Benabou wrote:

Sorry Ross, the “indexed categories” are not Pare-Schumacher, and not Lawvere. They are, as I mentioned to Barr, also due to the “fertile brain” of “you know who”.

Later he quotes 1962 as the year and mentiones SGA1.6 (what corresponds to the seminars 1960-61).
However, Grothendieck introduced pseudofunctors into descent theory a bit earlier.
The first published Grothendieck’s text dealing with subject has only pseudofunctors and NOT the fibered categories:
it is in those Bourbaki seminars, which are LATER collected as FGA (Fondements de la géométrie algébrique. [Extraits du Séminaire Bourbaki, 1957–1962.])

The descent chapters of Bourbaki seminars are from 1959 and involve pseudofunctors (indexed category point of view).
Later, in SGA1 (corresponding to 1960-1961), the chapter 6 on fibrations and descent has been written by Pierre/Peter Gabriel and introduces fibered categories (proper) as the basic setup for the first time. As far as I heard from algebaric geometers indeed there was in reality some
input from Gabriel, not only Grothendieck, when rephrasing in a better language of fibered categories what is just partly supported by the footnote to Ch. 6 that the chapter is written down by Gabriel (the clarity of Gabriel’s style, parellel to the style in Gabriel-Zisman book, is quite recognizable!).
Thus in any case, a variant of indexed categories I mean pseudofunctors were under a different name introduced by Grothendieck, not by Pare and Schumacher: and for fibered categories Grothendieck was the one who introduced
the philosophy there of trading structure to property, but some (unclear in extent) part in making the transition
was played by Gabriel as well. This is the story known to most of people adhering to Grothendieck school and
I am surprised that the category list has that much confusion over it (including never mentioning Gabriel in this context).

As far as Benabou-Roubaud’s paper I indeed enjoyed the posts by Bunge and Benabou explaining the (sometimes subtle) differences from the known work of Beck (which fit into my earlier impressions).

I have retyped few years ago the C.R. paper into LaTeX, translated into English; though I should still recheck for typoses. I intended to post it to the arXiv as it is short and historically and pedagogically important while unavailable online and I hereto ask Prof. Benabou for permission (after rechecking the file together for correctness).
[I should remark that SGA1 is also on the arXiv, since 2002.]

It would be interesting if the discussion on Beck’s theorem and on Benabou-Roubaud’s theorem in this list would extend here to the case of higher categories. Lurie’s second volume
Derived algebraic geometry II: noncommutative algebra arXiv:0702299 has the Barr-Beck theorem for (infty,1)-categories. I recall discussing with A. Rosenberg and M. Kontsevich in 2004 extensively on the need for a Beck theorem in the setup of A-infty categories,
and tried to work on it but insufficiently to obtain it. My motivation was to explain certain version of noncommutative flag variety and they had another version long before needing
similar methods. In both cases Cohn universal localization was involved and we took a look of Neeman-Ranicki papers on usage of localization theory in algebraic K-theory implicitly involving higher Massey products in understanding the derived picture coming from Cohn’s localization. Roughly speaking, they understand Cohn’s localization, as H^0 following certain Bousfield localization at the level of chain complexes (earlier found by Vogel); extending this to comonads coming from a cover by Cohn localizations would give a comonad in the triangulated setup, andone wants to understand the quasicoherent sheaves for our examples obtained by gluing Cohn localizations by means of some Beck’s theorem in that setup). On July 16, 2004 MK and AR told me that they have just the day before
solved the problem, with a remark that they did not need to discuss “covers”, but I have not seen the A-infty version ever in print (some applications were given in a talk by MK at a conference in honour of van den Bergh’s birthday though; including getting usual formal schemes as noncommutative schemes via gluing along derived epimorphisms).
But Sasha Rosenberg has posted his lectures on the Max Planck site having Beck’s theorem in triangulated setup,

A. L. Rosenberg, Topics in noncommutative algebraic geometry, homological algebra and K-theory, preprint MPIM Bonn 2008-57 (be careful: 105 pages)
link
page 36-37 (triangulated Beck’s theorem)

which contains a version of the Kontsevich-Rosenberg result in the triangulated setup: the proof uses the Verdier’s abelianization functor. It would be interesting to compare the result to Lurie’s result somehow, as well as to try to see somehow the Street’s orientals in the picture explicitly.

Side remark: has anybody ever stated exactly or even proved that any sensible notion of pseudo-n-limit for n greater than 2 would be represented by “n-categories of n-descent data” where the latter are constructed via Street’s orientals? in other words, I do not consider that
the assertion

“n-descent object can be represented as the n-category of n-descent data”

is a tautology, but rather a conjecture worth working on
(even n=2 case of the assertion can not be found in print);
second in the Lurie’s framework can one formulate (similar to the descent object) the universal property of the appropriate Eilenberg-Moore category in a manner
parallel to the universal property of the descent object in 1-categorical situation??). In other words, one should see how for general n the orientals help to satisfy some
n-categorical universal property (the latter subject for n greater than 2 being to my inspection almost void in the literature anyway). At least for strict n-categories…
[[ Furthermore, what to do with descent data, orientals, nonabelian cocycles in other (usual) categories — I mean, the combinatorics of nonabelian cocycles in Hopf algebra theory for example (e.g. Drinfeld twist, associator: the factor systems for cleft extensions of Hopf algebras etc.) seems to some extent superimposable to the one in group cohomology but still different and I do not know if orientals explain such things (cf. the bialgebra nonabelian cohomology in

Shahn Majid: Cross product quantisation, nonabelian cohomology and twisting of Hopf algebras. Generalized symmetries in physics (Clausthal, 1993), 13–41, World Sci. Publ., River Edge, NJ, 1994. arXiv: hep-th/9311184 ]]

In 2-categorical situation, the 2-fibered categories are defined in Gray’s work and then again introduced and discussed at length by Claudio Hermida, who has good ideas on higher n (and I will be trilled to hear once that he found the time to return to the topic and give us good answers). An appendix in

Claudio Herminda: Descent on 2-fibrations and strongly 2-regular 2-categories. (English summary) Appl. Categ. Structures 12 (2004), no. 5-6, 427–459.

is discussing a 2-categorical version of Chevalley condition (truly, the subtle historical and terminological differences between B-R vs Beck contributions for 1d case are not made there). Beck’s theorem (proper) for pseudomonads is in the paper

Le Creurer, I. J.; Marmolejo, F.; Vitale, E. M.
Beck’s theorem for pseudo-monads.
J. Pure Appl. Algebra 173 (2002), no. 3, 293–313.

starting

December 22, 2009

Hopefully, I will be starting soon. For now I just posted an entry containing my response to the historical discussion on the theorems of Beck and Benabou-Roubaud. For now I am more absorbed by nlab. My personal nlab pages can be found here and my web page at work is here.

Hello world!

August 31, 2009

Welcome to WordPress.com. This is your first post. Edit or delete it and start blogging!

Unfortunately, I dislike the very words blog and blogging.