I was talking to this fellow mathematician, and told him how for my very personal taste, the research in algebraic geometry has sailed a bit too far ashore from the original question that motivated the subject: A classification for the solutions of polynomial equations. I told him that most of the heavy machinery in algebraic geometry can be given very good intuition - however, most courses neglect to do so. Anyways, he was clearly amused and not quite in belief of my radical position that unconditional and solitary abstraction isn't the only way to do algebraic geometry. He teased me to then tell him what an *»ample«* line bundle was, without talking about tensor powers or commutative diagrams. So, that's the reason for this post. I answered that ample line bundles are precisely those that induce finite morphisms to projective space, but I couldn't remember where I knew it from. I searched through all the major literature and could not find a reference until someone on Mathoverflow helped me out. Since I complained so much about how noone ever gives intuition to these kinds of concepts in algebraic geometry, I decided I'd blog a bit about line bundles and how you should think about them. Do you want to know more?


My colleague and me were confronted with a question concerning the introductory chapter of a text on algebraic geometry. The scenario of a plane curve $C=Z(f)\subseteq \mathbb A_{\Bbbk}^2$ over a field $\Bbbk$ is considered, for a nonconstant polynomial $f\in\Bbbk[x,y]$. It was stated that the function field $\mathbb K:=\mathrm{Quot}(\Bbbk[x,y]/\langle f\rangle)$ is obviously of transcendence degree $1$ over $\Bbbk$ because $f(x,y)=0$ is a new relation between $x$ and $y$. The prerequisites to this text are basic undergraduate knowledge of algebra and topology. The question was about the obviousness of the above statement. We came up with a proof that only really requires some linear algebra. I found it rather cute. Do you want to see it?


I wrote a little text that outlines why vector bundles and locally free sheaves are the same thing. This approach is very messy with a lot of gluing, mostly because I did not look at Exercise II.5.18 in Hartshorne right away. The construction given there is much more canonical and preferable over mine. However, I decided to put this online simply because it is different and personally, it gave me a better feeling for why the two notions coincide.


Zariski's proof ((Oscar Zariski. A new proof of Hilbert's Nullstellensatz, Bulletin of the Ameican Mathematical Society Volume 53, Number 4 (1947), 362-368.)) of the Hilbert Nullstellensatz makes use of the ineffable Rabinowitch Trick ((J. L. Rabinowitsch, Zum Hilbertschen Nullstellensatz, Mathematische Annalen Volume 102, No. 1 (1929), 520.)) (check it out, that has got to be the shortest paper ever). But who is that awesome guy Rabinowitsch? I found out today, and the answer is basically in in this MO post: > Rainich was giving a lecture in which he made use of a clever trick which he had discovered. Someone in the audience indignantly interrupted him pointing out that this was the famous Rabinowitsch trick and berating Rainich for claiming to have discovered it. Without a word Rainich turned to the blackboard, picked up the chalk, and wrote RABINOWITSCH. He then put down the chalk, picked up an eraser and began erasing letters. When he was done what remained was RABINOWITSCH. He then went on with his lecture. Apparently, George Yuri Rainich is the mysterious stranger that went by the name of Rabinowitsch, which was his birthname ((Bruce P. Palka, Editor's Endnotes (May 2004), The American Mathematical Monthly 111 (5): 456–460)) ((Bruce P. Palka, Editor's Endnotes (December 2004). The American Mathematical Monthly 111 (10): 927–929)). I even updated the wikipedia page. Oh right, the reason this even caught my attention: Daniel R. Grayson has a really sweet, short proof of the Nullstellensatz, also using the Rainich Trick.


Harm Derksen brachte uns schon im Jahre 1998 die weltschnellste Methode zur Berechnung irreduzibler Charaktere der symmetrischen Gruppe. Aber in seinem preprint, Computing with Characters of the Symmetric Group, bleiben einige Fragen offen, die ich hiermit zu klären versuche.
» The code to end all codes «

Die Welt ist einen halben Herzschlag alt. Der unsterbliche Prophet meditiert seit Anbeginn der Laufzeit auf dem তাজমহল und studiert die heiligen 12 Zeilen C-Code, die jede SAT Instanz in $\mathcal{O}(n^{9699690})$ lösen. Wenn der Puls eine Ganzzahl wird, so wird er erwachen und die Wahrheit verkünden, und Lob wird gepriesen, und Licht wird ewig scheinen auf die Kinder des Schaltkreises. Denn er ist der Taktgeber, und an der Spitze des Signals wird es sein, wenn er uns erscheint. Buch Arithmæl Circulæ, Vers 21:7 Wollen Sie mehr wissen?



After discovering the toggle command available in MathJax, I immediately went to asked the capable people of tex.stackexchange whether this could be done inside a PDF file. And indeed: It can be done!
\documentclass{article}
\usepackage{animate}

\begin{document}
\begin{animateinline}[step]{1}
  \strut$x=1$
\newframe
  \strut$x=2$
\newframe
  \strut$x=3$
\end{animateinline}
\end{document}
Now, what do I want with this package? I want to do abstract nonsense. A diagram-based proof should, in my opinion, be a slideshow. You start with the diagram that is your assumption and by simply interacting with the diagram (clicking it), in each step, a new arrow is constructed from some universal property. I wanted to write a neat animated PDF with an abstract nonsense proof of the famous Snake Lemma, and there is a great book by Francis Borceux containing a proof, but unfortunately, I was unable to overcome a difficulty with the proof, so that will have to wait until someone answers my question.


Let $K\subseteq L$ be a field extension, and let $V$ be a $K$-vector space. The extension of $V$ by scalars in $L$ is the tensor product $E=V\otimes_KL$. I will prove that every $L$-vector spaced is obtained as some extension in this way and that $\dim_L(E)=\dim_K(V)$. Do you want to know more?


Most people probably know about the stacks project, but fewer might have heard about the CRing project. Now, there might be certain disadvantages to having a large number of contributors to a single document, but I would like to put some emphasis on the one, big advantage: You actually end up with a very complete reference, not just the narrower standpoint of one or two authors. If done correctly, group efforts of this sort are the perfect way to cover all basic (whatever that means) knowledge in some field. I really hope to see more areas pick up on the idea.


I was doing some math with Lars and during our session, we came up with the following inequality. Let $a_1,\ldots,a_r\in\mathbb N$ be integer numbers with $a_i\ge k$. Then, \[ \frac{\sum_{i=1}^r a_i}{\prod_{i=1}^r a_i} = \sum_{i=1}^r \frac{1}{\prod_{j\ne i} a_j} \le \frac{r}{k^{r-1}}= \frac{rk}{k^r} \] The inequality is completely obvious, but I found it useful and didn't want to forget about it.


On the second day of 29c3, we talked about the group law on an elliptic curve based on our old script (in German) called "Das Gruppengesetz auf Elliptischen Kurven". Today, we explained to some people how elliptic curves can be used to factorize the product of two primes, i.e. attack weak RSA keys. When we were done, we let the flipchart paper taped to the wall and sat down. Few minutes passed before two people in passing showed an unusual interest in our notes. A few words into the conversation, I bluntly asked them about their mathematical background, which was met with an amused > I'm a math professor, does that suffice? These people turned out to be Tanja Lange and Daniel Bernstein, two scientists who are rather big shots in mathematical cryptography. I am ashamed to say that I did not even know that, but I certainly understood that they knew a lot more about elliptic curves than me and they were willing to share. To be precise, they were really friendly, and good at explaining it. That's rare. Of course, I eagerly listened as they began elaborating on the advantages of Edwards coordinates on elliptic curves. The two of them had attended the talk by Edwards introducing the concept in 2007 and observed the cryptographic potential: Basically (and leaving out some details), it's all about the fact that \[ x^2+y^2 = 1+dx^2y^2 \] defines an elliptic curve, with all relevant points visible in the affine plane; check out the picture. Choosing $(0,1)$ as your neutral element, a group law on this curve is given by \[ (x_1,y_1) \oplus (x_2,y_2) := \left( \frac{x_1y_2+x_2y_1}{1+dx_1x_2y_1y_2}, \frac{y_1y_2+x_1x_2}{1+dx_1x_2y_1y_2}\right). \] With this shape, you never have to mention projective coordinates, the group law can be explained in a very elementary kind of way — and then it turns out that this shape also yields faster algorithms for point multiplication. In short: If you're into curves, cryptography or both, I thoroughly encourage you to check out their summary page about Edwards coordinates.


Let's say you have a matrix $A\in\mathrm{GL}_n$. How do we best denote the inverse of its transpose? You would probably write $(A^T)^{-1}$ or $(A^{-1})^T$ because it is the same. However, today at the office we decided to henceforth write $A^{-T}$ instead. It seems abusive at first, but I can make it formal for you, if you care for that kind of stuff. As we all know, the transpose of $A^T$ is $A$. In other words, the transposition operator $\vartheta:\mathrm{GL}_n\to\mathrm{GL}_n$ which maps $A$ to its transpose satisfies $\vartheta^2=1$. So far, the exponentiation map is defined as $\mathrm{GL}_n\times\mathbb{Z}\to\mathrm{GL}_n$ mapping $(A,k)\mapsto A^k$. We instead consider the ring $\mathbb{Z}[T]:=\mathbb{Z}[t]/(t^2)$ and extend the domain of the exponentiation map to $\mathrm{GL}_n\times\mathbb{Z}[T]_\ast$, where $\mathbb{Z}[T]_\ast$ denotes the homogeneous elements of $\mathbb{Z}[T]$. This way, you can write $A^{kT}$ instead of $(A^k)^T$ and you have $A^{kT^2}=A^k$ as required. Note that by restricting to homogeneous elements in $\mathbb{Z}[T]$, we get $A^{p+q}=A^pA^q$ and $A^{pq}=(A^p)^q$ for all $p,q\in\mathbb{Z}[T]_\ast$.


Let's say you have a vector space $V$ and a vector space $W$ over the same field $k$. Then, according to a huge pile of books on representation theory I read, there is an obvious isomorphism $V^\ast\otimes_k W \cong \mathrm{Hom}_k(V,W)$, where $V^\ast=\mathrm{Hom}_k(V,k)$. I concur, it's not hard to write down, but then why don't they just write it down? It is given by \[ \begin{array}{rcl} \phi: V^\ast\otimes_k W &\longrightarrow& \mathrm{Hom}_k(V,W) \\ f\otimes w &\longmapsto& (v\mapsto f(v)w), \end{array}\] when we assume $W$ to be finite-dimensional. Do you want to see the proof?


I am currently working through the book on Young Tableaux by Fulton, and I find it a very nice read - in a prose ((Attention Nikolai: Do not touch this book.)) kind of way. As you might notice from the general sound of it, I am getting into representation theory. However, this book is more about the combinatorical aspects of the field. Since combinatorics is a very hands-on kind of math, I really think I should do a certain amount of exercises. I am only skimming through the book since ultimately, I want to get back to abstract nonsense really bad, but I will write down my solutions for any exercise I do ((I know that the book already contains "solutions", but I'd rather call them "hints".)) . Do you want to know more?