The Stacks project

Lemma 10.131.6. In diagram (10.131.4.1), suppose that $S \to S'$ is surjective with kernel $I \subset S$. Then $\Omega _{S/R} \to \Omega _{S'/R'}$ is surjective with kernel generated as an $S$-module by the elements $\text{d}a$, where $a \in S$ is such that $\varphi (a) \in \beta (R')$. (This includes in particular the elements $\text{d}(i)$, $i \in I$.)

First proof. Consider the map of presentations (10.131.4.2). Clearly the right vertical map of free modules is surjective. Thus the map is surjective. Suppose that some element $\eta $ of $\Omega _{S/R}$ maps to zero in $\Omega _{S'/R'}$. Write $\eta $ as the image of $\sum s_ i[a_ i]$ for some $s_ i, a_ i \in S$. Then we see that $\sum \varphi (s_ i)[\varphi (a_ i)]$ is the image of an element

\[ \theta = \sum s_ j'[a_ j', b_ j'] + \sum s_ k'[f_ k', g_ k'] + \sum s_ l'[r_ l'] \]

in the upper left corner of the diagram. Since $\varphi $ is surjective, the terms $s_ j'[a_ j', b_ j']$ and $s_ k'[f_ k', g_ k']$ are in the image of elements in the lower right corner. Thus, modifying $\eta $ and $\theta $ by subtracting the images of these elements, we may assume $\theta = \sum s_ l'[r_ l']$. In other words, we see $\sum \varphi (s_ i)[\varphi (a_ i)]$ is of the form $\sum s'_ l [\beta (r'_ l)]$. Next, we may assume that we have some $a' \in S'$ such that $a' = \varphi (a_ i)$ for all $i$ and $a' = \beta (r_ l')$ for all $l$. This is clear from the direct sum decomposition of the upper right corner of the diagram. Choose $a \in S$ with $\varphi (a) = a'$. Then we can write $a_ i = a + x_ i$ for some $x_ i \in I$. Thus we may assume that all $a_ i$ are equal to $a$ by using the relations that are allowed. But then we may assume our element is of the form $s[a]$. We still know that $\varphi (s)[a'] = \sum \varphi (s_ l')[\beta (r_ l')]$. Hence either $\varphi (s) = 0$ and we're done, or $a' = \varphi (a)$ is in the image of $\beta $ and we're done as well. $\square$

Second proof. We will use the universal property of modules of differentials given in Lemma 10.131.3 without further mention.

In (10.131.4.1) let $R'' = S \times _{S'} R'$. Then we have following diagram:

\[ \xymatrix{ S \ar[r] & S \ar[r] & S' \\ R \ar[r] \ar[u] & R'' \ar[r] \ar[u] & R' \ar[u] } \]

Let $M$ be an $S$-module. It follows immediately from the definitions that an $R$-derivation $D : S \to M$ is an $R''$-derivation if and only if it annihilates the elements in the image of $R'' \to S$. The universal property translates this into the statement that the natural map $\Omega _{S/R} \to \Omega _{S/R''}$ is surjective with kernel generated as an $S$-module by the image of $R''$.

From the previous paragraph we see that it suffices to show that $\Omega _{S/R} \to \Omega _{S'/R'}$ is an isomorphism when $S \to S'$ is surjective and $R = S \times _{S'} R'$. Let $M'$ be an $S'$-module. Observe that any $R'$-derivation $D' : S' \to M'$ gives an $R$-derivation by precomposing with $S \to S'$. Conversely, suppose $M$ is an $S$-module and $D : S \to M$ is an $R$-derivation. If $i \in I$, then there exist an $a \in R$ with $\alpha (a) = i$ (as $R = S \times _{S'} R'$). It follows that $D(i) = 0$ and hence $0 = D(is) = iD(s)$ for all $s \in S$. Thus the image of $D$ is contained in the submodule $M' \subset M$ of elements annihilated by $I$ and moreover the induced map $S \to M'$ factors through an $R'$-derivation $S' \to M'$. It is an exercise to use the universal property to see that this means $\Omega _{S/R} \to \Omega _{S'/R'}$ is an isomorphism; details omitted. $\square$


Comments (12)

Comment #6681 by Frank on

Any hint for the diagram chasing in the first step here? I tried for several times but I cannot get the result because the presentation is not exact on the left.

Comment #6683 by on

Yes, this is horrible (see below) and we should do this another way! I will rewrite the proof the next time I go through all the comments.

Diagram chase (maybe not literally a diagram chase). Suppose that some element of maps to zero in . Write as the image of for some . Then we see that is the image of some huge sum of terms and and . Since is surjective, we may assume the terms and are not there by modifying our choices. Then we see that is of the form . I guess now you still have to do a little bit here. First, you can assume that is a fixed element of by grouping the sum into a sum of sums. Then you can write for some . Thus we may assume that all are equal to by using the relations that are allowed. But then we may assume our element is of the form and the result is clear in that case.

Comment #6684 by on

Undefined control sequence should read .

Comment #7704 by Ryo Suzuki on

This lemma can be proved by universal property.

It suffices to show that when is surjective and diagram (10.131.4.1) is pullback.

To show the surjectivity of , suppose is a -derivation and assume . In this case, because is surjective.

Next, let be a -derivation. If , there exists such that . It is because . Hence factors through . By taking , we see that factors through . This means that has a retraction. In particular is injective.

Comment #7710 by on

See Lemma 10.131.12 for the statement of the tensor product algebra whose proof also uses a universal property. Are you suggesting to deduce this lemma from that one? What is your justification for the sentence: "It suffices to show that ... when ... is surjective and diagram ... is pullback."?

Comment #7713 by Ryo Suzuki on

In diagram 10.129.1, let be a pullback . (not ) Then we have following diagram:

For any -derivation which is as -derivation is also as -derivation. Hence is surjective. Moreover, the kernel of this map is generated as -module by the image of , by definition of derivation. So, it suffice to show that .

Comment #7714 by Ryo Suzuki on

Hmm... The diagram is not shown propery. I try again.

Comment #7715 by Ryo Suzuki on

I guess preview is not working propery. At first I wrote as this:

\xymatrix{ S \ar[r] & S\ar[r] & S' \\\\ R \ar[r] \ar[u] & R'' \ar[r] \ar[u] & R' }

This works propery in preview, but doesn't work in comment. Then I wrote as this:

\xymatrix{ S \ar[r] & S\ar[r] & S' \\ R \ar[r] \ar[u] & R'' \ar[r] \ar[u] & R' }

This doesn't work propery in preview, but does work in comment.

Comment #8855 by Et on

At the end of the first proof, it would be helpful to say that we are done in the case thanks to the Leibnitz rule.

Comment #9234 by on

Dear Et, I think that if in the last sentence of the first proof then in the direct sum so we're done by fiat. I did remove a superfluous sentence here.

There are also:

  • 14 comment(s) on Section 10.131: Differentials

Post a comment

Your email address will not be published. Required fields are marked.

In your comment you can use Markdown and LaTeX style mathematics (enclose it like $\pi$). A preview option is available if you wish to see how it works out (just click on the eye in the toolbar).

Unfortunately JavaScript is disabled in your browser, so the comment preview function will not work.

All contributions are licensed under the GNU Free Documentation License.




In order to prevent bots from posting comments, we would like you to prove that you are human. You can do this by filling in the name of the current tag in the following input field. As a reminder, this is tag 00RR. Beware of the difference between the letter 'O' and the digit '0'.