The Stacks project

9.13 Linear independence of characters

Here is the statement.

Lemma 9.13.1. Let $L$ be a field. Let $G$ be a monoid, for example a group. Let $\chi _1, \ldots , \chi _ n : G \to L$ be pairwise distinct homomorphisms of monoids where $L$ is regarded as a monoid by multiplication. Then $\chi _1, \ldots , \chi _ n$ are $L$-linearly independent: if $\lambda _1, \ldots , \lambda _ n \in L$ not all zero, then $\sum \lambda _ i\chi _ i(g) \not= 0$ for some $g \in G$.

Proof. If $n = 1$ this is true because $\chi _1(e) = 1$ if $e \in G$ is the neutral (identity) element. We prove the result by induction for $n > 1$. Suppose that $\lambda _1, \ldots , \lambda _ n \in L$ not all zero. If $\lambda _ i = 0$ for some, then we win by induction on $n$. Since we want to show that $\sum \lambda _ i\chi _ i(g) \not= 0$ for some $g \in G$ we may after dividing by $-\lambda _ n$ assume that $\lambda _ n = -1$. Then the only way we get in trouble is if

\[ \chi _ n(g) = \sum \nolimits _{i = 1, \ldots , n - 1} \lambda _ i\chi _ i(g) \]

for all $g \in G$. Fix $h \in G$. Then we would also get

\begin{align*} \chi _ n(h)\chi _ n(g) & = \chi _ n(hg) \\ & = \sum \nolimits _{i = 1, \ldots , n - 1} \lambda _ i\chi _ i(hg) \\ & = \sum \nolimits _{i = 1, \ldots , n - 1} \lambda _ i\chi _ i(h) \chi _ i(g) \end{align*}

Multiplying the previous relation by $\chi _ n(h)$ and subtracting we obtain

\[ 0 = \sum \nolimits _{i = 1, \ldots , n - 1} \lambda _ i (\chi _ n(h) - \chi _ i(h)) \chi _ i(g) \]

for all $g \in G$. Since $\lambda _ i \not= 0$ we conclude that $\chi _ n(h) = \chi _ i(h)$ for all $i$ by induction. The choice of $h$ above was arbitrary, so we conclude that $\chi _ i = \chi _ n$ for $i \leq n - 1$ which contradicts the assumption that our characters $\chi _ i$ are pairwise distinct. $\square$

Lemma 9.13.2. Let $L$ be a field. Let $n \geq 1$ and $\alpha _1, \ldots , \alpha _ n \in L$ pairwise distinct elements of $L$. Then there exists an $e \geq 0$ such that $\sum _{i = 1, \ldots , n} \alpha _ i^ e \not= 0$.

Proof. Apply linear independence of characters (Lemma 9.13.1) to the monoid homomorphisms $\mathbf{Z}_{\geq 0} \to L$, $e \mapsto \alpha _ i^ e$. $\square$

Lemma 9.13.3. Let $K/F$ and $L/F$ be field extensions. Let $\sigma _1, \ldots , \sigma _ n : K \to L$ be pairwise distinct morphisms of $F$-extensions. Then $\sigma _1, \ldots , \sigma _ n$ are $L$-linearly independent: if $\lambda _1, \ldots , \lambda _ n \in L$ not all zero, then $\sum \lambda _ i\sigma _ i(\alpha ) \not= 0$ for some $\alpha \in K$.

Proof. Apply Lemma 9.13.1 to the restrictions of $\sigma _ i$ to the groups of units. $\square$

Lemma 9.13.4. Let $K/F$ and $L/F$ be field extensions with $K/F$ finite separable and $L$ algebraically closed. Then the map

\[ K \otimes _ F L \longrightarrow \prod \nolimits _{\sigma \in \mathop{\mathrm{Hom}}\nolimits _ F(K, L)} L,\quad \alpha \otimes \beta \mapsto (\sigma (\alpha )\beta )_\sigma \]

is an isomorphism of $L$-algebras.

Proof. Choose a basis $\alpha _1, \ldots , \alpha _ n$ of $K$ as a vector space over $F$. By Lemma 9.12.11 (and a tiny omitted argument) the set $\mathop{\mathrm{Hom}}\nolimits _ F(K, L)$ has $n$ elements, say $\sigma _1, \ldots , \sigma _ n$. In particular, the two sides have the same dimension $n$ as vector spaces over $L$. Thus if the map is not an isomorphism, then it has a kernel. In other words, there would exist $\mu _ j \in L$, $j = 1, \ldots , n$ not all zero, with $\sum \alpha _ j \otimes \mu _ j$ in the kernel. In other words, $\sum \sigma _ i(\alpha _ j)\mu _ j = 0$ for all $i$. This would mean the $n \times n$ matrix with entries $\sigma _ i(\alpha _ j)$ is not invertible. Thus we can find $\lambda _1, \ldots , \lambda _ n \in L$ not all zero, such that $\sum \lambda _ i\sigma _ i(\alpha _ j) = 0$ for all $j$. Now any element $\alpha \in K$ can be written as $\alpha = \sum \beta _ j \alpha _ j$ with $\beta _ j \in F$ and we would get

\[ \sum \lambda _ i\sigma _ i(\alpha ) = \sum \lambda _ i\sigma _ i(\sum \beta _ j \alpha _ j) = \sum \beta _ j \sum \lambda _ i\sigma _ i(\alpha _ j) = 0 \]

which contradicts Lemma 9.13.3. $\square$


Comments (6)

Comment #5684 by Yiyang Wang on

A small typo: in the proof of 9.13.1,  "Since we want to show that... " then the sigma should be ka

Comment #7276 by peter a g on

Possible typo in " Multiplying the previous relation by and substracting we obtain ",

where "substracting" should probably be "subtracting". Still, I looked it up: 'substract' is in fact a variant, although the OED says it's (now) non-standard. (The OED also teaches us that in Shakespeare a substractor is a slanderer.)

Comment #7335 by on

OK! Well, then I am going to leave it...

Comment #8603 by DavidePierrat on

The base case of Lemma 0CKL need not use we have an identity element in , but rather that is taking values in a field.

Comment #9171 by on

You absolutely need a monoid to have a unit element and homomorphisms of monoids need to send unit to unit otherwise this thing is wrong (since otherwise all of could be mapped to in the field).


Post a comment

Your email address will not be published. Required fields are marked.

In your comment you can use Markdown and LaTeX style mathematics (enclose it like $\pi$). A preview option is available if you wish to see how it works out (just click on the eye in the toolbar).

Unfortunately JavaScript is disabled in your browser, so the comment preview function will not work.

All contributions are licensed under the GNU Free Documentation License.




In order to prevent bots from posting comments, we would like you to prove that you are human. You can do this by filling in the name of the current tag in the following input field. As a reminder, this is tag 0CKK. Beware of the difference between the letter 'O' and the digit '0'.