Invariants
Arun Ram
Department of Mathematics and Statistics
University of Melbourne
Parkville, VIC 3010 Australia
aram@unimelb.edu.au
Last update: 30 May 2012
Abstract.
This is a typed version of I.G. Macdonald's lecture notes from lectures at the University of California San Diego from January to March of 1991.
Introduction
a root system in the Weyl group, acting on as a group of isometries. ( could be any finite Coxeter group). Each (in particular, each ) may be regarded as a linear function on
So if we may define as a polydomial function on
Let denote the algebra of polynomial functions on If is a basis of then
- is graded, i.e.
the polynomials of degree
- acts on as a group of automorphisms: if we define
We have
Moreover
respects the grading, i.e.
acts on each
Let
Then
is divisible by
|
|
Proof.
|
|
Let
so that
for Hence
if i.e. if
Hence divides
|
Now let
is the
ring of polynomial invariants of
is a
graded ring:
Examples.
- acting on by permuting the coordinates. Then acts on
by permuting the and is the subring of symmetric polynomials:
where are the elementary symmetric functions.
- acting on hyperplane
in This time
but now the are not independent: they are related by
So now
- Weyl group of type acting on by signed permutations. If
then is unchanged by replacing by hence is a symmetric polynomial in the So this time
where elementary symmetric function of
- Any
is invariant, hence the elementary symmetric functions of the () are invariants.
Averaging
Define by
- for all
- if
- if
Our first aim is to prove Chevalley's theorem.
There exist homogeneous elements
such that
and are algebraically independent over
Let
so that an element lies in if and only if Let be the ideal in generated by so that each element of can be written as a finite sum
with and Since is a polynomial ring, it follows from Hilbert's basis theorem that there is a finite set of homogeneous elements of such that (a) generate the ideal (b) no proper subset does. I claim that these do what we want. So we have three things to prove:
- every element of is a polynomial in the (with real coefficients);
- are algebraically independent over
|
|
Proof of 1.
|
|
Let without loss of generality homogeneous of degree Induction on is OK; suppose then hence
with Since and the are homogeneous, we may assume homogeneous of degree
(where ). Now apply the averaging operator (Prop. 2.1):
By (Prop. 2.1), and has degree hence, by the inductive hypothesis, is a polynomial in
Hence, so is
|
For the proof of 2. we require a lemma:
Let be such that is not in ideal of generated by Suppose are homogeneous elements of such that
Then
|
|
Proof.
|
|
Induction on
- then
by averaging the relation
over we get
by (Prop. 2.1). If this shows that a contradiction. Hence
- Let By (Prop. 1.1) we have
where is homogeneous of degree From the relation we deduce
and hence on subtraction
Each is homogeneous, and
so by the inductive hypothesis i.e.
or
Since the generate and is stable it follows that
for all or Averaging over we get
hence (because ).
|
|
|
Proof of 2.
|
|
Suppose is a nonzero polynomial in variables such that We may assume that all the monomials in the that occur in have the same degree as elements of and that is minimal. Let then and so
Consider the ideal in generated by We may choose the numbering so that but no proper subset, generate this ideal. Then there exist such that
is homogeneous of degree
Hence we may assume homogeneous of degree (so zero if ).
Since
we have
i.e.
Using (Inv 1) this becomes
for
Now apply the Lemma 2.3: is not in the ideal of generated by and each of the polynomials
is homogeneous (of degree ). Hence
lies in hence is of the form
with homogeneous of degree In particular, so that
Now multiply by and sum over remembering Euler's formula: we get
showing (since ) that lies in the ideal of generated by a contradiction.
|
It remains to prove that
|
|
Proof of 3.
|
|
Let
Then Now transcendence degree is additive in towers
So we have to prove that i.e. that is an algebraic extension. Since it is enough to show that each is algebraic over i.e. is a root of an equation with coefficients in Consider the equation in
It has a root, and its coefficients are clearly invariant, i.e. This completes the proof of Theorem 2.2.
|
In Theorem 2.2 the invariants are not uniquely determined, but their degrees are. To see why this is so we introduce Poincaré series.
Poincaré Series
Let
be a graded vector space (over ), with each finite dimensional. We define the Poincaré series of to be
If
is another graded vector space (with all ) then and are graded.
from which it follows that
Suppose where is homogeneous, Then
so that
From this and (Prop. 3.1) we deduce
-
-
|
|
Proof.
|
|
from which i. follows. Likewise
by Chevalley's Theorem (2.2), which gives ii.
|
Now suppose that also
where
are homogeneous and algebraically independent over If
then from (Prop. 3.2) we have
from which it follows that (after renumbering if necessary)
The numbers are called the degrees of
(Molien's formula)
|
|
Proof.
|
|
In general, if is a finite dimensional vector space (over a field of characteristic 0) and is an idempotent linear transformation (i.e. ) then
(for each eigenvalue of is 0 or 1, hence with respect to a suitable basis of the matrix of is
)
Apply this to the averaging operator acting on
so that
Suppose acting on has eigenvalues (roots of unity, since has finite order). Then the eigenvalues of acting on will all be monomials
with
as one sees by diagonalising Hence
and therefore
|
We shall use (Prop. 3.3) to show that the degrees satisfy
-
-
where
|
|
Proof.
|
|
From (Prop. 3.2) and (Prop. 3.3) we have
- Set Then all terms on the right vanish except that for so the right hand side is equal to and the left hand side is equal to
This proves i.
- Suppose is a reflection. I claim that some For if not, then for some not orthogonal to any root, i.e. But then implies by (1.21) (ref. to Root Systems?), contradiction. Now is a reflection if and only if
Hence the right hand side of (Inv 3) is of the form
where is a rational function of with no pole at Now put in (Inv 3) . The left hand side becomes
On the right hand side, we get
Comparing coefficients of now gives
|
Moreover if is any positive definite scalar product on then
is a invariant positive definite scalar product (positive definite because
and vanishes if and only if ). So we may assume is a group of isometries (or orthogonal transformations) of (relative to a suitably chosen scalar product).
Hence it makes sense to say that is generated by reflections. Reflection: eigenvalues equal to 1.
Consider the following four statements:
- is generated by reflections.
- Lemma 2.3 holds.
- are algebraically independent. (C')
-
for suitable positive integers
Then we have seen that A ⇒ B ⇒ C ⇒ D, and C ⇔ C'. Moreover it is easy to see that conversely D ⇒ C, for D shows that the dimension of each
is equal to the number of monomials
it contains.
In fact it is also true that C ⇒ A (Shephard and Todd). Hence A, B, C, D are all equivalent. In particular the weird Lemma 2.3 which appears to have no geometric content, is equivalent to being a reflection group.
Moreover Molien's formula (Prop. 3.3) is valid for any finite Hence we can avoid the appeal to field theory to prove that We have
The left hand side has a pole of order at the right hand side has a pole of order arising from the term
(all other terms have poles of order at ).
Finally we can improve on (Prop. 3.4). For each let
Then
(Shephard and Todd, Solomon)
This implies (Prop. 3.4.i) (set ) and (Prop. 3.4.ii): if and only if is a reflection, hence
Exercise:
Verify Theorem 3.7 when acting on by permuting the coordinates.
Decompose into disjoint cycles. Then each cycle gives a single fixes vector (e.g. the cycle
gives ) and hence
where is the cycle type of Hence
Let
be defined by
all Then the right hand side of (Inv 4) is
Now if replaced by a positive integer then
for all so that
for all Hence
Since this is true for all positive integers it is true identically:
giving
in agreement with Theorem 3.7, since the are
In fact there is a fancier version of Theorem 3.7. Let be a positive integer, a primitive root of unity, e.g.
For each let
(this is independent of which primitive root of unity we choose).
where is the truth function, i.e.
gives Theorem 3.7: (So the degrees determine the eigenvalue distributions in )
Proposition 3.8 is a consequence of a fancier version of Molien's formula:
where are independent variables.
I won't prove Proposition 3.9 yet, but I will show you how to deduce Proposition 3.8 (and hence in particular Theorem 3.7) from it.
|
|
Proof of Proposition 3.8, assuming Proposition 3.9.
|
|
First replace by
then (Prop. 3.9) becomes
Now set where is another variable, and then let
Consider first the left hand side of (Inv 5). Let and let be the eigenvalues of (acting on as usual). Then
Suppose first On putting we get
If on the other hand the corresponding factor in the product is
so when we put the limit as of
is just and hence the left hand side of (Inv 5) gives
Now consider the factors on the right hand side:
Suppose first that so that We can then safely put and get
If on the other hand so that we have to compute the limit as of
which we can do by l'Hôpital's rule: differentiate numerator and denominator with respect to and then set This gives
So the limit as of the right hand side of (Inv 5) is
since we know (Prop. 3.4) that
This completes the proof of Proposition 3.8 (modulo Proposition 3.9 which I will prove later, perhaps).
|
Examples.
- Take i.e. Then if and only if the right-hand side of (Prop. 3.8) has degree are even.
- If does not divide any then for all So all eigenvalues of satisfy for some (Is this otherwise obvious?)
- Reference: C.L. Morgan, Can. J. Math. (1979) vol. 31, 252-254.
skew polynomials
Each (acting on ) is an orthogonal transformation, hence
is the sign character of Clearly
i.e. is a homomorphism of into the 2-elements group In particular, for a reflection Hence if and is a reduced expression for where each is an we have
i.e.,
A polynomial is skew if
for all
Let
the product of the positive roots (if is of type then is the Vandermonde determinant).
The set of skew polynomials in is
|
|
Proof.
|
|
Suppose is skew. Then for each we have so that
is divisible by by (Prop. 1.1). Hence is divisible by in say
Moreover itself is skew, because by (1.9) (Ref to Root Systems page?)
and therefore
by (Prop. 4.1). Hence in the factorization both and are skew, hence is symmetric, i.e. and conversely each is skew.
|
As before let be a set of fundamental polynomial invariants.
|
|
Proof.
|
|
In general, if is a mapping, say
where the have partial derivatives, the Jacobian of is
In particular, if is linear, say
then
and so (constant).
If is another mapping, then by the chain rule we have
Define by
and let Then
(because the are invariant), i.e. So (Inv 6) gives
i.e.
for all whence is skew. But also is a homogeneous polynomial of degree
by (Prop. 3.4.ii), hence by (Prop. 4.2) we have for some and it remains to show that
Consider as before the fields of fractions of respectively. We have
and each is algebraic over Hence for each there is a polynomial
of minimal degree such that Differentiate with respect to
where
Write these equations as a matrix identity, say where
Taking determinants we get
by the minimality of the
Hence in
|
(Shephard and Todd) Let be a finite subgroup of and suppose that is generated as an algebra by independent homogeneous elements Then is generated by reflections.
|
|
Proof.
|
|
Let be the subgroup of generated by the reflections in and let be its ring of invariants. By Chevalley's theorem we have
with algebraically independent. We have hence each is (uniquely) a polynomial in the and
so that
By the remark above it follows that
Expanding, some product
Renumbering the we may assume that
Let
Then we must have
On the other hand, from (Prop. 3.4.ii), if is the number of reflections in (or )
So
and therefore But by (Prop. 3.4.i)
and
Hence and finally
|
Differential operators
Let be coordinates relative to an (orthonormal) basis of so that
Let denote the algebra of linear maps Define an algebra homomorphism
Explicitly, if say
then
We shall write in place of The operators are linear partial differential operators with constant coefficients.
The action of is given by
Let Then
|
|
Proof.
|
|
If this is true for and then it is true for
So it is enough to prove Proposition 5.1 when is linear, say The easiest thing to do is to go back to the definition: if then
and therefore
giving
i.e.
|
Scalar product on
The scalar product on induces one on which may be defined as follows: if then
If we have
because
As before let be coordinates relative to an orthonormal basis in and let
be multi-indices. Then the definition gives
which is zero if and is equal to
if
It follows that the scalar product is symmetric (not immediately obvious from the definition) and positive definite, and that the monomials form an orthogonal (but not orthonormal) basis of The homogeneous components of are pairwise orthogonal, and on we have our original scalar product.
Example.
Let
then
and
Since
it follows that
the permanent of the matrix (This is, as far as I am aware, the only place the permanent occurs in nature.)
Let and let
be multiplication by
is the adjoint of
|
|
Proof.
|
|
Let Then
|
Harmonic polynomials
A polynomial is said to be harmonic if for all Since is generated by (Thm. 2.2) it is enough to require that
- If are coordinates in relative to an orthonormal basis, then
hence certainly a harmonic polynomial must satisfy
i.e. it is a solution to Laplace's equation, hence is harmonic in the usual sense of the word.
- 1 is harmonic; so is because is skew of degree hence zero.
Let be the vector space of harmonic polynomials. It is a graded vector space:
Moreover each is stable, because if and we have
using (Prop. 5.1). So is a graded module.
Recall that is the ideal in generated by
is the orthogonal complement of in i.e.
|
|
Proof.
|
|
We have only to apply the definitions:
|
So Next we have
The mapping defined by is an isomorphism of modules (i.e. commutes with the actions of ).
|
|
Proof.
|
|
We have to show that is (a) surjective, (b) injective.
- We show by induction on that Clear for (since From (Prop. 7.2) we have
By the inductive hypothesis, hence Hence
i.e.
- For each let
The form an basis of Let
be a basis of consisting of homogeneous elements. Then the form a basis of and we have to show that their images under namely are linearly independent over So we suppose we have a nontrivial relations
By homogeneity we may assume that
is constant for each term in the sum. Consider the monomials occuring in (Inv 7), and choose one of least degree (N.B.
). This is not in the ideal of generated by the other occurring in (Inv 7), otherwise the would be linearly dependent over contradicting (Thm. 2.2). Hence by the Lemma 2.3 we have
But hence by (Prop. 7.2) i.e. for each contradiction.
|
-
-
|
|
Proof.
|
|
-
From (Prop. 3.1) and (Prop. 7.4) we have
hence
- From (Prop. 7.2) we have hence
as vector spaces, so that
|
(It follows that if
and that Since we have )
Suppose is a graded vector space over such that acts on each (and each is finite dimensional). ( here could be any finite group.) Then we can define the "graded character" of if and is an indeterminate, we define
If is another such, then acts on each (by the rule ) and we have
Hence, as in (Prop. 3.1) we have
We shall use this to prove
(or equivalently ) affords the regular representation of
|
|
Proof.
|
|
Since a representation is determined by its character, it is enough to prove that
By (Prop. 7.5) and (Prop. 7.3) we have
and
as in the proof of Molien's formula (Prop. 3.3). Moreover acts as the identity on hence
for each and therefore
Hence
and
If we get and if we get 0, because then vanishes at (the numerator of (Inv 8) is divisible by and the denominator by at most
|
From (Prop. 7.6) it follows that the regular representation of carries a natural grading: (or on ). For each irreducible representation of let denote the number of times occurs in Then we can form the generating function
This polynomial is given by
where is the character of the representation
|
|
Proof.
|
|
The multiplicity of in is the scalar product (on ) of with the character of i.e.,
Hence
|
Examples.
- acting on by permuting the coordinates. Here
The irreducible characters of are if has cycle type then
so that
and therefore by (Prop. 7.8)
Check:
as it should.
- If is an irreducible representation, so is with character We have
from (Prop. 7.8) (use
(Prop. 3.4)). In particular, and
Recall:
acts on hence on
- Every element of is of the form where
|
|
Proof.
|
|
- Let say Multiply numerator and denominator by
then say with
- Let
Then is fixed by if and only if is, i.e. if and only if So
|
- is a free module of rank
- And basis of is an basis of
- Any basis of is a basis of
|
|
Proof.
|
|
Let be an basis of
so that
and hence
It now follows from (Prop. 7.3) that
This proves ii. and hence i. As to iii., let be any basis of so that every has a unique expression
with Hence if we have
uniquely. By (Prop. 7.10(i)) this shows that is a basis of as a vector space.
|
Divided differences
Let Recall (Prop. 1.1) that is divisible by in Hence we may define an operator
by
(i.e.,
).
has degree -1, i.e. it maps into Clearly it kills
-
-
-
-
|
|
Proof.
|
|
- We have
hence
-
- If then hence
- If then
so
|
We shall use these operators only when is a simple root of and we shall write
For any sequence
with each let
Say is reduced if
is a reduced expression, i.e. if
- If is reduced, then depends only on
(write ).
- If is not reduced, then
|
|
Proof.
|
|
We shall prove (i) and (ii) by induction on the length of the sequence is OK, so assume
-
Let be reduced, We shall compute
when and is linear, using Proposition 8.1.ii:
Now by Proposition 8.1.iv we have
where
By (Root Systems, 1.16) the roots are precisely the positive roots such that is negative. Also, by the inductive hypothesis,
unless
is reduced, i.e. of length moveover
So we may rewrite (Inv 10) as follows:
summed over such that (1) and (2)
Thus the right hand side of (Inv 11) depends only on (and ), not on the reduced word for
Now let be another reduced word for and let
This is an operator of degree so that in particular From (Inv 11) it follows that
if is linear. But then by iteration we see that if are linear then
and hence (Inv 12) is true for all Taking now we have
i.e., and therefore
This completes the induction step for Proposition 8.2.i.
- Suppose now is not reduced, and let
If is not reduced then
by the inductive hypothesis, hence
If is reduced, let
so that and
since it follows that and hence
Consequently
by Proposition 8.1.i.
|
From Proposition 8.2 it follows that
Let Then
(Consider reduced words for and apply Proposition 8.2.)
Let Then is an operator of the form
where (the field of fractions of ) and in particular
where
[In fact unless (Bruhat order) ... ]
|
|
Proof.
|
|
Let
Then
in which the coefficient of
is
by (Root Systems, 1.16).
|
HW:
Show that is a polynomial, i.e. for all
Let be the longest element of Then
|
|
Proof.
|
|
Say
with coefficients
Since
for it follows from Proposition 8.4 that
i.e.
Since the generate we conclude that
for all and hence
Comparing this with the previous expression, we see that
for
Hence it is enough to determine one of the coefficients now by Proposition 8.5 we have
(since ). Hence
|
The are linearly independent over (Dedekind's lemma)
Dedekind's lemma
Dedekind's lemma. Distinct automorphisms of a field are linearly independent over
|
|
Proof.
|
|
Suppose we have a linear dependence relation
with May assume minimal (hence the ). Let Then we have
Multiply (Inv 13) by and subtract from (Inv 14).
Since there exists such that
but then
is a nontrivial linear dependence relation of length
|
Now let be the adjoint of
Since has degree -1, has degree +1. To get an idea of what looks like, we prove (although we shan't use it)
Let Then there exists (not unique) such that (indefinite integral: ). For any such we have
|
|
Proof.
|
|
Let Then
Since was arbitrary it follows that
|
Now let
a reduced expression and define
is the adjoint of and we have, from Proposition 8.4:
(Take adjoints: is the adjoint of (in that order).)
- maps into
- maps into
|
|
Proof.
|
|
- Let Then by Proposition 8.1.iii
so that
which proves (i).
- Recall Proposition 7.2 that is the orthogonal complement of in Hence if
we have
because Hence
|
Now let
By Proposition 9.4, and is homogeneous of degree
(BGG)
is an basis of
|
|
Proof.
|
|
Let us first establish that We have
and by Proposition 8.6 and skew symmetry of we have
So
and therefore (It must be a scalar multiple of )
Since (Proposition 7.4) it is enough to show that the are linearly independent over So suppose we have a linear dependence relation
Choose of minimal length such that Let and apply to (Inv 15). We have by Proposition 9.3
and if
because
and if So we obtain
hence
|
From Theorem 9.5 it follows that for each
and hence
Comparing this with Proposition 8.6 we have the polynomial identity
valid for any finite reflection group.
|
|
Proof of (Inv 17).
|
|
We may assume and proceed by induction on If we have
so If consider
and is either or 0, so that
if either 0 or equal to
So in both cases
for all hence (induction hypothesis)
i.e.
for all Hence finally
and (because ) so finally as required.
|
(The last part of this argument we have encountered before, in the proof of Chevalley's Lemma 2.3.)
We have because if is linear
by Proposition 8.1.iv.
Schubert polynomials
As before let
and define
is homogeneous of degree
(since
).
-
- Let
Then
is an basis of
|
|
Proof.
|
|
- We have
Suppose this is Then for reasons of degree we must have
i.e., and by Proposition 8.4
which forces So
unless and
and since by Proposition 8.6
we have
- Suppose
Take the scalar produce of each side with since
we get
and hence by (i) above. Hence the are linearly independent over and since
they are a basis.
|
Coxeter elements
Let be a tree, a group and
a mapping
Suppose that commute whenever are not joined by an edge in Then all the products
are conjugate in
|
|
Proof.
|
|
Remark first that if
then
is conjugate to So cyclic permutation of a product produces conjugate elements.
We prove the lemma by induction on the number of vertices in When there is nothing to prove. Suppose Since is a tree it has an end vertex say, joined to just one other vertex Consider a product of the Up to conjugacy in we may assume that comes first:
say. Now commutes with every except hence commutes with
hence is conjugate to So it remains to show that all products are conjugate in and to do this we use the inductive hypothesis. Let is a tree with vertices. Define
by
if Now apply the inductive hypothesis to
|
Now let be an ???? Weyl group (more generally, a finite reflection group), the generators of The Dynkin diagram of is a tree hence
satisfies the conditions of the lemma. Consequently all the products
of the generators are conjugate in Moreover if we had taken a different basis of the generators would be
so all the products (for all bases) are conjugate in These are the Coxeter elements of they form a distinguished conjugacy class.
Example.
Then is an cycle, so the Coxeter elements of are the cycles.
Let be a Coxeter element (we are assuming irreducible). Since conjugate elements of a finite group have the same order, we can define
is the Coxeter number of
Now consider the characteristic polynomial
whose root are the eigenvalues of Again they do not depend on which we take, and they are roots of unity, say
where
(One shows that 1 is not an eigenvalue of i.e. that has no fixed points in ). Since the eigenvalues are also (because they are the roots of a real polynomial) it follows that the sequence is the sequence in reverse order, i.e.
Adding up these equations, we get
The relevance of this to polynomial invariants is that if
are the degrees of then
Since by Proposition 3.4
it follows that
Again, since there is a polynomial invariant
of degree 2, we have hence
is an eigenvalue of i.e. a root of
Hence all primitive roots of unity are eigenvalues of i.e. each positive integer less than and prime to is an exponent.
Examples.
- acting irreducibly on the hyperplane in with equation is an cycle, so its eigenvalues are the roots of unity other than 1. So
This agrees with the above, because we have seen earlier that
- Here and so that The positive integers less than 30 and prime to 30 are
There are 8 of them, hence they are precisely the exponents of
-
so The positive integers less than 18 and prime to 18 are
So these are 6 of the exponents, and the other one must be by (1) above.
-
so Hence 1, 5, 7, 11 are four of the exponents. In fact, the other two are 4, 8, but one needs some extra information to deduce this. For example if one knew that
where are the two unknown exponents, then since we have
which together with gives and hence
- so again. Hence the exponents are 1, 5, 7, 11.
In fact the eigenvalues of the Cartan matrix
are
by an ingenious argument due to Coxeter [Bourbaki, p.140 Ex. 3,4].
simple (or almost simple) compact Lie group, the root system of relative to a maximal torus Then (over the reals) has the same cohomology as a product of spheres of dimensions i.e., is an exterior algebra with generators of degrees
Chevalley group over
so
Let be a Coxeter element of and put
Thus
Let
We have
where
is the element of the Cartan matrix Thus
and therefore, summing from to
Likewise, summing from to
since
Let
so that Then (Inv 23), (Inv 24) show that the matrix expressing the in terms of the is and that expressing the in terms of the is Hence if is the matrix of the Coxeter element we have
and hence
Now in general if is an matrix, its determinant is a sum of terms
Express the permutation as a product of disjoint cycles: each cycle
in will contribute
to the monomial (Inv 25). Apply this observation to the determinant above: the Dynkin diagram contains no cycles, hence any containing cycles of length will give zero. In other words, any term in the expansion of this determinant that contains
must also contain (below the diagonal). In other words we have
This vanishes when
i.e. when
i.e. when
Since
it follows that
and hence the eigenvalues of are
The roots
are precisely the positive roots such that is negative (Root systems, 1.18). The formula (Inv 23) shows that they are a basis of Let be the orbit of
Then one can show that the orbits are pairwise disjoint, that each contains elements, and that
References
I.G. Macdonald
Issac Newton Institute for the Mathematical Sciences
20 Clarkson Road
Cambridge CB3 OEH U.K.
Version: September 20, 2001
page history