## Lectures in Representation Theory

Last update: 20 August 2013

## Lecture 11

Definition 2.16 For an integer $r>0$ we define the power symmetric function ${p}_{r}\left(x\right)$ to be $pr(x)=pr (x1,x2,…,xn) =x1r+x2r+⋯ xnr,$ and we extend the definition to sequences $\mu =\left({\mu }_{1},{\mu }_{2},\dots ,{\mu }_{k}\right)$ of positive integers by $pμ(x)=pμ2 (x)pμ2(x) ⋯pμk(x).$

As an immediate corollary of 2.14 and 2.15, we have

Corollary 2.17 If $\sigma \in {S}_{m}$ has cycle type $\mu ⊢m,$ then $\text{wtr}\left(\sigma \right)={p}_{\mu }\left(x\right)\text{.}$

### 2.2$\phantom{\rule{1em}{0ex}}$Symmetric Functions

Let $W={S}_{n}$ (we use $W$ for Weyl group), and define an action of $W$ on polynomials in $ℂ\left[{x}_{1},{x}_{2},\dots ,{x}_{n}\right]$ in the following way. Let $w\in W$ act on the monomial ${x}_{{i}_{1}}{x}_{{i}_{2}}\cdots {x}_{{i}_{n}}\in ℂ\left[{x}_{1},{x}_{2},\dots ,{x}_{n}\right]$ by $wxi1xi2⋯ xin=xiw(1) xiw(2)⋯ xiw(n),$ and extend the action linearly to all of $ℂ\left[{x}_{1},{x}_{2},\dots ,{x}_{n}\right]\text{.}$ Notice that the action has the property that $w xi1λ1 xi2λ2⋯ xinλn= xiw(1)λ1 xiw(2)λ2⋯ xiw(n)λn= xi1λw-1(1) xi2λw-1(2)⋯ xinλw-1(n).$ Moreover, we have for all $w\in W$ $wpr(x)= pr(x)$ and $wpμ(x)= pμ(x).$

Definition 2.18 A polynomial $f\left(x\right)\in ℂ\left[{x}_{1},{x}_{2},\dots ,{x}_{n}\right]$ is a symmetric polynomial if it satisfies $wf(x)=f(x) for all w∈W,$ and a polynomial $g\left(x\right)\in ℂ\left[{x}_{1},{x}_{2},\dots ,{x}_{n}\right]$ is an alternating polynomial or a skew-symmetric polynomial if it satisfies $wg(x)=ε(w) g(x)for all w∈W.$

Let $\lambda =\left({\lambda }_{1},\cdots ,{\lambda }_{n}\right)$ be a sequence with each ${\lambda }_{i}\in ℕ,$ and let ${x}^{\lambda }$ denote the monomial ${x}_{1}^{{\lambda }_{1}}{x}_{2}^{{\lambda }_{2}}\cdots {x}_{n}^{{\lambda }_{n}}\text{.}$ We construct a symmetric function by “symmetrizing” ${x}^{\lambda }\text{.}$ Let $\text{Re}\left(\lambda \right)$ denote the set of all sequences in ${ℕ}^{n}$ that are rearrangements of the sequence $\lambda \text{.}$ Note that $\text{Re}\left(\lambda \right)$ is the $W\text{-orbit}$ ${W}_{\lambda }$ of $\lambda$ in ${ℕ}^{n}\text{.}$ Then define the symmetric function ${m}_{\lambda }\left(x\right)$ by $mλ(x)= ∑μ∈Re(λ) xμ=∑μ∈Wλ xμ,$ Notice that if $\nu \in \text{Re}\left(\lambda \right),$ then $mν(x)= mλ(x).$ Moreover, there is a unique $\nu \in \text{Re}\left(\lambda \right)$ such that $ν1≥ν2≥⋯≥ νn≥0.$ The polynomials ${ mν(x) | ν =(ν1,ν2,…,νn) , ν1≥ν2≥⋯≥ νn≥0 }$ are called the monomial symmetric polynomials.

The symmetric polynomials in $ℂ\left[{x}_{1},\dots ,{x}_{n}\right]$ form a $ℂ\text{-vector}$ space which we denote by ${\Lambda }_{n}\text{.}$ If $f\left(x\right)$ is a symmetric polynomial, let ${c}_{\nu }$ denote the coefficient of ${x}_{\nu }$ in $f\left(x\right)\text{.}$ Then, since $f\left(x\right)$ is symmetric, if $\lambda \in \text{Re}\left(\nu \right),$ ${c}_{\nu }$ is also the coefficient of ${x}^{\lambda }$ in $f\left(x\right)\text{.}$ Therefore, $f(x)= ∑ν1≥ν2≥⋯≥νn cνmν(x),$ and the monomial symmetric polynomials form a basis of ${\Lambda }_{n}\text{.}$

Analogously, the alternating polynomials in $ℂ\left[{x}_{1},\dots ,{x}_{n}\right]$ form a $ℂ\text{-vector}$ space which we denote by ${A}_{n}\text{.}$ To find a basis for ${A}_{n},$ we anti-symmetrize the monomial ${x}^{\lambda }\text{.}$ That is, we define $aλ(x)= ∑w∈Wε (w)wxλ,$ where, as before, $\epsilon \left(w\right)$ is the sign of $w\text{.}$ If $v\in W,$ then $vaλ(x) = ∑w∈Wε(w) vwxλ = ε(v) ∑vw∈Wε (vw)vwxλ = ε(v)aλ (x),$ and, therefore, ${a}_{\lambda }$ is alternating.

Lemma 2.19 Let $\lambda =\left({\lambda }_{1},{\lambda }_{2},\dots ,{\lambda }_{n}\right)$ with ${\lambda }_{i}\in ℕ,$ and suppose that ${\lambda }_{i}={\lambda }_{j}$ for some $i\ne j\text{.}$ Then, ${a}_{\lambda }\left(x\right)=0\text{.}$

 Proof. Let ${t}_{ij}\in W$ be the transposition that switches $i$ and $j\text{.}$ Then we have $aλ(x) = ∑w∈Wε(w) wx1λ1⋯ xiλi⋯ xjλj⋯ xnλn = ∑w∈Wε(w) wtijxλ = ε(tij) ∑tijw∈W ε(tijw) wtijxλ = ε(tij) aλ(x) = -aλ(x),$ and, therefore, ${a}_{\lambda }\left(x\right)=0\text{.}$ $\square$

If $\nu \in \text{Re}\left(\lambda \right),$ then $±{a}_{\lambda }\left(x\right)={a}_{\nu }\left(x\right),$ so, as in the case of the symmetric polynomials, we see that ${ aν(x) | ν=(ν1,ν2,…,νn) , ν1>ν2>⋯ >νn≥0 }$ forms a basis of ${A}_{n}\text{.}$

Our next goal is to show that the alternating symmetric functions and the symmetric functions are exactly the same. In fact, we will describe a bijection between ${\Lambda }_{n}$ and ${A}_{n}\text{.}$ To do this we let $δ= (n-1,n-2,…,2,1,0).$ Then if $\lambda \in {ℕ}^{n}$ with ${\lambda }_{1}>{\lambda }_{2}>\cdots >{\lambda }_{n}\ge 0,$ and $μ=λ-δ ( λ1-(n-1), λ2-(n-2),… λn-2-2, λn-1-1, λn ) ,$ we have ${\mu }_{1}\ge {\mu }_{2}\ge \cdots {\mu }_{n}\ge 0\text{.}$

For example, suppose that $n=8,$ $\delta =\left(7,6,5,4,3,2,1,0\right),$ and $\lambda =\left(13,10,9,8,3,2,1,0\right)\text{.}$ Then $\mu =\left(6,4,4,4,0,0,0,0\right)\text{.}$ We can picture this as follows. $□ □ □ □ □ □ □ | □ □ □ □ □ □ □ □ □ □ □ □ | □ □ □ □ □ □ □ □ □ | □ □ □ □ □ □ □ □ | □ □ □ □ □ □ □ | □ □ | □ | |$ The sequence $\delta$ is pictured to the left of the wall, the sequence $\mu$ is pictured to the right of the wall, and the sequence $\lambda$ is the entire picture. In this way we get a bijection between the index sets of the symmetric polynomials and the alternating polynomials.

Now we define a map between ${\Lambda }_{n}$ and ${A}_{n}\text{.}$ To this end, we note that ${a}_{\lambda }\left(x\right)=\sum _{w\in {S}_{n}}\epsilon \left(w\right)w{x}_{1}^{{\lambda }_{1}}\cdots {x}_{n}^{{\lambda }_{n}}=\text{det}\left({x}_{i}^{{\lambda }_{j}}\right),$ where by $\left({x}_{i}^{{\lambda }_{j}}\right)$ we mean the $n$ by $n$ matrix whose $i,j\text{-entry}$ is given by ${x}_{i}^{{\lambda }_{j}}\text{.}$ This is called the Vandermonde determinant, and it satisfies the following

Theorem 2.20 [Weyl’s Denominator Formula] $aλ(x)=det (xiλj)= ∏1≤i

Proof.

We proceed in several steps.

Step 1. $\prod _{i divides ${a}_{\lambda }\left(x\right)$ for all ${\lambda }_{1}>{\lambda }_{2}>\cdots >{\lambda }_{n}\text{.}$

 Proof. We use the evaluation map to send both ${x}_{i}$ and ${x}_{j}$ to $\alpha \in ℂ\text{.}$ Then the $i\text{th}$ and $j\text{th}$ row of the matrix $\left({x}_{i}^{{\lambda }_{j}}\right)$ are identical, and so ${a}_{\lambda }\left(x\right)=\text{det}\left({x}_{i}^{{\lambda }_{j}}\right)=0\text{.}$ This holds for all $\alpha \in ℂ,$ so ${a}_{\lambda }\left(x\right)$ is divisible by $\left({x}_{i}-{x}_{j}\right)\text{.}$ This argument holds for any pair $i so the product ${\prod }_{i divides ${a}_{\lambda }\left(x\right)\text{.}$ $\square$

Step 2. The polynomial $\prod _{i is alternating.

 Proof. Write the product $\prod _{i as follows $(x1-x2) (x1-x3)⋯ (x1-xi-1) (x1-xi) (x1-xi)⋯⋯ (x1-xn-1) (x1-xn)· (x2-x3) (x2-x4)⋯ (x2-xi) (x2-xi+1) (x2-xi+2)⋯⋯ (x2-xn)· ⋮ (xi-xi+1) (xi-xi+2) (xi-xi+3)⋯ (xi-xn-1) (xi-xn)· (xi+1-xi+2) (xi+1-xi+3) (xi+1-xi+4)⋯ (xi+1-xn)· ⋮ (xn-1-xn).$ (This is a trick of Littlewood.) Then consider the action of the simple transposition ${s}_{i}$ on this product. We see that ${s}_{i}$ preserves all rows except the $i\text{th}$ and the $\left(i+1\right)\text{st.}$ Moreover, the second element of the $i\text{th}$ row is changed with the 1st element of the $\left(i+1\right)\text{st}$ row, the third element of the $i\text{th}$ row is changed with the 2nd element of the $\left(i+1\right)\text{st}$ row, and so on. The first element of the $i\text{th}$ row stays the same with a sign change. Therefore, $si·∏i Since the result holds for each simple transposition ${s}_{i},$ it holds for the entire symmetric group ${S}_{n}\text{.}$ $\square$

Continued next lecture.

$\square$

## Notes and References

This is a copy of lectures in Representation Theory given by Arun Ram, compiled by Tom Halverson, Rob Leduc and Mark McKinzie.