## Lectures in Representation Theory

Last update: 20 August 2013

## Lecture 12

Continued proof.

Step 3. The polynomial ${\prod }_{i is homogeneous of degree $\left(\genfrac{}{}{0}{}{n}{2}\right)\text{.}$

 Proof. Each monomial in the expansion of ${\prod }_{i is obtained by choosing a factor of either ${x}_{i}$ or ${x}_{j}$ from each factor $\left({x}_{i}-{x}_{j}\right)$ for $1\le i Therefore, this polynomial is homogeneous of total degree $|\left\{\left(i,j\right) | 1\le i $\square$

Step 4. The coefficient of ${x}^{\lambda +\delta }$ in ${a}_{\lambda +\delta }\left(x\right)$ is one.

 Proof. Observe that $\lambda +\delta$ has distinct parts, so $w{x}^{\lambda +\delta }={x}^{\lambda +\delta }$ implies $w=1\text{.}$ Then $aλ+δ(x) |xλ+δ= ∑w∈W (ε(w)wxλ+δ) |xλ+δ=1.$ $\square$

Step 5. $\left({\prod }_{i

 Proof. To obtain the monomial ${x}^{\delta }={x}_{1}^{n-1}+{x}_{2}^{n-2}+\cdots +{x}_{n-1}$ as a term in the expansion of ${\prod }_{i it is necessary to choose a factor of ${x}_{1}$ from all factors of the form $\left({x}_{1}-{x}_{j}\right),$ $j>1\text{.}$ The only source of factors ${x}_{2}$ then are the factors of the form $\left({x}_{2}-{x}_{j}\right),$ $j>2,$ from each of which we must choose ${x}_{2}\text{.}$ We proceed in this manner and are forced to choose all factors of ${x}_{i}$ from the terms $\left({x}_{i}-{x}_{j}\right)$ where $i for $1\le i\le n-1\text{.}$ Hence, there is only one monomial equal to ${x}^{\delta }$ in the expansion of this product. $\square$

$\square$

We now conclude the proof of the Weyl Denominator Formula for Type A.

 Proof [WDF]. The polynomials ${a}_{\lambda +\delta }\left(x\right)$ are a basis for the space of alternating symmetric functions, hence ${\prod }_{i for some ${c}_{\alpha }\in ℂ\text{.}$ The left hand side is homogeneous of degree $\left(\genfrac{}{}{0}{}{n}{2}\right)$ by step 3; however, only ${a}_{\delta }$ has this degree among the ${a}_{\lambda +\delta }$ by step one. Thus, ${\prod }_{i Comparing coefficients of ${x}^{\delta }$ using steps 4 (with $\lambda =0\text{)}$ and 5 yields that ${c}_{\delta }=1\text{.}$ $\square$

Definition 2.21 The Schur function, denoted ${s}_{\lambda }\left(x\right)$ associated to the partition $\lambda ⊢n$ is the symmetric function defined by $sλ(x)= aλ+δ(x)aδ(x)$

Note that the Schur function is a symmetric function. If $f\left(x\right)={\sum }_{\alpha }{f}_{\alpha }{x}^{\alpha }$ and $g\left(x\right)={\sum }_{\beta }{g}_{\beta }{x}^{\beta }$ are arbitrary polynomials in $ℂ\left[{x}_{1},{x}_{2},\dots ,{x}_{n}\right],$ then for all $w\in W$ $w·(fg) = ∑α,β fαgβw· xα+β = ∑α,β fαgβ xwα+wβ = ∑αfαxwα ∑βgβxwβ = (w·f) (w·g)$ In particular, $w·\left({a}_{\delta }\left(x\right){s}_{\lambda }\left(x\right)\right)=\epsilon \left(w\right){a}_{\delta }\left(x\right)\left(w·{s}_{\lambda }\left(x\right)\right)\text{.}$ However $w·(aδ(x)sλ(x)) =w·aλ+δ(x)= ε(w)aλ+δ(x)$ from which it follows that $w·{s}_{\lambda }\left(x\right)={s}_{\lambda }\left(x\right)$ for all $w\in W\text{.}$

Moreover, we may define a linear map ${\Lambda }^{n}\to {A}^{n}$ by sending a symmetric function $f\left(x\right)$ to ${a}_{\delta }f\left(x\right)\in {A}^{n}\text{.}$ The inverse map ${A}^{n}\to {\Lambda }^{n}$ defined by $g\left(x\right)↦\frac{g\left(x\right)}{{a}_{\delta }\left(x\right)}$ is well defined, since the set of ${a}_{\lambda +\delta }\left(x\right)$ forms a basis for ${A}^{n}$ and these polynomials are divisible by ${a}_{\delta }\left(x\right)\text{.}$ Hence this map is a vector space isomorphism of ${\Lambda }^{n}\cong {A}^{n}\text{.}$ Furthermore, since the Schur functions map onto a basis of ${A}^{n},$ we have that

Proposition 2.22 The Schur functions $\left\{{s}_{\lambda }\left(x\right) | \lambda ⊢n\right\}$ form a basis for the vector space ${\Lambda }^{n}\text{.}$

Remark. This works for any finite Weyl group $W\text{.}$

We will next establish a very interesting relationship between the Schur functions and the Power symmetric functions. First, we will need the following formula due to Cauchy.

Lemma 2.23 (Cauchy’s Determinant) $|11-xiyj|1≤i,j≤n =∏i

 Proof. We roll up our sleeves and calculate. Let $\Delta =|\frac{1}{1-{x}_{i}{y}_{j}}|\text{.}$ Subtract the first row from all other rows; the $\left(i,j\right)$ entry of rows two through $n$ becomes $11-xiyj- 11-x1yj= (xi-x1)yj (1-xiyj) (1-x1yj) .$ Hence, we may pull out a common factor of $\left({x}_{i}-{x}_{1}\right)$ from row $i$ for $i\ge 2\text{.}$ Note that the product of these common factors may be written ${\left(-1\right)}^{n-1}{\prod }_{i=2}^{n}\left({x}_{1}-{x}_{i}\right)\text{.}$ The determinant then becomes $Δ=(-1)n-1 ∏i=2n (x1-xi) ∣ 11-x1y1 11-x1y2 ⋯ 11-x1yn y1(1-x2y1)(1-x1-y1) y2(1-x2y2)(1-x1-y2) ⋯ yn(1-x2yn)(1-x2-yn) ⋮ y1(1-xny1)(1-x1-y1) y2(1-xny2)(1-x1-y2) ⋯ yn(1-xnyn)(1-x1-yn) ∣ .$ Extracting a common factor of ${\left(1-{x}_{1}{y}_{j}\right)}^{-1}$ from the $j\text{th}$ column for $1\le j\le n,$ we obtain $Δ=(-1)n-1 ∏i=2n (x1-xi) ∏j=1n 11-x1yj ∣ 11⋯1 y11-x2y1 y21-x2y2 ⋯ yn1-x2yn ⋮ y11-xny1 y21-xny2 ⋯ yn1-xnyn ∣ .$ Next subtract column one from each of the remaining columns. For rows $2\le i\le n,$ the $\left(i,j\right)$ entry is given by $yj1-xiyj- y11-xiy1= yj-y1 (1-xiyj) (1-xiy1) .$ Hence we may extract a factor of $\left(-1\right)\left({y}_{1}-{y}_{j}\right)$ from column $j$ for $2\le j\le n\text{.}$ Combining factors of ${\left(-1\right)}^{n-1},$ we obtain $Δ = ∏i=2n (x1-xi) (y1-yi) ∏j=1n 11-x1yj ∣ 10⋯0 y11-x2y1 1(1-x2y2)(1-x2y1) ⋯ y1(1-x2yn)(1-x2y1) ⋮ y11-xny1 1(1-xny2)(1-xny1) ⋯ 1(1-xnyn)(1-xny1) ∣ = ∏i=2n (x1-xi) (y1-yi) ∏j=1n 11-x1yj |1(1-xiyj)(1-xiy1)| 2≤i,j≤n$ by expanding along the first row. We may pull out a factor of ${\left(1-{x}_{i}{y}_{1}\right)}^{-1}$ from each column $\left(2\le j\le n\right)\text{;}$ it follows from the inductive hypothesis for the variables ${x}_{2},{x}_{3},\dots {x}_{n},{y}_{2},{y}_{3},\dots ,{y}_{n}$ that $Δ = ∏i=1n (x1-xi) (y1-yi) ∏j=1n 11-x1yj 11-xjy1 |1(1-xiyj)| 2≤i,j≤n = ∏i=1n (x1-xi) (y1-yi) ∏j=1n 11-x1yj 11-xjy1 { ∏2≤i $\square$

## Notes and References

This is a copy of lectures in Representation Theory given by Arun Ram, compiled by Tom Halverson, Rob Leduc and Mark McKinzie.