## Complex Reflection Groups

Last update: 10 January 2014

## Section I

Let $V$ be a complex vector space and let $H$ be a Hermitian form on $V$ (i.e., $H\left(x,y\right)=\stackrel{‾}{H\left(y,x\right)}\text{).}$ Suppose $a\in V$ with $H\left(a,a\right)\ne 0$ and $\lambda \in ℂ\\left\{0\right\}\text{.}$ Define ${R}_{a,\lambda }:V\to V$ by $Ra,λ(v)=v+ (λ-1) H(v,a) H(a,a) a(v∈V).$

Proposition 1.

 (a) ${R}_{a,\lambda }\in GL\left(V\right)$ (b) ${R}_{a,\lambda }·{R}_{a,\mu }={R}_{a,\lambda \mu }$ (c) $H\left({R}_{a,\lambda }\left(v\right),{R}_{a,\lambda }\left(w\right)\right)=H\left(v,w\right)$ for all $v,w\in V⇔\lambda \stackrel{‾}{\lambda }=1\text{.}$ (d) If one of $\lambda$ or $\mu$ is not $1,$ then ${R}_{a,\lambda }={R}_{b,\mu }$ $⇔$ $ℂa=ℂb$ and $\lambda =\mu$ (e) If $T\in GL\left(V\right)$ and $H\left(Tv,Tw\right)=H\left(v,w\right)$ for all $v,w\in V,$ then $T{R}_{a,\lambda }{T}^{-1}={R}_{Ta,\lambda }\text{.}$

 Proof. Note that ${R}_{a,\lambda }\left(a\right)=\lambda a$ and if $v\in {a}^{\perp }$ (i.e., if $H\left(v,a\right)=0\text{),}$ then ${R}_{a,\lambda }\left(v\right)=v\text{.}$ Now the linearity of ${R}_{a,\lambda }$ is obvious. Since ${R}_{a,1}=I,$ (b) would imply that ${R}_{a,\lambda }$ is invertible and thus (a) follows from (b). To prove (b) we let $v\in V$ and compute $Ra,λ Ra,μ(v) = Ra,λ ( v+(μ-1) H(v,a) H(a,a) a ) = v+(λ-1) H(v,a) H(a,a) a+(μ-1) H(v,a) H(a,a) ·λa = v+(λμ-1) H(v,a) H(a,a) ·a = Ra,λμ(v).$ For (c) we let $v,w\in V$ and compute $H(Ra,λ(v),Ra,λ(w)) = H ( v+(λ-1) H(v,a) H(a,a) a,w+(λ-1) H(w,a) H(a,a) a ) = H(v,w)+(λλ‾-1) H(a,w) H(v,a) H(a,a)$ which implies (c). In (d) we suppose without loss of generality that $\lambda \ne 1\text{.}$ To prove the implication from left to right we evaluate both sides of ${R}_{a,\lambda }={R}_{b,\mu }$ at $a$ to obtain $(1) λa=a+(μ-1) H(a,b) H(b,b) bor (λ-1)a= (μ-1) H(a,b) H(b,b) ·b.$ Thus $\lambda -1\ne 0$ implies there is a $\gamma \in ℂ$ with $b=\gamma a\text{.}$ Inserting this in (1) we obtain $\lambda -1=\mu -1$ and hence $\lambda =\mu \text{.}$ The implication from right to left is trivial. For (e) we let $v\in V$ and compute $TRa,λT-1 (v) = T ( T-1v+(λ-1) H(T-1v,a) H(a,a) a ) = v+(λ-1) H(T-1v,a) H(a,a) Ta = v+(λ-1) H(v,Ta) H(Ta,Ta) Ta = RTa,λ(v).$ $\square$

Definition. If $\lambda$ is a primitive ${p}^{\text{th}}$ root of unity we call ${R}_{a,\lambda }$ a unitary reflection of order $p\text{.}$ Writing $V=ℂa\oplus {a}^{\perp }$ we see that there is a basis in which the matrix for ${R}_{a,\lambda }$ is $( λ 1 1 ⋱ 1 ) .$

Lemma 1: Let $\Gamma \in {𝒞}_{\ell }\text{.}$ Suppose for some pair of distinct integers $1\le i,j\le \ell$ there is an edge joining the ${i}^{\text{th}}$ and ${j}^{\text{th}}$ vertices. Then ${\text{cos}}^{2}\left(\pi /{q}_{ij}\right)-{\text{sin}}^{2}\left(\frac{\pi }{2{p}_{i}}-\frac{\pi }{2{p}_{j}}\right)\ge 0\text{.}$

 Proof. Letting $a=\pi /{q}_{ij}$ and $b=|\frac{\pi }{2{p}_{i}}-\frac{\pi }{2{p}_{j}}|$ we see that $a,b\ge 0$ and $a+b\le \pi /2\text{.}$ Hence $\text{cos}\left(a\right)\ge \text{sin}\left(b\right)$ yielding the lemma. $\square$

Let $\Gamma \in {𝒞}_{\ell }$ and let $V$ be an $\ell$ dimensional complex vector space with basis $\left\{{v}_{1},\dots ,{v}_{\ell }\right\}\text{.}$ We can associate with $\Gamma$ a Hermitian form $H=H\left(\Gamma \right)$ on $V$ as follows. If $v=\sum {\lambda }_{i}{v}_{i}$ and $w=\sum {\mu }_{i}{v}_{i}$ are vectors in $V,$ we define $H(v,w)=∑i,j αijλiμ‾j$ where $A=\left({\alpha }_{ij}\right)$ is the matrix with $αii=sin π/pi, αij=- { cos2π/qij- sin2(π2pi-π2pj) } 12$ if there is an edge joining the ${i}^{\text{th}}$ and ${j}^{\text{th}}$ vertices, and ${\alpha }_{ij}=0$ otherwise. [Cox1967, p.129,130] Notice that the quantity under the radical is non negative by Lemma 1, so that $A$ is a real symmetric matrix.

We fix a notation as follows: For $\Gamma \in {𝒞}_{\ell },$ we let $W=W\left(\Gamma \right)$ denote the group given by the presentation corresponding to $\Gamma$ and let ${r}_{1},\dots ,{r}_{\ell }$ be the generators corresponding to the vertices of $\Gamma \text{.}$ Let ${\epsilon }_{j}={e}^{2\pi i/{p}_{j}},$ $1\le j\le \ell ,$ and define ${S}_{j}={R}_{{v}_{j},{\epsilon }_{j}},$ $1\le j\le \ell ,$ using the Hermitian form $H=H\left(\Gamma \right)$ and the vector space $V$ with basis $\left\{{v}_{1},\dots ,{v}_{\ell }\right\}$ as described above. Finally let $G=G\left(\Gamma \right)=⟨{S}_{1},\dots ,{S}_{\ell }⟩\le GL\left(V\right)\text{.}$

Theorem 1: Let $\Gamma \in {𝒞}_{\ell },$ $W=W\left(\Gamma \right),$ and $G=G\left(\Gamma \right)\text{.}$ The correspondence ${r}_{i}\to {S}_{i}$ can be extended to a homomorphism $\theta$ of $W$ onto $G\text{.}$

 Proof. It suffices to show that for each $1\le i\le \ell$ we have ${S}_{i}^{{p}_{i}}=I,$ and that for each pair of distinct integers $1\le i,$ $j\le \ell$ we have ${S}_{i}{S}_{j}{S}_{i}\cdots ={S}_{j}{S}_{i}{S}_{j}\cdots$ with ${q}_{ij}$ factors on each side. The first of these conditions is obviously satisfied from the definition of ${S}_{i}\text{;}$ in fact ${S}_{i}$ has order ${p}_{i}\text{.}$ The second is vacuously satisfied if the graph $\Gamma$ has only one vertex. So, to begin, we assume the graph $\Gamma$ has two vertices. If the vertices are not joined by an edge then using $\left\{{v}_{1},{v}_{2}\right\}$ as a basis the matrices for ${S}_{1}$ and ${S}_{2}$ are both diagonal and hence commute. So we can assume $\Gamma$ is $p1 q p2 (q=q12).$ The verification of the theorem for this graph occurs in [Cox1962]. We will give a somewhat different argument here. We first compute the eigenvalues of ${S}_{1}{S}_{2}\text{.}$ The characteristic equation of ${S}_{1}{S}_{2}$ is ${\lambda }^{2}-\text{tr}\left({S}_{1}{S}_{2}\right)\lambda +\text{Det}\left({S}_{1}{S}_{2}\right)=0\text{.}$ Now in the basis $\left\{{v}_{1},{v}_{2}\right\}$ $S1= ( ε1 (1-ε1)c sin(π/p1) 0 1 ) ,S2= ( 10 (1-ε2)c sin(π/p2) ε2 ) ,$ and thus, $S1S2= ( ε1+c2(1-ε1)(1-ε2) sin(π/p1)sin(π/p2) ε2(1-ε1)c sin(π/p1) (1-ε2)c sin(π/p1) ε2 )$ where $c={\left\{{\text{cos}}^{2}\left(\pi /q\right)-{\text{sin}}^{2}\left(\frac{\pi }{2{p}_{1}}-\frac{\pi }{2{p}_{2}}\right)\right\}}^{\frac{1}{2}}\text{.}$ Letting ${\theta }_{1}={e}^{{\pi }_{i}/{p}_{1}},$ ${\theta }_{2}={e}^{{\pi }_{i}/{p}_{2}}$ and using $\text{sin}\left(x\right)=\frac{1}{2}\left({e}^{-ix}-{e}^{ix}\right)$ we see that $\text{tr}\left({S}_{1}{S}_{2}\right)={\theta }_{1}{\theta }_{2}\left(\frac{{\theta }_{1}}{{\theta }_{2}}+\frac{{\theta }_{2}}{{\theta }_{1}}-4{c}^{2}\right)\text{.}$ Then putting $\Phi =\frac{\pi }{{p}_{1}}-\frac{\pi }{{p}_{2}}$ we use ${e}^{i\Phi }+{e}^{-i\Phi }=2\text{cos} \Phi =2-4{\text{sin}}^{2}\left(\Phi /2\right)$ to obtain $tr(S1S2) = θ1θ2 ( 2-4sin2 (Φ/2)-4c2 ) = θ1θ2 ( 2-4cos2 (π/q) ) = -2θ1θ2 cos(2π/q).$ Hence the characteristic equation of ${S}_{1}{S}_{2}$ is $0 = λ2+2θ1θ2 cos(2π/q)λ+ θ12θ22 = λ2+θ1θ2 ( e2πi/q+ e-2πi/q ) λ+θ12θ22 = ( λ+θ1θ2 e2πi/q ) ( λ+θ1θ2 e-2πi/q )$ yielding the roots $λ1 = eπi(1p1+1p2+(1-2/q)) and λ2 = eπi(1p1+1p2-(1-2/q)).$ Note that $q\ne 2$ forces ${\lambda }_{1}\ne {\lambda }_{2}$ and thus there is an invertible matrix $P$ such that ${P}^{-1}{S}_{1}{S}_{2}P=\left[\begin{array}{cc}{\lambda }_{1}& 0\\ 0& {\lambda }_{2}\end{array}\right]\text{.}$ If $q$ is even the relation we must check is ${\left({S}_{1}{S}_{2}\right)}^{q/2}={\left({S}_{2}{S}_{1}\right)}^{q/2}\text{.}$ But a glance at the expressions for ${\lambda }_{1}$ and ${\lambda }_{2}$ reveals that if $q$ is even, then ${\lambda }_{1}^{q/2}={\lambda }_{2}^{q/2}\text{.}$ Let $a$ denote this common value. Then $P-1 (S1S2)q/2 P = (P-1S1S2P)q/2 =[λ1q/200λ2q/2] =aI2.$ Hence ${\left({S}_{1}{S}_{2}\right)}^{q/2}=a{I}_{2}$ and thus ${\left({S}_{2}{S}_{1}\right)}^{q/2}=a{I}_{2}$ also. Now we assume $q$ is odd. So here we have ${p}_{1}={p}_{2}=$ say $p\text{.}$ Thus ${\epsilon }_{1}={\epsilon }_{2}={e}^{2\pi i/p}$ and let $\epsilon ={e}^{2\pi i/p}\text{.}$ Define ${T}_{1}={P}^{-1}{S}_{1}P,$ ${T}_{2}={P}^{-1}{S}_{2}P,$ and let $D={P}^{-1}{S}_{1}{S}_{2}P\text{.}$ Finally, put ${T}_{1}=\left[\begin{array}{cc}x& u\\ v& y\end{array}\right]\text{.}$ Then $T2=P-1S2P = P-1S1-1P P-1S1S2P = T1-1D = [y/ε-u/ε-v/εx/ε] [λ100λ2] = [ yλ1ε -uλ2ε -vλ1ε xλ2ε ] = [ yγ -uγ‾ -vγ xγ‾ ]$ where $\gamma =\frac{{\lambda }_{1}}{\epsilon }={e}^{\pi i\left(1-2/q\right)}\text{.}$ Now comparing the traces of ${S}_{1}$ and ${T}_{1},$ and those of ${S}_{2}$ and ${T}_{2}$ we obtain the equations $1+\epsilon =x+y$ and $1+\epsilon =\gamma y+\stackrel{‾}{\gamma }x\text{.}$ Thus $x+y=\gamma y+\stackrel{‾}{\gamma }x\text{.}$ Using $\gamma \stackrel{‾}{\gamma }=1$ we obtain $\gamma y-x=\stackrel{‾}{\gamma }\left(\gamma y-x\right)\text{.}$ Since $q\ne 2$ we have $\gamma \ne 1$ and thus $\gamma y=x\text{.}$ Since $q$ is odd the relation we must check is ${\left({S}_{1}{S}_{2}\right)}^{\frac{q-1}{2}}{S}_{1}={S}_{2}{\left({S}_{1}{S}_{2}\right)}^{\frac{q-1}{2}}\text{.}$ Clearly it suffices to show ${P}^{-1}{\left({S}_{1}{S}_{2}\right)}^{\frac{q-1}{2}}{S}_{1}P={P}^{-1}{S}_{2}{\left({S}_{1}{S}_{2}\right)}^{\frac{q-1}{2}}P\text{.}$ $P-1 (S1S2)q-12 S1P = (P-1S1S2P)q-12 P-1S1P= Dq-12T1 = [ λ1q-12x λ1q-12u λ2q-12v λ2q-12y ] P-1S2 (S1S2)q-12P = P-1S2P (P-1S1S2P)q-12 =T2Dq-12 = [ λ1q-12γy -λ2q-12γ‾u -λ1q-12γv λ2q-12γ‾x ] .$ Thus we will have the desired equality if and only if $\gamma y=x$ and ${\lambda }_{1}^{\frac{q-1}{2}}=-{\lambda }_{2}^{\frac{q-1}{2}}\stackrel{‾}{\gamma }\text{.}$ The first of these conditions was obtained above. For the second we compute ${\left({\lambda }_{1}{\lambda }_{2}^{-1}\right)}^{\frac{q-1}{2}}={\left({e}^{\frac{-4\pi i}{q}}\right)}^{\frac{q-1}{2}}={e}^{\frac{2\pi i}{q}}=-\stackrel{‾}{\gamma }\text{.}$ We now have the theorem for graphs with one or two vertices. Now consider a graph $\Gamma$ with $\ell \ge 3$ vertices. Let $1\le i,$ $j\le \ell$ be distinct. Put $P=⟨{v}_{i},{v}_{j}⟩$ and $Q={P}^{\perp }$ $\text{("}\perp \text{"}$ is taken with respect to $H=H\left(\Gamma \right)\text{.}$ Now if $H$ is non degenerate on $P$ we will have $V=P\oplus Q\text{.}$ But by the argument given for the case where the graph had two vertices we see that ${S}_{i}$ and ${S}_{j}$ satisfy the required relationship when restricted to $P\text{.}$ Now ${S}_{i}$ and ${S}_{j}$ obviously satisfy the condition on $Q$ as they are both the identity transformation on $Q\text{.}$ Hence ${S}_{i}{S}_{j}{S}_{i}\cdots ={S}_{j}{S}_{i}{S}_{j}\cdots$ on the whole space $V\text{.}$ So we are led to assume that $H$ is degenerate on $P\text{.}$ Now $H$ is not identically zero on $P$ as $H\left({v}_{i},{v}_{i}\right)=\text{sin} \pi /{p}_{i}\ne 0\text{.}$ In fact we must further have $H\left({v}_{i},{v}_{j}\right)\ne 0$ for otherwise $H$ would be non degenerate on $P\text{.}$ Thus we see that $\text{dim}\left(P\cap Q\right)=1$ and since $\text{dim}\left(P\right)+\text{dim}\left(Q\right)\ge \ell$ we have $\text{dim}\left(Q\right)\ge \ell -2\text{.}$ If $\text{dim}\left(Q\right)=\ell -1$ we have $V=P+Q$ and there is an $\ell -2$ dimensional subspace $Q\prime$ of $Q$ such that $V=P\oplus Q\prime \text{.}$ Since ${S}_{i}$ and ${S}_{j}$ are the identity on $Q\prime$ we can argue as in the non degenerate case to obtain the desired result. So we assume $Q$ has dimensional $\ell -2\text{.}$ Hence $V\ne P+Q\text{;}$ therefore there is some basis vector ${v}_{k}$ not in $P+Q$ and we have $V=P+⟨{v}_{k}⟩+Q\text{.}$ Hence there is an $\ell -3$ dimensional subspace $Q\prime$ of $Q$ such that $V=P\oplus ⟨{v}_{k}⟩\oplus Q\prime \text{.}$ Now ${S}_{i}$ and ${S}_{j}$ are the identity transformation on $Q\prime$ so it suffices to check ${S}_{i}{S}_{j}{S}_{i}\cdots ={S}_{j}{S}_{i}{S}_{j}\cdots$ on the subspace $P\oplus ⟨{v}_{k}⟩=⟨{v}_{i},{v}_{j},{v}_{k}⟩\text{.}$ Recall we are assuming that $H$ is degenerate on $⟨{v}_{i},{v}_{j}⟩\text{.}$ Thus $Det ( αiiαij αijαjj ) =0.$ When expanded the equation is $\text{sin}\left(\pi /{p}_{i}\right)\text{sin}\left(\pi /{p}_{j}\right)-{\text{cos}}^{2}\left(\pi /{q}_{ij}\right)+{\text{sin}}^{2}\left(\pi /2{p}_{i}-\pi /2{p}_{j}\right)=0\text{.}$ Using ${\text{sin}}^{2}\left(a+b\right)-{\text{sin}}^{2}\left(a-b\right)=\text{sin}\left(2a\right)\text{sin}\left(2b\right)$ together with some half angle formulas one obtains the equivalent condition: $cos(πpi+πpj) =cos(π-2πqij).$ Since the arguments on each side are in the interval $\left[0,\pi \right]$ over which cos is one to one we have that $H$ is degenerate on $⟨{v}_{i},{v}_{j}⟩$ if and only $\frac{1}{{p}_{i}}+\frac{1}{{p}_{j}}+\frac{2}{{q}_{ij}}=1$ [Cox1974, p.110]. The solutions to this equation subject to the restrictions ${p}_{i},{p}_{j}\ge 2,$ ${q}_{ij}\ge 3,$ and ${p}_{i}={p}_{j}$ if ${q}_{ij}$ is odd are given in the table: $Table 1 qij 3 4 4 6 6 8 12 pi 6 3 4 2 3 2 2 pj 6 6 4 6 3 4 3 (pi≤pj)$ Now recall that ${v}_{k}$ is not in $P+Q\text{.}$ So in particular ${v}_{k}$ is not orthogonal to both ${v}_{i}$ and ${v}_{j}\text{.}$ Hence the portion of the graph $\Gamma$ involving the ${i}^{\text{th}},$ ${j}^{\text{th}},$ and ${k}^{\text{th}}$ vertices must look like: $pi pk pj qik qij qjk$ where the dotted lines joining the ${i}^{\text{th}}$ and ${k}^{\text{th}},$ and the ${j}^{\text{th}}$ and ${k}^{\text{th}}$ vertices indicate that there mayor may not be an edge joining those vertices but that at least one of those two edges actually occurs. Further the numbers ${q}_{ij},$ ${p}_{i},$ ${p}_{j}$ occur in one of the columns of Table 1. For example, one possibility is $3 p 6 q 4 q′$ $\text{(}{p}_{i}=3,$ ${p}_{j}=6,$ ${q}_{ij}=4,$ ${p}_{k}=p,$ ${q}_{ik}=q,$ ${q}_{jk}=q\prime \text{).}$ Let $U=⟨{v}_{i},{v}_{j},{v}_{k}⟩\text{.}$ Using the given basis $\left\{{v}_{i},{v}_{j},{v}_{k}\right\}$ for $U$ the matrix for ${S}_{i}$ restricted to $U$ is $[ ω 1-ωα x 0 1 0 0 0 1 ]$ while the matrix for ${S}_{j}$ restricted to $U$ is $[ 1 0 0 -ωα -ω2 y 0 0 1 ] .$ Here $\omega ={e}^{2\pi i/3},$ $\alpha ={3}^{1/4},$ $x=\left(\omega -1\right)\frac{H\left({v}_{k},{v}_{i}\right)}{H\left({v}_{i},{v}_{i}\right)},$ and $y=\left(-{\omega }^{2}-1\right)\frac{H\left({v}_{k},{v}_{j}\right)}{H\left({v}_{j},{v}_{j}\right)}\text{.}$ So $SiSj|U= [ ω2 1-ω2α x+1-ωαy -ωα -ω2 y 0 0 1 ] ,$ and thus, $(SiSj|U)2= [ 1 0 -ωx+1-ωαy 0 1 -ωαx+(1-ω)y 0 0 1 ] .$ Similarly, $SjSi|U= [ ω 1-ωα x -ω2α -ω -ωαx+y 0 0 1 ] ,$ and thus, $(SjSi|U)2= [ 1 0 -ωx+1-ωαy 0 1 -ωαx+(1-ω)y 0 0 1 ] .$ So we have verified ${\left({S}_{i}{S}_{j}\right)}^{2}={\left({S}_{j}{S}_{i}\right)}^{2}\text{.}$ A similar computation can be done in the remaining six cases to verify that the desired condition, ${S}_{i}{S}_{j}{S}_{i}\cdots ={S}_{j}{S}_{i}{S}_{j}\cdots ,$ is always satisfied. For $6 p 6 q q′$ one finds that $SiSjSi=SjSiSj= ( 0 -1 ωy+x -1 0 ωx+y 0 0 1 ) .$ With $4 p 4 q 4 q′$ one computes that $(SiSj)2= (SjSi)2= ( 1 0 (1-i)(x+y) 0 1 (1-i)(x+y) 0 0 1 ) .$ In the case of $p 6 q 6 q′$ one has that $(SiSj)3= (SjSi)3= ( 1 0 (1-ω)2y+(ω2-ω)x 0 1 (2-2ω)y+2(ω2-ω)x 0 0 1 ) .$ For $3 p 3 q 6 q′$ one finds that $(SiSj)3= (SjSi)3= ( 1 0 -3ω(x+y) 0 1 -3ω(x+y) 0 0 1 ) .$ If we have $p 4 q 8 q′$ we calculate that $(SiSj)4= (SjSi)4= ( 1 0 -4ix+2β3(1-i)y 0 1 -4βix+4(1-i)y 0 0 1 )$ where $\beta ={2}^{1/4}\text{.}$ Finally, for $p 3 q 12 q′$ one finds that $(SiSj)6= (SjSi)6= ( 1 0 6(ω2-ω)x-12α2ωy 0 1 62(ω2-ω)αx-12ωy 0 0 1 )$ where $\alpha ={3}^{1/4}\text{.}$ These verifications complete the proof of the theorem. $\square$

Corollary 1. Let $\Gamma \in 𝒞\text{.}$ Then the order of ${r}_{i}$ in $W\left(\Gamma \right)$ is ${p}_{i}\text{.}$

## Notes and references

This is a typed version of David W. Koster's thesis Complex Reflection Groups.

This thesis was submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy (Mathematics) at the University of Wisconsin - Madison, 1975.