Last update: 10 January 2014
Let be a complex vector space and let be a Hermitian form on (i.e., Suppose with and Define by
Proposition 1.
(a) | |
(b) | |
(c) | for all |
(d) | If one of or is not then and |
(e) | If and for all then |
Proof. | |
Note that and if (i.e., if then Now the linearity of is obvious. Since (b) would imply that is invertible and thus (a) follows from (b). To prove (b) we let and compute For (c) we let and compute which implies (c). In (d) we suppose without loss of generality that To prove the implication from left to right we evaluate both sides of at to obtain Thus implies there is a with Inserting this in (1) we obtain and hence The implication from right to left is trivial. For (e) we let and compute |
Definition. If is a primitive root of unity we call a unitary reflection of order Writing we see that there is a basis in which the matrix for is
Lemma 1: Let Suppose for some pair of distinct integers there is an edge joining the and vertices. Then
Proof. | |
Letting and we see that and Hence yielding the lemma. |
Let and let be an dimensional complex vector space with basis We can associate with a Hermitian form on as follows. If and are vectors in we define where is the matrix with if there is an edge joining the and vertices, and otherwise. [Cox1967, p.129,130] Notice that the quantity under the radical is non negative by Lemma 1, so that is a real symmetric matrix.
We fix a notation as follows: For we let denote the group given by the presentation corresponding to and let be the generators corresponding to the vertices of Let and define using the Hermitian form and the vector space with basis as described above. Finally let
Theorem 1: Let and The correspondence can be extended to a homomorphism of onto
Proof. | |
It suffices to show that for each we have and that for each pair of distinct integers we have with factors on each side. The first of these conditions is obviously satisfied from the definition of in fact has order The second is vacuously satisfied if the graph has only one vertex. So, to begin, we assume the graph has two vertices. If the vertices are not joined by an edge then using as a basis the matrices for and are both diagonal and hence commute. So we can assume is The verification of the theorem for this graph occurs in [Cox1962]. We will give a somewhat different argument here. We first compute the eigenvalues of The characteristic equation of is Now in the basis and thus, where Letting and using we see that Then putting we use to obtain Hence the characteristic equation of is yielding the roots Note that forces and thus there is an invertible matrix such that If is even the relation we must check is But a glance at the expressions for and reveals that if is even, then Let denote this common value. Then Hence and thus also. Now we assume is odd. So here we have say Thus and let Define and let Finally, put Then where Now comparing the traces of and and those of and we obtain the equations and Thus Using we obtain Since we have and thus Since is odd the relation we must check is Clearly it suffices to show Thus we will have the desired equality if and only if and The first of these conditions was obtained above. For the second we compute We now have the theorem for graphs with one or two vertices. Now consider a graph with vertices. Let be distinct. Put and is taken with respect to Now if is non degenerate on we will have But by the argument given for the case where the graph had two vertices we see that and satisfy the required relationship when restricted to Now and obviously satisfy the condition on as they are both the identity transformation on Hence on the whole space So we are led to assume that is degenerate on Now is not identically zero on as In fact we must further have for otherwise would be non degenerate on Thus we see that and since we have If we have and there is an dimensional subspace of such that Since and are the identity on we can argue as in the non degenerate case to obtain the desired result. So we assume has dimensional Hence therefore there is some basis vector not in and we have Hence there is an dimensional subspace of such that Now and are the identity transformation on so it suffices to check on the subspace Recall we are assuming that is degenerate on Thus When expanded the equation is Using together with some half angle formulas one obtains the equivalent condition: Since the arguments on each side are in the interval over which cos is one to one we have that is degenerate on if and only [Cox1974, p.110]. The solutions to this equation subject to the restrictions and if is odd are given in the table: Now recall that is not in So in particular is not orthogonal to both and Hence the portion of the graph involving the and vertices must look like: where the dotted lines joining the and and the and vertices indicate that there mayor may not be an edge joining those vertices but that at least one of those two edges actually occurs. Further the numbers occur in one of the columns of Table 1. For example, one possibility is Let Using the given basis for the matrix for restricted to is while the matrix for restricted to is Here and So and thus, Similarly, and thus, So we have verified A similar computation can be done in the remaining six cases to verify that the desired condition, is always satisfied. For one finds that With one computes that In the case of one has that For one finds that If we have we calculate that where Finally, for one finds that where These verifications complete the proof of the theorem. |
Corollary 1. Let Then the order of in is
This is a typed version of David W. Koster's thesis Complex Reflection Groups.
This thesis was submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy (Mathematics) at the University of Wisconsin - Madison, 1975.