Week 6 Problem Sheet
Group Theory and Linear algebra
Semester II 2011

Arun Ram
Department of Mathematics and Statistics
University of Melbourne
Parkville, VIC 3010 Australia
aram@unimelb.edu.au

Last updates: 26 August 2011

(1) Week 6: Vocabulary
(2) Week 6: Results
(3) Week 6: Examples and computations

Week 6: Vocabulary

Define Hermitian form and inner product and give some illustrative examples.
Define length, orthogonal and orthonormal and give some illustrative examples.
Define matrix of a Hermitian form with respect to a basis and give some illustrative examples.
Define orthogonal complement and give some illustrative examples.
Define adjoint of a linear transformation and give some illustrative examples.
Define adjoint of a matrix and give some illustrative examples.
Define symmetric, orthogonal and normal linear transformations and give some illustrative examples.
Define symmetric, orthogonal and normal matrices and give some illustrative examples.
Define Hermitian, unitary and normal linear transformations and give some illustrative examples.
Define Hermitian, unitary and normal matrices and give some illustrative examples.

Week 6: Results

Let W be a finite dimensional inner product space. Show that an orthonormal subset of W is linearly independent.
Let W be a finite dimensional inner product space. Show that an orthonormal subset of W can be extended to an orthonormal basis.
(Bessel's inequality) Let S={v1, ,vn} be an orthonormal subset of an inner product space V. Let v V and set ai=(v, vi) for i=1 ,2,,n. Show that i=1n |ai|2 v2.
Let S={v1, ,vn} be an orthonormal subset of an inner product space V. Let v V. Show that v- i=1n (v,vi)vi is orthogonal to each vj.
Let S={v1, ,vn} be an orthonormal subset of an inner product space V. Let v V and set ai=(v, vi) for i=1 ,2,,n. Show that if S is a basis of V then v= i=1n aivi and i=1n |ai|2 = v2.
(Schwarz's inequality) Show that if v and w are elements of an inner product space V then |(v,w)| v w.
(Triangle inequality) Show that if v and w are elements of an inner product space V then v+w v+ w.
Let V be a finite dimensional inner product space and let W be a subspace of V. Show that W is a subspace ofV and V=WW.
Let f:VV be a linear transformation on a fnite dimensional inner product space V. Show that the adjoint f* exists and is unique.
Assume that f:VV and g:VV and are linear transformations on an inner product space V such that if v,wV then (f(v),w) = (g(v),w) . Show that f=g.
Let V be a inner product space with an orthonormal basis ={v1, vn}. Suppose that a linear transformation f:VV has a matrix A with respect to . Show that the matrix of f* with respect to is the matrix A* given by (A*) ij = Aji .
Let f:VV be a linear transformation on an inner product space V. Show that the following are equivalent:
(a)   f*f=1;
(b)   If u,vV then (f(u), f(v)) =(u,v);
(c)   If vV then f(v) = v.
Let f:VV be a linear transformation on an inner product space V. Let W be an f-invariant subspace of V. Show that W is f*-invariant.
Let f:VV be a linear transformation over a finite dimensional real vector space V. Show that V has an f-invariant subspace of dimension 2.
Let f:VV be an orthogonal linear transformation on a finite dimensional real vector space V. Show that there is an orthonormal basis of V of the form { u1, v1, u2, v2, , uk, vk, w1, , w} , so that, for some θ1,, θk, f(ui) = (cosθi) ui + (sinθi) vi and f(vi) = (-sinθi) ui + (cosθi) vi, and f(wi) =±wi.
(Spectral theorem: first version) Let f:VV be a normal linear transformation on a finite dimensional complex inner product space V. Show that there is an orthonormal basis for V such that the matrix of f with respect to this basis is diagonal.
Let f:VV be a normal linear transformation on a finite dimensional complex inner product space V. Show that there is a non-zero element of V which is an eigenvector for both f and f*. Show that the two corresponding eigenvectors are complex conjugates.
(Spectral theorem: second version) Let f:VV be a normal linear transformation on a finite dimensional complex inner product space V. Show that there exist self-adjoint (Hermitian) linear transformations e1:VV ,, ek:VV and scalars a1,, ak such that
(a)   If ij then ai aj,
(b)   ei2=ei and ei0,
(c)   e1++ek =1,
(d)   a1e1 ++ akek =f.
Let f:VV be a linear transformation on a finite dimensional complex inner product space V. Show that
(a)   If f is unitary then the eigenvalues of f are of absolute value 1.
(b)   If f is self-adjoint then the eigenvalues of f are real.
Let f:VV be a linear transformation on a finite dimensional complex inner product space V. Show that the following are equivalent:
(a)   f is self adjoint and all eigenvalues of f are nonnegative,
(b)   There exists a self-adjoint g:VV such that f=g2,
(c)   There exists h:VV such that f=hh*,
(d)   f is self adjoint and if vV then (f(v),v) 0.
Let f:VV be a linear transformation on a finite dimensional complex inner product space V. Show that there exist a nonnegative linear transformation p:VV and a unitary linear transformation u:VV such that f=pu.
Let f:VV and g:VV be linear transformations on a finite dimensional complex inner product space V. Assume that fg= gf. Show that there exists an orthonormal basis B of V such that the matrices of f and g with respect to the basis B are diagonal.
Let f:VV and g:VV be linear transformations on a finite dimensional complex inner product space V. Show that fg= gf if and only if there exists a normal linear transformation h:VV and polynomials p,q [x] such that f=p(h) and g=q(h).

Week 6: Examples and computations

Let V=n and define , :V×V by ( a1,a2, , an) , ( b1,b2, , bn) = a1b1 + a1b2 ++ anbn . Show that , is a positive definite Hermitian form.
Let V=n and define , :V×V by ( a1,a2, , an) , ( b1 ,b2, , bn) = a1b1 + a1b2 ++ anbn . Show that , is a positive definite Hermitian form.
Let V be any n-dimensional vector space over and let {v1,v2, ,vn} be a basis of V. Define , :V×V by ( a1,a2, , an) , ( b1,b2, , bn) = a1b1 + a1b2 ++ anbn . Show that , is a positive definite Hermitian form.
Let V be any n-dimensional vector space over and let {v1,v2, ,vn} be a basis of V. Define , :V×V by ( a1,a2, , an) , ( b1 ,b2, , bn) = a1b1 + a1b2 ++ anbn . Show that , is a positive definite Hermitian form.
Let V=Mn×n (). Define , :V×V by A,B=trace (ABt), where trace(C) for a square matrix C, is the sum of the diagonal entries. Show that , is a positive definite Hermitian form.
Let V=[x] be the vector space of polynomials with coefficients in . Define , :V×V by p(x),q(x) = 01 p(x) q(x) dx . Show that , is a positive definite Hermitian form.
Let V=C([a,b] ,) be the vector space of continuous functions f:[a,b], where [a,b] is the closed interval {t|atb }. Define , :V×V by f,g = ab f(t) g(t) dt . Show that , is a positive definite Hermitian form.
Using the standard inner product on 3 (as in Problem (1)) apply the Gram-Schmidt algorithm to the basis { 12(1,1,0) , 13(1,-1, 1), (0,0,1)} of 3 to obtain an orthonormal basis of 3.
Using the standard inner product on polynomials (as in Problem (6)) apply the Gram-Schmidt algorithm to the basis {1,x} of 𝒫1() ={a0+a1x | a0,a1 } to obtain an orthonormal basis of 𝒫1().
Show that the orthogonal complement to a plane through the origin in 3 is the normal through the origin.
Show that the orthogonal complement to a line through the origin in 3 is the plane through the origin to which it is normal.
Show that the orthogonal complement to the set of diagonal matrices in Mn×n() is the set of matrices with zero entries on the diagonal.
Let A be an m×n matrix with real entries. Show that the row space of A is the orthogonal complement of the nullspace of A.
Show that if a linear transformation is represented by a symmetric matrix with respect to an orthonormal basis then it is self-adjoint.
Show that the matrices A= ( 1 2 2 5 ) and B= ( 1 2-i 2+i 3 ) are self adjoint (Hermitian).
A skew-symmetric matrix is a square matrix A with real entries such that A=-At. Show that a skew-symmetric matrix is normal. Determine which skew symmetric matrices are self adjoint.
Show that the matrix ( 1 1 i 3+2i ) is normal but is not self-adjoint or skew-symmetric or unitary.
Show that in dimension 2, the possibilities for orthogonal matrices up to similarity are ( 1 0 0 -1 ) and ( cosθ -sinθ sinθ cosθ ) for some θ[0,2π].
Find the length of (2+i, 3-2i,-1) with respect to the standard inner product on 3.
Find the length of x2-3x+1 with respect to the standard inner product on polynomials.
Find the length of ( 3 2 1 4 ) with respect to the standard inner product on matrices.
An exercise (from an anonymous textbook) claims that, if V is an inner product space and u,vV then u+v + u-v =2u +2v . Prove that this is false. Explain what was intended.
Let f:VV and g:VV be linear transformations on a finite dimensional inner product space V. Show that (f+g)* = f*+g*.
Let A be a transition matrix between orthonormal bases. Show that A is an isometry.
Let f:VV be a linear transformation on an inner product space V. Show that if f is self adjoint then the eigenvalues of f are real.
Let f:VV be a linear transformation on an inner product space V. Show that if f is an isometry then eigenvalues of f have absolute value 1.
Let f:VV be a linear transformation on a finite dimensional inner product space V. Show that imf* is the orthogonal complement of kerf. Deduce that the rank of f is equal to the rank of f*.
Show that the linear transformation d:[x] [x] given by differentiation with respect to x has no adjoint with respect to the standard inner product on polynomials. (Hint: Try to find what d*(1) should be.)
Show that a triangular matrix which is self-adjoint is diagonal.
Show that a triangular matrix which is unitary is diagonal.
Let f:VV be a linear transformation on an inner product space V. Assume that f* :VV is a function which satisfies if u,wV then f(u),w = u,f*(w) . Show that f* is a linear transformation.
Explain why z,w = z1w1 +4z2w2, for z= (z1,z2) and w= (w1,w2), does not define an inner product on 2.
Explain why z,w = z1 w1 - z2 w2, for z= (z1,z2) and w= (w1,w2), does not define an inner product on 2.
Explain why z,w = z1 w1, for z= (z1,z2) and w= (w1,w2), does not define an inner product on 2.
Find the length of (1-2i, 2+3i) using the complex dot product on 2.
Let W be the subspace of 4 spanned by (0,1,0,1) and (2,0,-3 ,-1). Find a basis for the orthogonal complement W using the dot product as inner product.
Let f:VV and g:VV be linear transformations on a finite dimensional inner product space V. Show that (fg)* =g*f*.
Which of the following matrices are (i) Hermitian, (ii) unitary, (iii) normal? A= ( 2 i -i 3 ), B= ( 1 i 0 1 ), C= ( 0 i -i 0 ), D= ( 1 i 1 2+i ).
Find an orthonormal basis for 2 containing a multiple of (1+i,1-1).
Let W be a subspace of an inner product space V. Show that W (W) .
Let W be a subspace of an inner product space V. Show that if dim(V) is finite then W= (W) .
Let f:VV be a linear transformation on an inner product space V. Show that kerf*= (imf).
Let V be a vector space with a complex inner product (,). Show that u,vV then 4(u,v) = u+v2 - u-v2 +i u+iv2 -i u-iv2 .
Let 2 be the vector space of sequences a=(a1, a2,) with ai such that i=1 |ai|2 <. Let (,) be the inner product on 2 given by (a ,b) = i=1 aibi . Prove that this series is absolutely convergent and defines an inner product on 2.
Let (,) be an inner product on a complex inner product space V. Further v,w =Re(v,w) defines a real inner product on V regarded as a real vector space. Show that (v,w) = v,w + i v,iw . Deduce that (v,w)=0 if and only if v,w =0 and v,iw =0.
Find a unitary matrix U such that U* AU is diagonal where A= ( 1 i -i 1 ).
Show that every normal matrix A has a square root.
Prove that if A is Hermitian then A+i is invertible.
Prove that if Q is orthogonal then Q+ 12 is invertible.
Show that any square matrix A can be written uniquely as a sum A=B+C, where B is Hermitian and C satisfies C* =-C. Show that A is normal if and only if B and C commute.
Let F be the n×n "Fourier matrix" with Fjk = 1n ωjk, where ω= e2πi/n. Show that F is unitary. (This arises in the theory of the "Fast Fourier transform".)
Show that if A=UDU* where D is a diagonal matrix and U is unitary, then A is a normal matrix.
Show that a linear transformation f:VV on a complex inner product space V is normal if and only if f satisfies f(u) ,f(v) = f*(u) ,f*(v) for all u,vV.
Show that every normal matrix A has a square root; that is, there exists a matrix B such that B2=A.
Must every complex matrix have a square root? Explain thoroughly.
Two linear transformations f and g on a finite dimensional complex inner product space are unitarily equivalent if there is a unitary linear transformation u such that g =u-1 fu. Two matrices are unitarily equivalent if their linear transformations, with respect to some fixed orthonormal basis, are unitarily equivalent. Decide whether the matrices ( 1 1 0 1 ) and ( 0 0 1 0 ) are unitarily equivalent. Always explain your reasoning.
Decide whether the matrices ( 0 0 2 0 0 0 2 0 0 ) and ( 1 1 0 1 1 0 0 0 -1 ) are unitarily equivalent. Always explain your reasoning.
Decide whether the matrices ( 0 1 0 -1 0 0 0 0 -1 ) and ( -1 0 0 0 i 0 0 0 -i ) are unitarily equivalent. Always explain your reasoning.
Let f:VV be a linear transformation on an inner product space V. Are f and f* always unitarily equivalent?
If f is a normal linear transformation on a finite dimensional inner product space, and if f2=f3, show that f=f2. Show also that f is self adjoint.
If f is a normal linear transformation on a finite dimensional inner product space show that f*=p(f) for some polynomial p.
If f and g are normal linear transformations on a finite dimensional inner product space, and fg=gf, show that f*g=gf*.
Let V be an inner product space, let g:V V be a linear transformation and let f:V V be a normal linear transformation. Show that if fg=gf then f*g=gf*.
Let V be an inner product space and let f:V V be a linear transformation. Assume that f(f*f) = (f*f)f .
(a)   Show that f*f is normal.
(b)   Choose an orthonormal basis so that the matrix of f*f takes the block diagonal form diag(A1, ,Am), where Ai=λi Imi and λi=λj only if i=j.
(c)   Show that f has matrix, with respect to this basis, of the block diagonal form diag(B1, ,Bm), for some mi×mi matrices Bi.
(d)   Deduce that Bi*Bi =Ai and that Bi*Bi = Bi Bi* .
(e)   Show that f is normal.
The following is a question (unedited) submitted to an Internet news group:
Hello,
I have a question hopefully any of you can help.

As you all know:

If we have a square matrix A, we can always find another
square matrix X such that

	X(-1) * A * X = J

where J is the matrix with Jordan normal form.  Column
vectors of X are called principal vectors of A.

(If J is a diagonal matrix, then the diagonal memebers are
the eigenvalues and column vectors of X are eigenvectors.)

It is also known that if A is real and symmetric matrix,
then we can find X such that X is "orthogonal" and J is
diagonal.

The question:

Are there any less strict conditions of A so that we can
guarantee X orthogonal, with J not necessarily a diagonal?

I would appreciate any answers and/or pointers to any
references.
Can you help?

References

[GH] J.R.J. Groves and C.D. Hodgson, Notes for 620-297: Group Theory and Linear Algebra, 2009.

[Ra] A. Ram, Notes in abstract algebra, University of Wisconsin, Madison 1994.