Last updates: 26 August 2011
(1) Week 6: Vocabulary
(2) Week 6: Results
(3) Week 6: Examples and computations
| Define Hermitian form and inner product and give some illustrative examples. | |
| Define length, orthogonal and orthonormal and give some illustrative examples. | |
| Define matrix of a Hermitian form with respect to a basis and give some illustrative examples. | |
| Define orthogonal complement and give some illustrative examples. | |
| Define adjoint of a linear transformation and give some illustrative examples. | |
| Define adjoint of a matrix and give some illustrative examples. | |
| Define symmetric, orthogonal and normal linear transformations and give some illustrative examples. | |
| Define symmetric, orthogonal and normal matrices and give some illustrative examples. | |
| Define Hermitian, unitary and normal linear transformations and give some illustrative examples. | |
| Define Hermitian, unitary and normal matrices and give some illustrative examples. |
| Let be a finite dimensional inner product space. Show that an orthonormal subset of is linearly independent. | ||
| Let be a finite dimensional inner product space. Show that an orthonormal subset of can be extended to an orthonormal basis. | ||
| (Bessel's inequality) Let be an orthonormal subset of an inner product space . Let and set for . Show that | ||
| Let be an orthonormal subset of an inner product space . Let . Show that is orthogonal to each . | ||
| Let be an orthonormal subset of an inner product space . Let and set for . Show that if is a basis of then | ||
| (Schwarz's inequality) Show that if and are elements of an inner product space then | ||
| (Triangle inequality) Show that if and are elements of an inner product space then | ||
| Let be a finite dimensional inner product space and let be a subspace of . Show that | ||
| Let be a linear transformation on a fnite dimensional inner product space . Show that the adjoint exists and is unique. | ||
| Assume that and and are linear transformations on an inner product space such that Show that . | ||
| Let be a inner product space with an orthonormal basis . Suppose that a linear transformation has a matrix with respect to . Show that the matrix of with respect to is the matrix given by | ||
Let
be a linear transformation on an inner product space . Show that
the following are equivalent:
| ||
| Let be a linear transformation on an inner product space . Let be an -invariant subspace of . Show that is -invariant. | ||
| Let be a linear transformation over a finite dimensional real vector space . Show that has an -invariant subspace of dimension . | ||
| Let be an orthogonal linear transformation on a finite dimensional real vector space . Show that there is an orthonormal basis of of the form so that, for some , and . | ||
| (Spectral theorem: first version) Let be a normal linear transformation on a finite dimensional complex inner product space . Show that there is an orthonormal basis for such that the matrix of with respect to this basis is diagonal. | ||
| Let be a normal linear transformation on a finite dimensional complex inner product space . Show that there is a non-zero element of which is an eigenvector for both and . Show that the two corresponding eigenvectors are complex conjugates. | ||
(Spectral theorem: second version)
Let be a
normal linear transformation on a finite dimensional complex inner product space
. Show that there exist self-adjoint (Hermitian) linear transformations
and scalars
such that
| ||
Let be a
linear transformation on a finite dimensional complex inner product space
. Show that
| ||
Let be a
linear transformation on a finite dimensional complex inner product space
. Show that the following are equivalent:
| ||
| Let be a linear transformation on a finite dimensional complex inner product space . Show that there exist a nonnegative linear transformation and a unitary linear transformation such that . | ||
| Let and be linear transformations on a finite dimensional complex inner product space . Assume that . Show that there exists an orthonormal basis of such that the matrices of and with respect to the basis are diagonal. | ||
| Let and be linear transformations on a finite dimensional complex inner product space . Show that if and only if there exists a normal linear transformation and polynomials such that and . |
| Let and define by Show that is a positive definite Hermitian form. | |
| Let and define by Show that is a positive definite Hermitian form. | |
| Let be any -dimensional vector space over and let be a basis of . Define by Show that is a positive definite Hermitian form. | |
| Let be any -dimensional vector space over and let be a basis of . Define by Show that is a positive definite Hermitian form. | |
| Let . Define by where for a square matrix , is the sum of the diagonal entries. Show that is a positive definite Hermitian form. | |
| Let be the vector space of polynomials with coefficients in . Define by Show that is a positive definite Hermitian form. | |
| Let be the vector space of continuous functions , where is the closed interval . Define by Show that is a positive definite Hermitian form. | |
| Using the standard inner product on (as in Problem (1)) apply the Gram-Schmidt algorithm to the basis of to obtain an orthonormal basis of . | |
| Using the standard inner product on polynomials (as in Problem (6)) apply the Gram-Schmidt algorithm to the basis of to obtain an orthonormal basis of | |
| Show that the orthogonal complement to a plane through the origin in is the normal through the origin. | |
| Show that the orthogonal complement to a line through the origin in is the plane through the origin to which it is normal. | |
| Show that the orthogonal complement to the set of diagonal matrices in is the set of matrices with zero entries on the diagonal. | |
| Let be an matrix with real entries. Show that the row space of is the orthogonal complement of the nullspace of . | |
| Show that if a linear transformation is represented by a symmetric matrix with respect to an orthonormal basis then it is self-adjoint. | |
| Show that the matrices are self adjoint (Hermitian). | |
| A skew-symmetric matrix is a square matrix with real entries such that . Show that a skew-symmetric matrix is normal. Determine which skew symmetric matrices are self adjoint. | |
| Show that the matrix is normal but is not self-adjoint or skew-symmetric or unitary. | |
| Show that in dimension 2, the possibilities for orthogonal matrices up to similarity are for some . | |
| Find the length of with respect to the standard inner product on . | |
| Find the length of with respect to the standard inner product on polynomials. | |
| Find the length of with respect to the standard inner product on matrices. | |
| An exercise (from an anonymous textbook) claims that, if is an inner product space and then | |
|
Let | |
|
Let | |
|
Let | |
|
Let | |
|
Let | |
|
Show that the linear transformation | |
| Show that a triangular matrix which is self-adjoint is diagonal. | |
| Show that a triangular matrix which is unitary is diagonal. | |
|
Let | |
|
Explain why
| |
|
Explain why
| |
|
Explain why
| |
|
Find the length of | |
|
Let | |
|
Let | |
|
Which of the following matrices are (i) Hermitian, (ii) unitary, (iii) normal?
| |
|
Find an orthonormal basis for | |
|
Let | |
|
Let | |
|
Let | |
|
Let | |
|
Let | |
|
Let | |
|
Find a unitary matrix | |
|
Show that every normal matrix | |
|
Prove that if | |
|
Prove that if | |
|
Show that any square matrix | |
|
Let | |
|
Show that if | |
|
Show that a linear transformation | |
|
Show that every normal matrix | |
| Must every complex matrix have a square root? Explain thoroughly. | |
|
Two linear transformations | |
|
Decide whether
the matrices
| |
|
Decide whether
the matrices
| |
|
Let | |
|
If | |
|
If | |
|
If | |
|
Let | |
Let
| |
The following is a question (unedited) submitted to an Internet news group:
Can you help?Hello, I have a question hopefully any of you can help. As you all know: If we have a square matrix A, we can always find another square matrix X such that X(-1) * A * X = J where J is the matrix with Jordan normal form. Column vectors of X are called principal vectors of A. (If J is a diagonal matrix, then the diagonal memebers are the eigenvalues and column vectors of X are eigenvectors.) It is also known that if A is real and symmetric matrix, then we can find X such that X is "orthogonal" and J is diagonal. The question: Are there any less strict conditions of A so that we can guarantee X orthogonal, with J not necessarily a diagonal? I would appreciate any answers and/or pointers to any references. |
[GH] J.R.J. Groves and C.D. Hodgson, Notes for 620-297: Group Theory and Linear Algebra, 2009.
[Ra] A. Ram, Notes in abstract algebra, University of Wisconsin, Madison 1994.