Group Theory and Linear algebra

Semester II 2011

Last updates: 26 August 2011

(1) Week 6: Vocabulary

(2) Week 6: Results

(3) Week 6: Examples and computations

Define Hermitian form and inner product and give some illustrative examples. | |

Define length, orthogonal and orthonormal and give some illustrative examples. | |

Define matrix of a Hermitian form with respect to a basis and give some illustrative examples. | |

Define orthogonal complement and give some illustrative examples. | |

Define adjoint of a linear transformation and give some illustrative examples. | |

Define adjoint of a matrix and give some illustrative examples. | |

Define symmetric, orthogonal and normal linear transformations and give some illustrative examples. | |

Define symmetric, orthogonal and normal matrices and give some illustrative examples. | |

Define Hermitian, unitary and normal linear transformations and give some illustrative examples. | |

Define Hermitian, unitary and normal matrices and give some illustrative examples. |

Let $W$ be a finite dimensional inner product space. Show that an orthonormal subset of $W$ is linearly independent. | ||

Let $W$ be a finite dimensional inner product space. Show that an orthonormal subset of $W$ can be extended to an orthonormal basis. | ||

(Bessel's inequality) Let $S=\{{v}_{1},\dots ,{v}_{n}\}$ be an orthonormal subset of an inner product space $V$. Let $v\in V$ and set ${a}_{i}=(v,{v}_{i})$ for $i=1,2,\dots ,n$. Show that $$\sum _{i=1}^{n}{\left|{a}_{i}\right|}^{2}\le {\Vert v\Vert}^{2}.$$ | ||

Let $S=\{{v}_{1},\dots ,{v}_{n}\}$ be an orthonormal subset of an inner product space $V$. Let $v\in V$. Show that $v-\sum _{i=1}^{n}(v,{v}_{i}){v}_{i}$ is orthogonal to each ${v}_{j}$. | ||

Let $S=\{{v}_{1},\dots ,{v}_{n}\}$ be an orthonormal subset of an inner product space $V$. Let $v\in V$ and set ${a}_{i}=(v,{v}_{i})$ for $i=1,2,\dots ,n$. Show that if $S$ is a basis of $V$ then $$v=\sum _{i=1}^{n}{a}_{i}{v}_{i}\phantom{\rule{2em}{0ex}}\text{and}\phantom{\rule{2em}{0ex}}\sum _{i=1}^{n}{\left|{a}_{i}\right|}^{2}={\Vert v\Vert}^{2}.$$ | ||

(Schwarz's inequality) Show that if $v$ and $w$ are elements of an inner product space $V$ then $$\left|\right(v,w\left)\right|\le \Vert v\Vert \cdot \Vert w\Vert .$$ | ||

(Triangle inequality) Show that if $v$ and $w$ are elements of an inner product space $V$ then $$\Vert v+w\Vert \le \Vert v\Vert +\Vert w\Vert .$$ | ||

Let $V$ be a finite dimensional inner product space and let $W$ be a subspace of $V$. Show that $${W}^{\perp}\phantom{\rule{.5em}{0ex}}\text{is a subspace of}\phantom{\rule{.5em}{0ex}}V\phantom{\rule{2em}{0ex}}\text{and}\phantom{\rule{2em}{0ex}}V=W\oplus {W}^{\perp}.$$ | ||

Let $f:V\to V$ be a linear transformation on a fnite dimensional inner product space $V$. Show that the adjoint ${f}^{*}$ exists and is unique. | ||

Assume that $f:V\to V$ and $g:V\to V$ and are linear transformations on an inner product space $V$ such that $$\text{if}\phantom{\rule{.5em}{0ex}}v,w\in V\phantom{\rule{2em}{0ex}}\text{then}\phantom{\rule{1em}{0ex}}\left(f\right(v),w)=\left(g\right(v),w).$$ Show that $f=g$. | ||

Let $V$ be a inner product space with an orthonormal basis $\mathcal{B}=\{{v}_{1},\dots {v}_{n}\}$. Suppose that a linear transformation $f:V\to V$ has a matrix $A$ with respect to $\mathcal{B}$. Show that the matrix of ${f}^{*}$ with respect to $\mathcal{B}$ is the matrix ${A}^{*}$ given by $${\left({A}^{*}\right)}_{ij}=\stackrel{\u203e}{{A}_{ji}}.$$ | ||

Let $f:V\to V$
be a linear transformation on an inner product space $V$. Show that
the following are equivalent:
- (a) ${f}^{*}f=1$;
- (b) If $u,v\in V$ then $\left(f\right(u),f(v\left)\right)=(u,v)$;
- (c) If $v\in V$ then $\Vert f\left(v\right)\Vert =\Vert v\Vert $.
| ||

Let $f:V\to V$ be a linear transformation on an inner product space $V$. Let $W$ be an $f$-invariant subspace of $V$. Show that ${W}^{\perp}$ is ${f}^{*}$-invariant. | ||

Let $f:V\to V$ be a linear transformation over a finite dimensional real vector space $V$. Show that $V$ has an $f$-invariant subspace of dimension $\le 2$. | ||

Let $f:V\to V$ be an orthogonal linear transformation on a finite dimensional real vector space $V$. Show that there is an orthonormal basis of $V$ of the form $$\{{u}_{1},{v}_{1},{u}_{2},{v}_{2},\dots ,{u}_{k},{v}_{k},{w}_{1},\dots ,{w}_{\ell}\},$$ so that, for some ${\theta}_{1},\dots ,{\theta}_{k}$, $$f\left({u}_{i}\right)=\left(\mathrm{cos}{\theta}_{i}\right){u}_{i}+\left(\mathrm{sin}{\theta}_{i}\right){v}_{i}\phantom{\rule{2em}{0ex}}\text{and}\phantom{\rule{2em}{0ex}}f\left({v}_{i}\right)=(-\mathrm{sin}{\theta}_{i}){u}_{i}+\left(\mathrm{cos}{\theta}_{i}\right){v}_{i},$$ and $f\left({w}_{i}\right)=\pm {w}_{i}$. | ||

(Spectral theorem: first version) Let $f:V\to V$ be a normal linear transformation on a finite dimensional complex inner product space $V$. Show that there is an orthonormal basis for $V$ such that the matrix of $f$ with respect to this basis is diagonal. | ||

Let $f:V\to V$ be a normal linear transformation on a finite dimensional complex inner product space $V$. Show that there is a non-zero element of $V$ which is an eigenvector for both $f$ and ${f}^{*}$. Show that the two corresponding eigenvectors are complex conjugates. | ||

(Spectral theorem: second version)
Let $f:V\to V$ be a
normal linear transformation on a finite dimensional complex inner product space
$V$. Show that there exist self-adjoint (Hermitian) linear transformations
${e}_{1}:V\to V,\dots ,{e}_{k}:V\to V$
and scalars
${a}_{1},\dots ,{a}_{k}\in \u2102$ such that
- (a) If $i\ne j$ then ${a}_{i}\ne {a}_{j}$,
- (b) ${e}_{i}^{2}={e}_{i}$ and ${e}_{i}\ne 0$,
- (c) ${e}_{1}+\cdots +{e}_{k}=1$,
- (d) ${a}_{1}{e}_{1}+\cdots +{a}_{k}{e}_{k}=f$.
| ||

Let $f:V\to V$ be a
linear transformation on a finite dimensional complex inner product space
$V$. Show that
- (a) If $f$ is unitary then the eigenvalues of $f$ are of absolute value 1.
- (b) If $f$ is self-adjoint then the eigenvalues of $f$ are real.
| ||

Let $f:V\to V$ be a
linear transformation on a finite dimensional complex inner product space
$V$. Show that the following are equivalent:
- (a) $f$ is self adjoint and all eigenvalues of $f$ are nonnegative,
- (b) There exists a self-adjoint $g:V\to V$ such that $f={g}^{2}$,
- (c) There exists $h:V\to V$ such that $f=h{h}^{*}$,
- (d) $f$ is self adjoint and if $v\in V$ then $\left(f\right(v),v)\ge 0$.
| ||

Let $f:V\to V$ be a linear transformation on a finite dimensional complex inner product space $V$. Show that there exist a nonnegative linear transformation $p:V\to V$ and a unitary linear transformation $u:V\to V$ such that $f=pu$. | ||

Let $f:V\to V$ and $g:V\to V$ be linear transformations on a finite dimensional complex inner product space $V$. Assume that $fg=gf$. Show that there exists an orthonormal basis $B$ of $V$ such that the matrices of $f$ and $g$ with respect to the basis $B$ are diagonal. | ||

Let $f:V\to V$ and $g:V\to V$ be linear transformations on a finite dimensional complex inner product space $V$. Show that $fg=gf$ if and only if there exists a normal linear transformation $h:V\to V$ and polynomials $p,q\in \u2102\left[x\right]$ such that $f=p\left(h\right)$ and $g=q\left(h\right)$. |

Let $V={\mathbb{R}}^{n}$ and define $\u27e8,\u27e9:V\times V\to \mathbb{R}$ by $$\u27e8({a}_{1},{a}_{2},\dots ,{a}_{n}),({b}_{1},{b}_{2},\dots ,{b}_{n})\u27e9={a}_{1}{b}_{1}+{a}_{1}{b}_{2}+\cdots +{a}_{n}{b}_{n}.$$ Show that $\u27e8,\u27e9$ is a positive definite Hermitian form. | |

Let $V={\u2102}^{n}$ and define $\u27e8,\u27e9:V\times V\to \u2102$ by $$\u27e8({a}_{1},{a}_{2},\dots ,{a}_{n}),({b}_{1},{b}_{2},\dots ,{b}_{n})\u27e9={a}_{1}\stackrel{\u203e}{{b}_{1}}+{a}_{1}\stackrel{\u203e}{{b}_{2}}+\cdots +{a}_{n}\stackrel{\u203e}{{b}_{n}}.$$ Show that $\u27e8,\u27e9$ is a positive definite Hermitian form. | |

Let $V$ be any $n$-dimensional vector space over $\mathbb{R}$ and let $\{{v}_{1},{v}_{2},\dots ,{v}_{n}\}$ be a basis of $V$. Define $\u27e8,\u27e9:V\times V\to \mathbb{R}$ by $$\u27e8({a}_{1},{a}_{2},\dots ,{a}_{n}),({b}_{1},{b}_{2},\dots ,{b}_{n})\u27e9={a}_{1}{b}_{1}+{a}_{1}{b}_{2}+\cdots +{a}_{n}{b}_{n}.$$ Show that $\u27e8,\u27e9$ is a positive definite Hermitian form. | |

Let $V$ be any $n$-dimensional vector space over $\u2102$ and let $\{{v}_{1},{v}_{2},\dots ,{v}_{n}\}$ be a basis of $V$. Define $\u27e8,\u27e9:V\times V\to \u2102$ by $$\u27e8({a}_{1},{a}_{2},\dots ,{a}_{n}),({b}_{1},{b}_{2},\dots ,{b}_{n})\u27e9={a}_{1}\stackrel{\u203e}{{b}_{1}}+{a}_{1}\stackrel{\u203e}{{b}_{2}}+\cdots +{a}_{n}\stackrel{\u203e}{{b}_{n}}.$$ Show that $\u27e8,\u27e9$ is a positive definite Hermitian form. | |

Let $V={M}_{n\times n}\left(\u2102\right)$. Define $\u27e8,\u27e9:V\times V\to \u2102$ by $$\u27e8A,B\u27e9=\mathrm{trace}\left(A{\stackrel{\u203e}{B}}^{t}\right),$$ where $\mathrm{trace}\left(C\right)$ for a square matrix $C$, is the sum of the diagonal entries. Show that $\u27e8,\u27e9$ is a positive definite Hermitian form. | |

Let $V=\u2102\left[x\right]$ be the vector space of polynomials with coefficients in $\u2102$. Define $\u27e8,\u27e9:V\times V\to \u2102$ by $$\u27e8p\left(x\right),q\left(x\right)\u27e9={\int}_{0}^{1}p\left(x\right)\stackrel{\u203e}{q\left(x\right)}\phantom{\rule{.2em}{0ex}}dx.$$ Show that $\u27e8,\u27e9$ is a positive definite Hermitian form. | |

Let $V=C\left(\right[a,b],\u2102)$ be the vector space of continuous functions $f:[a,b]\to \u2102$, where $[a,b]$ is the closed interval $\left\{t\phantom{\rule{.3em}{0ex}}\right|\phantom{\rule{.3em}{0ex}}a\le t\le b\}$. Define $\u27e8,\u27e9:V\times V\to \u2102$ by $$\u27e8f,g\u27e9={\int}_{a}^{b}f\left(t\right)\stackrel{\u203e}{g\left(t\right)}\phantom{\rule{.2em}{0ex}}dt.$$ Show that $\u27e8,\u27e9$ is a positive definite Hermitian form. | |

Using the standard inner product on ${\mathbb{R}}^{3}$ (as in Problem (1)) apply the Gram-Schmidt algorithm to the basis $\left\{\frac{1}{\sqrt{2}}\right(1,1,0),\phantom{\rule{.5em}{0ex}}\frac{1}{\sqrt{3}}(1,-1,1),\phantom{\rule{.5em}{0ex}}(0,0,1\left)\right\}$ of ${\mathbb{R}}^{3}$ to obtain an orthonormal basis of ${\mathbb{R}}^{3}$. | |

Using the standard inner product on polynomials (as in Problem (6)) apply the Gram-Schmidt algorithm to the basis $\{1,x\}$ of ${\mathcal{P}}_{1}\left(\mathbb{R}\right)=\{{a}_{0}+{a}_{1}x\phantom{\rule{.5em}{0ex}}|\phantom{\rule{.5em}{0ex}}{a}_{0},{a}_{1}\in \mathbb{R}\}$ to obtain an orthonormal basis of ${\mathcal{P}}_{1}\left(\mathbb{R}\right).$ | |

Show that the orthogonal complement to a plane through the origin in ${\mathbb{R}}^{3}$ is the normal through the origin. | |

Show that the orthogonal complement to a line through the origin in ${\mathbb{R}}^{3}$ is the plane through the origin to which it is normal. | |

Show that the orthogonal complement to the set of diagonal matrices in ${M}_{n\times n}\left(\mathbb{R}\right)$ is the set of matrices with zero entries on the diagonal. | |

Let $A$ be an $m\times n$ matrix with real entries. Show that the row space of $A$ is the orthogonal complement of the nullspace of $A$. | |

Show that if a linear transformation is represented by a symmetric matrix with respect to an orthonormal basis then it is self-adjoint. | |

Show that the matrices $$A=\left(\begin{array}{cc}1& 2\\ 2& 5\end{array}\right)\phantom{\rule{2em}{0ex}}\text{and}\phantom{\rule{2em}{0ex}}B=\left(\begin{array}{cc}1& 2-i\\ 2+i& 3\end{array}\right)$$ are self adjoint (Hermitian). | |

A skew-symmetric matrix is a square matrix $A$ with real entries such that
$A=-{A}^{t}$. Show that a skew-symmetric matrix
is normal. Determine which skew symmetric matrices are self adjoint.
| |

Show that the matrix $\left(\begin{array}{cc}1& 1\\ i& 3+2i\end{array}\right)$ is normal but is not self-adjoint or skew-symmetric or unitary. | |

Show that in dimension 2, the possibilities for orthogonal matrices up to similarity are $$\left(\begin{array}{cc}1& 0\\ 0& -1\end{array}\right)\phantom{\rule{2em}{0ex}}\text{and}\phantom{\rule{2em}{0ex}}\left(\begin{array}{cc}\mathrm{cos}\theta & -\mathrm{sin}\theta \\ \mathrm{sin}\theta & \mathrm{cos}\theta \end{array}\right)$$ for some $\theta \in [0,2\pi ]$. | |

Find the length of $(2+i,3-2i,-1)$ with respect to the standard inner product on ${\u2102}^{3}$. | |

Find the length of ${x}^{2}-3x+1$ with respect to the standard inner product on polynomials. | |

Find the length of $\left(\begin{array}{cc}3& 2\\ 1& 4\end{array}\right)$ with respect to the standard inner product on matrices. | |

An exercise (from an anonymous textbook) claims that, if $V$ is an inner product space and $u,v\in V$ then $\Vert u+v\Vert +$ \Vert u-v\Vert =2\Vert u\Vert +2\Vert v\Vert $.\; Prove\; that\; this\; is\; false.\; Explain\; what\; was\; intended.$ | |

Let $f:V\to V$ and $g:V\to V$ be linear transformations on a finite dimensional inner product space $V$. Show that ${(f+g)}^{*}={f}^{*}+{g}^{*}$. | |

Let $A$ be a transition matrix between orthonormal bases. Show that $A$ is an isometry. | |

Let $f:V\to V$ be a linear transformation on an inner product space $V$. Show that if $f$ is self adjoint then the eigenvalues of $f$ are real. | |

Let $f:V\to V$ be a linear transformation on an inner product space $V$. Show that if $f$ is an isometry then eigenvalues of $f$ have absolute value 1. | |

Let $f:V\to V$ be a linear transformation on a finite dimensional inner product space $V$. Show that $\mathrm{im}{f}^{*}$ is the orthogonal complement of $\mathrm{ker}f$. Deduce that the rank of $f$ is equal to the rank of ${f}^{*}$. | |

Show that the linear transformation $d:\u2102\left[x\right]\to \u2102\left[x\right]$ given by differentiation with respect to $x$ has no adjoint with respect to the standard inner product on polynomials. (Hint: Try to find what ${d}^{*}\left(1\right)$ should be.) | |

Show that a triangular matrix which is self-adjoint is diagonal. | |

Show that a triangular matrix which is unitary is diagonal. | |

Let $f:V\to V$ be a linear transformation on an inner product space $V$. Assume that ${f}^{*}:V\to V$ is a function which satisfies $$\text{if}\phantom{\rule{.5em}{0ex}}u,w\in V\phantom{\rule{.5em}{0ex}}\text{then}\phantom{\rule{.5em}{0ex}}\u27e8f\left(u\right),w\u27e9=\u27e8u,{f}^{*}\left(w\right)\u27e9.$$ Show that ${f}^{*}$ is a linear transformation. | |

Explain why $$\u27e8z,w\u27e9={z}_{1}{w}_{1}+4{z}_{2}{w}_{2},\phantom{\rule{2em}{0ex}}\text{for}\phantom{\rule{1em}{0ex}}z=({z}_{1},{z}_{2})\phantom{\rule{.5em}{0ex}}\text{and}\phantom{\rule{.5em}{0ex}}w=({w}_{1},{w}_{2}),$$ does not define an inner product on ${\u2102}^{2}$. | |

Explain why $$\u27e8z,w\u27e9={z}_{1}\stackrel{\u203e}{{w}_{1}}-{z}_{2}\stackrel{\u203e}{{w}_{2}},\phantom{\rule{2em}{0ex}}\text{for}\phantom{\rule{1em}{0ex}}z=({z}_{1},{z}_{2})\phantom{\rule{.5em}{0ex}}\text{and}\phantom{\rule{.5em}{0ex}}w=({w}_{1},{w}_{2}),$$ does not define an inner product on ${\u2102}^{2}$. | |

Explain why $$\u27e8z,w\u27e9={z}_{1}\stackrel{\u203e}{{w}_{1}},\phantom{\rule{2em}{0ex}}\text{for}\phantom{\rule{1em}{0ex}}z=({z}_{1},{z}_{2})\phantom{\rule{.5em}{0ex}}\text{and}\phantom{\rule{.5em}{0ex}}w=({w}_{1},{w}_{2}),$$ does not define an inner product on ${\u2102}^{2}$. | |

Find the length of $(1-2i,2+3i)$ using the complex dot product on ${\u2102}^{2}$. | |

Let $W$ be the subspace of ${\mathbb{R}}^{4}$ spanned by $(0,1,0,1)$ and $(2,0,-3,-1)$. Find a basis for the orthogonal complement ${W}^{\perp}$ using the dot product as inner product. | |

Let $f:V\to V$ and $g:V\to V$ be linear transformations on a finite dimensional inner product space $V$. Show that ${\left(fg\right)}^{*}={g}^{*}{f}^{*}$. | |

Which of the following matrices are (i) Hermitian, (ii) unitary, (iii) normal? $$A=\left(\begin{array}{cc}2& i\\ -i& 3\end{array}\right),\phantom{\rule{2em}{0ex}}B=\left(\begin{array}{cc}1& i\\ 0& 1\end{array}\right),\phantom{\rule{2em}{0ex}}C=\left(\begin{array}{cc}0& i\\ -i& 0\end{array}\right),\phantom{\rule{2em}{0ex}}D=\left(\begin{array}{cc}1& i\\ 1& 2+i\end{array}\right).$$ | |

Find an orthonormal basis for ${\u2102}^{2}$ containing a multiple of $(1+i,1-1)$. | |

Let $W$ be a subspace of an inner product space $V$. Show that $W\subseteq {\left({W}^{\perp}\right)}^{\perp}$. | |

Let $W$ be a subspace of an inner product space $V$. Show that if $\mathrm{dim}\left(V\right)$ is finite then $W={\left({W}^{\perp}\right)}^{\perp}$. | |

Let $f:V\to V$ be a linear transformation on an inner product space $V$. Show that $\mathrm{ker}{f}^{*}={\left(\mathrm{im}f\right)}^{\perp}$. | |

Let $V$ be a vector space with a complex inner product $(,)$. Show that $u,v\in V$ then $$4(u,v)={\Vert u+v\Vert}^{2}-{\Vert u-v\Vert}^{2}+i{\Vert u+iv\Vert}^{2}-i{\Vert u-iv\Vert}^{2}.$$ | |

Let ${\ell}^{2}$ be the vector space of sequences $\overrightarrow{a}=({a}_{1},{a}_{2},\dots )$ with ${a}_{i}\in \u2102$ such that $\sum _{i=1}^{\infty}{\left|{a}_{i}\right|}^{2}<\infty $. Let $(,)$ be the inner product on ${\ell}^{2}$ given by $$(\overrightarrow{a},\overrightarrow{b})=\sum _{i=1}^{\infty}{a}_{i}\stackrel{\u203e}{{b}_{i}}.$$ Prove that this series is absolutely convergent and defines an inner product on ${\ell}^{2}$. | |

Let $(,)$ be an inner product on a complex inner product space $V$. Further $$\u27e8v,w\u27e9=\mathrm{Re}(v,w)$$ defines a real inner product on $V$ regarded as a real vector space. Show that $$(v,w)=\u27e8v,w\u27e9+i\u27e8v,iw\u27e9.$$ Deduce that $(v,w)=0$ if and only if $\u27e8v,w\u27e9=0$ and $\u27e8v,iw\u27e9=0.$ | |

Find a unitary matrix $U$ such that ${U}^{*}AU$ is diagonal where $A=\left(\begin{array}{cc}1& i\\ -i& 1\end{array}\right).$ | |

Show that every normal matrix $A$ has a square root. | |

Prove that if $A$ is Hermitian then $A+i$ is invertible. | |

Prove that if $Q$ is orthogonal then $Q+\frac{1}{2}$ is invertible. | |

Show that any square matrix $A$ can be written uniquely as a sum $A=B+C$, where $B$ is Hermitian and $C$ satisfies ${C}^{*}=-C$. Show that $A$ is normal if and only if $B$ and $C$ commute. | |

Let $F$ be the $n\times n$ "Fourier matrix" with ${F}_{jk}=\frac{1}{\sqrt{n}}{\omega}^{jk}$, where $\omega ={e}^{2\pi i/n}$. Show that $F$ is unitary. (This arises in the theory of the "Fast Fourier transform".) | |

Show that if $A=UD{U}^{*}$ where $D$ is a diagonal matrix and $U$ is unitary, then $A$ is a normal matrix. | |

Show that a linear transformation $f:V\to V$ on a complex inner product space $V$ is normal if and only if $f$ satisfies $\u27e8f\left(u\right),f\left(v\right)\u27e9=\u27e8{f}^{*}\left(u\right),{f}^{*}\left(v\right)\u27e9$ for all $u,v\in V$. | |

Show that every normal matrix $A$ has a square root; that is, there exists a matrix $B$ such that ${B}^{2}=A$. | |

Must every complex matrix have a square root? Explain thoroughly. | |

Two linear transformations $f$ and $g$ on a finite dimensional
complex inner product space are unitarily equivalent if there is a unitary linear
transformation $u$ such that $g={u}^{-1}fu$. Two matrices are unitarily equivalent if their linear transformations,
with respect to some fixed orthonormal basis, are unitarily equivalent. Decide whether
the matrices
$$\left(\begin{array}{cc}1& 1\\ 0& 1\end{array}\right)\phantom{\rule{2em}{0ex}}\text{and}\phantom{\rule{2em}{0ex}}\left(\begin{array}{cc}0& 0\\ 1& 0\end{array}\right)$$
are unitarily equivalent. Always explain your reasoning.
| |

Decide whether the matrices $$\left(\begin{array}{ccc}0& 0& 2\\ 0& 0& 0\\ 2& 0& 0\end{array}\right)\phantom{\rule{2em}{0ex}}\text{and}\phantom{\rule{2em}{0ex}}\left(\begin{array}{ccc}1& 1& 0\\ 1& 1& 0\\ 0& 0& -1\end{array}\right)$$ are unitarily equivalent. Always explain your reasoning. | |

Decide whether the matrices $$\left(\begin{array}{ccc}0& 1& 0\\ -1& 0& 0\\ 0& 0& -1\end{array}\right)\phantom{\rule{2em}{0ex}}\text{and}\phantom{\rule{2em}{0ex}}\left(\begin{array}{ccc}-1& 0& 0\\ 0& i& 0\\ 0& 0& -i\end{array}\right)$$ are unitarily equivalent. Always explain your reasoning. | |

Let $f:V\to V$ be a linear transformation on an inner product space $V$. Are $f$ and ${f}^{*}$ always unitarily equivalent? | |

If $f$ is a normal linear transformation on a finite dimensional inner product space, and if ${f}^{2}={f}^{3}$, show that $f={f}^{2}$. Show also that $f$ is self adjoint. | |

If $f$ is a normal linear transformation on a finite dimensional inner product space show that ${f}^{*}=p\left(f\right)$ for some polynomial $p$. | |

If $f$ and $g$ are normal linear transformations on a finite dimensional inner product space, and $fg=gf$, show that ${f}^{*}g=g{f}^{*}$. | |

Let $V$ be an inner product space, let $g:V\to V$ be a linear transformation and let $f:V\to V$ be a normal linear transformation. Show that if $fg=gf$ then ${f}^{*}g=g{f}^{*}$. | |

Let $V$ be an inner product space
and let $f:V\to V$ be a linear transformation.
Assume that
$f\left({f}^{*}f\right)=\left({f}^{*}f\right)f$.
- (a) Show that ${f}^{*}f$ is normal.
- (b) Choose an orthonormal basis so that the matrix of ${f}^{*}f$ takes the block diagonal form $\mathrm{diag}({A}_{1},\dots ,{A}_{m})$, where ${A}_{i}={\lambda}_{i}{I}_{{m}_{i}}$ and ${\lambda}_{i}={\lambda}_{j}$ only if $i=j$.
- (c) Show that $f$ has matrix, with respect to this basis, of the block diagonal form $\mathrm{diag}({B}_{1},\dots ,{B}_{m})$, for some ${m}_{i}\times {m}_{i}$ matrices ${B}_{i}$.
- (d) Deduce that ${B}_{i}^{*}{B}_{i}={A}_{i}$ and that ${B}_{i}^{*}{B}_{i}={B}_{i}{B}_{i}^{*}$.
- (e) Show that $f$ is normal.
| |

The following is a question (unedited) submitted to an Internet news group:
Can you help?Hello, I have a question hopefully any of you can help. As you all know: If we have a square matrix A, we can always find another square matrix X such that X(-1) * A * X = J where J is the matrix with Jordan normal form. Column vectors of X are called principal vectors of A. (If J is a diagonal matrix, then the diagonal memebers are the eigenvalues and column vectors of X are eigenvectors.) It is also known that if A is real and symmetric matrix, then we can find X such that X is "orthogonal" and J is diagonal. The question: Are there any less strict conditions of A so that we can guarantee X orthogonal, with J not necessarily a diagonal? I would appreciate any answers and/or pointers to any references. |

[GH]
J.R.J. Groves and
C.D. Hodgson,
*Notes for 620-297: Group Theory and Linear Algebra*, 2009.

[Ra]
A. Ram,
*Notes in abstract algebra*, University of Wisconsin, Madison 1994.