## Invertibility

Last update: 13 August 2013

## Invertibility

• An $n×n$ matrix $A$ is invertible if there is a matrix ${A}^{-1}$ such that $A{A}^{-1}={A}^{-1}A={I}_{n}\text{.}$

${GL}_{n}\left(R\right)$ is a group with identity ${I}_{n}\text{.}$

${B}_{n},{U}_{n},{T}_{n},{S}_{n}$ are all subgroups of ${GL}_{n}\left(𝔽\right)\text{.}$

• The matrices ${e}_{ij}$ which contain a 1 in the ${i}^{\text{th}}$ row and the ${j}^{\text{th}}$ column and zeros in all other entries are called matrix units.
• The matrices of the form $I+aeij= (1⋱a⋱⋱1) or (1⋱⋱a⋱1) i≠j and a∈R, I+eij+ eji-eii -ejj= ( 10 ⋱ 1 01 1 ⋱ 1 10 1 ⋱ 01 ) , I+(c-1)eii= ( 1 ⋱ 1 c 1 ⋱ 1 )$

are called elementary matrices.

• The row rank of a matrix $A$ is $\text{dim} \left(\text{span}\left({r}_{i}\right)\right)=r$ where ${r}_{i}$ are the vectors determined by the rows of $A\text{.}$

Any matrix $A={GL}_{n}\left(R\right)$ is a product of elementary matrices.

Any matrix $A\in {M}_{m×n}\left(R\right)$ can be written in the form $A=P(Ir000) Q$ where $P\in {GL}_{m}\left(R\right),$ $Q\in {GL}_{n}\left(R\right)$ and $r$ is the row rank of $A\text{.}$

### Determinants

• The determinant of an $n×n$ matrix $A=\left({a}_{rs}\right)$ $det(A)= ∑w∈Sn Σ(w) a1r(1)⋯ anr(n).$

Laplace expansion $det ( a11a12⋯a1n a21a22⋯a2n ⋮⋮⋱⋮ an1an2⋯ann ) =∑ipi det ( ai1j1 ⋯ ai1jp ⋮⋱⋮ aipj1 ⋯ aipjp ) ( aip+1jp+1 ⋯ aip+1jn ⋮⋱⋮ ainjp+1 ⋯ ainjn )$ where ${j}_{1},\dots ,{j}_{n}$ is a fixed permutation of $1,2,\dots ,n$ and the sum is over all possible divisions of $1,2,\dots ,n$ into two sets $i1<⋯ and ${p}_{i}=+1$ or $-1$ according as ${i}_{1},..,{i}_{n}$ and ${j}_{1},\dots ,{j}_{n}$ are like or unlike derangements of $1,2,\dots ,m\text{.}$

If $A$ and $B$ are $n×n$ matrices then $det(AB)= det(A)det(B).$

 a) Let $B$ be the matrix obtained by switching two rows of $A\text{.}$ Then $det(B)=-det(A).$ b) Let $B$ be the matrix obtained by adding a multiple of a row of $A$ to another row of $A\text{.}$ Then $det(B)=det(A).$ c) Let $B$ be the matrix obtained by multiplying a row of $A$ by a constant $c\in R\text{.}$ Then $det(B)=cdet(A).$

If two rows of $A$ are the same then $\text{det}\left(A\right)=0\text{.}$

• The ${\left(i,j\right)}^{\text{th}}$ signed minor or cofactor, ${A}_{ij},$ of $A$ is $\text{det}\left(\stackrel{ˆ}{A}\right)$ where $\stackrel{ˆ}{A}$ is the matrix $A$ with the ${i}^{\text{th}}$ row and the ${j}^{\text{th}}$ column removed.

$∑i=1n aikAhi =Shkdet A.$

If $\text{det}\left(A\right)$ is a unit in $R$ then $A-1=det (a)-1 (Aij).$

Cramerâ€™s rule for $AX=B\text{.}$

Put Thms VII-X of Hodge and Padoe as exercises.

### $\lambda \text{-matrices}$

If $A$ is a $p×q \lambda \text{-matrix}$ of rank $r$ and ${E}_{1}\left(\lambda \right),\dots ,{E}_{r}\left(\lambda \right)$ are its invariant factors then there exist $M\in {GL}_{p}\left(K\left[\lambda \right]\right)$ and $N\in {GL}_{q}\left(K\left[\lambda \right]\right)$ such that $MAN= ( E1(λ) E2(λ)0 ⋱ Er(λ) 00 ⋱ 0 ) .$

Two $p×q \lambda \text{-matrices}$ $A$ and $B$ are equivalent if and only if they have the same invariant factors and if and only if they have the same elementary divisors.

Note that these proofs work for any Euclidean domain.

• The characteristic polynomial of $A$ is the polynomial $det(A-tIn).$

Cayley-Hamilton Theorem.

Note that the proof of Theorem II §10 Hodge and Padoe!
Theorem III (Hodge and Padoe) gives minimal polynomial.

• A bilinear form on a vector space $V$ over $F$ is a map $⟨,⟩:V×V\to F$ such that $⟨c1v1+c2v2,w⟩ = c1⟨v1,w⟩+ c2⟨v2,w⟩and ⟨ v,c1w1+c2 w2 ⟩ = c1⟨v,w1⟩+ c2⟨v,w2⟩$ for all $v,{v}_{1},{v}_{2},w,{w}_{1},{w}_{2},\in V$ and ${c}_{1},{c}_{2}\in F\text{.}$
• A bilinear form $⟨,⟩:V×V\to F$ is symmetric if $⟨v,w⟩=⟨w,v⟩$ for all $v,w\in V\text{.}$
• A bilinear form $⟨,⟩:V×V\to F$ is skew-symmetric if $⟨v,w⟩=- ⟨w,v⟩ for all v,w∈V.$
• A Hermitian form $⟨,⟩:V×V\to ℂ$ is a map such that  a) $⟨v,w⟩=\stackrel{‾}{⟨w,v⟩}$ for all $v,w\in V\text{.}$ b) $⟨{c}_{1}{v}_{1}+{c}_{2}{v}_{2},w⟩={c}_{1}⟨{v}_{1},w⟩+{c}_{2}⟨{v}_{2},w⟩$ $⟨v,{c}_{1}{w}_{1}+{c}_{2}{w}_{2}⟩=\stackrel{‾}{{c}_{1}}⟨v,{w}_{1}⟩+\stackrel{‾}{{c}_{2}}⟨v,{w}_{2}⟩\text{.}$ for all $v,{v}_{1},{v}_{2},w,{w}_{1},{w}_{2},\in V$ and ${c}_{1},{c}_{2}\in ℂ\text{.}$

• A bilinear form $⟨,⟩:V×V\to F$ is positive if $⟨v,v⟩≥0 for all v∈V and ⟨v,v⟩=0 if and only if v=0.$
• A bilinear form $⟨,⟩:V×V\to F$ is positive semidefinite if $⟨v,v⟩≥0 for all v∈V.$
• A bilinear form $⟨,⟩:V×V\to ℝ$ is negative definite if $⟨v,v⟩<0$ for all $v\ne 0,$ $v\in V\text{.}$
• A bilinear form $⟨,⟩:V×V\to ℝ$ is indefinite if there exists $v,w\in V$ such that $⟨v,v⟩>0$ and $⟨w,w⟩<0\text{.}$

• The adjoint ${T}^{*}:V\to V$ of a linear transformation $T:V\to V$ is the map determined by $⟨Tv,w⟩= ⟨v,T*w⟩ for all w∈V.$

• A matrix $A\in {M}_{n}\left(F\right)$ is symmetric if $A={A}^{t}\text{.}$
• A matrix $A\in {M}_{n}\left(F\right)$ is skew-symmetric if $A=-{A}^{t}\text{.}$
• A matrix $A\in {M}_{n}\left(ℂ\right)$ is $A={\stackrel{‾}{A}}^{t}\text{.}$
• A linear transformation $T:V\to V$ is self-adjoint if $T={T}^{*}\text{.}$

When is the adjoint well defined?

 a) If $V$ is a vector space over $ℝ$ and $V$ has a symmetric positive definite bilinear form, then $T$ is symmetric if and only if $⟨v,Tw⟩=⟨Tv,w⟩\text{.}$ b) If $V$ is a vector space over $ℂ$ and $V$ has a Hermitian form, then $T$ is Hermitian if and only if $⟨v,Tw⟩=⟨Tv,w⟩\text{.}$ c) If $V$ is a vector space over $F$ and $V$ has a skew-symmetric form then $T$ is skew-symmetric if and only if $⟨v,Tw⟩=⟨Tv,w⟩\text{.}$

 a) If $V$ is a vector space over $ℝ$ and $V$ has a symmetric positive definite bilinear form, then $T$ is orthogonal if and only if $⟨v,w⟩=⟨Tv,Tw⟩\text{.}$ b) If $V$ is a vector space over $ℂ$ and $V$ has a Hermitian form, then $T$ is Unitary if and only if $⟨v,w⟩=⟨Tv,Tw⟩\text{.}$ c) If $V$ is a vector space over $F$ and and $V$ has a skewed-symmetric form, then $T$ is symplectic if and only if $⟨v,w⟩=⟨Tv,Tw⟩\text{.}$

• A matrix $A\in {M}_{n}\left(ℂ\right)$ is normal if $AA*=A*A.$

 a) If $A\in {M}_{n}\left(ℝ\right)$ is symmetric then there exists $P\in {O}_{n}\left(ℝ\right)$ such that ${P}^{-1}AP$ is diagonal. b) If $A\in {M}_{n}\left(ℂ\right)$ is Hermitian then there exists $P\in {U}_{n}\left(ℂ\right)$ such that ${P}^{-1}AP$ is diagonal and ${P}^{-1}AP\in {M}_{n}\left(ℝ\right)\text{.}$ c) $A\in {M}_{n}\left(ℂ\right)$ is normal if and only if there exists $P\in {U}_{n}\left(ℂ\right)$ such that ${P}^{-1}AP$ is diagonal. d) If $A\in {M}_{n}\left(𝔽\right)$ is skew-symmetric then there exists $P\in {Sp}_{2n}\left(F\right)$ such that ${P}^{-1}AP$ is diagonal.

 a) $V$ is a vector space over $ℝ$ with symmetric positive definite bilinear form and $T:V\to V$ linear transformation. $T$ is orthogonal if and only if $T$ is a change of basis matrix between orthonormal bases. b) $V$ is a vector space over $ℂ$ with Hermitian form and $T:V\to V$ linear transformation. $T$ is Hermitian if and only if $T$ is a change of basis matrix between orthonormal bases. c) $V$ is a vector space over $F$ with skew-symmetric form and $T:V\to V$ linear transformation. $T$ is symplectic if and only if $T$ is a change of basis between symplectic bases.

 aa) Let $V$ be a vector space over $ℝ$ with positive definite symmetric bilinear form. Then there exists an orthonormal basis. ab) Let $A\in {M}_{n}\left(ℝ\right)$ be a symmetric positive definite matrix. Then there exists $Q\in {GL}_{n}\left(ℝ\right)$ such that $QA{Q}^{t}={I}_{n}\text{.}$ ba) Let $V$ be a vector space over $ℂ$ with a Hermitian form. Then there exists an orthonormal basis. bb) Let $A\in {M}_{n}\left(ℂ\right)$ be a Hermitian matrix. Then there exists $Q\in {GL}_{n}\left(ℂ\right)$ such that $QAQ*={I}_{n}\text{.}$ ca) Let $V$ be a vector space over $ℝ$ with a symmetric bilinear form. Then there exists an orthogonal basis such that $⟨{v}_{i},{v}_{i}⟩=±1$ or $0\text{.}$ cb) Let $A\in {M}_{n}\left(ℝ\right)$ be symmetric. Then there exists $Q\in {GL}_{n}\left(ℝ\right)$ such that $QAQt= ( Ip0 -Iq 00 )$ da) Let $V$ be a vector space over $F$ with a skew-symmetric form. Then there exists a symplectic basis. db) Let $A$ be a skew-symmetric matrix $A\in {M}_{2n}\left(𝔽\right)\text{.}$ Then there exists $Q\in {GL}_{2n}\left(𝔽\right)$ such that $QAQt= (0In-In0).$

Example.

 1) Schwartz Inequality. 2) Triangle Inequality. 3) Conics and Quadrics. 4) Positive definite Exs.

Let $T:V\to V$ be a linear transformation.

• $\text{rank}\left(T\right)=\text{dim}\left(\text{ker} T\right)$
• $\text{nullity}\left(T\right)=\text{dim}\left(\text{im} T\right)$

Let $A\in {M}_{m×n}\left(F\right)\text{.}$ Then there exists $Q\in {GL}_{m}\left(F\right)$ and $P\in {GL}_{n}\left(F\right)$ such that $QAP=(Ir000).$

• Martix of a linear transformation T with respect to basis $B=\left\{{b}_{1},\dots ,{b}_{n}\right\}$ is $A=\left({a}_{ij}\right)$ where $Tbj= ∑i=1n biaij.$

Let $B$ and $B\prime$ be bases and let $T:V\to V$ be a linear transformation. Then let $A$ and $A\prime$ be the matrices of $T$ with respect to $B$ and $B\prime$ respectively. Let $P$ be the change of basis matrix $P⟨b⟩=⟨b\prime ⟩\text{.}$ Then $A′=PAP-1.$

• An eigenvector of a linear transformation $T:V\to V$ is a vector $v\in V$ such that $Tv=\lambda v$ for some $\lambda \in F\text{.}$
• An eigenvalue of a linear transformation $T:V\to V$ is a vector $\lambda \in F$ such that $Tv=\lambda v$ for some $v\in V\text{.}$

• The characteristic polynomial of a linear transformation $T:V\to V$ is ${\text{ch}}_{T}\left(T\right)=\text{det}\left(T-tI\right)\in F\left[t\right]\text{.}$
• The minimal polynomial of a linear transformation $T:V\to V$ is a polynomial ${m}_{T}\left(t\right)\in F\left[t\right]$ such that  a) ${m}_{T}\left(T\right)=0,$ and b) If $f\left(t\right)\in F\left[T\right]$ and $f\left(t\right)=0$ then ${m}_{T}\left(t\right)$ divides $f\left(t\right)$ in $F\left[t\right]\text{.}$

Ex. Matrix in Jordan form, with ${\text{ch}}_{T}\ne {m}_{T}\text{.}$

${m}_{T}\left(t\right)$ exists and is unique.

Let $mT(t)= p1(t)e1 p2(t)e2… pk(t)ek$ be the prime factorization of the minimal polynomial of $T$ and let ${f}_{i}\left(t\right)=\frac{{m}_{T}\left(t\right)}{{p}_{i}{\left(t\right)}^{{e}_{1}}}\text{.}$ Let ${E}_{i}={f}_{i}\left(T\right),$ then $Ei2 = Ei. EiEj = 0 if i≠j. 1 = E1+E2+⋯+En.$

$\lambda$ is an eigenvalue of $T$ if and only if $\lambda$ is a root of the characteristic polynomial of $T,$ ${\text{ch}}_{T}\left(t\right)\text{.}$

 a) Let $A\in {M}_{n}\left(F\right)$ such that ${\text{ch}}_{A}\left(t\right)$ factors into linear factors in $F\text{.}$ Then there exists $P\in {GL}_{n}\left(F\right)$ such that $PA{P}^{-1}$ is triangular. b) Let $A\in {M}_{n}\left(F\right)$ and suppose ${\text{ch}}_{F}\left(t\right)$ has $n$ distinct roots in $F\text{.}$ Then there exists $P\in {GL}_{n}\left(F\right)$ such that $PA{P}^{-1}$ is diagonal.

Ex. Does the converse hold?

### Proofs for §X.

Let $mT(t)= p1(t)e1 p2(t)e2⋯ pk(t)ek$ be the prime factorization of the minimal polynomial of $T$ and let ${f}_{i}\left(t\right)=\frac{{m}_{T}\left(t\right)}{{p}_{i}{\left(t\right)}^{{e}_{1}}}\text{.}$ Let ${E}_{i}={f}_{i}\left(T\right),$ then $Ei2 = Ei. EiEj = 0 if i≠j. 1 = E1+E2+…+En.$

Proof.

Let ${q}_{i}\left(x\right)\frac{m\left(x\right)}{{p}_{i}{\left(x\right)}^{{e}_{i}}}\text{.}$
Then the ${q}_{i}\left(x\right)$ have $\text{gcd}=1\text{.}$
Since $F\left[x\right]$ is a Euclidean domain there exists ${a}_{i}\left(x\right)$ such that

$1=q1(x)a1(x) +…+qs(x)as(x).$

Let ${f}_{i}\left(x\right)={q}_{i}\left(x\right){a}_{i}\left(x\right)\text{.}$
Then $1={f}_{1}\left(x\right)+\cdots +{f}_{n}\left(T\right)\text{.}$
So a) $1={f}_{1}\left(x\right)+\cdots +{f}_{n}\left(T\right)\text{.}$

To show:

 b) ${f}_{i}\left(T\right){f}_{j}\left(T\right)=0$ if $i\ne j\text{.}$

 Proof. $fi(x)fj(x) = qi(x) ai(x) qj(x) aj(x) = m(x)pi(x)ei m(x)pj(x)ej ai(x)aj(x).$ So $m\left(x\right) | {f}_{i}\left(x\right){f}_{j}\left(x\right)\text{.}$ So ${f}_{i}\left(T\right){f}_{j}\left(T\right)=0\text{.}$ $\square$

To show:

 c) ${f}_{i}{\left(T\right)}^{2}={f}_{i}\left(T\right)\text{.}$

 Proof. $fi(T)·1 = fi(T) (∑fj(T)) = fi(T)2.$ $\square$

To show:

 d) ${f}_{i}\left(T\right)V\ne 0\text{.}$

 Proof. $V=∑jfi (T)V.$ If ${f}_{i}\left(T\right)V=0$ then $V=∑j≠i fj(T)V.$ So $qi(T)V = ∑j≠iqi (T)qj(T) aj(T)V = 0$ since $m\left(X\right) | {q}_{i}\left(x\right){q}_{j}\left(x\right)\text{.}$ But $m\left(x\right)$ is the minimal polynomial. So ${q}_{i}\left(T\right)V=0\text{.}$ So ${f}_{i}\left(T\right)\ne 0\text{.}$ $\square$

To show:

 e) ${f}_{i}\left(T\right)V=\text{nullsp}\left({p}_{i}{\left(T\right)}^{{e}_{i}}\right)\text{.}$

Proof.

 ea) ${f}_{i}\left(T\right)V\subseteq \text{nullsp}\left({p}_{i}{\left(T\right)}^{{e}_{i}}\right)\text{.}$ Let $v\in {f}_{i}\left(T\right)V\text{.}$ Then $pi(T)eiV = pi(T)ei fi(T)v = m(T) ai(T)v = 0.$ So $v\in nullsp\left({p}_{i}{\left(T\right)}^{{e}_{i}}\right)\text{.}$ eb) Let $v\in \text{nullsp}\left({p}_{i}{\left(T\right)}^{{e}_{i}}\right)\text{.}$ Then $v={f}_{1}\left(T\right){v}_{1}+\cdots +{f}_{s}\left(T\right){v}_{s}\text{.}$ So ${p}_{i}{\left(T\right)}^{{e}_{i}}v=0={p}_{i}{\left(T\right)}^{{e}_{i}}{f}_{1}\left(T\right){r}_{1}+\cdots +{p}_{i}{\left(T\right)}^{{e}_{i}}{f}_{s}\left(T\right){v}_{s}\text{.}$

$\square$

$\square$