3.4 Reprsentation of Transformations by Matrices

这一节通过Theorem 11Theorem 12 将linear transformation和矩阵建立了一对一的关联,此外Theorem 13说明linear transformation的composition与矩阵乘法也有天然的联系。Theorem 14则说明在linear operator的情况下,同一个linear transformation不同basis的矩阵存在相似(similar)的关系,且这个导致相似的矩阵实质上就是一组基在另一组基下的矩阵。

Exercises

  1. Let T T be the linear operator on C 2 C^2 defined by T ( x 1 , x 2 ) = ( x 1 , 0 ) T(x_1,x_2)=(x_1,0) . Let B \mathfrak B be the stantard ordered basis for C 2 C^2 and let B = { α 1 , α 2 } \mathfrak B'=\{\alpha_1,\alpha_2\} be the ordered basis defined by α 1 = ( 1 , i ) , α 2 = ( i , 2 ) \alpha_1=(1,i),\alpha_2=(-i,2) .
    ( a ) What is the matrix of T T relative to the pair B , B \mathfrak B,\mathfrak B' ?
    ( b ) What is the matrix of T T relative to the pair B , B \mathfrak B',\mathfrak B ?
    ( c ) What is the matrix of T T in the ordered basis B \mathfrak B' ?
    ( d ) What is the matrix of T T in the ordered basis { α 2 , α 1 } \{\alpha_2,\alpha_1\} ?
    Solution:
    ( a ) T ϵ 1 = 2 α 1 i α 2 , T ϵ 2 = 0 α 1 + 0 α 2 Tϵ_1=2α_1-iα_2,Tϵ_2=0α_1+0α_2 , thus the matrix of T T relative to the pair B , B \mathfrak B,\mathfrak B' is [ 2 0 i 0 ] \begin{bmatrix}2&0\\-i&0\end{bmatrix} .
    ( b ) T α 1 = ϵ 1 , T α 2 = i ϵ 1 Tα_1=ϵ_1,Tα_2=-iϵ_1 , thus the matrix of T T relative to the pair B , B \mathfrak B',\mathfrak B is [ 1 i 0 0 ] \begin{bmatrix}1&-i\\0&0\end{bmatrix} .
    ( c ) T α 1 = 2 α 1 i α 2 , T α 2 = 2 i α 1 α 2 Tα_1=2α_1-iα_2,Tα_2=-2iα_1-α_2 , thus [ T ] B = [ 2 2 i i 1 ] [T]_{\mathfrak B'}=\begin{bmatrix}2&-2i\\-i&-1\end{bmatrix} .
    ( d ) T α 2 = α 2 2 i α 1 , T α 1 = i α 2 + 2 α 1 Tα_2=-α_2-2iα_1,Tα_1=-iα_2+2α_1 , thus [ T ] { α 2 , α 1 } = [ 1 i 2 i 2 ] [T]_{\{α_2,α_1\}}=\begin{bmatrix}-1&-i\\-2i&2\end{bmatrix} .

  2. Let T T be the linear transformation from R 3 R^3 into R 2 R^2 defined by
    T ( x 1 , x 2 , x 3 ) = ( x 1 + x 2 , 2 x 3 x 1 ) T(x_1,x_2,x_3)=(x_1+x_2,2x_3-x_1)
    ( a ) If B \mathfrak B is the standard ordered basis for R 3 R^3 and B \mathfrak B' is the standard ordered basis for R 2 R^2 , what is the matrix of T T relative to the pair B , B \mathfrak B,\mathfrak B' ?
    ( b ) If B = { α 1 , α 2 , α 3 } \mathfrak B=\{\alpha_1,\alpha_2,\alpha_3\} and B = { β 1 , β 2 } \mathfrak B'=\{\beta_1,\beta_2\} , where
    α 1 = ( 1 , 0 , 1 ) , α 2 = ( 1 , 1 , 1 ) , α 3 = ( 1 , 0 , 0 ) , β 1 = ( 0 , 1 ) , β 2 = ( 1 , 0 ) \alpha_1=(1,0,-1),\quad\alpha_2=(1,1,1),\quad\alpha_3=(1,0,0),\quad\beta_1=(0,1),\quad\beta_2=(1,0)
    what is the matrix of T T relative to the pair B , B \mathfrak B,\mathfrak B' ?
    Solution:
    ( a ) We let ϵ 1 , ϵ 2 , ϵ 3 ϵ_1,ϵ_2,ϵ_3 be the standard ordered basis for R 3 R^3 and ϵ 1 , ϵ 2 ϵ_1',ϵ_2' be the standard ordered basis for R 2 R^2 , then
    T ϵ 1 = ( 1 , 1 ) , T ϵ 2 = ( 1 , 0 ) , T ϵ 3 = ( 0 , 2 ) Tϵ_1=(1,-1),Tϵ_2=(1,0),Tϵ_3=(0,2)
    thus the matrix of T T relative to the pair B , B \mathfrak B',\mathfrak B is
    A = [ 1 1 0 1 0 2 ] A=\begin{bmatrix}1&1&0\\-1&0&2\end{bmatrix}
    ( b ) A direct calculation shows
    T α 1 = ( 1 , 3 ) = 3 β 1 + β 2 T α 2 = ( 2 , 1 ) = β 1 + 2 β 2 , T α 3 = ( 1 , 1 ) = β 1 + β 2 Tα_1=(1,-3)=-3β_1+β_2\quad Tα_2=(2,1)=β_1+2β_2,\quad Tα_3=(1,-1)=-β_1+β_2
    thus the matrix of T T relative to the pair B , B \mathfrak B',\mathfrak B is
    A = [ 3 1 1 1 2 1 ] A=\begin{bmatrix}-3&1&-1\\1&2&1\end{bmatrix}

  3. Let T T be a linear operator on F n F^n , let A A be the matrix of T T in the standard ordered basis for F n F^n , and let W W be the subspace of F n F^n spanned by the column vectors of A A . What does W W have to do with T T ?
    Solution: By the condition given, if we write A = [ A 11 A 1 n A n 1 A n n ] A=\begin{bmatrix}A_{11}&\cdots&A_{1n}\\\vdots&\ddots&\vdots\\A_{n1}&\cdots&A_{nn}\end{bmatrix} , then T ϵ j = i = 1 n A i j ϵ i = A j Tϵ_j=∑_{i=1}^nA_{ij} ϵ_i=A_j , the j j -th column vector of A A , thus if W W is spanned by A 1 , , A n A_1,\dots,A_n , it is easy to see W = range  T W=\text{range }T .

  4. Let V V be a two-dimensional vector space over the field F F , and let B \mathfrak B be an ordered basis for V V . If T T is a linear operator on V V and
    [ T ] B = [ a b c d ] [T]_{\mathfrak B}=\begin{bmatrix}a&b\\c&d\end{bmatrix}
    prove that T 2 ( a + d ) T + ( a d b c ) I = 0 T^2-(a+d)T+(ad-bc)I=0 .
    Solution: We write B = { α 1 , α 2 } \mathfrak B=\{α_1,α_2\} , then for any α V α∈V , we have α = x 1 α 1 + x 2 α 2 , x 1 , x 2 F α=x_1 α_1+x_2 α_2,x_1,x_2∈F , notice that
    [ T α 1 ] B = [ T ] B [ α 1 ] B = [ a b c d ] [ 1 0 ] = [ a c ] , [Tα_1 ]_{\mathfrak B}=[T]_{\mathfrak B} [α_1 ]_{\mathfrak B}=\begin{bmatrix}a&b\\c&d\end{bmatrix}\begin{bmatrix}1\\0\end{bmatrix}=\begin{bmatrix}a\\c\end{bmatrix},
    [ T α 2 ] B = [ T ] B [ α 2 ] B = [ a b c d ] [ 0 1 ] = [ b d ] , [Tα_2 ]_{\mathfrak B}=[T]_{\mathfrak B} [α_2 ]_{\mathfrak B}=\begin{bmatrix}a&b\\c&d\end{bmatrix}\begin{bmatrix}0\\1\end{bmatrix}=\begin{bmatrix}b\\d\end{bmatrix},
    [ T 2 α 1 ] B = [ T ] B [ T α 1 ] B = [ a b c d ] [ a c ] = [ a 2 + b c a c + c d ] , [T^2 α_1 ]_{\mathfrak B}=[T]_{\mathfrak B} [Tα_1 ]_{\mathfrak B}=\begin{bmatrix}a&b\\c&d\end{bmatrix}\begin{bmatrix}a\\c\end{bmatrix}=\begin{bmatrix}a^2+bc\\ac+cd\end{bmatrix},
    [ T 2 α 2 ] B = [ T ] B [ T α 2 ] B = [ a b c d ] [ b d ] = [ a b + b d b c + d 2 ] [T^2 α_2 ]_{\mathfrak B}=[T]_{\mathfrak B} [Tα_2 ]_{\mathfrak B}=\begin{bmatrix}a&b\\c&d\end{bmatrix}\begin{bmatrix}b\\d\end{bmatrix}=\begin{bmatrix}ab+bd\\bc+d^2\end{bmatrix}
    thus
    ( T 2 ( a + d ) T + ( a d b c ) I ) α 1 = T 2 α 1 ( a + d ) T α 1 + ( a d b c ) α 1 = ( a 2 + b c ) α 1 + ( a c + c d ) α 2 ( a + d ) ( a α 1 + c α 2 ) + ( a d b c ) α 1 = ( a 2 + b c a 2 a d + a d b c ) α 1 + ( a c + c d a c c d ) α 2 = 0 \begin{aligned}&\quad(T^2-(a+d)T+(ad-bc)I) α_1\\&=T^2 α_1-(a+d)Tα_1+(ad-bc) α_1\\&=(a^2+bc) α_1+(ac+cd) α_2-(a+d)(aα_1+cα_2 )+(ad-bc) α_1\\&=(a^2+bc-a^2-ad+ad-bc) α_1+(ac+cd-ac-cd) α_2\\&=0\end{aligned}
    ( T 2 ( a + d ) T + ( a d b c ) I ) α 2 = T 2 α 2 ( a + d ) T α 2 + ( a d b c ) α 2 = ( a b + b d ) α 1 + ( b c + d 2 ) α 2 ( a + d ) ( b α 1 + d α 2 ) + ( a d b c ) α 2 = ( a b + b d a b b d ) α 1 + ( b c + d 2 a d d 2 + a d b c ) α 2 = 0 \begin{aligned}&\quad (T^2-(a+d)T+(ad-bc)I) α_2\\&=T^2 α_2-(a+d)Tα_2+(ad-bc) α_2\\&=(ab+bd) α_1+(bc+d^2 ) α_2-(a+d)(bα_1+dα_2 )+(ad-bc) α_2\\&=(ab+bd-ab-bd) α_1+(bc+d^2-ad-d^2+ad-bc) α_2\\&=0\end{aligned}
    Now we have ( T 2 ( a + d ) T + ( a d b c ) I ) α 1 = ( T 2 ( a + d ) T + ( a d b c ) I ) α 2 = 0 (T^2-(a+d)T+(ad-bc)I) α_1=(T^2-(a+d)T+(ad-bc)I) α_2=0 , and so
    ( T 2 ( a + d ) T + ( a d b c ) I ) α = ( T 2 ( a + d ) T + ( a d b c ) I ) ( x 1 α 1 + x 2 α 2 ) = x 1 ( T 2 ( a + d ) T + ( a d b c ) I ) α 1 + x 2 ( T 2 ( a + d ) T + ( a d b c ) I ) α 2 = 0 \begin{aligned}&\quad (T^2-(a+d)T+(ad-bc)I)α\\&=(T^2-(a+d)T+(ad-bc)I)(x_1 α_1+x_2 α_2 )\\&=x_1 (T^2-(a+d)T+(ad-bc)I) α_1+x_2 (T^2-(a+d)T+(ad-bc)I) α_2\\&=0\end{aligned}

  5. Let T T be the linear operator on R 3 R^3 , the matrix of which in the standard ordered basis is
    A = [ 1 2 1 0 1 1 1 3 4 ] A=\begin{bmatrix}1&2&1\\0&1&1\\-1&3&4\end{bmatrix}
    Find a basis for the range of T T and a basis for the null space of T T .
    Solution: Using Exercise 3 we know the range of T T is spanned by the column vectors of A A , using elementary column operations we have
    A = [ 1 2 1 0 1 1 1 3 4 ] [ 1 0 0 0 1 1 1 5 5 ] [ 1 0 0 0 1 0 1 5 0 ] A=\begin{bmatrix}1&2&1\\0&1&1\\-1&3&4\end{bmatrix}→\begin{bmatrix}1&0&0\\0&1&1\\-1&5&5\end{bmatrix}→\begin{bmatrix}1&0&0\\0&1&0\\-1&5&0\end{bmatrix}
    so a basis for the range of T T is ( 1 , 0 , 1 ) , ( 0 , 1 , 5 ) (1,0,-1),(0,1,5) , notice that T ϵ 3 T ϵ 1 = T ϵ 2 2 T ϵ 1 Tϵ_3-Tϵ_1=Tϵ_2-2Tϵ_1 , thus
    T ( ϵ 3 + ϵ 1 ϵ 2 ) = T ( 1 , 1 , 1 ) = 0 T(ϵ_3+ϵ_1-ϵ_2 )=T(1,-1,1)=0
    and the dimension of the null space of T T is 1 1 , so a basis for the null space of T T is ( 1 , 1 , 1 ) (1,-1,1) .

  6. Let T T be the linear operator on R 2 R^2 defined by
    T ( x 1 , x 2 ) = ( x 2 , x 1 ) T(x_1,x_2)=(-x_2,x_1)
    ( a ) What is the matrix of T T in the standard ordered basis for R 2 R^2 ?
    ( b ) What is the matrix of T T in the ordered basis B = { α 1 , α 2 } \mathfrak B=\{\alpha_1,\alpha_2\} , where α 1 = ( 1 , 2 ) \alpha_1=(1,2) and α 2 = ( 1 , 1 ) \alpha_2=(1,-1) ?
    ( c ) Prove that for every real number c c the operator ( T c I ) (T-cI) is invertible.
    ( d ) Prove that if B \mathfrak B is any ordered basis for R 2 R^2 and [ T ] B = A [T]_{\mathfrak B}=A , then A 12 A 21 0 A_{12}A_{21}\neq 0 .
    Solution:
    ( a ) T ϵ 1 = ( 0 , 1 ) = ϵ 2 , T ϵ 2 = ( 1 , 0 ) = ϵ 1 Tϵ_1=(0,1)=ϵ_2,Tϵ_2=(-1,0)=-ϵ_1 , so the matrix of T T in the standard ordered basis for R 2 R^2 is [ 0 1 1 0 ] \begin{bmatrix}0&-1\\1&0\end{bmatrix} .
    ( b ) We have
    T α 1 = ( 2 , 1 ) = 1 3 α 1 5 3 α 2 , T α 2 = ( 1 , 1 ) = 2 3 α 1 + 1 3 α 2 Tα_1=(-2,1)=-\frac{1}{3} α_1-\frac{5}{3} α_2,\quad Tα_2=(1,1)=\frac{2}{3}α_1+\frac{1}{3}α_2
    thus
    [ T ] B = [ 1 / 3 2 / 3 5 / 3 1 / 3 ] [T]_{\mathfrak B}=\begin{bmatrix}-1/3&2/3\\-5/3&1/3\end{bmatrix}
    ( c ) The matrix of T c I T-cI in the standard ordered basis for R 2 R^2 is [ c 1 1 c ] \begin{bmatrix}-c&-1\\1&-c\end{bmatrix} , thus
    ( T c I ) ϵ 1 = ( c , 1 ) , ( T c I ) ϵ 2 = ( 1 , c ) (T-cI) ϵ_1=(-c,1),\quad (T-cI) ϵ_2=(-1,-c)
    since ( c , 1 ) , ( 1 , c ) (-c,1),(-1,-c) are linearly independent for any c R c∈R , thus a basis of R 2 R^2 , this means T c I T-cI is invertible.
    ( d ) Let B = { ϵ 1 , ϵ 2 } \mathfrak B'=\{ϵ_1,ϵ_2\} , then [ T ] B = [ 0 1 1 0 ] [T]_{\mathfrak B' }=\begin{bmatrix}0&-1\\1&0\end{bmatrix} , given any B \mathfrak B , we can find an invertible P P s.t.
    [ T ] B = A = P [ T ] B P 1 [T]_{\mathfrak B}=A=P[T]_{\mathfrak B'} P^{-1}
    we can assume P = [ a b c d ] P=\begin{bmatrix}a&b\\c&d\end{bmatrix} , then P 1 = 1 a d b c [ d b c a ] P^{-1}=\frac{1}{ad-bc} \begin{bmatrix}d&-b\\-c&a\end{bmatrix} , obviously a , b , c , d a,b,c,d cannot be all zero.
    A = 1 a d b c [ a b c d ] [ 0 1 1 0 ] [ d b c a ] = 1 a d b c [ b a d c ] [ d b c a ] = 1 a d b c [ b d + a c b 2 a 2 d 2 + c 2 b d a c ] \begin{aligned}A&=\frac{1}{ad-bc}\begin{bmatrix}a&b\\c&d\end{bmatrix}\begin{bmatrix}0&-1\\1&0\end{bmatrix}\begin{bmatrix}d&-b\\-c&a\end{bmatrix}\\&=\frac{1}{ad-bc}\begin{bmatrix}b&-a\\d&-c\end{bmatrix}\begin{bmatrix}d&-b\\-c&a\end{bmatrix}\\&=\frac{1}{ad-bc}\begin{bmatrix}bd+ac&-b^2-a^2\\d^2+c^2&-bd-ac\end{bmatrix}\end{aligned}
    thus A 12 A 21 = ( a 2 + b 2 ) ( c 2 + d 2 ) 0 A_{12} A_{21}=-(a^2+b^2 )(c^2+d^2 )\neq 0 .

  7. Let T T be the linear operator on R 3 R^3 defined by
    T ( x 1 , x 2 , x 3 ) = ( 3 x 1 + x 3 , 2 x 1 + x 2 , x 1 + 2 x 2 + 4 x 3 ) T(x_1,x_2,x_3)=(3x_1+x_3,-2x_1+x_2,-x_1+2x_2+4x_3) .
    ( a ) What is the matrix of T T in the standard ordered basis for R 3 R^3 ?
    ( b ) What is the matrix of T T in the ordered basis { α 1 , α 2 , α 3 } \{\alpha_1,\alpha_2,\alpha_3\} , where α 1 = ( 1 , 0 , 1 ) , α 2 = ( 1 , 2 , 1 ) , α 3 = ( 2 , 1 , 1 ) \alpha_1=(1,0,1),\alpha_2=(-1,2,1),\alpha_3=(2,1,1) ?
    ( c ) Prove that T T is invertible and give a rule for T 1 T^{-1} like the one which defines T T .
    Solution:
    ( a ) The matrix of T T in the standard ordered basis for R 3 R^3 is [ 3 0 1 2 1 0 1 2 4 ] \begin{bmatrix}3&0&1\\-2&1&0\\-1&2&4\end{bmatrix} .
    (b) We form P P where P j = [ α j ] { ϵ 1 , ϵ 2 , ϵ 3 } P_j=[α_j ]_{\{ϵ_1,ϵ_2,ϵ_3\}} , it is easy to see P = [ 1 1 2 0 2 1 1 1 1 ] P=\begin{bmatrix}1&-1&2\\0&2&1\\1&1&1\end{bmatrix} , and we perform
    [ 1 1 2 1 0 0 0 2 1 0 1 0 1 1 1 0 0 1 ] [ 1 1 2 1 0 0 0 2 1 0 1 0 0 2 1 1 0 1 ] [ 1 1 2 1 0 0 0 2 1 0 1 0 0 0 2 1 1 1 ] [ 1 1 0 0 1 1 0 2 0 1 / 2 1 / 2 1 / 2 0 0 1 1 / 2 1 / 2 1 / 2 ] [ 1 0 0 1 / 4 3 / 4 5 / 4 0 1 0 1 / 4 1 / 4 1 / 4 0 0 1 1 / 2 1 / 2 1 / 2 ] \begin{aligned}\begin{bmatrix}1&-1&2&1&0&0\\0&2&1&0&1&0\\1&1&1&0&0&1\end{bmatrix}&→\begin{bmatrix}1&-1&2&1&0&0\\0&2&1&0&1&0\\0&2&-1&-1&0&1\end{bmatrix}\\&→\begin{bmatrix}1&-1&2&1&0&0\\0&2&1&0&1&0\\0&0&-2&-1&-1&1\end{bmatrix}\\&→\begin{bmatrix}1&-1&0&0&-1&1\\0&2&0&-1/2&1/2&1/2\\0&0&1&1/2&1/2&-1/2\end{bmatrix}\\&→\begin{bmatrix}1&0&0&-1/4&-3/4&5/4\\0&1&0&-1/4&1/4&1/4\\0&0&1&1/2&1/2&-1/2\end{bmatrix}\end{aligned}
    thus
    P 1 = 1 / 4 [ 1 3 5 1 1 1 2 2 2 ] P^{-1}=1/4 \begin{bmatrix}-1&-3&5\\-1&1&1\\2&2&-2\end{bmatrix}
    and then
    [ T ] { α 1 , α 2 , α 3 } = P 1 [ T ] { ϵ 1 , ϵ 2 , ϵ 3 } P = 1 4 [ 1 3 5 1 1 1 2 2 2 ] [ 3 0 1 2 1 0 1 2 4 ] [ 1 1 2 0 2 1 1 1 1 ] = 1 4 [ 2 7 19 6 3 3 4 2 6 ] [ 1 1 2 0 2 1 1 1 1 ] = 1 4 [ 17 35 22 3 15 6 2 14 0 ] \begin{aligned}[T]_{\{α_1,α_2,α_3\}} =P^{-1} [T]_{\{ϵ_1,ϵ_2,ϵ_3\}} P&=\frac{1}{4} \begin{bmatrix}-1&-3&5\\-1&1&1\\2&2&-2\end{bmatrix}\begin{bmatrix}3&0&1\\-2&1&0\\-1&2&4\end{bmatrix}\begin{bmatrix}1&-1&2\\0&2&1\\1&1&1\end{bmatrix}\\&=\frac{1}{4} \begin{bmatrix}-2&7&19\\-6&3&3\\4&-2&-6\end{bmatrix}\begin{bmatrix}1&-1&2\\0&2&1\\1&1&1\end{bmatrix}\\&=\frac{1}{4}\begin{bmatrix}17&35&22\\-3&15&-6\\-2&-14&0\end{bmatrix}\end{aligned}
    ( c ) It is enough to prove [ T ] { ϵ 1 , ϵ 2 , ϵ 3 } [T]_{\{ϵ_1,ϵ_2,ϵ_3\}} is invertible, this is true since
    [ 3 0 1 2 1 0 1 2 4 ] 1 = [ 4 / 9 2 / 9 1 / 9 8 / 9 13 / 9 2 / 9 1 / 3 2 / 3 1 / 3 ] \begin{bmatrix}3&0&1\\-2&1&0\\-1&2&4\end{bmatrix}^{-1}=\begin{bmatrix}4/9&2/9&-1/9\\8/9&13/9&-2/9\\-1/3&-2/3&1/3\end{bmatrix}
    since we know [ T 1 ] { ϵ 1 , ϵ 2 , ϵ 3 } = ( [ T ] { ϵ 1 , ϵ 2 , ϵ 3 } ) 1 [T^{-1}]_{\{ϵ_1,ϵ_2,ϵ_3\}}=([T]_{\{ϵ_1,ϵ_2,ϵ_3\}})^{-1} , it’s able to describe
    T 1 ( x 1 , x 2 , x 3 ) = ( 4 9 x 1 + 2 9 x 2 1 9 x 3 , 8 9 x 1 + 13 9 x 2 2 9 x 3 , 1 3 x 1 2 3 x 2 + 1 3 x 3 ) T^{-1} (x_1,x_2,x_3 )=\left(\frac{4}{9} x_1+\frac{2}{9} x_2-\frac{1}{9} x_3,\frac{8}{9} x_1+\frac{13}{9} x_2-\frac{2}{9} x_3,-\frac{1}{3} x_1-\frac{2}{3} x_2+\frac{1}{3} x_3\right)

  8. Let θ \theta be a real number. Prove that the following two matrices are similar over the field of complex numbers:
    [ cos θ sin θ sin θ cos θ ] , [ e i θ 0 0 e i θ ] \begin{bmatrix}\cos{\theta}&\sin{\theta}\\\sin{\theta}&\cos{\theta}\end{bmatrix},\quad \begin{bmatrix}e^{i{\theta}}&0\\0&e^{-i\theta}\end{bmatrix}
    Solution: Let T T be the linear operator on C 2 C^2 which is represented by the first matrix in the standard ordered basis, then T ( 1 , 0 ) = ( cos θ , sin θ ) , T ( 0 , 1 ) = ( sin θ , cos θ ) T(1,0)=(\cos⁡θ,\sin⁡θ), T(0,1)=(-\sin⁡θ,\cos⁡θ) , let α 1 = ( i , 1 ) , α 2 = ( 1 , i ) α_1=(i,1),α_2=(1,i) , then T α 1 = i ( cos θ , sin θ ) + ( sin θ , cos θ ) = ( i cos θ sin θ , i sin θ + cos θ ) = e i θ ( i , 1 ) = e i θ α 1 Tα_1=i(\cos⁡θ,\sin⁡θ )+(-\sin⁡θ,\cos⁡θ )=(i \cos⁡θ-\sin⁡θ,i \sin⁡θ+\cos⁡θ )=e^{iθ} (i,1)=e^{iθ} α_1
    and similarly we can see T α 2 = e i θ α 2 Tα_2=e^{-iθ} α_2 , it is easy to see { α 1 , α 2 } \{α_1,α_2 \} are linearly independent, thus a basis of C 2 C^2 , since [ T ] { α 1 , α 2 } = [ e i θ 0 0 e i θ ] [T]_{\{α_1,α_2\}} =\begin{bmatrix}e^{iθ}&0\\0&e^{-iθ} \end{bmatrix} , the two matrices are similar, with P = [ i 1 1 i ] P=\begin{bmatrix}i&1\\1&i\end{bmatrix} and
    [ T ] { α 1 , α 2 } = P 1 [ cos θ sin θ sin θ cos θ ] P [T]_{\{α_1,α_2\}} =P^{-1} \begin{bmatrix}\cos⁡θ&-\sin⁡θ\\\sin⁡θ&\cos⁡θ \end{bmatrix}P

  9. Let V V be a finite-dimensional vector space over the field F F and let S S and T T be linear operators on V V . We ask: When do there exist ordered bases B \mathfrak B and B \mathfrak B' for V V such that [ S ] B = [ T ] B [S]_{\mathfrak B}=[T]_{\mathfrak B'} ? Prove that such bases exist if and only if there is an invertible linear operator U U on V V such that T = U S U 1 T=USU^{-1} .
    Solution: If [ S ] B = [ T ] B [S]_{\mathfrak B}=[T]_{\mathfrak B'} , then let U U be the linear operator that carries B {\mathfrak B} onto B {\mathfrak B'} , i.e., if we let
    B = { a 1 , , a n } , B = { b 1 , , b n } {\mathfrak B}=\{a_1,\dots,a_n \},\quad {\mathfrak B'}=\{b_1,\dots,b_n \}
    and define U U by U a i = b i , i = 1 , , n Ua_i=b_i,i=1,\dots,n . Then U U is invertible since it carries a basis onto another basis, and U 1 b i = a i , i = 1 , , n U^{-1} b_i=a_i,i=1,\dots,n . If we denote [ S ] B = [ T ] B = A = ( A i j ) [S]_{\mathfrak B}=[T]_{\mathfrak B'}=A=(A_{ij}) , then by definition we have S a j = i = 1 n A i j a i , T b j = i = 1 n A i j b i Sa_j=∑_{i=1}^nA_{ij} a_i ,Tb_j=∑_{i=1}^nA_{ij} b_i , so
    U S U 1 ( b j ) = U S a j = U ( i = 1 n A i j a i ) = i = 1 n A i j U a i = i = 1 n A i j b i = T b j , j = 1 , , n USU^{-1}(b_j )=USa_j=U\left(∑_{i=1}^nA_{ij} a_i \right)=∑_{i=1}^nA_{ij} Ua_i=∑_{i=1}^nA_{ij}b_i=Tb_j,\quad j=1,\dots,n
    Since U S U 1 USU^{-1} and T T are equal on a basis of V V , we have T = U S U 1 T=USU^{-1} .
    Conversely, if T = U S U 1 T=USU^{-1} for some invertible U U , we let B = { a 1 , , a n } B=\{a_1,\dots,a_n \} be an ordered basis for V V , and B = { U a 1 , , U a n } \mathfrak B'=\{Ua_1,\dots,Ua_n \} , since U U is invertible, B \mathfrak B' is a basis for V V . Notice that if α V α∈V , then
    α = k 1 a 1 + + k n a n , k 1 , , k n F α=k_1 a_1+\dots+k_n a_n,\quad k_1,\dots,k_n∈F
    thus [ α ] B = [ k 1 k n ] [α]_{\mathfrak B}=\begin{bmatrix}k_1\\\vdots\\k_n\end{bmatrix} , and U α = k 1 U a 1 + + k n U a n Uα=k_1 Ua_1+\dots+k_n Ua_n , so [ U α ] B = [ k 1 k n ] [Uα]_{\mathfrak B'}=\begin{bmatrix}k_1\\\vdots\\k_n\end{bmatrix} , so we have [ α ] B = [ U α ] B [α]_{\mathfrak B}=[Uα]_{\mathfrak B'} , from this and the fact that T U = U S TU=US we can have
    [ S ] B [ α ] B = [ S α ] B = [ U S α ] B = [ T U α ] B = [ T ] B [ U α ] B = [ T ] B [ α ] B [S]_{\mathfrak B} [α]_{\mathfrak B}=[Sα]_{\mathfrak B}=[USα]_{\mathfrak B'}=[TUα]_{\mathfrak B'}=[T]_{\mathfrak B'} [Uα]_{\mathfrak B'}=[T]_{\mathfrak B'} [α]_{\mathfrak B}
    and it must follow that [ S ] B = [ T ] B [S]_{\mathfrak B}=[T]_{\mathfrak B'} .
    [Alternatively, one easier proof is using Theorem 14 and the consequence of Theorem 13, since in this case we have
    [ T ] B = [ U 1 ] B [ T ] B [ U ] B = [ U 1 T U ] B = [ U 1 ( U S U 1 ) U ] B = [ S ] B [T]_{\mathfrak B'}=[U^{-1}]_{\mathfrak B} [T]_{\mathfrak B} [U]_{\mathfrak B}=[U^{-1} TU]_{\mathfrak B}=[U^{-1}(USU^{-1})U]_{\mathfrak B}=[S]_{\mathfrak B}
    ].

  10. We have seen that the linear operator T T on R 2 R^2 defined by T ( x 1 , x 2 ) = ( x 1 , 0 ) T(x_1,x_2)=(x_1,0) is represented in the standard ordered basis by the matrix
    A = [ 1 0 0 0 ] . A=\begin{bmatrix}1&0\\0&0\end{bmatrix}.
    This operator satisfies T 2 = T T^2=T . Prove that if S S is a linear operator on R 2 R^2 such that S 2 = S S^2=S , then S = 0 S=0 ,or S = I S=I , or there is an ordered basis B \mathfrak B for R 2 R^2 such that [ S ] B = A [S]_{\mathfrak B}=A (above).
    Solution: If S = 0 S=0 or S = I S=I , we obviously have S 2 = S S^2=S , now suppose S 0 , S I S\neq 0,S\neq I , but S 2 = S S^2=S , then it is able to find α , β R 2 α,β∈R^2 , s.t. S α 0 , S β β Sα\neq 0,Sβ\neq β , since S α range  S Sα∈\text{range }S , we have dim range  S 1 \dim⁡ \text{range }S≥1 , also since S ( S β β ) = S 2 β S β = S β S β = 0 S(Sβ-β)=S^2 β-Sβ=Sβ-Sβ=0 , we know S β β null  S Sβ-β∈\text{null }S , and dim null  S 1 \dim⁡ \text{null }S≥1 , as we discuss in R 2 R^2 , dim range  S + dim null  S = 2 \dim \text{range }S+\dim \text{null }S=2 , thus dim range  S = dim null  S = 1 \dim \text{range }S=\dim \text{null }S=1 . Now if we let a = S α , b = S β β a=Sα,b=Sβ-β , and B = { a , b } {\mathfrak B}=\{a,b\} , then B {\mathfrak B} is an ordered basis for R 2 R^2 and [ S ] B = A [S]_{\mathfrak B}=A .

  11. Let W W be the space of all n × 1 n\times 1 column matrices over a field F F . If $A4 is an n × n n\times n matrix over F F , then A A defines a linear operator L A L_A on W W through left multiplicaition: L A ( X ) = A X L_A(X)=AX . Prove that every linear operator on W W is left multiplication by some n × n n\times n matrix, i.e., is L A L_A for some A A .
    Solution: If T T is a linear operator on W W , let B = { ϵ 1 , , ϵ n } \mathfrak B'=\{ϵ_1,\dots,ϵ_n \} be the standard basis on W W , for each X = [ x 1 x n ] X=\begin{bmatrix}x_1\\\vdots\\x_n \end{bmatrix} , we have X = j = 1 n x j ϵ j X=∑_{j=1}^nx_j ϵ_j , and if we define A : = [ T ] B A:=[T]_{\mathfrak B'} , then T ( X ) = T ( j = 1 n x j ϵ j ) = j = 1 n x j T ϵ j = j = 1 n x j i = 1 n A i j ϵ i = i = 1 n ( j = 1 n A i j x j ) ϵ i = A X T(X)=T(∑_{j=1}^nx_j ϵ_j )=∑_{j=1}^nx_j Tϵ_j =∑_{j=1}^nx_j ∑_{i=1}^nA_{ij} ϵ_i =∑_{i=1}^n(∑_{j=1}^nA_{ij} x_j )ϵ_i =AX , thus T = L A T=L_A .
    For the second question, let B = { a 1 , , a n } \mathfrak B=\{a_1,\dots,a_n \} , if [ α ] B = [ β ] B = [ x 1 x n ] [α]_{\mathfrak B}=[β]_{\mathfrak B}=\begin{bmatrix}x_1\\\vdots \\x_n\end{bmatrix} , then α = β = j = 1 n x j a j α=β=∑_{j=1}^nx_j a_j , this shows ( U α = U β ) ( α = β ) (Uα=Uβ)⇒(α=β) , so U U is injective. Also for any X = [ x 1 x n ] W X=\begin{bmatrix}x_1\\\vdots\\x_n \end{bmatrix}∈W , define α = j = 1 n x j a j α=∑_{j=1}^nx_j a_j , then U α = X Uα=X , thus U U is surjective, combined we show U U is an isomorphism.
    If T T is a linear operator on V V , then let X = [ x 1 x n ] , Y = [ y 1 y n ] W X=\begin{bmatrix}x_1\\\vdots\\x_n\end{bmatrix},Y=\begin{bmatrix}y_1\\\vdots\\y_n \end{bmatrix}∈W , by definition we have
    U T U 1 ( c X + Y ) = U T ( U 1 ( c X + Y ) ) = U T ( j = 1 n ( c x j + y j ) a j ) = U ( T ( j = 1 n ( c x j + y j ) a j ) ) = U ( j = 1 n ( c x j + y j ) T a j ) = j = 1 n ( c x j + y j ) U T a j = c j = 1 n x j U T a j + j = 1 n y j U T a j = c ( U T j = 1 n x j a j ) + U T ( j = 1 n y j a j ) = c ( U T ( U 1 ( X ) ) ) + ( U T ( U 1 ( Y ) ) ) = c ( U T U 1 ) ( X ) + ( U T U 1 ) ( Y ) \begin{aligned}UTU^{-1} (cX+Y)&=UT(U^{-1} (cX+Y))\\&=UT\left(∑_{j=1}^n(cx_j+y_j)a_j \right)=U\left(T\Big(∑_{j=1}^n(cx_j+y_j)a_j \Big)\right)\\&=U\left(∑_{j=1}^n(cx_j+y_j)Ta_j \right)=∑_{j=1}^n(cx_j+y_j)UTa_j \\&=c∑_{j=1}^nx_j UTa_j +∑_{j=1}^ny_j UTa_j =c\left(UT∑_{j=1}^nx_j a_j \right)+UT\left(∑_{j=1}^ny_j a_j \right)\\&=c(UT(U^{-1}(X)))+(UT(U^{-1}(Y)))=c(UTU^{-1})(X)+(UTU^{-1})(Y)\end{aligned}
    thus U T U 1 UTU^{-1} is a linear operator on W W .
    If T T is a linear operator on V V , then let C = [ T ] B C=[T]_{\mathfrak B} , i.e. T a j = j = 1 n C i j a i Ta_j=∑_{j=1}^nC_{ij} a_i , in the first part we proved A = [ U T U 1 ] B A=[UTU^{-1}]_{\mathfrak B'} , to compute A A , we see that U T U 1 ( ϵ j ) = U T a j = U ( j = 1 n C i j a i ) = j = 1 n C i j U a i = j = 1 n C i j ϵ j UTU^{-1}(ϵ_j )=UTa_j=U(∑_{j=1}^nC_{ij}a_i)=∑_{j=1}^nC_{ij}Ua_i =∑_{j=1}^nC_{ij}ϵ_j , thus [ U T U 1 ] B = C = [ T ] B [UTU^{-1}]_{\mathfrak B'}=C=[T]_{\mathfrak B} , or A = [ T ] B A=[T]_{\mathfrak B} .

  12. Let V V be an n n -dimensional vector space over the field F F , and let B = { α 1 , , α n } \mathfrak B=\{\alpha_1,\dots,\alpha_n\} be an ordered basis for V V .
    ( a ) According th Theorem 1, there is a unique linear operator T T on V V such that
    T α j = α j + 1 , j = 1 , , n 1 , T α n = 0. T{\alpha}_j={\alpha}_{j+1},\qquad j=1,\dots,n-1,\qquad T{\alpha}_n=0.
    What is the matrix A A of T T in the ordered basis B \mathfrak B ?
    ( b ) Prove that T n = 0 T^n=0 but T n 1 0 T^{n-1}\neq 0 .
    ( c ) Let S S be any linear operator on V V such that S n = 0 S^n=0 but S n 1 0 S^{n-1}\neq 0 . Prove that there is an ordered basis B \mathfrak B' for V V such that the matrix of S S in the ordered basis B \mathfrak B' is the matrix A A of part (a).
    ( d ) Prove that if M M and N N are n × n n\times n matrices over F F such that M n = N n = 0 M^n=N^n=0 but M n 1 0 N n 1 M^{n-1}\neq 0\neq N^{n-1} , then M M and N N are similar.
    Solution:
    ( a ) A direct computation shows A = [ 0 0 0 1 0 0 1 0 ] A=\begin{bmatrix}0&0&\dots&0\\1&0& & \\ &\ddots&\ddots& \\0& &1&0\end{bmatrix} .
    ( b ) We have T k α n + 1 k = 0 , k = 1 , , n T^k α_{n+1-k}=0,k=1,\dots,n , thus T n = 0 T^n=0 , but T n 1 α 1 = α n 0 T^{n-1}α_1=α_n\neq 0 .
    ( c ) It is able to choose α α s.t. S n 1 α 0 S^{n-1}α\neq 0 but S n α = 0 S^n α=0 , notice α 0 α\neq 0 , and { α , S α , , S n 1 α } \{α,Sα,\dots,S^{n-1}α\} is linearly independent, for if we have
    k 1 α + k 2 S α + + k n S n 1 α = 0 k_1 α+k_2 Sα+\dots+k_n S^{n-1} α=0
    then S n 1 ( k 1 α + k 2 S α + + k n S n 1 α ) = k 1 S n 1 α = 0 S^{n-1}(k_1 α+k_2 Sα+\dots+k_n S^{n-1}α)=k_1 S^{n-1}α=0 , thus k 1 = 0 k_1=0 , the above becomes
    k 2 S α + + k n S n 1 α = 0 k_2 Sα+\dots+k_n S^{n-1}α=0
    then S n 2 ( k 2 S α + + k n S n 1 α ) = k 2 S n 1 α = 0 S^{n-2}(k_2 Sα+\dots+k_n S^{n-1}α)=k_2 S^{n-1}α=0 , thus k 2 = 0 k_2=0 , continue this step we eventually have k 1 = = k n = 0 k_1=\dots=k_n=0 . Thus we can define B = { α , S α , , S n 1 α } \mathfrak B'=\{α,Sα,\dots,S^{n-1} α\} , and [ S ] B = A [S]_{\mathfrak B'}=A .
    ( d ) Let T T and S S be linear operators which satisfies [ T ] B = M , [ S ] B = N [T]_{\mathfrak B}=M,[S]_{\mathfrak B}=N , in which B = { ϵ 1 , , ϵ n } {\mathfrak B}=\{ϵ_1,\dots,ϵ_n \} . From ( c ) we can find two ordered basis B 1 {\mathfrak B}_1 and B 2 {\mathfrak B}_2 s.t. [ T ] B 1 = [ S ] B 2 = A [T]_{\mathfrak B_1}=[S]_{\mathfrak B_2}=A , by Theorem 14, let P P be the n × n n\times n matrix with columns P j = [ ϵ j ] B 1 P_j=[ϵ_j ]_{\mathfrak B_1} , and Q Q be the n × n n\times n matrix with columns Q j = [ ϵ j ] B 2 Q_j=[ϵ_j ]_{\mathfrak B_2} , then
    M = [ T ] B = P 1 A P , N = [ S ] B = Q 1 A Q M=[T]_{\mathfrak B}=P^{-1} AP,\quad N=[S]_{\mathfrak B}=Q^{-1}AQ
    thus M = P 1 Q A Q 1 P = ( Q 1 P ) 1 A ( Q 1 P ) M=P^{-1} QAQ^{-1}P=(Q^{-1}P)^{-1}A(Q^{-1}P) .

  13. Let V V and W W be finite-dimensional vector spaces over the field F F and let T T be a linear transformation from V V into W W . If
    B = { α 1 , , α n }  and  B = { β 1 , , β m } \mathfrak B=\{\alpha_1,\dots,\alpha_n\}\text{ and }\mathfrak B'=\{\beta_1,\dots,\beta_m\}
    are ordered base for V V and W W , respectively, define the linear transformations E p , q E{p,q} as in the proof of Theorem 5: E p , q ( α i ) = δ i q β p E^{p,q}(\alpha_i)=\delta_{iq}\beta_p . Then the E p , q , 1 p m , 1 q n E{p,q},1\leq p\leq m,1\leq q\leq n , form a basis for L ( V , W ) L(V,W) , and so
    T = p = 1 m q = 1 n A p q E p , q T=\sum_{p=1}^m\sum_{q=1}^nA_{pq}E^{p,q}
    for certain scalars A p q A_{pq} (the coordinates of T T in this basis for L ( V , W ) L(V,W) ). Show that the matrix A A with entries A ( p , q ) = A p q A(p,q)=A_{pq} is precisely the matrix of T T relative to the pair B , B \mathfrak B,\mathfrak B' .
    Solution: T α j = ( p = 1 m q = 1 n A p q E p , q ) ( α j ) = p = 1 m q = 1 n A p q E p , q ( α j ) = p = 1 m q = 1 n A p q δ j q β p = p = 1 m A p j β p Tα_j=(∑_{p=1}^m∑_{q=1}^nA_{pq} E^{p,q})(α_j )=∑_{p=1}^m∑_{q=1}^nA_{pq} E^{p,q}(α_j)=∑_{p=1}^m∑_{q=1}^nA_{pq} δ_{jq} β_p=∑_{p=1}^mA_{pj} β_p , and the conclusion follows.

发布了77 篇原创文章 · 获赞 14 · 访问量 2786

猜你喜欢

转载自blog.csdn.net/christangdt/article/details/104002737
3.4