Question

In: Math

Let A be a rotation matrix on a plane, please use the concept of linear transformation...

Let A be a rotation matrix on a plane, please use the concept of linear transformation to explain it doesn't have real eigenvector.

Solutions

Expert Solution

There is a kind of intuitive way to view the eigenvalues and eigenvectors, and it ties in with geometric ideas as well (without resorting to four dimensions!).

The matrix, is unitary (more specifically, it is real so it is called orthogonal) and so there is an orthogonal basis of eigenvectors. Here, as you noted, it is (1i)(1i) and (1−i)(1−i), let us call them v1v1and v2v2, that form a basis of C2C2, and so we can write any element of R2R2 in terms of v1v1 and v2v2 as well, since R2R2 is a subset of C2C2. (And we normally think of rotations as occurring in R2R2! Please note that C2C2 is a two-dimensional vector space with components in CC and need not be considered as four-dimensional, with components in RR.)

We can then represent any vector in R2R2uniquely as a linear combination of these two vectors x=λ1v1+λ2v2x=λ1v1+λ2v2, with λi∈Cλi∈C. So if we call the linear map that the matrix represents RR

R(x)=R(λ1v1+λ2v2)=λ1R(v1)+λ2R(v2)=eiθλ1(v1)+e−iθλ2(v2)R(x)=R(λ1v1+λ2v2)=λ1R(v1)+λ2R(v2)=eiθλ1(v1)+e−iθλ2(v2)

In other words, when working in the basis v1,v2v1,v2:

R(λ1λ2)=(eiθλ1e−iθλ2)R(λ1λ2)=(eiθλ1e−iθλ2)

And we know that multiplying a complex number by eiθeiθ is an anticlockwise rotation by theta. So the rotation of a vector when represented by the basis v1,v2v1,v2 is the same as just rotating the individual components of the vector in the complex plane!


Related Solutions

In parts a and b, A is a matrix representation of a linear transformation in the...
In parts a and b, A is a matrix representation of a linear transformation in the standard basis. Find a matrix representation of the linear transformation in the new basis. show all steps. a. A = 2x2 matrix with the first row being 2 and -1, and the second row being 1 and 3; new basis = {<1, 2> , < 1, 1> } b. A = 3x3 matrix with the first row being 2, 1, -1, the second row...
Let ? and W be finite dimensional vector spaces and let ?:?→? be a linear transformation....
Let ? and W be finite dimensional vector spaces and let ?:?→? be a linear transformation. We say a linear transformation ?:?→? is a left inverse of ? if ST=I_v, where ?_v denotes the identity transformation on ?. We say a linear transformation ?:?→? is a right inverse of ? if ??=?_w, where ?_w denotes the identity transformation on ?. Finally, we say a linear transformation ?:?→? is an inverse of ? if it is both a left and right...
. Let T : R n → R m be a linear transformation and A the...
. Let T : R n → R m be a linear transformation and A the standard matrix of T. (a) Let BN = {~v1, . . . , ~vr} be a basis for ker(T) (i.e. Null(A)). We extend BN to a basis for R n and denote it by B = {~v1, . . . , ~vr, ~ur+1, . . . , ~un}. Show the set BR = {T( r~u +1), . . . , T( ~un)} is a...
Let T : R2 → R3 be a linear transformation such that T( e⃗1 ) =...
Let T : R2 → R3 be a linear transformation such that T( e⃗1 ) = (2,3,-5) and T( e⃗2 ) = (-1,0,1). Determine the standard matrix of T. Calculate T( ⃗u ), the image of ⃗u=(4,2) under T. Suppose T(v⃗)=(3,2,2) for a certain v⃗ in R2 .Calculate the image of ⃗w=2⃗u−v⃗ . 4. Find a vector v⃗ inR2 that is mapped to ⃗0 in R3.
How can you determine the standard matrix of a linear transformation? What is the difference between...
How can you determine the standard matrix of a linear transformation? What is the difference between a one-to-one linear transformation and an onto linear transformation?
Let T : Rn →Rm be a linear transformation. (a) If {v1,v2,...,vk} is a linearly dependent...
Let T : Rn →Rm be a linear transformation. (a) If {v1,v2,...,vk} is a linearly dependent subset of Rn, prove that {T(v1),T(v2),...,T(vk)} is a linearly dependent subset of Rm. (b) Suppose the kernel of T is {0}. (Recall that the kernel of a linear transformation T : Rn → Rm is the set of all x ∈ Rn such that T(x) = 0). If {w1,w2,...,wp} is a linearly independent subset of Rn, then show that {T(w1),T(w2),...,T(wp)} is a linearly independent...
Let T: R2 -> R2 be a linear transformation defined by T(x1 , x2) = (x1...
Let T: R2 -> R2 be a linear transformation defined by T(x1 , x2) = (x1 + 2x2 , 2x1 + 4x2) a. Find the standard matrix of T. b. Find the ker(T) and nullity (T). c. Is T one-to-one? Explain.
Let T : Rn -> Rm be a onto linear transformation. Select all the true statements...
Let T : Rn -> Rm be a onto linear transformation. Select all the true statements from the following list. (Incorrect choices will earn you negative points) For any y in Rm there exists a unique x in Rn such that T(x) = y For any y in Rm there is at most one x in Rn such that T(x) = y The range of T is the entire Rm For any y in Rm there is at least one...
The linear transformation is such that for any v in R2, T(v) = Av. a) Use...
The linear transformation is such that for any v in R2, T(v) = Av. a) Use this relation to find the image of the vectors v1 = [-3,2]T and v2 = [2,3]T. For the following transformations take k = 0.5 first then k = 3, T1(x,y) = (kx,y) T2(x,y) = (x,ky) T3(x,y) = (x+ky,y) T4(x,y) = (x,kx+y) For T5 take theta = (pi/4) and then theta = (pi/2) T5(x,y) = (cos(theta)x - sin(theta)y, sin(theta)x + cos(theta)y) b) Plot v1 and...
Consider the general linear model ? = ?? + ?. Use matrix algebra to show that...
Consider the general linear model ? = ?? + ?. Use matrix algebra to show that ?̂ is an unbiased estimator of ?. the last ? has bar
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT