Prove the follwing statements
Suppose that S is a linearly independent set of vectors in the
vector space V and let w be a vector of V that is
not in S. Then the set obtained from S by adding w
to S is linearly independent in V.
If U is a subspace of a vector space V and dim(U)=dim(V), then
U=V.
2. Find all eigenvalues and corresponding linearly independent
eigenvectors of A = [2 0 3 4] (Its a 2x2 matrix)
4. Find all eigenvalues and corresponding linearly independent
eigenvectors of A = [1 0 1 0 2 3 0 0 3] (Its's a 3x3 matrix)
6. Find all eigenvalues and corresponding eigenvectors of A =
1 2 3 0 1 2 0 0 1 .(Its a 3x3 matrix)
Use the method of Frobenius to obtain two linearly independent
series solutions about x = 0. Form the general solution on
(0,inf).
2x^2y'' - xy' + (x^2 + 1)y = 0
Use power series to find two linearly independent solutions
centered at the point x=0
1) y'' + 2y' - 2y = 0
2) 2x2y'' + x(x-1)y' - 2y = 0
please show work, thank you!
Let ?1=(1,0,1,0) ?2=(0,−1,1,−1) ?3=(1,1,1,1) be linearly
independent vectors in ℝ4.
a. Apply the Gram-Schmidt algorithm to orthonormalise the vectors
{?1,?2,?3} of vectors {?1,?2,?3}.
b. Find a vector ?4 such that {?1,?2,?3,?4} is an orthonormal basis
for ℝ4 (where ℝ4 is the Euclidean space, that is, the
inner product is the dot product).