Question

In: Advanced Math

Let V be a vector space, and suppose that U and W are both subspaces of...

Let V be a vector space, and suppose that U and W are both subspaces of V. Show that U ∩W := {v | v ∈ U and v ∈ W} is a subspace of V.

Solutions

Expert Solution

Defination : Let V be a vector space , a subset A V is said to be a subspace of V if the following two conditions are hold

(i) x € A and y € A then x +y € A

(ii) x € A and c be a constant then cx € A.

Now given U and W both are subspace of V .

(i) Let x € U W .

Then x € U and x€ W . Now since both U and W are subspace of V so x + y belongs to both U and V .

This implies x + y € U W . So x € U W and y € U W implies x + y belongs to U W .

So condition (i) satisfies .

(ii) Suppose x € U W and c be a scalar .

Since x € U W so x € U and x € W .

Now as both U and W are subspace of V then ,

cx € U and cx € Y => cx € U W .

So x € U W and c be a scalar implies cx € U W .

So condition (ii) satisfies .

Hence U W is a subspace of V .


Related Solutions

Let V be a vector space and let U and W be subspaces of V ....
Let V be a vector space and let U and W be subspaces of V . Show that the sum U + W = {u + w : u ∈ U and w ∈ W} is a subspace of V .
Suppose that ? and ? are subspaces of a vector space ? with ? = ?...
Suppose that ? and ? are subspaces of a vector space ? with ? = ? ⊕ ?. Suppose also that ??, … , ?? is a basis of ? amd ??, … , ?? is a basis of ?. Prove ??, … , ??, ??, … , ?? is a basis of V.
(10pt) Let V and W be a vector space over R. Show that V × W...
(10pt) Let V and W be a vector space over R. Show that V × W together with (v0,w0)+(v1,w1)=(v0 +v1,w0 +w1) for v0,v1 ∈V, w0,w1 ∈W and λ·(v,w)=(λ·v,λ·w) for λ∈R, v∈V, w∈W is a vector space over R. (5pt)LetV beavectorspaceoverR,λ,μ∈R,andu,v∈V. Provethat (λ+μ)(u+v) = ((λu+λv)+μu)+μv. (In your proof, carefully refer which axioms of a vector space you use for every equality. Use brackets and refer to Axiom 2 if and when you change them.)
(1) Suppose that V is a vector space and that S = {u,v} is a set...
(1) Suppose that V is a vector space and that S = {u,v} is a set of two vectors in V. Let w=u+v, let x=u+2v, and letT ={w,x} (so thatT is another set of two vectors in V ). (a) Show that if S is linearly independent in V then T is also independent. (Hint: suppose that there is a linear combination of elements of T that is equal to 0. Then ....). (b) Show that if S generates V...
Prove that if U, V and W are vector spaces such that U and V are...
Prove that if U, V and W are vector spaces such that U and V are isomorphic and V and W are isomorphic, then U and W are isomorphic.
Let U be a subset of a vector space V. Show that spanU is the intersection...
Let U be a subset of a vector space V. Show that spanU is the intersection of all the subspaces of V that contain U. What does this say if U=∅? Need proof
Questionnnnnnn a. Let V and W be vector spaces and T : V → W a...
Questionnnnnnn a. Let V and W be vector spaces and T : V → W a linear transformation. If {T(v1), . . . T(vn)} is linearly independent in W, show that {v1, . . . vn} is linearly independent in V . b. Define similar matrices c Let A1, A2 and A3 be n × n matrices. Show that if A1 is similar to A2 and A2 is similar to A3, then A1 is similar to A3. d. Show that...
Vector v=(9,0,2) is vector from R3 space. Consider standard inner product in R3. Let W be...
Vector v=(9,0,2) is vector from R3 space. Consider standard inner product in R3. Let W be a subspace in R3 span by u = (9,2,0) and w=(9/2,0,2). a) Does V belong to W? show explanation b) find orthonormal basis in W. Show work c) find projection of v onto W( he best approximation of v with elements of w) d) find the distance between projection and vector v
Suppose U is a subspace of a finite dimensional vector space V. Prove there exists an...
Suppose U is a subspace of a finite dimensional vector space V. Prove there exists an operator T on V such that the null space of T is U.
Let U and V be vector spaces, and let L(V,U) be the set of all linear...
Let U and V be vector spaces, and let L(V,U) be the set of all linear transformations from V to U. Let T_1 and T_2 be in L(V,U),v be in V, and x a real number. Define vector addition in L(V,U) by (T_1+T_2)(v)=T_1(v)+T_2(v) , and define scalar multiplication of linear maps as (xT)(v)=xT(v). Show that under these operations, L(V,U) is a vector space.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT