1- Let W1, W2 be two subspaces of a vector space V . Show
that
both...
1- Let W1, W2 be two subspaces of a vector space V . Show
that
both W1 ∩ W2 and W1 +W2 are subspaces.?and Show that W1 ∪ W2 is a
subspace
only when W1 ⊂ W2 or W2 ⊂ W1.
(recall that W1 + W2 = {x + y | x ∈ W1, y ∈ W2}.)
Suppose that ? and ? are subspaces of a vector space ? with ? =
? ⊕ ?. Suppose also that ??, … , ?? is a basis of ? amd ??, … , ??
is a basis of ?. Prove ??, … , ??, ??, … , ?? is a basis of V.
Let U be a subset of a vector space V. Show that spanU is the
intersection of all the subspaces of V that contain U. What does
this say if U=∅? Need proof
(10pt) Let V and W be a vector space over R. Show that V × W
together with (v0,w0)+(v1,w1)=(v0 +v1,w0 +w1) for v0,v1 ∈V, w0,w1
∈W
and
λ·(v,w)=(λ·v,λ·w) for λ∈R, v∈V, w∈W is a vector space over
R.
(5pt)LetV beavectorspaceoverR,λ,μ∈R,andu,v∈V. Provethat
(λ+μ)(u+v) = ((λu+λv)+μu)+μv.
(In your proof, carefully refer which axioms of a vector space
you use for every equality. Use brackets and refer to Axiom 2 if
and when you change them.)
QUESTION 1
Vector Space Axioms
Let V be a set on which two operations, called vector addition
and vector scalar multiplication, have been defined. If u and v are
in V , the sum of u and v is denoted by u + v , and if k is a
scalar, the scalar multiple of u is denoted by ku . If the
following axioms satisfied for all u , v and w in V and for all
scalars k...
Let T and S be linear
transformations of a vector space V, and TS=ST
(a) Show that T
preserves the generalized eigenspace and eigenspace of S.
(b) Suppose V is a
vector space on R and dimV = 4. S has a minimal polynomial of
(t-2)2 (t-3)2?. What is the jordan canonical
form of S.
(c) Show that the
characteristic polynomial of T has at most 2 distinct roots and
splits completely.
Let S be a subset of a vector space V . Show that span(S) =
span(span(S)). Show that span(S) is the unique smallest linear
subspace of V containing S as a subset, and that it is the
intersection of all subspaces of V that contain S as a subset.
Need the math explanation
1. The value of a weight vector is given as (w1=3, w2=-2, w0=1)
for a linear model with soft threshold (sigmoid) function f(x).
Define a decision boundary, where the values of the feature vector
x result in f(x)=0.5. Plot the decision boundary in two
dimensions.
2. Generating training samples: In two dimensional feature space
x: (x1, x2,1), generate 20 random samples, for different values of
(x1,x2), that belong to two different classes C1 (1) and C2...