In: Math
Proof:
Let S ⊆ V be a subset of a vector space V over F. We have that S is linearly dependent if and only if there exist vectors v1, v2, . . . , vn ∈ S such that vi is a linear combination of v1, v2, . . . , vi−1, vi+1, . . . , vn for some 1 ≤ i ≤ n.
Let us first assume that there are vectors v1, v2, . . . , vn ∈ S such that some vi ,1 ≤ i ≤ n, is a linear combination of v1, v2, .,vi-1,vi+1, . . , vn, then there exist scalars a1,a2,…,an(not all zero if vi ≠0 ), such that vi = a1v1+a2v2+…+ai-1vi-1 +ai+1vi+1 + ,…+anvn . Then a1v1+a2v2+…+ai-1vi-1 +ai+1vi+1 + ,…+anvn –vi = 0. This means that a linear combination of v1, v2, . . . , vn equals zero without the scalar coefficients of these vectors being all zero. Hence, by the definition of linear dependence, the vectors v1, v2, . . . , vn ∈ S are linearly dependent. Therefore, the set S is also linearly dependent.
Now, let us assume that S is a linearly dependent set. This implies that there are vectors v1, v2, . . . , vn ∈ S such that a linear combination of v1, v2, . . . , vn, say a1v1+a2v2+…+anvn = 0, without the scalar coefficients of these vectors being all zero. Further, if ai≠ 0, then vi = (1/ai)( a1v1+a2v2+…+ai-1vi-1 +ai+1vi+1 + ,…+anvn).Thus, vi is a linear combination of v1, v2, .,vi-1,vi+1, . . , vn.