The Student
Junior Member
- Joined
- Apr 25, 2012
- Messages
- 241
My notes have the following in bold.
Proposition. If X is linearly dependent, and X is a subset of a finite set Y , then Y is also linearly dependent.
Proof. Suppose that
X = {v1,...,vm}, Y = {v1,...,vm,w1,...,wk}.
Since X is linearly dependent, then there are scalars a1,...,am, not all 0, such that a1v1 + ··· + am, vm = 0.
Then
a1v1 + ··· + am, vm + 0w1 + ··· + 0wm = 0
so Y is linearly dependent.
The proposition seems to be false. If the coefficients for the set of the w vectors are not zero, and the w vectors are not scalar combinations of v vectors, then wouldn't that mean that some of Y is linearly independent to X?
Proposition. If X is linearly dependent, and X is a subset of a finite set Y , then Y is also linearly dependent.
Proof. Suppose that
X = {v1,...,vm}, Y = {v1,...,vm,w1,...,wk}.
Since X is linearly dependent, then there are scalars a1,...,am, not all 0, such that a1v1 + ··· + am, vm = 0.
Then
a1v1 + ··· + am, vm + 0w1 + ··· + 0wm = 0
so Y is linearly dependent.
The proposition seems to be false. If the coefficients for the set of the w vectors are not zero, and the w vectors are not scalar combinations of v vectors, then wouldn't that mean that some of Y is linearly independent to X?
Last edited: