1. If a subset S of a vector space V contains 0 then S is linearly dependent. We will prove this by contradiction. We assume that a subset S of a vector space V contains 0, but that S is then linearly independent.

If a vector space S={0, s1, s2, ... , sn} contains 0, then we can write the following linear combination of the terms in S:
    
    x1*0 + x2*s1 + x3*s2 + ... + xn+1*sn = 0  
 
 where x1 is a non-zero scalar and  
              
            x2, x3, ... , xn+1 = 0. 


Thus we have a linear combination of elements in S which is equal to zero and which does NOT have all zero coefficients (note, x1 was non-zero). But from the following theorem in the web notes:

Theorem. A subset S of a vector space V is linearly independent if and only if there exists exactly one linear combination of elements of S which is equal to 0, the one with all zero coefficients.

Thus, we have a contridiction, and S cannot be linearly independent. Also, we know that a set is linearly dependent if it is not linearly independent. Thus, S is linearly dependent and the proof is complete.

2. A set S with exactly two vectors is linearly dependent if and only if one of these vectors is a scalar multiple of the other. Let S contain the two vectors s1 and s2. The contrapositive of of the theorem about linearly independent sets of elements in a vector space tells us that S is linearly dependent if and only if we can write

a1*s1+a2*s2 = 0


for a1, a2 not both zero.

(See the end of this proof for a further discussion of this statement.) If we assume S to be linearly dependent, then we can write a1*s1=-a2*s2. At least one of the ai is nonzero, so we can always solve for at least one of the elements of S; we find that
 
                      s1 = -(a2/a1)s2 
or                    s2 = -(a1/a2)s1. 


In either case, one of the elements of S is a scalar multiple of the other.

If one of the elements is a scalar multiple of the other, then for some scalar x, s1=xs2 or s2=xs1. This can be condensed into one statement; simply s1=xs2 suffices.

We can then write x=-a2/a1, regardless of what value x has; if necessary, take a2=-x and a1=1. So

s1 = -(a2/a1)s2,


or

a1*s1 = -a2*s2.


Thus a1*s1+a2*s2 = 0, where a1 and a2 are not both zero. So s1 and s2 are linearly dependent by the contrapositive of the theorem mentioned above.

So S is linearly dependent if and only if one element is a scalar multiple of the other.

3. If a subset of a set S is linearly dependent then the whole set is linearly dependent In the web notes it states that a vector space is linearly dependent if it is not linearly independent. Also from the web notes that for a set to be linearly independent the following two statements are equivalent:

0 = a1*s1 + a2*s2 + ... + an*sn


if and only if

a1 = a2 =... = an


Where the vector space S = {s1, ..., sn}

Thus, if a subset is linearly dependent we can write a linear combination of elements in the subset of S:

0 = a1*s1 + a2*s2 + ... + ai*si


where a1 , ... ai are not all equal to zero. ( ie, a1 is NOT equal to zero.)

We also know from the web notes that every element of this subset of S is contained within the whole set S.

Thus we can write a linear combination of the whole set S:
 
0 = a1*s1 + ... + ai*si + bj*sj + ... + bn*sn 
 
where sj, ..., sn are the elements of S not in the subset {s1, ..., si};  
bj, ..., bn = 0; 
and a1, ... ai are the same constants as before, that is they are not all equal to zero. 
 

Thus we have a linear combination of the whole set S that is equal to zero without having all zero coefficients, thus from the theorem about linear independence in the web notes S cannot be linearly independent. And we know from the web notes that a vector space is linearly dependent if it is not linearly independent. Thus, if a subset of S is linearly dependent, then the whole S is linearly dependent and the proof is complete.