1. If S is a basis of a vector space
V then every vector in V has exactly one representation as a linear combination of elements of S.
Proof. 1. Let S be a basis of
a vector space V. Then by the
definition of bases V=span(S), so every vector in V is equal to a linear
combination of vectors from S. It remains to prove that this linear
combination is unique. By contradiction, suppose that there exists a vector
a in V which is equal to two different linear combinations of vectors from S:
Subtracting the first equality from the second one, we get:
Thus the zero vector
is equal to a linear combination of elements from S. Since xi is not equal to yi for some i, some of the coefficients of this linear combination are not equal
to zero. By the theorem
about linearly independent sets of vectors, S is not linearly independent.
But this contradicts the assumption that S is a basis. Thus we assumed that a is represented by
two different linear combinations of elements of S and deduced a
contradiction. This completes the proof of part 1.
2.1
If V has a basis with n elements then every set of vectors in V which has
more than n elements is linearly dependent.
Proof. Let {v1,...,vn}
be a basis of V and let S be a subset of V with m>n elements. We need to
prove that S is linearly dependent.
Since {v1,...,vn} is a basis of V, every element of S is a linear
combination of the v's:
Consider the following n-vectors:
Claim: these vectors are linearly dependent. In order to prove that,
by theorem about linear independent sets, we need
to find not all zero numbers x1,..., xm such that
This leads to a homogeneous system of linear equations with n equations and
m unknowns. Since m>n, the number of equations is smaller than the number
of unknowns. By the theorem
about homogeneous systems of equations, this system has a non-zero solution
x1,...,xm.
Recall that we have the following equalities:
Multiply the first equality by x1, the second equality -- by x2, ...,
the last equality by xm and add them all. It is easy to see that on the
left we shall get the vector x1s1+...+xmsm. On the right we shall
get p1v1+...+pnvn where pi (i=1,...,n) is the i-th coordinate
of the vector x1t1+...+xntn. Since this vector is zero, all its coordinates
are equal to 0, so the right hand side is equal to the zero vector.
Thus x1s1+...+xmsm=0 and so the zero vector is equal to
a linear combination of elements from S not all coefficients of which are 0.
This implies, according to the theorem about linearly independent sets that S is linearly dependent.
2.2. If V has a basis with n elements then every set of vectors with fewer than n elements does not span V.
Proof. Suppose that V has basis S with n elements and suppose that T is a subset of V with fewer than n
elements. We need to show that T does not span V. By contradiction, assume that
T spabs V. By the theorem
about throwing away elements, we can find a subset T' of T which is linearly
independent and also spans V. Since V=span(T') and T' is linearly independent,
T' is a basis of V. Since the
number of elements in S is bigger than the number of elements in this basis,
we can conclude (using the part 2.1
of our theorem) that S is not
linearly independent. But this contradicts the assumption that S is a basis.
The proof of the part 2.2 is complete.
3.
If V has a basis with n elements then all
bases of V have the same number of elements.
Proof follows directly from parts 2.1 and 2.2.