Problem 1: Let \mathbf{u}=\begin{bmatrix}3\\2\\-4\end{bmatrix}, \mathbf{v}=\begin{bmatrix}-6\\1\\7\end{bmatrix}, \mathbf{w}=\begin{bmatrix}0\\-5\\2\end{bmatrix} and \mathbf{z}=\begin{bmatrix}3\\7\\-5\end{bmatrix}.
- Are the sets \{\mathbf{u}, \mathbf{v}\}, \{\mathbf{u},\mathbf{w}\}, \{\mathbf{u},\mathbf{z}\}, \{\mathbf{v},\mathbf{w}\}, \{\mathbf{v},\mathbf{z}\} and \{\mathbf{w},\mathbf{z}\} each linearly independent? Why or why not?
- Does the answer to Problem 1 imply that \{\mathbf{u}, \mathbf{v}, \mathbf{w}, \mathbf{z}\} is linearly independent?
- To determine if \{\mathbf{u}, \mathbf{v}, \mathbf{w}, \mathbf{z}\} is linearly dependent, is it wise to check if, say \mathbf{w} is a linear combination of \mathbf{u}, \mathbf{v} and \mathbf{z}?
- Is \{\mathbf{u}, \mathbf{v}, \mathbf{w}, \mathbf{z}\} linearly dependent?
Solution:
- All the sets are linearly independent because none of the vectors is a multiple of another.
- The answer to part 1 doesn’t imply that \{\mathbf{u}, \mathbf{v}, \mathbf{w}, \mathbf{z}\} is linearly independent. In general, pairwise linear independency does not guarantee the global linear independency.
- It is not wise to check if \mathbf{w} is a linear combination of \mathbf{u}, \mathbf{v} and \mathbf{z} to determine if \{\mathbf{u}, \mathbf{v}, \mathbf{w}, \mathbf{z}\} is linearly dependent. However, if \mathbf{w} happens to be a linear combination of \mathbf{u}, \mathbf{v} and \mathbf{z}, we can conclude that \{\mathbf{u}, \mathbf{v}, \mathbf{w}, \mathbf{z}\} is linearly dependent. On the other hand, it is not necessarily true that \mathbf{w} being a linear combination of \mathbf{u}, \mathbf{v} and \mathbf{z},a linear combination of \mathbf{u}, \mathbf{v} and \mathbf{z} implies that \{\mathbf{u}, \mathbf{v}, \mathbf{w}, \mathbf{z}\} is linearly dependent.
- The set \{\mathbf{u}, \mathbf{v}, \mathbf{w}, \mathbf{z}\} is linearly dependent because the number of vectors is greater than the dimension.
Problem 2: For what values of h is \mathbf{v_3} in \mathrm{span}(\mathbf{v_1}, \mathbf{v_2}) and for what values of h is \{\mathbf{v_1}, \mathbf{v_2}, \mathbf{v_3}\} linearly dependent? Justify each answer. \mathbf{v_1}=\begin{bmatrix}1\\-3\\2\end{bmatrix}, \mathbf{v_2}=\begin{bmatrix}-3\\9\\-6\end{bmatrix}, \mathbf{v_3}=\begin{bmatrix}5\\-7\\h\end{bmatrix}.
Solution: Consider the vector equation, x_1\mathbf{v_1}+x_2\mathbf{v_2}=\mathbf{v_3}. The correspondent augmented matrix is \begin{bmatrix}1 & -3 & 5\\-3 & 9 & -7\\2 & -6 & h\end{bmatrix}. After a row reduction, we obtain \begin{bmatrix}1 & -3 & 5\\0 & 0 & 8\\2 & -6 & h\end{bmatrix}, in which the second row implies a contradiction. Hence the vector equation has no solutions. Therefore \mathbf{v_3} is not in the span of \mathbf{v_1} and \mathbf{v_2} for all values of h.
On the other hand, consider the vector equation x_1\mathbf{v_1}+x_2\mathbf{v_2}+x_3\mathbf{v_3}=0. The correspondent augmented matrix is \begin{bmatrix}1 & -3 & 5 & 0\\-3 & 9 & -7 & 0\\2 & -6 & h & 0\end{bmatrix}. After three row reductions, we obatin \begin{bmatrix}1 & -3 & 5 & 0\\0 & 0 & 8 & 0\\0 & 0 & 0 & 0\end{bmatrix}, in which the second column is not a pivot column. Hence the vector equation has infinite solutions. Therefore \{\mathbf{v_1}, \mathbf{v_2}, \mathbf{v_3}\} is linearly dependent for all values of h.
Problem 3: Describe the possible echelon forms of a 2\times 2 matrix with linearly dependent columns.
Solution: The echelon form of a 2\times 2 matrix A can be written as \begin{bmatrix}a & b\\0 & d\end{bmatrix}. Because A has linearly dependent columns, the augmented matrix \begin{bmatrix}A & \mathbf{0}\end{bmatrix} has infinitely many solutions. Since A is equivalent to its echelon form, the augmented matrix \begin{bmatrix}A & \mathbf{0}\end{bmatrix} is equivalent to the augmented matrix \begin{bmatrix}a & b & 0\\0 & d & 0\end{bmatrix}. Therefore, the augmented matrix \begin{bmatrix}a & b & 0\\0 & d & 0\end{bmatrix} has infinite solutions. Either a=0 or d=0 can guarantee this. Hence the possible echelon forms are \begin{bmatrix}0 & b\\0 & d\end{bmatrix}, \begin{bmatrix}a & b\\0 & 0\end{bmatrix}.
Problem 4: Suppose an m\times n matrix A has n pivot columns. Explain why for each \mathbf{b} in \mathbb{R}^m the equation A\mathbf{x}=\mathbf{b} has at most one solution.
Solution: Assume for sake of contradiction that A\mathbf{x}=\mathbf{b} has two different solutions, say \mathbf{x_1} and \mathbf{x_2}. Then A\mathbf{x_1}=\mathbf{b} and A\mathbf{x_2}=\mathbf{b} imply that A\mathbf{x_0}=\mathbf{0}, where \mathbf{x_0} = \mathbf{x_1} - \mathbf{x_2} \neq \mathbf{0}. Since A has n pivot columns, i.e., all its columns are pivot columns, the augmented matrix \begin{bmatrix}A & \mathbf{0}\end{bmatrix} has a unique solution, that is the zero solution. However \mathbf{x_0} is not a zero vector and satisfies A\mathbf{x_0}=\mathbf{0}. This is a contradiction.