1. Linear Dependence and Independence
01 August 2019
Prerequisite Knowledge> A basic understanding of vectors
IntroductionThe concept of linear dependence/independence is one that isn't really well investigated in my opinion. VCE students upon encountering this concept may question why they are bothering to learn this, when it seems so abstract and perhaps a bit random. Fear not - those who will pursue studies in mathematics will encounter the world of linear algebra, where this becomes very important. In this guide, I'll hopefully clear up some misconceptions and/or muddy points about linear dependence/independence!
DefinitionsAll three of the following definitions regarding linear dependence/independence are equivalent.
Definition 1: A set of vectors \(\{\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_n\}\) is said to be linearly
dependent if there exists \(k_1,\dots,k_n\in\mathbb{R}\),
not all zero, such that \(k_1\mathbf{v}_1+\dots+k_n\mathbf{v}_n=\mathbf{0}\).
Definition 2: A set of vectors is said to be linearly
dependent if
at least one of its members can be expressed as a linear combination of the remaining members.
Definition 3: A set of vectors \(\{\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_n\}\) is said to be linearly
independent if \(k_1\mathbf{v}_1+\dots+k_n\mathbf{v}_n=\mathbf{0}\implies k_1=\dots=k_n=0\).
Essentially, to determine whether a set of vectors is linearly dependent or independent, you should form the linear system given by \[k_1\mathbf{v}_1+\dots+k_n\mathbf{v}_n=\mathbf{0}\] and aim to solve for \(k_1,\dots,k_n\). Using our definitions from above, if you find that the
only solution is \(k_1=\dots=k_n=0\) (note that this will always be a solution), then the set is linearly
independent. If you find there are alternative solutions for which not all \(k_i\) are \(0\), then the set is linearly
dependent.
A Few Useful TheoremsTheorem 1: Two vectors are linearly dependent if and only if they are proportional.
You should be able to prove this yourself
\((\implies)\) Suppose that \(\mathbf{a}\) and \(\mathbf{b}\) are linearly dependent. Since there are only two vectors, by Definition 2, we can write \(\mathbf{a}\) as a linear 'combination' of the remaining vector \(\mathbf{b}\). That is, for some \(k\in\mathbb{R}\), we have \(\mathbf{a}=k\mathbf{b}\), and so \(\mathbf{a}\) and \(\mathbf{b}\) are proportional.
\((\impliedby)\) Suppose that \(\mathbf{a}\) and \(\mathbf{b}\) are proportional. That is, for some \(k\in\mathbb{R}\), we have \(\mathbf{a}=k\mathbf{b}\). Then, \(\mathbf{a}-k\mathbf{b}=\mathbf{0}\), and so by Definition 1, \(\mathbf{a}\) and \(\mathbf{b}\) are linearly dependent. \(\Box\)
Theorem 2: A set of \(n+1\) vectors in \(\mathbb{R}^n\) is necessarily linearly dependent.
(You do not need to be able to prove this).
Some Common MisconceptionsMisconception 1: If the vectors \(\mathbf{a},\mathbf{b},\mathbf{c}\in\mathbb{R}^3\) are linearly dependent then \(\mathbf{c}=k_1\mathbf{a}+k_2\mathbf{b}\) for some \(k_1,k_2\in\mathbb{R}\).
Misconception 2: The vectors \(\mathbf{a},\mathbf{b},\mathbf{c}\in\mathbb{R}^3\) are linearly independent if \(\mathbf{c}\neq k_1\mathbf{a}+k_2\mathbf{b}\) for all \(k_1,k_2\in\mathbb{R}\).
These are both
incorrect. Definition 2 states that a set of vectors is linearly dependent if
at least one of its members can be expressed as a linear combination of the remaining members. That is, \(\mathbf{c}\) need not be that vector. Consider the vectors \[\mathbf{a}=2\mathbf{i}-\mathbf{j}+\mathbf{k},\ \ \mathbf{b}=-4\mathbf{i}+2\mathbf{j}-2\mathbf{k},\ \ \mathbf{c}=3\mathbf{i}+2\mathbf{j}+\mathbf{k}.\] It can very easily be shown that \(\mathbf{c}\neq m\mathbf{a}+n\mathbf{b}\ \ \forall\,m,n\in\mathbb{R}\) yet its obvious that \(\mathbf{a},\mathbf{b},\mathbf{c}\) are linearly dependent since \[2\mathbf{a}+\mathbf{b}+0\mathbf{c}=\mathbf{0}.\] Here, either \(\mathbf{a}\) or \(\mathbf{b}\) can be written as a linear combination of \(\mathbf{b}\) and \(\mathbf{a}\) respectively.
The
converse of the above statements are true.
Example (from
TWM Publications Free Specialist Exam 2)
Question 14The vectors \(\mathbf{a}=3\mathbf{i}-3\mathbf{j}+\alpha\mathbf{k}\), \(\mathbf{b}=\mathbf{i}+\mathbf{j}-3\mathbf{k}\) and \(\mathbf{c}=2\mathbf{i}-\mathbf{j}+4\mathbf{k}\) are linearly independent if
A. \(\alpha =1\)
B. \(\alpha =11\)
C. \(\alpha \in \mathbb{R}\setminus \{1\}\)
D. \(\alpha \in \mathbb{R}\setminus \{11\}\)
E. \(\alpha \in \mathbb{R}\)
Solution
Since \(\mathbf{b}\) and \(\mathbf{c}\) are not proportional, we consider \(\mathbf{a}=k_1\mathbf{b}+k_2\mathbf{c}\) and aim for linear dependence. Comparing vector components, we have \[\begin{cases} 3=k_1+2k_2\\ -3=k_1-k_2\\ \alpha=-3k_1+4k_2\end{cases}\]Solving the linear system gives \(\alpha=11,\ k_1=-1,\ k_2=2\), so that \(\mathbf{a},\mathbf{b},\mathbf{c}\) are linearly dependent. Thus, for linear independence, we require \(\alpha\neq 11\).
The answer is D.
A Useful Test for Linear Independence/Dependence (for Exam 2)
Let \(\mathbf{v}_1,...,\mathbf{v}_n\in\mathbb{R}^n\) and construct the matrix \[A=\begin{bmatrix}\mid & & \mid \\ \mathbf{v}_1 & \dots & \mathbf{v}_n\\ \mid & & \mid \end{bmatrix}\] obtained by placing the vectors as columns.
If \(\det(A)\neq 0\), then \(\mathbf{v}_1,...,\mathbf{v}_n\) are linearly
independent.
If \(\det(A)=0\), then \(\mathbf{v}_1,...,\mathbf{v}_n\) are linearly
dependent.
ConclusionSo, here's the first miniature guide for Specialist Maths! I totally underestimated the workload I had in semester 1, and so I apologise for not contributing anything to the thread. It's the start of semester 2 now, and since I did a summer subject earlier this year, I get to underload, which means I \(^1\)should have a bit more time on my hands.
\(^1\) We'll see of course...
To request a topic, reply to this thread or send me a message!