it's just a definition. I guess it's good to distinguish it from other possible "combinations". I remember a lecturer once called something like
a "polynomial combination of
and
". Or said something like
and
are "polynomialy independent" (probably slang just to add some humour and contrast with the linear case)
If in R^2, a set of 3 vectors will have one redundant vector, is that only true if the set if linearly dependent?
Having a "redundant vector" is actually slang for being linearly dependent, so trivially yes. The more interesting point is that in
any set of three vectors is indeed linearly independent.
If we have a set of three 2D vectors, will one always be redundant?
Yes by what I said before.
If so, could we choose any two vectors and solve for the scalar factors (not sure if that's the right term, just the constants in front of them) rather than solve a 3-variable, 2 equation system, choosing some number for one of the scalars?
I'm not sure what your point is. Maybe you are suggesting that if you want to represent some arbitrary vector in
and you want to represent is as a linear combination of
then can you delete one and just represent
as a linear combination of the other two? Yes you can though you have to be careful which one you delete (for example if a,b are parralel and c is not, then by deleting c you lose a "dimension", so then there are some vectors you can't represent in terms of just a and b)