Linear Algebra 3: Vector Equations

Vector Equations and Spans

tenzin migmar (t9nz)
9 min readOct 14, 2023

--

Preface

Welcome back to the third essay of my ongoing series on the basics of Linear Algebra, the foundational math behind machine learning. In my previous article, I walked through the echelon matrix forms. This article will take a look at vectors, spans and linear combinations and connect these new ideas to what we’ve already learned. This article would best serve readers if read in accompaniment with Linear Algebra and Its Applications by David C. Lay, Steven R. Lay, and Judi J. McDonald. Consider this series as a companion resource.

Feel free to share thoughts, questions, and critique.

Vectors in ℝ², ℝ³, and ℝⁿ

So far, we’ve learned about matrices which are arrays of numbers, what if we simply had an array (singular) of numbers? Behold the vector: a special kind of matrix with a size of m x 1, where m denotes the number of rows or entries in the vector. Recall that notation for matrix size is m x n, where m equals the number of rows while n corresponds to the number of columns. A vector will always have only one column, but any number of rows.

The set of all vectors with two entries is ℝ². ℝ encapsulates the entire set of real numbers, and so it makes sense that ℝ² is the two dimensional space of all the possible points (x, y) of real numbers.

Vectors can be in ℝ², ℝ³, ℝ⁴ … ℝⁿ, note that the dimension of the vector space corresponds to the number of entries in the vector.

You will likely eventually encounter the peculiar zero vector (written as simply 0), a vector where all its entries are zero. While it may seem like a minor detail, we’ll later find that it has important implications for some of the most important ideas in linear algebra.

Geometric Visualization

Up until now, matrices and vectors have been described, explained, and notated mathematically while vectors in physics are described as a quantity with both magnitude and direction. Both are equally correct; the below graphical visualization of vectors in ℝ² unite both definitions of a vector.

It is important to bear in mind that vectors in ℝ² are ordered pairs and vectors in higher dimensional vector spaces are ordered tuples (an list of numbers with a defined order). Two vectors may have the exact same numbers as their entries, but if order of their entries are not the same, then the vectors are also not the same, as seen in the diagram above.

Vectors in ℝ³ can also be visualized, we simply add a third axis as we have an additional entry. Beyond ℝ³, graphing vectors becomes a lot more convoluted as it gets difficult to grapple with picturing higher dimensional spaces.

Algebraic Properties of Vectors

For all vectors u, v, w in any given vector space and scalars c and d: the following algebraic properties¹ hold:

(i) commutative*: u + v = v + u

(ii) associative*: (u + v) + w = w + (v + w)

(iii) additive identity: u + 0 = 0 + u = u

(iv) additive inverse: u + (-u) = -u + u = 0

(v) distributive with vectors: c(u + v) = cu + cv

(vi) distributive with scalars: (c + d)u = cu + du

(vii) associative with scalars: c(du) = (cd)u

These properties are tied to the operations of vector addition and scalar multiplication.

To add two vectors, the corresponding entries are summed to produce the vector sum. This means that vector addition for two vectors of different sizes is undefined. In order to add two vectors, they must have the same number of entries! This condition arises from how vector addition is performed.

With scalar multiplication, for a given scalar c and vector u, the scalar multiple is cu where every entry in u has been multiplied by the scalar c.

These two operations may be used together; and as you’ll find in the next section, are combined to form a concept central to Linear Algebra: linear combinations.

Linear Combinations

Suppose we have the vectors v₁, v₂, … vₐ in ℝⁿ and we’re given scalars (also known as weights) c₁, c₂, … cₐ, which may be any real number, including zero. The linear combination is the vector defined by the sum of the scalar multiples, c₁v₁ + c₂v₂ + … + cₐvₐ. ²

Previously, we’ve explored the concept of existence in Linear Algebra where given a matrix, does there exist at least one solution? In other words, does the reduced/row echelon form of the matrix have an inconsistency? If so, no solutions exist. If not, then at least one solution does. This fundamental existence question is tied to many ideas in Linear Algebra, and linear combinations are no different.

We say that a vector b is a linear combination of a set of vectors v₁, v₂, .. vₐₚ in Rⁿ, if there exists a set of weights c₁, c₂, … cₐ (a solution) such that c₁v₁ + c₂v₂ + … + cₐvₐ = b.

To determine if b is a linear combination, we can use the operations of vector addition and scalar multiplication to rearrange our linear combination equation: c₁v₁ + c₂v₂ + … + cₐvₐₚ = b into a notation we have become very familiar with. This process of rearrangement also sheds some light on exactly why figuring out whether the vector b is a linear combination of a set of vectors is an existence problem.

The above explanation is meant to emphasize why the existence problem and matrix row reduction is connected to linear combinations and demonstrates the idea in a general sense. Let’s take a look at a more specific example.

In the above example, after row-reducing the augmented matrix to reduced row echelon form, we found that a solution does indeed exist!

However let’s consider the case of an augmented matrix in row reduced echelon form with the row [0, 0, … | b] where b ≠ 0, that would mean that the vector b cannot be written as a linear combination of a set of vectors. Put another way, the vector b is out of reach for our set of vectors, or (and this is a nice segue into the upcoming section) the vector b is not within the span of the set of vectors.

Span of a Set of Vectors

The set of all possible linear combinations for vectors v₁, v₂, … vₐ in ℝⁿ is referred to as the subset of ℝⁿ spanned by v₁, v₂, … vₐ. The span of vectors v₁, v₂, … vₐ is denoted Span{v₁, v₂, … vₐ} and is the set of vectors that can be written as c₁v₁ + c₂v₂ + … + cₐvₐ.³ Another way of thinking about it is the span contains all the vectors that can be written as a linear combination of vectors v₁, v₂, … vₐ.

We can find the span for a given set of any number of vectors. Suppose we have a set of a singular vector, v₁. The Span{v₁} would then be all the scalar multiples of v₁ because the only operation that can be applied in this case is scalar multiplication (at least two vectors are needed to perform vector addition). The Span{v₁} contains all of the vectors that can be reached by v₁.

If we were to visualize the span, it would be a straight line that goes though v₁ and the origin because with only one vector, the linear combinations (vector multiples) cannot change direction. This point is further illustrated in the diagram below.

Consider the span of two vectors in different directions in ℝ², what are the possible linear combinations these two vectors could make? In other words, what are the vectors in ℝ² that can be written as a linear combination of those two vectors?

For the above case, after further investigation, it appears that u and v span all of ℝ²! That means that any vector in ℝ² can be written as a linear combination of u and v. In a future article, we’ll explore the concept of linear independence which will be used to prove concretely that u and v span ℝ².

Conclusion

Vectors, linear combinations, and spans bring us one step deeper into the lush field of Linear Algebra. These fundamental concepts help us understand the structure of vector spaces and the relationships between different sets of vectors. As we progress further, you’ll find these ideas continually resurfacing because they’re linked to other core concepts. Similarly, I hope that you’ll take some time to think about how everything we’ve learned so far (the existence of solutions, row echelon forms) is deeply connected to these new concepts.

Summary

In this chapter, we learned about:

  • Vectors in ℝ², ℝ³, and ℝⁿ: the vector is a special kind of matrix with a size of m x 1. A vector may have any number of entries but only one column. We discovered that it is also possible to have a zero vector, a vector where all its entries are zero.
  • The geometric visualization of vectors: vectors can be graphically represented which helps in understanding where the ideas of magnitude and direction come from.
  • Algebraic properties of vectors: the following algebraic properties of vectors hold for all vectors and scalars; commutative, associative, additive identity, additive inverse, distributive with vectors, distributive with scalars, and associative with scalars.
  • Linear Combinations: the linear combination is the vector defined by the sum of the scalar multiples c₁v₁ + c₂v₂ + … + cₐvₐ. The weights c₁, c₂, … cₐ may be any scalar including zero.
  • Vector spans: the span of vectors v₁, v₂, … vₐ is denoted Span{v₁, v₂, … vₐ} and is the set of vectors that can be written as c₁v₁ + c₂v₂ + … + cₐvₐ.

Notes

¹Algebraic properties of vectors referenced from https://cs.brown.edu/stc/summer/94GeoTrans/94GeoTrans_17.html

²Definition for linear combinations referenced from Linear Algebra and Its Applications 6th Edition by David C. Lay, Steven R. Lay, and Judi J. McDonald

³Definition for span referenced from Linear Algebra and Its Applications 6th Edition by David C. Lay, Steven R. Lay, and Judi J. McDonald.

*All images created by the author unless otherwise noted.

*The associative property means that for the operations of addition and multiplication, numbers may be grouped together in any way, and the result will remain the same. For example, (5 + 2) + 3 = 5 + (2 + 3) = 10 and (5 x 2) x 3 = 5 x (2 x 3) = 30.

*Commutative means that for the operations of addition and multiplication, numbers may be added or multiplied in any order and the result will remain the same. For example, 5 + 2 = 2 + 5 = 7 and 5 x 2 = 2 x 5 = 10.

--

--