Vector Spaces and The Four Fundamental Subspaces
The Problem of Solving Linear Equations
We want to solve a system of linear equations in unknowns, written as . In the "row picture," each of these equations defines a hyperplane in -dimensional space. The goal is to find a solution , which is a single point of intersection that lies on all of these hyperplanes.
This geometric view presents three possibilities:
- One Solution: The hyperplanes intersect at a single point.
- No Solution: The hyperplanes have no common intersection point (e.g., two planes are parallel).
- Infinite Solutions: The hyperplanes intersect on a larger set, such as a line or a plane (e.g., three planes intersect on a common line).
The homogeneous case is a related problem where . Since all hyperplanes must pass through the origin, (the "trivial solution") is always one answer. The fundamental question becomes: Do the hyperplanes intersect only at the origin, or do they also intersect along a larger set (like a line or plane) that passes through the origin?
Basis, Dimension, and Rank
Basis
-
Definition: A set of vectors in a vector space that is both linearly independent and spans the space .
- Linearly Independent: The only combination is when all coefficients are zero. This means there is no redundancy in the set.
- Spans the Space: Every vector in the space can be expressed as a linear combination .
-
Intuition: A basis is the smallest possible set of "building blocks" or "coordinate axes" for a vector space.
-
It has just enough vectors:
- Not too few (or it couldn't span the whole space).
- Not too many (or the vectors would be dependent, and some would be redundant).
-
Example: The standard basis for (the x-y plane) is .
-
Example: A different basis for is . A vector space can have infinitely many different bases.
-
Dimension
- Definition: The number of vectors in any basis for a vector space .
- Key Theorem: All bases for a specific vector space have the exact same number of vectors. This fixed number is the dimension.
- Intuition: The dimension is the "number of independent (orthogonal) directions" or "degrees of freedom" of a space.
- A line has dimension 1.
- A plane has dimension 2.
- has dimension .
- If the nullspace of is the line of all multiples of , its basis is and its dimension is 1.
Rank
- Definition: The dimension of the column space of a matrix
- Key Theorem: The dimension of the column space is equal to the dimension of the row space. This common number is the rank .
- Intuition: The rank is the "true dimension" of a matrix. It counts the number of "essential" or "independent" columns (or rows).
- A matrix like looks 2D, but its columns are on the same line. Its column space has dimension 1, so its rank is 1.
- How to Find Rank: The rank is the number of pivots found by Gaussian elimination in the echelon form or .
- Example: reduces to .
- It has only one pivot (the 1).
- Therefore, the rank is 1.
What is different between Dimension and Rank?
- (the dimension of the column space)
- (the dimension of the row space)
| Feature | Dimension | Rank |
|---|---|---|
| What it describes? | A vector space (or subspace). | A matrix. |
| What it measures? | The number of vectors in any basis for the space. | The dimension of the column space (or row space) of the matrix. |
| How to find it? | Find a basis for the space and count the vectors. | Find the number of pivots in the matrix's echelon form. |
| Example: | The plane is a subspace of with dimension 2. (A basis is ) | The matrix (which has that plane as its nullspace) has rank 1. |
The Four Fundamental Subspaces
To answer these questions, we analyze the matrix . The key is to find its rank , which is the number of pivots in its echelon form. For our examples, we will use this matrix with rank :
The Column Space
Definition
The subspace of consisting of all linear combinations of the columns of .
Intuition and Properties
- is the span of the columns. This is the set of all vectors that can be constructed from linear combinations of the columns of .
- The question "Does have a solution?" is identical to the question "Is in the column space of ?" In other words, can be built from the columns? Example: For our matrix , the columns are , , and . We can see that , so the third column is dependent and adds no new vectors to the span. Therefore, is not all of ; it is a 2-dimensional plane inside .
Formal Math
- if and only if for some .
- The dimension of is the rank . For our example, .
- A basis for is formed by the pivot columns of the original matrix . In our example, the pivots are in columns 1 and 2. So, one basis for is .
The Nullspace
Definition
The subspace of consisting of all solutions to the homogeneous equation .
Intuition and Properties
- The nullspace is the set of all "recipes" that combine the columns of to produce the zero vector. Hence, it describes the dependency among the columns. (i.e. redundancy in the matrix)
- This space represents the ambiguity in the solutions to . If is one solution, the complete solution is , where is any vector in the nullspace. Therefore, if , solutions (if they exist) are unique.
- In short, If the nullspace is just the zero vector, the columns are independent, and if the nullspace contains non-zero vectors, they are dependent.
Example #1: Consider this matrix , which has independent columns:
The "recipe" to combine these columns to get the zero vector is the equation : The only recipe that makes this happen is and , which is a trivial vector already in Nullspace.
- Conclusion: The only vector in the nullspace is .
- Meaning: There is no dependency between the columns. You can't use one column to "cancel out" the other. The columns are independent.
Example #2: Now consider this matrix , which has dependent columns:
Column 2 is just 2 times Column 1. This is a redundancy. The columns are linearly dependent.
There exists a non-zero recipe that makes Ax=0 work:
So, the vector is a non-zero solution.
- Conclusion: The vector (and all its multiples) is in the nullspace.
- Meaning: The nullspace contains the vector , which is the exact recipe that describes how the columns are dependent. It literally tells you that .
Example #3: For our above, we solve . The dependency can be rewritten as , or . This shows the vector is in the nullspace. Since the dimension is 1 (see below), this single vector is a basis. The nullspace is a line in .
Formal Math
- if and only if .
- The dimension is (number of columns - rank). This is the number of free variables.
- In our example, and , so the dimension is . This matches our intuition that the nullspace is a line.
- A basis is formed by the "special solutions". We find these from the reduced form : The free variable is . Set . The basis for is the single special solution . (Note: This is just a different multiple of the vector we found by inspection).
The Row Space
Definition
The subspace of consisting of all linear combinations of the rows of .
Intuition and Properties
- This space represents the "effective input" of . Any input vector can be split into a row-space part () and a nullspace part (). The matrix only "sees" the part: .
- Gaussian elimination does not change the row space. The row space of is the same as the row space of its echelon form or .
- The dimension of is also the rank . This is the "row rank = column rank" theorem.
Example: For our , the row space is a plane in (since ).
Formal Math
- A basis for is formed by the non-zero rows of the echelon form or .
- From our example, the basis for is the set of non-zero rows of : .
- The row space and the nullspace are orthogonal complements. Every vector in one is perpendicular to every vector in the other.
- Check: Take the basis vectors from and .
- .
- .
- Their dimensions add up to the full dimension of the input space: .
- Check: Take the basis vectors from and .
The Left Nullspace
Definition
The subspace of consisting of all solutions to the equation .
Intuition and Properties
- It is called the "left" nullspace because is the same as . This is a vector of coefficients that combines the rows of to produce the zero vector.
- This space gives the solvability condition for . If (a combination of rows is zero), then must also be zero for a solution to exist.
Example: For our , we saw from elimination that . The combination vector is . This vector is a basis for . It is a line in .
Formal Math
- if .
- The dimension is (number of rows - rank). This is the number of zero rows in the echelon form. (Example: , so ).
- The left nullspace and the column space are orthogonal complements.
- Check: Take the basis vectors.
- basis: and .
- basis: .
- .
- .
- Their dimensions add up to the full dimension of the output space: .
- Check: Take the basis vectors.
Integrating Concepts: The Complete Solution
The four subspaces provide a complete and rigorous answer to the problems from the first section.
When does have a solution? (Existence)
-
The Problem: Do the hyperplanes intersect?
-
The Subspace Answer: A solution exists if and only if is in the Column Space .
-
The Practical Test: We know is orthogonal to . This gives a perfect test for solvability:
has a solution if and only if is orthogonal to the left nullspace.
( for all that satisfy ).
-
Example: For , the left nullspace is spanned by .
-
Does have a solution?
Yes, because .
-
Does have a solution?
No, because .
-
How many solutions are there? (Uniqueness)
-
The Problem: Is the intersection a single point, a line, or a plane?
-
The Subspace Answer: This is controlled entirely by the Nullspace .
-
The Complete Answer: The full solution set is , where is one particular solution and is any vector from the nullspace.
-
Unique Solution: This happens if . The nullspace is just . The only solution is .
-
Infinite Solutions: This happens if . The solution set is a line or plane that is parallel to the nullspace, just "shifted" away from the origin by .
- Example: For our (with ) and , one particular solution is . The complete solution is the line .
-
Summary

Figure: Visual representation of the four fundamental subspaces and their relationships. [1]
References
- MIT OpenCourseWare - 18.06SC Linear Algebra - The Four Fundamental Subspaces
- Strang, Gilbert - Introduction to Linear Algebra (Chapter 2)