Skip to main content

Vector Spaces and The Four Fundamental Subspaces


The Problem of Solving Linear Equations

We want to solve a system of mm linear equations in nn unknowns, written as Ax=bAx=b. In the "row picture," each of these mm equations defines a hyperplane in nn-dimensional space. The goal is to find a solution xx, which is a single point of intersection that lies on all mm of these hyperplanes.

This geometric view presents three possibilities:

  1. One Solution: The hyperplanes intersect at a single point.
  2. No Solution: The hyperplanes have no common intersection point (e.g., two planes are parallel).
  3. Infinite Solutions: The hyperplanes intersect on a larger set, such as a line or a plane (e.g., three planes intersect on a common line).

The homogeneous case Ax=0Ax=0 is a related problem where b=0b=0. Since all hyperplanes must pass through the origin, x=0x=0 (the "trivial solution") is always one answer. The fundamental question becomes: Do the hyperplanes intersect only at the origin, or do they also intersect along a larger set (like a line or plane) that passes through the origin?

Basis, Dimension, and Rank

Basis

  • Definition: A set of vectors {v1,,vk}\{v_1, \dots, v_k\} in a vector space VV that is both linearly independent and spans the space VV.

    • Linearly Independent: The only combination c1v1++ckvk=0c_1v_1 + \dots + c_kv_k = 0 is when all coefficients cic_i are zero. This means there is no redundancy in the set.
    • Spans the Space: Every vector vv in the space VV can be expressed as a linear combination v=c1v1++ckvkv = c_1v_1 + \dots + c_kv_k.
  • Intuition: A basis is the smallest possible set of "building blocks" or "coordinate axes" for a vector space.

    • It has just enough vectors:

      • Not too few (or it couldn't span the whole space).
      • Not too many (or the vectors would be dependent, and some would be redundant).
    • Example: The standard basis for R2R^2 (the x-y plane) is {[10],[01]}\left\{ \begin{bmatrix} 1 \\ 0 \end{bmatrix}, \begin{bmatrix} 0 \\ 1 \end{bmatrix} \right\}.

    • Example: A different basis for R2R^2 is {[11],[11]}\left\{ \begin{bmatrix} 1 \\ 1 \end{bmatrix}, \begin{bmatrix} 1 \\ -1 \end{bmatrix} \right\}. A vector space can have infinitely many different bases.

Dimension

  • Definition: The number of vectors in any basis for a vector space VV.
    • Key Theorem: All bases for a specific vector space have the exact same number of vectors. This fixed number is the dimension.
  • Intuition: The dimension is the "number of independent (orthogonal) directions" or "degrees of freedom" of a space.
    • A line has dimension 1.
    • A plane has dimension 2.
    • RnR^n has dimension nn.
    • If the nullspace of A=[1122]A = \begin{bmatrix} 1 & 1 \\ 2 & 2 \end{bmatrix} is the line of all multiples of [11]\begin{bmatrix} 1 \\ -1 \end{bmatrix}, its basis is {[11]}\left\{ \begin{bmatrix} 1 \\ -1 \end{bmatrix} \right\} and its dimension is 1.

Rank

  • Definition: The dimension of the column space C(A)C(A) of a matrix AA
    • Key Theorem: The dimension of the column space is equal to the dimension of the row space. This common number is the rank rr.
  • Intuition: The rank is the "true dimension" of a matrix. It counts the number of "essential" or "independent" columns (or rows).
    • A matrix like A=[1224]A = \begin{bmatrix} 1 & 2 \\ 2 & 4 \end{bmatrix} looks 2D, but its columns are on the same line. Its column space has dimension 1, so its rank is 1.
  • How to Find Rank: The rank rr is the number of pivots found by Gaussian elimination in the echelon form UU or RR.
    • Example: A=[123246]A = \begin{bmatrix} 1 & 2 & 3 \\ 2 & 4 & 6 \end{bmatrix} reduces to U=[123000]U = \begin{bmatrix} 1 & 2 & 3 \\ 0 & 0 & 0 \end{bmatrix}.
    • It has only one pivot (the 1).
    • Therefore, the rank is 1.

What is different between Dimension and Rank?

  • rank(A)=dim(C(A))\text{rank}(A) = \dim(C(A)) (the dimension of the column space)
  • rank(A)=dim(C(AT))\text{rank}(A) = \dim(C(A^T)) (the dimension of the row space)
FeatureDimensionRank
What it describes?A vector space (or subspace).A matrix.
What it measures?The number of vectors in any basis for the space.The dimension of the column space (or row space) of the matrix.
How to find it?Find a basis for the space and count the vectors.Find the number of pivots in the matrix's echelon form.
Example:The plane x+y+z=0x+y+z=0 is a subspace of R3R^3 with dimension 2. (A basis is {[110],[101]}\left\{ \begin{bmatrix} -1 \\ 1 \\ 0 \end{bmatrix}, \begin{bmatrix} -1 \\ 0 \\ 1 \end{bmatrix} \right\})The matrix A=[111222]A = \begin{bmatrix} 1 & 1 & 1 \\ 2 & 2 & 2 \end{bmatrix} (which has that plane as its nullspace) has rank 1.

The Four Fundamental Subspaces

To answer these questions, we analyze the m×nm \times n matrix AA. The key is to find its rank rr, which is the number of pivots in its echelon form. For our examples, we will use this 3×33 \times 3 matrix AA with rank r=2r=2:

A=[112235347]EliminationU=[112011000]Reduced FormR=[101011000]A = \begin{bmatrix} 1 & 1 & 2 \\ 2 & 3 & 5 \\ 3 & 4 & 7 \end{bmatrix} \xrightarrow{\text{Elimination}} U = \begin{bmatrix} 1 & 1 & 2 \\ 0 & 1 & 1 \\ 0 & 0 & 0 \end{bmatrix} \xrightarrow{\text{Reduced Form}} R = \begin{bmatrix} 1 & 0 & 1 \\ 0 & 1 & 1 \\ 0 & 0 & 0 \end{bmatrix}

The Column Space C(A)C(A)

Definition

The subspace of Rm\mathbb{R}^m consisting of all linear combinations of the columns of AA.

Intuition and Properties

  1. C(A)C(A) is the span of the columns. This is the set of all vectors that can be constructed from linear combinations of the columns of AA.
  2. The question "Does Ax=bAx=b have a solution?" is identical to the question "Is bb in the column space of AA?" In other words, can bb be built from the columns? Example: For our matrix AA, the columns are c1=[123]c_1 = \begin{bmatrix} 1 \\ 2 \\ 3 \end{bmatrix}, c2=[134]c_2 = \begin{bmatrix} 1 \\ 3 \\ 4 \end{bmatrix}, and c3=[257]c_3 = \begin{bmatrix} 2 \\ 5 \\ 7 \end{bmatrix}. We can see that c1+c2=c3c_1 + c_2 = c_3, so the third column is dependent and adds no new vectors to the span. Therefore, C(A)C(A) is not all of R3\mathbb{R}^3; it is a 2-dimensional plane inside R3\mathbb{R}^3.

Formal Math

  1. bC(A)b \in C(A) if and only if b=Axb = Ax for some xx.
  2. The dimension of C(A)C(A) is the rank rr. For our example, dim(C(A))=r=2\dim(C(A)) = r = 2.
  3. A basis for C(A)C(A) is formed by the rr pivot columns of the original matrix AA. In our example, the pivots are in columns 1 and 2. So, one basis for C(A)C(A) is {[123],[134]}\left\{ \begin{bmatrix} 1 \\ 2 \\ 3 \end{bmatrix}, \begin{bmatrix} 1 \\ 3 \\ 4 \end{bmatrix} \right\}.

The Nullspace N(A)N(A)

Definition

The subspace of Rn\mathbb{R}^n consisting of all solutions xx to the homogeneous equation Ax=0Ax=0.

Intuition and Properties

  1. The nullspace is the set of all "recipes" xx that combine the columns of AA to produce the zero vector. Hence, it describes the dependency among the columns. (i.e. redundancy in the matrix)
  2. This space represents the ambiguity in the solutions to Ax=bAx=b. If xpx_p is one solution, the complete solution is xp+xnx_p + x_n, where xnx_n is any vector in the nullspace. Therefore, if N(A)={0}N(A) = \{0\}, solutions (if they exist) are unique.
  3. In short, If the nullspace is just the zero vector, the columns are independent, and if the nullspace contains non-zero vectors, they are dependent.

Example #1: Consider this matrix AA, which has independent columns:

A=[1001]A = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} The "recipe" to combine these columns to get the zero vector is the equation Ax=0Ax=0: x1[10]+x2[01]=[00]x_1 \begin{bmatrix} 1 \\ 0 \end{bmatrix} + x_2 \begin{bmatrix} 0 \\ 1 \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \end{bmatrix} The only recipe x=[x1x2]x = \begin{bmatrix} x_1 \\ x_2 \end{bmatrix} that makes this happen is x1=0x_1 = 0 and x2=0x_2 = 0, which is a trivial vector already in Nullspace.

  • Conclusion: The only vector in the nullspace is x=0x=0.
  • Meaning: There is no dependency between the columns. You can't use one column to "cancel out" the other. The columns are independent.

Example #2: Now consider this matrix AA, which has dependent columns:

A=[1236]A = \begin{bmatrix} 1 & 2 \\ 3 & 6 \end{bmatrix}

Column 2 is just 2 times Column 1. This is a redundancy. The columns are linearly dependent.

There exists a non-zero recipe xx that makes Ax=0 work: 2[13]+1[26]=[00]-2 \begin{bmatrix} 1 \\ 3 \end{bmatrix} + 1 \begin{bmatrix} 2 \\ 6 \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \end{bmatrix}

So, the vector x=[21]x = \begin{bmatrix} -2 \\ 1 \end{bmatrix} is a non-zero solution.

  • Conclusion: The vector x=[21]x = \begin{bmatrix} -2 \\ 1 \end{bmatrix} (and all its multiples) is in the nullspace.
  • Meaning: The nullspace contains the vector [21]\begin{bmatrix} -2 \\ 1 \end{bmatrix}, which is the exact recipe that describes how the columns are dependent. It literally tells you that 1×(Column 2)2×(Column 1)=01 \times (\text{Column 2}) - 2 \times (\text{Column 1}) = 0.

Example #3: For our AA above, we solve Ax=0Ax=0. The dependency c1+c2=c3c_1 + c_2 = c_3 can be rewritten as c1+c2c3=0c_1 + c_2 - c_3 = 0, or A[111]=[000]A \begin{bmatrix} 1 \\ 1 \\ -1 \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ 0 \end{bmatrix}. This shows the vector [111]\begin{bmatrix} 1 \\ 1 \\ -1 \end{bmatrix} is in the nullspace. Since the dimension is 1 (see below), this single vector is a basis. The nullspace is a line in R3\mathbb{R}^3.

Formal Math

  1. xN(A)x \in N(A) if and only if Ax=0Ax=0.
  2. The dimension is dim(N(A))=nr\dim(N(A)) = n - r (number of columns - rank). This is the number of free variables.
    • In our example, n=3n=3 and r=2r=2, so the dimension is 32=13 - 2 = 1. This matches our intuition that the nullspace is a line.
  3. A basis is formed by the nrn-r "special solutions". We find these from the reduced form RR: R=[101011000]    x1+x3=0x2+x3=00=0R = \begin{bmatrix} 1 & 0 & 1 \\ 0 & 1 & 1 \\ 0 & 0 & 0 \end{bmatrix} \implies \begin{array}{r} x_1 + x_3 = 0 \\ x_2 + x_3 = 0 \\ 0 = 0 \end{array} The free variable is x3x_3. Set x3=1    x1=1,x2=1x_3=1 \implies x_1 = -1, x_2 = -1. The basis for N(A)N(A) is the single special solution s1=[1 1 1]s_1 = \begin{bmatrix} -1 \ -1 \ 1 \end{bmatrix}. (Note: This is just a different multiple of the vector we found by inspection).

The Row Space C(AT)C(A^T)

Definition

The subspace of Rn\mathbb{R}^n consisting of all linear combinations of the rows of AA.

Intuition and Properties

  1. This space represents the "effective input" of AA. Any input vector xx can be split into a row-space part (xrx_r) and a nullspace part (xnx_n). The matrix AA only "sees" the xrx_r part: A(xr+xn)=Axr+Axn=Axr+0A(x_r + x_n) = Ax_r + Ax_n = Ax_r + 0.
  2. Gaussian elimination does not change the row space. The row space of AA is the same as the row space of its echelon form UU or RR.
  3. The dimension of C(AT)C(A^T) is also the rank rr. This is the "row rank = column rank" theorem.

Example: For our AA, the row space is a plane in R3\mathbb{R}^3 (since r=2r=2).

Formal Math

  1. A basis for C(AT)C(A^T) is formed by the rr non-zero rows of the echelon form UU or RR.
    • From our example, the basis for C(AT)C(A^T) is the set of non-zero rows of RR: {[101],[011]}\left\{ \begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix}, \begin{bmatrix} 0 \\ 1 \\ 1 \end{bmatrix} \right\}.
  2. The row space C(AT)C(A^T) and the nullspace N(A)N(A) are orthogonal complements. Every vector in one is perpendicular to every vector in the other.
    • Check: Take the basis vectors from N(A)N(A) and C(AT)C(A^T).
      • [111][101]=(1)(1)+(1)(0)+(1)(1)=0\begin{bmatrix} -1 \\ -1 \\ 1 \end{bmatrix} \cdot \begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix} = (-1)(1) + (-1)(0) + (1)(1) = 0.
      • [111][011]=(1)(0)+(1)(1)+(1)(1)=0\begin{bmatrix} -1 \\ -1 \\ 1 \end{bmatrix} \cdot \begin{bmatrix} 0 \\ 1 \\ 1 \end{bmatrix} = (-1)(0) + (-1)(1) + (1)(1) = 0.
    • Their dimensions add up to the full dimension of the input space: dim(C(AT))+dim(N(A))=r+(nr)=2+1=3=n\dim(C(A^T)) + \dim(N(A)) = r + (n-r) = 2 + 1 = 3 = n.

The Left Nullspace N(AT)N(A^T)

Definition

The subspace of Rm\mathbb{R}^m consisting of all solutions yy to the equation ATy=0A^Ty=0.

Intuition and Properties

  1. It is called the "left" nullspace because ATy=0A^Ty=0 is the same as yTA=0Ty^TA = 0^T. This yy is a vector of coefficients that combines the rows of AA to produce the zero vector.
  2. This space gives the solvability condition for Ax=bAx=b. If yTA=0y^TA=0 (a combination of rows is zero), then yTby^Tb must also be zero for a solution to exist.

Example: For our AA, we saw from elimination that (row 1)(row 2)+(row 3)=[000](-\text{row } 1) - (\text{row } 2) + (\text{row } 3) = \begin{bmatrix} 0 & 0 & 0 \end{bmatrix}. The combination vector is y=[1 1 1]y = \begin{bmatrix} -1 \ -1 \ 1 \end{bmatrix}. This vector is a basis for N(AT)N(A^T). It is a line in R3\mathbb{R}^3.

Formal Math

  1. yN(AT)y \in N(A^T) if ATy=0A^Ty=0.
  2. The dimension is dim(N(AT))=mr\dim(N(A^T)) = m - r (number of rows - rank). This is the number of zero rows in the echelon form. (Example: m=3,r=2m=3, r=2, so dim=1\dim = 1).
  3. The left nullspace N(AT)N(A^T) and the column space C(A)C(A) are orthogonal complements.
    • Check: Take the basis vectors.
      • C(A)C(A) basis: [1 2 3]\begin{bmatrix} 1 \ 2 \ 3 \end{bmatrix} and [1 3 4]\begin{bmatrix} 1 \ 3 \ 4 \end{bmatrix}.
      • N(AT)N(A^T) basis: [1 1 1]\begin{bmatrix} -1 \ -1 \ 1 \end{bmatrix}.
      • [1 1 1][1 2 3]=12+3=0\begin{bmatrix} -1 \ -1 \ 1 \end{bmatrix} \cdot \begin{bmatrix} 1 \ 2 \ 3 \end{bmatrix} = -1 - 2 + 3 = 0.
      • [1 1 1][1 3 4]=13+4=0\begin{bmatrix} -1 \ -1 \ 1 \end{bmatrix} \cdot \begin{bmatrix} 1 \ 3 \ 4 \end{bmatrix} = -1 - 3 + 4 = 0.
    • Their dimensions add up to the full dimension of the output space: dim(C(A))+dim(N(AT))=r+(mr)=2+1=3=m\dim(C(A)) + \dim(N(A^T)) = r + (m-r) = 2 + 1 = 3 = m.

Integrating Concepts: The Complete Solution

The four subspaces provide a complete and rigorous answer to the problems from the first section.

When does Ax=bAx=b have a solution? (Existence)

  • The Problem: Do the mm hyperplanes intersect?

  • The Subspace Answer: A solution exists if and only if bb is in the Column Space C(A)C(A).

  • The Practical Test: We know C(A)C(A) is orthogonal to N(AT)N(A^T). This gives a perfect test for solvability:

    Ax=bAx=b has a solution if and only if bb is orthogonal to the left nullspace.

    (yTb=0y^T b = 0 for all yy that satisfy ATy=0A^Ty = 0).

  • Example: For A=[112235347]A = \begin{bmatrix} 1 & 1 & 2 \\ 2 & 3 & 5 \\ 3 & 4 & 7 \end{bmatrix}, the left nullspace is spanned by y=[111]y = \begin{bmatrix} -1 \\ -1 \\ 1 \end{bmatrix}.

    • Does Ax=b=[123]Ax = b = \begin{bmatrix} 1 \\ 2 \\ 3 \end{bmatrix} have a solution?

      Yes, because yTb=[111][123]=12+3=0y^Tb = \begin{bmatrix} -1 & -1 & 1 \end{bmatrix} \begin{bmatrix} 1 \\ 2 \\ 3 \end{bmatrix} = -1 - 2 + 3 = 0.

    • Does Ax=b=[124]Ax = b = \begin{bmatrix} 1 \\ 2 \\ 4 \end{bmatrix} have a solution?

      No, because yTb=[111][124]=12+4=10y^Tb = \begin{bmatrix} -1 & -1 & 1 \end{bmatrix} \begin{bmatrix} 1 \\ 2 \\ 4 \end{bmatrix} = -1 - 2 + 4 = 1 \neq 0.

How many solutions are there? (Uniqueness)

  • The Problem: Is the intersection a single point, a line, or a plane?

  • The Subspace Answer: This is controlled entirely by the Nullspace N(A)N(A).

  • The Complete Answer: The full solution set is x=xp+xnx = x_p + x_n, where xpx_p is one particular solution and xnx_n is any vector from the nullspace.

    • Unique Solution: This happens if dim(N(A))=nr=0\dim(N(A)) = n-r = 0. The nullspace is just {0}\{0\}. The only solution is x=xpx = x_p.

    • Infinite Solutions: This happens if dim(N(A))=nr>0\dim(N(A)) = n-r > 0. The solution set x=xp+xnx = x_p + x_n is a line or plane that is parallel to the nullspace, just "shifted" away from the origin by xpx_p.

      • Example: For our AA (with nr=1n-r=1) and b=[123]b = \begin{bmatrix} 1 \\ 2 \\ 3 \end{bmatrix}, one particular solution is xp=[100]x_p = \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix}. The complete solution is the line x=[100]+c[111]x = \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix} + c \begin{bmatrix} -1 \\ -1 \\ 1 \end{bmatrix}.

Summary

The Four Fundamental Subspaces

Figure: Visual representation of the four fundamental subspaces and their relationships. [1]


References

  1. MIT OpenCourseWare - 18.06SC Linear Algebra - The Four Fundamental Subspaces
  2. Strang, Gilbert - Introduction to Linear Algebra (Chapter 2)