Section 2: Spectral Theory & Matrix Decompositions
Overview
Focus on eigenvalue analysis, positive definiteness, and the singular value decomposition — the tools that reveal the intrinsic geometry of matrices.
Topics Covered
Chapter 5: Eigenvalues and Eigenvectors
- Eigenvalue equation and characteristic polynomial
- Diagonalization , matrix powers
- Spectral theorem for symmetric matrices
- Similarity transformations
- Difference equations and matrix exponentials
Chapter 6: Positive Definite Matrices
- Quadratic forms and their geometry (bowls, saddles, domes)
- Five equivalent tests for positive definiteness
- Cholesky decomposition (deeper treatment)
- Gram matrix
- Rayleigh quotient and min-max principles
- Ellipsoids and principal axes
Chapter 6 (cont.): Singular Value Decomposition
- for any matrix
- Geometric interpretation: rotate-stretch-rotate
- Low-rank approximation (Eckart-Young theorem)
- Pseudoinverse and minimum-norm least squares
- Condition number via singular values
Learning Objectives
- Compute eigenvalues/eigenvectors and diagonalize matrices
- Understand the spectral theorem and why symmetric matrices are special
- Test for positive definiteness and connect it to optimization geometry
- Decompose any matrix via SVD and interpret the components
- Apply low-rank approximation to compress and denoise data
- Connect spectral methods to PCA, spectral clustering, and gradient dynamics