Skip to main content

Section 2: Matrix Calculus and Automatic Differentiation

Overview

Focus on differentiating matrix and vector expressions, and on the algorithmic computation of gradients that powers modern deep learning.

Topics Covered

Chapter 1: Matrix Calculus

  • Derivatives of scalar functions of matrices (trace, determinant, inverse)
  • Numerator and denominator layout conventions
  • Common matrix calculus identities and the cookbook approach

Chapter 2: Automatic Differentiation

  • Forward-mode AD and dual numbers
  • Reverse-mode AD and backpropagation
  • Computational graphs, JVP vs VJP

Learning Objectives

  • Differentiate expressions involving traces, determinants, and matrix inverses
  • Navigate numerator and denominator layout conventions
  • Understand forward-mode and reverse-mode automatic differentiation
  • Relate reverse-mode AD to backpropagation in neural networks