Section 2: Problems
University-level exam questions for Matrix Calculus and Automatic Differentiation.
Matrix Calculus
Problem 1.1
Compute where , , are matrices of compatible dimensions.
Difficulty: Medium
Problem 1.2
Show that for a positive definite matrix .
Difficulty: Hard
Problem 1.3
Derive using the identity .
Difficulty: Medium
Problem 1.4
For the linear regression loss , compute and using matrix calculus.
Difficulty: Medium
Automatic Differentiation
Problem 2.1
Draw the computational graph for and compute the gradient using reverse-mode AD.
Difficulty: Medium
Problem 2.2
Explain why forward-mode AD is efficient for and reverse-mode AD is efficient for . What are the computational costs of each?
Difficulty: Medium
Problem 2.3
Implement dual numbers for forward-mode AD and verify on that .
Difficulty: Medium
Challenge Problems
Problem 3.1
Derive the backpropagation equations for a two-layer neural network with ReLU activations and cross-entropy loss, identifying each step as a VJP computation.
Difficulty: Very Hard
Problem 3.2
Prove that the memory cost of reverse-mode AD is proportional to the number of operations in the computational graph.
Difficulty: Hard
Solutions
Solutions are available in the implementation file with verification code.