Section 3: Problems
University-level exam questions for Advanced Topics for ML.
Exponential Families
Problem 1.1
Write the Bernoulli distribution in exponential family form. Identify the natural parameter, sufficient statistic, and log-partition function.
Difficulty: Medium
Problem 1.2
Show that the mean of the sufficient statistic equals the gradient of the log-partition function: .
Difficulty: Hard
Problem 1.3
Identify the conjugate prior for the Poisson distribution and derive the posterior after observing i.i.d. samples.
Difficulty: Medium
Information Theory
Problem 2.1
Compute the entropy of a fair coin flip and a biased coin with . Which has higher entropy and why?
Difficulty: Easy
Problem 2.2
Prove that with equality iff (Gibbs' inequality).
Difficulty: Hard
Problem 2.3
Show that minimizing the cross-entropy over is equivalent to minimizing .
Difficulty: Medium
Problem 2.4
Compute the mutual information for jointly Gaussian random variables with correlation .
Difficulty: Hard
Concentration Inequalities
Problem 3.1
Use Markov's inequality to bound when and .
Difficulty: Easy
Problem 3.2
Let be i.i.d. with and . Use Hoeffding's inequality to find such that .
Difficulty: Medium
Problem 3.3
Explain how Hoeffding's inequality leads to the sample complexity bound in PAC learning: .
Difficulty: Hard
Challenge Problems
Problem 4.1
Derive the ELBO (Evidence Lower Bound) using Jensen's inequality and show that maximizing the ELBO is equivalent to minimizing where is the true posterior.
Difficulty: Very Hard
Solutions
Solutions are available in the implementation file with verification code.