Section 2: Problems
University-level exam questions for Multivariate Distributions and Estimation.
Joint, Marginal, and Conditional Distributions
Problem 1.1
Let have joint density for . Find the marginal densities and , and the conditional density .
Difficulty: Medium
Problem 1.2
Prove the law of total variance: .
Difficulty: Hard
The Multivariate Gaussian
Problem 2.1
Let . Find and .
Difficulty: Medium
Problem 2.2
Show that the Mahalanobis distance reduces to Euclidean distance when .
Difficulty: Easy
Problem 2.3
Prove that if and , then .
Difficulty: Medium
Maximum Likelihood Estimation
Problem 3.1
Derive the MLE for the parameters of a Gaussian distribution from i.i.d. samples.
Difficulty: Medium
Problem 3.2
Show that the MLE for the variance is biased, and compute the bias correction factor.
Difficulty: Medium
Problem 3.3
Compute the Fisher information for the Bernoulli distribution and state the Cramer-Rao lower bound for any unbiased estimator of .
Difficulty: Hard
Bayesian Inference
Problem 4.1
Given a Beta(2, 2) prior on and observing 7 heads in 10 coin flips, compute the posterior distribution and the posterior mean.
Difficulty: Medium
Problem 4.2
For a Gaussian likelihood with known variance and a Gaussian prior , derive the posterior distribution of given observations.
Difficulty: Hard
Problem 4.3
Explain why the posterior mean is always between the prior mean and the MLE, and show this algebraically for the Normal-Normal conjugate model.
Difficulty: Medium
Challenge Problems
Problem 5.1
Derive the posterior predictive distribution for the Normal-Normal model (Gaussian likelihood with Gaussian prior on the mean).
Difficulty: Very Hard
Solutions
Solutions are available in the implementation file with verification code.