Skip to main content

Section 1: Probability Foundations

Overview

Focus on the core probability theory that underpins all of machine learning: axioms, random variables, distributions, and summary statistics.

Topics Covered

Chapter 1: Probability Basics

  • Probability axioms and sample spaces
  • Conditional probability and Bayes' theorem
  • Independence and the law of total probability

Chapter 2: Random Variables

  • Discrete and continuous random variables
  • PMF, PDF, and CDF
  • Transformations of random variables

Chapter 3: Common Distributions

  • Bernoulli, Binomial, Poisson
  • Gaussian, Exponential, Gamma, Beta
  • Categorical, Multinomial, Uniform

Chapter 4: Expectation and Variance

  • Expectation and its properties
  • Variance, covariance, and correlation
  • Moments and the law of the unconscious statistician

Learning Objectives

  • State and apply the axioms of probability
  • Use Bayes' theorem to update beliefs given evidence
  • Work with common discrete and continuous distributions
  • Compute expectations, variances, and covariances