Week 1
Prove the Addition Theorem: \(P(A \cup B) + P(A \cap B) = P(A) + P(B)\).
Week 2
Prove Bayes Theorem for evidence \(E\) and a partition of hypotheses \(H_1, \ldots, H_n\).
Week 3
Prove the Binomial Theorem, the Borel Cantelli Lemmas, and Infinite Boole’s inequality.
Week 4
Prove the Poisson Law: \(b(n, p) \rightarrow P(\lambda)\) if \(\lambda = n p\).
Week 5
Prove that a beta prior \(h(\theta) = \text{Beta}(\alpha, \beta)\) is conjugate to a binomial likelihood \(g(x \mid \theta) = b(n, \theta)\): \(f(\theta \mid x) = \text{Beta}(\alpha + x, \beta + n - x)\).
Week 6
Derive all the BEG-CUP pdf moments: binomial, geometric, hypergeometric, negative binomial, Poisson, gamma, exponential, chi-square, beta, uniform, Gaussian.
Week 7
Prove the sampling Uncertainty Principle: \(s_{XY}^2 \le s_X^2 s_Y^2\). Prove the population Uncertainty Principle: \(\sigma_{XY}^2 \le \sigma_X^2 \sigma_Y^2\).
Week 8
State the formal definition for all UC-MOPED convergences. Prove the Weak Law of Large Numbers: \(\overline{X}_n \xrightarrow{p} \mu_X\). Show that the sample variance is unbiased and consistent: \(E[S_X^2(n)] = \sigma_X^2\) for all \(n\) and \(S_X^2(n) \xrightarrow{p} \sigma_X^2\).
Week 9
Derive the population mean and variance of a Doubly Random Sum: \(\overline{X}_N\): \(E[\overline{X}_N] = E_N[N] \mu_X\) and \(V[\overline{X}_N] = E_N[N] \sigma_X^2 + V_N[N] \mu_X^2\).
Week 10
Derive the pdf for \(Y = g(X)\) if \(g(x)\) is 1-to-1. Prove that Conditioning Reduces Entropy: \(H(Y \mid X) \le H(Y)\).
Week 11
Prove the Central Limit Theorem: \(Z_n = \text{STD}(\overline{X}_n) \xrightarrow{d} Z \sim N(0,1)\) in distribution (\(d\)). Derive all MGFs on the BEG CUP pdf sheet.
Week 12
Derive the system population mean \(E[N]\) and \(V[N]\) from the Global Balance Equations of the \(M / M / 1\) queue.
Week 14
Prove that the Ordinary Least Squares estimator \(\hat{\beta}^{\text{OLS}}\) is linear in \(Y\) if \(Y = X \beta + \epsilon\) and \(\epsilon \sim N(\mathbf{0}, \sigma^2 \mathbf{I})\): \(\hat{\beta}^{\text{OLS}} = (X^T X)^{-1} X^T Y\). Show that \(\hat{\beta}^{\text{OLS}} \sim N(\beta, \sigma^2 (X^T X)^{-1})\).