Random variables; probability, probability density and cumulative distribution functions; Expected value and variance of a random variable. Bernouilli trials, binomial, Poisson, geometric, hypergeometric, negative binomial distributions, and their inter-relationships. Poisson process. Probability generating functions.
Moment and cumulant generating functions; exponential, gamma, normal, lognormal, uniform, Cauchy and beta distributions.
Joint distributions; conditional distributions; independence; conditional expectations. Covariance, correlation.
Distributions of functions of random variables, including sums, means, products and ratios. Transformations of random variables; simulating observations from standard distributions; use of Jacobians; marginal distributions.
Proof of Central Limit Theorem. Derivation of chi-squared, t and F distribtions, and their uses. Distributions of sample mean and sample variance.
Estimation: Method of moments and maximum likelihood, efficiency, bias consistency and mean square error, unbiasedness, asymptotic properties of estimators. Confidence intervals for one and two samples for means and variances of normal distributions. Use of paired data.
Introduction to statistical inference; hypothesis testing; significance level, power, likelihood ratios, particularly demonstrating uses of chi-squared, t and F distributions, Bayesian inference.
Multivariate distributions and moment generating function; multinomial distribution; bivariate normal distribution; correlation. Distributions of maximum and minimum observations.
Compound distributions: conditional expectations, mean and variance of a random variable from expected values of conditional expected values.