Search within:

Statistics Examination Syllabus


Distribution Theory

  • Random experiments; sample spaces; event spaces; properties of probability functions; conditional probability, stochastic independence; calculations of probabilities in the special case of finite sample spaces; combinatorial methods; combinations and permutations.
  • Discrete, continuous and mixed random variables and random vectors; distribution functions and density functions; joint distribution functions and joint density functions; marginal distributions; conditional distributions and densities. Bayes' theorem and distributional form of Bayes' theorem; Law of total probability. Sums and mixtures of random variables; convolution.
  • Expectation operator and its properties; moments: mean, variance, covariance, correlation, higher moments; Chebyshev's Inequality; Jensen's Inequality; Cauchy-Schwartz Inequality.
  • Moment-generating functions and other generating functions, uniqueness, characterization theorem, properties, examples and applications.
  • Special distributions: uniform, Bernoulli, binomial, hypergeometric, Poisson, geometric, negative binomial, exponential, gamma, beta, normal, lognormal, Pareto, Weibull, multinomial, bivariate normal; exponential family of distributions; computing probabilities and moments for these distributions, their properties and relationships.
  • Finding distributions of transformed random vectors; moment-generating function method, distribution-function method, and the transformation theorem; examples; the probability integral transformation; means, variances, and covariances of linear combinations of random variables.
  • Random samples, i.e. independent and identically distributed random variables; sample statistics and their exact distributions; distributions of sample statistics when random sample is drawn from a normal population; chi-square distribution; (Fisher) F distribution; (Student) t distribution.
  • Asymptotics; Limiting properties of sample statistics.

Statistical Inference

  • Elements of a statistical model; family of distribution functions; random sample; parameters of interest; type of inference desired; point estimation; interval estimation, and hypothesis testing.
  • Point estimation; methods of generating estimators: maximum likelihood, method of moments. Evaluating estimators: unbiasedness and asymptotic unbiasedness, consistency, variance, mean square error; Cramer-Rao lower bound; uniform minimum variance unbiased estimators (UMVUE), efficiency, superefficiency, sufficiency, and completeness. Interval estimation: constructing confidence intervals, pivotal quantity method; interpretation of a confidence interval; large and small sample confidence intervals choosing the sample size.
  • Testing hypotheses; Type I and Type II error; power function of a test; level of significance and p-values; Neyman-Pearson fundamental lemma; tests in the exponential family; likelihood ratio tests. Tests arising from the family of normal distributions: t-tests, chi-square tests, and F-tests. Large sample approximations; Chi-square tests of independence and goodness-of-fit. Exact binomial tests. Kolmogorov-Smirnov goodness-of-fit test.

Mathematical Statistics

  • types of convergence and their relations
  • continuity theorem for moment generating functions
  • Khinchin's Theorem and Slutsky's Theorem
  • CLT for non-i.i.d case
  • limiting distribution of sample variance and sample standard deviation
  • consistency of MLE
  • CAN estimators
  • asymptotic properties of sample quantiles
  • minimal sufficient statistics and complete statistics
  • exponential family
  • Basu's Theorem
  • Rao-Blackwell Theorem
  • N-P's lemma and UMP for composite alternative
  • decision theory.

Regression and Linear Models

  • Multivariate normal distribution.
  • Distribution of quadratic forms.
  • Least squares estimation; properties of least squares estimators. Simple and multiple linear regression models; tests for significance of regression and individual regression coefficients; interval estimation of model parameters; interval estimation of the mean response; prediction of new observations. Residual analysis and model adequacy checking. Lack-of-fit. Variance stabilizing transformations. Transformations to linearize the model. Measures of influence. Multicollinearity. Polynomial regression.
  • Decomposition of the total sum of squares and distributional properties of its components. Analysis of variance, analysis of variance with two or more predictors, generalized linear models, analysis of deviance as a generalization of the analysis of variance. General linear hypothesis; correlation. Remedial measures and transformations for departures from normality or non-constant variance.


  • Modern Mathematical Statistics, by Edward J. Dudewicz and Satya N. Mishra, 1988.
  • Statistical Inference, 2nd Ed., George Casella and Roger L. Berger, Duxbury (2002).
  • Theory of Point Estimation, by Lehmann and Casella, 1998.
  • The Theory of Linear Models, B. Jorgensen, Chapman & Hall (1993).
  • Linear Models in Statistics, by Alvin C. Rencher, 2nd ed. 2008.