|Unit level:||Level 4|
|Teaching period(s):||Semester 1|
|Offered by||School of Mathematics|
|Available as a free choice unit?:||N
- MATH20701 - Probability 2 (Compulsory)
- MATH20802 - Statistical Methods (Compulsory)
Additional RequirementsMATH48001 pre-requisites
Students are not permitted to take more than one of MATH38001 or MATH48001 for credit in the same or different undergraduate year. Students are not permitted to take MATH48001 and MATH68001 for credit in an undergraduate programme and then a postgraduate programme.
This course unit aims to introduce students to the principles of efficient estimation and hypothesis testing and acquaint them with the more successful methods of estimation and of constructing test procedures.
Statistical Inference is the body of principles and methods underlying the statistical analysis of data. In this course we introduce desirable properties that good estimators and hypothesis tests should enjoy and use them as criteria in the development of optimal estimators and test procedures. This is done both from the Classical/Frequentist as well as from the Bayesian point of view.
On successful completion of this module students will be able to:
- derive Fisher information,
- find the Cramer-Rao lower bound for the variances of unbiased estimators,
- use the Rao-Blackwell theorem to find an unbiased estimator,
- determine how good an estimator is on a number of criteria,
- formulate estimators and test procedures based both on the maximum likelihood principle and on Bayesian principles,
- state the non-asymptotic and asymptotic properties of maximum likelihood method of estimation,
- construct confidence intervals for parameters for both finite sample sizes and asymptotically as the sample size tends to infinity,
- conduct generalised likelihood ratio tests.
- Other - 20%
- Written exam - 80%
Assessment Further Information
- Coursework: weighting 20%
- End of semester examination: three hours, weighting 80%
Estimation: point estimation; unbiasedness; mean squared error; consistency; the score function; Fisher information; Cramer-Rao inequality; efficiency; most efficient estimators; sufficiency; factorisation theorem; minimal sufficiency; Rao Blackwell theorem and its use in improving an estimator.
Methods of estimation: maximum likelihood estimators (m.l.e) and their asymptotic properties; asymptotic distribution of the score function; confidence intervals based on the m.l.e and on the score function; restricted m.l.e and asymptotic properties.
Hypothesis testing: Wald test; the generalised likelihood ratio test; asymptotic form of the generalised likelihood ratio test; multinomial test; Pearson Chi-squared statistic; the Deviance function (including graphical methods in obtaining confidence regions for parameters). 
- Bayesian inference: introduction, priors, posteriors, conjugate prior, non-informative priors, Jeffrey's non informative prior, Bayesian estimation, predictive distributions, accuracy of an estimate, loss functions and expected posterior loss, optimal decisions with respect to a loss function, credibility intervals, highest posterior density credible intervals, hypothesis tests, large sample Bayesian approximation. 
- Beaumont, G. P., Intermediate Mathematical Statistics. Chapman & Hall 1980.
- Cox, D. R. and Hinkley, D. V., Theoretical Statistics. , Chapman & Hall 1974.
- Lindgren, B. W. Statistical Theory, 4th edition, Chapman & Hall 1993.
- Mood, A. M., Graybill, F. A. and Boes, D. C., Introduction to the Theory of Statistics, 3rd edition, McGraw-Hill 1974.
- Silvey, S. D., Statistical Inference, Chapman & Hall 1075.
Feedback tutorials will provide an opportunity for students' work to be discussed and provide feedback on their understanding. Coursework or in-class tests (where applicable) also provide an opportunity for students to receive feedback. Students can also get feedback on their understanding directly from the lecturer, for example during the lecturer's office hour.
- Lectures - 33 hours
- Tutorials - 11 hours
- Independent study hours - 106 hours