|Unit level:||Level 4|
|Teaching period(s):||Semester 1|
|Offered by||School of Mathematics|
|Available as a free choice unit?:||N
To introduce the student to computational statistics, both the underlying theory and the practical application.
Computers are an invaluable tool to modern statisticians. The increasing power of computers has greatly increased the scope of inferential methods and the type of models which can be analysed. This has led to the development of a number of computationally intensive statistical methods, many of which will be introduced in this course.
On successful completion of this course unit students will be able to:
- construct algorithms to simulate random observations from probability distributions using a variety of methods and explain mathematically why they work;
- construct and derive the statistical properties of Monte Carlo estimators, as well as alternatives which seek to reduce variance;
- apply the bootstrap and jackknife to assign measures of accuracy to sample estimates and to derive their statistical properties analytically in some simple cases;
- recognise a non-linear regression model and be able to formulate the Gauss-Newton algorithm to find the parameter estimates from data;
- use the EM algorithm to find maximum likelihood estimators of parameters in some given contexts when we have incomplete sample information;
- implement the methodology discussed in the module (and also carry out simple simulation studies) on a computer using the statistical software R. To present informatively and discursively the results of computations.
- Other - 50%
- Written exam - 50%
Assessment Further Information
- Three pieces of coursework: 50%
- End of semester written examination (2 hours): 50%
- Introduction 
- Simulating random variables: inversion of the cdf; rejection sampling; transformations; ratio of uniforms. 
- Monte Carlo integration 
- Variance Reduction: importance sampling; control variates. 
- Nonparametric bootstrap methods; the Jackknife. 
- Nonlinear regression: model specification; least squares estimation; Gauss-Newton algorithm. 
- EM algorithm: data augmentation; the multinomial model; mixture distributions, censored data, Monte-Carlo EM. 
- Rizzo, M. Statistical Computing with R. Chapman & Hall
- Ripley, B.D. Stochastic Simulation. Wiley.
- Efron, B. and Tibshirani, R. An introduction to the bootstrap. Chapman & Hall
- Jones, M. and Wand, M. Kernel smoothing. Chapman & Hall
Feedback tutorials will provide an opportunity for students' work to be discussed and provide feedback on their understanding. Coursework also provides an opportunity for students to receive feedback. Students can also get feedback on their understanding directly from the lecturer, for example during the lecturer's office hour.
- Lectures - 22 hours
- Practical classes & workshops - 22 hours
- Independent study hours - 106 hours