|Unit level:||Level 3|
|Teaching period(s):||Semester 1|
|Offered by||School of Mathematics|
|Available as a free choice unit?:||N
- MATH10202 - Linear Algebra A (Compulsory)
- MATH10212 - Linear Algebra B (Compulsory)
The course aims to introduce students to modern convex optimization and its applications in fields such as machine learning. The course is designed to cover practical modelling aspects, algorithm analysis and design, and the theoretical foundations of the subject.
Optimization is the art of optimal decision making under constraints. Convex optimization refers to a set of problems and methods that can be formulated using convex functions and sets; countless problems from science, engineering and statistics can be cast as convex optimization problems and solved using efficient algorithms. The course is intended as an introduction to convex optimization, focussing on the theory, the modelling techniques, and the algorithm analysis and design. Recent developments such as convex regularization and compressed sensing will be discussed. The problem sessions will be used to present applications from machine learning, signal processing, and finance.
On completion of the course, students should be able to:
- recognise problems that can be formulated as convex optimization problem,
- describe and apply gradient descent and Newton’s method, and explain their performance and limitations,
- solve linear, quadratic and semidefinite programming problems using interior point methods, and evaluate their performance,
- derive the Lagrange dual of various standard optimization problems,
- characterise the solutions of optimization problems using optimality conditions such as the Karush Kuhn Tucker (KKT) conditions,
- explain the role of convex optimization in machine learning, signal processing, compressed sensing, and finance.
- Other - 20%
- Written exam - 80%
Assessment Further Information
Coursework: Weighting within unit 20%.
2 hours end of semester examination: Weighting within unit 80%.
The lectures and problem sessions will cover the following topics:
(1) Overview and examples of optimization problems;
(2) Least-squares, gradient descent, Newton’s method;
(3) Introduction to CVX;
(4) Fundamentals of convex analysis and geometry: convex sets and functions, subdifferential calculus;
(5) Linear and quadratic programming, semidefinite programming, conic optimization;
(6) Optimality conditions, duality theory, theorems of alternative;
(7) Interior-point methods, augmented Lagrangians, alternating direction method of multipliers;
(8) Applications in machine learning: convex regularization, compressed sensing and matrix completion.
The main reference is the book
Stephen Boyd and Lieven Vandenberghe. Convex Optimization. Cambridge University Press, 2004.
The book is available online at http://www.stanford.edu/~boyd/cvxbook/. The lecture will also make use of the CVX software, which is based on MATLAB.
Other useful references include:
- J. Nocedal and S.J.Wright. Numerical Optimization. Springer, 2006.
A. Ben-Tal and A. Nemirovski. Lectures on Modern Convex Optimization. 2013.
CONVEX OPTIMIZATION 3
- Y. Nesterov. Introductory lectures on convex optimization: A basic course. Springer, 2004.
Feedback tutorials will provide an opportunity for students' work to be discussed and provide feedback on their understanding. Coursework or in-class tests (where applicable) also provide an opportunity for students to receive feedback. Students can also get feedback on their understanding directly from the lecturer, for example during the lecturer's office hour.
- Lectures - 22 hours
- Tutorials - 11 hours
- Independent study hours - 67 hours