Postgraduate Course: Large Scale Optimization for Data Science (MATH11147)
|School||School of Mathematics
||College||College of Science and Engineering
|Credit level (Normal year taken)||SCQF Level 11 (Postgraduate)
||Availability||Available to all students
|Summary||The detailed modelling of real life problems requires a knowledgeable choice of the objective function and constraints, and often leads to very large optimization problems. The efficient solution of such problems is a key to the success of optimization in practice.
Data Science provides numerous instances of problems which can be modelled using optimization. The amount of data in some of these models challenges existing optimization techniques and requires the development of new ones.
This course will address the methods for constrained optimization and the assumption will be made that the knowledge of an exact (or an approximation of) the second order information (Hessian of the Lagrangian) is available. The course will cover interior point methods (IPMs) for various classes of optimization problems, addressing their theory and implementation.
It will also cover the alternating direction method of multipliers (ADMM) and touch on stochastic gradient (SD) used in deep learning.
The successful applications of these techniques in various Data Science problems from areas such as statistics, machine learning, engineering, energy and finance, will be discussed.
The practical component of this course will consist of computing laboratory work using Matlab. These exercises will reinforce the theoretical analysis of problems, methods and their implementation.
Unconstrained and Constrained Optimization (modelling issues: constraints in optimization)
Interior Point Methods for linear, quadratic, nonlinear, second-order cone and semidefinite programming (motivation, theory, polynomial complexity, implementation). Newton Method and self-concordant barriers in optimization
Implementational aspects of methods for very large scale optimization
(sparse matrices, inexact Newton Method).
Alternating Direction Method of Multipliers (ADMM). Stochastic Gradient.
Data Science Applications:
- Statistics: regressions, classification, discrimination analysis,
- Machine learning: support vector machines
- Engineering: signal and image processing
- Finance: portfolio optimization, asset and liability management
Entry Requirements (not applicable to Visiting Students)
|| Students MUST have passed:
Fundamentals of Optimization (MATH11111)
||Other requirements|| None
Information for Visiting Students
|Pre-requisites||Visiting students are advised to check that they have studied the material covered in the syllabus of each prerequisite course before enrolling.
|High Demand Course?
Course Delivery Information
|Academic year 2020/21, Available to all students (SV1)
|Learning and Teaching activities (Further Info)
Lecture Hours 18,
Seminar/Tutorial Hours 5,
Supervised Practical/Workshop/Studio Hours 4,
Programme Level Learning and Teaching Hours 2,
Directed Learning and Independent Learning Hours
|Assessment (Further Info)
|Additional Information (Assessment)
||Written Exam 50 %, Coursework 50 %
There will be 3 STACK-based assignments (30%) and two MATLAB-based asssignments (20%) contributing to the 50% of the Coursework.
||For STACK-based assignments, automatic (computer-based) feedback will be provided; for MATLAB-based assignments, written feedback will be provided.
||Hours & Minutes
|Main Exam Diet S2 (April/May)||Large Scale Optimization for Data Science (MATH11147)||2:00|
On completion of this course, the student will be able to:
- Model real-life problems as optimization problems.
- Choose a solution method appropriate to the characteristics of a given problem and obtain a solution using Matlab-based utilities.
- Explain how complexity analysis can be used to assess the efficiency of optimization techniques.
- Demonstrate the action of optimization methods by solving illustrative problems on paper.
- Explain how the implementation of optimization methods yields problems in numerical linear algebra.
|Numerical Optimization, J. Nocedal and S. Wright, Springer, 2nd edition. ISBN-10: 038730303 |
Primal-Dual Interior-Point Methods, S. Wright, SIAM, Philadelphia.
|Graduate Attributes and Skills
|Course organiser||Prof Jacek Gondzio
Tel: (0131 6)50 8574
|Course secretary||Miss Gemma Aitchison
Tel: (0131 6)50 9268