University Homepage
DRPS Homepage
DRPS Search
DRPS Contact
DRPS : Course Catalogue : School of Mathematics : Mathematics

Postgraduate Course: Large Scale Optimization for Data Science (MATH11147)

Course Outline
SchoolSchool of Mathematics CollegeCollege of Science and Engineering
Credit level (Normal year taken)SCQF Level 11 (Postgraduate) AvailabilityAvailable to all students
SCQF Credits10 ECTS Credits5
SummaryThe detailed modelling of real life problems requires a knowledgeable choice of the objective function and constraints, and often leads to very large optimization problems. The efficient solution of such problems is a key to the success of optimization in practice.
Data Science provides numerous instances of problems which can be modelled using optimization. The amount of data in some of these models challenges existing optimization techniques and requires the development of new ones.

This course will address the methods for constrained optimization and the assumption will be made that the knowledge of an exact (or an approximation of) the second order information (Hessian of the Lagrangian) is available. The course will cover interior point methods for various classes of optimization problems, addressing their theory and implementation. It will also cover the Benders and Dantzig-Wolfe decomposition techniques.

The successful applications of these techniques in various Data Science problems from areas such as statistics, machine learning, engineering, energy and finance, will be discussed.

The practical component of this course will consist of computing laboratory work using Matlab (including the Matlab-based CVX system for convex optimization). These exercises will reinforce the theoretical analysis of problems, methods and their implementation.
Course description Unconstrained and Constrained Optimization (modelling issues: constraints in optimization)
Interior Point Methods for linear, quadratic, nonlinear, second-order cone and semidefinite programming (motivation, theory, polynomial complexity, implementation). Newton Method and self-concordant barriers in optimization
Implementational aspects of methods for very large scale optimization
(sparse matrices, inexact Newton Method, preconditioning, quasi-Newton)
Decomposition Techniques (Benders and Dantzig-Wolfe decompositions).
Data Science Applications:
- Statistics: regressions, classification, discrimination analysis,
- Machine learning: support vector machines
- Engineering: signal and image processing
- Energy: uncertainty of sustainable sources (wind)
- Finance: portfolio optimization, asset and liability management
Entry Requirements (not applicable to Visiting Students)
Pre-requisites Students MUST have passed: Fundamentals of Optimization (MATH11111)
Prohibited Combinations Other requirements None
Information for Visiting Students
High Demand Course? Yes
Course Delivery Information