Postgraduate Course: Machine Learning & Pattern Recognition (Level 11) (INFR11073)
|School||School of Informatics
||College||College of Science and Engineering
|Credit level (Normal year taken)||SCQF Level 11 (Year 4 Undergraduate)
||Availability||Available to all students
|Summary||***PLEASE NOTE - this course has been replaced by Machine Learning and Pattern Recognition INFR11130 (20 credit course) from 2016/17.***
Both the study of Artificial Intelligence - understanding how to build learning machines - and the business of developing tools to analyse the numerous increasing data sources involves developing a systematic understanding of how we can learn from data. A principled approach to this problem is critical given the wide differences in the places these methods need to be used.
This course is a foundational course for anyone pursuing machine learning, or interested in the intelligent utilisation of machine learning methods. The primary aim of the course is enable the student to think coherently and confidently about machine learning problems, and present the student with a set of practical tools that can be applied to solve real-world problems in machine learning, coupled with an appropriate, principled approach to formulating a solution.
This course avoids the potential pitfalls of simply presenting a set of machine learning tools as if they were an end in themselves, but follows the basic principles of machine learning methods in showing how the different tools are developed, how they are related, how they should be deployed, and how they are used in practice. The course presents a number of methods in machine learning that are increasingly used, including Bayesian methods, and Gaussian processes.
* Data and Models: Introducing Data, Probability and Bayesian Presumptions.
* Simple Distributions, Maximum Likelihood and Bayesian Estimation.
* Bayesian Sets Example
* The Exponential Family
* Multivariate Gaussians, PCA and PPCA. Bayesian Gaussian
* Linear Parameter Models, Bayesian Regression
* Logistic Regression and Neural Networks
* Approximate Methods: Laplace, Variational Methods, Sampling.
* Na´ve Bayes, Class Conditional Gaussians, Gaussian Mixtures and EM.
* Gaussian Processes and Kernel Methods
* Bayesian Decision Theory.
Relevant QAA Computing Curriculum Sections: Artificial Intelligence, Human-Computer Interaction (HCI), Intelligent Information Systems Technologies, Natural Language Computing, Simulation and Modelling, Theoretical Computing.
Entry Requirements (not applicable to Visiting Students)
||Other requirements|| This course is open to all Informatics students including those on joint degrees. For external students where this course is not listed in your DPT, please seek special permission from the course organiser (lecturer).
Familiarity with basic mathematics, including algebra and calculus is essential. A reasonable knowledge of computational, logical, geometric and set-theoretic concepts is assumed. Working knowledge of vectors and matrices is also necessary. A basic grasp of probability and partial differentiation, is strongly recommended. Programming in a numerical language such as Matlab will be required, but past experience with Matlab is not assumed.
Information for Visiting Students
|Pre-requisites||Visiting students are required to have comparable background to that assumed by the course prerequisites listed in the Degree Regulations & Programmes of Study.
If in doubt, consult the course organiser (lecturer).
|High Demand Course?
Course Delivery Information
|Not being delivered|
| 1 - Way of thinking - the course introduces an approach to thinking about machine learning problems. Learning Outcome: The students will be able to describe why a particular model is appropriate in a given situations, formulate the model and use it appropriately.
2 - A strong foundation - the course will provide students with the core techniques and methods needed to use machine learning in any area. Learning Outcome: The student will be able to analytically demonstrate how different models and different algorithms are related to one another.
3 - Practical capability - the course will provide students with the theoretical background needed to assess good practice, along with the practical experience. Learning Outcome: Students will be able to implement a set of practical methods, given example algorithms in MATLAB, and be able to program solutions to some given real world machine learning problems, using the toolbox of practical methods presented in the lectures.
4 - Thoroughness - students will leave the course with a deep understanding of machine learning and its aims and limitations. Learning Outcome: Given a particular situation, students will be able be able to justify why a given model is appropriate for the situation or why it is not appropriate. Students will be able to developing an appropriate algorithm from a given model, and demonstrate the use of that method.
5 - Coherence - the course provide a unifying coherent view on machine learning. Learning Outcome: students will be able to design and compare machine learning methods, and discuss how different methods relate to one another and will be able to develop new and appropriate machine learning methods appropriate for particular problems.
6 - Breadth of Thinking - Learning outcome: Given a complex problem, students will be able to: (a) identify sub-problems that are amenable to solution using Machine Learning techniques, (b) provide solutions to those sub-problems, and evaluation of the solutions.
|* Self contained course notes (Barber 2007)|
* C.M. Bishop (2006) Pattern Recognition and Machine Learning. Springer.
* Duda Hart and Stork. (2001). Pattern Classification. Wiley
|Course organiser||Dr Iain Murray
Tel: (0131 6)51 9078
|Course secretary||Mr Gregor Hall
Tel: (0131 6)50 5194