Postgraduate Course: Machine Learning & Pattern Recognition (Level 11) (INFR11073)
Course Outline
School | School of Informatics |
College | College of Science and Engineering |
Course type | Standard |
Availability | Available to all students |
Credit level (Normal year taken) | SCQF Level 11 (Postgraduate) |
Credits | 10 |
Home subject area | Informatics |
Other subject area | None |
Course website |
http://www.inf.ed.ac.uk/teaching/courses/mlpr |
Taught in Gaelic? | No |
Course description | Both the study of Artificial Intelligence - understanding how to build learning machines - and the business of developing tools to analyse the numerous increasing data sources involves developing a systematic understanding of how we can learn from data. A principled approach to this problem is critical given the wide differences in the places these methods need to be used.
This course is a foundational course for anyone pursuing machine learning, or interested in the intelligent utilisation of machine learning methods. The primary aim of the course is enable the student to think coherently and confidently about machine learning problems, and present the student with a set of practical tools that can be applied to solve real-world problems in machine learning, coupled with an appropriate, principled approach to formulating a solution.
This course avoids the potential pitfalls of simply presenting a set of machine learning tools as if they were an end in themselves, but follows the basic principles of machine learning methods in showing how the different tools are developed, how they are related, how they should be deployed, and how they are used in practice. The course presents a number of methods in machine learning that are increasingly used, including Bayesian methods, and Gaussian processes.
This course is identical to the level 10 version except for an additional learning outcome, and a consequential difference in assessment. |
Entry Requirements (not applicable to Visiting Students)
Pre-requisites |
|
Co-requisites | |
Prohibited Combinations | Students MUST NOT also be taking
Machine Learning & Pattern Recognition (Level 10) (INFR10036)
|
Other requirements | For Informatics PG and final year MInf students only, or by special permission of the School. Familiarity with basic mathematics, including algebra and calculus is essential. A reasonable knowledge of computational, logical, geometric and set-theoretic concepts is assumed. Working knowledge of vectors and matrices is also necessary. A basic grasp of probability and partial differentiation, is strongly recommended. |
Additional Costs | None |
Information for Visiting Students
Pre-requisites | None |
Displayed in Visiting Students Prospectus? | No |
Course Delivery Information
|
Delivery period: 2011/12 Semester 2, Available to all students (SV1)
|
WebCT enabled: No |
Quota: None |
Location |
Activity |
Description |
Weeks |
Monday |
Tuesday |
Wednesday |
Thursday |
Friday |
Central | Lecture | | 1-11 | | 11:10 - 12:00 | | | | Central | Lecture | | 1-11 | | | | | 11:10 - 12:00 |
First Class |
Week 1, Tuesday, 11:10 - 12:00, Zone: Central. George Sq 07 F21 |
Exam Information |
Exam Diet |
Paper Name |
Hours:Minutes |
|
|
Main Exam Diet S2 (April/May) | | 2:00 | | |
Summary of Intended Learning Outcomes
1 - Way of thinking - the course introduces an approach to thinking about machine learning problems. Learning Outcome: The students will be able to describe why a particular model is appropriate in a given situations, formulate the model and use it appropriately.
2 - A strong foundation - the course will provide students with the core techniques and methods needed to use machine learning in any area. Learning Outcome: The student will be able to analytically demonstrate how different models and different algorithms are related to one another.
3 - Practical capability - the course will provide students with the theoretical background needed to assess good practice, along with the practical experience. Learning Outcome: Students will be able to implement a set of practical methods, given example algorithms in MATLAB, and be able to program solutions to some given real world machine learning problems, using the toolbox of practical methods presented in the lectures.
4 - Thoroughness - students will leave the course with a deep understanding of machine learning and its aims and limitations. Learning Outcome: Given a particular situation, students will be able be able to justify why a given model is appropriate for the situation or why it is not appropriate. Students will be able to developing an appropriate algorithm from a given model, and demonstrate the use of that method.
5 - Coherence - the course provide a unifying coherent view on machine learning. Learning Outcome: students will be able to design and compare machine learning methods, and discuss how different methods relate to one another and will be able to develop new and appropriate machine learning methods appropriate for particular problems.
6 - Breadth of Thinking - Learning outcome: Given a complex problem, students will be able to: (a) identify sub-problems that are amenable to solution using Machine Learning techniques, (b) provide solutions to those sub-problems, and evaluation of the solutions. |
Assessment Information
Written Examination 80
Assessed Assignments 20
Oral Presentations 0
Assessment
There will be two assignments for the course, one for each half of the course contents. This will involve practical hands on data analysis as well as questions about the ideas on the course. The level 11 course will test for the additional learning outcome.
If delivered in semester 1, this course will have an option for semester 1 only visiting undergraduate students, providing assessment prior to the end of the calendar year. |
Special Arrangements
None |
Additional Information
Academic description |
Not entered |
Syllabus |
* Data and Models: Introducing Data, Probability and Bayesian Presumptions.
* Simple Distributions, Maximum Likelihood and Bayesian Estimation.
* Bayesian Sets Example
* The Exponential Family
* Multivariate Gaussians, PCA and PPCA. Bayesian Gaussian
* Linear Parameter Models, Bayesian Regression
* Logistic Regression and Neural Networks
* Optimisation
* Approximate Methods: Laplace, Variational Methods, Sampling.
* Naïve Bayes, Class Conditional Gaussians, Gaussian Mixtures and EM.
* Gaussian Processes and Kernel Methods
* Bayesian Decision Theory.
Relevant QAA Computing Curriculum Sections: Artificial Intelligence, Human-Computer Interaction (HCI), Intelligent Information Systems Technologies, Natural Language Computing, Simulation and Modelling, Theoretical Computing. |
Transferable skills |
Not entered |
Reading list |
* Self contained course notes (Barber 2007)
* C.M. Bishop (2006) Pattern Recognition and Machine Learning. Springer.
* Duda Hart and Stork. (2001). Pattern Classification. Wiley |
Study Abroad |
Not entered |
Study Pattern |
Lectures 20
Tutorials 8
Timetabled Laboratories 0
Non-timetabled assessed assignments 22
Private Study/Other 50
Total 100 |
Keywords | Not entered |
Contacts
Course organiser | Dr Michael Rovatsos
Tel: (0131 6)51 3263
Email: mrovatso@inf.ed.ac.uk |
Course secretary | Miss Kate Weston
Tel: (0131 6)50 2701
Email: Kate.Weston@ed.ac.uk |
|
© Copyright 2011 The University of Edinburgh - 16 January 2012 6:17 am
|