Postgraduate Course: Probabilistic Modelling and Reasoning (INFR11134)
|School||School of Informatics
||College||College of Science and Engineering
|Credit level (Normal year taken)||SCQF Level 11 (Postgraduate)
||Availability||Available to all students
|Summary||When dealing with real world data, we often need to deal with uncertainty. For example, short segments of a speech signal are ambiguous, and we need to take into account context in order to make sense of an utterance. Probability theory provides a rigorous method for representing and reasoning with uncertain knowledge. The course covers two main areas (i) the process of inference in probabilistic reasoning systems and (ii) learning probabilistic models from data. Its aim is to provide a firm grounding in probabilistic modelling and reasoning, and to give a basis which will allow students to go on to develop their interests in more specific areas, such as data-intensive linguistics, automatic speech recognition, probabilistic expert systems, statistical theories of vision etc.
The course will cover the most important topics in probabilistic modelling and unsupervised learning, and provide a thorough basis for understanding extensions, further developments and applications.
While the precise topics will vary slightly from year to year, the core content will revolve around:
- probabilistic graphical models
- exact inference
- learning from data
- methods for approximate inference and learning
The course will be delivered in a series of lectures and exercises. In addition to working through exercises with pencil and paper, the students will be expected to complete some programming exercises to gain experience with implementing and using the material taught in the course.
The detailed course syllabus is available on the course homepage https://www.inf.ed.ac.uk/teaching/courses/pmr
Entry Requirements (not applicable to Visiting Students)
||Other requirements|| This course is open to all Informatics students including those on joint degrees. For external students where this course is not listed in your DPT, please seek special permission from the course organiser.
1. Probability theory: Discrete and continuous univariate random variables. Expectation, variance. Joint and conditional distributions. Univariate and multivariate Gaussian distribution.
2. Linear algebra: Vectors and matrices: definitions, addition. Matrix multiplication, matrix inversion. Eigenvectors, determinants, quadratic forms.
3. Calculus: Functions of several variables. Partial differentiation. Multivariate maxima and minima. Integration: need to know definitions, including multivariate integration.
4. Special functions: Log, exp are fundamental.
5 . Geometry: Basics of lines, planes and hyperplanes. Coordinate geometry of circle, sphere, ellipse, ellipsoid and n-dimensional generalizations.
Programming prerequisite: A basic level of programming is assumed and not covered in lectures.
Previously taking and being happy with MLPR is highly recommended for this course: PMR has similar mathematical requirements to MLPR, and there will be some assumption that people are familiar with machine learning concepts such as those taught on MLPR or an equivalent course.
Information for Visiting Students
|High Demand Course?
Course Delivery Information
|Academic year 2019/20, Available to all students (SV1)
|Learning and Teaching activities (Further Info)
Lecture Hours 30,
Seminar/Tutorial Hours 10,
Feedback/Feedforward Hours 2,
Summative Assessment Hours 2,
Programme Level Learning and Teaching Hours 4,
Directed Learning and Independent Learning Hours
|Assessment (Further Info)
|Additional Information (Assessment)
||Written Exam 100 %,
Coursework 0 %,
Practical Exam 0 %
The material covered in the lecture, the required reading, the pen-and-paper and programming exercises are examinable unless otherwise mentioned. Exam-style questions and their solutions are provided in the tutorials.
||Feedback will primarily be provided through tutorials, an online forum, and direct interactions with the tutors, TA and lecturer.
||Hours & Minutes
|Main Exam Diet S2 (April/May)||2:00|
On completion of this course, the student will be able to:
- Define the joint distribution implied by directed and undirected probabilistic graphical models, convert between different graphical models, and carry out inference in graphical models from first principles by hand.
- Demonstrate understanding of frequentist and Bayesian methods for parameter estimation by hand derivation of estimation equations for specific problems.
- Critically discuss differences between various latent variable models for data and derive EM updates for various latent variable models. Demonstrate ability to implement approximate inference and learning methods.
- Explain when and why the methods taught in the course are applicable and demonstrate experience gained from practically implementing them.
|Graduate Attributes and Skills
||The student will be able to reason about uncertainty, an important transferable skill.
In addition the student will be able to:
- Undertake critical evaluations of a wide range of numerical and graphical data.
- Apply critical analysis, evaluation and synthesis to forefront issues, or issues that are informed by forefront developments in the subject/discipline/sector.
- Identify, conceptualise and define new and abstract problems and issues.
- Develop original and creative responses to problems and issues.
- Critically review, consolidate and extend knowledge, skills, practices and thinking in a subject/discipline/sector.
- Deal with complex issues and make informed judgements in situations in the absence of complete or consistent data/information.
|Keywords||Bayesian Statistics,Unsupervised Learning,Probabilistic Models
|Course organiser||Dr Michael Urs Gutmann
Tel: (0131 6)50 5190
|Course secretary||Ms Lindsay Seal
Tel: (0131 6)50 2701