# DEGREE REGULATIONS & PROGRAMMES OF STUDY 2012/2013- ARCHIVE for reference onlyTHIS PAGE IS OUT OF DATE

 University Homepage DRPS Homepage DRPS Search DRPS Contact
DRPS : Course Catalogue : School of Informatics : Informatics

# Postgraduate Course: Probabilistic Modelling and Reasoning (INFR11050)

 School School of Informatics College College of Science and Engineering Course type Standard Availability Available to all students Credit level (Normal year taken) SCQF Level 11 (Postgraduate) Credits 10 Home subject area Informatics Other subject area None Course website http://www.inf.ed.ac.uk/teaching/courses/pmr Taught in Gaelic? No Course description When dealing with real world data, we often need to deal with uncertainty. For example, short segments of a speech signal are ambiguous, and we need to take into account context in order to make sense of an utterance. Probability theory provides a rigorous method for representing and reasoning with uncertain knowledge. The course covers two main areas (i) the process of inference in probabilistic reasoning systems and (ii) learning probabilistic models from data. Its aim is to provide a firm grounding in probabilistic modelling and reasoning, and to give a basis which will allow students to go on to develop their interests in more specific areas, such as data-intensive linguistics, automatic speech recognition, probabilistic expert systems, statistical theories of vision etc.
 Pre-requisites Co-requisites Prohibited Combinations Other requirements This course has the following mathematics prerequisites: 1 - Probability theory: Discrete and continuous univariate random variables. Expectation, variance. Univariate Gaussian distribution. Joint and conditional distributions. (At the level taught in MfI 1&4). 2 - Linear algebra: Vectors and matrices: definitions, addition. Matrix multiplication, matrix inversion. Eigenvectors, determinants, quadratic forms. (At the level taught in MfI 2&3). 3 - Calculus: Functions of several variables. Partial differentiation. Multivariate maxima and minima. Integration: need to know definitions, including multivariate integration. (At the level taught in MfI 1&2) 4 - Special functions: Log, exp are fundamental. (At the level taught in MfI 1) 5 - Geometry: Basics of lines, planes and hyperplanes. Coordinate geometry of circle, sphere, ellipse, ellipsoid and n-dimensional generalizations. (At level taught in MfI 1&4) Additional Costs None
 Pre-requisites None Displayed in Visiting Students Prospectus? Yes
 Delivery period: 2012/13 Semester 1, Available to all students (SV1) Learn enabled:  No Quota:  None Location Activity Description Weeks Monday Tuesday Wednesday Thursday Friday Central Lecture 1-11 10:00 - 10:50 Central Lecture 1-11 10:00 - 10:50 First Class Week 1, Tuesday, 10:00 - 10:50, Zone: Central. LT2 Appleton Tower Exam Information Exam Diet Paper Name Hours:Minutes Main Exam Diet S1 (December) 2:00
 1 - Define the joint distribution implied by directed and undirected probabilistic graphical models. 2 - Carry out inference ingraphical models from first principles by hand, and by using the junction tree algorithm. 3 - Demonstrate understanding of maximum likelihood and Bayesian methods for parameter estimation by hand derivation of estimation equations for specific problems. 4 - Critically discuss differences between various latent variable models for data. 5 - Derive EM updates for various latent variable models (e.g. mixture models). 6 - Define entropy, joint entropy, conditional entropy, mutual information, expected code length. 7 - Demonstrate ability to design, assess and evaluate belief network models. 8 - Use matlab code implementing probabilistic graphic models. 9 - Demonstrate ability to conduct experimental investigations and draw conclusions from them.
 Written Examination 80 Assessed Assignments 20 Oral Presentations 0 Assessment One assignment, mainly focussing on learning probabilistic models of data. If delivered in semester 1, this course will have an option for semester 1 only visiting undergraduate students, providing assessment prior to the end of the calendar year.
 None
 Academic description Not entered Syllabus *Introduction * Probability o events, discrete variables o joint, conditional probability * Discrete belief networks, inference * Continuous distributions, graphical Gaussian models * Learning: Maximum Likelihood parameter estimation * Decision theory * Hidden variable models o mixture models and the EM algorithm o factor analysis o ICA, non-linear factor analysis * Dynamic hidden variable models o Hidden Markov models o Kalman filters (and extensions) * Undirected graphical models o Markov Random Fields o Boltzmann machines * Information theory o entropy, mutual information o source coding, Kullback-Leibler divergence * Bayesian methods for o Inference on parameters o Model comparison Relevant QAA Computing Curriculum Sections: Artificial Intelligence Transferable skills Not entered Reading list * The course text is "Pattern Recognition and Machine Learning" by C. M. Bishop (Springer, 2006). * In addition, David MacKay's book "Information Theory, Inference and Learning Algorithms" (CUP, 2003) is highly recommended. Study Abroad Not entered Study Pattern Lectures 20 Tutorials 7 Timetabled Laboratories 0 Non-timetabled assessed assignments 20 Private Study/Other 53 Total 100 Keywords Not entered
 Course organiser Dr Iain Murray Tel: (0131 6)51 9078 Email: I.Murray@ed.ac.uk Course secretary Miss Kate Weston Tel: (0131 6)50 2701 Email: Kate.Weston@ed.ac.uk
 Navigation Help & Information Home Introduction Glossary Search DPTs and Courses Regulations Regulations Degree Programmes Introduction Browse DPTs Courses Introduction Humanities and Social Science Science and Engineering Medicine and Veterinary Medicine Other Information Timetab Prospectuses Important Information
© Copyright 2012 The University of Edinburgh - 14 January 2013 4:10 am