THE UNIVERSITY of EDINBURGH

DEGREE REGULATIONS & PROGRAMMES OF STUDY 2019/2020

University Homepage
DRPS Homepage
DRPS Search
DRPS Contact
DRPS : Course Catalogue : School of Engineering : Postgrad (School of Engineering)

Postgraduate Course: Probability, Random Variables and Estimation Theory (PET) (MSc) (PGEE11123)

Course Outline
SchoolSchool of Engineering CollegeCollege of Science and Engineering
Credit level (Normal year taken)SCQF Level 11 (Postgraduate) AvailabilityNot available to visiting students
SCQF Credits10 ECTS Credits5
SummaryThe Probability, Random Variables and Estimation Theory course introduces the fundamental statistical tools that are required to analyse and describe advanced signal processing algorithms within the MSc Signal Processing and Communications programme. It provides a unified mathematical framework which is the basis for describing random events and signals, and how to describe key characteristics of random processes. The course covers probability theory, considers the notion of random variables and vectors, how they can be manipulated, and provides an introduction to estimation theory. It is demonstrated that many estimation problems, and therefore signal processing problems, can be reduced to an exercise in either optimisation or integration. While these problems can be solved using deterministic numerical methods, the course introduces Monte Carlo techniques which are the basis of powerful stochastic optimisation and integration algorithms. These methods rely on being able to sample numbers, or variates, from arbitrary distributions. This course will therefore discuss the various techniques which are necessary to understand these methods and, if time permits, techniques for random number generation are considered.

This course, in combination with the Statistical Signal Processing course, provides the fundamental knowledge required for the advanced signal, image, and communication modules in the MSc course.
Course description Any minor modifications to the latest syllabus and lecture are always contained in the lecture handout.

Course Introduction, Motivation, Prerequisites (2 lectures):

1. Motivating the field of statistical signal processing, along with the role of probability, random variables, and estimation theory as a consistent mathematical analysis framework.
2. Examples of modern signal processing applications.
3. Function norms (signal measures), the Fourier transform, and Laplace transform (revision).

Scalar Random Variables (4 lectures):

1. Notion of a random variable and its formal definition involving experimental outcomes, sample space, probability of events, and assigned values; the concept of the cumulative distribution function (cdf), the probability density function (pdf), and their formal properties.
2. Discrete random variables (RVs), their probability mass function (pmf), the corresponding cdfs and pdfs, as well as mixtures of continuous and discrete random variables.
3. Examples of several common discrete- and continuous RVs and their pdfs.
4. Introduce the probability transformation rule through a conceptual derivation, with examples.
5. Expectations, moments, central moments, and higher-order statistics and cummulants.
6. The characteristic function, the moment generating function (MGF), properties and examples.

Random Vectors and Multiple Random Variables (7 lectures):

1. Generalisation of the theory on scalar random variables to multiple random variables.
2. Introduction to the concept and formal definition of a random vector, along with the notion of a joint cdf, joint pdf; cover the properties of joint cdfs and pdfs, the probability of arbitrary events, and calculating joint cdfs from joint pdfs.
3. Introducing marginal cdfs from pdfs, independence, conditional densities, and Bayes¿s theorem.
4. Popular examples of dependent random variables, including the Monty Hall problem.
5. The probability transformation rule, including calculating the Jacobian, auxiliary variables, and the sum of independent RVs. Examples include application of Cartesian to Polar coordinate transformation.
6. Statistical descriptions of random vectors, including the mean and correlation matrices, cross-correlation, and cross-covariance. Covers the properties of correlation and covariance matrices, and determining whether a correlation or covariance matrix is a valid one.
7. Considers special case of linear transformations of random vectors; effect of linear transformations on statistical properties; invariance of the expectation operator.
8. Normally distributed random vectors; derivation of the Gaussian integral identity; the two envelopes problem/paradox; properties of the multivariate Gaussian.
9. Characteristic functions and MGFs for random vectors.
10. Analysis of the sum of independent random variables, and the central limit theorem (CLT), using the characteristic functions. MATLAB demo of the CLT.

Principles of Estimation Theory (7 lectures)

1. General introduction to parameter estimation set in the context of observing a repeated number of observations of and experiment, possibly as a function of time. Covers examples such as the taxi-cab problem.
2. Properties of estimators: bias, variance, mean-squared error (MSE), Cramer-Rao lower-bound (CRLB). Discusses the notion of a likelihood function. Includes examples such as sample mean and sample variance.
3. Efficiency of an estimator, consistency, and estimating multiple parameters.
4. Maximum-likelihood estimator, the principle of least squares.
5. Linear least squares and the normal equations.
6. Introduction to Bayesian estimation: priors, marginalisation, posterior distributions.
7. Overview of problems of optimisation and marginalisation in practice.
Entry Requirements (not applicable to Visiting Students)
Pre-requisites Co-requisites
Prohibited Combinations Other requirements None
Course Delivery Information
Academic year 2019/20, Not available to visiting students (SS1) Quota:  10
Course Start Block 1 (Sem 1)
Timetable Timetable
Learning and Teaching activities (Further Info) Total Hours: 100 ( Lecture Hours 44, Seminar/Tutorial Hours 22, Feedback/Feedforward Hours 11, Programme Level Learning and Teaching Hours 2, Directed Learning and Independent Learning Hours 21 )
Assessment (Further Info) Written Exam 100 %, Coursework 0 %, Practical Exam 0 %
Additional Information (Assessment) 100% Examination
Feedback Not entered
Exam Information
Exam Diet Paper Name Hours & Minutes
Main Exam Diet S1 (December)1:30
Learning Outcomes
At the end of the Probability, Random Variables and Estimation Theory course, a student should be able to:

1. define, understand and manipulate scalar and multiple random variables, using the theory of probability; this should include the tools of probability transformations and characteristic functions;

2. explain the notion of characterising random variables and random vectors using moments, and be able to manipulate them; understand the relationship between random variables within a random vector;

3. understand the central limit theorem (CLT) and explain its use in estimation theory and the sum of random variables;

4. understand the principles of estimation theory; understand and be able to apply estimation techniques such as maximum-likelihood, least squares, and Bayesian estimation;

5. be able to characterise the uncertainty in an estimator, as well as characterise the performance of an estimator (bias, variance, and so forth); understand the Cram' r-Rao lower-bound (CRLB) and minimum variance unbiased estimator (MVUE) estimators;

6. if time permits, explain and apply methods for generating random numbers, or random variates, from an arbitrary distribution, using methods such as accept-reject and Gibbs sampling; understand the notion of stochastic numerical methods for solving integration and optimisation problems.
Reading List
1. Recommended course text book: Therrien C. W. and M. Tummala, Probability and Random Processes for Electrical and Computer Engineers, Second edition, CRC Press, 2011. IDENTIFIERS ¿ Hardback, ISBN10: 1439826986, ISBN13: 978-1439826980

2. Manolakis D. G., V. K. Ingle, and S. M. Kogon, Statistical and Adaptive Signal Processing: Spectral Estimation, Signal Modeling, Adaptive Filtering and Array Processing, McGraw Hill, Inc., 2000.

3. Kay S. M., Fundamentals of Statistical Signal Processing: Estimation Theory, Prentice-Hall, Inc., 1993.

4. Papoulis A. and S. Pillai, Probability, Random Variables, and Stochastic Processes, Fourth edition, McGraw Hill, Inc., 2002.
Additional Information
Graduate Attributes and Skills Not entered
KeywordsProbability,random variables,random vectors,and estimation theory.
Contacts
Course organiserDr James Hopgood
Tel: (0131 6)50 5571
Email: James.Hopgood@ed.ac.uk
Course secretaryMrs Megan Inch-Kellingray
Tel: (0131 6)51 7079
Email: M.Inch@ed.ac.uk
Navigation
Help & Information
Home
Introduction
Glossary
Search DPTs and Courses
Regulations
Regulations
Degree Programmes
Introduction
Browse DPTs
Courses
Introduction
Humanities and Social Science
Science and Engineering
Medicine and Veterinary Medicine
Other Information
Combined Course Timetable
Prospectuses
Important Information