Postgraduate Course: Probability, Random Variables and Estimation Theory (PET) (MSc) (PGEE11123)
Course Outline
School  School of Engineering 
College  College of Science and Engineering 
Credit level (Normal year taken)  SCQF Level 11 (Postgraduate) 
Availability  Not available to visiting students 
SCQF Credits  10 
ECTS Credits  5 
Summary  The Probability, Random Variables and Estimation Theory course introduces the fundamental statistical tools that are required to analyse and describe advanced signal processing algorithms within the MSc Signal Processing and Communications programme. It provides a unified mathematical framework which is the basis for describing random events and signals, and how to describe key characteristics of random processes. The course covers probability theory, considers the notion of random variables and vectors, how they can be manipulated, and provides an introduction to estimation theory. It is demonstrated that many estimation problems, and therefore signal processing problems, can be reduced to an exercise in either optimisation or integration. While these problems can be solved using deterministic numerical methods, the course introduces Monte Carlo techniques which are the basis of powerful stochastic optimisation and integration algorithms. These methods rely on being able to sample numbers, or variates, from arbitrary distributions. This course will therefore discuss the various techniques which are necessary to understand these methods and, if time permits, techniques for random number generation are considered.
This course, in combination with the Statistical Signal Processing course, provides the fundamental knowledge required for the advanced signal, image, and communication modules in the MSc course. 
Course description 
Any minor modifications to the latest syllabus and lecture are always contained in the lecture handout.
Course Introduction, Motivation, Prerequisites (2 lectures):
1. Motivating the field of statistical signal processing, along with the role of probability, random variables, and estimation theory as a consistent mathematical analysis framework.
2. Examples of modern signal processing applications.
3. Function norms (signal measures), the Fourier transform, and Laplace transform (revision).
Scalar Random Variables (4 lectures):
1. Notion of a random variable and its formal definition involving experimental outcomes, sample space, probability of events, and assigned values; the concept of the cumulative distribution function (cdf), the probability density function (pdf), and their formal properties.
2. Discrete random variables (RVs), their probability mass function (pmf), the corresponding cdfs and pdfs, as well as mixtures of continuous and discrete random variables.
3. Examples of several common discrete and continuous RVs and their pdfs.
4. Introduce the probability transformation rule through a conceptual derivation, with examples.
5. Expectations, moments, central moments, and higherorder statistics and cummulants.
6. The characteristic function, the moment generating function (MGF), properties and examples.
Random Vectors and Multiple Random Variables (7 lectures):
1. Generalisation of the theory on scalar random variables to multiple random variables.
2. Introduction to the concept and formal definition of a random vector, along with the notion of a joint cdf, joint pdf; cover the properties of joint cdfs and pdfs, the probability of arbitrary events, and calculating joint cdfs from joint pdfs.
3. Introducing marginal cdfs from pdfs, independence, conditional densities, and Bayes's theorem.
4. Popular examples of dependent random variables, including the Monty Hall problem.
5. The probability transformation rule, including calculating the Jacobian, auxiliary variables, and the sum of independent RVs. Examples include application of Cartesian to Polar coordinate transformation.
6. Statistical descriptions of random vectors, including the mean and correlation matrices, crosscorrelation, and crosscovariance. Covers the properties of correlation and covariance matrices, and determining whether a correlation or covariance matrix is a valid one.
7. Considers special case of linear transformations of random vectors; effect of linear transformations on statistical properties; invariance of the expectation operator.
8. Normally distributed random vectors; derivation of the Gaussian integral identity; the two envelopes problem/paradox; properties of the multivariate Gaussian.
9. Characteristic functions and MGFs for random vectors.
10. Analysis of the sum of independent random variables, and the central limit theorem (CLT), using the characteristic functions. MATLAB demo of the CLT.
Principles of Estimation Theory (7 lectures)
1. General introduction to parameter estimation set in the context of observing a repeated number of observations of an experiment, possibly as a function of time. Covers examples such as the taxicab problem.
2. Properties of estimators: bias, variance, meansquared error (MSE), CramerRao lowerbound (CRLB). Discusses the notion of a likelihood function. Includes examples such as sample mean and sample variance.
3. Efficiency of an estimator, consistency, and estimating multiple parameters.
4. Maximumlikelihood estimator, the principle of least squares.
5. Linear least squares and the normal equations.
6. Introduction to Bayesian estimation: priors, marginalisation, posterior distributions.
7. Overview of problems of optimisation and marginalisation in practice.

Entry Requirements (not applicable to Visiting Students)
Prerequisites 

Corequisites  
Prohibited Combinations  
Other requirements  None 
Course Delivery Information
Not being delivered 
Learning Outcomes
On completion of this course, the student will be able to:
 define, understand and manipulate scalar and multiple random variables, using the theory of probability; this should include the tools of probability transformations and characteristic functions;
 explain the notion of characterising random variables and random vectors using moments, and be able to manipulate them; understand the relationship between random variables within a random vector. Understand the central limit theorem (CLT) and explain its use in estimation theory and the sum of random variables;
 understand the principles of estimation theory; understand and be able to apply estimation techniques such as maximumlikelihood, least squares, and Bayesian estimation;
 be able to characterise the uncertainty in an estimator, as well as characterise the performance of an estimator (bias, variance, and so forth); understand the Cram' rRao lowerbound (CRLB) and minimum variance unbiased estimator (MVUE) estimators;
 if time permits, explain and apply methods for generating random numbers, or random variates, from an arbitrary distribution, using methods such as acceptreject and Gibbs sampling; understand the notion of stochastic numerical methods for solving integration and optimisation problems.

Reading List
1. Recommended course text book: Therrien C. W. and M. Tummala, Probability and Random Processes for Electrical and Computer Engineers, Second edition, CRC Press, 2011. IDENTIFIERS ¿ Hardback, ISBN10: 1439826986, ISBN13: 9781439826980
2. Manolakis D. G., V. K. Ingle, and S. M. Kogon, Statistical and Adaptive Signal Processing: Spectral Estimation, Signal Modeling, Adaptive Filtering and Array Processing, McGraw Hill, Inc., 2000.
3. Kay S. M., Fundamentals of Statistical Signal Processing: Estimation Theory, PrenticeHall, Inc., 1993.
4. Papoulis A. and S. Pillai, Probability, Random Variables, and Stochastic Processes, Fourth edition, McGraw Hill, Inc., 2002. 
Additional Information
Graduate Attributes and Skills 
Not entered 
Keywords  Probability,random variables,random vectors,and estimation theory. 
Contacts
Course organiser  Dr James Hopgood
Tel: (0131 6)50 5571
Email: James.Hopgood@ed.ac.uk 
Course secretary  Mrs Megan InchKellingray
Tel: (0131 6)51 7079
Email: M.Inch@ed.ac.uk 

