Postgraduate Course: Message Passing Programming (INFD11011)
|School||School of Informatics
||College||College of Science and Engineering
|Credit level (Normal year taken)||SCQF Level 11 (Postgraduate)
|Course type||Online Distance Learning
||Availability||Available to all students
|Summary||*This course is delivered online for students on Online Learning programmes within the College of Science and Engineering only. On-campus students interested in the material should refer to INFR11163 - Message-passing Programming*
Parallel programming by definition involves co-operation between processors to solve a common problem. The programmer has to define the tasks that will be executed by the processors, and also how these tasks exchange data and synchronise with each other.
In the message-passing model the tasks are separate processes that communicate by explicitly sending each other messages. All parallel operations are performed via calls to some message-passing interface that is entirely responsible for interfacing with the physical communication network.
This course uses the de facto standard for message passing, the Message Passing Interface (MPI), which is a library callable from C, C++ or Fortran. Parallel programs written using MPI can run on almost any system from a multicore laptop up to the world's largest supercomputers.
The course makes use of lecture content delivered in both video form and delivered live via e.g. Blackboard Collaborate. Practical activities will be set for students to undertake in their own time in advance of regular tutorial sessions run via Blackboard Collaborate, with discussion and support available from other students, demonstrators, and staff via dedicated discussion areas (e.g. Learn Discussion Boards, Slack).
The course will cover the following topics:
- The message-passing model
- Message-passing parallelisation of a regular domain code
- MPI terminology
- The anatomy of send and receive (synchronous and asynchronous)
- Point-to-point message-passing examples
- Non-blocking operations
- Communicator management
- Derived datatypes (focusing mainly on array subsections)
- Practicalities / Hints and Tips
Information for Visiting Students
|Pre-requisites||Ability to program in C, C++ or Fortran.
|High Demand Course?
Course Delivery Information
|Academic year 2019/20, Available to all students (SV1)
|Course Start Date
|Learning and Teaching activities (Further Info)
Online Activities 30,
Programme Level Learning and Teaching Hours 2,
Directed Learning and Independent Learning Hours
|Assessment (Further Info)
|Additional Information (Assessment)
||Written Exam 0 %, Coursework 100 %, Practical Exam 0 %
This is a practical course and the associated programming exercises are at least as important as the lectures. The course is assessed by coursework which requires students to write an MPI program to solve a given problem and produce a report covering the program's design, implementation and performance, demonstrating scientific writing.
||Provided on assessed work and through discussion of practical exercises via scheduled Blackboard Collaborate class sessions, and class-discussion.
|No Exam Information
On completion of this course, the student will be able to:
- Describe the message-passing model in detail.
- Explain the circumstances which cause issues such as deadlock.
- Implement standard message-passing algorithms in MPI.
- Measure and comment on the performance of MPI programs.
- Design, implement and debug efficient parallel programs to solve regular-grid problems.
|"Using MPI: Portable Parallel Programming with the Message-Passing Interface", William Gropp, Ewing Lusk and Anthony Skjellum.|
|Graduate Attributes and Skills
Project management skills.
Effective written and diagrammatic communication.
Data collection and analysis.
|Additional Class Delivery Information
||Fully delivered online
|Keywords||Programming,message passing,parallel,distributed memory,MPI,HPC,Parallelism
|Course organiser||Dr David Henty
Tel: (0131 6)50 5960
|Course secretary||Mr Ben Morse
Tel: (0131 6)51 3398