An Introduction to MPI Parallel Programming with the Message Passing Interface

10/14/98


Click here to start


Table of Contents

An Introduction to MPI Parallel Programming with the Message Passing Interface

Outline

Outline (continued)

Companion Material

The Message-Passing Model

Types of Parallel Computing Models

Cooperative Operations for Communication

One-Sided Operations for Communication

What is MPI?

MPI Sources

Why Use MPI?

A Minimal MPI Program (C)

A Minimal MPI Program (Fortran)

Notes on C and Fortran

Error Handling

Running MPI Programs

Finding Out About the Environment

Better Hello (C)

Better Hello (Fortran)

MPI Basic Send/Receive

What is message passing?

Some Basic Concepts

MPI Datatypes

MPI Tags

MPI Basic (Blocking) Send

MPI Basic (Blocking) Receive

Retrieving Further Information

Simple Fortran Example - 1

Simple Fortran Example - 2

Simple Fortran Example - 3

Why Datatypes?

Tags and Contexts

MPI is Simple

Introduction to Collective Operations in MPI

Example: PI in Fortran - 1

Example: PI in Fortran - 2

Example: PI in Fortran - 3

Example: PI in C -1

Example: PI in C - 2

Alternative set of 6 Functions for Simplified MPI

Sources of Deadlocks

Some Solutions to the �unsafe� Problem

Toward a Portable MPI Environment

Extending the Message-Passing Interface

Some Simple Exercises

When to use MPI

When not to use MPI

Summary

Author: William D Gropp

Email: [email protected]

Home Page: http://www.mcs.anl.gov/~gropp

Download presentation source