Intermediate Topics in MPI

Intermediate Topics in MPI

This workshop targets programmers in both academia and industry who already have experience with basic MPI and are ready to take the next step to more advanced usage. Topics which will be covered include communicators, groups, derived data types, one-sided communication, non-blocking collectives and hybrid MPI+threading approaches. Lectures will be interleaved with hands-on exercises. All exercises will be written in C, but the instructors will be able to answer questions about MPI in Fortran and Python.

This workshop is a cooperation between ENCCS, PDC, and HPC2N. The course is free.

The Message Passing Interface (MPI) is the de facto standard for distributed memory parallelism in high performance computing (HPC). MPI is the dominant programming model for modern day supercomputers and will continue to be critical in enabling researchers to scale up their HPC workloads to forthcoming pre-exascale and exascale systems within EuroHPC and elsewhere.

Pre-requisites

  • Familiarity with MPI in C/C++, Fortran or Python, either from introductory courses or workshops (e.g. PDC’s Introduction to MPI, or SNIC’s introduction to parallel programming course) or through self-taught usage.
  • Familiarity with C/C++
  • Basic Linux command line skills
  • Existing access to a computing cluster or own computer with MPI and compilers installed.

Instructors

  • Xin Li, PhD
  • Pedro Ojeda May, PhD
  • Roberto Di Remigio Eikås, PhD

Time and date

Date: 14 - 17 June 2022
Time: 09:00 - 12:30 CEST each day
 

More information, agenda, and registration at the ENCCS event page.

Updated: 2024-03-21, 12:31