Teaching Parallel Programming to Lower Division Undergraduates

Multicore hardware is pervasive. However, due to limited exposure to parallelism in current undergraduate CS curricula, most students can't program it by the time they graduate. In order to grasp the underpinnings of parallelism, students need to be introduced to parallelism early in their CS curriculum. That is why Professor Peter Pacheco (University of San Francisco) began teaching the course Introduction to Parallel Computing to lower division undergraduates 6 years ago.

To share the best practices developed over the past 6 years, Professor Pacheco presented at the IEEE Computer Society Technical Committee on Parallel Processing (NSF/TCPP) workshop (EduPar-11) at IPDPS 2011. Professor Pacheco's course is based on the following principles:
1) Students need to start writing parallel programs early
2) Formalism & rigor are less important than hands-on experience and starting to learn how to "think in parallel" early in their CS education

Watch the Presentation >> 

Professor Pacheco covers the course organization, what he teaches, what he doesn't teach, course infrastructure, challenges & key learnings. Download presentation slides:  Teaching_Parallelism_to_Lower-Division_Undergrads.pdf

Get the course materials >>

- Homework/programming assignments
- Homework solutions
- Syllabus
- Additional resources

Video Interviews 

Parallel Programming Talk #104 - Dr. Peter Pacheco discusses new book: "Introduction to Parallel Programming"

Interview with Professor Pacheco at IPDPS 2011


Textbook: An Introduction to Parallel Programming, by Peter Pacheco



Key Features

·  Takes a tutorial approach, starting with small programming examples and building progressively to more challenging examples

·  Focuses on designing, debugging and evaluating the performance of distributed and shared-memory programs

·  Explains how to develop parallel programs using MPI, Pthreads, and OpenMP programming models


 Author Peter Pacheco uses a tutorial approach to show students how to develop effective parallel programs with MPI, Pthreads, and OpenMP. The first undergraduate text to directly address compiling and running parallel programs on the new multi-core and cluster architecture,An Introduction to Parallel Programmingexplains how to design, debug, and evaluate the performance of distributed and shared-memory programs. User-friendly exercises teach students how to compile, run and modify example programs.


Students in undergraduate parallel programming or parallel computing courses designed for the computer science major or as a service course to other departments; professionals with no background in parallel computing. 


  Biography & Research Interests


"My main research interest is in parallel computing. I've been involved in the development of the MPI Standard for message-passing, and I've written a short User's Guide to MPI. My book Parallel Programming with MPI is an elementary introduction to programming parallel systems that use the MPI 1.1 library of extensions to C and Fortran. It is intended for use by students and professionals with some knowledge of programming conventional, single-processor systems, but who have little or no experience programming multiprocessor systems.

I've also worked in computational neuroscience, and my students and I have developed a collection of programs, Parallel Neurosys, for the simulation of large networks of biologically accurate neurons on parallel computers.

Most recently I've written a more general introduction to parallel programming, An Introduction to Parallel Programming. This is also an elementary introduction to parallel programming, but in addition to MPI, it introduces parallel programming in Pthreads and OpenMP."





For more complete information about compiler optimizations, see our Optimization Notice.

1 comment

Rama Chandra M.'s picture

Supporting Materials For Basics Of Parallel Programming.

Rama Chandra,
Ravali Technologies.

Add a Comment

Have a technical question? Visit our forums. Have site or software product issues? Contact support.