User Tools

Site Tools


cs415pdc:lab2

Lab 2 - MPI Broadcast and Reduce

The MPI trapezoid rule program is in the MPI program zip file found on at Message Passing using MPI. Download from the MPI_examples link to get the program files. Be sure to read the description of this program on this website. You will use other files from this set later.

When the instructions say to run the program and record the results, you need to record the average of at least 3 runs. Sometimes a spike in network traffic or CPU load can make one run last longer than it might otherwise. For this reason, always run the same command at least 3 times and record the average of the runs. NOTE: If one run is an outlier (say, 3 times larger or smaller than the others), don't include it in the average. Throw the result out and run again.

  1. Unzip the archive and change directory to MPI_Examples/trapIntegration/mpi_trap
  2. Type make to build the program
  3. Using either mpiexec or mpirun, run the program on one process, with a million trapezoids. Record the result in a file. For example:
    • mpiexec -n 1 ./mpi_trap 1000000
  4. Repeat the run on a single machine using 2, 4, 8, and 16 processes. Record the results; label each result with the process count
  5. Repeat your program runs from the previous step, but on 4 processors by using the -f host_file option. Make sure you have an appropriate host_file in the directory with your executable program.
  6. You will need to modify host_file first to add a 4th computer to the list. Then, modify it before the 8 and 16 process runs to select the number of cores on each machine (see below). This ensures the proper number of cores are being used per host. Record these results as well.
  7. Graph the speedup versus number of processors twice, once for each set of runs. Is the communication overhead a significant factor?

As we may have discussed in lecture, the entries in host_file can include the number of cores to use. Simply add :2 or :4 after each name.

cs415pdc/lab2.txt · Last modified: 2022/02/01 21:40 by scarl