[petsc-users] Parallelization efficiency diagnose

Sun, Hui hus003 at ucsd.edu
Thu Dec 4 18:49:17 CST 2014


Hello,

I try to test the efficiency of parallelization of my current installation of petsc by running the following command:


make PETSC_DIR=/Users/Paul/Documents/software/petsc-install PETSC_ARCH= streams NPMAX=4


I get the following results:


Number of MPI processes 1

Process 0 Huis-MacBook-Pro.local

Function      Rate (MB/s)

Copy:       13354.1380

Scale:      13012.7268

Add:        14725.4078

Triad:      14822.7110

Number of MPI processes 2

Process 0 Huis-MacBook-Pro.local

Process 1 Huis-MacBook-Pro.local

Function      Rate (MB/s)

Copy:       14135.6610

Scale:      14071.0462

Add:        15598.0208

Triad:      15717.5890

Number of MPI processes 3

Process 0 Huis-MacBook-Pro.local

Process 1 Huis-MacBook-Pro.local

Process 2 Huis-MacBook-Pro.local

Function      Rate (MB/s)

Copy:       13755.8241

Scale:      13704.7662

Add:        15312.1487

Triad:      15319.4803

Number of MPI processes 4

Process 0 Huis-MacBook-Pro.local

Process 1 Huis-MacBook-Pro.local

Process 2 Huis-MacBook-Pro.local

Process 3 Huis-MacBook-Pro.local

Function      Rate (MB/s)

Copy:       13769.1621

Scale:      13708.0972

Add:        15103.1783

Triad:      15133.8786

------------------------------------------------

np  speedup

1 1.0

2 1.06

3 1.03

4 1.02

Estimation of possible speedup of MPI programs based on Streams benchmark.

It appears you have 1 node(s)


Does this result basically says that my MacBook only have one node? However, I know my computer has 4 cores, as I type sysctl -n hw.ncpu in bash, it gives me 4. What does it really mean? By the way, here is my configuration for this installation:


./configure --download-fblaslapack --download-suitesparse --download-superlu_dist --download-parmetis  --download-metis --download-hypre --prefix=/Users/Paul/Documents/software/petsc-install --with-mpi-dir=/Users/Paul/Documents/software/mpich-3.1.3-install


Is there anything in this configuration command that causes trouble?


Best,

Hui


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141205/11659d27/attachment.html>


More information about the petsc-users mailing list