[petsc-users] Mesh partitioning and MPI calls

Tabrez Ali stali at geology.wisc.edu
Thu Aug 25 16:15:17 CDT 2011


Hello

I have an unstructured FE mesh which I am partitioning using Metis.

In the first case I only use the element partitioning info and discard 
the nodal partitioning info i.e., the original ordering is same as 
petsc's global ordering. In the second case I do use the nodal 
partitioning info and nodes are distributed accordingly.

I would expect that in the 2nd scenario the total number of MPI messages 
(at the end of the solve) would be lower than the 1st. However I see 
that opposite is true. See the plot at http://stali.freeshell.org/mpi.png

The number on the y axis is the last column of the "MPI messages:" field 
from the -log_summary output.

Any ideas as to why this is happening. Does relying on total number of 
MPI messages as a performance measure even make sense. Please excuse my 
ignorance on the subject.

Alternatively what is a good way to measure how good the Metis 
partitioning is?

Thanks in advance

Tabrez



More information about the petsc-users mailing list