[petsc-users] Mesh partitioning and MPI calls
Matthew Knepley
knepley at gmail.com
Thu Aug 25 23:49:58 CDT 2011
On Thu, Aug 25, 2011 at 9:15 PM, Tabrez Ali <stali at geology.wisc.edu> wrote:
> Hello
>
> I have an unstructured FE mesh which I am partitioning using Metis.
>
> In the first case I only use the element partitioning info and discard the
> nodal partitioning info i.e., the original ordering is same as petsc's
> global ordering. In the second case I do use the nodal partitioning info and
> nodes are distributed accordingly.
>
> I would expect that in the 2nd scenario the total number of MPI messages
> (at the end of the solve) would be lower than the 1st. However I see that
> opposite is true. See the plot at http://stali.freeshell.org/**mpi.png<http://stali.freeshell.org/mpi.png>
>
> The number on the y axis is the last column of the "MPI messages:" field
> from the -log_summary output.
>
> Any ideas as to why this is happening. Does relying on total number of MPI
> messages as a performance measure even make sense. Please excuse my
> ignorance on the subject.
>
> Alternatively what is a good way to measure how good the Metis partitioning
> is?
>
The thing to do here is take a case like 2 proc that can be completely
understood, and get down to the details. I
think there is a probably just a simple misunderstanding here.
The first thing to check is that you are partitioning what you think. By
default, Metis partitions the vertices of a graph,
not elements, Thus you usually have to give Metis the "dual" of your finite
element mesh. I would take a small (maybe
8 or 10 elements) mesh and look at the original and Metis partitions. If
Metis does not look better, something is wrong.
Matt
> Thanks in advance
>
> Tabrez
>
>
--
What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110826/ed79ba52/attachment.htm>
More information about the petsc-users
mailing list