[petsc-users] Partitioner in DMPlexDistribute

Matthew Knepley knepley at gmail.com
Mon Oct 28 07:42:59 CDT 2013

On Mon, Oct 28, 2013 at 6:17 AM, Cedric Doucet <cedric.doucet at inria.fr>wrote:

> I was not looking at the latest version of the code (it seems online
> documentation has not been updated yet).
> Now, I can see that partitioner is used as an argument of
> DMPlexCreatePartition.
> However, there is a new argument: a PetscSF.
> I have two questions about this new implementation:
> 1. which partitioner can be used?

Yes, Chaco and Parmetis. These are really the only partitioners that
matter. If you want another one, it would not be
hard to put in, but the performance of all the others is not really
competitive with these.

> 2. how to initialize the new PetscSF argument?

You do not need to initialize it if you just call DMPlexDistribute().


> ------------------------------
> *De: *"Cedric Doucet" <cedric.doucet at inria.fr>
> *À: *petsc-users at mcs.anl.gov
> *Envoyé: *Lundi 28 Octobre 2013 11:25:25
> *Objet: *[petsc-users] Partitioner in DMPlexDistribute
> Hello,
> I need to use DMPlexDistribute in a parallel finite element code.
> I would like to know which partitioner can be used with this function.
> I saw in an example that chaco is a possibility.
> I tried to use parmetis instead but I did not manage to do it.
> Furthermore, I do not see where the input partitioner is used in the
> source code of DMPlexDistribute.
> Best regards,
> Cédric Doucet

What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20131028/aa20e412/attachment.html>

More information about the petsc-users mailing list