[petsc-dev] DMFOREST repartitioning in PETSc

Tobin Isaac tisaac at cc.gatech.edu
Thu Mar 22 07:10:14 CDT 2018



On March 21, 2018 11:26:35 AM MDT, Saurabh Chawdhary <schawdhary at anl.gov> wrote:
>Hello team,
>
>I haven't used petsc since DMFOREST was released but I have a question 
>regarding repartitioning of DMFOREST mesh. How is the repartitioning of
>
>mesh over processors done after some mesh refinement is carried out? Is
>
>it done by calling a p4est function or partitioning is done in PETSc?
>
>I was using p4est (natively) a couple of years ago and I remember that 
>when I tried to partition the grid I could only use the serial METIS
>and 
>not the parMETIS with p4est (using a native function 
>/p4est_connectivity_reorder/). So what I want to know is whether 
>DMFOREST repartitioning is done in parallel or in serial?

You can use the following workflow:

- Create/read-only an unstructured hexahedral mesh as a DMPlex
- Use ParMETIS to repartition that: there is a ParMETIS implementation of PetscPartitioner, which creates index sets and communication patterns for DMPlexDistribute.
- Convert the DMPlex to DMP4est or DMP8est.

This does not avoid the fundamental limitations of p4est: the distributed mesh will be redundantly serialized behind-the-scenes, and the coarse-mesh ordering derived from ParMETIS will be static for the life of the forest mesh.

>
>
>Thank you.
>
>Saurabh


More information about the petsc-dev mailing list