[petsc-users] Agglomeration for Multigrid on Unstructured Meshes

Lawrence Mitchell wencel at gmail.com
Tue Jun 2 03:25:42 CDT 2020


Hi Dave,

> On 2 Jun 2020, at 05:43, Dave May <dave.mayhem23 at gmail.com> wrote:
> 
> 
> 
> On Tue 2. Jun 2020 at 03:30, Matthew Knepley <knepley at gmail.com> wrote:
> On Mon, Jun 1, 2020 at 7:03 PM Danyang Su <danyang.su at gmail.com> wrote:
> Thanks Jed for the quick response. Yes I am asking about the repartitioning of coarse grids in geometric multigrid for unstructured mesh. I am happy with AMG. Thanks for letting me know.
> 
> All the pieces are there, we just have not had users asking for this, and it will take some work to put together.
> 
> Matt - I created a branch for you and Lawrence last year which added full support for PLEX within Telescope. This implementation was not a fully automated algmoeration strategy - it utilized the partition associated with the DM returned from DMGetCoarseDM. Hence the job of building the distributed coarse hierarchy was let to the user.
> 
> I’m pretty sure that code got merged into master as the branch also contained several bug mixes for Telescope. Or am I mistaken?

I think you're right. I didn't manage to get the redistribution of the DMPlex object done last summer (it's bubbling up again).

As I see it, for redistributed geometric multigrid on plexes, the missing piece is a function:

DMPlexRedistributeOntoComm(DM old, MPI_Comm comm, DM *new)

I went down a rabbit hole of trying to do this, since I actually think this replaced the current interface to DMPlexDistribute, which is

DMPlexDistribute(DM old, PetscInt overlap, PetscSF *pointDistSF, DM *new)

Where the new DM comes out on the same communicator as the old DM, just with a different partition.

This has lots of follow-on consequences, for example, one can't easily load on P processes and then compute on Q.

Unfortunately, collectiveness over MPI_Comm(old) is baked into the redistribution routines everywhere, and I didn't manage to finish things.

Lawrence


More information about the petsc-users mailing list