[petsc-users] Load balancing / redistributing a 1D DM

Matthew Knepley knepley at gmail.com
Mon Mar 5 08:07:35 CST 2018

On Mon, Mar 5, 2018 at 9:01 AM, Tobin Isaac <tisaac at cc.gatech.edu> wrote:

> This is a somewhat incomplete description of the steps in linear
> partitioning.  The rest can be accomplished with PetscSF calls, but I
> should wrap it up in a PetscPartitioner because it's a mistake-prone
> operation.

Jed likes to do everything by hand because it is transparent, but then you
become the maintainer.
I think this is easy to do in Plex, and we maintain the code. It is less
transparent, which is the tradeoff.


> On March 5, 2018 8:31:42 AM EST, Jed Brown <jed at jedbrown.org> wrote:
> >Dave May <dave.mayhem23 at gmail.com> writes:
> >
> >> For a 1D problem such as yours, I would use your favourite graph
> >> partitioner (Metis,Parmetis, Scotch) together with your cell based
> >> weighting and repartition the data yourself.
> >
> >That's overkill in 1D.  You can MPI_Allreduce(SUM) and MPI_Scan(SUM)
> >the
> >weights, then find the transition indices in each subdomain.  It'll be
> >cheaper, more intuitive/deterministic, and avoid the extra library
> >dependency.  Of course if you think you may want to move to multiple
> >dimensions, it would make sense to consider DMPlex or DMForest.

What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180305/c9011223/attachment.html>

More information about the petsc-users mailing list