[petsc-users] Load balancing / redistributing a 1D DM

Åsmund Ervik Asmund.Ervik at sintef.no
Mon Mar 5 07:56:31 CST 2018


As Jed suggests, computing the (re)partitioning is straightforward in my 1D case. We're not planning to move this to multiple dimensions (we have another type of solver for that).

So if it's possible to expose the repartitioning code for DAs, I'd be very happy to go this route. Is it a lot of work to do this?


I have another question on a similar-but-different problem for 3D, but I'll write a separate mail on it.

Best regards,
Åsmund


> -----Original Message-----
> From: Jed Brown [mailto:jed at jedbrown.org]
> Sent: Monday, March 5, 2018 2:32 PM
> To: Dave May <dave.mayhem23 at gmail.com>; Åsmund Ervik
> <Asmund.Ervik at sintef.no>
> Cc: petsc-users at mcs.anl.gov
> Subject: Re: [petsc-users] Load balancing / redistributing a 1D DM
> 
> Dave May <dave.mayhem23 at gmail.com> writes:
> 
> > For a 1D problem such as yours, I would use your favourite graph
> > partitioner (Metis,Parmetis, Scotch) together with your cell based
> > weighting and repartition the data yourself.
> 
> That's overkill in 1D.  You can MPI_Allreduce(SUM) and MPI_Scan(SUM) the
> weights, then find the transition indices in each subdomain.  It'll be cheaper,
> more intuitive/deterministic, and avoid the extra library dependency.  Of
> course if you think you may want to move to multiple dimensions, it would
> make sense to consider DMPlex or DMForest.



More information about the petsc-users mailing list