[petsc-users] Load balancing / redistributing a 1D DM

Dave May dave.mayhem23 at gmail.com
Mon Mar 5 08:25:07 CST 2018


On 5 March 2018 at 13:56, Åsmund Ervik <Asmund.Ervik at sintef.no> wrote:

> As Jed suggests, computing the (re)partitioning is straightforward in my
> 1D case. We're not planning to move this to multiple dimensions (we have
> another type of solver for that).
>
> So if it's possible to expose the repartitioning code for DAs, I'd be very
> happy to go this route. Is it a lot of work to do this?
>

Exposing the code for repartitioning the DA does not make much sense to me
given the simplicity of the DMDA and given fact there other more general
methods available in petsc.

If you want to write the repartition code yourself for a DMDA, it would be
straight forward to follow the pattern in Telescope and simplify it for the
1D case.
Start with this file:

http://www.mcs.anl.gov/petsc/petsc-current/src/ksp/pc/impls/telescope/telescope_dmda.c.html
and look at
  PCTelescopeSetUp_dmda()

There are three parts to this function:

[1] Copying the definition of the parent DMDA for the new repartitioned
DMDA (PCTelescopeSetUp_dmda_repart())

[2] Defining the coordinates for the repartitioned DMDA
(PCTelescopeSetUp_dmda_repart_coors())

[3] Defining the permutation which maps vectors from the original DMDA to
the repartitioned DMDA (PCTelescopeSetUp_dmda_permutation_2d())

* Assuming you have the new layout defined (via Metis or MPI_Scan()),
PCTelescopeSetUp_dmda_repart() would look basically the same expect you'd
change the call
  DMDASetOwnershipRanges(ctx->dmrepart,NULL,NULL,NULL);
to use an array defining the new layout of the repartitioned DMDA

* PCTelescopeSetUp_dmda_repart_coors2d() is straight forward to modify for
the 1D case by taking out the loop over j

* PCTelescopeSetUp_dmda_permutation_2d() is also straight forward to modify
for the 1D by taking out the loop over j and not using dimensionally index
variables (like startI[1]) associated the j direction. The following
functions support 1D and can be re-used:
  _DMDADetermineRankFromGlobalIJK()
  _DMDADetermineGlobalS0()

Obviously you also wouldn't need guards like
  if (isActiveRank(sred->psubcomm)) {


Thanks,
  Dave






>
>
> I have another question on a similar-but-different problem for 3D, but
> I'll write a separate mail on it.
>
> Best regards,
> Åsmund
>
>
> > -----Original Message-----
> > From: Jed Brown [mailto:jed at jedbrown.org]
> > Sent: Monday, March 5, 2018 2:32 PM
> > To: Dave May <dave.mayhem23 at gmail.com>; Åsmund Ervik
> > <Asmund.Ervik at sintef.no>
> > Cc: petsc-users at mcs.anl.gov
> > Subject: Re: [petsc-users] Load balancing / redistributing a 1D DM
> >
> > Dave May <dave.mayhem23 at gmail.com> writes:
> >
> > > For a 1D problem such as yours, I would use your favourite graph
> > > partitioner (Metis,Parmetis, Scotch) together with your cell based
> > > weighting and repartition the data yourself.
> >
> > That's overkill in 1D.  You can MPI_Allreduce(SUM) and MPI_Scan(SUM) the
> > weights, then find the transition indices in each subdomain.  It'll be
> cheaper,
> > more intuitive/deterministic, and avoid the extra library dependency.  Of
> > course if you think you may want to move to multiple dimensions, it would
> > make sense to consider DMPlex or DMForest.
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180305/16f191bb/attachment.html>


More information about the petsc-users mailing list