[petsc-users] Load balancing / redistributing a 1D DM

Matthew Knepley knepley at gmail.com
Mon Mar 5 08:29:32 CST 2018


On Mon, Mar 5, 2018 at 9:25 AM, Åsmund Ervik <Asmund.Ervik at sintef.no> wrote:

> My only argument "against" using Plex is that I don't understand how to
> use it. Is there a simple example anywhere that shows how to set up a 1D
> simplical (?) mesh, and then just get/return data between vectors
> associated with the Plex and (local) Fortran arrays on each proc? I don't
> have any KSP, SNES etc.
>

I have not done a 1D example, because there is not much call for it.
However, I did give the sequence of calls in my last mail.

I am killing myself getting ready for SIAM PP right now, but this is easily
written when I get back.

However, I am not offended if you want to do it by hand.

  Thanks,

    Matt


> I assume such a code would go something like
>
> <Plex mesh init stuff>
> <Plex initial (uniform) parallel distribution>
> <My code's init stuff>
> <loop over time>
>     <Plex equivalent of DMDAGetCorners>
>     <Plex equivalent of DMDAVecGetArrayF90>
>     <My code that computes things with the array, and finds weights for
> redistribution>
>     <Plex equivalent of DMDAVecRestoreArrayF90>
>     <Plex redistribution>
> <end loop>
>
> Regards,
> Åsmund
>
> > -----Original Message-----
> > From: Matthew Knepley [mailto:knepley at gmail.com]
> > Sent: Monday, March 5, 2018 3:08 PM
> > To: Tobin Isaac <tisaac at cc.gatech.edu>
> > Cc: Jed Brown <jed at jedbrown.org>; Dave May
> > <dave.mayhem23 at gmail.com>; Åsmund Ervik <Asmund.Ervik at sintef.no>;
> > petsc-users at mcs.anl.gov
> > Subject: Re: [petsc-users] Load balancing / redistributing a 1D DM
> >
> > On Mon, Mar 5, 2018 at 9:01 AM, Tobin Isaac <tisaac at cc.gatech.edu
> > <mailto:tisaac at cc.gatech.edu> > wrote:
> >
> >
> >       This is a somewhat incomplete description of the steps in linear
> > partitioning.  The rest can be accomplished with PetscSF calls, but I
> should
> > wrap it up in a PetscPartitioner because it's a mistake-prone operation.
> >
> >
> >
> > Jed likes to do everything by hand because it is transparent, but then
> you
> > become the maintainer.
> > I think this is easy to do in Plex, and we maintain the code. It is less
> > transparent, which is the tradeoff.
> >
> >    Matt
> >
> >
> >       On March 5, 2018 8:31:42 AM EST, Jed Brown <jed at jedbrown.org
> > <mailto:jed at jedbrown.org> > wrote:
> >       >Dave May <dave.mayhem23 at gmail.com
> > <mailto:dave.mayhem23 at gmail.com> > writes:
> >       >
> >       >> For a 1D problem such as yours, I would use your favourite graph
> >       >> partitioner (Metis,Parmetis, Scotch) together with your cell
> based
> >       >> weighting and repartition the data yourself.
> >       >
> >       >That's overkill in 1D.  You can MPI_Allreduce(SUM) and
> > MPI_Scan(SUM)
> >       >the
> >       >weights, then find the transition indices in each subdomain.
> It'll be
> >       >cheaper, more intuitive/deterministic, and avoid the extra library
> >       >dependency.  Of course if you think you may want to move to
> > multiple
> >       >dimensions, it would make sense to consider DMPlex or DMForest.
> >
> >
> >
> >
> >
> > --
> >
> > What most experimenters take for granted before they begin their
> > experiments is infinitely more interesting than any results to which
> their
> > experiments lead.
> > -- Norbert Wiener
> >
> > https://www.cse.buffalo.edu/~knepley/
> > <http://www.caam.rice.edu/~mk51/>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180305/52e69f11/attachment.html>


More information about the petsc-users mailing list