[petsc-users] Load balancing / redistributing a 1D DM

Dave May dave.mayhem23 at gmail.com
Mon Mar 5 07:17:46 CST 2018


On 5 March 2018 at 09:29, Åsmund Ervik <Asmund.Ervik at sintef.no> wrote:

> Hi all,
>
> We have a code that solves the 1D multiphase Euler equations, using some
> very expensive thermodynamic calls in each cell in each time step. The
> computational time for different cells varies significantly in the spatial
> direction (due to different thermodynamic states), and varies slowly from
> timestep to timestep.
>
> Currently the code runs in serial, but I would like to use a PETSc DM of
> some sort to run it in parallell. There will be no linear on nonlinear
> PETSc solves etc., just a distributed mesh, at least initially. The code is
> Fortran.
>
> Now for my question: Is it possible to do dynamic load balancing using a
> plain 1D DMDA, somehow? There is some mention of this for PCTELESCOPE, but
> I guess it only works for linear solves? Or could I use an index set or
> some other PETSc structure? Or do I need to use a 1D DMPLEX?
>

I don't think TELESCOPE is what you want to use.

TELESCOPE redistributes a DMDA from one MPI communicator to another MPI
communicator with fewer ranks. I would not describe its functionality as
"load balancing". Re-distribution could be interpreted as load balancing
onto a different communicator, with an equal "load" associated with each
point in the DMDA - but that is not what you are after. In addition, I
didn't add support within TELESCOPE to re-distribute a 1D DMDA as that
use-case almost never arises.

For a 1D problem such as yours, I would use your favourite graph
partitioner (Metis,Parmetis, Scotch) together with your cell based
weighting and repartition the data yourself.

This is not a very helpful comment but I'll make it anyway...
If your code was in C, or C++, and you didn't want to mess around with any
MPI calls at all from your application code, I think you could use the
DMSWAM object pretty easily to perform the load balancing. I haven't tried
this exact use-case myself, but in principal you could take the output from
Metis (which tells you the rank you should move each point in the graph to)
and directly shove this info into a SWARM object and then ask it to migrate
your data.
DMSWAM lets you define and migrate (across a communicator) any data type
you like - it doesn't have to be a PetscReal, PetscScalar, you can define C
structs for example.  Unfortunately I didn't have the time to add Fortran
support for DMSWAM at the moment.


Cheers,
  Dave






Thanks,
  Dave


>
> If the latter, how do I make a 1D DMPLEX? All the variables are stored in
> cell centers (collocated), so it's a completely trivial "mesh". I tried
> reading the DMPLEX manual, and looking at examples, but I'm having trouble
> penetrating the FEM lingo / abstract nonsense.
>
> Best regards,
> Åsmund
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180305/12f5972d/attachment.html>


More information about the petsc-users mailing list