[petsc-users] Load imbalance in DMSwarm & DMDA simulations

Mark Adams mfadams at lbl.gov
Tue Mar 17 06:17:26 CDT 2026


Hi Miguel,

This is a common problem. To my knowledge you need to deal with it on your
side, but PETSc does have tools to help.
I would look in the literature to get ideas and acclimate yourself to the
problem.

The tool that PETSc has is you can repartition your mesh. Create a dual
graph (any petsc examples?) put weights on the "vertice" (now elements, use
particle count) and repartition.
I am not sure of the best way to proceed from here ... you could create a
new DMSwarm with this new "cell" DM and add particles from the old to the
new, delete the old, call DMMigrate on the new DMSwarm and that will do the
moving that we want.

Others will probably be able to add to this.

Thanks,
Mark


On Tue, Mar 17, 2026 at 4:49 AM Miguel Molinos <m.molinos at upm.es> wrote:

> Dear all,
>
> I am currently running a PIC-like implementation based on DMSwarm and
> DMDA. In my setup, a particle (DMSwarm) discretization represents, for
> example, a sphere placed at the center of a box-shaped domain discretized
> with a regular mesh (DMDA).
>
> The implementation uses both MPI and OpenMP to accelerate the
> computations. The domain decomposition of the sphere follows a PIC-like
> approach: each particle is assigned to a rank based on the mesh element it
> belongs to, and thus inherits the rank that “owns” that element. This is
> essentially the same strategy used in one of the PETSc examples.
>
> However, I have observed that, for certain configurations, some ranks end
> up with very few or even no particles, which leads to load imbalance.
>
> Has anyone experienced a similar issue?
>
> Thanks,
> Miguel
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20260317/a5a12191/attachment.html>


More information about the petsc-users mailing list