[petsc-users] Load imbalance in DMSwarm & DMDA simulations

Matthew Knepley knepley at gmail.com
Tue Mar 17 10:47:34 CDT 2026


On Tue, Mar 17, 2026 at 11:22 AM Miguel Molinos <m.molinos at upm.es> wrote:

> Hi Matt and Mark,
>
> Thank you for the feedback.
>
> A disadvantage is that you have to work harder to get structured variable
> access, which DMDA has
> automatically. Depending on your discretization on this grid, this can be
> restored, but it is some work.
>
>
> I’ve been trying to avoid DMPlex because I find harder to create ghost
> particles (define padding regions) using an unstructured mesh.
>

Oh, you can make DMPlex be geometrically structured very easily. So this
should not be a problem.


> Regarding variable access, this is not a limitation since I use the
> background mesh for domain decomposition purposes. The particles carry al
> the information.
>
> What does the algorithm look like?
>
>
> My code reproduces interaction between atoms (some sort of MD). The
> purposes of the mesh are:
> - Create a domain decomposition
> - Particle migration
> - Definition of boundary conditions (it can works like a supercell)
>

Oh, cool. DMPlex should be very easy then. You just create a BoxMesh (I
ould probably do it from the command line), and then have it load balanced
exactly as Mark said. I can help you if anything does not make sense.

  Thanks,

     Matt



> Thanks,
> Miguel
>
> On 17 Mar 2026, at 15:11, Matthew Knepley <knepley at gmail.com> wrote:
>
> On Tue, Mar 17, 2026 at 8:53 AM Mark Adams <mfadams at lbl.gov> wrote:
>
>> This tool works at the Mat level so it does not matter, but you need to
>> work at the DM level anyway.
>>
>> I don't know of DM tools for this. Let Matt chime in.
>>
>
> Hi Miguel,
>
> DMPlex can definitely do point location and load balancing, which is an
> advantage.
>
> A disadvantage is that you have to work harder to get structured variable
> access, which DMDA has
> automatically. Depending on your discretization on this grid, this can be
> restored, but it is some work.
>
> What does the algorithm look like?
>
>   Thanks,
>
>       Matt
>
>
>> On Tue, Mar 17, 2026 at 7:54 AM Miguel Molinos <m.molinos at upm.es> wrote:
>>
>>> Thank you Mark. Indeed my question was oriented towards the tools PETSc
>>> can offer to address this problem. Currently I’m using DMDA, perhaps DMPlex
>>> is more suited?
>>>
>>> Thanks,
>>> Miguel
>>>
>>> On 17 Mar 2026, at 12:17, Mark Adams <mfadams at lbl.gov> wrote:
>>>
>>> Hi Miguel,
>>>
>>> This is a common problem. To my knowledge you need to deal with it on
>>> your side, but PETSc does have tools to help.
>>> I would look in the literature to get ideas and acclimate yourself to
>>> the problem.
>>>
>>> The tool that PETSc has is you can repartition your mesh. Create a dual
>>> graph (any petsc examples?) put weights on the "vertice" (now elements, use
>>> particle count) and repartition.
>>> I am not sure of the best way to proceed from here ... you could create
>>> a new DMSwarm with this new "cell" DM and add particles from the old to the
>>> new, delete the old, call DMMigrate on the new DMSwarm and that will do the
>>> moving that we want.
>>>
>>> Others will probably be able to add to this.
>>>
>>> Thanks,
>>> Mark
>>>
>>>
>>> On Tue, Mar 17, 2026 at 4:49 AM Miguel Molinos <m.molinos at upm.es> wrote:
>>>
>>>>
>>>> Dear all,
>>>>
>>>> I am currently running a PIC-like implementation based on DMSwarm and
>>>> DMDA. In my setup, a particle (DMSwarm) discretization represents, for
>>>> example, a sphere placed at the center of a box-shaped domain discretized
>>>> with a regular mesh (DMDA).
>>>>
>>>> The implementation uses both MPI and OpenMP to accelerate the
>>>> computations. The domain decomposition of the sphere follows a PIC-like
>>>> approach: each particle is assigned to a rank based on the mesh element it
>>>> belongs to, and thus inherits the rank that “owns” that element. This is
>>>> essentially the same strategy used in one of the PETSc examples.
>>>>
>>>> However, I have observed that, for certain configurations, some ranks
>>>> end up with very few or even no particles, which leads to load imbalance.
>>>>
>>>> Has anyone experienced a similar issue?
>>>>
>>>> Thanks,
>>>> Miguel
>>>>
>>>
>>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!fxCuWHIJfQQ5W0v9jYacL1OPdsqwr4qqwdne4wySjXMGM-MKEwlo8jFf1PUQEmKP9bG9bzY3RLxoCJOfEp5A$ 
> <https://urldefense.com/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!D9dNQwwGXtA!XQaHU8QofqUHo3OrZh_T1a_rkDNzGmL18ailuPFZfek-dHvrLo4HiFJ8XH3WdyrAEaDXZ3jXldET3SA$ >
>
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!fxCuWHIJfQQ5W0v9jYacL1OPdsqwr4qqwdne4wySjXMGM-MKEwlo8jFf1PUQEmKP9bG9bzY3RLxoCJOfEp5A$  <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!fxCuWHIJfQQ5W0v9jYacL1OPdsqwr4qqwdne4wySjXMGM-MKEwlo8jFf1PUQEmKP9bG9bzY3RLxoCMIkYhLt$ >
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20260317/6c57d23d/attachment.html>


More information about the petsc-users mailing list