[petsc-users] locate DMSwarm particles with respect to a background DMDA mesh

Matthew Knepley knepley at gmail.com
Fri Dec 23 10:14:46 CST 2022


On Thu, Dec 22, 2022 at 3:08 PM Matteo Semplice <
matteo.semplice at uninsubria.it> wrote:

>
> Il 22/12/22 20:06, Dave May ha scritto:
>
>
>
> On Thu 22. Dec 2022 at 10:27, Matteo Semplice <
> matteo.semplice at uninsubria.it> wrote:
>
>> Dear Dave and Matt,
>>
>>     I am really dealing with two different use cases in a code that will
>> compute a levelset function passing through a large set of points. If I had
>> DMSwarmSetMigrateType() and if it were safe to switch the migration mode
>> back and forth in the same swarm, this would cover all my use cases here.
>> Is it safe to add it back to petsc? Details below if you are curious.
>>
>> 1) During preprocessing I am loading a point cloud from disk (in whatever
>> order it comes) and need to send the particles to the right ranks. Since
>> the background DM is a DMDA I can easily figure out the destination rank.
>> This would be covered by your suggestion not to attach the DM, except that
>> later I need to locate these points with respect to the background cells in
>> order to initialize data on the Vecs associated to the DMDA.
>>
>> 2) Then I need to implement a semilagrangian time evolution scheme. For
>> this I'd like to send particles around at the "foot of characteristic",
>> collect data there and then send them back to the originating point. The
>> first migration would be based on particle coordinates
>> (DMSwarmMigrate_DMNeighborScatter and the restriction to only neighbouring
>> ranks is perfect), while for the second move it would be easier to just
>> send them back to the originating rank, which I can easily store in an Int
>> field in the swarm. Thus at each timestep I'd need to swap migrate types in
>> this swarm (DMScatter for moving them to the feet and BASIC to send them
>> back).
>>
>
> When you use BASIC, you would have to explicitly call the point location
> routine from your code as BASIC does not interact with the DM.
>
> Based on what I see in the code, switching  migrate modes between basic
> and dmneighbourscatter should be safe.
>
> If you are fine calling the point location from your side then what you
> propose should work.
>
> If I understood the code correctly, BASIC will just migrate particles
> sending them to what is stored in DMSwarmField_rank, right? That'd be easy
> since I can create a SWARM with all the data I need and an extra int field
> (say "original_rank") and copy those values into DMSwarmField_rank before
> calling migrate for the "going back" step. After this backward migration I
> do not need to locate particles again (e.g. I do not need
> DMSwarmSortGetAccess after the BASIC migration, but only after the
> DMNeighborScatter one).
>
> Thus having back DMSwarmSetMigrateType() should be enough for me.
>
> Hi Matteo,

I have done this in

  https://gitlab.com/petsc/petsc/-/merge_requests/5941

I also hope to get the fix for your DMDA issue in there.

  Thanks,

    Matt


> Thanks
>
>     Matteo
>
>
> Cheers
> Dave
>
>
>
>> Thanks
>>
>>     Matteo
>> Il 22/12/22 18:40, Dave May ha scritto:
>>
>> Hey Matt,
>>
>> On Thu 22. Dec 2022 at 05:02, Matthew Knepley <knepley at gmail.com> wrote:
>>
>>> On Thu, Dec 22, 2022 at 6:28 AM Matteo Semplice <
>>> matteo.semplice at uninsubria.it> wrote:
>>>
>>>> Dear all
>>>>
>>>>     please ignore my previous email and read this one: I have better
>>>> localized the problem. Maybe DMSwarmMigrate is designed to migrate
>>>> particles only to first neighbouring ranks?
>>>>
>>> Yes, I believe that was the design.
>>>
>>> Dave, is this correct?
>>>
>>
>> Correct. DMSwarmMigrate_DMNeighborScatter() only scatter points to the
>> neighbour ranks - where neighbours are defined by the DM provided to
>> represent the mesh.
>>
>> DMSwarmMigrate_DMNeighborScatter() Is selected by default if you attach
>> a DM.
>>
>> The scatter method should be over ridden with
>>
>> DMSwarmSetMigrateType()
>>
>> however it appears this method no longer exists.
>>
>> If one can determine the exact rank where points should should be sent
>> and it is not going to be the neighbour rank (given by the DM), I would
>> suggest not attaching the DM at all.
>>
>> However if this is not possible and one wanted to scatter to say the
>> neighbours neighbours, we will have to add a new interface and refactor
>> things a little bit.
>>
>> Cheers
>> Dave
>>
>>
>>
>>>   Thanks,
>>>
>>>     Matt
>>>
>>>
>>>> Il 22/12/22 11:44, Matteo Semplice ha scritto:
>>>>
>>>> Dear everybody,
>>>>
>>>>     I have bug a bit into the code and I am able to add more
>>>> information.
>>>> Il 02/12/22 12:48, Matteo Semplice ha scritto:
>>>>
>>>> Hi.
>>>> I am sorry to take this up again, but further tests show that it's not
>>>> right yet.
>>>>
>>>> Il 04/11/22 12:48, Matthew Knepley ha scritto:
>>>>
>>>> On Fri, Nov 4, 2022 at 7:46 AM Matteo Semplice <
>>>> matteo.semplice at uninsubria.it> wrote:
>>>>
>>>>> On 04/11/2022 02:43, Matthew Knepley wrote:
>>>>>
>>>>> On Thu, Nov 3, 2022 at 8:36 PM Matthew Knepley <knepley at gmail.com>
>>>>> wrote:
>>>>>
>>>>>> On Thu, Oct 27, 2022 at 11:57 AM Semplice Matteo <
>>>>>> matteo.semplice at uninsubria.it> wrote:
>>>>>>
>>>>>>> Dear Petsc developers,
>>>>>>>     I am trying to use a DMSwarm to locate a cloud of points with
>>>>>>> respect to a background mesh. In the real application the points will be
>>>>>>> loaded from disk, but I have created a small demo in which
>>>>>>>
>>>>>>>    - each processor creates Npart particles, all within the domain
>>>>>>>    covered by the mesh, but not all in the local portion of the mesh
>>>>>>>    - migrate the particles
>>>>>>>
>>>>>>> After migration most particles are not any more in the DMSwarm (how
>>>>>>> many and which ones seems to depend on the number of cpus, but it never
>>>>>>> happens that all particle survive the migration process).
>>>>>>>
>>>>>>> Thanks for sending this. I found the problem. Someone has some
>>>>>> overly fancy code inside DMDA to figure out the local bounding box from the
>>>>>> coordinates.
>>>>>> It is broken for DM_BOUNDARY_GHOSTED, but we never tested with this.
>>>>>> I will fix it.
>>>>>>
>>>>>
>>>>> Okay, I think this fix is correct
>>>>>
>>>>>   https://gitlab.com/petsc/petsc/-/merge_requests/5802
>>>>> <https://eur01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgitlab.com%2Fpetsc%2Fpetsc%2F-%2Fmerge_requests%2F5802&data=05%7C01%7Cmatteo.semplice%40uninsubria.it%7C94569ac2a32a4838103608dae44f9c46%7C9252ed8bdffc401c86ca6237da9991fa%7C0%7C0%7C638073327837332509%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=9yKNH0%2FbepSLU2PO5S%2B2GkYSmjjFIDGHpapy%2FUb43D4%3D&reserved=0>
>>>>>
>>>>> I incorporated your test as src/dm/impls/da/tests/ex1.c. Can you take
>>>>> a look and see if this fixes your issue?
>>>>>
>>>>> Yes, we have tested 2d and 3d, with various combinations of
>>>>> DM_BOUNDARY_* along different directions and it works like a charm.
>>>>>
>>>>> On a side note, neither DMSwarmViewXDMF nor DMSwarmMigrate seem to be
>>>>> implemented for 1d: I get
>>>>>
>>>>> [0]PETSC ERROR: No support for this operation for this object type
>>>>>                                                                                                                                             [0]PETSC
>>>>> ERROR: Support not provided for 1D
>>>>>
>>>>> However, currently I have no need for this feature.
>>>>>
>>>>> Finally, if the test is meant to stay in the source, you may remove
>>>>> the call to DMSwarmRegisterPetscDatatypeField as in the attached
>>>>> patch.
>>>>>
>>>>> Thanks a lot!!
>>>>>
>>>> Thanks! Glad it works.
>>>>
>>>>    Matt
>>>>
>>>> There are still problems when not using 1,2 or 4 cpus. Any other number
>>>> of cpus that I've tested does not work corectly.
>>>>
>>>> I have now modified private_DMDALocatePointsIS_2D_Regular to print out
>>>> some debugging information. I see that this is called twice during
>>>> migration, once before and once after DMSwarmMigrate_DMNeighborScatter. If
>>>> I understand correctly, the second call to
>>>> private_DMDALocatePointsIS_2D_Regular should be able to locate all
>>>> particles owned by the rank but it fails for some of them because they have
>>>> been sent to the wrong rank (despite being well away from process
>>>> boundaries).
>>>>
>>>> For example, running the example src/dm/impls/da/tests/ex1.c with Nx=21
>>>> (20x20 Q1 elements on [-1,1]X[-1,1]) with 3 processors,
>>>>
>>>> - the particles (-0.191,-0.462) and (0.191,-0.462) are sent cpu2
>>>> instead of cpu0
>>>>
>>>> - those at (-0.287,-0.693)and (0.287,-0.693) are sent to cpu1 instead
>>>> of cpu0
>>>>
>>>> - those at (0.191,0.462) and (-0.191,0.462) are sent to cpu0 instead of
>>>> cpu2
>>>>
>>>> (This is 2d and thus not affected by the 3d issue mentioned yesterday
>>>> on petsc-dev. Tests were made based on the release branch pulled out this
>>>> morning, i.e. on commit bebdc8d016f).
>>>>
>>>> I see: particles are sent "all around" and not only to the destination
>>>> rank.
>>>>
>>>> Still however, running the example src/dm/impls/da/tests/ex1.c with
>>>> Nx=21 (20x20 Q1 elements on [-1,1]X[-1,1]) with 3 processors, there are 2
>>>> particles initially owned by rank2 (at y=-0.6929 and x=+/-0.2870) that are
>>>> sent only to rank1 and never make it to rank0 and are thus lost in the end
>>>> since rank1, correctly, discards them.
>>>>
>>>> Thanks
>>>>
>>>>     Matteo
>>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://www.cse.buffalo.edu/~knepley/
>>> <https://eur01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.cse.buffalo.edu%2F~knepley%2F&data=05%7C01%7Cmatteo.semplice%40uninsubria.it%7C94569ac2a32a4838103608dae44f9c46%7C9252ed8bdffc401c86ca6237da9991fa%7C0%7C0%7C638073327837332509%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=LEmGbachok2Z3m9SyCASEcTWytkpxw%2FnHCIXjYXPTa8%3D&reserved=0>
>>>
>> --
>> ---
>> Professore Associato in Analisi Numerica
>> Dipartimento di Scienza e Alta Tecnologia
>> Università degli Studi dell'InsubriaVia Valleggio, 11 - Como <https://eur01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.google.com%2Fmaps%2Fsearch%2FVia%2BValleggio%2C%2B11%2B-%2BComo%3Fentry%3Dgmail%26source%3Dg&data=05%7C01%7Cmatteo.semplice%40uninsubria.it%7C94569ac2a32a4838103608dae44f9c46%7C9252ed8bdffc401c86ca6237da9991fa%7C0%7C0%7C638073327837332509%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=0MwkPJb5hslcrFhqrLCaN4OKs%2FoylMR3dOib2MYsNOY%3D&reserved=0>
>>
>> --
> ---
> Professore Associato in Analisi Numerica
> Dipartimento di Scienza e Alta Tecnologia
> Università degli Studi dell'Insubria
> Via Valleggio, 11 - Como
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20221223/b6a5254b/attachment-0001.html>


More information about the petsc-users mailing list