[petsc-users] DMSTAG Gathering Vector on single process
Colton Bryant
coltonbryant2021 at u.northwestern.edu
Wed Dec 13 10:21:43 CST 2023
Hi,
Thanks for the help last week. The suggestions made the implementation I
had much cleaner. I had one follow up question. Is there a way to sort of
undo this operation? I know the vec scatter can be done backwards to
distribute the arrays but I didn't see an easy way to migrate the DMDA
vectors back into the DMStag object.
Thanks for any advice.
-Colton
On Wed, Dec 6, 2023 at 8:18 PM Barry Smith <bsmith at petsc.dev> wrote:
>
>
> On Dec 6, 2023, at 8:35 PM, Matthew Knepley <knepley at gmail.com> wrote:
>
> On Wed, Dec 6, 2023 at 8:10 PM Barry Smith <bsmith at petsc.dev> wrote:
>
>>
>> Depending on the serial library you may not need to split the vector
>> into DMDA vectors with DMStagVecSplitToDMDA() for each component. Just
>> global to natural and scatter to zero on the full vector, now the full
>> vector is on the first rank and you can access what you need in that one
>> vector if possible.
>>
>
> Does DMStag have a GlobalToNatural?
>
>
> Good point, it does not appear to have such a thing, though it could.
>
> Also, the serial code would have to have identical interleaving.
>
> Thanks,
>
> Matt
>
>> On Dec 6, 2023, at 6:37 PM, Colton Bryant <
>> coltonbryant2021 at u.northwestern.edu> wrote:
>>
>> Ah excellent! I was not aware of the ability to preallocate the objects
>> and migrate them each time.
>>
>> Thanks!
>> -Colton
>>
>> On Wed, Dec 6, 2023 at 5:18 PM Matthew Knepley <knepley at gmail.com> wrote:
>>
>>> On Wed, Dec 6, 2023 at 5:54 PM Colton Bryant <
>>> coltonbryant2021 at u.northwestern.edu> wrote:
>>>
>>>> Hello,
>>>>
>>>> I am working on a code in which a DMSTAG object is used to solve a
>>>> fluid flow problem and I need to gather this flow data on a single process
>>>> to interact with an existing (serial) library at each timestep of my
>>>> simulation. After looking around the solution I've tried is:
>>>>
>>>> -use DMStagVecSplitToDMDA to extract vectors of each component of the
>>>> flow
>>>> -use DMDACreateNaturalVector and DMDAGlobalToNatural to get the
>>>> components naturally ordered
>>>> -use VecScatterCreateToZero to set up and then do the scatter to gather
>>>> on the single process
>>>>
>>>> Unless I'm misunderstanding something this method results in a lot of
>>>> memory allocation/freeing happening at each step of the evolution and I was
>>>> wondering if there is a way to directly perform such a scatter from the
>>>> DMSTAG object without splitting as I'm doing here.
>>>>
>>>
>>> 1) You can see here:
>>>
>>>
>>> https://petsc.org/main/src/dm/impls/stag/stagda.c.html#DMStagVecSplitToDMDA
>>>
>>> that this function is small. You can do the DMDA creation manually, and
>>> then just call DMStagMigrateVecDMDA() each time, which will not create
>>> anything.
>>>
>>> 2) You can create the natural vector upfront, and just scatter each time.
>>>
>>> 3) You can create the serial vector upfront, and just scatter each time.
>>>
>>> This is some data movement. You can compress the g2n and 2zero scatters
>>> using
>>>
>>> https://petsc.org/main/manualpages/PetscSF/PetscSFCompose/
>>>
>>> as an optimization.
>>>
>>> Thanks,
>>>
>>> Matt
>>>
>>>
>>>> Any advice would be much appreciated!
>>>>
>>>> Best,
>>>> Colton Bryant
>>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://www.cse.buffalo.edu/~knepley/
>>> <http://www.cse.buffalo.edu/~knepley/>
>>>
>>
>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20231213/a967ced9/attachment.html>
More information about the petsc-users
mailing list