[petsc-users] Fwd: Interpolation in staggered grid
Matthew Knepley
knepley at gmail.com
Tue Oct 16 05:06:27 CDT 2018
On Tue, Oct 16, 2018 at 3:57 AM Anton Popov <popov at uni-mainz.de> wrote:
> Hi Manuel,
>
> you might consider useful taking a look at this code
> https://bitbucket.org/bkaus/lamem, which is a multi-DMDA staggered grid
> implementation.
>
> Also, as Matt pointed out, there is a direct PETSc support for staggered
> grids (DMSTAG), recently contributed by Patrick Sanan.
>
> Anton is right, LaMeM is a great example. Did you also have a look at SNES
ex30? It is staggered grid FV and works in parallel.
Thanks,
Matt
> Thanks,
>
> Anton
>
> On 15.10.2018 23:22, Manuel Valera wrote:
>
>
> Thanks Matthew, i have made some progress but i am still unsure on how to
> proceed to make the DMDAs work as intended, i will try to lay out what i am
> trying now:
>
> I was able to port the interpolation into the DMDA model but it works only
> in serial runs, and it becomes unstable for more than one processor running,
>
> What i do is roughly this:
>
> - Calculate density at a cell-centered DMDA object, using a local
> vector so i can access the ghost rows (indices -1 and max+1)
> - Interpolate into a face-centered DMDA object for velocities, using
> also a local vector.
>
> Doing this i get the right results using the same interpolation i used for
> my non-petsc implementation of the model, as long as i use only one
> processor but the doubts i have are:
>
> - How to use the local vectors properly: Is operating on them the
> recommended course in this case?
> - How can i access the ghost indices in the global vector so i can
> then communicate GlobalToLocal? would this be a better strategy?
> - I feel is wrong to interpolate a cell-centered based vector into a
> face-centered based vector, using the indices of the latter, what strategy
> would work best in this case?
>
> I also tried opening a global vector with a different DA layout than the
> one created (opening global density on the velocities DMDA layout) and this
> introduced an error in the GlobalToLocal, LocalToLocal, LocalToGlobal
> communication, something didn't happen if i used the local versions of
> these vectors instead,
>
> Thanks for your help,
>
> Manuel
>
>
>
>
>
> On Sat, Oct 6, 2018 at 4:45 AM Matthew Knepley <knepley at gmail.com> wrote:
>
>> On Fri, Oct 5, 2018 at 6:49 PM Manuel Valera <mvalera-w at sdsu.edu> wrote:
>>
>>> Hello,
>>>
>>> I'm trying to do a simple variable interpolation, from a cell center to
>>> a face in a staggered grid, my model data management is done with DMDAs,
>>> with two different DMs one for each cell position,
>>>
>>> I already did this task on a fortran only version of the model using the
>>> 4 closest neighbors of the scalars (cell center) to be interpolated at the
>>> velocity site (cell face), i did it using a loop over the domain, somehow
>>> this easy task is not translating into the DMDA framework,
>>>
>>
>> Its not clear to me what problem you are having. We have done this
>> before. For example,
>>
>>
>> https://www.mcs.anl.gov/petsc/petsc-current/src/snes/examples/tutorials/ex30.c.html
>>
>> I would note that the development version of PETSc now has DMStag which
>> supports staggered grid discretizations directly.
>>
>> Thanks,
>>
>> Matt
>>
>>
>>> I'm not sure what I'm doing wrong or not aware of a easier way Petsc may
>>> have for this task, if you could point out an easier strategy or an example
>>> I would be grateful,
>>>
>>> Thanks,
>>>
>>>
>>>
>>>
>>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/%7Eknepley/>
>>
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181016/25b12f16/attachment-0001.html>
More information about the petsc-users
mailing list