[petsc-users] need help with vector interpolation on nonuniform DMDA grids

Matthew Knepley knepley at gmail.com
Tue Nov 6 09:01:04 CST 2018


On Tue, Nov 6, 2018 at 9:45 AM Francesco Magaletti <
francesco_magaletti at fastwebnet.it> wrote:

> Hi Matt,
> thanks for the reply, but I didn’t catch exactly your point. What do you
> mean with “first bin the points into cells, and then use the
> interpolant from the discretization” ? To me it sounds like my “naive”
> approach, but maybe I missed something in your suggestion.
>

I did not read the whole mail before. Your naive approach is exactly right.
However, in parallel, you need to keep track of the bounds of each process,
send the points to the right process, and then do local location and
interpolation. It is some bookkeeping.

Explaining this gave me another idea. I think that DMSwarm might do this
automatically. You create a DMSwarm, stick in
all the points you need to interpolate at, set the cellDM to the DMDA used
for interpolation. Then you call DMSwarmMigrate()
to put the points on the correct process, and then interpolate. Tell me if
this does not work. I am Cc'ing the author Dave May
in case I have made an error.

  Thanks,

    Matt


> My problem comes from the solution of a 1D time evolving pde coupled with
> a hand-made mesh adapting algorithm. Therefore I’m already using a DMDA to
> manage all the communication issues among processors. Every n timesteps I
> move the grid and then I need to evaluate the old solution on the new grid
> points, this is why I need an efficient way to do it. Is it feasible to let
> DMDA’s objects speak with DMPlexes, in order to keep the previous code
> structure unaltered and use DMPlex only for the interpolation routine?
>
> Thanks
>
>
>
> Il giorno 06/nov/2018, alle ore 15:14, Matthew Knepley <knepley at gmail.com>
> ha scritto:
>
> On Tue, Nov 6, 2018 at 5:11 AM Francesco Magaletti via petsc-users <
> petsc-users at mcs.anl.gov> wrote:
>
>> Dear all,
>>
>> I would like to ask you if is there an easy and efficient way in parallel
>> (by using PetsC functions) to interpolate a DMDA vector associated with a
>> nonuniform 1D grid to another DMDA vector with the same length but
>> associated with a different nonuniform grid.
>>
>> Let me rephrase it to be as clearer as I can:
>>
>> I have two structured nonuniform 1D grids with coordinate vectors x[i]
>> and y[i]. Both the domains have been discretized with the same number of
>> points, but the coordinate vectors x and y are different. I have a
>> discretized field u[i] = u(x[i]) and I would like to use these point values
>> to evaluate the values u(y[i]) in the points of the second grid.
>>
>> I read on the manual pages that functions like DMCreateInterpolation or
>> similar work only with different but uniform DMDAs. Did I understand
>> correctly?
>>
>> A naive approach, with a serial code, could be to find the points x[i]
>> and x[i+1] that surround the point y[j] for every j and then simply linear
>> interpolating the values u[i] and u[i+1]. I suspect that this is not the
>> most efficient way to do it. Moreover it won’t work in parallel since, in
>> principle, I do not know beforehand how many ghost nodes could be necessary
>> to perform all the interpolations.
>>
>> Thank you in advance for your help!
>>
>
> This has not been written, but is not that hard. You would first bin the
> points into cells, and then use the
> interpolant from the discretization. This is how we do it in the
> unstructured case. Actually, if you wrote
> your nonuniform 1D meshes as DMPlexes, I think it would work right now
> (have not tested).
>
>   Thanks,
>
>    Matt
>
>
>> Francesco
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181106/0c703799/attachment-0001.html>


More information about the petsc-users mailing list