[petsc-users] Interpolation Between DMSTAG Objects

Matthew Knepley knepley at gmail.com
Wed Jun 14 14:16:57 CDT 2023


On Wed, Jun 7, 2023 at 10:46 AM Colton Bryant <
coltonbryant2021 at u.northwestern.edu> wrote:

> Hello,
>
> I am new to PETSc so apologies in advance if there is an easy answer to
> this question I've overlooked.
>
> I have a problem in which the computational domain is divided into two
> overlapping regions (overset grids). I would like to discretize each region
> as a separate DMSTAG object. What I do not understand is how to go about
> interpolating a vector from say DM1 onto nodes in DM2. My current (likely
> inefficient) idea is to create vectors of query points on DM2, share these
> vectors among all processes, perform the interpolations on DM1, and then
> insert the results into the vector on DM2.
>
> Before I embark on manually setting up the communication here I wanted to
> just ask if there is any native support for this kind of operation in PETSc
> I may be missing.
>
> Thanks in advance for any advice!
>

This sounds like a good first step. We do not currently have support for
this, but I would like to support this pattern. I was thinking

 1) Create a DMSwarm() with your query points

 2) Call DMSwarmMigrate() to put the points on the correct processors

     This needs some implementation work, but is not super hard. We need to
preserve the map so
     that you can send the results back.

 3) Interpolate on the DMStag

 3) Use the PetscSF that migrated particles to send back the results

Joe and I are working on this support.

  Thanks,

    Matt


> Best,
> Colton Bryant
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230614/5a168236/attachment.html>


More information about the petsc-users mailing list