[petsc-users] parallel interpolation?

Matthew Knepley knepley at gmail.com
Tue Feb 17 09:11:38 CST 2015


On Tue, Feb 17, 2015 at 8:15 AM, Gideon Simpson <gideon.simpson at gmail.com>
wrote:

> I’m gathering from your suggestions that I would need, a priori, knowledge
> of how many ghost points I would need, is that right?
>

We have to be more precise about a priori. You can certainly create a
VecScatter on the fly every time
if your communication pattern is changing. However, how will you know what
needs to be communicated.

   Matt


> -gideon
>
> On Feb 17, 2015, at 9:10 AM, Matthew Knepley <knepley at gmail.com> wrote:
>
> On Tue, Feb 17, 2015 at 7:46 AM, Gideon Simpson <gideon.simpson at gmail.com>
> wrote:
>
>> Suppose I have data in Vec x and Vec y, and I want to interpolate this
>> onto Vec xx, storing the values in Vec yy.  All vectors have the same
>> layout.  The problem is that, for example, some of the values in xx on
>> processor 0 may need the values of x and y on processor 1, and so on.
>> Aside from just using sequential vectors, so that everything is local, is
>> there a reasonable way to make this computation?
>>
>
> At the most basic linear algebra level, you would construct a VecScatter
> which mapped the pieces you need from other processes into a local vector
> along with the local portion, and you would use that to calculate values,
> which you then put back into your owned portion of a global vector. Thus
> local vectors have halos and global vectors do not.
>
> If you halo regions (values you need from other processes) have a common
> topology, then we have simpler
> support that will make the VecScatter for you. For example, if your values
> lie on a Cartesian grid and you
> just need neighbors within distance k, you can use a DMDA to express this
> and automatically make the
> VecScatter. Likewise, if you values lie on an unstructured mesh and you
> need a distance k adjacency,
> DMPlex can create the scatter for you.
>
> If you are creating the VecScatter yourself, it might be easier to use the
> new PetscSF instead since it only needs one-sided information, and performs
> the same job. This is what DMPlex uses to do the communication.
>
>   Thanks,
>
>      Matt
>
>
>> -gideon
>>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150217/f536a6bb/attachment.html>


More information about the petsc-users mailing list