[petsc-users] Question about DMLocalToLocal for DM_BOUNDARY_GHOSTED conditions
Matthew Knepley
knepley at gmail.com
Mon May 25 19:22:22 CDT 2020
On Mon, May 25, 2020 at 11:19 AM Lucas Banting <bantingl at myumanitoba.ca>
wrote:
> So DM_BOUNDARY_GHOSTED values are not updated then in the DMDALocalToLocal
> routines. I should instead use DM_BOUNDARY_NONE and make the domain larger
> by 2 elements if I need those values to be shared between processes then?
> Is that the best approach?
>
I think we need to clarify what PETSc means by these things.
We divide the original grid into non-overlapping pieces. These are the
vertices that are ordered for the global numbering,
and those that we own are called the "local" vertices. In order to
calculate the value at a local vertex, we need information
from neighboring vertices. Sometimes these are outside the portion we own.
We call these "ghost" vertices. Thus
Global Numbering --> local vertices
Local Numbering --> local+ghost vertices
At a boundary, we have to decide how our stencil behaves. If there is just
nothing there, we would have type NONE. Some stencil
indices for a boundary vertex are invalid. If instead we decide that the
stencil should wrap around the domain, indexing into vertices
owned by another process, that is type PERIODIC. If the stencil wraps back
onto this process, it is type MIRROR. If the stencil
indexes into some storage space in which I can put boundary values, it is
type GHOSTED.
Now if we do LocalToGlobal. We are copying unknowns from Global storage to
Local storage. This is just 1-to-1 if we have NONE,
since no extra unknowns were added. For PERIODIC or MIRROR, the extra local
unknowns are mapped to other global unknowns,
and thus we have a copy, possibly with communication. For GHOSTED, the
extra local unknowns are just storage and do not
correspond to other global unknowns. Thus there is no copy.
We can think of LocalToLocal as just LocalToGlobal followed by
GlobalToLocal but done in one step. Since GHOSTED is not affected
by LocalToGlobal, nothing happens. I am not sure what you are trying to
achieve with the boundary condition.
Thanks,
Matt
> Thanks,
>
> Lucas
>
> ------------------------------
> *From:* Matthew Knepley <knepley at gmail.com>
> *Sent:* Friday, May 22, 2020 8:03 PM
> *To:* Lucas Banting <bantingl at myumanitoba.ca>
> *Cc:* PETSc <petsc-users at mcs.anl.gov>
> *Subject:* Re: [petsc-users] Question about DMLocalToLocal for
> DM_BOUNDARY_GHOSTED conditions
>
> On Fri, May 22, 2020 at 4:34 PM Lucas Banting <bantingl at myumanitoba.ca>
> wrote:
>
> Hello,
>
> I am converting a serial code to parallel in fortran with petsc. I am
> using the DMDA to manage communication of the information that used to be
> in old two-dimensional fortran arrays.
>
> I noticed when using DMLocalToLocalBegin/End, not all the ghost values in
> the array at the DM_BOUNDARY_GHOSTED area is updated. Is this expected
> behaviour?
>
>
> I believe so. GHOSTED is user managed space. We do not touch it.
>
>
> I read through this thread:
> https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2016-May/029252.html
> and saw someone had a similar question, but the answer was not clear to me.
>
> If this is expected behaviour, how should I instead update these values in
> my arrays? I was using DM_BOUNDARY_GHOSTED as I needed the extra ghost
> cells for some subroutines, but I do not need them in my matrix from
> DMCreateMatrix.
>
>
> You fill them in the local vector.
>
> Thanks,
>
> Matt
>
>
> I am using Petsc 3.12.4 and open MPI 3.1.4.
>
> Thanks,
>
> Lucas Banting
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200525/92c66036/attachment.html>
More information about the petsc-users
mailing list