[petsc-users] DMPlex distribution with FVM adjacency
Matthew Knepley
knepley at gmail.com
Thu May 25 13:46:01 CDT 2017
On Thu, May 25, 2017 at 1:38 PM, Lawrence Mitchell <
lawrence.mitchell at imperial.ac.uk> wrote:
>
> > On 25 May 2017, at 19:23, Matthew Knepley <knepley at gmail.com> wrote:
> >
> > Ok, let me clarify.
> >
> > Given shared facets, I'd like closure(support(facet)) this is a subset
> of the fem adjacency. "Add in the cell and its closure from the remote
> rank". This doesn't include remote cells I can only see through vertices.
> Without sending data evaluated at facet quad points, I think this is the
> adjacency I need to compute facet integrals: all the dofs in
> closure(support(facet)).
> >
> > This seems incoherent to me. For FV, dofs reside in the cells, so you
> should only need the cell for adjacency. If you
> > need dofs defined at vertices, then you should also need cells which are
> only attached by vertices. How could this
> > scheme be consistent without this?
>
> OK, so what I think is this:
>
> I need to compute integrals over cells and facets.
>
Sounds like DG. I will get out my dead chicken for the incantation.
> So I do:
>
> GlobalToLocal(INSERT_VALUES)
> ComputeIntegralsOnOwnedEntities
> LocalToGlobal(ADD_VALUES)
>
> That way, an integration is performed on every entity exactly once, and
> LocalToGlobal ensures that I get a consistent assembled Vec.
>
> OK, so if I only compute cell integrals, then the zero overlap
> distribution with all the points in the closure of the cell (including some
> remote points) is sufficient.
>
Yep.
> If I compute facet integrals, I need both cells (and their closure) in the
> support of the facet. Again, each facet is only integrated by one process,
> and the LocalToGlobal adds in contributions to remote dofs. This is the
> same as cell integrals, just I need a bit more data, no?
>
> The other option is to notice that what I actually need when I compute a
> facet integral is the test function and/or any coefficients evaluated at
> quadrature points on the facet. So if I don't want the extra overlapped
> halo, then what I need to do is for the remote process to evaluate any
> coefficients at the quad points, then send the evaluated data to the facet
> owner. Now the facet owner can compute the integral, and again
> LocalToGlobal adds in contributions to remote dofs.
That seems baroque. So this is just another adjacency pattern. You should
be able to easily define it, or if you are a patient person,
wait for me to do it. Its here
https://bitbucket.org/petsc/petsc/src/01c3230e040078628f5e559992965c1c4b6f473d/src/dm/impls/plex/plexdistribute.c?at=master&fileviewer=file-view-default#plexdistribute.c-239
I am more than willing to make this overridable by the user through
function composition or another mechanism.
Thanks,
Matt
>
> Lawrence
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
http://www.caam.rice.edu/~mk51/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170525/a29d64b9/attachment-0001.html>
More information about the petsc-users
mailing list