[petsc-users] DMPlex for cell centred finite volume
Edoardo alinovi
edoardo.alinovi at gmail.com
Tue Aug 27 03:54:24 CDT 2019
Hello Matt,
Thanks for your kind replay. Using DMplex and changing the data structure
is not a straightforward task as you can imagine. Up to now I am in the
preliminaries and I am trying to understand how to reuse most of all my
subroutines (I am using fortran) such as fluxes and coeffs assembly,
numerical schemes and so o in order to avoid to rewrite all the code :p I
will start with your first two suggestions and see if it is an affordable
task :)
For the face value, I am not interested in Reimann solvers. Being
incompressible I am using Central difference, QUICK, SOU, Upwind or
TVD-family schemes, but maybe I can use my stuff to get it.
I think I can be happy enough having the ghost cell centers available and
some way to deal with the BC (I guess this will be the most tricky part).
Basically I would like to use DMplex at low level just to have a support
for the parallel mesh management.
Have you got some example for P0 finite elements and overlap=1?
Thank you! :)
Il giorno mar 27 ago 2019 alle ore 09:33 Matthew Knepley <knepley at gmail.com>
ha scritto:
> On Tue, Aug 27, 2019 at 4:07 AM Edoardo alinovi via petsc-users <
> petsc-users at mcs.anl.gov> wrote:
>
>> Hello PETSc users and developers,
>> I hope you are doing well! Today I have a general question about DMplex
>> to see if it can be usefull to me or not.
>>
>> I have my fancy finite volume solver (cell centered, incompressible NS)
>> which uses petsc as linear solver engine. Thanks to your suggestions so far
>> it is now running well :)
>>
>> I would like to see if I can do another step incapsulatng in petsc also
>> the mesh managment part. I think that Dmplex is the way to go since my code
>> is fully unstructured.
>>
>> I have red some papers and lectures by Matt around the web, but still I
>> have not found an answer to this question:
>> Can dmplex dial with cell centered data arrangement and provide some
>> support for basic operation (e. g. interpolation between partitions, face
>> value calculation etc)?
>>
>
> 1) Cell centered data
>
> Definitely yes. This is the same data layout as P0 finite elements. You
> just assign the PetscSection k dofs per cell.
>
> 2) Interpolation between partitions
>
> I assume you mean ghost values from other parallel partitions. You can do
> this by using overlap=1 in DMPlexDisrtibute().
>
> 3) Face value calculation
>
> You can do Riemann solves by looping over faces, grabbing values from the
> neighboring cells, doing the calculation, and
> updating cell values. We carry out this kind of computation in TS ex11.
> That example attempts to do everything, so it is
> messy, but we can help you understand what it is doing in any part that is
> unclear.
>
> 4) FV boundary conditions
>
> You can use DMPlexConstructGhostCells() to put artificial cells around the
> boundary that you use to prescribe fluxes.
>
> Thanks,
>
> Matt
>
>
>> Thank you very much!
>>
>> Edo
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190827/6aab656c/attachment.html>
More information about the petsc-users
mailing list