[petsc-users] MWE for DMPlexCreateCGNS

Matthew Knepley knepley at gmail.com
Tue Feb 5 09:27:21 CST 2019


On Tue, Feb 5, 2019 at 9:47 AM Andrew Parker via petsc-users <
petsc-users at mcs.anl.gov> wrote:

> Does anyone have a MWE for DMPlexCreateCGNS to use in parallel? Ideally,
> read parallel, distribute in parallel, construct ghost cells (for parallel
> comms and halos for physical boundaries)? It's for a cell-centered solver
> working with cgns meshes.  Is there any limitation on cell-types?
>

1) CGNS is a terrible format, unfortunately. Are you sure about committing
to it? Always best to question design decisions before lots of code has
been written.

2) DMPlexCreateCGNS() probably works, but it has not been tested
exhaustively.

3) It does not work in parallel, and is unlikely to in the near future. In
order to read in parallel, you should be able to nicely select
    blocks from disk in the format, and also have local BC specification
(rather than global numbers). MED is a nice format like this.
    It is the CASCADE format, and GMsh uses it internally. We can now read
MED in parallel. Without too much work, we could also
    read ExodusII in parallel I think.

4) Once read, you get a regular Plex, so you can redistribute, make ghost
cells, etc. as you can for any Plex mesh.

5) As long as all you want Plex to do is manage topology and field data,
then you can have whatever mix of cell types you want.

6) Limitations on cell types come from routines that calculate geometric
quantities, integrate, etc.

  Thanks,

     Matt


> Thanks,
> Andy
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190205/06667680/attachment.html>


More information about the petsc-users mailing list