[petsc-users] Parallel IO of DMPlex

Matthew Knepley knepley at gmail.com
Sat Jun 27 06:01:03 CDT 2015


On Sat, Jun 27, 2015 at 2:05 AM, Justin Chang <jychang48 at gmail.com> wrote:

> Hi everyone,
>
> I see that parallel IO of various input mesh formats for DMPlex is not
> currently supported. However, is there a way to read a custom mesh file in
> parallel?
>
> For instance, I have a binary data file formatted like this:
>
> <No. of elements>
> <No. of vertices>
> <No. of nodes per element>
> <Spatial dimension>
> <Connectivity array ...
> ....>
> <Coordinates array ...
> ....>
> <No. of exterior nodes>
> <List of exterior nodes ...
> ....>
>
> Reading this will allow me to create a DMPlex with DMPlexCreateFromDAG().
> Currently, only the root process reads this binary mesh file and creates
> the DMPlex whereas the other processes create an empty DMPlex. Then all
> processes invoke DMPlexDistribute(). This one-to-all distribution seems to
> be a bottleneck, and I want to know if it's possible to have each process
> read a local portion of the connectivity and coordinates arrays and let
> DMPlex/METIS/ParMETIS handle the load balancing and redistribution.
> Intuitively this would be easy to write, but again I want to know how to do
> this through leveraging the functions and routines within DMPlex.
>

This is on our agenda for the fall, but I can describe the process:

  a) Do a naive partition of the cells (easy)

  b) Read the connectivity for "your" cells (easy)

  c) Read "your" coordinates (tricky)

  d) Read "your" exterior (tricky)

  e) Create local DAGs (easy)

  f) Create SF for shared boundary (hard)

  g) Repartition and redistribute (easy)

You could start writing this for your format and we could help. I probably
will not get to the generic one until late in the year.

  Thanks,

    Matt

Thanks,
> Justin
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150627/ae8c1fde/attachment.html>


More information about the petsc-users mailing list