<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On Sat, Jun 27, 2015 at 2:05 AM, Justin Chang <span dir="ltr"><<a href="mailto:jychang48@gmail.com" target="_blank">jychang48@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div><div><div><div><div><div><div><div><div><div>Hi everyone,<br><br></div>I see that parallel IO of various input mesh formats for DMPlex is not currently supported. However, is there a way to read a custom mesh file in parallel?<br><br></div>For instance, I have a binary data file formatted like this:<br><br></div><No. of elements><br></div><No. of vertices><br></div><No. of nodes per element><br></div><Spatial dimension><br></div><Connectivity array ...<br>....><br></div><Coordinates array ...<br>....><br></div><No. of exterior nodes><br></div><List of exterior nodes ...<br><div>....><br><br></div><div>Reading this will allow me to create a DMPlex with DMPlexCreateFromDAG(). Currently, only the root process reads this binary mesh file and creates the DMPlex whereas the other processes create an empty DMPlex. Then all processes invoke DMPlexDistribute(). This one-to-all distribution seems to be a bottleneck, and I want to know if it's possible to have each process read a local portion of the connectivity and coordinates arrays and let DMPlex/METIS/ParMETIS handle the load balancing and redistribution. Intuitively this would be easy to write, but again I want to know how to do this through leveraging the functions and routines within DMPlex.<br></div></div></blockquote><div><br></div><div>This is on our agenda for the fall, but I can describe the process:</div><div><br></div><div> a) Do a naive partition of the cells (easy)</div><div><br></div><div> b) Read the connectivity for "your" cells (easy)</div><div><br></div><div> c) Read "your" coordinates (tricky)</div><div><br></div><div> d) Read "your" exterior (tricky)</div><div><br></div><div> e) Create local DAGs (easy)</div><div><br></div><div> f) Create SF for shared boundary (hard)</div><div><br></div><div> g) Repartition and redistribute (easy)</div><div><br></div><div>You could start writing this for your format and we could help. I probably will not get to the generic one until late in the year.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div><br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div></div><div>Thanks,<br></div><div>Justin<br></div></div>
</blockquote></div><br><br clear="all"><div><br></div>-- <br><div class="gmail_signature">What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div>
</div></div>