[petsc-users] Speedup studies using DMPlex

Justin Chang jychang48 at gmail.com
Thu Dec 11 04:45:35 CST 2014


I am manually creating a structured tetrahedral mesh within my code and
using the DMPlexCreateFromDAG function to make a DMPlex out of it. If I go
with your suggestion, do I simply call DMRefine(...) after the mesh is
distributed? Because I notice that regular refinement is present in PETSc
3.5.2 SNES ex12.c but not in the PETSc developer's version (which I am
using).

Thanks,
Justin

On Thu, Dec 11, 2014 at 4:07 AM, Matthew Knepley <knepley at gmail.com> wrote:

> On Wed, Dec 10, 2014 at 7:34 PM, Justin Chang <jychang48 at gmail.com> wrote:
>
>> Hi all,
>>
>> So I am trying to run a speed-up (i.e., strong scaling) study by solving
>> a diffusion problem much like in SNES ex12.c, and plan on using up to 1k
>> cores on LANL's Mustang HPC system. However, it seems that DMPlexDistribute
>> is taking an extremely long time. I am using -petscpartitioner_type
>> parmetis on command line but it seems to make over 50% of the code
>> execution time. Is this normal or is there a "better" way to conduct such a
>> study?
>>
>
> 0) What mesh are you using? The most scalable way of running now is to
> read and distribute a coarse mesh and use regular refinement in parallel.
>
> 1) This is pure overhead in the sense that its one-to-many communication,
> and its done once, so most people do not report the time.
>
> 2) I agree its too slow. There is a branch in next that completely reworks
> distribution. We have run it up to 8K cores on Hector and
>     it is faster.
>
> 3) Early next year we plan to have parallel startup working, where each
> process reads a chunk of the mesh, and then its redistributes
>     for load balance.
>
>   Thanks,
>
>      Matt
>
>
>> Thanks,
>> Justin
>>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141211/c4a7ff6e/attachment-0001.html>


More information about the petsc-users mailing list