[petsc-users] Bad memory scaling with PETSc 3.10

Myriam Peyrounette myriam.peyrounette at idris.fr
Mon Apr 15 07:45:15 CDT 2019


Hi,

you'll find the new scaling attached (green line). I used the version
3.11 and the four scalability options :
-matptap_via scalable
-inner_diag_matmatmult_via scalable
-inner_offdiag_matmatmult_via scalable
-mat_freeintermediatedatastructures

The scaling is much better! The code even uses less memory for the
smallest cases. There is still an increase for the larger one.

With regard to the time scaling, I used KSPView and LogView on the two
previous scalings (blue and yellow lines) but not on the last one (green
line). So we can't really compare them, am I right? However, we can see
that the new time scaling looks quite good. It slightly increases from
~8s to ~27s.

Unfortunately, the computations are expensive so I would like to avoid
re-run them if possible. How relevant would be a proper time scaling for
you? 

Myriam


Le 04/12/19 à 18:18, Zhang, Hong a écrit :
> Myriam :
> Thanks for your effort. It will help us improve PETSc.
> Hong
>
>     Hi all,
>
>     I used the wrong script, that's why it diverged... Sorry about that. 
>     I tried again with the right script applied on a tiny problem (~200
>     elements). I can see a small difference in memory usage (gain ~ 1mB).
>     when adding the -mat_freeintermediatestructures option. I still
>     have to
>     execute larger cases to plot the scaling. The supercomputer I am
>     used to
>     run my jobs on is really busy at the moment so it takes a while. I
>     hope
>     I'll send you the results on Monday.
>
>     Thanks everyone,
>
>     Myriam
>
>
>     Le 04/11/19 à 06:01, Jed Brown a écrit :
>     > "Zhang, Hong" <hzhang at mcs.anl.gov <mailto:hzhang at mcs.anl.gov>>
>     writes:
>     >
>     >> Jed:
>     >>>> Myriam,
>     >>>> Thanks for the plot. '-mat_freeintermediatedatastructures'
>     should not affect solution. It releases almost half of memory in
>     C=PtAP if C is not reused.
>     >>> And yet if turning it on causes divergence, that would imply a
>     bug.
>     >>> Hong, are you able to reproduce the experiment to see the memory
>     >>> scaling?
>     >> I like to test his code using an alcf machine, but my hands are
>     full now. I'll try it as soon as I find time, hopefully next week.
>     > I have now compiled and run her code locally.
>     >
>     > Myriam, thanks for your last mail adding configuration and
>     removing the
>     > MemManager.h dependency.  I ran with and without
>     > -mat_freeintermediatedatastructures and don't see a difference in
>     > convergence.  What commands did you run to observe that difference?
>
>     -- 
>     Myriam Peyrounette
>     CNRS/IDRIS - HLST
>     --
>
>

-- 
Myriam Peyrounette
CNRS/IDRIS - HLST
--

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190415/4540a0e6/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: mem_scaling_big_cases_ex42_irene_SCALABLE.png
Type: image/png
Size: 35366 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190415/4540a0e6/attachment-0002.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: time_scaling_big_cases_ex42_irene_SCALABLE.png
Type: image/png
Size: 22181 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190415/4540a0e6/attachment-0003.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: smime.p7s
Type: application/pkcs7-signature
Size: 2975 bytes
Desc: Signature cryptographique S/MIME
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190415/4540a0e6/attachment-0001.p7s>


More information about the petsc-users mailing list