<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
</head>
<body text="#000000" bgcolor="#FFFFFF">
<p>Here is the time weak scaling from the same study. The 3.10.2
version seems to be much more stable with regard to the execution
time. But not necessarily faster for "large scale" simulations
(problem size = 1e8). <br>
</p>
<p>I didn't use -mat_freeintermediatedatastructures. I tested it
this morning and the solver diverges when using this option
(KSPReason -3).</p>
<p>Myriam<br>
</p>
<br>
<div class="moz-cite-prefix">Le 04/09/19 à 17:23, Zhang, Hong a
écrit :<br>
</div>
<blockquote type="cite"
cite="mid:CAGCphBvr6kTE7YiGYMNeSsORJiZ1RLOym+BDMXQBUvNk99OhGw@mail.gmail.com">
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
<div dir="ltr">
<div>
<div><span
style="color:rgb(33,33,33);font-family:Roboto,RobotoDraft,Helvetica,Arial,sans-serif;font-size:14px;letter-spacing:0.2px">Myriam,</span><br>
</div>
<div>Do you have 'execution time scalability' plot? Did you
use '-mat_freeintermediatedatastructures' for PETSc 3.10.2?</div>
<div>We made several computing optimizations on MatPtAP(),
which might trade memory for speed. It would be helpful to
see a complete comparison.</div>
<div>Hong</div>
</div>
<br>
<div class="gmail_quote">
<div dir="ltr" class="gmail_attr">On Tue, Apr 9, 2019 at 7:43
AM Myriam Peyrounette via petsc-users <<a
href="mailto:petsc-users@mcs.anl.gov" target="_blank"
moz-do-not-send="true">petsc-users@mcs.anl.gov</a>>
wrote:<br>
</div>
<blockquote class="gmail_quote" style="margin:0px 0px 0px
0.8ex;border-left:1px solid
rgb(204,204,204);padding-left:1ex">
Hi,<br>
<br>
in my first mail, I provided a memory scaling concerning the
PETSc<br>
example #42. You'll find attached the main files used (one
for PETSc<br>
3.6.4, one for PETSc 3.10.2), and the corresponding memory
scaling.<br>
<br>
In the main files, I modified the solver/preconditioner, so
that it<br>
corresponds to my problem. You'll find the modifications by
searching<br>
the keyword "TopBridge". In particular, I use GAMG.<br>
<br>
Note that the example is about solving Stokes equation, so
using GAMG<br>
may not be adapted. However, the memory gap appears and
that's the<br>
point. No matter if the results are correct.<br>
<br>
Are these scripts useful for you? Let me know.<br>
<br>
Thanks,<br>
<br>
Myriam<br>
<br>
<br>
Le 04/04/19 à 00:09, Jed Brown a écrit :<br>
> Myriam Peyrounette via petsc-users <<a
href="mailto:petsc-users@mcs.anl.gov" target="_blank"
moz-do-not-send="true">petsc-users@mcs.anl.gov</a>>
writes:<br>
><br>
>> Hi all,<br>
>><br>
>> for your information, you'll find attached the
comparison of the weak<br>
>> memory scalings when using :<br>
>><br>
>> - PETSc 3.6.4 (reference)<br>
>> - PETSc 3.10.4 without specific options<br>
>> - PETSc 3.10.4 with the three scalability options
you mentionned<br>
>><br>
>> Using the scalability options does improve the
memory scaling. However,<br>
>> the 3.6 version still has a better one...<br>
> Yes, this still looks significant. Is this an effect
we can still<br>
> reproduce with a PETSc example and/or using a memory
profiler (such as<br>
> massif or gperftools)? I think it's important for us
to narrow down<br>
> what causes this difference (looks like almost 2x on
your 1e8 problem<br>
> size) so we can fix.<br>
<br>
-- <br>
Myriam Peyrounette<br>
CNRS/IDRIS - HLST<br>
--<br>
<br>
</blockquote>
</div>
</div>
</blockquote>
<br>
<pre class="moz-signature" cols="72">--
Myriam Peyrounette
CNRS/IDRIS - HLST
--
</pre>
</body>
</html>