<div dir="ltr"><div dir="ltr">On Thu, Oct 29, 2020 at 3:04 PM Su,D.S. Danyang <<a href="mailto:dsu@eoas.ubc.ca">dsu@eoas.ubc.ca</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
<div lang="EN-CA" style="overflow-wrap: break-word;">
<div class="gmail-m_-6518904428935477041WordSection1">
<p class="MsoNormal"><span style="color:black">Dear PETSc users,<u></u><u></u></span></p>
<p class="MsoNormal"><span style="color:black"><u></u> <u></u></span></p>
<p class="MsoNormal"><span style="color:black">This is a question bother me for some time. I have the same code running on different clusters and both clusters have good speedup. However, I noticed some thing quite strange. On one cluster, the solver is quite
stable in computing time while on another cluster, the solver is unstable in computing time. As shown in the figure below, the local calculation almost has no communication and the computing time in this part is quite stable. However, PETSc solver on Cluster
B jumps quite a lot and the performance is not as good as Cluster A, even though the local calculation is a little better on Cluster B. There are some difference on hardware and PETSc configuration and optimization. Cluster A uses OpenMPI + GCC compiler and
Cluster B uses MPICH + GCC compiler. The number of processors used is 128 on Cluster A and 120 on Cluster B. I also tested different number of processors but the problem is the same. Does anyone have any idea which part might cause this problem?</span></p></div></div></blockquote><div><br></div><div>First question: Does the solver take more iterates when the time bumps up?</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div lang="EN-CA" style="overflow-wrap: break-word;"><div class="gmail-m_-6518904428935477041WordSection1">
<p class="MsoNormal"><img width="512" height="455" style="width: 5.3333in; height: 4.7395in;" id="gmail-m_-6518904428935477041Picture_x0020_1" src="cid:1757709ac374cff311"><u></u><u></u></p>
<p class="MsoNormal"><u></u> <u></u></p>
<p class="MsoNormal"><img width="512" height="455" style="width: 5.3333in; height: 4.7395in;" id="gmail-m_-6518904428935477041Picture_x0020_2" src="cid:1757709ac385b16b22"><u></u><u></u></p>
<p class="MsoNormal"><u></u> <u></u></p>
<p class="MsoNormal">Thanks and regards,<u></u><u></u></p>
<p class="MsoNormal"><u></u> <u></u></p>
<p class="MsoNormal">Danyang<u></u><u></u></p>
<p class="MsoNormal"><span style="font-size:11pt"><u></u> <u></u></span></p>
</div>
</div>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr" class="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>