[petsc-users] Poor weak scaling when solving successivelinearsystems

Smith, Barry F. bsmith at mcs.anl.gov
Tue May 29 13:03:55 CDT 2018


Here is the bar chart I mentioned you should generate. As you can see the larger problem has 1.5096 extra seconds in VecTDot, 3.9548 extra seconds in Norm, is 1.266 seconds faster in Matmult, is 9.697 seconds slower in MatMultAdd, 1.047 secs slower in MatMultTranspose and 5.006 seconds slower in MatSOR. All of these together match the total extra time of 19.64 well (19.94 compared to 19.64).

   The Dot and Norm could be explained by using 8 times as many processes slowing down the reductions a bit. I cannot explain the slow down in MatSOR at all since that is embarrassingly parallel and should scale perfectly. I also cannot explain the huge jump in MatMultAdd()!

   This is why I ask you to run again to see if the numbers are consistent.

   Barry



[cid:20ADD8C8-1DC4-4781-94E4-9C8AFC99F6C1 at mcs.anl.gov]

On May 29, 2018, at 6:18 AM, Michael Becker <Michael.Becker at physik.uni-giessen.de<mailto:Michael.Becker at physik.uni-giessen.de>> wrote:

Hello again,

here are the updated log_view files for 125 and 1000 processors. I ran both problems twice, the first time with all processors per node allocated ("-1.txt"), the second with only half on twice the number of nodes ("-2.txt").

On May 24, 2018, at 12:24 AM, Michael Becker <Michael.Becker at physik.uni-giessen.de<mailto:Michael.Becker at physik.uni-giessen.de>>
 wrote:

I noticed that for every individual KSP iteration, six vector objects are created and destroyed (with CG, more with e.g. GMRES).

   Hmm, it is certainly not intended at vectors be created and destroyed within each KSPSolve() could you please point us to the code that makes you think they are being created and destroyed?   We create all the work vectors at KSPSetUp() and destroy them in KSPReset() not during the solve. Not that this would be a measurable distance.


I mean this, right in the log_view output:

Memory usage is given in bytes:

Object Type Creations Destructions Memory Descendants' Mem.
Reports information only for process 0.

--- Event Stage 0: Main Stage

...

--- Event Stage 1: First Solve

...

--- Event Stage 2: Remaining Solves

Vector 23904 23904 1295501184 0.
I logged the exact number of KSP iterations over the 999 timesteps and its exactly 23904/6 = 3984.
Michael


Am 24.05.2018 um 19:50 schrieb Smith, Barry F.:
  Please send the log file for 1000 with cg as the solver.

   You should make a bar chart of each event for the two cases to see which ones are taking more time and which are taking less (we cannot tell with the two logs you sent us since they are for different solvers.)




On May 24, 2018, at 12:24 AM, Michael Becker <Michael.Becker at physik.uni-giessen.de<mailto:Michael.Becker at physik.uni-giessen.de>>
 wrote:

I noticed that for every individual KSP iteration, six vector objects are created and destroyed (with CG, more with e.g. GMRES).

   Hmm, it is certainly not intended at vectors be created and destroyed within each KSPSolve() could you please point us to the code that makes you think they are being created and destroyed?   We create all the work vectors at KSPSetUp() and destroy them in KSPReset() not during the solve. Not that this would be a measurable distance.




This seems kind of wasteful, is this supposed to be like this? Is this even the reason for my problems? Apart from that, everything seems quite normal to me (but I'm not the expert here).


Thanks in advance.

Michael



<log_view_125procs.txt><log_view_1000procs.txt>


<log_view_125procs-1.txt><log_view_125procs-2.txt><log_view_1000procs-1.txt><log_view_1000procs-2.txt>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180529/9bc379d3/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Untitled.png
Type: image/png
Size: 285204 bytes
Desc: Untitled.png
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180529/9bc379d3/attachment-0001.png>


More information about the petsc-users mailing list