[petsc-users] GAMG setup scalability

Mark F. Adams mark.adams at columbia.edu
Fri Aug 17 09:50:36 CDT 2012

Yes John, the GAMG setup is not great.  So we know there are problems.  We do have a new staff person that will be working on matrix product methods soon.  Matrix products are hard to make fast and GAMG setup relies on them quite a bit.

A few things to note:

1) This is in the mesh setup phase, so many applications can amortize this cost.  (MatPtAP is a "matrix" setup cost and so it is amortized for linear problems or non-full Newton solves, but not for full nonlinear solves).

2) You do not seem to be smoothing (-pc_gamg_agg_nsmooths 1), this is very useful for elliptic operators but uses a MatMatMult (so it gets worse).

3) This test is getting less than 2x speedup in KSPSolve for 8x processors.  So this problem looks hard: small or poorly partitioned, and not in the range of where we want people to run to get good performance.

4) I have found that the setup times are about twice that of ML, which uses a similar algorithm, and about 5x slower than hypre, which uses a very different algorithm.  So if you can not amortize theses setup costs then ML or hypre would probably work better for you.


On Aug 17, 2012, at 10:11 AM, John Fettig wrote:

> GAMG without any options is using MatTransposeMatMult (in petsc-dev)
> during setup at line 1031 of agg.c.  What I find is that this
> operation takes up a majority of the total setup time, and doesn't
> scale well.  Is there anything that can be done about this?
> I am a little surprised that it is taking significantly more time than
> the RAP construction of the coarse grid operators done by MatPtAP.  On
> 1 processor, for example, it takes 5% of the setup time, and on 8
> processors it takes ~4%.  The MatTransposeMatMult, on the other hand,
> takes 67% of setup on 1 processor and 71% of setup on 8.
> I've attached the log_summary for 1 processor and 8 processors.  You
> can also see that the solve time is completely eclipsed by the setup
> time.
> Regards,
> John
> <gamg_np1.out><gamg_np8.out>

More information about the petsc-users mailing list