[petsc-users] Effect of -pc_gamg_threshold vs PETSc version
Jeremy Theler
jeremy at seamplex.com
Fri Apr 14 07:53:58 CDT 2023
Hi Mark. So glad you answered.
> 0) what is your test problem? eg, 3D Lapacian with Q1 finite
> elements.
I said in my first email it was linear elasticty (and I gave a link
where you can see the geometry, BCs, etc.) but I did not specifty
further details.
It is linear elasticity with displacement-based FEM formulation using
unstructured curved 10-noded tetrahedra.
The matrix is marked as SPD with MatSetOption() and the solver is
indeed CG and not the default GMRES.
> First, you can get GAMG diagnostics by running with '-info :pc' and
> grep on GAMG.
Great advice. Now I have a lot more of information but I'm not sure how
to analyze it. Find attached for each combination of threshold and
PETSc version the output of -info :pc -ksp_monitor -ksp_view
In general it looks like 3.18 and 3.19 have less KSP iterations than
3.17 but the overall time is larger.
> Anyway, sorry for the changes.
> I hate changing GAMG for this reason and I hate AMG for this reason!
No need to apologize, I just want to better understand how to better
exploit your code!
Thanks
--
jeremy
>
> Thanks,
> Mark
>
>
>
> On Thu, Apr 13, 2023 at 8:17 AM Jeremy Theler <jeremy at seamplex.com>
> wrote:
> > When using GAMG+cg for linear elasticity and providing the near
> > nullspace computed by MatNullSpaceCreateRigidBody(), I used to find
> > "experimentally" that a small value of -pc_gamg_threshold in the
> > order
> > of 0.0001 would slightly decrease the solve time.
> >
> > Starting with 3.18, I started seeing that any positive value for
> > the
> > treshold would increase the solve time. I did a quick parametric
> > (serial) run solving an elastic problem with a matrix size of
> > approx
> > 570k x 570k for different values of GAMG threshold and different
> > PETSc
> > versions (compiled with the same compiler, options and flags).
> >
> > I noted that
> >
> > 1. starting from 3.18, a threshold of 0.0001 that used to improve
> > the
> > speed now worsens it.
> > 2. PETSc 3.17 looks like a "sweet spot" of speed
> >
> > I would like to hear any comments you might have.
> >
> > The wall time shown includes the time needed to read the mesh and
> > assemble the stiffness matrix. It is a refined version of the
> > NAFEMS
> > LE10 benchmark described here:
> > https://seamplex.com/feenox/examples/mechanical.html#nafems-le10-
> > thick-plate-pressure-benchmark
> >
> > If you want, I could dump the matrix, rhs and near nullspace
> > vectors
> > and share them.
> >
> > --
> > jeremy theler
> >
-------------- next part --------------
A non-text attachment was scrubbed...
Name: log.tar.gz
Type: application/x-compressed-tar
Size: 21926 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230414/79e1ea77/attachment-0001.bin>
More information about the petsc-users
mailing list