[petsc-users] Effect of -pc_gamg_threshold vs PETSc version

Matthew Knepley knepley at gmail.com
Fri Apr 14 08:36:12 CDT 2023


On Fri, Apr 14, 2023 at 8:54 AM Jeremy Theler <jeremy at seamplex.com> wrote:

> Hi Mark. So glad you answered.
>
> > 0) what is your test problem? eg, 3D Lapacian with Q1 finite
> > elements.
>
> I said in my first email it was linear elasticty (and I gave a link
> where you can see the geometry, BCs, etc.) but I did not specifty
> further details.
> It is linear elasticity with displacement-based FEM formulation using
> unstructured curved 10-noded tetrahedra.
>

I believe our jargon for this would be "P_2 Lagrange element".


> The matrix is marked as SPD with MatSetOption() and the solver is
> indeed CG and not the default GMRES.
>
> > First, you can get GAMG diagnostics by running with '-info :pc' and
> > grep on GAMG.
>
> Great advice. Now I have a lot more of information but I'm not sure how
> to analyze it. Find attached for each combination of threshold and
> PETSc version the output of -info :pc -ksp_monitor -ksp_view
>
> In general it looks like 3.18 and 3.19 have less KSP iterations than
> 3.17 but the overall time is larger.
>

I will also look and see if I can figure out the change. This kind of
behavior usually means that
we somehow made the coarse problem larger. This can make it more accurate
(fewer iterations)
but more costly. This also makes sense that it is sensitive to the
threshold parameter, but that is
not the only thing that controls the sparsity. There is also squaring the
graph.

Mark, do you know if we change the default for squaring?

  Thanks,

    Matt


> > Anyway, sorry for the changes.
> > I hate changing GAMG for this reason and I hate AMG for this reason!
>
> No need to apologize, I just want to better understand how to better
> exploit your code!
>
> Thanks
> --
> jeremy
>
> >
> > Thanks,
> > Mark
> >
> >
> >
> > On Thu, Apr 13, 2023 at 8:17 AM Jeremy Theler <jeremy at seamplex.com>
> > wrote:
> > > When using GAMG+cg for linear elasticity and providing the near
> > > nullspace computed by MatNullSpaceCreateRigidBody(), I used to find
> > > "experimentally" that a small value of -pc_gamg_threshold in the
> > > order
> > > of 0.0001 would slightly decrease the solve time.
> > >
> > > Starting with 3.18, I started seeing that any positive value for
> > > the
> > > treshold would increase the solve time. I did a quick parametric
> > > (serial) run solving an elastic problem with a matrix size of
> > > approx
> > > 570k x 570k for different values of GAMG threshold and different
> > > PETSc
> > > versions (compiled with the same compiler, options and flags).
> > >
> > > I noted that
> > >
> > >  1. starting from 3.18, a threshold of 0.0001 that used to improve
> > > the
> > > speed now worsens it.
> > >  2. PETSc 3.17 looks like a "sweet spot" of speed
> > >
> > > I would like to hear any comments you might have.
> > >
> > > The wall time shown includes the time needed to read the mesh and
> > > assemble the stiffness matrix. It is a refined version of the
> > > NAFEMS
> > > LE10 benchmark described here:
> > > https://seamplex.com/feenox/examples/mechanical.html#nafems-le10-
> > > thick-plate-pressure-benchmark
> > >
> > > If you want, I could dump the matrix, rhs and near nullspace
> > > vectors
> > > and share them.
> > >
> > > --
> > > jeremy theler
> > >
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230414/6c248b87/attachment.html>


More information about the petsc-users mailing list