[petsc-users] Block preconditioning for 3d problem

Jed Brown jed at jedbrown.org
Mon Oct 14 20:43:22 CDT 2019


Dave Lee <davelee2804 at gmail.com> writes:

> Thanks Jed,
>
> I will reconfigure my PETSc with MUMPS or SuperLU and see if that helps.
> (my code is configured to only run in parallel on 6*n^2 processors (n^2
> procs on each face of a cubed sphere, which is a little annoying for
> situations like these where a serial LU solver would be handy for testing).
>
> I've tried setting a tight tolerance on the slab-wise block solve, and
> preconditioning based on the pressure, but these haven't helped.
>
> Unlike other atmosphere codes I'm using slab (layer)-minor indexing, as my
> horizontal discretisation is arbitrary order, while I'm only piecewise
> constant/linear in the vertical, so I've already got the slab-wise dofs
> close together in memory.

Being high order does not necessarily mean more tightly coupled.  A
useful test is to compute a column of the Jacobian inverse (e.g., by
solving with a right-hand side that is a column of the identity) and
plot it to see how it extends in each spatial dimension.

> I feel like maybe the construction of the Hessenberg during the Arnoldi
> iteration and/or the least squares minimisation for the GMRES solve is
> leading to additional coupling between the slabs, which is maybe degrading
> the convergence, just a thought...

GMRES does not care how you order degrees of freedom.

My guess based on what I've seen in this thread is that there is a bug
in your discretization.


More information about the petsc-users mailing list