[petsc-users] Block preconditioning for 3d problem

Dave Lee davelee2804 at gmail.com
Mon Oct 14 22:00:43 CDT 2019


Hey Jed,

I take your point that H.O. != tightly coupled, what I was trying to say
was that I'm deliberately using slab dofs as the minor index as the stencil
for the Helmholtz operator is wider in the slab that between slabs
(apologies for the confusion).

I'll keep digging to see if I can find a bug in my matrix assembly. I agree
that this is the most likely explanation. Concurrently I'll also see if
using FieldSplit (with each slab as a field) helps.

Cheers, Dave.

On Tue, Oct 15, 2019 at 12:43 PM Jed Brown <jed at jedbrown.org> wrote:

> Dave Lee <davelee2804 at gmail.com> writes:
>
> > Thanks Jed,
> >
> > I will reconfigure my PETSc with MUMPS or SuperLU and see if that helps.
> > (my code is configured to only run in parallel on 6*n^2 processors (n^2
> > procs on each face of a cubed sphere, which is a little annoying for
> > situations like these where a serial LU solver would be handy for
> testing).
> >
> > I've tried setting a tight tolerance on the slab-wise block solve, and
> > preconditioning based on the pressure, but these haven't helped.
> >
> > Unlike other atmosphere codes I'm using slab (layer)-minor indexing, as
> my
> > horizontal discretisation is arbitrary order, while I'm only piecewise
> > constant/linear in the vertical, so I've already got the slab-wise dofs
> > close together in memory.
>
> Being high order does not necessarily mean more tightly coupled.  A
> useful test is to compute a column of the Jacobian inverse (e.g., by
> solving with a right-hand side that is a column of the identity) and
> plot it to see how it extends in each spatial dimension.
>
> > I feel like maybe the construction of the Hessenberg during the Arnoldi
> > iteration and/or the least squares minimisation for the GMRES solve is
> > leading to additional coupling between the slabs, which is maybe
> degrading
> > the convergence, just a thought...
>
> GMRES does not care how you order degrees of freedom.
>
> My guess based on what I've seen in this thread is that there is a bug
> in your discretization.
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20191015/2bee6904/attachment.html>


More information about the petsc-users mailing list