[petsc-users] Block preconditioning for 3d problem

Dave Lee davelee2804 at gmail.com
Fri Oct 11 23:04:54 CDT 2019


Thanks Jed,

I will reconfigure my PETSc with MUMPS or SuperLU and see if that helps.
(my code is configured to only run in parallel on 6*n^2 processors (n^2
procs on each face of a cubed sphere, which is a little annoying for
situations like these where a serial LU solver would be handy for testing).

I've tried setting a tight tolerance on the slab-wise block solve, and
preconditioning based on the pressure, but these haven't helped.

Unlike other atmosphere codes I'm using slab (layer)-minor indexing, as my
horizontal discretisation is arbitrary order, while I'm only piecewise
constant/linear in the vertical, so I've already got the slab-wise dofs
close together in memory.

I feel like maybe the construction of the Hessenberg during the Arnoldi
iteration and/or the least squares minimisation for the GMRES solve is
leading to additional coupling between the slabs, which is maybe degrading
the convergence, just a thought...

Thanks again, Dave.



On Sat, Oct 12, 2019 at 12:18 AM Jed Brown <jed at jedbrown.org> wrote:

> Dave Lee <davelee2804 at gmail.com> writes:
>
> > Hey Jed,
> >
> > Right now they are uncoupled purely for testing purposes. I just want to
> > get the uncoupled problem right before moving on to the coupled problem.
>
> Okay, well you may not want to change your dof ordering to put slabs
> together since you'll likely have some parts of your code that do
> vertical processing (column integrals/physics or PDE stiffness in the
> vertical).  It's common for climate/weather codes to use some horizontal
> index (within a blocking scheme or an entire subdomain array index) as
> the innermost to facilitate vectorization of the column processing
> (which has high enough arithmetic intensity that vectorization matters).
>
> > When I add the vertical dynamics back in both the outer nonlinear
> > convergence and the inner linear convergence are roughly the same as for
> > when I solve for the 2D slabs only as a 3D system.
>
> Which you've said is terrible compared to separate 2D slabs, so we have
> to get to the bottom of that.  We outlined a number of tests you can do
> to understand why it isn't converging as expected, but haven't heard
> back on the results so it'll be hard to advise further.
>
> > I like your scaling idea. I tried using the pressure multiplied by the
> mass
> > matrix as a preconditioner for the inner linear problem, but this didn't
> > help. Perhaps some sort of scaling is the way to go though.
> >
> > Cheers, Dave.
> >
> > On Fri, Oct 11, 2019 at 2:45 PM Jed Brown <jed at jedbrown.org> wrote:
> >
> >> Why are your slabs decoupled at present?  (Have you done a transform in
> >> the vertical?)  Is the linear convergence significantly different when
> >> you include the multiple layers?
> >>
> >> Dave Lee <davelee2804 at gmail.com> writes:
> >>
> >> > Hi Jed and Mark,
> >> >
> >> > thanks for your helpful comments. Yes the nonlinear outer problem is
> >> > uncoupled between the slabs, it is only the linear inner problem where
> >> they
> >> > are coupled.
> >> >
> >> > I've tried to make the slab DOFs close in memory, and also tried
> using a
> >> > tight tolerance on the outer KSP (1.0e-20), but without success.
> >> >
> >> > You make some really good points about scaling issues Jed and Mark.
> This
> >> is
> >> > a solve for a global atmosphere, and each 2D slab is a horizontal
> layer
> >> of
> >> > the atmosphere, so the pressure (which the linear solve is for) will
> vary
> >> > dramatically between slabs. Perhaps I can additionally precondition
> the
> >> > linear problem to normalise the pressure in each slab so that they
> stay
> >> > close to unity.
> >> >
> >> > Cheers, Dave.
> >> >
> >> > On Fri, Oct 11, 2019 at 2:52 AM Mark Adams <mfadams at lbl.gov> wrote:
> >> >
> >> >> I can think of a few sources of coupling in the solver: 1) line
> search
> >> and
> >> >> 2) Krylov, and 3) the residual test (scaling issues). You could turn
> >> >> linesearch off and use Richardson (with a fixed number of
> iterations) or
> >> >> exact solves as Jed suggested. As far as scaling can you use the
> same NL
> >> >> problem on each slab? This should fix all the problems anyway. Or, on
> >> the
> >> >> good decouple solves, if the true residual is of the same scale and
> >> *all*
> >> >> of the slabs converge well then you should be OK on scaling. If this
> >> works
> >> >> then start adding stuff back in and see what breaks it.
> >> >>
> >> >> On Thu, Oct 10, 2019 at 11:01 AM Jed Brown via petsc-users <
> >> >> petsc-users at mcs.anl.gov> wrote:
> >> >>
> >> >>> Dave Lee via petsc-users <petsc-users at mcs.anl.gov> writes:
> >> >>>
> >> >>> > Hi PETSc,
> >> >>> >
> >> >>> > I have a nonlinear 3D problem for a set of uncoupled 2D slabs.
> >> (Which I
> >> >>> > ultimately want to couple once this problem is solved).
> >> >>> >
> >> >>> > When I solve the inner linear problem for each of these 2D slabs
> >> >>> > individually (using KSPGMRES) the convergence of the outer
> nonlinear
> >> >>> > problem is good. However when I solve the inner linear problem as
> a
> >> >>> single
> >> >>> > 3D problem (with no coupling between the 2D slabs, so the matrix
> is
> >> >>> > effectively a set of uncoupled blocks, one for each 2D slab) the
> >> outer
> >> >>> > nonlinear convergence degrades dramatically.
> >> >>>
> >> >>> Is the nonlinear problem also decoupled between slabs?
> >> >>>
> >> >>> If you solve the linear problem accurately (tight tolerances on the
> >> >>> outer KSP, or global direct solve), is the outer nonlinear
> convergence
> >> >>> good again?  If not, test that your Jacobian is correct (it likely
> >> isn't
> >> >>> or you have inconsistent scaling leading to ill conditioning).  SNES
> >> has
> >> >>> automatic tests for that, but you aren't using SNES so you'd have to
> >> >>> write some code.
> >> >>>
> >> >>> What happens if you run the 2D problem (where convergence is
> currently
> >> >>> good) with much smaller subdomains (or -pc_type pbjacobi)?
> >> >>>
> >> >>> > Note that I am not using SNES, just my own quasi-Newton approach
> for
> >> the
> >> >>> > outer nonlinear problem.
> >> >>> >
> >> >>> > I suspect that the way to recover the convergence for the 3D
> coupled
> >> >>> > problem is to use some sort of PCBJACOBI or PCFIELDSPLIT
> >> preconditioner,
> >> >>> > but I'm not entirely sure. I've tried following this example:
> >> >>> >
> >> >>>
> >>
> https://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/examples/tutorials/ex7.c.html
> >> >>> > but without any improvement in the convergence.
> >> >>> >
> >> >>> > On each processor the slab index is the major index and the
> internal
> >> >>> slab
> >> >>> > DOF index is the minor index. The parallel decomposition is within
> >> the
> >> >>> 2D
> >> >>> > slab dimensions only, not between slabs.
> >> >>>
> >> >>> For convergence, you usually want the direction of tight coupling
> >> >>> (sounds like that is within slabs) to be close in memory.
> >> >>>
> >> >>> In general, use -ksp_monitor_true_residual -ksp_converged_reason.
> >> >>>
> >> >>> > I'm configuring the 3D inner linear solver as
> >> >>> >
> >> >>> > KSPGetPC(ksp, &pc);
> >> >>> > PCSetType(pc, PCBJACOBI);
> >> >>> > PCBJacobiSetLocalBlocks(pc, num_slabs, NULL);
> >> >>> > KSPSetUp(ksp);
> >> >>> > PCBJacobiGetSubKSP(pc, &nlocal, &first_local, &subksp);
> >> >>> > for(int ii = 0; ii < nlocal; ii++) {
> >> >>> >     KSPGetPC(subksp[ii], &subpc);
> >> >>> >     PCSetType(subpc, PCJACOBI);
> >> >>> >     KSPSetType(subksp[ii], KSPGMRES);
> >> >>> >     KSPSetTolerances(subksp[ii], 1.e-12, PETSC_DEFAULT,
> >> PETSC_DEFAULT,
> >> >>> > PETSC_DEFAULT);
> >> >>> > }
> >> >>> >
> >> >>> > However this does not seem to change the outer nonlinear
> convergence.
> >> >>> When
> >> >>> > I run with ksp_view the local block sizes look correct - number of
> >> >>> local 2D
> >> >>> > slab DOFs for each block on each processor.
> >> >>> >
> >> >>> > Any ideas on what sort of KSP/PC configuration I should be using
> to
> >> >>> recover
> >> >>> > the convergence of the uncoupled 2D slab linear solve for this 3D
> >> linear
> >> >>> > solve? Should I be rethinking my node indexing to help with this?
> >> >>> >
> >> >>> > Cheers, Dave.
> >> >>>
> >> >>
> >>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20191012/557b3429/attachment.html>


More information about the petsc-users mailing list