[petsc-users] Block preconditioning for 3d problem

Jed Brown jed at jedbrown.org
Thu Oct 10 10:00:53 CDT 2019


Dave Lee via petsc-users <petsc-users at mcs.anl.gov> writes:

> Hi PETSc,
>
> I have a nonlinear 3D problem for a set of uncoupled 2D slabs. (Which I
> ultimately want to couple once this problem is solved).
>
> When I solve the inner linear problem for each of these 2D slabs
> individually (using KSPGMRES) the convergence of the outer nonlinear
> problem is good. However when I solve the inner linear problem as a single
> 3D problem (with no coupling between the 2D slabs, so the matrix is
> effectively a set of uncoupled blocks, one for each 2D slab) the outer
> nonlinear convergence degrades dramatically.

Is the nonlinear problem also decoupled between slabs?

If you solve the linear problem accurately (tight tolerances on the
outer KSP, or global direct solve), is the outer nonlinear convergence
good again?  If not, test that your Jacobian is correct (it likely isn't
or you have inconsistent scaling leading to ill conditioning).  SNES has
automatic tests for that, but you aren't using SNES so you'd have to
write some code.

What happens if you run the 2D problem (where convergence is currently
good) with much smaller subdomains (or -pc_type pbjacobi)?

> Note that I am not using SNES, just my own quasi-Newton approach for the
> outer nonlinear problem.
>
> I suspect that the way to recover the convergence for the 3D coupled
> problem is to use some sort of PCBJACOBI or PCFIELDSPLIT preconditioner,
> but I'm not entirely sure. I've tried following this example:
> https://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/examples/tutorials/ex7.c.html
> but without any improvement in the convergence.
>
> On each processor the slab index is the major index and the internal slab
> DOF index is the minor index. The parallel decomposition is within the 2D
> slab dimensions only, not between slabs.

For convergence, you usually want the direction of tight coupling
(sounds like that is within slabs) to be close in memory.

In general, use -ksp_monitor_true_residual -ksp_converged_reason.

> I'm configuring the 3D inner linear solver as
>
> KSPGetPC(ksp, &pc);
> PCSetType(pc, PCBJACOBI);
> PCBJacobiSetLocalBlocks(pc, num_slabs, NULL);
> KSPSetUp(ksp);
> PCBJacobiGetSubKSP(pc, &nlocal, &first_local, &subksp);
> for(int ii = 0; ii < nlocal; ii++) {
>     KSPGetPC(subksp[ii], &subpc);
>     PCSetType(subpc, PCJACOBI);
>     KSPSetType(subksp[ii], KSPGMRES);
>     KSPSetTolerances(subksp[ii], 1.e-12, PETSC_DEFAULT, PETSC_DEFAULT,
> PETSC_DEFAULT);
> }
>
> However this does not seem to change the outer nonlinear convergence. When
> I run with ksp_view the local block sizes look correct - number of local 2D
> slab DOFs for each block on each processor.
>
> Any ideas on what sort of KSP/PC configuration I should be using to recover
> the convergence of the uncoupled 2D slab linear solve for this 3D linear
> solve? Should I be rethinking my node indexing to help with this?
>
> Cheers, Dave.


More information about the petsc-users mailing list