[petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split

Alexander Lindsay alexlindsay239 at gmail.com
Thu Apr 13 16:07:07 CDT 2023


Here's the result.

    0 KSP unpreconditioned resid norm 1.033076851740e+03 true resid norm
1.033076851740e+03 ||r(i)||/||b|| 1.000000000000e+00
      Residual norms for fieldsplit_u_ solve.
      0 KSP Residual norm           -nan
      Residual norms for fieldsplit_p_ solve.
      0 KSP Residual norm           -nan
      Residual norms for fieldsplit_u_ solve.
      0 KSP Residual norm           -nan
      1 KSP Residual norm           -nan
      Residual norms for fieldsplit_u_ solve.
      0 KSP Residual norm           -nan
  Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
                 PC failed due to SUBPC_ERROR

I probably should have read the FAQ on `-fp_trap` before sending my first
email.

Working with this stack trace

 (gdb) bt
#0  0x00007fffe83a4286 in hypre_ParMatmul._omp_fn.1 () at
par_csr_matop.c:1124
#1  0x00007ffff4982a16 in GOMP_parallel () from
/lib/x86_64-linux-gnu/libgomp.so.1
#2  0x00007fffe83abfd1 in hypre_ParMatmul (A=<optimized out>,
B=B at entry=0x55555da2ffa0)
at par_csr_matop.c:967
#3  0x00007fffe82f09bf in hypre_BoomerAMGSetup (amg_vdata=<optimized out>,
A=<optimized out>, f=<optimized out>,
    u=<optimized out>) at par_amg_setup.c:2790
#4  0x00007fffe82d54f0 in HYPRE_BoomerAMGSetup (solver=<optimized out>,
A=<optimized out>, b=<optimized out>,
    x=<optimized out>) at HYPRE_parcsr_amg.c:47
#5  0x00007fffe940d33c in PCSetUp_HYPRE (pc=<optimized out>)
    at /home/lindad/projects/moose/petsc/src/ksp/pc/impls/hypre/hypre.c:418
#6  0x00007fffe9413d87 in PCSetUp (pc=0x55555d5ef390)
    at /home/lindad/projects/moose/petsc/src/ksp/pc/interface/precon.c:1017
#7  0x00007fffe94f856b in KSPSetUp (ksp=ksp at entry=0x55555d5eecb0)
    at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:408
#8  0x00007fffe94fa6f4 in KSPSolve_Private (ksp=ksp at entry=0x55555d5eecb0,
b=0x55555d619730, x=<optimized out>)
    at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:852
#9  0x00007fffe94fd8b1 in KSPSolve (ksp=ksp at entry=0x55555d5eecb0,
b=<optimized out>, x=<optimized out>)
    at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:1086
#10 0x00007fffe93d84a1 in PCApply_FieldSplit_Schur (pc=0x555555bef790,
x=0x555556d5a510, y=0x555556d59e30)
    at
/home/lindad/projects/moose/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1185
#11 0x00007fffe9414484 in PCApply (pc=pc at entry=0x555555bef790,
x=x at entry=0x555556d5a510,
y=y at entry=0x555556d59e30)
    at /home/lindad/projects/moose/petsc/src/ksp/pc/interface/precon.c:445
#12 0x00007fffe9415ad7 in PCApplyBAorAB (pc=0x555555bef790, side=PC_RIGHT,
x=0x555556d5a510,
    y=y at entry=0x555556e922a0, work=0x555556d59e30)
    at /home/lindad/projects/moose/petsc/src/ksp/pc/interface/precon.c:727
#13 0x00007fffe9451fcd in KSP_PCApplyBAorAB (w=<optimized out>,
y=0x555556e922a0, x=<optimized out>,
    ksp=0x555556068fc0) at
/home/lindad/projects/moose/petsc/include/petsc/private/kspimpl.h:421
#14 KSPGMRESCycle (itcount=itcount at entry=0x7fffffffcca0, ksp=ksp at entry
=0x555556068fc0)
    at /home/lindad/projects/moose/petsc/src/ksp/ksp/impls/gmres/gmres.c:162
#15 0x00007fffe94536f9 in KSPSolve_GMRES (ksp=0x555556068fc0)
    at /home/lindad/projects/moose/petsc/src/ksp/ksp/impls/gmres/gmres.c:247
#16 0x00007fffe94fb1c4 in KSPSolve_Private (ksp=0x555556068fc0,
b=b at entry=0x55555568e510,
x=<optimized out>,
    x at entry=0x55555607cce0) at
/home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:914
#17 0x00007fffe94fd8b1 in KSPSolve (ksp=<optimized out>,
b=b at entry=0x55555568e510,
x=x at entry=0x55555607cce0)
    at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:1086
#18 0x00007fffe9582850 in SNESSolve_NEWTONLS (snes=0x555556065610)
    at /home/lindad/projects/moose/petsc/src/snes/impls/ls/ls.c:225
#19 0x00007fffe959c7ee in SNESSolve (snes=0x555556065610, b=0x0,
x=<optimized out>)
    at /home/lindad/projects/moose/petsc/src/snes/interface/snes.c:4809

On Thu, Apr 13, 2023 at 1:54 PM Barry Smith <bsmith at petsc.dev> wrote:

>
>   It would be useful to see the convergences inside the linear solve so
> perhaps start with
>
> -ksp_monitor_true_residual
>
> -fieldsplit_u_ksp_type richardson    (this is to allow the monitor below
> to work)
> -fieldsplit_u_ksp_max_its 1
> -fieldsplit_u_ksp_monitor
>
> Perhaps others, Matt/Jed/Pierre/Stefano likely know better off the cuff
> than me.
>
> We should have a convenience option like -pc_fieldsplit_schur_monitor
> similar to the -pc_fieldsplit_gkb_monitor
>
>
>
> On Apr 13, 2023, at 4:33 PM, Alexander Lindsay <alexlindsay239 at gmail.com>
> wrote:
>
> Hi, I'm trying to solve steady Navier-Stokes for different Reynolds
> numbers. My options table
>
> -dm_moose_fieldsplit_names u,p
> -dm_moose_nfieldsplits 2
> -fieldsplit_p_dm_moose_vars pressure
> -fieldsplit_p_ksp_type preonly
> -fieldsplit_p_pc_type jacobi
> -fieldsplit_u_dm_moose_vars vel_x,vel_y
> -fieldsplit_u_ksp_type preonly
> -fieldsplit_u_pc_hypre_type boomeramg
> -fieldsplit_u_pc_type hypre
> -pc_fieldsplit_schur_fact_type full
> -pc_fieldsplit_schur_precondition selfp
> -pc_fieldsplit_type schur
> -pc_type fieldsplit
>
> works wonderfully for a low Reynolds number of 2.2. The solver performance
> crushes LU as I scale up the problem. However, not surprisingly this
> options table struggles when I bump the Reynolds number to 220. I've read
> that use of AIR (approximate ideal restriction) can improve performance for
> advection dominated problems. I've tried
> setting -pc_hypre_boomeramg_restriction_type 1 for a simple diffusion
> problem and the option works fine. However, when applying it to my
> field-split preconditioned Navier-Stokes system, I get immediate
> non-convergence:
>
>  0 Nonlinear |R| = 1.033077e+03
>       0 Linear |R| = 1.033077e+03
>   Linear solve did not converge due to DIVERGED_NANORINF iterations 0
> Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0
>
> Does anyone have an idea as to why this might be happening? If not, I'd
> take a suggestion on where to set a breakpoint to start my own
> investigation. Alternatively, I welcome other preconditioning suggestions
> for an advection dominated problem.
>
> Alex
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230413/44567751/attachment.html>


More information about the petsc-users mailing list