[petsc-users] Debugging suggestions: GAMG

Sanjay Govindjee s_g at berkeley.edu
Sat Jun 13 14:48:06 CDT 2020


Mark,
   The problem is a 2D plane elasticity problem, definitely SPD.
If I add -pc_gamg_esteig_ksp_type cg, it works perfectly.

   I also tried  adding -info but that caused some other type of weird 
crash; the options I used were

    -ksp_type cg -ksp_monitor -log_view -pc_type gamg -pc_gamg_type agg
    -pc_gamg_agg_nsmooths 1 -pc_gamg_esteig_ksp_type cg -info  -options_left

The resulting error was:

    [0] petscinitialize_internal(): (Fortran):PETSc successfully
    started: procs 1
    [0] petscinitialize_internal(): Running on machine:
    vulture.ce.berkeley.edu
      --> ERRORS OCCURRED: For details see file: Ofo

    [0] PetscFinalize(): PetscFinalize() called
    [0] PetscCommDuplicate(): Duplicating a communicator 140367846615616
    28408096 max tags = 2147483647
    ************************************************************************************************************************
    ***             WIDEN YOUR WINDOW TO 120 CHARACTERS.  Use 'enscript
    -r -fCourier9' to print this document            ***
    ************************************************************************************************************************

    ---------------------------------------------- PETSc Performance
    Summary: ----------------------------------------------



           ##########################################################
           #                                                        #
           #                       WARNING!!!                       #
           #                                                        #
           #   This code was compiled with a debugging option.      #
           #   To get timing results run ./configure                #
           #   using --with-debugging=no, the performance will      #
           #   be generally two or three times faster.              #
           #                                                        #
           ##########################################################


    /home/sg/Feap/ver86/parfeap/feap on a gnu named
    vulture.ce.berkeley.edu with 1 processor, by sg Sat Jun 13 12:47:06 2020
    Using Petsc Release Version 3.13.2, unknown

                              Max       Max/Min     Avg       Total
    Time (sec):           1.516e-03     1.000   1.516e-03
    Objects:              1.000e+00     1.000   1.000e+00
    Flop:                 0.000e+00     0.000   0.000e+00  0.000e+00
    Flop/sec:             0.000e+00     0.000   0.000e+00  0.000e+00
    Memory:               5.122e+04     1.000   5.122e+04  5.122e+04
    MPI Messages:         0.000e+00     0.000   0.000e+00  0.000e+00
    MPI Message Lengths:  0.000e+00     0.000   0.000e+00  0.000e+00
    MPI Reductions:       0.000e+00     0.000

    Flop counting convention: 1 flop = 1 real number operation of type
    (multiply/divide/add/subtract)
                                 e.g., VecAXPY() for real vectors of
    length N --> 2N flop
                                 and VecAXPY() for complex vectors of
    length N --> 8N flop

    Summary of Stages:   ----- Time ------  ----- Flop ------  ---
    Messages ---  -- Message Lengths --  -- Reductions --
                             Avg     %Total     Avg     %Total Count  
    %Total     Avg         %Total    Count   %Total
      0:      Main Stage: 1.5061e-03  99.3%  0.0000e+00   0.0%
    0.000e+00   0.0%  0.000e+00        0.0%  0.000e+00   0.0%

    ------------------------------------------------------------------------------------------------------------------------
    See the 'Profiling' chapter of the users' manual for details on
    interpreting output.
    Phase summary info:
        Count: number of times phase was executed
        Time and Flop: Max - maximum over all processors
                       Ratio - ratio of maximum to minimum over all
    processors
        Mess: number of messages sent
        AvgLen: average message length (bytes)
        Reduct: number of global reductions
        Global: entire computation
        Stage: stages of a computation. Set stages with
    PetscLogStagePush() and PetscLogStagePop().
           %T - percent time in this phase         %F - percent flop in
    this phase
           %M - percent messages in this phase     %L - percent message
    lengths in this phase
           %R - percent reductions in this phase
        Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max
    time over all processors)
    ------------------------------------------------------------------------------------------------------------------------


           ##########################################################
           #                                                        #
           #                       WARNING!!!                       #
           #                                                        #
           #   This code was compiled with a debugging option.      #
           #   To get timing results run ./configure                #
           #   using --with-debugging=no, the performance will      #
           #   be generally two or three times faster.              #
           #                                                        #
           ##########################################################


    Event                Count      Time (sec)
    Flop                              --- Global ---  --- Stage ---- Total
                        Max Ratio  Max     Ratio   Max  Ratio  Mess
    AvgLen  Reduct  %T %F %M %L %R  %T %F %M %L %R Mflop/s
    ------------------------------------------------------------------------------------------------------------------------

    --- Event Stage 0: Main Stage

    ------------------------------------------------------------------------------------------------------------------------

    Memory usage is given in bytes:

    Object Type          Creations   Destructions     Memory
    Descendants' Mem.
    Reports information only for process 0.

    --- Event Stage 0: Main Stage

                   Viewer     1              0            0     0.
    ========================================================================================================================
    Average time to get PetscTime(): 4.38071e-08
    #PETSc Option Table entries:
    -info
    -ksp_monitor
    -ksp_type cg
    -log_view
    -options_left
    -pc_gamg_agg_nsmooths 1
    -pc_gamg_esteig_ksp_type cg
    -pc_gamg_type agg
    -pc_type gamg
    #End of PETSc Option Table entries
    Compiled without FORTRAN kernels
    Compiled with full precision matrices (default)
    sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8
    sizeof(PetscScalar) 8 sizeof(PetscInt) 4
    Configure options: --download-parmetis --download-superlu_dist
    --download-openmpi --download-ml --download-hypre --download-metis
    --download-mumps --download-scalapack --download-blacs
    -----------------------------------------
    Libraries compiled on 2020-06-13 05:13:15 on vulture.ce.berkeley.edu
    Machine characteristics:
    Linux-5.6.13-100.fc30.x86_64-x86_64-with-fedora-30-Thirty
    Using PETSc directory: /home/sg/petsc-3.13.2
    Using PETSc arch: gnu
    -----------------------------------------

    Using C compiler: /home/sg/petsc-3.13.2/gnu/bin/mpicc  -fPIC -Wall
    -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas
    -fstack-protector -fvisibility=hidden -g3
    Using Fortran compiler: /home/sg/petsc-3.13.2/gnu/bin/mpif90 -fPIC
    -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g
    -----------------------------------------

    Using include paths: -I/home/sg/petsc-3.13.2/include
    -I/home/sg/petsc-3.13.2/gnu/include
    -----------------------------------------

    Using C linker: /home/sg/petsc-3.13.2/gnu/bin/mpicc
    Using Fortran linker: /home/sg/petsc-3.13.2/gnu/bin/mpif90
    Using libraries: -Wl,-rpath,/home/sg/petsc-3.13.2/gnu/lib
    -L/home/sg/petsc-3.13.2/gnu/lib -lpetsc
    -Wl,-rpath,/home/sg/petsc-3.13.2/gnu/lib
    -L/home/sg/petsc-3.13.2/gnu/lib
    -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/9
    -L/usr/lib/gcc/x86_64-redhat-linux/9 -lHYPRE -lcmumps -ldmumps
    -lsmumps -lzmumps -lmumps_common -lpord -lscalapack -lsuperlu_dist
    -lml -llapack -lblas -lX11 -lparmetis -lmetis -lm -lstdc++ -ldl
    -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran
    -lm -lgfortran -lm -lgcc_s -lquadmath -lpthread -lstdc++ -ldl
    -----------------------------------------



           ##########################################################
           #                                                        #
           #                       WARNING!!!                       #
           #                                                        #
           #   This code was compiled with a debugging option.      #
           #   To get timing results run ./configure                #
           #   using --with-debugging=no, the performance will      #
           #   be generally two or three times faster.              #
           #                                                        #
           ##########################################################


    [0] Petsc_DelViewer(): Removing viewer data attribute in an MPI_Comm
    28408096
    [0] Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to PETSc
    communicator embedded in a user MPI_Comm 28408096
    [0] Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm 140367846615616
    is being unlinked from inner PETSc comm 28408096
    [0] PetscCommDestroy(): Deleting PETSc MPI_Comm 28408096
    [0] Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in an
    MPI_Comm 28408096
    [0] PetscCommDuplicate(): Duplicating a communicator 140367846615616
    28408096 max tags = 2147483647
    #PETSc Option Table entries:
    -info
    -ksp_monitor
    -ksp_type cg
    -log_view
    -options_left
    -pc_gamg_agg_nsmooths 1
    -pc_gamg_esteig_ksp_type cg
    -pc_gamg_type agg
    -pc_type gamg
    #End of PETSc Option Table entries
    [0] Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to PETSc
    communicator embedded in a user MPI_Comm 28408096
    [0] Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm 140367846615616
    is being unlinked from inner PETSc comm 28408096
    [0] PetscCommDestroy(): Deleting PETSc MPI_Comm 28408096
    [0] Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in an
    MPI_Comm 28408096
    WARNING! There are options you set that were not used!
    WARNING! could be spelling mistake, etc!
    There are 6 unused database options. They are:
    Option left: name:-ksp_monitor (no value)
    Option left: name:-ksp_type value: cg
    Option left: name:-pc_gamg_agg_nsmooths value: 1
    Option left: name:-pc_gamg_esteig_ksp_type value: cg
    Option left: name:-pc_gamg_type value: agg
    Option left: name:-pc_type value: gamg
    [sg at vulture main]$



On 6/13/20 5:10 AM, Mark Adams wrote:
> That is odd. Are these problems symmetric positive definite?
>
> Eigen estimates are a pain in practice but I've never seen this. Hypre 
> has (better) smoothers that don't need this and the AMG algorithm does 
> not need them either. I think ML does pretty much the same thing as me.
>
> If SPD then you definitely want '-pc_gamg_esteig_ksp_type cg'. CG 
> converges faster and is more robust. Even if not strictly SPD like 
> with large deformation or plasticity CG is probably better. You can 
> run with -info and grep on GAMG to see what eigen (high) estimates it 
> came up with for each level. They should be >1 and < 4, or so.
>
> I wonder if your LAPACK is funky. You might try a fresh build (delete 
> the "arch" directory) and download LAPACK, but this problem will go 
> away with CG.
>
> Note, if you use Chebyshev smoothing then it needs eigen estimates 
> also. THis is usually where eigen estimate problems come up. If you 
> use jacobi PC in the smoother, GAMG will use the eigen estimate from 
> the smoothed aggregation smoothing (what is failing for you here). 
> -pc_gamg_use_sa_esteig <T,F> will force it to use, or not use, it. For 
> instance, if you sor/ilu/asm then the eigen estimates are probably 
> fine to reuse but GAMG does not by default. If the cheby eigen 
> estimates are low then the solver fails. THere is a safety factor to 
> increase the eigen estimate in chychev to avoid this failure.
>
> And, -pc_gamg_esteig_ksp_max_it X sets the number of iterations in the 
> estimator. You can increase this and you should see your eigen 
> estimates go up and converge. You should see that CG converges much 
> faster than GMRES (the default).
>
>
>
>
> On Sat, Jun 13, 2020 at 2:32 AM Sanjay Govindjee <s_g at berkeley.edu 
> <mailto:s_g at berkeley.edu>> wrote:
>
>     I have a FEA problem that I am trying to solve with GAMG.  The
>     problem solves
>     just fine with direct solvers (mumps, superlu) and iterative
>     solvers (gmres, ml, hypre-boomer) etc.
>
>     However with GAMG I am getting a divide by zero that I am having
>     trouble tracking down.  Below
>     is the gdb stack trace and the source lines going up the stack.
>
>     When I run in valgrind the problem runs fine (and gets the correct
>     answer).
>     Valgrind reports nothing of note (just lots of indirectly lost
>     blocks  related to PMP_INIT).
>
>     I'm only running on one processor.
>
>     Any suggestions on where to start to trace the problem?
>
>     -sanjay
>
>         #0  0x00007fb262dc5be1 in ieeeck_ () from /lib64/liblapack.so.3
>         #1  0x00007fb262dc5332 in ilaenv_ () from /lib64/liblapack.so.3
>         #2  0x00007fb262dbbcef in dlasq2_ () from /lib64/liblapack.so.3
>         #3  0x00007fb262dbb78c in dlasq1_ () from /lib64/liblapack.so.3
>         #4  0x00007fb262da1e2e in dbdsqr_ () from /lib64/liblapack.so.3
>         #5  0x00007fb262960110 in dgesvd_ () from /lib64/liblapack.so.3
>         #6  0x00007fb264e74b66 in
>         KSPComputeExtremeSingularValues_GMRES (ksp=0x1816560,
>         emax=0x7ffc5010e7c8, emin=0x7ffc5010e7d0) at
>         /home/sg/petsc-3.13.2/src/ksp/ksp/impls/gmres/gmreig.c:32
>         #7  0x00007fb264dfe69a in KSPComputeExtremeSingularValues
>         (ksp=0x1816560, emax=0x7ffc5010e7c8, emin=0x7ffc5010e7d0) at
>         /home/sg/petsc-3.13.2/src/ksp/ksp/interface/itfunc.c:64
>         #8  0x00007fb264b44a1f in PCGAMGOptProlongator_AGG
>         (pc=0x12f3d30, Amat=0x11a2630, a_P=0x7ffc5010ebe0) at
>         /home/sg/petsc-3.13.2/src/ksp/pc/impls/gamg/agg.c:1145
>         #9  0x00007fb264b248a1 in PCSetUp_GAMG (pc=0x12f3d30) at
>         /home/sg/petsc-3.13.2/src/ksp/pc/impls/gamg/gamg.c:557
>         #10 0x00007fb264d8535b in PCSetUp (pc=0x12f3d30) at
>         /home/sg/petsc-3.13.2/src/ksp/pc/interface/precon.c:898
>         #11 0x00007fb264e01a93 in KSPSetUp (ksp=0x128dd80) at
>         /home/sg/petsc-3.13.2/src/ksp/ksp/interface/itfunc.c:376
>         #12 0x00007fb264e057af in KSPSolve_Private (ksp=0x128dd80,
>         b=0x1259f30, x=0x125d910) at
>         /home/sg/petsc-3.13.2/src/ksp/ksp/interface/itfunc.c:633
>         #13 0x00007fb264e086b9 in KSPSolve (ksp=0x128dd80,
>         b=0x1259f30, x=0x125d910) at
>         /home/sg/petsc-3.13.2/src/ksp/ksp/interface/itfunc.c:853
>         #14 0x00007fb264e46216 in kspsolve_ (ksp=0x832670
>         <__pfeapc_MOD_kspsol>, b=0x832698 <__pfeapc_MOD_rhs>,
>         x=0x8326a0 <__pfeapc_MOD_sol>, __ierr=0x7ffc5010f358)
>             at
>         /home/sg/petsc-3.13.2/src/ksp/ksp/interface/ftn-auto/itfuncf.c:266
>         #15 0x000000000043298d in usolve (flags=..., b=...) at
>         usolve.F:313
>         #16 0x000000000044afba in psolve (stype=-3, b=..., fp=...,
>         factor=.TRUE., solve=.TRUE., cfr=.FALSE., prnt=.TRUE.) at
>         psolve.f:212
>         #17 0x00000000006b7393 in pmacr1 (lct=..., ct=..., j=3,
>         _lct=_lct at entry=15) at pmacr1.f:578
>         #18 0x00000000005c247b in pmacr (initf=.FALSE.) at pmacr.f:578
>         #19 0x000000000044ff20 in pcontr () at pcontr.f:1307
>         #20 0x0000000000404d9b in feap () at feap86.f:162
>         #21 main (argc=<optimized out>, argv=<optimized out>) at
>         feap86.f:168
>         #22 0x00007fb261aaef43 in __libc_start_main () from
>         /lib64/libc.so.6
>         #23 0x0000000000404dde in _start ()
>
>         (gdb) list
>         1       <built-in>: No such file or directory.
>         (gdb) up
>         #1  0x00007fb262dc5332 in ilaenv_ () from /lib64/liblapack.so.3
>         (gdb) up
>         #2  0x00007fb262dbbcef in dlasq2_ () from /lib64/liblapack.so.3
>         (gdb) up
>         #3  0x00007fb262dbb78c in dlasq1_ () from /lib64/liblapack.so.3
>         (gdb) up
>         #4  0x00007fb262da1e2e in dbdsqr_ () from /lib64/liblapack.so.3
>         (gdb) up
>         #5  0x00007fb262960110 in dgesvd_ () from /lib64/liblapack.so.3
>         (gdb) up
>         #6  0x00007fb264e74b66 in
>         KSPComputeExtremeSingularValues_GMRES (ksp=0x1816560,
>         emax=0x7ffc5010e7c8, emin=0x7ffc5010e7d0) at
>         /home/sg/petsc-3.13.2/src/ksp/ksp/impls/gmres/gmreig.c:32
>         32
>         PetscStackCallBLAS("LAPACKgesvd",LAPACKgesvd_("N","N",&bn,&bn,R,&bN,realpart,&sdummy,&idummy,&sdummy,&idummy,work,&lwork,&lierr));
>         (gdb) up
>         #7  0x00007fb264dfe69a in KSPComputeExtremeSingularValues
>         (ksp=0x1816560, emax=0x7ffc5010e7c8, emin=0x7ffc5010e7d0) at
>         /home/sg/petsc-3.13.2/src/ksp/ksp/interface/itfunc.c:64
>         64          ierr =
>         (*ksp->ops->computeextremesingularvalues)(ksp,emax,emin);CHKERRQ(ierr);
>         (gdb) up
>         #8  0x00007fb264b44a1f in PCGAMGOptProlongator_AGG
>         (pc=0x12f3d30, Amat=0x11a2630, a_P=0x7ffc5010ebe0) at
>         /home/sg/petsc-3.13.2/src/ksp/pc/impls/gamg/agg.c:1145
>         1145          ierr = KSPComputeExtremeSingularValues(eksp,
>         &emax, &emin);CHKERRQ(ierr);
>         (gdb) up
>         #9  0x00007fb264b248a1 in PCSetUp_GAMG (pc=0x12f3d30) at
>         /home/sg/petsc-3.13.2/src/ksp/pc/impls/gamg/gamg.c:557
>         557               ierr = pc_gamg->ops->optprolongator(pc,
>         Aarr[level], &Prol11);CHKERRQ(ierr);
>         (gdb) up
>         #10 0x00007fb264d8535b in PCSetUp (pc=0x12f3d30) at
>         /home/sg/petsc-3.13.2/src/ksp/pc/interface/precon.c:898
>         898         ierr = (*pc->ops->setup)(pc);CHKERRQ(ierr);
>         (gdb) up
>         #11 0x00007fb264e01a93 in KSPSetUp (ksp=0x128dd80) at
>         /home/sg/petsc-3.13.2/src/ksp/ksp/interface/itfunc.c:376
>         376       ierr = PCSetUp(ksp->pc);CHKERRQ(ierr);
>         (gdb) up
>         #12 0x00007fb264e057af in KSPSolve_Private (ksp=0x128dd80,
>         b=0x1259f30, x=0x125d910) at
>         /home/sg/petsc-3.13.2/src/ksp/ksp/interface/itfunc.c:633
>         633       ierr = KSPSetUp(ksp);CHKERRQ(ierr);
>         (gdb) up
>         #13 0x00007fb264e086b9 in KSPSolve (ksp=0x128dd80,
>         b=0x1259f30, x=0x125d910) at
>         /home/sg/petsc-3.13.2/src/ksp/ksp/interface/itfunc.c:853
>         853       ierr = KSPSolve_Private(ksp,b,x);CHKERRQ(ierr);
>         (gdb) up
>         #14 0x00007fb264e46216 in kspsolve_ (ksp=0x832670
>         <__pfeapc_MOD_kspsol>, b=0x832698 <__pfeapc_MOD_rhs>,
>         x=0x8326a0 <__pfeapc_MOD_sol>, __ierr=0x7ffc5010f358)
>             at
>         /home/sg/petsc-3.13.2/src/ksp/ksp/interface/ftn-auto/itfuncf.c:266
>         266     *__ierr = KSPSolve(
>         (gdb) up
>         #15 0x000000000043298d in usolve (flags=..., b=...) at
>         usolve.F:313
>         313               call KSPSolve         (kspsol, rhs, sol, ierr)
>
>
>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200613/3d2c4932/attachment-0001.html>


More information about the petsc-users mailing list