[petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve

Mark Adams mfadams at lbl.gov
Sat Jan 7 21:57:10 CST 2017


This error seems to be coming from the computation of the extreme
eigenvalues of the matrix for smoothing in smoothed aggregation.

Are you getting good solutions with hypre? This error looks like it might
just be the first place where a messed up matrix fails in GAMG.

On Sat, Jan 7, 2017 at 10:23 PM, Matthew Knepley <knepley at gmail.com> wrote:

> On Sat, Jan 7, 2017 at 7:38 PM, Manuel Valera <mvalera at mail.sdsu.edu>
> wrote:
>
>> Ok great, i tried those command line args and this is the result:
>>
>> when i use -pc_type gamg:
>>
>> [1]PETSC ERROR: --------------------- Error Message
>> --------------------------------------------------------------
>>
>> [1]PETSC ERROR: Petsc has generated inconsistent data
>>
>> [1]PETSC ERROR: Have un-symmetric graph (apparently). Use
>> '-pc_gamg_sym_graph true' to symetrize the graph or '-pc_gamg_threshold
>> -1.0' if the matrix is structurally symmetric.
>>
>> [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> for trouble shooting.
>>
>> [1]PETSC ERROR: Petsc Release Version 3.7.4, unknown
>>
>> [1]PETSC ERROR: ./ucmsMR on a arch-linux2-c-debug named ocean by valera
>> Sat Jan  7 17:35:05 2017
>>
>> [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
>> --with-fc=gfortran --download-fblaslapack --download-mpich --download-hdf5
>> --download-netcdf --download-hypre --download-metis --download-parmetis
>> --download-trillinos
>>
>> [1]PETSC ERROR: #1 smoothAggs() line 462 in /usr/dataC/home/valera/petsc/s
>> rc/ksp/pc/impls/gamg/agg.c
>>
>> [1]PETSC ERROR: #2 PCGAMGCoarsen_AGG() line 998 in
>> /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c
>>
>> [1]PETSC ERROR: #3 PCSetUp_GAMG() line 571 in
>> /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/gamg.c
>>
>> [1]PETSC ERROR: #4 PCSetUp() line 968 in /usr/dataC/home/valera/petsc/s
>> rc/ksp/pc/interface/precon.c
>>
>> [1]PETSC ERROR: #5 KSPSetUp() line 390 in /usr/dataC/home/valera/petsc/s
>> rc/ksp/ksp/interface/itfunc.c
>>
>> application called MPI_Abort(comm=0x84000002, 77) - process 1
>>
>>
>> when i use -pc_type gamg and -pc_gamg_sym_graph true:
>>
>
> Do everything Barry said.
>
> However, I would like to track down this error. It could be a bug in our
> code. However, it appears to happen in the call
> to LAPACK, so it could also be a problem with that library on your
> machine. Could you run this case in the debugger
> and give the stack trace?
>
>   Thanks,
>
>      Matt
>
>
>>  ------------------------------------------------------------------------
>>
>> [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point
>> Exception,probably divide by zero
>>
>> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
>>
>> [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/d
>> ocumentation/faq.html#valgrind
>>
>> [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS
>> X to find memory corruption errors
>>
>> [1]PETSC ERROR: ------------------------------
>> ------------------------------------------
>>
>> [1]PETSC ERROR: ---------------------  Stack Frames
>> ------------------------------------
>>
>> [1]PETSC ERROR: Note: The EXACT line numbers in the stack are not
>> available,
>>
>> [1]PETSC ERROR:       INSTEAD the line number of the start of the function
>>
>> [1]PETSC ERROR:       is given.
>>
>> [1]PETSC ERROR: [1] LAPACKgesvd line 42 /usr/dataC/home/valera/petsc/s
>> rc/ksp/ksp/impls/gmres/gmreig.c
>>
>> [1]PETSC ERROR: [1] KSPComputeExtremeSingularValues_GMRES line 24
>> /usr/dataC/home/valera/petsc/src/ksp/ksp/impls/gmres/gmreig.c
>>
>> [1]PETSC ERROR: [1] KSPComputeExtremeSingularValues line 51
>> /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c
>>
>> [1]PETSC ERROR: [1] PCGAMGOptProlongator_AGG line 1187
>> /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c
>>
>> [1]PETSC ERROR: [1] PCSetUp_GAMG line 472 /usr/dataC/home/valera/petsc/s
>> rc/ksp/pc/impls/gamg/gamg.c
>>
>> [1]PETSC ERROR: [1] PCSetUp line 930 /usr/dataC/home/valera/petsc/s
>> rc/ksp/pc/interface/precon.c
>>
>> [1]PETSC ERROR: [1] KSPSetUp line 305 /usr/dataC/home/valera/petsc/s
>> rc/ksp/ksp/interface/itfunc.c
>>
>> [0] PCGAMGOptProlongator_AGG line 1187 /usr/dataC/home/valera/petsc/s
>> rc/ksp/pc/impls/gamg/agg.c
>>
>> [0]PETSC ERROR: [0] PCSetUp_GAMG line 472 /usr/dataC/home/valera/petsc/s
>> rc/ksp/pc/impls/gamg/gamg.c
>>
>> [0]PETSC ERROR: [0] PCSetUp line 930 /usr/dataC/home/valera/petsc/s
>> rc/ksp/pc/interface/precon.c
>>
>> [0]PETSC ERROR: [0] KSPSetUp line 305 /usr/dataC/home/valera/petsc/s
>> rc/ksp/ksp/interface/itfunc.c
>>
>> [0]PETSC ERROR: --------------------- Error Message
>> --------------------------------------------------------------
>>
>>
>> when i use -pc_type hypre it actually shows something different on
>> -ksp_view :
>>
>>
>> KSP Object: 2 MPI processes
>>
>>   type: gcr
>>
>>     GCR: restart = 30
>>
>>     GCR: restarts performed = 37
>>
>>   maximum iterations=10000, initial guess is zero
>>
>>   tolerances:  relative=1e-14, absolute=1e-50, divergence=10000.
>>
>>   right preconditioning
>>
>>   using UNPRECONDITIONED norm type for convergence test
>>
>> PC Object: 2 MPI processes
>>
>>   type: hypre
>>
>>     HYPRE BoomerAMG preconditioning
>>
>>     HYPRE BoomerAMG: Cycle type V
>>
>>     HYPRE BoomerAMG: Maximum number of levels 25
>>
>>     HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1
>>
>>     HYPRE BoomerAMG: Convergence tolerance PER hypre call 0.
>>
>>     HYPRE BoomerAMG: Threshold for strong coupling 0.25
>>
>>     HYPRE BoomerAMG: Interpolation truncation factor 0.
>>
>>     HYPRE BoomerAMG: Interpolation: max elements per row 0
>>
>>     HYPRE BoomerAMG: Number of levels of aggressive coarsening 0
>>
>>     HYPRE BoomerAMG: Number of paths for aggressive coarsening 1
>>
>>     HYPRE BoomerAMG: Maximum row sums 0.9
>>
>>     HYPRE BoomerAMG: Sweeps down         1
>>
>>     HYPRE BoomerAMG: Sweeps up           1
>>
>>     HYPRE BoomerAMG: Sweeps on coarse    1
>>
>>     HYPRE BoomerAMG: Relax down          symmetric-SOR/Jacobi
>>
>>     HYPRE BoomerAMG: Relax up            symmetric-SOR/Jacobi
>>
>>     HYPRE BoomerAMG: Relax on coarse     Gaussian-elimination
>>
>>     HYPRE BoomerAMG: Relax weight  (all)      1.
>>
>>     HYPRE BoomerAMG: Outer relax weight (all) 1.
>>
>>     HYPRE BoomerAMG: Using CF-relaxation
>>
>>     HYPRE BoomerAMG: Not using more complex smoothers.
>>
>>     HYPRE BoomerAMG: Measure type        local
>>
>>     HYPRE BoomerAMG: Coarsen type        Falgout
>>
>>     HYPRE BoomerAMG: Interpolation type  classical
>>
>>     HYPRE BoomerAMG: Using nodal coarsening (with
>> HYPRE_BOOMERAMGSetNodal() 1
>>
>>     HYPRE BoomerAMG: HYPRE_BoomerAMGSetInterpVecVariant() 1
>>
>>   linear system matrix = precond matrix:
>>
>>   Mat Object:   2 MPI processes
>>
>>     type: mpiaij
>>
>>     rows=200000, cols=200000
>>
>>     total: nonzeros=3373340, allocated nonzeros=3373340
>>
>>     total number of mallocs used during MatSetValues calls =0
>>
>>       not using I-node (on process 0) routines
>>
>>
>>
>> but still the timing is terrible.
>>
>>
>>
>>
>>
>> On Sat, Jan 7, 2017 at 5:28 PM, Jed Brown <jed at jedbrown.org> wrote:
>>
>>> Manuel Valera <mvalera at mail.sdsu.edu> writes:
>>>
>>> > Awesome Matt and Jed,
>>> >
>>> > The GCR is used because the matrix is not invertible and because this
>>> was
>>> > the algorithm that the previous library used,
>>> >
>>> > The Preconditioned im aiming to use is multigrid, i thought i
>>> configured
>>> > the hypre-boomerAmg solver for this, but i agree in that it doesn't
>>> show in
>>> > the log anywhere, how can i be sure is being used ? i sent -ksp_view
>>> log
>>> > before in this thread
>>>
>>> Did you run with -pc_type hypre?
>>>
>>> > I had a problem with the matrix block sizes so i couldn't make the
>>> petsc
>>> > native multigrid solver to work,
>>>
>>> What block sizes?  If the only variable is pressure, the block size
>>> would be 1 (default).
>>>
>>> > This is a nonhidrostatic pressure solver, it is an elliptic problem so
>>> > multigrid is a must,
>>>
>>> Yes, multigrid should work well.
>>>
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>



-- 
*The secret to doing good research is always be a little underemployed. You
waste years by not being able to waste hours* -- Amos Tversky
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170107/89762fdd/attachment-0001.html>


More information about the petsc-users mailing list