[petsc-dev] The errors when running petsc/src/snes/examples/tutorials/ex19.c in PETSc

Peter Brune prbrune at gmail.com
Tue May 7 13:14:41 CDT 2013


I can't reproduce.  Have you pulled master recently?  Those line numbers
don't correspond to anything, and that file hasn't changed very much in a
while.

- Peter


On Tue, May 7, 2013 at 11:31 AM, Lulu Liu <lulu.liu at kaust.edu.sa> wrote:

> mpirun -n 2  ./ex19 -da_refine 2 -da_overlap 2 -snes_monitor_short
> -snes_type aspin  -grashof 4e4 -lidvelocity 100   -snes_view
>
>  0 SNES Function norm 5381.22
>   1 SNES Function norm 3104.04
>   2 SNES Function norm 777.712
>   3 SNES Function norm 45.9649
>   4 SNES Function norm 0.195115
>   5 SNES Function norm 6.28208e-06
> SNES Object: 2 MPI processes
>   type: newtonls
>   maximum iterations=50, maximum function evaluations=10000
>   tolerances: relative=1e-08, absolute=1e-50, solution=1e-08
>   total number of linear solver iterations=47
>   total number of function evaluations=0
>   SNESLineSearch Object:   2 MPI processes
>     type: bt
>       interpolation: cubic
>       alpha=1.000000e-04
>     maxstep=1.000000e+08, minlambda=1.000000e-12
>     tolerances: relative=1.000000e-08, absolute=1.000000e-15,
> lambda=1.000000e-08
>     maximum iterations=40
>   SNES Object:  (npc_)   2 MPI processes
>     type: nasm
>       Nonlinear Additive Schwarz: total subdomain blocks = 2
>       Nonlinear Additive Schwarz: restriction/interpolation type - BASIC
>       Nonlinear Additive Schwarz: subSNES iterations: 114022506 subKSP
> iterations: 564006000
>       [0] number of local blocks = 1
>       [1] number of local blocks = 1
>       Local SNES objects:
>       SNES Object:      (npc_sub_)       1 MPI processes
>         type: newtonls
>         maximum iterations=50, maximum function evaluations=10000
>         tolerances: relative=1e-08, absolute=1e-50, solution=1e-08
>         total number of linear solver iterations=1
>         total number of function evaluations=3
>         SNESLineSearch Object:        (npc_sub_)         1 MPI processes
>           type: bt
>             interpolation: cubic
>             alpha=1.000000e-04
>           maxstep=1.000000e+08, minlambda=1.000000e-12
>           tolerances: relative=1.000000e-08, absolute=1.000000e-15,
> lambda=1.000000e-08
>           maximum iterations=40
>         KSP Object:        (npc_sub_)         1 MPI processes
>           type: preonly
>           maximum iterations=10000, initial guess is zero
>           tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
>           left preconditioning
>           using NONE norm type for convergence test
>         PC Object:        (npc_sub_)         1 MPI processes
>           type: lu
>             LU: out-of-place factorization
>             tolerance for zero pivot 2.22045e-14
>             matrix ordering: nd
>             factor fill ratio given 5, needed 2.95194
>               Factored matrix follows:
>                 Matrix Object:                 1 MPI processes
>                   type: seqaij
>                   rows=468, cols=468, bs=4
>                   package used to perform factorization: petsc
>                   total: nonzeros=25552, allocated nonzeros=25552
>                   total number of mallocs used during MatSetValues calls =0
>                     using I-node routines: found 117 nodes, limit used is 5
>           linear system matrix = precond matrix:
>           Matrix Object:           1 MPI processes
>             type: seqaij
>             rows=468, cols=468, bs=4
>             total: nonzeros=8656, allocated nonzeros=8656
>             total number of mallocs used during MatSetValues calls =0
>               using I-node routines: found 117 nodes, limit used is 5
> [1]PETSC ERROR: PetscCommDuplicate() line 183 in
> /Users/liul/soft/petsc/src/sys/objects/tagm.c
> [1]PETSC ERROR: PetscSynchronizedFlush() line 450 in
> /Users/liul/soft/petsc/src/sys/fileio/mprint.c
> [1]PETSC ERROR: PetscViewerFlush_ASCII() line 135 in
> /Users/liul/soft/petsc/src/sys/classes/viewer/impls/ascii/filev.c
> [1]PETSC ERROR: PetscViewerFlush() line 30 in
> /Users/liul/soft/petsc/src/sys/classes/viewer/interface/flush.c
> [1]PETSC ERROR: SNESView_NASM() line 240 in
> /Users/liul/soft/petsc/src/snes/impls/nasm/nasm.c
> [1]PETSC ERROR: SNESView() line 251 in
> /Users/liul/soft/petsc/src/snes/interface/snes.c
> [1]PETSC ERROR: SNESView() line 336 in
> /Users/liul/soft/petsc/src/snes/interface/snes.c
> [1]PETSC ERROR: SNESSolve() line 3804 in
> /Users/liul/soft/petsc/src/snes/interface/snes.c
> [1]PETSC ERROR: main() line 157 in src/snes/examples/tutorials/ex19.c
> application called MPI_Abort(MPI_COMM_WORLD, 872626446) - process 1
> [cli_1]: aborting job:
> application called MPI_Abort(MPI_COMM_WORLD, 872626446) - process 1
> --
> Best wishes,
> Lulu Liu
> Applied Mathematics and Computational Science
> King Abdullah University of Science and Technology
> Tel:+966-0544701599
>
> ------------------------------
> This message and its contents, including attachments are intended solely
> for the original recipient. If you are not the intended recipient or have
> received this message in error, please notify me immediately and delete
> this message from your computer system. Any unauthorized use or
> distribution is prohibited. Please consider the environment before printing
> this email.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20130507/b2a81bf4/attachment.html>


More information about the petsc-dev mailing list