[petsc-users] problem with -ts_type gl

Dominik Szczerba dominik at itis.ethz.ch
Wed Nov 9 07:57:05 CST 2011


please run in a debugger...

On Wed, Nov 9, 2011 at 2:51 PM, Konstantinos Kontzialis
<ckontzialis at lycos.com> wrote:
> Dear all,
>
> I run implicitly my code for the boundary layer over a flat plate, with:
>
> mpiexec -n 1 valgrind ./hoac blasius -snes_mf_operator -llf_flux -n_out 10
> -end_time 50 -implicit -pc_type bjacobi -sub_pc_type ilu
> -sub_pc_factor_mat_ordering_type rcm -gl -ksp_type fgmres
> -sub_pc_factor_levels 2 -snes_monitor -snes_converged_reason
> -ksp_converged_reason -ts_view -ksp_pc_side right -sub_pc_factor_levels 4
> -ksp_gmres_restart 500 -dt 1.0e-3 -snes_ksp_ew -ts_type gl
>
> and I got:
>
>
>
> Approximation order = 0
> # DOF = 9600
> # nodes in mesh = 1281
> # elements in mesh = 1200
> Navier-Stokes solution
> Using LLF flux
>
>
> Linear solve converged due to CONVERGED_RTOL iterations 1
>
>
> Timestep   0: dt = 0.001, T = 0, Res[rho] = 2.06015e-10, Res[rhou] =
> 23.9721, Res[rhov] = 0.00322747, Res[E] = 0.00680121, CFL = 199.999
>     0 SNES Function norm 1.660837592895e+03
>     Linear solve converged due to CONVERGED_RTOL iterations 1
>     1 SNES Function norm 4.452765833604e+01
>     Linear solve converged due to CONVERGED_RTOL iterations 2
>     2 SNES Function norm 2.226427753100e+00
>     Linear solve converged due to CONVERGED_RTOL iterations 3
>     3 SNES Function norm 1.378423579772e-02
>     Linear solve converged due to CONVERGED_RTOL iterations 5
>     4 SNES Function norm 2.392525805814e-06
>   Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE
>     0 SNES Function norm 2.697720824971e+03
>     Linear solve converged due to CONVERGED_RTOL iterations 1
>     1 SNES Function norm 6.544960638431e+01
>     Linear solve converged due to CONVERGED_RTOL iterations 2
>     2 SNES Function norm 3.215472486217e+00
>     Linear solve converged due to CONVERGED_RTOL iterations 3
>     3 SNES Function norm 1.962042780514e-02
>     Linear solve converged due to CONVERGED_RTOL iterations 5
>     4 SNES Function norm 3.666057800237e-06
>   Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE
> ==2767== Jump to the invalid address stated on the next line
> ==2767==    at 0x0: ???
> ==2767==    by 0x585DB5E: TSGLChooseNextScheme (gl.c:795)
> ==2767==    by 0x585F12A: TSSolve_GL (gl.c:948)
> ==2767==    by 0x5881DD3: TSSolve (ts.c:1848)
> ==2767==    by 0x4272CC: implicit_time (implicit_time.c:77)
> ==2767==    by 0x4267B3: main (hoac.c:1175)
> ==2767==  Address 0x0 is not stack'd, malloc'd or (recently) free'd
> ==2767==
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> [0]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/petsc-as/documentation/faq.html#valgrind[0]PETSC
> ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find
> memory corruption errors
> [0]PETSC ERROR: likely location of problem given in stack below
> [0]PETSC ERROR: ---------------------  Stack Frames
> ------------------------------------
> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
> [0]PETSC ERROR:       INSTEAD the line number of the start of the function
> [0]PETSC ERROR:       is given.
> [0]PETSC ERROR: [0] TSGLAdaptChoose line 232
> /home/kontzialis/petsc-3.2-p5/src/ts/impls/implicit/gl/gladapt.c
> [0]PETSC ERROR: [0] TSGLChooseNextScheme line 774
> /home/kontzialis/petsc-3.2-p5/src/ts/impls/implicit/gl/gl.c
> [0]PETSC ERROR: --------------------- Error Message
> ------------------------------------
> [0]PETSC ERROR: Signal received!
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Petsc Release Version 3.2.0, Patch 5, Sat Oct 29 13:45:54
> CDT 2011
> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [0]PETSC ERROR: See docs/index.html for manual pages.
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: ./hoac on a linux-gnu named PlusSodaL by kontzialis Wed Nov
> 9 13:11:45 2011
> [0]PETSC ERROR: Libraries linked from
> /home/kontzialis/petsc-3.2-p5/linux-gnu-c-debug/lib
> [0]PETSC ERROR: Configure run at Sat Nov  5 20:58:12 2011
> [0]PETSC ERROR: Configure options --with-debugging=1
> ---with-mpi-dir=/usr/lib64/mpich2/bin --with-shared-libraries
> --with-shared-libraries --with-large-file-io=1 --with-precision=double
> --with-blacs=1 --download-blacs=yes --download-f-blas-lapack=yes
> --with-plapack=1 --download-plapack=yes --with-scalapack=1
> --download-scalapack=yes --with-superlu=1 --download-superlu=yes
> --with-superlu_dist=1 --download-superlu_dist=yes --with-ml=1
> --download-ml=yes --with-umfpack=1 --download-umfpack=yes --with-mpi=1
> --download-mpich=1 --with-sundials=1 --download-sundials=1 --with-parmetis=1
> --download-parmetis=1 --with-hypre=1 --download-hypre=1
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: User provided function() line 0 in unknown directory unknown
> file
> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
> [cli_0]: aborting job:
> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
> ==2767==
> ==2767== HEAP SUMMARY:
> ==2767==     in use at exit: 10,732,979 bytes in 19,630 blocks
> ==2767==   total heap usage: 53,585 allocs, 33,955 frees, 2,757,452,468
> bytes allocated
> ==2767==
> ==2767== LEAK SUMMARY:
> ==2767==    definitely lost: 1,114 bytes in 24 blocks
> ==2767==    indirectly lost: 24 bytes in 3 blocks
> ==2767==      possibly lost: 0 bytes in 0 blocks
> ==2767==    still reachable: 10,731,841 bytes in 19,603 blocks
> ==2767==         suppressed: 0 bytes in 0 blocks
> ==2767== Rerun with --leak-check=full to see details of leaked memory
>
> What am I doing wrong?
>
> Thank you,
>
> Kostas
>


More information about the petsc-users mailing list