[petsc-dev] Floating Point Error in VecSet

Matthew Knepley knepley at gmail.com
Tue Jul 22 10:20:13 CDT 2014


On Tue, Jul 22, 2014 at 10:09 AM, John Mousel <john.mousel at gmail.com> wrote:

> I'm getting a floating point error in VecSet. I see a similar error
> reported in a nightly log here:
>
>
> http://ftp.mcs.anl.gov/pub/petsc/nightlylogs/archive/2014/06/18/examples_master_arch-opensolaris-cmplx-pkgs-dbg_n-gage.log
>
> I've attached the error message below. Any suggestions?
>

rvector.c:581

  if (val > PETSC_MAX_REAL/x->map->N) {

It looks like someone has a moral objection to 0-size vectors. If no one
fixes this before I get
back next week, I will do it.

  Thanks,

     Matt


> Thanks,
>
> John
>
>
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point
> Exception,probably divide by zero
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> [0]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC
> ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find
> memory corruption errors
> [0]PETSC ERROR: likely location of problem given in stack below
> [0]PETSC ERROR: ---------------------  Stack Frames
> ------------------------------------
> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> available,
> [0]PETSC ERROR:       INSTEAD the line number of the start of the function
> [0]PETSC ERROR:       is given.
> [0]PETSC ERROR: [0] VecSet line 568
> /Users/jmousel/SOFT/petsc/src/vec/vec/interface/rvector.c
> [0]PETSC ERROR: [0] VecCreate_Seq line 34
> /Users/jmousel/SOFT/petsc/src/vec/vec/impls/seq/bvec3.c
> [0]PETSC ERROR: [0] VecSetType line 38
> /Users/jmousel/SOFT/petsc/src/vec/vec/interface/vecreg.c
> [0]PETSC ERROR: [0] VecCreateSeq line 36
> /Users/jmousel/SOFT/petsc/src/vec/vec/impls/seq/vseqcr.c
> [0]PETSC ERROR: [0] MatSetUpMultiply_MPIAIJ line 26
> /Users/jmousel/SOFT/petsc/src/mat/impls/aij/mpi/mmaij.c
> [0]PETSC ERROR: [0] MatAssemblyEnd_MPIAIJ line 665
> /Users/jmousel/SOFT/petsc/src/mat/impls/aij/mpi/mpiaij.c
> [0]PETSC ERROR: [0] MatAssemblyEnd line 4892
> /Users/jmousel/SOFT/petsc/src/mat/interface/matrix.c
>
> [95]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [95]PETSC ERROR: Signal received
> [95]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [95]PETSC ERROR: Petsc Development GIT revision: v3.5-35-g0af79b0  GIT
> Date: 2014-07-09 19:23:10 -0500
> [95]PETSC ERROR: /nfsscratch/Users/jmousel/BAV2/BAV_EXE on a gnu-debug
> named compute-1-17.local by jmousel Tue Jul 22 09:51:50 2014
> [95]PETSC ERROR: Configure options --with-cmake-dir=/opt/cmake/bin/cmake
> --download-metis --download-parmetis --download-fblaslapack --download-hdf5
> --download-mpich --download-hypre --download-ml
> [95]PETSC ERROR: #1 User provided function() line 0 in  unknown file
> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 95
> [cli_95]: aborting job:
> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 95
>
>
> ===================================================================================
> =   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
> =   PID 28216 RUNNING AT compute-3-73.local
> =   EXIT CODE: 59
> =   CLEANING UP REMAINING PROCESSES
> =   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20140722/1a4e601f/attachment.html>


More information about the petsc-dev mailing list