[petsc-dev] petsc_infinity and vecset
Barry Smith
bsmith at mcs.anl.gov
Fri Jun 6 11:12:47 CDT 2014
On Jun 6, 2014, at 10:36 AM, Jason Sarich <sarich at mcs.anl.gov> wrote:
> There's a floating point exception error showing up in the nightly builds, only for some bounded TAO Fortran examples using arch-opensolaris-misc_n-gage
> > [0]PETSC ERROR: ------------------------------------------------------------------------
> > [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point Exception,probably divide by zero
> > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> > [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
> > [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
> > [0]PETSC ERROR: to get more information on the crash.
> > [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> > [0]PETSC ERROR: Signal received
> > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
> > [0]PETSC ERROR: Petsc Development GIT revision: v3.4.4-6010-g43e530f GIT Date: 2014-06-05 17:04:47 -0500
> > [0]PETSC ERROR: ./plate2f on a arch-opensolaris-pkgs-opt named n-gage by petsc Fri Jun 6 01:17:49 2014
> > [0]PETSC ERROR: Configure options --with-debugger=/bin/true --with-debugging=0 --download-mpich=1 --with-c2html=0 --download-cmake=1 --download-metis=1 --download-parmetis=1 --download-triangle=1 --download-superlu=1 --download-superlu_dist=1 --download-fblaslapack=1 --download-scalapack=1 --download-mumps=1 --download-parms=1 --download-sundials=1 --download-hypre=1 --download-suitesparse=1 --download-chaco=1 --download-spai=1 --with-no-output -PETSC_ARCH=arch-opensolaris-pkgs-opt -PETSC_DIR=/export/home/petsc/petsc.clone
> > [0]PETSC ERROR: #1 User provided function() line 0 in unknown file
> > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
> > [cli_0]: aborting job:
> > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
> >
> > ===================================================================================
> > = BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
> > = EXIT CODE: 59
> > = CLEANING UP REMAINING PROCESSES
> > = YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
> > ===================================================================================
> /export/home/petsc/petsc.clone/src/tao/bound/examples/tutorials
> Possible problem with plate2f_1 stdout, diffs above
>
> In setting up the blmvm algorithm, I want to create a vector of infinity values using
> VecSet(XL,PETSC_INIFINITY) , where PETSC_INFINITY is defined in petscmath.h as PETSC_MAX_REAL/4.0, in this case 4.49423e+307
>
> I think this should be fine, but VecSet automagically computes the norms of this vector, invoking the product of XL->map->N * PETSC_INFINITY which causes the floating point exception.
>
> Is there a better way to prevent this than testing if (PetscAbsScalar(alpha) > PETSC_REAL_MAX/N) then NORM = PETSC_INFINITY ?
Jason,
if (PetscAbsScalar(alpha) > PETSC_REAL_MAX/N) then I would simply skip the setting of the 1- and 2-norms and only set the max norm. Don’t set the 1 and 2 norm to PETSC_INFINITY.
Barry
>
> This error is reproducible with the following code.
> Again, this only seems to break Fortran programs with arch-opensolaris-misc_n-gage
>
> program inftest
> implicit none
> #include "finclude/petscsys.h"
> #include "finclude/petscvec.h"
>
>
>
> PetscErrorCode ierr
> Vec x
>
>
> call PetscInitialize(PETSC_NULL_CHARACTER,ierr)
>
> call VecCreateSeq(MPI_COMM_SELF,100,x,ierr)
> call VecSet(x,PETSC_INFINITY,ierr)
>
> call PetscFinalize(ierr)
> end program
>
More information about the petsc-dev
mailing list