<div dir="ltr">There's a floating point exception error showing up in the nightly builds, only for some bounded TAO Fortran examples using <a href="http://ftp.mcs.anl.gov/pub/petsc/nightlylogs/archive/2014/06/06/examples_next_arch-opensolaris-pkgs-opt_n-gage.log">arch-opensolaris-misc_n-gage</a> <div>
<blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex">> [0]PETSC ERROR: ------------------------------------------------------------------------<br>
> [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point Exception,probably divide by zero<br>> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger<br>> [0]PETSC ERROR: or see <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC</a> ERROR: or try <a href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to find memory corruption errors<br>
> [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run <br>> [0]PETSC ERROR: to get more information on the crash.<br>> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>
> [0]PETSC ERROR: Signal received<br>> [0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.<br>> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.4-6010-g43e530f GIT Date: 2014-06-05 17:04:47 -0500<br>
> [0]PETSC ERROR: ./plate2f on a arch-opensolaris-pkgs-opt named n-gage by petsc Fri Jun 6 01:17:49 2014<br>> [0]PETSC ERROR: Configure options --with-debugger=/bin/true --with-debugging=0 --download-mpich=1 --with-c2html=0 --download-cmake=1 --download-metis=1 --download-parmetis=1 --download-triangle=1 --download-superlu=1 --download-superlu_dist=1 --download-fblaslapack=1 --download-scalapack=1 --download-mumps=1 --download-parms=1 --download-sundials=1 --download-hypre=1 --download-suitesparse=1 --download-chaco=1 --download-spai=1 --with-no-output -PETSC_ARCH=arch-opensolaris-pkgs-opt -PETSC_DIR=/export/home/petsc/petsc.clone<br>
> [0]PETSC ERROR: #1 User provided function() line 0 in unknown file<br>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0<br>> [cli_0]: aborting job:<br>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0<br>
> <br>> ===================================================================================<br>> = BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES<br>> = EXIT CODE: 59<br>> = CLEANING UP REMAINING PROCESSES<br>
> = YOU CAN IGNORE THE BELOW CLEANUP MESSAGES<br>> ===================================================================================<br>/export/home/petsc/petsc.clone/src/tao/bound/examples/tutorials<br>Possible problem with plate2f_1 stdout, diffs above </blockquote>
<div style><br></div><div style>In setting up the blmvm algorithm, I want to create a vector of infinity values using</div><div style>VecSet(XL,PETSC_INIFINITY) , where PETSC_INFINITY is defined in petscmath.h as PETSC_MAX_REAL/4.0, in this case 4.49423e+307</div>
<div style><br></div><div style>I think this should be fine, but VecSet automagically computes the norms of this vector, invoking the product of XL->map->N * PETSC_INFINITY which causes the floating point exception.</div>
<div style><br></div><div style>Is there a better way to prevent this than testing if (PetscAbsScalar(alpha) > PETSC_REAL_MAX/N) then NORM = PETSC_INFINITY ?</div><div style><br></div><div style>This error is reproducible with the following code. </div>
<div style>Again, this only seems to break Fortran programs with arch-opensolaris-misc_n-gage</div><div style><div><br></div><div> program inftest</div><div> implicit none</div><div>#include "finclude/petscsys.h"</div>
<div>#include "finclude/petscvec.h"</div><div><br></div><div><br></div><div><br></div><div> PetscErrorCode ierr</div><div> Vec x</div><div><br></div><div><br></div><div> call PetscInitialize(PETSC_NULL_CHARACTER,ierr)</div>
<div><br></div><div> call VecCreateSeq(MPI_COMM_SELF,100,x,ierr)</div><div> call VecSet(x,PETSC_INFINITY,ierr)</div><div><br></div><div> call PetscFinalize(ierr)</div><div> end program</div><div><br></div>
</div></div></div>