<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On Tue, Jul 22, 2014 at 10:09 AM, John Mousel <span dir="ltr"><<a href="mailto:john.mousel@gmail.com" target="_blank">john.mousel@gmail.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex"><div dir="ltr">I'm getting a floating point error in VecSet. I see a similar error reported in a nightly log here:<br>
<br><a href="http://ftp.mcs.anl.gov/pub/petsc/nightlylogs/archive/2014/06/18/examples_master_arch-opensolaris-cmplx-pkgs-dbg_n-gage.log" target="_blank">http://ftp.mcs.anl.gov/pub/petsc/nightlylogs/archive/2014/06/18/examples_master_arch-opensolaris-cmplx-pkgs-dbg_n-gage.log</a><br>
<div><br></div><div>I've attached the error message below. Any suggestions?<br></div></div></blockquote><div><br></div><div>rvector.c:581</div><div><br></div><div><div> if (val > PETSC_MAX_REAL/x->map->N) {</div>
</div><div><br></div><div>It looks like someone has a moral objection to 0-size vectors. If no one fixes this before I get</div><div>back next week, I will do it.</div><div><br></div><div> Thanks,</div><div><br></div><div>
Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex"><div dir="ltr"><div></div><div>
Thanks,<br><br></div><div>John<br></div><div><br><br>[0]PETSC ERROR: ------------------------------------------------------------------------<br>
[0]PETSC ERROR: Caught signal number 8 FPE: Floating Point Exception,probably divide by zero<br>[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger<br>[0]PETSC ERROR: or see <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC</a> ERROR: or try <a href="http://valgrind.org" target="_blank">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to find memory corruption errors<br>
[0]PETSC ERROR: likely location of problem given in stack below<br>[0]PETSC ERROR: --------------------- Stack Frames ------------------------------------<br>[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,<br>
[0]PETSC ERROR: INSTEAD the line number of the start of the function<br>[0]PETSC ERROR: is given.<br>[0]PETSC ERROR: [0] VecSet line 568 /Users/jmousel/SOFT/petsc/src/vec/vec/interface/rvector.c<br>[0]PETSC ERROR: [0] VecCreate_Seq line 34 /Users/jmousel/SOFT/petsc/src/vec/vec/impls/seq/bvec3.c<br>
[0]PETSC ERROR: [0] VecSetType line 38 /Users/jmousel/SOFT/petsc/src/vec/vec/interface/vecreg.c<br>[0]PETSC ERROR: [0] VecCreateSeq line 36 /Users/jmousel/SOFT/petsc/src/vec/vec/impls/seq/vseqcr.c<br>[0]PETSC ERROR: [0] MatSetUpMultiply_MPIAIJ line 26 /Users/jmousel/SOFT/petsc/src/mat/impls/aij/mpi/mmaij.c<br>
[0]PETSC ERROR: [0] MatAssemblyEnd_MPIAIJ line 665 /Users/jmousel/SOFT/petsc/src/mat/impls/aij/mpi/mpiaij.c<br>[0]PETSC ERROR: [0] MatAssemblyEnd line 4892 /Users/jmousel/SOFT/petsc/src/mat/interface/matrix.c<br><br>[95]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>
[95]PETSC ERROR: Signal received<br>[95]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.<br>[95]PETSC ERROR: Petsc Development GIT revision: v3.5-35-g0af79b0 GIT Date: 2014-07-09 19:23:10 -0500<br>
[95]PETSC ERROR: /nfsscratch/Users/jmousel/BAV2/BAV_EXE on a gnu-debug named compute-1-17.local by jmousel Tue Jul 22 09:51:50 2014<br>[95]PETSC ERROR: Configure options --with-cmake-dir=/opt/cmake/bin/cmake --download-metis --download-parmetis --download-fblaslapack --download-hdf5 --download-mpich --download-hypre --download-ml<br>
[95]PETSC ERROR: #1 User provided function() line 0 in unknown file<br>application called MPI_Abort(MPI_COMM_WORLD, 59) - process 95<br>[cli_95]: aborting job:<br>application called MPI_Abort(MPI_COMM_WORLD, 59) - process 95<br>
<br>===================================================================================<br>= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES<br>= PID 28216 RUNNING AT compute-3-73.local<br>= EXIT CODE: 59<br>
= CLEANING UP REMAINING PROCESSES<br>
= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES<br><br><br></div></div>
</blockquote></div><br><br clear="all"><div><br></div>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener
</div></div>