<html><head><meta http-equiv="Content-Type" content="text/html; charset=us-ascii"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class="">Thanks for identifying this, Mark. <div class=""><br class=""></div><div class="">If I compile the debug version of Petsc, will it also build a debug version of Mumps? <br class=""><div><br class=""><blockquote type="cite" class=""><div class="">On Apr 8, 2019, at 12:58 PM, Mark Adams <<a href="mailto:mfadams@lbl.gov" class="">mfadams@lbl.gov</a>> wrote:</div><br class="Apple-interchange-newline"><div class=""><div dir="ltr" class="">This looks like an error in MUMPS:<br class=""><div class=""><br class=""></div><div class=""><pre style="white-space: pre-wrap;" class=""> IF ( IROW_GRID .NE. root%MYROW .OR.
& JCOL_GRID .NE. root%MYCOL ) THEN
WRITE(*,*) MYID,':INTERNAL Error: recvd root arrowhead '</pre></div></div><br class=""><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Mon, Apr 8, 2019 at 1:37 PM Smith, Barry F. via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov" class="">petsc-users@mcs.anl.gov</a>> wrote:<br class=""></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"> Difficult to tell what is going on. <br class="">
<br class="">
The message User provided function() line 0 in unknown file indicates the crash took place OUTSIDE of PETSc code and error message INTERNAL Error: recvd root arrowhead is definitely not coming from PETSc. <br class="">
<br class="">
Yes, debug with the debug version and also try valgrind.<br class="">
<br class="">
Barry<br class="">
<br class="">
<br class="">
> On Apr 8, 2019, at 12:12 PM, Manav Bhatia via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank" class="">petsc-users@mcs.anl.gov</a>> wrote:<br class="">
> <br class="">
> <br class="">
> Hi,<br class="">
> <br class="">
> I am running a code a nonlinear simulation using mesh-refinement on libMesh. The code runs without issues on a Mac (can run for days without issues), but crashes on Linux (Centos 6). I am using version 3.11 on Linux with openmpi 3.1.3 and gcc8.2. <br class="">
> <br class="">
> I tried to use the -on_error_attach_debugger, but it only gave me this message. Does this message imply something to the more experienced eyes? <br class="">
> <br class="">
> I am going to try to build a debug version of petsc to figure out what is going wrong. I will get and share more detailed logs in a bit. <br class="">
> <br class="">
> Regards,<br class="">
> Manav<br class="">
> <br class="">
> [8]PETSC ERROR: ------------------------------------------------------------------------<br class="">
> [8]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range<br class="">
> [8]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger<br class="">
> [8]PETSC ERROR: or see <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind" rel="noreferrer" target="_blank" class="">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind</a><br class="">
> [8]PETSC ERROR: or try <a href="http://valgrind.org/" rel="noreferrer" target="_blank" class="">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to find memory corruption errors<br class="">
> [8]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run <br class="">
> [8]PETSC ERROR: to get more information on the crash.<br class="">
> [8]PETSC ERROR: User provided function() line 0 in unknown file <br class="">
> PETSC: Attaching gdb to /cavs/projects/brg_codes/users/bhatia/mast/mast_topology/opt/examples/structural/example_5/structural_example_5 of pid 2108 on display localhost:10.0 on machine <a href="http://warhawk1.hpc.msstate.edu/" rel="noreferrer" target="_blank" class="">Warhawk1.HPC.MsState.Edu</a><br class="">
> PETSC: Attaching gdb to /cavs/projects/brg_codes/users/bhatia/mast/mast_topology/opt/examples/structural/example_5/structural_example_5 of pid 2112 on display localhost:10.0 on machine <a href="http://warhawk1.hpc.msstate.edu/" rel="noreferrer" target="_blank" class="">Warhawk1.HPC.MsState.Edu</a><br class="">
> 0 :INTERNAL Error: recvd root arrowhead <br class="">
> 0 :not belonging to me. IARR,JARR= 67525 67525<br class="">
> 0 :IROW_GRID,JCOL_GRID= 0 4<br class="">
> 0 :MYROW, MYCOL= 0 0<br class="">
> 0 :IPOSROOT,JPOSROOT= 92264688 92264688<br class="">
> --------------------------------------------------------------------------<br class="">
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD<br class="">
> with errorcode -99.<br class="">
> <br class="">
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.<br class="">
> You may or may not see output from other processes, depending on<br class="">
> exactly when Open MPI kills them.<br class="">
> --------------------------------------------------------------------------<br class="">
> <br class="">
<br class="">
</blockquote></div>
</div></blockquote></div><br class=""></div></body></html>