<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On Tue, May 27, 2014 at 11:19 PM, Adrian Croucher <span dir="ltr"><<a href="mailto:a.croucher@auckland.ac.nz" target="_blank">a.croucher@auckland.ac.nz</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">hi<br>
<br>
The TS ex11 example (from 'next' branch) runs OK for me in serial, but crashes when I run it in parallel (for any number of processors > 1).<br>
<br>
Output from serial and parallel runs is below. Any clues as to what has gone wrong?<br></blockquote><div><br></div><div>Yes, I put an overaggressive check in. I will push the fix tonight.</div><div><br></div><div> Thanks,</div>
<div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Cheers, Adrian<br>
<br>
------------------------------<u></u>------------------------------<u></u>---------------<br>
<br>
1) Serial:<br>
<br>
acro018@des108:~/software/<u></u>PETSc/code/src/ts/examples/<u></u>tutorials$ mpirun -np 1 ex11 -f ${PETSC_DIR}/share/petsc/<u></u>datafiles/meshes/sevensi<br>
de-quad.exo -advect_sol_type BUMP<br>
0 time 0 |x| 0.2425<br>
1 time 0.3311 |x| 0.1401<br>
2 time 0.6623 |x| 0.07485<br>
3 time 0.9934 |x| 0.0455<br>
4 time 1.325 |x| 0.0316<br>
5 time 1.656 |x| 0.02646<br>
6 time 1.987 |x| 0.02069<br>
7 time 2.318 |x| 0.01547<br>
CONVERGED_TIME at time 2.31791 after 7 steps<br>
<br>
2) Same thing in parallel:<br>
<br>
acro018@des108:~/software/<u></u>PETSc/code/src/ts/examples/<u></u>tutorials$ mpirun -np 2 ex11 -f ${PETSC_DIR}/share/petsc/<u></u>datafiles/meshes/sevensi<br>
de-quad.exo -advect_sol_type BUMP<br>
0 time 0 |x| 0.2425<br>
[0]PETSC ERROR: --------------------- Error Message ------------------------------<u></u>------------------------------<u></u>--<br>
[0]PETSC ERROR: Argument out of range<br>
[0]PETSC ERROR: Point 0 has 0 constraints > -2 dof<br>
[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">http://www.mcs.anl.gov/petsc/<u></u>documentation/faq.html</a> for trouble shooting.<br>
[0]PETSC ERROR: Petsc Development GIT revision: v3.4.4-5803-gabde81b GIT Date: 2014-05-26 14:52:58 -0500<br>
[0]PETSC ERROR: ex11 on a linux-gnu-c-opt named des108 by acro018 Wed May 28 16:14:26 2014<br>
[1]PETSC ERROR: --------------------- Error Message ------------------------------<u></u>------------------------------<u></u>--<br>
[1]PETSC ERROR: Argument out of range<br>
[1]PETSC ERROR: Point 3 has 0 constraints > -2 dof<br>
[1]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">http://www.mcs.anl.gov/petsc/<u></u>documentation/faq.html</a> for trouble shooting.<br>
[1]PETSC ERROR: Petsc Development GIT revision: v3.4.4-5803-gabde81b GIT Date: 2014-05-26 14:52:58 -0500<br>
[1]PETSC ERROR: ex11 on a linux-gnu-c-opt named des108 by acro018 Wed May 28 16:14:26 2014<br>
[1]PETSC ERROR: Configure options --download-netcdf --download-exodusii --with-hdf5-dir=/usr/lib --with-numpy-dir=/usr/lib/<u></u>pymodules/p<br>
ython2.7/numpy --with-scientificpython-dir=/<u></u>usr/lib/python2.7/dist-<u></u>packages/scipy --with-petsc4py-dir=/usr/<u></u>local/lib/python2.7/dist-pa<br>
ckages/petsc4py --download-triangle --with-fortran-interfaces=1 --download-ptscotch --download-chaco --download-ctetgen<br>
[1]PETSC ERROR: #1 DMCreateDefaultSF() line 3066 in /home/acro018/software/PETSc/<u></u>code/src/dm/interface/dm.c<br>
[1]PETSC ERROR: #2 DMGetDefaultSF() line 2986 in /home/acro018/software/PETSc/<u></u>code/src/dm/interface/dm.c<br>
[1]PETSC ERROR: #3 DMGlobalToLocalBegin() line 1646 in /home/acro018/software/PETSc/<u></u>code/src/dm/interface/dm.c<br>
[1]PETSC ERROR: #4 VecView_Plex() line 118 in /home/acro018/software/PETSc/<u></u>code/src/dm/impls/plex/plex.c<br>
[1]PETSC ERROR: #5 VecView() line 601 in /home/acro018/software/PETSc/<u></u>code/src/vec/vec/interface/<u></u>vector.c<br>
[1]PETSC ERROR: #6 MonitorVTK() line 1395 in /home/acro018/software/PETSc/<u></u>code/src/ts/examples/<u></u>tutorials/ex11.c<br>
[1]PETSC ERROR: [0]PETSC ERROR: Configure options --download-netcdf --download-exodusii --with-hdf5-dir=/usr/lib --with-numpy-dir=/usr<br>
/lib/pymodules/python2.7/numpy --with-scientificpython-dir=/<u></u>usr/lib/python2.7/dist-<u></u>packages/scipy --with-petsc4py-dir=/usr/<u></u>local/lib/p<br>
ython2.7/dist-packages/<u></u>petsc4py --download-triangle --with-fortran-interfaces=1 --download-ptscotch --download-chaco --download-ctetge<br>
n<br>
[0]PETSC ERROR: #1 DMCreateDefaultSF() line 3066 in /home/acro018/software/PETSc/<u></u>code/src/dm/interface/dm.c<br>
[0]PETSC ERROR: #2 DMGetDefaultSF() line 2986 in /home/acro018/software/PETSc/<u></u>code/src/dm/interface/dm.c<br>
[0]PETSC ERROR: #7 TSMonitor() line 2810 in /home/acro018/software/PETSc/<u></u>code/src/ts/interface/ts.c<br>
[1]PETSC ERROR: #8 TSSolve() line 2752 in /home/acro018/software/PETSc/<u></u>code/src/ts/interface/ts.c<br>
[1]PETSC ERROR: #9 main() line 1541 in /home/acro018/software/PETSc/<u></u>code/src/ts/examples/<u></u>tutorials/ex11.c<br>
[1]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint@mcs.anl.gov-------<u></u>---<br>
#3 DMGlobalToLocalBegin() line 1646 in /home/acro018/software/PETSc/<u></u>code/src/dm/interface/dm.c<br>
[0]PETSC ERROR: #4 VecView_Plex() line 118 in /home/acro018/software/PETSc/<u></u>code/src/dm/impls/plex/plex.c<br>
[0]PETSC ERROR: #5 VecView() line 601 in /home/acro018/software/PETSc/<u></u>code/src/vec/vec/interface/<u></u>vector.c<br>
[0]PETSC ERROR: #6 MonitorVTK() line 1395 in /home/acro018/software/PETSc/<u></u>code/src/ts/examples/<u></u>tutorials/ex11.c<br>
[0]PETSC ERROR: #7 TSMonitor() line 2810 in /home/acro018/software/PETSc/<u></u>code/src/ts/interface/ts.c<br>
[0]PETSC ERROR: #8 TSSolve() line 2752 in /home/acro018/software/PETSc/<u></u>code/src/ts/interface/ts.c<br>
[0]PETSC ERROR: #9 main() line 1541 in /home/acro018/software/PETSc/<u></u>code/src/ts/examples/<u></u>tutorials/ex11.c<br>
[0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint@mcs.anl.gov-------<u></u>---<br>
------------------------------<u></u>------------------------------<u></u>--------------<br>
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD<br>
with errorcode 63.<br>
<br>
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.<br>
You may or may not see output from other processes, depending on<br>
exactly when Open MPI kills them.<br>
------------------------------<u></u>------------------------------<u></u>--------------<br>
------------------------------<u></u>------------------------------<u></u>--------------<br>
mpirun has exited due to process rank 1 with PID 606 on<br>
node des108 exiting without calling "finalize". This may<br>
have caused other processes in the application to be<br>
terminated by signals sent by mpirun (as reported here).<br>
------------------------------<u></u>------------------------------<u></u>--------------<br>
[des108:00604] 1 more process has sent help message help-mpi-api.txt / mpi-abort<br>
[des108:00604] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages<span class="HOEnZb"><font color="#888888"><br>
<br>
-- <br>
Dr Adrian Croucher<br>
Senior Research Fellow<br>
Department of Engineering Science<br>
University of Auckland, New Zealand<br>
email: <a href="mailto:a.croucher@auckland.ac.nz" target="_blank">a.croucher@auckland.ac.nz</a><br>
tel: <a href="tel:%2B64%20%280%299%20923%204611" value="+6499234611" target="_blank">+64 (0)9 923 4611</a><br>
<br>
</font></span></blockquote></div><br><br clear="all"><div><br></div>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener
</div></div>