<html><head></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; ">The message you're seeing is a result of a segmentation fault in some process. This is usually the result of a bug in the application. The best way to diagnose this is to rerun the application with core files enabled, then open the core file in a debugger to see where the segmentation fault occurred. E.g.,<div><br></div><div> ulimit -c unlimited</div><div> mpiexec ...</div><div><br></div><div>Then look for a file called core.XXX (where XXX is the pid of the failed process) and open it in a debugger, e.g.:</div><div><br></div><div> gdb executable core.XXX</div><div><br></div><div>In gdb give the command </div><div> bt</div><div>to see the back trace to see where the error occurred.</div><div><br></div><div>If you're running this on a mac, the core file will be located in /core, and if there are multiple core files in there already, you can find the one you're looking for by the creation time.</div><div><br></div><div>-d</div><div><br><div><br><div><div><div>On May 18, 2012, at 10:02 PM, 유경완 wrote:</div><br class="Apple-interchange-newline"><blockquote type="cite"><p>Hi, thanks for read this mail</p><div> <br class="webkit-block-placeholder"></div><p>First of all, I very appreciate to make this mpich2 programs, because I used this program very usefully with clustering. Really thanks about it</p><div> <br class="webkit-block-placeholder"></div><p>But, I have little problems with using this... So I wanna ask something. sorry for bother you.</p><p>The problem was that when I upgrading cluster computers and also upgrading mpich2's version from 1.2 to 1.4.1p1,</p><p>and then installing was finished and mpiexec worked well with mpich2 1.4.1p version.</p><p>Then I tested compiling with mpicxx and it seems like works well with no errors.</p><p>But, when I processed mpiexec with just compiled files, then there appear errors like this...</p><div> <br class="webkit-block-placeholder"></div><p>%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%</p><p>[root@octofous2 yookw]# ./odengmorun 8</p><p>=====================================================================================<br>= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES<br>= EXIT CODE: 11<br>= CLEANING UP REMAINING PROCESSES<br>= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES<br>=====================================================================================<br>[proxy:0:1@n002] HYD_pmcd_pmip_control_cmd_cb (/home/octofous2/libraries/mpich2-1.4.1p1/src/pm/hydra/pm/pmiserv/pmip_cb.c:928): assert (!closed) failed<br>[proxy:0:1@n002] HYDT_dmxu_poll_wait_for_event (/home/octofous2/libraries/mpich2-1.4.1p1/src/pm/hydra/tools/demux/demux_poll.c:77): callback returned error status<br>[proxy:0:1@n002] main (/home/octofous2/libraries/mpich2-1.4.1p1/src/pm/hydra/pm/pmiserv/pmip.c:226): demux engine error waiting for event<br>[mpiexec@octofous2.psl] HYDT_bscu_wait_for_completion (/home/octofous2/libraries/mpich2-1.4.1p1/src/pm/hydra/tools/bootstrap/utils/bscu_wait.c:70): one of the processes terminated badly; aborting<br>[mpiexec@octofous2.psl] HYDT_bsci_wait_for_completion (/home/octofous2/libraries/mpich2-1.4.1p1/src/pm/hydra/tools/bootstrap/src/bsci_wait.c:23): launcher returned error waiting for completion<br>[mpiexec@octofous2.psl] HYD_pmci_wait_for_completion (/home/octofous2/libraries/mpich2-1.4.1p1/src/pm/hydra/pm/pmiserv/pmiserv_pmci.c:191): launcher returned error waiting for completion<br>[mpiexec@octofous2.psl] main (/home/octofous2/libraries/mpich2-1.4.1p1/src/pm/hydra/ui/mpich/mpiexec.c:405): process manager error waiting for completion<br>8 cpus</p><p>%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%</p><div> <br class="webkit-block-placeholder"></div><div> <br class="webkit-block-placeholder"></div><p>which odengmorun was</p><div> <br class="webkit-block-placeholder"></div><p>%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%</p><p>#!/bin/bash<br><br>mpiexec -f ./machine.list -n $1 ./yoo.out<br>echo $1 cpus</p><p>%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%</p><div> <br class="webkit-block-placeholder"></div><p>This is <span class="fnt_e15"><strong> </strong></span>weird for me because when I compiled with past computers which have mpich2 1.2 version same code, the mpiexec was worked...</p><p>So, there is only change of version and maybe some configures( sorry but I was changed administrator of clusters and I don't know about past computer's configure options.... but this times configure of new computer was</p><p>--with-pm=hydra:gforker:smpd --enable-fast=O3 -prefix=/home/octofous2/mpich2-install )</p><p>Sorry for ask like this but can I know any change in compiling between version 1.2 and 1.4.1p1 which can be a clue of this problem?</p><div> <br class="webkit-block-placeholder"></div><p>Thanks for read</p><p>Best regards</p>_______________________________________________<br>mpich-discuss mailing list <a href="mailto:mpich-discuss@mcs.anl.gov">mpich-discuss@mcs.anl.gov</a><br>To manage subscription options or unsubscribe:<br><a href="https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss">https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss</a><br></blockquote></div><br></div></div></div></body></html>