<div>ok. I sorted out that problem. Now I got this error:</div>
<div> </div>
<p>*** glibc detected *** double free or corruption (!prev): 0x0000000000509660 ***<br>/usr/lsf62/bin/mvapich_wrapper: line 388: 28571 Aborted (core dumped) $PJL $PJL_OPTS $REMOTE_ENV_VAR $JOB_CMD<br>Job /usr/lsf62/bin/mvapich_wrapper ./ex1f
</p>
<p>TID HOST_NAME COMMAND_LINE STATUS TERMINATION_TIME<br>===== ========== ================ ======================= ===================<br>00001 atlas3-c63 Undefined<br>00002 atlas3-c63 Undefined
<br>00002 atlas3-c61 ./ex1f Signaled (SIGKILL) 05/22/2007 12:23:58<br>00003 atlas3-c61 Undefined<br></p>
<p> </p>
<p>Thanks</p>
<div><br><br> </div>
<div><span class="gmail_quote">On 5/22/07, <b class="gmail_sendername">Satish Balay</b> <<a href="mailto:balay@mcs.anl.gov">balay@mcs.anl.gov</a>> wrote:</span>
<blockquote class="gmail_quote" style="PADDING-LEFT: 1ex; MARGIN: 0px 0px 0px 0.8ex; BORDER-LEFT: #ccc 1px solid">I don't think this has anything to do with PETSc. You might want to<br>check the docs of MPI library or the batch system you are using.
<br><br>Satish<br><br><br>On Tue, 22 May 2007, Ben Tay wrote:<br><br>> Hi,<br>><br>> I tried to compile PETSc and there's no problem. I also used the<br>> *--with-batch=1<br>> *option since my server is using a job scheduler. There's no shared library
<br>> and I'm installing hypre too.<br>><br>> However, when after sending the job for ex1f, I got the error msg:<br>><br>> Can't read MPIRUN_HOST.<br>><br>> So what's the problem?<br>><br>
> thanks<br>><br><br></blockquote></div><br>