Can't read MPIRUN_HOST

Satish Balay balay at mcs.anl.gov
Tue May 22 00:11:11 CDT 2007


which ex1f is this?

If its src/sys/examples/tests/ex1f.F - then it tests PETSc error codes
[i.e the code generates an error]

Satish

On Tue, 22 May 2007, Ben Tay wrote:

> It's ex1f. Thanks
> 
> On 5/22/07, Satish Balay <balay at mcs.anl.gov> wrote:
> > 
> > Is this a PETSc example? Which one is it?
> > 
> > Satish
> > 
> > On Tue, 22 May 2007, Ben Tay wrote:
> > 
> > > ok. I sorted out that problem. Now I got this error:
> > >
> > >
> > > *** glibc detected *** double free or corruption (!prev):
> > 0x0000000000509660
> > > ***
> > > /usr/lsf62/bin/mvapich_wrapper: line 388: 28571 Aborted
> > > (core dumped) $PJL $PJL_OPTS $REMOTE_ENV_VAR $JOB_CMD
> > > Job  /usr/lsf62/bin/mvapich_wrapper ./ex1f
> > >
> > > TID   HOST_NAME
> > COMMAND_LINE            STATUS            TERMINATION_TIME
> > > ===== ========== ================  =======================
> > > ===================
> > > 00001 atlas3-c63                   Undefined
> > > 00002 atlas3-c63                   Undefined
> > > 00002 atlas3-c61 ./ex1f            Signaled (SIGKILL)       05/22/2007
> > > 12:23:58
> > > 00003 atlas3-c61                   Undefined
> > >
> > >
> > >
> > > Thanks
> > >
> > >
> > >
> > > On 5/22/07, Satish Balay <balay at mcs.anl.gov> wrote:
> > > >
> > > > I don't think this has anything to do with PETSc. You might want to
> > > > check the docs of MPI library or the batch system you are using.
> > > >
> > > > Satish
> > > >
> > > >
> > > > On Tue, 22 May 2007, Ben Tay wrote:
> > > >
> > > > > Hi,
> > > > >
> > > > > I tried to compile PETSc and there's no problem. I also used the
> > > > > *--with-batch=1
> > > > > *option since my server is using a job scheduler. There's no shared
> > > > library
> > > > > and I'm installing hypre too.
> > > > >
> > > > > However, when after sending the job for ex1f, I got the error msg:
> > > > >
> > > > > Can't read MPIRUN_HOST.
> > > > >
> > > > > So what's the problem?
> > > > >
> > > > > thanks
> > > > >
> > > >
> > > >
> > >
> > 
> > 
> 




More information about the petsc-users mailing list