mpiexec

Satish Balay balay at mcs.anl.gov
Fri Aug 17 10:31:58 CDT 2007


Ideally PETSc should be configured to use mpicc/etc wrappers.  And
'make ex1' would be the right thing. This can be done by specifying
--with-mpi-dir [which is equvalent to --with-cc=MPIDIR/bin/mpicc
--with-fc=MPIDIR/bin/mpif90]

To debug the current problem - you'll have to send us the relavent
logs [configure.log make_log test_log etc..] at
petsc-maint at mcs.anl.gov

Satish


On Fri, 17 Aug 2007, li pan wrote:

> dear all,
> I have a question about MPI. If I want to run the code
> with mpiexec, shall I compile it with mpicc first? 
> For example, I compile ksp/examples/tutorials/ex1.c
> just wiht make ex1
> and got following error when run the code:
> mpiexec -n 2 ./ex1
> [cli_0]: aborting job:
> Fatal error in MPI_Bcast: Other MPI error, error
> stack:
> MPI_Bcast(791)............................:
> MPI_Bcast(buf=0xbfffc84c, count=1, MPI_INT, root=0,
> MPI_COMM_WORLD) failed
> MPIR_Bcast(220)...........................:
> MPIC_Send(48).............................:
> MPIC_Wait(321)............................:
> MPIDI_CH3_Progress_wait(199)..............: an error
> occurred while handling an event returned by
> MPIDU_Sock_Wait()
> MPIDI_CH3I_Progress_handle_sock_event(944): [ch3:sock]
> failed to connnect to remote process
> kvs_e0403_pc13_33149_43_0:1
> MPIDU_Socki_handle_connect(806)...........: connection
> failure (set=0,sock=1,errno=111:(strerror() not
> found))
> rank 0 in job 44  e0403-pc13_33149   caused collective
> abort of all ranks
>   exit status of rank 0: return code 13
> 
> My mpdtrace showed all the hostnames I have.
> 
> best regards
> 
> pan
> 
> 
> 
> 
> 
>       ____________________________________________________________________________________
> Park yourself in front of a world of choices in alternative vehicles. Visit the Yahoo! Auto Green Center.
> http://autos.yahoo.com/green_center/ 
> 
> 




More information about the petsc-users mailing list