MPI Initialisation Problem

Tim Stitt timothy.stitt at ichec.ie
Sat Jun 16 12:32:40 CDT 2007


Hi all,

I am having some difficulty getting my parallel eigensolver to run over 
multiple processes.

When I execute my parallel code on my distributed-memory machine ( with > 1 
processes) I keep getting the following runtime message:

mpiexec: Warning: task 0 exited before completing MPI startup.
mpiexec: Warning: task 1 exited oddly---report bug: status 0 done 0.

MPI_Comm_Size returns 1 for #processes even though I mpiexec -n 2 or higher.

The code still runs to completion but serially.......

I have taken the same code and ran it on my shared-memory machine with no 
problems and all processes getting picked up so I know it is not a coding 
problem.....the sample ksp test codes that come with my petsc (2.3.3) 
distribution also exhibit this problem.

So there seems to be a problem with the PETScInitialize routine with this 
particular architecture. I have tried both pathscale and pgi compilers with 
the same result. As far as I can see there are no noticeable warnings 
generated during the config and make phases. Others use MPI regularly on our 
cluster so I don't see how it could be a MPI library issue.

Any thoughts on what could be happening gratefully received.

Regards,

Tim.

-- 
Dr. Timothy Stitt <timothy_dot_stitt_at_ichec.ie>
HPC Application Consultant - ICHEC (www.ichec.ie)

Dublin Institute for Advanced Studies
5 Merrion Square - Dublin 2 - Ireland

+353-1-6621333 (tel) / +353-1-6621477 (fax) / +353-874195427 (mobile)




More information about the petsc-users mailing list