You can try to run sole MPI samples coming with OpenMPI first, make sure the OpenMPI is running all right.<br><br>Thanks,<br><br>Xin Qian<br><br><div class="gmail_quote">On Wed, Jul 8, 2009 at 4:48 PM, Yin Feng <span dir="ltr"><<a href="mailto:yfeng1@tigers.lsu.edu">yfeng1@tigers.lsu.edu</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">I tried OpenMPI build PETSc and used mpirun provided by OpenMPI.<br>
But, when I check the load on each node, I found the master node take<br>
all the load<br>
and others are just free.<br>
<br>
Did you have any idea about this situation?<br>
<br>
Thanks in adcance!<br>
<br>
Sincerely,<br>
<font color="#888888">YIN<br>
</font><div><div></div><div class="h5"><br>
On Wed, Jul 8, 2009 at 1:26 PM, Satish Balay<<a href="mailto:balay@mcs.anl.gov">balay@mcs.anl.gov</a>> wrote:<br>
> Perhaps you are using the wrong mpiexec or mpirun. You'll have to use<br>
> the correspond mpiexec from MPI you've used to build PETSc.<br>
><br>
> Or if the MPI has special instruction on usage - you should follow<br>
> that [for ex: some clusters require extra options to mpiexec ]<br>
><br>
> Satish<br>
><br>
> On Wed, 8 Jul 2009, Yin Feng wrote:<br>
><br>
>> I am a beginner of PETSc.<br>
>> I tried the PETSC example 5(ex5) with 4 nodes,<br>
>> However, it seems every nodes doing the exactly the same things and<br>
>> output the same results again and again. is this the problem of petsc or<br>
>> MPI installation?<br>
>><br>
>> Thank you in adcance!<br>
>><br>
>> Sincerely,<br>
>> YIN<br>
>><br>
><br>
><br>
</div></div></blockquote></div><br><br clear="all"><br>-- <br>QIAN, Xin (<a href="http://pubpages.unh.edu/~xqian/">http://pubpages.unh.edu/~xqian/</a>)<br><a href="mailto:xqian@unh.edu">xqian@unh.edu</a> <a href="mailto:chianshin@gmail.com">chianshin@gmail.com</a><br>