[petsc-users] Help with mpiexec

George Mathew gmnnus at gmail.com
Sat Oct 13 16:00:27 CDT 2012


On Sat, Oct 13, 2012 at 4:55 PM, Jed Brown <jedbrown at mcs.anl.gov> wrote:

> On Sat, Oct 13, 2012 at 3:53 PM, George Mathew <gmnnus at gmail.com> wrote:
>
>> On Sat, Oct 13, 2012 at 3:19 PM, Matthew Knepley <knepley at gmail.com<https://lists.mcs.anl.gov/mailman/listinfo/petsc-users>>
>> wrote:
>>
>> >* On Sat, Oct 13, 2012 at 3:17 PM, George Mathew <gmnnus at gmail.com <https://lists.mcs.anl.gov/mailman/listinfo/petsc-users>> wrote:*>**>>* I have petsc 3.3 and mpich2 version 1.4.1p1 installed.*>>* When I run the example ex2.c code in the petsc user manual, it runs one*>>* one machine format. But, if I chose to run on multiple nodes, it gives*>>* error.*>>* The example is the file ${PETSC}/src/ksp/ksp/examples/tutorials/ex2.c.*>>* I compiled it and when run using the following command, it works.*>>* mpiexec -f machinefile -n 1 ./ex2*>>* But, if I run it using the command*>>* mpiexec -f machinefile -n 2 ./ex2*>>* I get the following error.*>>**>**>* It is very likely that you are running the code with the wrong mpiexec*>* (from another MPI installation).*>* I cannot tell which MPI you used with configure.log.*>**
>> >Can you also try
>>
>> >1. Run multiple processes on your local machine
>>
>> >2. Run with -skip_petscrc
>>
>> I have only one MPI installation. It is under /usr/local/bin. The following command works:
>>
>> mpiexec -n 4 ./ex2
>>
>> So, locally, multiple processes must be running fine.
>>
>> How do I run with -skip_petscrc?
>>
>>
> mpiexec -n 4 -f machinefile ./ex2 -skip_petscrc
>

Here is the message when run with skip_petscrc:

$ mpiexec -f machinefile -n 4 ./ex2 -skip_petscrc
[0]PETSC ERROR: PetscWorldIsSingleHost() line 99 in src/sys/utils/pdisplay.c
[0]PETSC ERROR: PetscSetDisplay() line 125 in src/sys/utils/pdisplay.c
[0]PETSC ERROR: PetscOptionsCheckInitial_Private() line 319 in
src/sys/objects/init.c
[0]PETSC ERROR: PetscInitialize() line 761 in src/sys/objects/pinit.c
PETSC ERROR: Logging has not been enabled.
You might have forgotten to call PetscInitialize().
application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20121013/26d0f633/attachment-0001.html>


More information about the petsc-users mailing list