[petsc-users] Issue when calling multiple processors from PETSc
Matthew Knepley
knepley at gmail.com
Thu Dec 24 10:37:43 CST 2015
On Thu, Dec 24, 2015 at 10:16 AM, Ajit Desai <ajit.ndesai at gmail.com> wrote:
> Hello everyone,
>
> New user of PETSc and trying to get familiar with it for parallel
> computing applications.
> I wrote simple hello world code using PETSc environment and found a issue
> with the output. For instance :
>
> When I compile and execute "PETSc_helloWorld.F90" code with multiple
> processor. It prints following output.
>
Almost certainly you are using an 'mpiexec' that does not match the MPI
library that you built PETSc with. Did you
use --download-mpich? Try this
cd $PETSC_DIR
make check
That will test it with the correct mpiexec.
Matt
> MacUser$ *mpiexec -np 4 ./PETSc_helloWorld*
>
> * Hello from PETSc World, rank: 0 of total 1 processes.*
>
> * Hello from PETSc World, rank: 0 of total 1 processes.*
>
> * Hello from PETSc World, rank: 0 of total 1 processes.*
>
> * Hello from PETSc World, rank: 0 of total 1 processes.*
>
>
> Similar code with MPI prints following output:
>
> MacUser
> $
> *mpiexec -np 4 ./a.out *
>
> * Hello from MPI World, rank: 0 *
> * *
> * of total 4 processes.*
>
> * Hello from MPI World, rank: 1 *
> * *
> * of total 4 processes.*
>
> * Hello from MPI World, rank: 2 *
> * *
> * of total 4 processes.*
>
> * Hello from MPI World, rank: 3 of total 4 processes. *
>
> I am not sure is this the issue with the PETSc installation in my
> machine or am I missing something here?
> I have attached both codes and *log_summary* of PETSc for your
> convenience.
>
> Thanks & Regards,
>
> *Ajit Desai*
> *--*
> * Carleton University *
> * Ottawa, Canada*
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151224/551c776c/attachment.html>
More information about the petsc-users
mailing list