[petsc-users] Configuration of Hybrid MPI-OpenMP

Jed Brown jed at jedbrown.org
Thu Jan 30 11:30:50 CST 2014

Danyang Su <danyang.su at gmail.com> writes:

> I made a second check on initialization of PETSc, and found that the 
> initialization does not take effect. The codes are as follows.
>          call PetscInitialize(Petsc_Null_Character,ierrcode)
>          call MPI_Comm_rank(Petsc_Comm_World,rank,ierrcode)
>          call MPI_Comm_size(Petsc_Comm_World,nprcs,ierrcode)
> The value of rank and nprcs are always 0 and 1, respectively, whatever 
> how many processors are used in running the program.

The most common reason for this is that you have more than one MPI
implementation on your system and they are getting mixed up.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 835 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140130/a18943af/attachment.pgp>

More information about the petsc-users mailing list