[petsc-users] Configuration of Hybrid MPI-OpenMP

Danyang Su danyang.su at gmail.com
Thu Jan 30 12:59:31 CST 2014


On 30/01/2014 9:30 AM, Jed Brown wrote:
> Danyang Su <danyang.su at gmail.com> writes:
>
>> I made a second check on initialization of PETSc, and found that the
>> initialization does not take effect. The codes are as follows.
>>
>>           call PetscInitialize(Petsc_Null_Character,ierrcode)
>>           call MPI_Comm_rank(Petsc_Comm_World,rank,ierrcode)
>>           call MPI_Comm_size(Petsc_Comm_World,nprcs,ierrcode)
>>
>> The value of rank and nprcs are always 0 and 1, respectively, whatever
>> how many processors are used in running the program.
> The most common reason for this is that you have more than one MPI
> implementation on your system and they are getting mixed up.
Yes, I have MPICH2 and Microsoft HPC on the same OS. The PETSc was build 
using MPICH2. I will uninstall Microsoft HPC to see if it works.

Thanks,

Danyang


More information about the petsc-users mailing list