[petsc-users] Configuration of Hybrid MPI-OpenMP

Danyang Su danyang.su at gmail.com
Thu Jan 30 11:27:27 CST 2014


I made a second check on initialization of PETSc, and found that the 
initialization does not take effect. The codes are as follows.

         call PetscInitialize(Petsc_Null_Character,ierrcode)
         call MPI_Comm_rank(Petsc_Comm_World,rank,ierrcode)
         call MPI_Comm_size(Petsc_Comm_World,nprcs,ierrcode)

The value of rank and nprcs are always 0 and 1, respectively, whatever 
how many processors are used in running the program.

Danyang

On 29/01/2014 6:08 PM, Jed Brown wrote:
> Danyang Su <danyang.su at gmail.com> writes:
>
>> Hi Karli,
>>
>> "--with-threadcomm --with-openmp" can work when configure PETSc with
>> MPI-OpenMP. Sorry for making a mistake before.
>> The program can be compiled but I got a new error while running my program.
>>
>> Error: Attempting to use an MPI routine before initializing MPICH
>>
>> This error occurs when calling MPI_SCATTERV. I have already called
>> PetscInitialize, and MPI_BCAST, which is just before the calling of
>> MPI_SCATTERV, can also work without throwing error.
>>
>> When PETSc is configured without openmp, there is no error in this section.
> Are you calling this inside an omp parallel block?  Are you initializing
> MPI with MPI_THREAD_MULTIPLE?  Do you have other threads doing something
> with MPI?
>
> I'm afraid we'll need a reproducible test case if it still doesn't work
> for you.



More information about the petsc-users mailing list