[petsc-users] Running problem with pc_type hypre
Danyang Su
danyang.su at gmail.com
Thu Jun 5 12:58:34 CDT 2014
Hi All,
I recompiled the hypre library with the same compiler and intel mkl,
then the error is gone.
Thanks,
Danyang
On 28/05/2014 4:10 PM, Danyang Su wrote:
> Hi Barry,
>
> I need further check on it. Running this executable file on another
> machine results into mkl_intel_thread.dll missing error. I am not sure
> at present if the mkl_intel_thread.dll version causes this problem.
>
> Thanks,
>
> Danyang
>
> On 28/05/2014 4:01 PM, Barry Smith wrote:
>> Some possibilities:
>>
>> Are you sure that the hypre was compiled with exactly the same MPI
>> as the that used to build PETSc?
>>
>> On May 28, 2014, at 4:57 PM, Danyang Su <danyang.su at gmail.com> wrote:
>>
>>> Hi All,
>>>
>>> I am testing my codes under windows with PETSc V3.4.4.
>>>
>>> When running with option -pc_type hypre using 1 processor, the
>>> program exactly uses 6 processors (my computer is 6 processors 12
>>> threads)
>> 6 threads? or 6 processes? It should not be possible for it to
>> use more processes then what you start the program with.
>>
>> hypre can be configured to use OpenMP thread parallelism PLUS
>> MPI parallelism. Was it configured/compiled for that? If so you want
>> to turn that off,
>> configure and compile hypre before linking to PETSc so it does not
>> use OpenMP.
>>
>> Are you sure you don’t have a bunch of zombie MPI processes
>> running from previous jobs that crashed. They suck up CPU but are not
>> involved in the current MPI run. Reboot the machine to get rid of
>> them all.
>>
>> Barry
>>
>>> and the program crashed after many timesteps. The error information
>>> is as follows:
>>>
>>> job aborted:
>>> [ranks] message
>>>
>>> [0] fatal error
>>> Fatal error in MPI_Comm_create: Internal MPI error!, error stack:
>>> MPI_Comm_create(536).......: MPI_Comm_create(comm=0x84000000,
>>> group=0xc80300f2, new_comm=0x000000001EA6DD30) failed
>>> MPI_Comm_create(524).......:
>>> MPIR_Comm_create_intra(209):
>>> MPIR_Get_contextid(253)....: Too many communicators
>>>
>>> When running with option -pc_type hypre using 2 processors or more,
>>> the program exactly uses all the threads, making the system
>>> seriously overburden and the program runs very slowly.
>>>
>>> When running without -pc_type hypre, the program works fine without
>>> any problem.
>>>
>>> Does anybody have the same problem in windows.
>>>
>>> Thanks and regards,
>>>
>>> Danyang
>
More information about the petsc-users
mailing list