[petsc-users] Running problem with pc_type hypre
Danyang Su
danyang.su at gmail.com
Wed May 28 16:57:54 CDT 2014
Hi All,
I am testing my codes under windows with PETSc V3.4.4.
When running with option -pc_type hypre using 1 processor, the program
exactly uses 6 processors (my computer is 6 processors 12 threads) and
the program crashed after many timesteps. The error information is as
follows:
job aborted:
[ranks] message
[0] fatal error
Fatal error in MPI_Comm_create: Internal MPI error!, error stack:
MPI_Comm_create(536).......: MPI_Comm_create(comm=0x84000000,
group=0xc80300f2, new_comm=0x000000001EA6DD30) failed
MPI_Comm_create(524).......:
MPIR_Comm_create_intra(209):
MPIR_Get_contextid(253)....: Too many communicators
When running with option -pc_type hypre using 2 processors or more, the
program exactly uses all the threads, making the system seriously
overburden and the program runs very slowly.
When running without -pc_type hypre, the program works fine without any
problem.
Does anybody have the same problem in windows.
Thanks and regards,
Danyang
More information about the petsc-users
mailing list