[petsc-users] MPI_Testall errors

Jed Brown jed at jedbrown.org
Mon Jul 23 11:32:53 CDT 2018


Do you have a reproducible test case you could give us?

Does the problem go away if you run with -matstash_legacy?

"Pham, Dung Ngoc" <dnpham at wpi.edu> writes:

> Hello,
>
> <mailto:petsc-users at mcs.anl.gov>
>
> I am constructing a large matrices in a generalized eigen value problem of order ~ 700,000*700,000 in MPIAIJ format across multiple nodes. The matrix construction goes smoothly when I have smaller dimensional matrices or using less number of cores. However when I use more number of cores (~ 180 or more) I am getting these following error.
>
>
> [169]PETSC ERROR: #1 PetscCommBuildTwoSidedFReq_Ibarrier() line 418 in /petsc/petsc-3.9.3/src/sys/utils/mpits.c
> [169]PETSC ERROR: #2 PetscCommBuildTwoSidedFReq() line 570 in /petsc/petsc-3.9.3/src/sys/utils/mpits.c
> [169]PETSC ERROR: #3 MatStashScatterBegin_BTS() line 933 in /petsc/petsc-3.9.3/src/mat/utils/matstash.c
> [169]PETSC ERROR: #4 MatStashScatterBegin_Private() line 461 in /petsc/petsc-3.9.3/src/mat/utils/matstash.c
> [169]PETSC ERROR: #5 MatAssemblyBegin_MPIAIJ() line 683 in /petsc/petsc-3.9.3/src/mat/impls/aij/mpi/mpiaij.c
> [169]PETSC ERROR: #6 MatAssemblyBegin() line 5158 in /petsc/petsc-3.9.3/src/mat/interface/matrix.c
>
> I understand that the errors are in assembling the matrix in particular in subroutine "MPI_Testall" where it test the completion of test requests. What might be the cause for such errors? Any thoughts on resolution is appreciated. I have attached the configure.log as well.


More information about the petsc-users mailing list