[petsc-users] MPI_Testall errors

Pham, Dung Ngoc dnpham at wpi.edu
Mon Jul 23 10:58:01 CDT 2018


<mailto:petsc-users at mcs.anl.gov>

I am constructing a large matrices in a generalized eigen value problem of order ~ 700,000*700,000 in MPIAIJ format across multiple nodes. The matrix construction goes smoothly when I have smaller dimensional matrices or using less number of cores. However when I use more number of cores (~ 180 or more) I am getting these following error.

[169]PETSC ERROR: #1 PetscCommBuildTwoSidedFReq_Ibarrier() line 418 in /petsc/petsc-3.9.3/src/sys/utils/mpits.c
[169]PETSC ERROR: #2 PetscCommBuildTwoSidedFReq() line 570 in /petsc/petsc-3.9.3/src/sys/utils/mpits.c
[169]PETSC ERROR: #3 MatStashScatterBegin_BTS() line 933 in /petsc/petsc-3.9.3/src/mat/utils/matstash.c
[169]PETSC ERROR: #4 MatStashScatterBegin_Private() line 461 in /petsc/petsc-3.9.3/src/mat/utils/matstash.c
[169]PETSC ERROR: #5 MatAssemblyBegin_MPIAIJ() line 683 in /petsc/petsc-3.9.3/src/mat/impls/aij/mpi/mpiaij.c
[169]PETSC ERROR: #6 MatAssemblyBegin() line 5158 in /petsc/petsc-3.9.3/src/mat/interface/matrix.c

I understand that the errors are in assembling the matrix in particular in subroutine "MPI_Testall" where it test the completion of test requests. What might be the cause for such errors? Any thoughts on resolution is appreciated. I have attached the configure.log as well.


D. N. Pham

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180723/c2bd4d12/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: configure.log
Type: text/x-log
Size: 7667765 bytes
Desc: configure.log
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180723/c2bd4d12/attachment-0001.bin>

More information about the petsc-users mailing list