<div dir="ltr"><div dir="ltr"><br><br></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Wed, Aug 18, 2021 at 12:52 PM Feimi Yu <<a href="mailto:yuf2@rpi.edu" target="_blank">yuf2@rpi.edu</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
<div>
<p>Hi,</p>
<p>I was trying to run a simulation with a PETSc-wrapped Hypre
preconditioner, and encountered this problem:</p>
<p>[dcs122:133012] Out of resources: all 4095 communicator IDs have
been used.<br>
[19]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------<br>
[19]PETSC ERROR: General MPI error <br>
[19]PETSC ERROR: MPI error 17 MPI_ERR_INTERN: internal error<br>
[19]PETSC ERROR: See <a href="https://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">https://www.mcs.anl.gov/petsc/documentation/faq.html</a>
for trouble shooting.<br>
[19]PETSC ERROR: Petsc Release Version 3.15.2, unknown <br>
[19]PETSC ERROR: ./main on a arch-linux-c-opt named dcs122 by
CFSIfmyu Wed Aug 11 19:51:47 2021 <br>
[19]PETSC ERROR: [dcs122:133010] Out of resources: all 4095
communicator IDs have been used.<br>
[18]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------<br>
[18]PETSC ERROR: General MPI error <br>
[18]PETSC ERROR: MPI error 17 MPI_ERR_INTERN: internal error<br>
[18]PETSC ERROR: See <a href="https://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">https://www.mcs.anl.gov/petsc/documentation/faq.html</a>
for trouble shooting.<br>
[18]PETSC ERROR: Petsc Release Version 3.15.2, unknown <br>
[18]PETSC ERROR: ./main on a arch-linux-c-opt named dcs122 by
CFSIfmyu Wed Aug 11 19:51:47 2021 <br>
[18]PETSC ERROR: Configure options --download-scalapack
--download-mumps --download-hypre --with-cc=mpicc
--with-cxx=mpicxx --with-fc=mpif90 --with-cudac=0
--with-debugging=0
--with-blaslapack-dir=/gpfs/u/home/CFSI/CFSIfmyu/barn-shared/dcs-rh8/lapack-build/<br>
[18]PETSC ERROR: <a href="https://itssc.rpi.edu/hc/requests/1" rel="ticket" target="_blank">#1</a> MatCreate_HYPRE() at
/gpfs/u/barn/CFSI/shared/dcs-rh8/petsc/src/mat/impls/hypre/mhypre.c:2120<br>
[18]PETSC ERROR: <a href="https://itssc.rpi.edu/hc/requests/2" rel="ticket" target="_blank">#2</a> MatSetType() at
/gpfs/u/barn/CFSI/shared/dcs-rh8/petsc/src/mat/interface/matreg.c:91<br>
[18]PETSC ERROR: <a href="https://itssc.rpi.edu/hc/requests/3" rel="ticket" target="_blank">#3</a> MatConvert_AIJ_HYPRE() at
/gpfs/u/barn/CFSI/shared/dcs-rh8/petsc/src/mat/impls/hypre/mhypre.c:392<br>
[18]PETSC ERROR: <a href="https://itssc.rpi.edu/hc/requests/4" rel="ticket" target="_blank">#4</a> MatConvert() at
/gpfs/u/barn/CFSI/shared/dcs-rh8/petsc/src/mat/interface/matrix.c:4439<br>
[18]PETSC ERROR: <a href="https://itssc.rpi.edu/hc/requests/5" rel="ticket" target="_blank">#5</a> PCSetUp_HYPRE() at
/gpfs/u/barn/CFSI/shared/dcs-rh8/petsc/src/ksp/pc/impls/hypre/hypre.c:240<br>
[18]PETSC ERROR: <a href="https://itssc.rpi.edu/hc/requests/6" rel="ticket" target="_blank">#6</a> PCSetUp() at
/gpfs/u/barn/CFSI/shared/dcs-rh8/petsc/src/ksp/pc/interface/precon.c:1015<br>
Configure options --download-scalapack --download-mumps
--download-hypre --with-cc=mpicc --with-cxx=mpicxx
--with-fc=mpif90 --with-cudac=0 --with-debugging=0
--with-blaslapack-dir=/gpfs/u/home/CFSI/CFSIfmyu/barn-shared/dcs-rh8/lapack-build/<br>
[19]PETSC ERROR: <a href="https://itssc.rpi.edu/hc/requests/1" rel="ticket" target="_blank">#1</a> MatCreate_HYPRE() at
/gpfs/u/barn/CFSI/shared/dcs-rh8/petsc/src/mat/impls/hypre/mhypre.c:2120<br>
[19]PETSC ERROR: <a href="https://itssc.rpi.edu/hc/requests/2" rel="ticket" target="_blank">#2</a> MatSetType() at
/gpfs/u/barn/CFSI/shared/dcs-rh8/petsc/src/mat/interface/matreg.c:91<br>
[19]PETSC ERROR: <a href="https://itssc.rpi.edu/hc/requests/3" rel="ticket" target="_blank">#3</a> MatConvert_AIJ_HYPRE() at
/gpfs/u/barn/CFSI/shared/dcs-rh8/petsc/src/mat/impls/hypre/mhypre.c:392<br>
[19]PETSC ERROR: <a href="https://itssc.rpi.edu/hc/requests/4" rel="ticket" target="_blank">#4</a> MatConvert() at
/gpfs/u/barn/CFSI/shared/dcs-rh8/petsc/src/mat/interface/matrix.c:4439<br>
[19]PETSC ERROR: <a href="https://itssc.rpi.edu/hc/requests/5" rel="ticket" target="_blank">#5</a> PCSetUp_HYPRE() at
/gpfs/u/barn/CFSI/shared/dcs-rh8/petsc/src/ksp/pc/impls/hypre/hypre.c:240<br>
[19]PETSC ERROR: <a href="https://itssc.rpi.edu/hc/requests/6" rel="ticket" target="_blank">#6</a> PCSetUp() at
/gpfs/u/barn/CFSI/shared/dcs-rh8/petsc/src/ksp/pc/interface/precon.c:1015</p>
<p>It seems that MPI_Comm_dup() at
petsc/src/mat/impls/hypre/mhypre.c:2120 caused the problem. Since
mine is a time-dependent problem, MatCreate_HYPRE() is called
every time the new system matrix is assembled. The above error
message is reported after ~4095 calls of MatCreate_HYPRE(), which
is around 455 time steps in my code. Here is some basic compiler
information:</p></div></blockquote><div>Can you destroy old matrices to free MPI communicators? Otherwise, you run into a limitation we knew before.</div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><p> </p></div></blockquote><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div>
<p>IBM Spectrum MPI 10.4.0</p>
<p>GCC 8.4.1</p>
<p>I've never had this problem before with OpenMPI or MPICH
implementation, so I was wondering if this can be resolved from my
end, or it's an implementation specific problem.</p>
<p>Thanks!</p>
<p>Feimi<br>
</p>
</div>
</blockquote></div></div>