<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class="">From <span style="font-family: Menlo; font-size: 14px;" class="">src/sys/objects/ftn-custom/zstart.c petscinitialize_internal</span><br class=""><br class="">PETSC_COMM_WORLD = MPI_COMM_WORLD<div class=""><br class=""></div><div class="">Which means that PETSC_COMM_WORLD is not a PETSc communicator.</div><div class=""><br class=""></div><div class="">The first matrix creation duplicates the PETSC_COMM_WORLD and thus can be reused for the other objects</div><div class="">When you finally destroy the matrix inside the loop, the ref count of this duplicated comm goes to zero and it is free</div><div class="">This is why you duplicate at each step</div><div class=""><br class=""></div><div class="">However, the C version of PetscInitialize does the same, so I’m not sure why this happens with Fortran and not with C. (Do you leak objects in the C code?)<br class=""><div><br class=""></div><div><br class=""><blockquote type="cite" class=""><div class="">On Nov 1, 2019, at 1:41 PM, Patrick Sanan via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov" class="">petsc-users@mcs.anl.gov</a>> wrote:</div><br class="Apple-interchange-newline"><div class=""><div dir="ltr" class=""><b class="">Context:</b> I'm trying to track down an error that (only) arises when running a Fortran 90 code, using PETSc, on a new cluster. The code creates and destroys a linear system (Mat,Vec, and KSP) at each of (many) timesteps. The error message from a user looks like this, which leads me to suspect that MPI_Comm_dup() is being called many times and this is eventually a problem for this particular MPI implementation (Open MPI 2.1.0):<br class=""><blockquote style="margin:0 0 0 40px;border:none;padding:0px" class=""><br class="">[lo-a2-058:21425] *** An error occurred in MPI_Comm_dup<br class="">[lo-a2-058:21425] *** reported by process [4222287873,2]<br class="">[lo-a2-058:21425] *** on communicator MPI COMMUNICATOR 65534 DUP FROM 65533<br class="">[lo-a2-058:21425] *** MPI_ERR_INTERN: internal error<br class="">[lo-a2-058:21425] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,<br class="">[lo-a2-058:21425] *** and potentially your MPI job)<br class=""><br class=""></blockquote><b class="">Question: </b>I remember some discussion recently (but can't find the thread) about not calling MPI_Comm_dup() too many times from PetscCommDuplicate(), which would allow one to safely use the (admittedly not optimal) approach used in this application code. Is that a correct understanding and would the fixes made in that context also apply to Fortran? I don't fully understand the details of the MPI techniques used, so thought I'd ask here. <div class=""><br class=""></div><div class="">If I hack a simple build-solve-destroy example to run several loops, I see a notable difference between C and Fortran examples. With the attached ex223.c and ex221f.F90, which just add outer loops (5 iterations) to KSP tutorials examples ex23.c and ex21f.F90, respectively, I see the following. Note that in the Fortran case, it appears that communicators are actually duplicated in each loop, but in the C case, this only happens in the first loop:</div><div class=""><br class=""></div><blockquote style="margin:0 0 0 40px;border:none;padding:0px" class=""><div class="">[(arch-maint-extra-opt) tutorials (maint *$%=)]$ ./ex223 -info | grep PetscCommDuplicate</div><div class="">[0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 268435455</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class=""><br class=""></div><div class="">[(arch-maint-extra-opt) tutorials (maint *$%=)]$ ./ex221f -info | grep PetscCommDuplicate</div><div class="">[0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 268435455</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 268435455</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 268435455</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 268435455</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 268435455</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div><div class="">[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784</div></blockquote><div class=""><br class=""></div><div class=""><br class=""><blockquote style="margin:0 0 0 40px;border:none;padding:0px" class=""><br class=""></blockquote></div></div>
<span id="cid:f_k2fzzyeq0"><ex221f.F90></span><span id="cid:f_k2fzzyfg1"><ex223.c></span></div></blockquote></div><br class=""></div></body></html>