<div dir="ltr"><div>Were they all tested with your ex2f.F90 example? </div><div><br></div><div><div dir="ltr" class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr">--Junchao Zhang</div></div></div><br></div><br><div class="gmail_quote gmail_quote_container"><div dir="ltr" class="gmail_attr">On Fri, Jul 11, 2025 at 6:58 AM Klaij, Christiaan <<a href="mailto:C.Klaij@marin.nl">C.Klaij@marin.nl</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">In summary for future reference:<br>
- tested 3 different machines, two at Marin, one at the national HPC<br>
- tested 3 different mpi implementation (intelmpi, openmpi and mpich)<br>
- tested openmpi in both release and debug<br>
- tested 2 different compilers (intel and gnu), both older and very recent versions<br>
- tested with the most basic config (./configure --with-cxx=0 --with-debugging=0 --download-mpich)<br>
<br>
All of these test either segfault, or hang or error-out at the call to PetscLogView.<br>
<br>
Chris<br>
<br>
________________________________________<br>
From: Klaij, Christiaan <<a href="mailto:C.Klaij@marin.nl" target="_blank">C.Klaij@marin.nl</a>><br>
Sent: Friday, July 11, 2025 10:10 AM<br>
To: Barry Smith; Junchao Zhang<br>
Cc: PETSc users list<br>
Subject: Re: [petsc-users] problem with nested logging, standalone example<br>
<br>
@Matt: no MPI errors indeed. I've tried with MPICH and I get the same hanging.<br>
@Barry: both stack traces aren't exactly the same, see a sample with MPICH below.<br>
<br>
If it cannot be reproduced at your side, I'm afraid this is another dead end. Thanks anyway, I really appreciate all your help.<br>
<br>
Chris<br>
<br>
(gdb) bt<br>
#0  0x000015555033bc2e in MPIDI_POSIX_mpi_release_gather_gather.constprop.0 ()<br>
   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
#1  0x000015555033db8a in MPIDI_POSIX_mpi_allreduce_release_gather ()<br>
   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
#2  0x000015555033e70f in MPIR_Allreduce ()<br>
   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
#3  0x000015555033f22e in PMPI_Allreduce ()<br>
   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
#4  0x0000155553f85d69 in MPIU_Allreduce_Count (comm=-2080374782,<br>
    op=1476395020, dtype=1275072547, count=1, outbuf=0x7fffffffac70,<br>
    inbuf=0x7fffffffac60)<br>
    at /home/cklaij/petsc/petsc-3.23.4/src/sys/objects/pinit.c:1839<br>
#5  MPIU_Allreduce_Private (inbuf=inbuf@entry=0x7fffffffac60,<br>
    outbuf=outbuf@entry=0x7fffffffac70, count=count@entry=1,<br>
    dtype=dtype@entry=1275072547, op=op@entry=1476395020, comm=-2080374782)<br>
    at /home/cklaij/petsc/petsc-3.23.4/src/sys/objects/pinit.c:1869<br>
#6  0x0000155553f33dbe in PetscPrintXMLNestedLinePerfResults (<br>
    viewer=viewer@entry=0x458890, name=name@entry=0x155554ef6a0d 'mbps\000',<br>
    value=<optimized out>, minthreshold=minthreshold@entry=0,<br>
    maxthreshold=maxthreshold@entry=0.01,<br>
    minmaxtreshold=minmaxtreshold@entry=1.05)<br>
    at /home/cklaij/petsc/petsc-3.23.4/src/sys/logging/handler/impls/nested/xmlviewer.c:255<br>
<br>
<br>
(gdb) bt<br>
#0  0x000015554fed3b17 in clock_gettime@GLIBC_2.2.5 () from /lib64/libc.so.6<br>
#1  0x0000155550b0de71 in ofi_gettime_ns ()<br>
   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
#2  0x0000155550b0dec9 in ofi_gettime_ms ()<br>
   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
#3  0x0000155550b2fab5 in sock_cq_sreadfrom ()<br>
   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
#4  0x00001555505ca6f7 in MPIDI_OFI_progress ()<br>
   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
#5  0x0000155550591fe9 in progress_test ()<br>
   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
#6  0x00001555505924a3 in MPID_Progress_wait ()<br>
   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
#7  0x000015555043463e in MPIR_Wait_state ()<br>
   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
#8  0x000015555052ec49 in MPIC_Wait ()<br>
   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
#9  0x000015555053093e in MPIC_Sendrecv ()<br>
   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
#10 0x00001555504bf674 in MPIR_Allreduce_intra_recursive_doubling ()<br>
   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
#11 0x00001555505b61de in MPIDI_OFI_mpi_finalize_hook ()<br>
   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
<br>
________________________________________<br>
From: Barry Smith <<a href="mailto:bsmith@petsc.dev" target="_blank">bsmith@petsc.dev</a>><br>
Sent: Thursday, July 10, 2025 11:10 PM<br>
To: Junchao Zhang<br>
Cc: Klaij, Christiaan; PETSc users list<br>
Subject: Re: [petsc-users] problem with nested logging, standalone example<br>
<br>
<br>
  I cannot reproduce<br>
<br>
On Jul 10, 2025, at 3:46 PM, Junchao Zhang <<a href="mailto:junchao.zhang@gmail.com" target="_blank">junchao.zhang@gmail.com</a>> wrote:<br>
<br>
Adding -mca coll_hcoll_enable 0 didn't change anything at my end.  Strange.<br>
<br>
--Junchao Zhang<br>
<br>
<br>
On Thu, Jul 10, 2025 at 3:39 AM Klaij, Christiaan <<a href="mailto:C.Klaij@marin.nl" target="_blank">C.Klaij@marin.nl</a><mailto:<a href="mailto:C.Klaij@marin.nl" target="_blank">C.Klaij@marin.nl</a>>> wrote:<br>
An additional clue perhaps: with the option OMPI_MCA_coll_hcoll_enable=0, the code does not hang but gives the error below.<br>
<br>
Chris<br>
<br>
<br>
$ mpirun -mca coll_hcoll_enable 0 -n 2 ./ex2f-cklaij-dbg -pc_type jacobi -ksp_monitor_short -ksp_gmres_cgs_refinement_type refine_always<br>
0 KSP Residual norm 1.11803<br>
1 KSP Residual norm 0.591608<br>
2 KSP Residual norm 0.316228<br>
3 KSP Residual norm < 1.e-11<br>
0 KSP Residual norm 0.707107<br>
1 KSP Residual norm 0.408248<br>
2 KSP Residual norm < 1.e-11<br>
Norm of error < 1.e-12 iterations 3<br>
[1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>
[1]PETSC ERROR: General MPI error<br>
[1]PETSC ERROR: MPI error 1 MPI_ERR_BUFFER: invalid buffer pointer<br>
[1]PETSC ERROR: See <a href="https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!ZyiwrArIFFVQEy3iWRduuvJj55fiVzKR94DvhrQs9yD59inAYaV8ixbXwagVSTSKcTFN80eCIfFWyH6SWODSjHOBKqtO$" rel="noreferrer" target="_blank">https://petsc.org/release/faq/</a><<a href="https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJjkYxsN9$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJjkYxsN9$</a>> for trouble shooting.<br>
[1]PETSC ERROR: Petsc Release Version 3.22.4, Mar 01, 2025<br>
[1]PETSC ERROR: ./ex2f-cklaij-dbg with 2 MPI process(es) and PETSC_ARCH on login1 by cklaij Thu Jul 10 10:33:33 2025<br>
[1]PETSC ERROR: Configure options: --prefix=/home/cklaij/ReFRESCO/trunk/install/extLibs --with-mpi-dir=/cm/shared/apps/openmpi/gcc/5.0.6-debug --with-x=0 --with-mpe=0 --with-debugging=0 --download-superlu_dist=<a href="https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/superlu_dist-8.1.2.tar.gz__;!!G_uCfscf7eWS!ZyiwrArIFFVQEy3iWRduuvJj55fiVzKR94DvhrQs9yD59inAYaV8ixbXwagVSTSKcTFN80eCIfFWyH6SWODSjEz07NgJ$" rel="noreferrer" target="_blank">https://updates.marin.nl/refresco/libs/superlu_dist-8.1.2.tar.gz</a><<a href="https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/superlu_dist-8.1.2.tar.gz__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJkouVHb2$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/superlu_dist-8.1.2.tar.gz__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJkouVHb2$</a>> --with-blaslapack-dir=/cm/shared/apps/oneapi/2024.2.1/mkl/2024.2 --download-parmetis=<a href="https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/parmetis-4.0.3-p9.tar.gz__;!!G_uCfscf7eWS!ZyiwrArIFFVQEy3iWRduuvJj55fiVzKR94DvhrQs9yD59inAYaV8ixbXwagVSTSKcTFN80eCIfFWyH6SWODSjK2HNCeG$" rel="noreferrer" target="_blank">https://updates.marin.nl/refresco/libs/parmetis-4.0.3-p9.tar.gz</a><<a href="https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/parmetis-4.0.3-p9.tar.gz__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJrjo6-SP$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/parmetis-4.0.3-p9.tar.gz__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJrjo6-SP$</a>> --download-metis=<a href="https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/metis-5.1.0-p11.tar.gz__;!!G_uCfscf7eWS!ZyiwrArIFFVQEy3iWRduuvJj55fiVzKR94DvhrQs9yD59inAYaV8ixbXwagVSTSKcTFN80eCIfFWyH6SWODSjL4z_WUP$" rel="noreferrer" target="_blank">https://updates.marin.nl/refresco/libs/metis-5.1.0-p11.tar.gz</a><<a href="https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/metis-5.1.0-p11.tar.gz__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJhCc9MRE$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/metis-5.1.0-p11.tar.gz__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJhCc9MRE$</a>> --with-packages-build-dir=/home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild --with-ssl=0 --with-shared-libraries=1 CFLAGS="-std=gnu11 -Wall -funroll-all-loops -O3 -DNDEBUG" CXXFLAGS="-std=gnu++14 -Wall -funroll-all-loops -O3 -DNDEBUG " COPTFLAGS="-std=gnu11 -Wall -funroll-all-loops -O3 -DNDEBUG" CXXOPTFLAGS="-std=gnu++14 -Wall -funroll-all-loops -O3 -DNDEBUG " FCFLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG" F90FLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG" FOPTFLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG"<br>
[1]PETSC ERROR: #1 PetscLogNestedTreePrintLine() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:289<br>
[1]PETSC ERROR: #2 PetscLogNestedTreePrint() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:377<br>
[1]PETSC ERROR: #3 PetscLogNestedTreePrint() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:384<br>
[1]PETSC ERROR: #4 PetscLogNestedTreePrintTop() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:420<br>
[1]PETSC ERROR: #5 PetscLogHandlerView_Nested_XML() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:443<br>
[1]PETSC ERROR: #6 PetscLogHandlerView_Nested() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/lognested.c:405<br>
[1]PETSC ERROR: #7 PetscLogHandlerView() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/interface/loghandler.c:342<br>
[1]PETSC ERROR: #8 PetscLogView() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/plog.c:2040<br>
[1]PETSC ERROR: #9 ex2f-cklaij-dbg.F90:301<br>
--------------------------------------------------------------------------<br>
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_SELF<br>
Proc: [[55228,1],1]<br>
Errorcode: 98<br>
<br>
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.<br>
You may or may not see output from other processes, depending on<br>
exactly when Open MPI kills them.<br>
--------------------------------------------------------------------------<br>
--------------------------------------------------------------------------<br>
prterun has exited due to process rank 1 with PID 0 on node login1 calling<br>
"abort". This may have caused other processes in the application to be<br>
terminated by signals sent by prterun (as reported here).<br>
--------------------------------------------------------------------------<br>
<br>
________________________________________<br>
<image198746.png><br>
dr. ir.         Christiaan       Klaij   |      senior researcher<br>
Research & Development   |      CFD Development<br>
T +31 317 49 33 44<tel:+31%20317%2049%2033%2044>         |      <a href="https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!ZyiwrArIFFVQEy3iWRduuvJj55fiVzKR94DvhrQs9yD59inAYaV8ixbXwagVSTSKcTFN80eCIfFWyH6SWODSjDGFyRHJ$" rel="noreferrer" target="_blank">www.marin.nl</a><<a href="https://urldefense.us/v3/__https://www.marin.nl/__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJrOqapgp$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__https://www.marin.nl/__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJrOqapgp$</a>><br>
<image542473.png><<a href="https://urldefense.us/v3/__https://www.facebook.com/marin.wageningen__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJoD4fuV7$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__https://www.facebook.com/marin.wageningen__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJoD4fuV7$</a>><br>
<image555176.png><<a href="https://urldefense.us/v3/__https://www.linkedin.com/company/marin__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJospHf95$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__https://www.linkedin.com/company/marin__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJospHf95$</a>><br>
<image269837.png><<a href="https://urldefense.us/v3/__https://www.youtube.com/marinmultimedia__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJrpsjB_W$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__https://www.youtube.com/marinmultimedia__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJrpsjB_W$</a>><br>
<br>
<br>
From: Klaij, Christiaan <<a href="mailto:C.Klaij@marin.nl" target="_blank">C.Klaij@marin.nl</a><mailto:<a href="mailto:C.Klaij@marin.nl" target="_blank">C.Klaij@marin.nl</a>>><br>
Sent: Thursday, July 10, 2025 10:15 AM<br>
To: Junchao Zhang<br>
Cc: PETSc users list<br>
Subject: Re: [petsc-users] problem with nested logging, standalone example<br>
<br>
Hi Junchao,<br>
<br>
Thanks for testing. I've fixed the error but unfortunately that doesn't change the behavior, the code still hangs as before, with the same stack trace...<br>
<br>
Chris<br>
<br>
________________________________________<br>
From: Junchao Zhang <<a href="mailto:junchao.zhang@gmail.com" target="_blank">junchao.zhang@gmail.com</a><mailto:<a href="mailto:junchao.zhang@gmail.com" target="_blank">junchao.zhang@gmail.com</a>>><br>
Sent: Tuesday, July 8, 2025 10:58 PM<br>
To: Klaij, Christiaan<br>
Cc: PETSc users list<br>
Subject: Re: [petsc-users] problem with nested logging, standalone example<br>
<br>
Hi, Chris,<br>
First, I had to fix an error in your test by adding " PetscCallA(MatSetFromOptions(AA,ierr))" at line 254.<br>
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>
[0]PETSC ERROR: Object is in wrong state<br>
[0]PETSC ERROR: Mat object's type is not set: Argument # 1<br>
...<br>
[0]PETSC ERROR: #1 MatSetValues() at /scratch/jczhang/petsc/src/mat/interface/matrix.c:1503<br>
[0]PETSC ERROR: #2 ex2f.F90:258<br>
<br>
Then I could ran the test without problems<br>
mpirun -n 2 ./ex2f -pc_type jacobi -ksp_monitor_short -ksp_gmres_cgs_refinement_type refine_always<br>
0 KSP Residual norm 1.11803<br>
1 KSP Residual norm 0.591608<br>
2 KSP Residual norm 0.316228<br>
3 KSP Residual norm < 1.e-11<br>
0 KSP Residual norm 0.707107<br>
1 KSP Residual norm 0.408248<br>
2 KSP Residual norm < 1.e-11<br>
Norm of error < 1.e-12 iterations 3<br>
<br>
I used petsc-3.22.4, gcc-11.3, openmpi-5.0.6 and configured with<br>
./configure --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-openmpi --with-ssl=0 --with-shared-libraries=1 CFLAGS="-std=gnu11 -Wall -funroll-all-loops -O3 -DNDEBUG" CXXFLAGS="-std=gnu++14 -Wall -funroll-all-loops -O3 -DNDEBUG " COPTFLAGS="-std=gnu11 -Wall -funroll-all-loops -O3 -DNDEBUG" CXXOPTFLAGS="-std=gnu++14 -Wall -funroll-all-loops -O3 -DNDEBUG " FCFLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG" F90FLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG" FOPTFLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG"<br>
<br>
Could you fix the error and retry?<br>
<br>
--Junchao Zhang<br>
<br>
<br>
On Sun, Jul 6, 2025 at 12:57 PM Klaij, Christiaan via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a><mailto:<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><mailto:<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a><mailto:<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>>>> wrote:<br>
Attached is a standalone example of the issue described in the<br>
earlier thread "problem with nested logging". The issue appeared<br>
somewhere between petsc 3.19.4 and 3.23.4.<br>
<br>
The example is a variation of ../ksp/tutorials/ex2f.F90, where<br>
I've added the nested log viewer with one event as well as the<br>
solution of a small system on rank zero.<br>
<br>
When running on mulitple procs the example hangs during<br>
PetscLogView with the backtrace below. The configure.log is also<br>
attached in the hope that you can replicate the issue.<br>
<br>
Chris<br>
<br>
<br>
#0 0x000015554c84ea9e in mca_pml_ucx_recv (buf=0x7fffffff9e30, count=1,<br>
datatype=0x15554c9ef900 <ompi_mpi_2dblprec>, src=1, tag=-12,<br>
comm=0x7f1e30, mpi_status=0x0) at pml_ucx.c:700<br>
#1 0x000015554c65baff in ompi_coll_base_allreduce_intra_recursivedoubling (<br>
sbuf=0x7fffffff9e20, rbuf=0x7fffffff9e30, count=1,<br>
dtype=0x15554c9ef900 <ompi_mpi_2dblprec>,<br>
op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30, module=0xaec630)<br>
at base/coll_base_allreduce.c:247<br>
#2 0x000015554c6a7e40 in ompi_coll_tuned_allreduce_intra_do_this (<br>
sbuf=0x7fffffff9e20, rbuf=0x7fffffff9e30, count=1,<br>
dtype=0x15554c9ef900 <ompi_mpi_2dblprec>,<br>
op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30, module=0xaec630,<br>
algorithm=3, faninout=0, segsize=0) at coll_tuned_allreduce_decision.c:142<br>
#3 0x000015554c6a054f in ompi_coll_tuned_allreduce_intra_dec_fixed (<br>
sbuf=0x7fffffff9e20, rbuf=0x7fffffff9e30, count=1,<br>
dtype=0x15554c9ef900 <ompi_mpi_2dblprec>,<br>
op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30, module=0xaec630)<br>
at coll_tuned_decision_fixed.c:216<br>
#4 0x000015554c68e160 in mca_coll_hcoll_allreduce (sbuf=0x7fffffff9e20,<br>
rbuf=0x7fffffff9e30, count=1, dtype=0x15554c9ef900 <ompi_mpi_2dblprec>,<br>
op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30, module=0xaecb80)<br>
at coll_hcoll_ops.c:217<br>
#5 0x000015554c59811a in PMPI_Allreduce (sendbuf=0x7fffffff9e20,<br>
recvbuf=0x7fffffff9e30, count=1, datatype=0x15554c9ef900 <ompi_mpi_2dblprec>, op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30) at allreduce.c:123<br>
#6 0x0000155553eabede in MPIU_Allreduce_Private () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
#7 0x0000155553e50d08 in PetscPrintXMLNestedLinePerfResults () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
#8 0x0000155553e5123e in PetscLogNestedTreePrintLine () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
#9 0x0000155553e51f3a in PetscLogNestedTreePrint () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
#10 0x0000155553e51e96 in PetscLogNestedTreePrint () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
#11 0x0000155553e51e96 in PetscLogNestedTreePrint () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
#12 0x0000155553e52142 in PetscLogNestedTreePrintTop () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
#13 0x0000155553e5257b in PetscLogHandlerView_Nested_XML () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
#14 0x0000155553e4e5a0 in PetscLogHandlerView_Nested () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
#15 0x0000155553e56232 in PetscLogHandlerView () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
#16 0x0000155553e588c3 in PetscLogView () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
#17 0x0000155553e40eb5 in petsclogview_ () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
#18 0x0000000000402c8b in MAIN__ ()<br>
#19 0x00000000004023df in main ()<br>
[cid:ii_197ebccaa1d27ee6ef21]<br>
dr. ir. Christiaan Klaij | senior researcher<br>
Research & Development | CFD Development<br>
T +31 317 49 33 44<tel:+31%20317%2049%2033%2044> | <a href="https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!ZyiwrArIFFVQEy3iWRduuvJj55fiVzKR94DvhrQs9yD59inAYaV8ixbXwagVSTSKcTFN80eCIfFWyH6SWODSjDGFyRHJ$" rel="noreferrer" target="_blank">www.marin.nl</a><<a href="https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJhphmV4x$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJhphmV4x$</a>><<a href="https://urldefense.us/v3/__https://www.marin.nl/__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imk4ivm_tE$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__https://www.marin.nl/__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imk4ivm_tE$</a>><br>
[Facebook]<<a href="https://urldefense.us/v3/__https://www.facebook.com/marin.wageningen__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imkLNCvsiI$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__https://www.facebook.com/marin.wageningen__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imkLNCvsiI$</a>><br>
[LinkedIn]<<a href="https://urldefense.us/v3/__https://www.linkedin.com/company/marin__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imkrb79Ay4$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__https://www.linkedin.com/company/marin__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imkrb79Ay4$</a>><br>
[YouTube]<<a href="https://urldefense.us/v3/__https://www.youtube.com/marinmultimedia__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imkJiCoeLw$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__https://www.youtube.com/marinmultimedia__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imkJiCoeLw$</a>><br>
<br>
<br>
</blockquote></div>