<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
</head>
<body>
<div dir="ltr" style="font-family: Aptos, Arial, Helvetica, sans-serif; font-size: 12pt;">
Thank you for the quick fix — it works well on my end.</div>
<div dir="ltr" style="font-family: Aptos, Arial, Helvetica, sans-serif; font-size: 12pt;">
<br>
</div>
<div dir="ltr" style="font-family: Aptos, Arial, Helvetica, sans-serif; font-size: 12pt; color: rgb(0, 0, 0);">
Best wishes,</div>
<div dir="ltr" style="font-family: Aptos, Arial, Helvetica, sans-serif; font-size: 12pt; color: rgb(0, 0, 0);">
Zongze</div>
<div dir="ltr" style="font-family: Aptos, Arial, Helvetica, sans-serif; font-size: 12pt;">
<br>
</div>
<div id="mail-editor-reference-message-container">
<div dir="ltr" class="ms-outlook-mobile-reference-message skipProofing"></div>
<div class="ms-outlook-mobile-reference-message skipProofing" style="text-align: left; padding: 3pt 0in 0in; border-width: 1pt medium medium; border-style: solid none none; border-color: rgb(181, 196, 223) currentcolor currentcolor; font-family: Aptos; font-size: 12pt; color: black;">
<b>From: </b>Junchao Zhang <junchao.zhang@gmail.com><br>
<b>Date: </b>Thursday, July 24, 2025 at 02:55<br>
<b>To: </b>Barry Smith <bsmith@petsc.dev><br>
<b>Cc: </b>Zongze Yang <yangzongze@gmail.com>, Klaij, Christiaan <C.Klaij@marin.nl>, PETSc users list <petsc-users@mcs.anl.gov><br>
<b>Subject: </b>Re: [petsc-users] problem with nested logging, standalone example<br>
<br>
</div>
<div dir="ltr" class="ms-outlook-mobile-reference-message skipProofing">I think I have a fix at
<a href="https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/merge_requests/8583__;!!G_uCfscf7eWS!fxzaDHQxd3uHn2ASrZmv-IW42m1OeVvMXd0xo20hK2CZsZ_Mp8c7krPPe-rwleQvMo-ZGwDbRXPknH8Iv3wiy85a$" data-outlook-id="035c7344-275d-4a09-bd11-5fb0c05c699b">
https://gitlab.com/petsc/petsc/-/merge_requests/8583</a></div>
<div dir="ltr" class="ms-outlook-mobile-reference-message skipProofing"><br>
</div>
<div dir="ltr" class="ms-outlook-mobile-reference-message skipProofing">Chirs and Zongze, could you try it?</div>
<div dir="ltr" class="ms-outlook-mobile-reference-message skipProofing"><br>
</div>
<div dir="ltr" class="ms-outlook-mobile-reference-message skipProofing">Thanks!</div>
<div dir="ltr" class="ms-outlook-mobile-reference-message skipProofing">--Junchao Zhang</div>
<div dir="ltr" class="ms-outlook-mobile-reference-message skipProofing"><br>
</div>
<div dir="ltr" class="ms-outlook-mobile-reference-message skipProofing"><br>
</div>
<div dir="ltr" class="gmail_attr">On Tue, Jul 22, 2025 at 4:16 PM Barry Smith <<a href="mailto:bsmith@petsc.dev" data-outlook-id="61dbd3bf-86be-49ad-b101-dbc1589e0aaa">bsmith@petsc.dev</a>> wrote:</div>
<blockquote style="margin: 0px 0px 0px 0.8ex; padding-left: 1ex; border-left-width: 1px; border-left-style: solid; border-left-color: rgb(204, 204, 204);">
<div dir="ltr" class="gmail_quote"><br>
</div>
<div class="gmail_quote"> Yippee! (maybe)</div>
<div dir="ltr" class="gmail_quote"><br>
</div>
<blockquote>
<div class="gmail_quote">On Jul 22, 2025, at 4:18 PM, Junchao Zhang <<a href="mailto:junchao.zhang@gmail.com" target="_blank" data-outlook-id="3a5591a8-5591-4a44-96f8-d3ed750d6a1f">junchao.zhang@gmail.com</a>> wrote:</div>
<div dir="ltr" class="gmail_quote"><br>
</div>
<div dir="ltr" class="gmail_quote">With Chris's example, I did reproduce the "MPI_ERR_BUFFER: invalid buffer pointer" on a machine. I am looking into it.<br>
<br>
Thanks.</div>
<div dir="ltr" class="gmail_signature">--Junchao Zhang</div>
<div dir="ltr" class="gmail_quote"><br>
</div>
<div dir="ltr" class="gmail_quote"><br>
</div>
<div dir="ltr" class="gmail_attr">On Tue, Jul 22, 2025 at 9:51 AM Zongze Yang <<a href="mailto:yangzongze@gmail.com" target="_blank" data-outlook-id="9630781c-e1e0-4340-9b8f-3f0e0ce53db4">yangzongze@gmail.com</a>> wrote:</div>
<blockquote style="margin: 0px 0px 0px 0.8ex; padding-left: 1ex; border-left-width: 1px; border-left-style: solid; border-left-color: rgb(204, 204, 204);">
<div dir="ltr" class="gmail_quote" style="font-family: Aptos, Arial, Helvetica, sans-serif; font-size: 12pt;">
Hi,<br>
I encountered a similar issue with Firedrake when using the <code>-log_view</code> option with XML format on macOS. Below is the error message. The Firedrake code and the shell script used to run it are attached.</div>
<div dir="ltr" class="gmail_quote" style="font-family: Aptos, Arial, Helvetica, sans-serif; font-size: 12pt;">
<br>
</div>
<div dir="ltr" class="gmail_quote" style="font-family: Aptos, Arial, Helvetica, sans-serif; font-size: 12pt;">
```</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: General MPI error</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: MPI error 1 MPI_ERR_BUFFER: invalid buffer pointer</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: See <a href="https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!eiv8Wo1VhQz4c2L8MbDoPcg0KZ0loiWlwjI1MR6VEtFfLWTjZNV4UssfSUT-F9tKXb2GjX8Ar-YrWmBGIAY9ujQp$" target="_blank" data-outlook-id="83c9667b-6b6a-4d04-beb1-ec50a2283c90">
https://petsc.org/release/faq/</a> for trouble shooting.</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: PETSc Release Version 3.23.4, unknown</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: test.py with 2 MPI process(es) and PETSC_ARCH arch-firedrake-default on 192.168.10.51 by zzyang Tue Jul 22 22:24:05 2025</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: Configure options: PETSC_ARCH=arch-firedrake-default --COPTFLAGS="-O3 -march=native -mtune=native" --CXXOPTFLAGS="-O3 -march=native -mtune=native" --FOPTFLAGS="-O3 -mtune=native" --with-c2html=0 --with-debugging=0 --with-fortran-bindings=0 --with-shared-libraries=1
--with-strict-petscerrorcode --download-cmake --download-bison --download-fftw --download-mumps-avoid-mpi-in-place --with-hdf5-dir=/opt/homebrew --with-hwloc-dir=/opt/homebrew --download-metis --download-mumps --download-netcdf --download-pnetcdf --download-ptscotch
--download-scalapack --download-suitesparse --download-superlu_dist --download-slepc --with-zlib --download-hpddm --download-libpng --download-ctetgen --download-tetgen --download-triangle --download-mmg --download-parmmg --download-p4est --download-eigen
--download-hypre --download-pragmatic</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: #1 PetscLogNestedTreePrintLine() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:289</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: #2 PetscLogNestedTreePrint() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:383</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: #3 PetscLogNestedTreePrint() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:384</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: #4 PetscLogNestedTreePrint() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:384</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: #5 PetscLogNestedTreePrint() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:384</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: #6 PetscLogNestedTreePrint() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:384</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: #7 PetscLogNestedTreePrint() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:384</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: #8 PetscLogNestedTreePrint() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:384</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: #9 PetscLogNestedTreePrint() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:384</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: #10 PetscLogNestedTreePrint() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:384</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: #11 PetscLogNestedTreePrint() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:384</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: #12 PetscLogNestedTreePrint() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:384</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: #13 PetscLogNestedTreePrintTop() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:420</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: #14 PetscLogHandlerView_Nested_XML() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:443</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: #15 PetscLogHandlerView_Nested() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/lognested.c:405</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: #16 PetscLogHandlerView() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/interface/loghandler.c:342</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: #17 PetscLogView() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/plog.c:2043</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: #18 PetscLogViewFromOptions() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/plog.c:2084</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
[0]PETSC ERROR: #19 PetscFinalize() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/objects/pinit.c:1552</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
PetscFinalize() failed [error code: 98]</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
--------------------------------------------------------------------------</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
prterun has exited due to process rank 0 with PID 28986 on node 192.168.10.51 exiting</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
improperly. There are three reasons this could occur:</div>
<div dir="ltr" class="gmail_quote" style="line-height: normal; margin: 0px; min-height: 15px; font-family: Menlo; font-size: 13px;">
<br>
</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
1. this process did not call "init" before exiting, but others in the</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
job did. This can cause a job to hang indefinitely while it waits for</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
all processes to call "init". By rule, if one process calls "init",</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
then ALL processes must call "init" prior to termination.</div>
<div dir="ltr" class="gmail_quote" style="line-height: normal; margin: 0px; min-height: 15px; font-family: Menlo; font-size: 13px;">
<br>
</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
2. this process called "init", but exited without calling "finalize".</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
By rule, all processes that call "init" MUST call "finalize" prior to</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
exiting or it will be considered an "abnormal termination"</div>
<div dir="ltr" class="gmail_quote" style="line-height: normal; margin: 0px; min-height: 15px; font-family: Menlo; font-size: 13px;">
<br>
</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
3. this process called "MPI_Abort" or "prte_abort" and the mca</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
parameter prte_create_session_dirs is set to false. In this case, the</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
run-time cannot detect that the abort call was an abnormal</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
termination. Hence, the only error message you will receive is this</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
one.</div>
<div dir="ltr" class="gmail_quote" style="line-height: normal; margin: 0px; min-height: 15px; font-family: Menlo; font-size: 13px;">
<br>
</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
This may have caused other processes in the application to be</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
terminated by signals sent by prterun (as reported here).</div>
<div dir="ltr" class="gmail_quote" style="line-height: normal; margin: 0px; min-height: 15px; font-family: Menlo; font-size: 13px;">
<br>
</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
You can avoid this message by specifying -quiet on the prterun command</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
line.</div>
<div class="gmail_quote" style="line-height: normal; margin: 0px; font-family: Menlo; font-size: 13px;">
--------------------------------------------------------------------------</div>
<div dir="ltr" class="gmail_quote" style="font-family: Aptos, Arial, Helvetica, sans-serif; font-size: 12pt;">
```</div>
<div dir="ltr" class="gmail_quote" style="font-family: Aptos, Arial, Helvetica, sans-serif; font-size: 12pt;">
<br>
</div>
<div dir="ltr" class="gmail_quote" style="font-family: Aptos, Arial, Helvetica, sans-serif; font-size: 12pt;">
Best wishes,</div>
<div dir="ltr" class="gmail_quote" style="font-family: Aptos, Arial, Helvetica, sans-serif; font-size: 12pt;">
Zongze</div>
<div dir="ltr" class="gmail_quote" style="font-family: Aptos, Arial, Helvetica, sans-serif; font-size: 12pt;">
<br>
</div>
<div id="m_-3979039196510518239m_-3018690497905492689mail-editor-reference-message-container">
<div style="text-align: left; padding: 3pt 0in 0in; border-width: 1pt medium medium; border-style: solid none none; border-color: rgb(181, 196, 223) currentcolor currentcolor; font-family: Aptos; font-size: 12pt;">
<b>From: </b>petsc-users <<a href="mailto:petsc-users-bounces@mcs.anl.gov" target="_blank" data-outlook-id="d4bd0224-1537-4fde-8356-04052c9a4e21">petsc-users-bounces@mcs.anl.gov</a>> on behalf of Klaij, Christiaan via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank" data-outlook-id="4b1c64e7-9b82-47ed-b115-69f4ea5e3bb6">petsc-users@mcs.anl.gov</a>><br>
<b>Date: </b>Monday, July 14, 2025 at 15:58<br>
<b>To: </b>Barry Smith <<a href="mailto:bsmith@petsc.dev" target="_blank" data-outlook-id="f8bbba79-9d18-40af-9af2-be1988c6d60c">bsmith@petsc.dev</a>><br>
<b>Cc: </b>PETSc users list <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank" data-outlook-id="038e33f1-e447-4e2c-9a1d-2d7e04a29113">petsc-users@mcs.anl.gov</a>><br>
<b>Subject: </b>Re: [petsc-users] problem with nested logging, standalone example<br>
<br>
</div>
<div style="font-size: 11pt;">@Junchao: yes, all with my ex2f.F90 variation on two or three cores<br>
<br>
@Barry: it's really puzzling that you cannot reproduce. Can you try running it a dozen times in a row? And look at the report_performance.xml file? When it hangs I see some nan's, for instance here in the VecAXPY event:<br>
<br>
<events><br>
<event><br>
<name>VecAXPY</name><br>
<time><br>
<avgvalue>0.00610203</avgvalue><br>
<minvalue>0.</minvalue><br>
<maxvalue>0.0122041</maxvalue><br>
<minloc>1</minloc><br>
<maxloc>0</maxloc><br>
</time><br>
<ncalls><br>
<avgvalue>0.5</avgvalue><br>
<minvalue>0.</minvalue><br>
<maxvalue>1.</maxvalue><br>
<minloc>1</minloc><br>
<maxloc>0</maxloc><br>
</ncalls><br>
</event><br>
<event><br>
<name>self</name><br>
<time><br>
<value>-nan.</value><br>
</time><br>
<br>
This is what I did in my latest attempt on the login node of our Rocky Linux 9 cluster:<br>
1) download petsc-3.23.4.tar.gz from the petsc website<br>
2) ./configure -prefix=~/petsc/install --with-cxx=0 --with-debugging=0 --with-mpi-dir=/cm/shared/apps/mpich/ge/gcc/64/3.4.2<br>
3) adjust my example to this version of petsc (file is attached)<br>
4) make ex2f-cklaij-dbg-v2<br>
5) mpirun -n 2 ./ex2f-cklaij-dbg-v2<br>
<br>
So the exact versions are: petsc-3.23.4, system mpich 3.4.2, system gcc 11.5.0<br>
<br>
________________________________________<br>
From: Barry Smith <<a href="mailto:bsmith@petsc.dev" target="_blank" data-outlook-id="d76b5e8d-0b89-47dc-b7cf-3b360b58a429">bsmith@petsc.dev</a>><br>
Sent: Friday, July 11, 2025 11:22 PM<br>
To: Klaij, Christiaan<br>
Cc: Junchao Zhang; PETSc users list<br>
Subject: Re: [petsc-users] problem with nested logging, standalone example<br>
<br>
<br>
And yet we cannot reproduce.<br>
<br>
Please tell us the exact PETSc version and MPI implementation versions. And reattach your reproducing example. And exactly how you run it.<br>
<br>
<br>
Can you reproduce it on an "ordinary" machine, say a Mac or Linux laptop.<br>
<br>
Barry<br>
<br>
If I could reproduce the problem here is how I would debug. I put use -start_in_debugger and then put break points in places which it seem problematic. Presumably I would end up with a hang with each MPI process in a "different place" and from that I may
be able to determine how that happened.<br>
<br>
<br>
<br>
> On Jul 11, 2025, at 7:58 AM, Klaij, Christiaan <<a href="mailto:C.Klaij@marin.nl" target="_blank" data-outlook-id="927141ab-3160-4450-a40d-11f9e6c8480a">C.Klaij@marin.nl</a>> wrote:<br>
><br>
> In summary for future reference:<br>
> - tested 3 different machines, two at Marin, one at the national HPC<br>
> - tested 3 different mpi implementation (intelmpi, openmpi and mpich)<br>
> - tested openmpi in both release and debug<br>
> - tested 2 different compilers (intel and gnu), both older and very recent versions<br>
> - tested with the most basic config (./configure --with-cxx=0 --with-debugging=0 --download-mpich)<br>
><br>
> All of these test either segfault, or hang or error-out at the call to PetscLogView.<br>
><br>
> Chris<br>
><br>
> ________________________________________<br>
> From: Klaij, Christiaan <<a href="mailto:C.Klaij@marin.nl" target="_blank" data-outlook-id="dacd60d6-a603-4f43-aa96-b29ecf5140f7">C.Klaij@marin.nl</a>><br>
> Sent: Friday, July 11, 2025 10:10 AM<br>
> To: Barry Smith; Junchao Zhang<br>
> Cc: PETSc users list<br>
> Subject: Re: [petsc-users] problem with nested logging, standalone example<br>
><br>
> @Matt: no MPI errors indeed. I've tried with MPICH and I get the same hanging.<br>
> @Barry: both stack traces aren't exactly the same, see a sample with MPICH below.<br>
><br>
> If it cannot be reproduced at your side, I'm afraid this is another dead end. Thanks anyway, I really appreciate all your help.<br>
><br>
> Chris<br>
><br>
> (gdb) bt<br>
> #0 0x000015555033bc2e in MPIDI_POSIX_mpi_release_gather_gather.constprop.0 ()<br>
> from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
> #1 0x000015555033db8a in MPIDI_POSIX_mpi_allreduce_release_gather ()<br>
> from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
> #2 0x000015555033e70f in MPIR_Allreduce ()<br>
> from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
> #3 0x000015555033f22e in PMPI_Allreduce ()<br>
> from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
> #4 0x0000155553f85d69 in MPIU_Allreduce_Count (comm=-2080374782,<br>
> op=1476395020, dtype=1275072547, count=1, outbuf=0x7fffffffac70,<br>
> inbuf=0x7fffffffac60)<br>
> at /home/cklaij/petsc/petsc-3.23.4/src/sys/objects/pinit.c:1839<br>
> #5 MPIU_Allreduce_Private (inbuf=inbuf@entry=0x7fffffffac60,<br>
> outbuf=outbuf@entry=0x7fffffffac70, count=count@entry=1,<br>
> dtype=dtype@entry=1275072547, op=op@entry=1476395020, comm=-2080374782)<br>
> at /home/cklaij/petsc/petsc-3.23.4/src/sys/objects/pinit.c:1869<br>
> #6 0x0000155553f33dbe in PetscPrintXMLNestedLinePerfResults (<br>
> viewer=viewer@entry=0x458890, name=name@entry=0x155554ef6a0d 'mbps\000',<br>
> value=<optimized out>, minthreshold=minthreshold@entry=0,<br>
> maxthreshold=maxthreshold@entry=0.01,<br>
> minmaxtreshold=minmaxtreshold@entry=1.05)<br>
> at /home/cklaij/petsc/petsc-3.23.4/src/sys/logging/handler/impls/nested/xmlviewer.c:255<br>
><br>
><br>
> (gdb) bt<br>
> #0 0x000015554fed3b17 in clock_gettime@GLIBC_2.2.5 () from /lib64/libc.so.6<br>
> #1 0x0000155550b0de71 in ofi_gettime_ns ()<br>
> from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
> #2 0x0000155550b0dec9 in ofi_gettime_ms ()<br>
> from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
> #3 0x0000155550b2fab5 in sock_cq_sreadfrom ()<br>
> from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
> #4 0x00001555505ca6f7 in MPIDI_OFI_progress ()<br>
> from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
> #5 0x0000155550591fe9 in progress_test ()<br>
> from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
> #6 0x00001555505924a3 in MPID_Progress_wait ()<br>
> from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
> #7 0x000015555043463e in MPIR_Wait_state ()<br>
> from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
> #8 0x000015555052ec49 in MPIC_Wait ()<br>
> from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
> #9 0x000015555053093e in MPIC_Sendrecv ()<br>
> from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
> #10 0x00001555504bf674 in MPIR_Allreduce_intra_recursive_doubling ()<br>
> from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
> #11 0x00001555505b61de in MPIDI_OFI_mpi_finalize_hook ()<br>
> from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br>
><br>
> ________________________________________<br>
> From: Barry Smith <<a href="mailto:bsmith@petsc.dev" target="_blank" data-outlook-id="47eb61b9-82b3-4838-817d-8ec81ee16ac8">bsmith@petsc.dev</a>><br>
> Sent: Thursday, July 10, 2025 11:10 PM<br>
> To: Junchao Zhang<br>
> Cc: Klaij, Christiaan; PETSc users list<br>
> Subject: Re: [petsc-users] problem with nested logging, standalone example<br>
><br>
><br>
> I cannot reproduce<br>
><br>
> On Jul 10, 2025, at 3:46 PM, Junchao Zhang <<a href="mailto:junchao.zhang@gmail.com" target="_blank" data-outlook-id="a8a97985-bc64-4c68-8ea0-c13db590a7bc">junchao.zhang@gmail.com</a>> wrote:<br>
><br>
> Adding -mca coll_hcoll_enable 0 didn't change anything at my end. Strange.<br>
><br>
> --Junchao Zhang<br>
><br>
><br>
> On Thu, Jul 10, 2025 at 3:39 AM Klaij, Christiaan <<a href="mailto:C.Klaij@marin.nl" target="_blank" data-outlook-id="83ba00cf-dc3d-427b-8842-91c308dbe0db">C.Klaij@marin.nl</a><mailto:<a href="mailto:C.Klaij@marin.nl" target="_blank" data-outlook-id="d3029f8f-bbb8-44af-a1cb-8b2f013495cb">C.Klaij@marin.nl</a>>>
wrote:<br>
> An additional clue perhaps: with the option OMPI_MCA_coll_hcoll_enable=0, the code does not hang but gives the error below.<br>
><br>
> Chris<br>
><br>
><br>
> $ mpirun -mca coll_hcoll_enable 0 -n 2 ./ex2f-cklaij-dbg -pc_type jacobi -ksp_monitor_short -ksp_gmres_cgs_refinement_type refine_always<br>
> 0 KSP Residual norm 1.11803<br>
> 1 KSP Residual norm 0.591608<br>
> 2 KSP Residual norm 0.316228<br>
> 3 KSP Residual norm < 1.e-11<br>
> 0 KSP Residual norm 0.707107<br>
> 1 KSP Residual norm 0.408248<br>
> 2 KSP Residual norm < 1.e-11<br>
> Norm of error < 1.e-12 iterations 3<br>
> [1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>
> [1]PETSC ERROR: General MPI error<br>
> [1]PETSC ERROR: MPI error 1 MPI_ERR_BUFFER: invalid buffer pointer<br>
> [1]PETSC ERROR: See <a href="https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!dcT9AzbxDJMLIie0NhYIw4YU2TObPM3WHhzR-HlzrpfbjPd6sgsPX009yFy1lw_eLLu2WprNwYRABMK43J9p4SM$" target="_blank" data-outlook-id="0f1b4a9a-a443-485e-ac8e-0e733dd3cb4f">
https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!dcT9AzbxDJMLIie0NhYIw4YU2TObPM3WHhzR-HlzrpfbjPd6sgsPX009yFy1lw_eLLu2WprNwYRABMK43J9p4SM$</a> <<a href="https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJjkYxsN9$" target="_blank" data-outlook-id="31d1f75a-d6fb-4bfa-a4a0-b01e956ff0de">https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJjkYxsN9$</a>>
for trouble shooting.<br>
> [1]PETSC ERROR: Petsc Release Version 3.22.4, Mar 01, 2025<br>
> [1]PETSC ERROR: ./ex2f-cklaij-dbg with 2 MPI process(es) and PETSC_ARCH on login1 by cklaij Thu Jul 10 10:33:33 2025<br>
> [1]PETSC ERROR: Configure options: --prefix=/home/cklaij/ReFRESCO/trunk/install/extLibs --with-mpi-dir=/cm/shared/apps/openmpi/gcc/5.0.6-debug --with-x=0 --with-mpe=0 --with-debugging=0 --download-superlu_dist=<a href="https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/superlu_dist-8.1.2.tar.gz__;!!G_uCfscf7eWS!dcT9AzbxDJMLIie0NhYIw4YU2TObPM3WHhzR-HlzrpfbjPd6sgsPX009yFy1lw_eLLu2WprNwYRABMK4VVy6P4U$" target="_blank" data-outlook-id="f37f784f-ad24-43b0-b8ec-3c0ca34194cd">https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/superlu_dist-8.1.2.tar.gz__;!!G_uCfscf7eWS!dcT9AzbxDJMLIie0NhYIw4YU2TObPM3WHhzR-HlzrpfbjPd6sgsPX009yFy1lw_eLLu2WprNwYRABMK4VVy6P4U$</a> <<a href="https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/superlu_dist-8.1.2.tar.gz__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJkouVHb2$" target="_blank" data-outlook-id="04f44a9e-5d3f-4af1-912e-3d2e4265a8bd">https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/superlu_dist-8.1.2.tar.gz__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJkouVHb2$</a>>
--with-blaslapack-dir=/cm/shared/apps/oneapi/2024.2.1/mkl/2024.2 --download-parmetis=<a href="https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/parmetis-4.0.3-p9.tar.gz__;!!G_uCfscf7eWS!dcT9AzbxDJMLIie0NhYIw4YU2TObPM3WHhzR-HlzrpfbjPd6sgsPX009yFy1lw_eLLu2WprNwYRABMK4-9b1K84$" target="_blank" data-outlook-id="3fd709a6-c8ba-4273-949f-4bc15f97e67b">https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/parmetis-4.0.3-p9.tar.gz__;!!G_uCfscf7eWS!dcT9AzbxDJMLIie0NhYIw4YU2TObPM3WHhzR-HlzrpfbjPd6sgsPX009yFy1lw_eLLu2WprNwYRABMK4-9b1K84$</a> <<a href="https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/parmetis-4.0.3-p9.tar.gz__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJrjo6-SP$" target="_blank" data-outlook-id="5bb42e80-b76c-40d0-b41b-143758578b96">https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/parmetis-4.0.3-p9.tar.gz__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJrjo6-SP$</a>>
--download-metis=<a href="https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/metis-5.1.0-p11.tar.gz__;!!G_uCfscf7eWS!dcT9AzbxDJMLIie0NhYIw4YU2TObPM3WHhzR-HlzrpfbjPd6sgsPX009yFy1lw_eLLu2WprNwYRABMK4Y9uaqiQ$" target="_blank" data-outlook-id="ea1f8eef-7879-4ab5-92f2-fd572d677f50">https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/metis-5.1.0-p11.tar.gz__;!!G_uCfscf7eWS!dcT9AzbxDJMLIie0NhYIw4YU2TObPM3WHhzR-HlzrpfbjPd6sgsPX009yFy1lw_eLLu2WprNwYRABMK4Y9uaqiQ$</a> <<a href="https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/metis-5.1.0-p11.tar.gz__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJhCc9MRE$" target="_blank" data-outlook-id="574e2a57-4a66-4267-9198-33456023a385">https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/metis-5.1.0-p11.tar.gz__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJhCc9MRE$</a>>
--with-packages-build-dir=/home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild --with-ssl=0 --with-shared-libraries=1 CFLAGS="-std=gnu11 -Wall -funroll-all-loops -O3 -DNDEBUG" CXXFLAGS="-std=gnu++14 -Wall -funroll-all-loops -O3 -DNDEBUG " COPTFLAGS="-std=gnu11
-Wall -funroll-all-loops -O3 -DNDEBUG" CXXOPTFLAGS="-std=gnu++14 -Wall -funroll-all-loops -O3 -DNDEBUG " FCFLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG" F90FLAGS="-Wall
-funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG" FOPTFLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG"<br>
> [1]PETSC ERROR: #1 PetscLogNestedTreePrintLine() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:289<br>
> [1]PETSC ERROR: #2 PetscLogNestedTreePrint() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:377<br>
> [1]PETSC ERROR: #3 PetscLogNestedTreePrint() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:384<br>
> [1]PETSC ERROR: #4 PetscLogNestedTreePrintTop() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:420<br>
> [1]PETSC ERROR: #5 PetscLogHandlerView_Nested_XML() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:443<br>
> [1]PETSC ERROR: #6 PetscLogHandlerView_Nested() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/lognested.c:405<br>
> [1]PETSC ERROR: #7 PetscLogHandlerView() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/interface/loghandler.c:342<br>
> [1]PETSC ERROR: #8 PetscLogView() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/plog.c:2040<br>
> [1]PETSC ERROR: #9 ex2f-cklaij-dbg.F90:301<br>
> --------------------------------------------------------------------------<br>
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_SELF<br>
> Proc: [[55228,1],1]<br>
> Errorcode: 98<br>
><br>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.<br>
> You may or may not see output from other processes, depending on<br>
> exactly when Open MPI kills them.<br>
> --------------------------------------------------------------------------<br>
> --------------------------------------------------------------------------<br>
> prterun has exited due to process rank 1 with PID 0 on node login1 calling<br>
> "abort". This may have caused other processes in the application to be<br>
> terminated by signals sent by prterun (as reported here).<br>
> --------------------------------------------------------------------------<br>
><br>
> ________________________________________<br>
> <image198746.png><br>
> dr. ir. Christiaan Klaij | senior researcher<br>
> Research & Development | CFD Development<br>
> T +31 317 49 33 44<tel:+31%20317%2049%2033%2044> | <a href="https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!dcT9AzbxDJMLIie0NhYIw4YU2TObPM3WHhzR-HlzrpfbjPd6sgsPX009yFy1lw_eLLu2WprNwYRABMK4BUEn1h8$" target="_blank" data-outlook-id="b132801c-de2e-4753-8dfb-3474c948696d">
https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!dcT9AzbxDJMLIie0NhYIw4YU2TObPM3WHhzR-HlzrpfbjPd6sgsPX009yFy1lw_eLLu2WprNwYRABMK4BUEn1h8$</a> <<a href="https://urldefense.us/v3/__https://www.marin.nl/__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJrOqapgp$" target="_blank" data-outlook-id="b18c0fb5-1fca-41d1-9117-33eecf5376d5">https://urldefense.us/v3/__https://www.marin.nl/__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJrOqapgp$</a>><br>
> <image542473.png><<a href="https://urldefense.us/v3/__https://www.facebook.com/marin.wageningen__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJoD4fuV7$" target="_blank" data-outlook-id="60e7654f-0ff4-49dd-9a62-63213a0833d5">https://urldefense.us/v3/__https://www.facebook.com/marin.wageningen__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJoD4fuV7$</a>><br>
> <image555176.png><<a href="https://urldefense.us/v3/__https://www.linkedin.com/company/marin__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJospHf95$" target="_blank" data-outlook-id="c857e2b4-a81a-4216-8c02-ff3a6038e94f">https://urldefense.us/v3/__https://www.linkedin.com/company/marin__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJospHf95$</a>><br>
> <image269837.png><<a href="https://urldefense.us/v3/__https://www.youtube.com/marinmultimedia__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJrpsjB_W$" target="_blank" data-outlook-id="59e59888-b974-4dc7-937c-d3a2c44ab4ed">https://urldefense.us/v3/__https://www.youtube.com/marinmultimedia__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJrpsjB_W$</a>><br>
><br>
><br>
> From: Klaij, Christiaan <<a href="mailto:C.Klaij@marin.nl" target="_blank" data-outlook-id="c42104b9-a66e-443a-8429-01db8e61ee5c">C.Klaij@marin.nl</a><mailto:<a href="mailto:C.Klaij@marin.nl" target="_blank" data-outlook-id="05fa2b5c-3e16-4b6f-8405-1bd23f40f571">C.Klaij@marin.nl</a>>><br>
> Sent: Thursday, July 10, 2025 10:15 AM<br>
> To: Junchao Zhang<br>
> Cc: PETSc users list<br>
> Subject: Re: [petsc-users] problem with nested logging, standalone example<br>
><br>
> Hi Junchao,<br>
><br>
> Thanks for testing. I've fixed the error but unfortunately that doesn't change the behavior, the code still hangs as before, with the same stack trace...<br>
><br>
> Chris<br>
><br>
> ________________________________________<br>
> From: Junchao Zhang <<a href="mailto:junchao.zhang@gmail.com" target="_blank" data-outlook-id="9d710425-31b8-4c53-b09b-e905de0e726f">junchao.zhang@gmail.com</a><mailto:<a href="mailto:junchao.zhang@gmail.com" target="_blank" data-outlook-id="619acac9-cbf1-47e9-a6b8-8cabd0bd322e">junchao.zhang@gmail.com</a>>><br>
> Sent: Tuesday, July 8, 2025 10:58 PM<br>
> To: Klaij, Christiaan<br>
> Cc: PETSc users list<br>
> Subject: Re: [petsc-users] problem with nested logging, standalone example<br>
><br>
> Hi, Chris,<br>
> First, I had to fix an error in your test by adding " PetscCallA(MatSetFromOptions(AA,ierr))" at line 254.<br>
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>
> [0]PETSC ERROR: Object is in wrong state<br>
> [0]PETSC ERROR: Mat object's type is not set: Argument # 1<br>
> ...<br>
> [0]PETSC ERROR: #1 MatSetValues() at /scratch/jczhang/petsc/src/mat/interface/matrix.c:1503<br>
> [0]PETSC ERROR: #2 ex2f.F90:258<br>
><br>
> Then I could ran the test without problems<br>
> mpirun -n 2 ./ex2f -pc_type jacobi -ksp_monitor_short -ksp_gmres_cgs_refinement_type refine_always<br>
> 0 KSP Residual norm 1.11803<br>
> 1 KSP Residual norm 0.591608<br>
> 2 KSP Residual norm 0.316228<br>
> 3 KSP Residual norm < 1.e-11<br>
> 0 KSP Residual norm 0.707107<br>
> 1 KSP Residual norm 0.408248<br>
> 2 KSP Residual norm < 1.e-11<br>
> Norm of error < 1.e-12 iterations 3<br>
><br>
> I used petsc-3.22.4, gcc-11.3, openmpi-5.0.6 and configured with<br>
> ./configure --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-openmpi --with-ssl=0 --with-shared-libraries=1 CFLAGS="-std=gnu11 -Wall -funroll-all-loops -O3 -DNDEBUG" CXXFLAGS="-std=gnu++14 -Wall -funroll-all-loops -O3 -DNDEBUG " COPTFLAGS="-std=gnu11
-Wall -funroll-all-loops -O3 -DNDEBUG" CXXOPTFLAGS="-std=gnu++14 -Wall -funroll-all-loops -O3 -DNDEBUG " FCFLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG" F90FLAGS="-Wall
-funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG" FOPTFLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG"<br>
><br>
> Could you fix the error and retry?<br>
><br>
> --Junchao Zhang<br>
><br>
><br>
> On Sun, Jul 6, 2025 at 12:57 PM Klaij, Christiaan via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank" data-outlook-id="b76ac14c-b803-4b3a-9ac1-8095538c2e9d">petsc-users@mcs.anl.gov</a><mailto:<a href="mailto:petsc-users@mcs.anl.gov" target="_blank" data-outlook-id="ad6c32a0-d011-4eb2-9a65-a67091c2192b">petsc-users@mcs.anl.gov</a>><mailto:<a href="mailto:petsc-users@mcs.anl.gov" target="_blank" data-outlook-id="8e3faef5-7da2-451b-b711-f0411b2c4e44">petsc-users@mcs.anl.gov</a><mailto:<a href="mailto:petsc-users@mcs.anl.gov" target="_blank" data-outlook-id="0c01ca95-8b4f-42d3-901b-b9ec61640b00">petsc-users@mcs.anl.gov</a>>>>
wrote:<br>
> Attached is a standalone example of the issue described in the<br>
> earlier thread "problem with nested logging". The issue appeared<br>
> somewhere between petsc 3.19.4 and 3.23.4.<br>
><br>
> The example is a variation of ../ksp/tutorials/ex2f.F90, where<br>
> I've added the nested log viewer with one event as well as the<br>
> solution of a small system on rank zero.<br>
><br>
> When running on mulitple procs the example hangs during<br>
> PetscLogView with the backtrace below. The configure.log is also<br>
> attached in the hope that you can replicate the issue.<br>
><br>
> Chris<br>
><br>
><br>
> #0 0x000015554c84ea9e in mca_pml_ucx_recv (buf=0x7fffffff9e30, count=1,<br>
> datatype=0x15554c9ef900 <ompi_mpi_2dblprec>, src=1, tag=-12,<br>
> comm=0x7f1e30, mpi_status=0x0) at pml_ucx.c:700<br>
> #1 0x000015554c65baff in ompi_coll_base_allreduce_intra_recursivedoubling (<br>
> sbuf=0x7fffffff9e20, rbuf=0x7fffffff9e30, count=1,<br>
> dtype=0x15554c9ef900 <ompi_mpi_2dblprec>,<br>
> op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30, module=0xaec630)<br>
> at base/coll_base_allreduce.c:247<br>
> #2 0x000015554c6a7e40 in ompi_coll_tuned_allreduce_intra_do_this (<br>
> sbuf=0x7fffffff9e20, rbuf=0x7fffffff9e30, count=1,<br>
> dtype=0x15554c9ef900 <ompi_mpi_2dblprec>,<br>
> op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30, module=0xaec630,<br>
> algorithm=3, faninout=0, segsize=0) at coll_tuned_allreduce_decision.c:142<br>
> #3 0x000015554c6a054f in ompi_coll_tuned_allreduce_intra_dec_fixed (<br>
> sbuf=0x7fffffff9e20, rbuf=0x7fffffff9e30, count=1,<br>
> dtype=0x15554c9ef900 <ompi_mpi_2dblprec>,<br>
> op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30, module=0xaec630)<br>
> at coll_tuned_decision_fixed.c:216<br>
> #4 0x000015554c68e160 in mca_coll_hcoll_allreduce (sbuf=0x7fffffff9e20,<br>
> rbuf=0x7fffffff9e30, count=1, dtype=0x15554c9ef900 <ompi_mpi_2dblprec>,<br>
> op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30, module=0xaecb80)<br>
> at coll_hcoll_ops.c:217<br>
> #5 0x000015554c59811a in PMPI_Allreduce (sendbuf=0x7fffffff9e20,<br>
> recvbuf=0x7fffffff9e30, count=1, datatype=0x15554c9ef900 <ompi_mpi_2dblprec>, op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30) at allreduce.c:123<br>
> #6 0x0000155553eabede in MPIU_Allreduce_Private () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
> #7 0x0000155553e50d08 in PetscPrintXMLNestedLinePerfResults () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
> #8 0x0000155553e5123e in PetscLogNestedTreePrintLine () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
> #9 0x0000155553e51f3a in PetscLogNestedTreePrint () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
> #10 0x0000155553e51e96 in PetscLogNestedTreePrint () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
> #11 0x0000155553e51e96 in PetscLogNestedTreePrint () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
> #12 0x0000155553e52142 in PetscLogNestedTreePrintTop () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
> #13 0x0000155553e5257b in PetscLogHandlerView_Nested_XML () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
> #14 0x0000155553e4e5a0 in PetscLogHandlerView_Nested () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
> #15 0x0000155553e56232 in PetscLogHandlerView () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
> #16 0x0000155553e588c3 in PetscLogView () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
> #17 0x0000155553e40eb5 in petsclogview_ () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
> #18 0x0000000000402c8b in MAIN__ ()<br>
> #19 0x00000000004023df in main ()<br>
> [cid:ii_197ebccaa1d27ee6ef21]<br>
> dr. ir. Christiaan Klaij | senior researcher<br>
> Research & Development | CFD Development<br>
> T +31 317 49 33 44<tel:+31%20317%2049%2033%2044> | <a href="https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!dcT9AzbxDJMLIie0NhYIw4YU2TObPM3WHhzR-HlzrpfbjPd6sgsPX009yFy1lw_eLLu2WprNwYRABMK4BUEn1h8$" target="_blank" data-outlook-id="8af88ac1-0144-46c6-9560-64d168940e90">
https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!dcT9AzbxDJMLIie0NhYIw4YU2TObPM3WHhzR-HlzrpfbjPd6sgsPX009yFy1lw_eLLu2WprNwYRABMK4BUEn1h8$</a> <<a href="https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJhphmV4x$%3E%3Chttps://urldefense.us/v3/__https://www.marin.nl/__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imk4ivm_tE$" target="_blank" data-outlook-id="184d461f-b5fc-4529-838a-67dedb61185d">https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJhphmV4x$><https://urldefense.us/v3/__https://www.marin.nl/__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imk4ivm_tE$</a>><br>
> [Facebook]<<a href="https://urldefense.us/v3/__https://www.facebook.com/marin.wageningen__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imkLNCvsiI$" target="_blank" data-outlook-id="05faa404-2e9e-4e1c-b0d7-f50c89471fe0">https://urldefense.us/v3/__https://www.facebook.com/marin.wageningen__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imkLNCvsiI$</a>><br>
> [LinkedIn]<<a href="https://urldefense.us/v3/__https://www.linkedin.com/company/marin__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imkrb79Ay4$" target="_blank" data-outlook-id="e5d51f24-bdff-4c51-999b-fab8b2ef791a">https://urldefense.us/v3/__https://www.linkedin.com/company/marin__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imkrb79Ay4$</a>><br>
> [YouTube]<<a href="https://urldefense.us/v3/__https://www.youtube.com/marinmultimedia__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imkJiCoeLw$" target="_blank" data-outlook-id="13d4606f-d277-475a-884e-e44460c020d7">https://urldefense.us/v3/__https://www.youtube.com/marinmultimedia__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imkJiCoeLw$</a>><br>
><br>
><br>
<br>
</div>
</div>
</blockquote>
</blockquote>
<div dir="ltr" class="gmail_quote"><br>
</div>
</blockquote>
</div>
</body>
</html>