[petsc-users] problem with nested logging

Klaij, Christiaan C.Klaij at marin.nl
Fri Jul 4 05:55:29 CDT 2025


(yes, valgrind was attempted but did not show any obvious
problems.)

However, we are making some progress. The random segmentation
faults are observed with intelmpi 2012.13 and openmpi
4.1.5. However, with openmpi 5.0.6, the randomness is gone and we
get systematic hanging of the code during PetscLogView. In fact,
we can now make the code hang by adding a single collective
PetscLogEvent. Next step is to replicate this behaviour in a
standalone example for you to investigate.

Chris

_____
dr. ir. Christiaan Klaij | senior researcher
Research & Development | CFD Development
T +31 317 49 33 44 | https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!fGUcwKtvcNmBqtHO64KSPpOhRicJBWa4sUVUEshLDH1m6jh1mEO97VnoUwQPxTYSSqRNNC-zDhw8Qm8RR5RdXuE$ 
___________________________________
From: Barry Smith <bsmith at petsc.dev>
Sent: Tuesday, July 1, 2025 4:54 PM
To: Klaij, Christiaan
Cc: PETSc users list
Subject: Re: [petsc-users] problem with nested logging


   It's probably already been done, but if not, you should run under Valgrind. "Random" crashes are usually an indication of memory corruption.



> On Jul 1, 2025, at 4:16 AM, Klaij, Christiaan via petsc-users <petsc-users at mcs.anl.gov> wrote:
>
> It's been a while, in the meantime we did upgrade the OS and the
> compilers but the problem still persists.
>
> Can it be that one must call the PetscLogEventRegister and the
> PetscLogEventBegin/End on all procs? Currently we don't do so,
> think of some small system being build and solved. This event is
> registered on all procs, but may take place on a single proc or
> subset with the begin/end only on that proc or subset.
>
> Chris
>
> ________________________________________
> From: Klaij, Christiaan <C.Klaij at marin.nl>
> Sent: Thursday, May 1, 2025 3:06 PM
> To: Stefano Zampini
> Cc: Randall Mackie; PETSc users list; Isaac, Toby
> Subject: Re: [petsc-users] problem with nested logging
>
> If I deactivate all the calls to PetscLogEventBegin/End in the
> simulation code, the error does not show-up.
>
> But since there are more than 2500 events, it's impossible to do
> them one-by-one, especially since the error shows-up at random
> and requires a number of cases and repetitions.
>
> Unfortunately, I'm running out of time and budget. I will retry
> once we get Rocky Linux 9 and the latest Intel compilers.
>
> Chris
>
> ________________________________________
> From: Stefano Zampini <stefano.zampini at gmail.com>
> Sent: Thursday, May 1, 2025 10:57 AM
> To: Klaij, Christiaan
> Cc: Randall Mackie; PETSc users list; Isaac, Toby
> Subject: Re: [petsc-users] problem with nested logging
>
>
>
> Il giorno gio 1 mag 2025 alle ore 11:38 Klaij, Christiaan <C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>> ha scritto:
> The checks seem to be in place: I do get a PETSC ERROR when I add a log event on rank 0 as you suggested.
>
>
> Ok, the broken logic may be in LogView then. You can try deactivating some logging by classes and see how the error evolves, maybe using PetscLogClassSetActiveAll. Or, if feasible, commenting out some part of the simulation code
>
> Another thought: in between the log events pairs, I also have calls to PetscLogFlops, perhaps that plays a role somehow.
>
> It shouldn't
>
> Chris
>
> ________________________________________
> From: Klaij, Christiaan <C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>
> Sent: Thursday, May 1, 2025 10:23 AM
> To: Stefano Zampini
> Cc: Randall Mackie; PETSc users list; Isaac, Toby
> Subject: Re: [petsc-users] problem with nested logging
>
> Was the rewritting by Toby done somewhere between petsc 3.19.4 (no problem) and 3.23.4 (problem)?
>
> Chris
>
> ________________________________________
> From: Stefano Zampini <stefano.zampini at gmail.com<mailto:stefano.zampini at gmail.com>>
> Sent: Thursday, May 1, 2025 9:12 AM
> To: Klaij, Christiaan
> Cc: Randall Mackie; PETSc users list; Isaac, Toby
> Subject: Re: [petsc-users] problem with nested logging
>
> If I look at the code for PetscLogHandlerEventBegin_Default, there are checks in place to see if the event is collectively called (see below)
> Can you make sure the Nested logging system has the same checks?
> It should, but double check since the code has been largely rewritten by Toby some time ago; to check it should be as easy as writing a code that calls a collective event on a single process and a debug version of petsc should complain
>
> if (rank ==0)
>  LogEventBegin() <-this should call MPIU_Allreduce, but other processes will not, thus hang
>
>
> If it does not complain, then the error must come from how the logic in LogView works, and from how it traverses the various events (my guess: processed in a different order from different processes). Without a reproducer, it is hard to understand what's going on
>
> static PetscErrorCode PetscLogHandlerEventBegin_Default(PetscLogHandler h, PetscLogEvent event, PetscObject o1, PetscObject o2, PetscObject o3, PetscObject o4)
> {
>  PetscLogHandler_Default def             = (PetscLogHandler_Default)h->data;
>  PetscEventPerfInfo     *event_perf_info = NULL;
>  PetscLogEventInfo       event_info;
>  PetscLogDouble          time;
>  PetscLogState           state;
>  PetscLogStage           stage;
>
>  PetscFunctionBegin;
>  PetscCall(PetscLogHandlerGetState(h, &state));
>  if (PetscDefined(USE_DEBUG)) {
>    PetscCall(PetscLogStateEventGetInfo(state, event, &event_info));
>    if (PetscUnlikely(o1)) PetscValidHeader(o1, 3);
>    if (PetscUnlikely(o2)) PetscValidHeader(o2, 4);
>    if (PetscUnlikely(o3)) PetscValidHeader(o3, 5);
>    if (PetscUnlikely(o4)) PetscValidHeader(o4, 6);
>    if (event_info.collective && o1) {
>      PetscInt64 b1[2], b2[2];
>
>      b1[0] = -o1->cidx;
>      b1[1] = o1->cidx;
>      PetscCallMPI(MPIU_Allreduce(b1, b2, 2, MPIU_INT64, MPI_MAX, PetscObjectComm(o1)));
>      PetscCheck(-b2[0] == b2[1], PETSC_COMM_SELF, PETSC_ERR_PLIB, "Collective event %s not called collectively %" PetscInt64_FMT " != %" PetscInt64_FMT, event_info.name<https://urldefense.us/v3/__http://event_info.name__;!!G_uCfscf7eWS!faEur1uGDdy2EYiXEYLCVq_UbMO1KiXKQa0vvY2nxKctWQDpgzsX7k9qTLBuMuQ6VNzwRHRHYc8jUopR10nyRr4$ ><https://urldefense.us/v3/__http://event_info.name__;!!G_uCfscf7eWS!faEur1uGDdy2EYiXEYLCVq_UbMO1KiXKQa0vvY2nxKctWQDpgzsX7k9qTLBuMuQ6VNzwRHRHYc8jUopR10nyRr4$ >, -b2[0], b2[1]);
>    }
>  }
>  /* Synchronization */
>  PetscCall(PetscLogHandlerEventSync_Default(h, event, PetscObjectComm(o1)));
>
>
>
>
> Il giorno gio 1 mag 2025 alle ore 09:56 Klaij, Christiaan <C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>> ha scritto:
> I've tried with -log_sync, no complaints whatsoever, but the error is still there...
>
> Chris
>
> ________________________________________
> From: Stefano Zampini <stefano.zampini at gmail.com<mailto:stefano.zampini at gmail.com><mailto:stefano.zampini at gmail.com<mailto:stefano.zampini at gmail.com>>>
> Sent: Tuesday, April 29, 2025 6:12 PM
> To: Randall Mackie
> Cc: Klaij, Christiaan; PETSc users list; Isaac, Toby
> Subject: Re: [petsc-users] problem with nested logging
>
> Can you try using -log_sync ? This should check every entry/exit points of logged Events and complain if something is not collectively called
>
> Stefano
>
> On Tue, Apr 29, 2025, 18:21 Randall Mackie <rlmackie862 at gmail.com<mailto:rlmackie862 at gmail.com><mailto:rlmackie862 at gmail.com<mailto:rlmackie862 at gmail.com>><mailto:rlmackie862 at gmail.com<mailto:rlmackie862 at gmail.com><mailto:rlmackie862 at gmail.com<mailto:rlmackie862 at gmail.com>>>> wrote:
> ah okay, I missed that this was found using openmpi.
>
> then it’s probably not the same issue we had.
>
> I can’t remember in which version it was fixed (I’m away from my work computer)….I do know in our case openmpi and the latest Intel One API work fine.
>
> Randy
>
> On Apr 29, 2025, at 8:58 AM, Klaij, Christiaan <C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>>> wrote:
>
> Well, the error below only shows-up thanks to openmpi and gnu compilers.
> With the intel mpi and compilers it just hangs (tried oneapi 2023.1.0). In which version was that bug fixed?
>
> Chris
>
> ________________________________________
> <image374583.png>
> dr. ir.     Christiaan       Klaij
> |      senior researcher        |      Research & Development   |      CFD Development
> T +31 317 49 33 44<tel:+31%20317%2049%2033%2044>         |       C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>>      | https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!faEur1uGDdy2EYiXEYLCVq_UbMO1KiXKQa0vvY2nxKctWQDpgzsX7k9qTLBuMuQ6VNzwRHRHYc8jUopRLSBIKfE$ <https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!faEur1uGDdy2EYiXEYLCVq_UbMO1KiXKQa0vvY2nxKctWQDpgzsX7k9qTLBuMuQ6VNzwRHRHYc8jUopRLSBIKfE$ ><https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!faEur1uGDdy2EYiXEYLCVq_UbMO1KiXKQa0vvY2nxKctWQDpgzsX7k9qTLBuMuQ6VNzwRHRHYc8jUopRLSBIKfE$ ><https://urldefense.us/v3/__https://www.marin.nl/__;!!G_uCfscf7eWS!bW2X4VgowGOLrAASgbFOR_Mh6HW4HtWqrdtpsvpnpiFrIwki34JOGyih-h-1bvgb-Bh4EdWRUoVqQW7s6Cx-ci9iNg$>
> <image322757.png><https://urldefense.us/v3/__https://www.facebook.com/marin.wageningen__;!!G_uCfscf7eWS!bW2X4VgowGOLrAASgbFOR_Mh6HW4HtWqrdtpsvpnpiFrIwki34JOGyih-h-1bvgb-Bh4EdWRUoVqQW7s6Cxl-rBbfA$>
> <image512282.png><https://urldefense.us/v3/__https://www.linkedin.com/company/marin__;!!G_uCfscf7eWS!bW2X4VgowGOLrAASgbFOR_Mh6HW4HtWqrdtpsvpnpiFrIwki34JOGyih-h-1bvgb-Bh4EdWRUoVqQW7s6Cw7FNelIQ$>
> <image090879.png><https://urldefense.us/v3/__https://www.youtube.com/marinmultimedia__;!!G_uCfscf7eWS!bW2X4VgowGOLrAASgbFOR_Mh6HW4HtWqrdtpsvpnpiFrIwki34JOGyih-h-1bvgb-Bh4EdWRUoVqQW7s6CwmfHQRog$>
>
> From: Randall Mackie <rlmackie862 at gmail.com<mailto:rlmackie862 at gmail.com><mailto:rlmackie862 at gmail.com<mailto:rlmackie862 at gmail.com>><mailto:rlmackie862 at gmail.com<mailto:rlmackie862 at gmail.com><mailto:rlmackie862 at gmail.com<mailto:rlmackie862 at gmail.com>>>>
> Sent: Tuesday, April 29, 2025 3:33 PM
> To: Klaij, Christiaan
> Cc: Matthew Knepley; petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>>; Isaac, Toby
> Subject: Re: [petsc-users] problem with nested logging
>
> You don't often get email from rlmackie862 at gmail.com<mailto:rlmackie862 at gmail.com><mailto:rlmackie862 at gmail.com<mailto:rlmackie862 at gmail.com>><mailto:rlmackie862 at gmail.com<mailto:rlmackie862 at gmail.com><mailto:rlmackie862 at gmail.com<mailto:rlmackie862 at gmail.com>>>. Learn why this is important<https://urldefense.us/v3/__https://aka.ms/LearnAboutSenderIdentification__;!!G_uCfscf7eWS!faEur1uGDdy2EYiXEYLCVq_UbMO1KiXKQa0vvY2nxKctWQDpgzsX7k9qTLBuMuQ6VNzwRHRHYc8jUopRRPgiIXk$ <https://urldefense.us/v3/__https://aka.ms/LearnAboutSenderIdentification__;!!G_uCfscf7eWS!bW2X4VgowGOLrAASgbFOR_Mh6HW4HtWqrdtpsvpnpiFrIwki34JOGyih-h-1bvgb-Bh4EdWRUoVqQW7s6CwgJDUS5w$>>
> We had a similar issue last year that we eventually tracked down to a bug in Intel MPI AllReduce, which was around the same version you are using.
>
> Can you try a different MPI or the latest Intel One API and see if your error clears?
>
> Randy
>
> On Tue, Apr 29, 2025 at 8:17 AM Klaij, Christiaan via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>>>> wrote:
> I don't think so, we have tracing in place to detect mismatches. But as soon as I switch the tracing on, the error disappears... Same if I add a counter or print statements before and after EventBegin/End. Looks like a memory corruption problem, maybe nothing to do with petsc despite the error message.
>
> Chris
>
> ________________________________________
> From: Matthew Knepley <knepley at gmail.com<mailto:knepley at gmail.com><mailto:knepley at gmail.com<mailto:knepley at gmail.com>><mailto:knepley at gmail.com<mailto:knepley at gmail.com><mailto:knepley at gmail.com<mailto:knepley at gmail.com>>><mailto:knepley at gmail.com<mailto:knepley at gmail.com><mailto:knepley at gmail.com<mailto:knepley at gmail.com>><mailto:knepley at gmail.com<mailto:knepley at gmail.com><mailto:knepley at gmail.com<mailto:knepley at gmail.com>>>>>
> Sent: Tuesday, April 29, 2025 1:50 PM
> To: Klaij, Christiaan
> Cc: Junchao Zhang; petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>>>; Isaac, Toby
> Subject: Re: [petsc-users] problem with nested logging
>
> On Tue, Apr 29, 2025 at 6:50 AM Klaij, Christiaan <C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>>>>> wrote:
> Here's a slightly better error message, obtained --with-debugging=1
>
> Is it possible that you have a mismatched EventBegin()/EventEnd() in your code? That could be why we cannot reproduce it here.
>
> Thanks,
>
> Matt
>
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> [0]PETSC ERROR: Petsc has generated inconsistent data
> [0]PETSC ERROR: MPIU_Allreduce() called in different locations (code lines) on different processors
> [0]PETSC ERROR: See https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!fFq1daAFFpKhBA_NWU3sd2QJe_S44rklqeRi0TB57XI0nQsh9jgy8iw3JNGpBbd21zqvO3QlGTLa7kjggvQAzPU$ for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.22.4, Mar 01, 2025
> [0]PETSC ERROR: ./refresco with 2 MPI process(es) and PETSC_ARCH on marclus3login2 by cklaij Tue Apr 29 12:43:54 2025
> [0]PETSC ERROR: Configure options: --prefix=/home/cklaij/ReFRESCO/trunk/install/extLibs --with-mpi-dir=/cm/shared/apps/openmpi/gcc/4.0.2 --with-x=0 --with-mpe=0 --with-debugging=1 --download-superlu_dist=https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/superlu_dist-8.1.2.tar.gz__;!!G_uCfscf7eWS!fFq1daAFFpKhBA_NWU3sd2QJe_S44rklqeRi0TB57XI0nQsh9jgy8iw3JNGpBbd21zqvO3QlGTLa7kjgVgVAJPM$ --with-blaslapack-dir=/cm/shared/apps/intel/oneapi/mkl/2021.4.0 --download-parmetis=https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/parmetis-4.0.3-p9.tar.gz__;!!G_uCfscf7eWS!fFq1daAFFpKhBA_NWU3sd2QJe_S44rklqeRi0TB57XI0nQsh9jgy8iw3JNGpBbd21zqvO3QlGTLa7kjgo2JWTO4$ --download-metis=https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/metis-5.1.0-p11.tar.gz__;!!G_uCfscf7eWS!fFq1daAFFpKhBA_NWU3sd2QJe_S44rklqeRi0TB57XI0nQsh9jgy8iw3JNGpBbd21zqvO3QlGTLa7kjgX9ZMYJA$ --with-packages-build-dir=/home/cklaij/ReFRESCO/trunk/build-libs/superbuild --with-ssl=0 --with-shared-libraries=1
> [0]PETSC ERROR: #1 PetscLogNestedTreePrintLine() at /home/cklaij/ReFRESCO/trunk/build-libs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:289
> [0]PETSC ERROR: #2 PetscLogNestedTreePrint() at /home/cklaij/ReFRESCO/trunk/build-libs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:379
> [0]PETSC ERROR: #3 PetscLogNestedTreePrint() at /home/cklaij/ReFRESCO/trunk/build-libs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:384
> [0]PETSC ERROR: #4 PetscLogNestedTreePrint() at /home/cklaij/ReFRESCO/trunk/build-libs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:384
> [0]PETSC ERROR: #5 PetscLogNestedTreePrintTop() at /home/cklaij/ReFRESCO/trunk/build-libs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:420
> [0]PETSC ERROR: #6 PetscLogHandlerView_Nested_XML() at /home/cklaij/ReFRESCO/trunk/build-libs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:443
> [0]PETSC ERROR: #7 PetscLogHandlerView_Nested() at /home/cklaij/ReFRESCO/trunk/build-libs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/lognested.c:405
> [0]PETSC ERROR: #8 PetscLogHandlerView() at /home/cklaij/ReFRESCO/trunk/build-libs/superbuild/petsc/src/src/sys/logging/handler/interface/loghandler.c:342
> [0]PETSC ERROR: #9 PetscLogView() at /home/cklaij/ReFRESCO/trunk/build-libs/superbuild/petsc/src/src/sys/logging/plog.c:2040
> [0]PETSC ERROR: #10 /home/cklaij/ReFRESCO/trunk/Code/src/petsc_include_impl.F90:130
>
> ________________________________________
> [cid:ii_19681617e7812ff9cfc1]
> dr. ir. Christiaan Klaij
> | senior researcher | Research & Development | CFD Development
> T +31 317 49 33 44<tel:+31%20317%2049%2033%2044> | C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>>>> | https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!fFq1daAFFpKhBA_NWU3sd2QJe_S44rklqeRi0TB57XI0nQsh9jgy8iw3JNGpBbd21zqvO3QlGTLa7kjgJooxJhg$ <https://urldefense.us/v3/__https://www.marin.nl/__;!!G_uCfscf7eWS!fFq1daAFFpKhBA_NWU3sd2QJe_S44rklqeRi0TB57XI0nQsh9jgy8iw3JNGpBbd21zqvO3QlGTLa7kjgpOXjQSM$ >
> [Facebook]<https://urldefense.us/v3/__https://www.facebook.com/marin.wageningen__;!!G_uCfscf7eWS!fFq1daAFFpKhBA_NWU3sd2QJe_S44rklqeRi0TB57XI0nQsh9jgy8iw3JNGpBbd21zqvO3QlGTLa7kjg_3Zt0Pw$ >
> [LinkedIn]<https://urldefense.us/v3/__https://www.linkedin.com/company/marin__;!!G_uCfscf7eWS!fFq1daAFFpKhBA_NWU3sd2QJe_S44rklqeRi0TB57XI0nQsh9jgy8iw3JNGpBbd21zqvO3QlGTLa7kjgcpgQdSE$ >
> [YouTube]<https://urldefense.us/v3/__https://www.youtube.com/marinmultimedia__;!!G_uCfscf7eWS!fFq1daAFFpKhBA_NWU3sd2QJe_S44rklqeRi0TB57XI0nQsh9jgy8iw3JNGpBbd21zqvO3QlGTLa7kjgn8qP6_I$ >
>
> From: Klaij, Christiaan <C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>>>>>
> Sent: Monday, April 28, 2025 3:53 PM
> To: Matthew Knepley
> Cc: Junchao Zhang; petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>>>>; Isaac, Toby
> Subject: Re: [petsc-users] problem with nested logging
>
> Bisecting would be quite hard, it's not just the petsc version that changed, also other libs, compilers, even os components.
>
> Chris
>
> ________________________________________
> From: Matthew Knepley <knepley at gmail.com<mailto:knepley at gmail.com><mailto:knepley at gmail.com<mailto:knepley at gmail.com>><mailto:knepley at gmail.com<mailto:knepley at gmail.com><mailto:knepley at gmail.com<mailto:knepley at gmail.com>>><mailto:knepley at gmail.com<mailto:knepley at gmail.com><mailto:knepley at gmail.com<mailto:knepley at gmail.com>><mailto:knepley at gmail.com<mailto:knepley at gmail.com><mailto:knepley at gmail.com<mailto:knepley at gmail.com>>>><mailto:knepley at gmail.com<mailto:knepley at gmail.com><mailto:knepley at gmail.com<mailto:knepley at gmail.com>><mailto:knepley at gmail.com<mailto:knepley at gmail.com><mailto:knepley at gmail.com<mailto:knepley at gmail.com>>><mailto:knepley at gmail.com<mailto:knepley at gmail.com><mailto:knepley at gmail.com<mailto:knepley at gmail.com>><mailto:knepley at gmail.com<mailto:knepley at gmail.com><mailto:knepley at gmail.com<mailto:knepley at gmail.com>>>>>>
> Sent: Monday, April 28, 2025 3:06 PM
> To: Klaij, Christiaan
> Cc: Junchao Zhang; petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>>>>; Isaac, Toby
> Subject: Re: [petsc-users] problem with nested logging
>
> You don't often get email from knepley at gmail.com<mailto:knepley at gmail.com><mailto:knepley at gmail.com<mailto:knepley at gmail.com>><mailto:knepley at gmail.com<mailto:knepley at gmail.com><mailto:knepley at gmail.com<mailto:knepley at gmail.com>>><mailto:knepley at gmail.com<mailto:knepley at gmail.com><mailto:knepley at gmail.com<mailto:knepley at gmail.com>><mailto:knepley at gmail.com<mailto:knepley at gmail.com><mailto:knepley at gmail.com<mailto:knepley at gmail.com>>>><mailto:knepley at gmail.com<mailto:knepley at gmail.com><mailto:knepley at gmail.com<mailto:knepley at gmail.com>><mailto:knepley at gmail.com<mailto:knepley at gmail.com><mailto:knepley at gmail.com<mailto:knepley at gmail.com>>><mailto:knepley at gmail.com<mailto:knepley at gmail.com><mailto:knepley at gmail.com<mailto:knepley at gmail.com>><mailto:knepley at gmail.com<mailto:knepley at gmail.com><mailto:knepley at gmail.com<mailto:knepley at gmail.com>>>>>. Learn why this is important<https://urldefense.us/v3/__https://aka.ms/LearnAboutSenderIdentification__;!!G_uCfscf7eWS!fFq1daAFFpKhBA_NWU3sd2QJe_S44rklqeRi0TB57XI0nQsh9jgy8iw3JNGpBbd21zqvO3QlGTLa7kjgcKEJXRA$ >
> On Mon, Apr 28, 2025 at 8:45 AM Klaij, Christiaan via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>>>>>> wrote:
> I've tried adding a nested log viewer to src/snes/tutorials/ex70.c,
> but it does not replicate the problem and works fine.
>
> Perhaps it is related to fortran, since the manualpage of
> PetscLogNestedBegin says "No fortran support" (why? we've been
> using it in fortran ever since).
>
> Therefore I've tried adding it to src/snes/ex5f90.F90 and that
> also works fine. It seems I cannot replicate the problem in a
> small example, unfortunately.
>
> We cannot replicate it here. Is there a chance you could bisect to see what change is responsible?
>
> Thanks,
>
> Matt
>
> Chris
>
> ________________________________________
> From: Junchao Zhang <junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>>>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>>>>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>>>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>>>>>>>
> Sent: Saturday, April 26, 2025 3:51 PM
> To: Klaij, Christiaan
> Cc: petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>>>>>; Isaac, Toby
> Subject: Re: [petsc-users] problem with nested logging
>
> You don't often get email from junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>>>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>>>>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>>>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com><mailto:junchao.zhang at gmail.com<mailto:junchao.zhang at gmail.com>>>>>>. Learn why this is important<https://urldefense.us/v3/__https://aka.ms/LearnAboutSenderIdentification__;!!G_uCfscf7eWS!cmLENZvO_Uydoa8ciUmsyX-F-QiJt9a2ZfQRUvQnRibGm7VE6PED7S_BDsUgjOzvPZIJyiIoJ8bLJk6gVgt1dmE$ >
> Toby (Cc'ed) might know it. Or could you provide an example?
>
> --Junchao Zhang
>
>
> On Fri, Apr 25, 2025 at 3:31 AM Klaij, Christiaan via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>>>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>>>>>>> wrote:
> We recently upgraded from 3.19.4 to 3.22.4 but face the problem below with the nested logging. Any ideas?
>
> Chris
>
>
> [1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> [1]PETSC ERROR: General MPI error
> [1]PETSC ERROR: MPI error 1 MPI_ERR_BUFFER: invalid buffer pointer
> [1]PETSC ERROR: See https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!cmLENZvO_Uydoa8ciUmsyX-F-QiJt9a2ZfQRUvQnRibGm7VE6PED7S_BDsUgjOzvPZIJyiIoJ8bLJk6gIT68pbk$ <https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!biVlk6PKXoJvq5oVlmWdVJfW9tXv-JlwuWr3zg3jI5u1_jo8rvtZpEYnHO5RjdBqQEoqpqlJ3nusrFGM3UaqHzc$> for trouble shooting.
> [1]PETSC ERROR: Petsc Release Version 3.22.4, Mar 01, 2025
> [1]PETSC ERROR: refresco with 2 MPI process(es) and PETSC_ARCH on marclus3login2 by jwindt Fri Apr 25 08:52:30 2025
> [1]PETSC ERROR: Configure options: --prefix=/home/jwindt/cmake_builds/refresco/install-libs-gnu --with-mpi-dir=/cm/shared/apps/openmpi/gcc/4.0.2 --with-x=0 --with-mpe=0 --with-debugging=0 --download-superlu_dist=https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/superlu_dist-8.1.2.tar.gz__;!!G_uCfscf7eWS!cmLENZvO_Uydoa8ciUmsyX-F-QiJt9a2ZfQRUvQnRibGm7VE6PED7S_BDsUgjOzvPZIJyiIoJ8bLJk6grH5BbeU$ <https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/superlu_dist-8.1.2.tar.gz__;!!G_uCfscf7eWS!biVlk6PKXoJvq5oVlmWdVJfW9tXv-JlwuWr3zg3jI5u1_jo8rvtZpEYnHO5RjdBqQEoqpqlJ3nusrFGM21-2D-o$> --with-blaslapack-dir=/cm/shared/apps/intel/oneapi/mkl/2021.4.0 --download-parmetis=https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/parmetis-4.0.3-p9.tar.gz__;!!G_uCfscf7eWS!cmLENZvO_Uydoa8ciUmsyX-F-QiJt9a2ZfQRUvQnRibGm7VE6PED7S_BDsUgjOzvPZIJyiIoJ8bLJk6gw4-tEtY$ <https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/parmetis-4.0.3-p9.tar.gz__;!!G_uCfscf7eWS!biVlk6PKXoJvq5oVlmWdVJfW9tXv-JlwuWr3zg3jI5u1_jo8rvtZpEYnHO5RjdBqQEoqpqlJ3nusrFGMW0lYHko$> --download-metis=https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/metis-5.1.0-p11.tar.gz__;!!G_uCfscf7eWS!cmLENZvO_Uydoa8ciUmsyX-F-QiJt9a2ZfQRUvQnRibGm7VE6PED7S_BDsUgjOzvPZIJyiIoJ8bLJk6gHq4uYiY$ <https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/metis-5.1.0-p11.tar.gz__;!!G_uCfscf7eWS!biVlk6PKXoJvq5oVlmWdVJfW9tXv-JlwuWr3zg3jI5u1_jo8rvtZpEYnHO5RjdBqQEoqpqlJ3nusrFGMbSrIiUg$> --with-packages-build-dir=/home/jwindt/cmake_builds/refresco/build-libs-gnu/superbuild --with-ssl=0 --with-shared-libraries=1 CFLAGS="-std=gnu11 -Wall -funroll-all-loops -O3 -DNDEBUG" CXXFLAGS="-std=gnu++14 -Wall -funroll-all-loops -O3 -DNDEBUG " COPTFLAGS="-std=gnu11 -Wall -funroll-all-loops -O3 -DNDEBUG" CXXOPTFLAGS="-std=gnu++14 -Wall -funroll-all-loops -O3 -DNDEBUG " FCFLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG" F90FLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG" FOPTFLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG"
> [1]PETSC ERROR: #1 PetscLogNestedTreePrint() at /home/jwindt/cmake_builds/refresco/build-libs-gnu/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:330
> [1]PETSC ERROR: #2 PetscLogNestedTreePrint() at /home/jwindt/cmake_builds/refresco/build-libs-gnu/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:384
> [1]PETSC ERROR: #3 PetscLogNestedTreePrint() at /home/jwindt/cmake_builds/refresco/build-libs-gnu/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:384
> [1]PETSC ERROR: #4 PetscLogNestedTreePrintTop() at /home/jwindt/cmake_builds/refresco/build-libs-gnu/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:420
> [1]PETSC ERROR: #5 PetscLogHandlerView_Nested_XML() at /home/jwindt/cmake_builds/refresco/build-libs-gnu/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:443
> [1]PETSC ERROR: #6 PetscLogHandlerView_Nested() at /home/jwindt/cmake_builds/refresco/build-libs-gnu/superbuild/petsc/src/src/sys/logging/handler/impls/nested/lognested.c:405
> [1]PETSC ERROR: #7 PetscLogHandlerView() at /home/jwindt/cmake_builds/refresco/build-libs-gnu/superbuild/petsc/src/src/sys/logging/handler/interface/loghandler.c:342
> [1]PETSC ERROR: #8 PetscLogView() at /home/jwindt/cmake_builds/refresco/build-libs-gnu/superbuild/petsc/src/src/sys/logging/plog.c:2040
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
> with errorcode 98.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
> [cid:ii_196725d1e2a809852191]
> dr. ir. Christiaan Klaij
> | senior researcher | Research & Development | CFD Development
> T +31 317 49 33 44<tel:+31%20317%2049%2033%2044> | C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>>>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>>>>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>>>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl><mailto:C.Klaij at marin.nl<mailto:C.Klaij at marin.nl>>>>>>> | https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!cmLENZvO_Uydoa8ciUmsyX-F-QiJt9a2ZfQRUvQnRibGm7VE6PED7S_BDsUgjOzvPZIJyiIoJ8bLJk6g8TwMMcw$ <https://urldefense.us/v3/__https://www.marin.nl/__;!!G_uCfscf7eWS!biVlk6PKXoJvq5oVlmWdVJfW9tXv-JlwuWr3zg3jI5u1_jo8rvtZpEYnHO5RjdBqQEoqpqlJ3nusrFGM8tSNH1g$>
> [Facebook]<https://urldefense.us/v3/__https://www.facebook.com/marin.wageningen__;!!G_uCfscf7eWS!biVlk6PKXoJvq5oVlmWdVJfW9tXv-JlwuWr3zg3jI5u1_jo8rvtZpEYnHO5RjdBqQEoqpqlJ3nusrFGMcVCZ9hk$>
> [LinkedIn]<https://urldefense.us/v3/__https://www.linkedin.com/company/marin__;!!G_uCfscf7eWS!biVlk6PKXoJvq5oVlmWdVJfW9tXv-JlwuWr3zg3jI5u1_jo8rvtZpEYnHO5RjdBqQEoqpqlJ3nusrFGMIDBZW7k$>
> [YouTube]<https://urldefense.us/v3/__https://www.youtube.com/marinmultimedia__;!!G_uCfscf7eWS!biVlk6PKXoJvq5oVlmWdVJfW9tXv-JlwuWr3zg3jI5u1_jo8rvtZpEYnHO5RjdBqQEoqpqlJ3nusrFGMVKWos24$>
>
>
> --
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
>
> https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!fFq1daAFFpKhBA_NWU3sd2QJe_S44rklqeRi0TB57XI0nQsh9jgy8iw3JNGpBbd21zqvO3QlGTLa7kjg539kFLg$ <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!fFq1daAFFpKhBA_NWU3sd2QJe_S44rklqeRi0TB57XI0nQsh9jgy8iw3JNGpBbd21zqvO3QlGTLa7kjgMsu6hhA$ >
>
>
> --
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
>
> https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!fFq1daAFFpKhBA_NWU3sd2QJe_S44rklqeRi0TB57XI0nQsh9jgy8iw3JNGpBbd21zqvO3QlGTLa7kjg539kFLg$ <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!fFq1daAFFpKhBA_NWU3sd2QJe_S44rklqeRi0TB57XI0nQsh9jgy8iw3JNGpBbd21zqvO3QlGTLa7kjgMsu6hhA$ >
>
>
>
> --
> Stefano
>
>
> --
> Stefano

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250704/f72c0189/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image240518.png
Type: image/png
Size: 5004 bytes
Desc: image240518.png
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250704/f72c0189/attachment-0004.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image614921.png
Type: image/png
Size: 487 bytes
Desc: image614921.png
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250704/f72c0189/attachment-0005.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image484724.png
Type: image/png
Size: 504 bytes
Desc: image484724.png
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250704/f72c0189/attachment-0006.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image987297.png
Type: image/png
Size: 482 bytes
Desc: image987297.png
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250704/f72c0189/attachment-0007.png>


More information about the petsc-users mailing list