<div dir="ltr"><div dir="ltr">On Thu, Jul 10, 2025 at 8:46 AM Klaij, Christiaan <<a href="mailto:C.Klaij@marin.nl">C.Klaij@marin.nl</a>> wrote:</div><div class="gmail_quote gmail_quote_container"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Hi Matt,<br>
<br>
Attached is the output of valgrind:<br>
<br>
$ mpirun -mca coll_hcoll_enable 0 -n 2 valgrind --track-origins=yes ./ex2f-cklaij-dbg -pc_type jacobi -ksp_monitor_short -ksp_gmres_cgs_refinement_type refine_always > out 2>&1<br></blockquote><div><br></div><div>Hmm, so no MPI error when running with valgrind? It looks like Junchao and I cannot reproduce here. It is puzzling. Would you be able to try MPICH instead?</div><div><br></div><div>  Thanks,</div><div><br></div><div>     Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
Chris<br>
<br>
<br>
________________________________________<br>
From: Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>><br>
Sent: Thursday, July 10, 2025 1:37 PM<br>
To: Klaij, Christiaan<br>
Cc: Junchao Zhang; PETSc users list<br>
Subject: Re: [petsc-users] problem with nested logging, standalone example<br>
<br>
On Thu, Jul 10, 2025 at 4:39 AM Klaij, Christiaan via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a><mailto:<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>>> wrote:<br>
An additional clue perhaps: with the option OMPI_MCA_coll_hcoll_enable=0, the code does not hang but gives the error below.<br>
<br>
The error on its face should be impossible. On line 289, we pass pointers to two variables on the stack. This would seem to indicate more general memory corruption.<br>
<br>
I know we asked before, but have you run under Address Sanitizer or Valgrind?<br>
<br>
  Thanks,<br>
<br>
     Matt<br>
<br>
Chris<br>
<br>
<br>
$ mpirun -mca coll_hcoll_enable 0 -n 2 ./ex2f-cklaij-dbg -pc_type jacobi -ksp_monitor_short -ksp_gmres_cgs_refinement_type refine_always<br>
0 KSP Residual norm 1.11803<br>
1 KSP Residual norm 0.591608<br>
2 KSP Residual norm 0.316228<br>
3 KSP Residual norm < 1.e-11<br>
0 KSP Residual norm 0.707107<br>
1 KSP Residual norm 0.408248<br>
2 KSP Residual norm < 1.e-11<br>
Norm of error < 1.e-12 iterations 3<br>
[1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>
[1]PETSC ERROR: General MPI error<br>
[1]PETSC ERROR: MPI error 1 MPI_ERR_BUFFER: invalid buffer pointer<br>
[1]PETSC ERROR: See <a href="https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!ekmHYyjQCQOSoZsuiOVaLC1ES7g6V5BR3QAM29XZk9xse4rZ3cQBIP5mk--PyGf6YJ9GhRyxHroRTn_izD5g$" rel="noreferrer" target="_blank">https://petsc.org/release/faq/</a><<a href="https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!bhLWmMB1f8WaSDbp9K4m6tdMiaSZUO0fz4wfjGqnmEpFXM6dyY0NHVQFP9Rbvo2D9gl117ZjcVyTiAmcs91fyp4$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!bhLWmMB1f8WaSDbp9K4m6tdMiaSZUO0fz4wfjGqnmEpFXM6dyY0NHVQFP9Rbvo2D9gl117ZjcVyTiAmcs91fyp4$</a>> for trouble shooting.<br>
[1]PETSC ERROR: Petsc Release Version 3.22.4, Mar 01, 2025<br>
[1]PETSC ERROR: ./ex2f-cklaij-dbg with 2 MPI process(es) and PETSC_ARCH on login1 by cklaij Thu Jul 10 10:33:33 2025<br>
[1]PETSC ERROR: Configure options: --prefix=/home/cklaij/ReFRESCO/trunk/install/extLibs --with-mpi-dir=/cm/shared/apps/openmpi/gcc/5.0.6-debug --with-x=0 --with-mpe=0 --with-debugging=0 --download-superlu_dist=<a href="https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/superlu_dist-8.1.2.tar.gz__;!!G_uCfscf7eWS!ekmHYyjQCQOSoZsuiOVaLC1ES7g6V5BR3QAM29XZk9xse4rZ3cQBIP5mk--PyGf6YJ9GhRyxHroRThmsqdKy$" rel="noreferrer" target="_blank">https://updates.marin.nl/refresco/libs/superlu_dist-8.1.2.tar.gz</a><<a href="https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/superlu_dist-8.1.2.tar.gz__;!!G_uCfscf7eWS!bhLWmMB1f8WaSDbp9K4m6tdMiaSZUO0fz4wfjGqnmEpFXM6dyY0NHVQFP9Rbvo2D9gl117ZjcVyTiAmcEB0dwdE$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/superlu_dist-8.1.2.tar.gz__;!!G_uCfscf7eWS!bhLWmMB1f8WaSDbp9K4m6tdMiaSZUO0fz4wfjGqnmEpFXM6dyY0NHVQFP9Rbvo2D9gl117ZjcVyTiAmcEB0dwdE$</a>> --with-blaslapack-dir=/cm/shared/apps/oneapi/2024.2.1/mkl/2024.2 --download-parmetis=<a href="https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/parmetis-4.0.3-p9.tar.gz__;!!G_uCfscf7eWS!ekmHYyjQCQOSoZsuiOVaLC1ES7g6V5BR3QAM29XZk9xse4rZ3cQBIP5mk--PyGf6YJ9GhRyxHroRTo_cr9O7$" rel="noreferrer" target="_blank">https://updates.marin.nl/refresco/libs/parmetis-4.0.3-p9.tar.gz</a><<a href="https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/parmetis-4.0.3-p9.tar.gz__;!!G_uCfscf7eWS!bhLWmMB1f8WaSDbp9K4m6tdMiaSZUO0fz4wfjGqnmEpFXM6dyY0NHVQFP9Rbvo2D9gl117ZjcVyTiAmcW9tvX1c$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/parmetis-4.0.3-p9.tar.gz__;!!G_uCfscf7eWS!bhLWmMB1f8WaSDbp9K4m6tdMiaSZUO0fz4wfjGqnmEpFXM6dyY0NHVQFP9Rbvo2D9gl117ZjcVyTiAmcW9tvX1c$</a>> --download-metis=<a href="https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/metis-5.1.0-p11.tar.gz__;!!G_uCfscf7eWS!ekmHYyjQCQOSoZsuiOVaLC1ES7g6V5BR3QAM29XZk9xse4rZ3cQBIP5mk--PyGf6YJ9GhRyxHroRTs3JgUlY$" rel="noreferrer" target="_blank">https://updates.marin.nl/refresco/libs/metis-5.1.0-p11.tar.gz</a><<a href="https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/metis-5.1.0-p11.tar.gz__;!!G_uCfscf7eWS!bhLWmMB1f8WaSDbp9K4m6tdMiaSZUO0fz4wfjGqnmEpFXM6dyY0NHVQFP9Rbvo2D9gl117ZjcVyTiAmcI1wRWu4$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/metis-5.1.0-p11.tar.gz__;!!G_uCfscf7eWS!bhLWmMB1f8WaSDbp9K4m6tdMiaSZUO0fz4wfjGqnmEpFXM6dyY0NHVQFP9Rbvo2D9gl117ZjcVyTiAmcI1wRWu4$</a>> --with-packages-build-dir=/home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild --with-ssl=0 --with-shared-libraries=1 CFLAGS="-std=gnu11 -Wall -funroll-all-loops -O3 -DNDEBUG" CXXFLAGS="-std=gnu++14 -Wall -funroll-all-loops -O3 -DNDEBUG " COPTFLAGS="-std=gnu11 -Wall -funroll-all-loops -O3 -DNDEBUG" CXXOPTFLAGS="-std=gnu++14 -Wall -funroll-all-loops -O3 -DNDEBUG " FCFLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG" F90FLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG" FOPTFLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG"<br>
[1]PETSC ERROR: #1 PetscLogNestedTreePrintLine() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:289<br>
[1]PETSC ERROR: #2 PetscLogNestedTreePrint() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:377<br>
[1]PETSC ERROR: #3 PetscLogNestedTreePrint() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:384<br>
[1]PETSC ERROR: #4 PetscLogNestedTreePrintTop() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:420<br>
[1]PETSC ERROR: #5 PetscLogHandlerView_Nested_XML() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:443<br>
[1]PETSC ERROR: #6 PetscLogHandlerView_Nested() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/lognested.c:405<br>
[1]PETSC ERROR: #7 PetscLogHandlerView() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/interface/loghandler.c:342<br>
[1]PETSC ERROR: #8 PetscLogView() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/plog.c:2040<br>
[1]PETSC ERROR: #9 ex2f-cklaij-dbg.F90:301<br>
--------------------------------------------------------------------------<br>
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_SELF<br>
Proc: [[55228,1],1]<br>
Errorcode: 98<br>
<br>
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.<br>
You may or may not see output from other processes, depending on<br>
exactly when Open MPI kills them.<br>
--------------------------------------------------------------------------<br>
--------------------------------------------------------------------------<br>
prterun has exited due to process rank 1 with PID 0 on node login1 calling<br>
"abort". This may have caused other processes in the application to be<br>
terminated by signals sent by prterun (as reported here).<br>
--------------------------------------------------------------------------<br>
<br>
________________________________________<br>
[cid:ii_197f41eaf2e74966e3f1]<br>
dr. ir.         Christiaan       Klaij   |      senior researcher<br>
Research & Development   |      CFD Development<br>
T +31 317 49 33 44<tel:+31%20317%2049%2033%2044>         |      <a href="https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!ekmHYyjQCQOSoZsuiOVaLC1ES7g6V5BR3QAM29XZk9xse4rZ3cQBIP5mk--PyGf6YJ9GhRyxHroRTjNXiH1z$" rel="noreferrer" target="_blank">www.marin.nl</a><<a href="https://urldefense.us/v3/__https://www.marin.nl/__;!!G_uCfscf7eWS!bhLWmMB1f8WaSDbp9K4m6tdMiaSZUO0fz4wfjGqnmEpFXM6dyY0NHVQFP9Rbvo2D9gl117ZjcVyTiAmcwyIuD3g$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__https://www.marin.nl/__;!!G_uCfscf7eWS!bhLWmMB1f8WaSDbp9K4m6tdMiaSZUO0fz4wfjGqnmEpFXM6dyY0NHVQFP9Rbvo2D9gl117ZjcVyTiAmcwyIuD3g$</a>><br>
[Facebook]<<a href="https://urldefense.us/v3/__https://www.facebook.com/marin.wageningen__;!!G_uCfscf7eWS!bhLWmMB1f8WaSDbp9K4m6tdMiaSZUO0fz4wfjGqnmEpFXM6dyY0NHVQFP9Rbvo2D9gl117ZjcVyTiAmc0UAPFx4$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__https://www.facebook.com/marin.wageningen__;!!G_uCfscf7eWS!bhLWmMB1f8WaSDbp9K4m6tdMiaSZUO0fz4wfjGqnmEpFXM6dyY0NHVQFP9Rbvo2D9gl117ZjcVyTiAmc0UAPFx4$</a>><br>
[LinkedIn]<<a href="https://urldefense.us/v3/__https://www.linkedin.com/company/marin__;!!G_uCfscf7eWS!bhLWmMB1f8WaSDbp9K4m6tdMiaSZUO0fz4wfjGqnmEpFXM6dyY0NHVQFP9Rbvo2D9gl117ZjcVyTiAmc0f6IfnU$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__https://www.linkedin.com/company/marin__;!!G_uCfscf7eWS!bhLWmMB1f8WaSDbp9K4m6tdMiaSZUO0fz4wfjGqnmEpFXM6dyY0NHVQFP9Rbvo2D9gl117ZjcVyTiAmc0f6IfnU$</a>><br>
[YouTube]<<a href="https://urldefense.us/v3/__https://www.youtube.com/marinmultimedia__;!!G_uCfscf7eWS!bhLWmMB1f8WaSDbp9K4m6tdMiaSZUO0fz4wfjGqnmEpFXM6dyY0NHVQFP9Rbvo2D9gl117ZjcVyTiAmcDphiKcc$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__https://www.youtube.com/marinmultimedia__;!!G_uCfscf7eWS!bhLWmMB1f8WaSDbp9K4m6tdMiaSZUO0fz4wfjGqnmEpFXM6dyY0NHVQFP9Rbvo2D9gl117ZjcVyTiAmcDphiKcc$</a>><br>
<br>
<br>
From: Klaij, Christiaan <<a href="mailto:C.Klaij@marin.nl" target="_blank">C.Klaij@marin.nl</a><mailto:<a href="mailto:C.Klaij@marin.nl" target="_blank">C.Klaij@marin.nl</a>>><br>
Sent: Thursday, July 10, 2025 10:15 AM<br>
To: Junchao Zhang<br>
Cc: PETSc users list<br>
Subject: Re: [petsc-users] problem with nested logging, standalone example<br>
<br>
Hi Junchao,<br>
<br>
Thanks for testing. I've fixed the error but unfortunately that doesn't change the behavior, the code still hangs as before, with the same stack trace...<br>
<br>
Chris<br>
<br>
________________________________________<br>
From: Junchao Zhang <<a href="mailto:junchao.zhang@gmail.com" target="_blank">junchao.zhang@gmail.com</a><mailto:<a href="mailto:junchao.zhang@gmail.com" target="_blank">junchao.zhang@gmail.com</a>>><br>
Sent: Tuesday, July 8, 2025 10:58 PM<br>
To: Klaij, Christiaan<br>
Cc: PETSc users list<br>
Subject: Re: [petsc-users] problem with nested logging, standalone example<br>
<br>
Hi, Chris,<br>
First, I had to fix an error in your test by adding " PetscCallA(MatSetFromOptions(AA,ierr))" at line 254.<br>
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>
[0]PETSC ERROR: Object is in wrong state<br>
[0]PETSC ERROR: Mat object's type is not set: Argument # 1<br>
...<br>
[0]PETSC ERROR: #1 MatSetValues() at /scratch/jczhang/petsc/src/mat/interface/matrix.c:1503<br>
[0]PETSC ERROR: #2 ex2f.F90:258<br>
<br>
Then I could ran the test without problems<br>
mpirun -n 2 ./ex2f -pc_type jacobi -ksp_monitor_short -ksp_gmres_cgs_refinement_type refine_always<br>
0 KSP Residual norm 1.11803<br>
1 KSP Residual norm 0.591608<br>
2 KSP Residual norm 0.316228<br>
3 KSP Residual norm < 1.e-11<br>
0 KSP Residual norm 0.707107<br>
1 KSP Residual norm 0.408248<br>
2 KSP Residual norm < 1.e-11<br>
Norm of error < 1.e-12 iterations 3<br>
<br>
I used petsc-3.22.4, gcc-11.3, openmpi-5.0.6 and configured with<br>
./configure --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-openmpi --with-ssl=0 --with-shared-libraries=1 CFLAGS="-std=gnu11 -Wall -funroll-all-loops -O3 -DNDEBUG" CXXFLAGS="-std=gnu++14 -Wall -funroll-all-loops -O3 -DNDEBUG " COPTFLAGS="-std=gnu11 -Wall -funroll-all-loops -O3 -DNDEBUG" CXXOPTFLAGS="-std=gnu++14 -Wall -funroll-all-loops -O3 -DNDEBUG " FCFLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG" F90FLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG" FOPTFLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG"<br>
<br>
Could you fix the error and retry?<br>
<br>
--Junchao Zhang<br>
<br>
<br>
On Sun, Jul 6, 2025 at 12:57 PM Klaij, Christiaan via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a><mailto:<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><mailto:<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a><mailto:<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>>>> wrote:<br>
Attached is a standalone example of the issue described in the<br>
earlier thread "problem with nested logging". The issue appeared<br>
somewhere between petsc 3.19.4 and 3.23.4.<br>
<br>
The example is a variation of ../ksp/tutorials/ex2f.F90, where<br>
I've added the nested log viewer with one event as well as the<br>
solution of a small system on rank zero.<br>
<br>
When running on mulitple procs the example hangs during<br>
PetscLogView with the backtrace below. The configure.log is also<br>
attached in the hope that you can replicate the issue.<br>
<br>
Chris<br>
<br>
<br>
#0 0x000015554c84ea9e in mca_pml_ucx_recv (buf=0x7fffffff9e30, count=1,<br>
datatype=0x15554c9ef900 <ompi_mpi_2dblprec>, src=1, tag=-12,<br>
comm=0x7f1e30, mpi_status=0x0) at pml_ucx.c:700<br>
#1 0x000015554c65baff in ompi_coll_base_allreduce_intra_recursivedoubling (<br>
sbuf=0x7fffffff9e20, rbuf=0x7fffffff9e30, count=1,<br>
dtype=0x15554c9ef900 <ompi_mpi_2dblprec>,<br>
op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30, module=0xaec630)<br>
at base/coll_base_allreduce.c:247<br>
#2 0x000015554c6a7e40 in ompi_coll_tuned_allreduce_intra_do_this (<br>
sbuf=0x7fffffff9e20, rbuf=0x7fffffff9e30, count=1,<br>
dtype=0x15554c9ef900 <ompi_mpi_2dblprec>,<br>
op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30, module=0xaec630,<br>
algorithm=3, faninout=0, segsize=0) at coll_tuned_allreduce_decision.c:142<br>
#3 0x000015554c6a054f in ompi_coll_tuned_allreduce_intra_dec_fixed (<br>
sbuf=0x7fffffff9e20, rbuf=0x7fffffff9e30, count=1,<br>
dtype=0x15554c9ef900 <ompi_mpi_2dblprec>,<br>
op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30, module=0xaec630)<br>
at coll_tuned_decision_fixed.c:216<br>
#4 0x000015554c68e160 in mca_coll_hcoll_allreduce (sbuf=0x7fffffff9e20,<br>
rbuf=0x7fffffff9e30, count=1, dtype=0x15554c9ef900 <ompi_mpi_2dblprec>,<br>
op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30, module=0xaecb80)<br>
at coll_hcoll_ops.c:217<br>
#5 0x000015554c59811a in PMPI_Allreduce (sendbuf=0x7fffffff9e20,<br>
recvbuf=0x7fffffff9e30, count=1, datatype=0x15554c9ef900 <ompi_mpi_2dblprec>, op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30) at allreduce.c:123<br>
#6 0x0000155553eabede in MPIU_Allreduce_Private () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
#7 0x0000155553e50d08 in PetscPrintXMLNestedLinePerfResults () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
#8 0x0000155553e5123e in PetscLogNestedTreePrintLine () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
#9 0x0000155553e51f3a in PetscLogNestedTreePrint () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
#10 0x0000155553e51e96 in PetscLogNestedTreePrint () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
#11 0x0000155553e51e96 in PetscLogNestedTreePrint () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
#12 0x0000155553e52142 in PetscLogNestedTreePrintTop () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
#13 0x0000155553e5257b in PetscLogHandlerView_Nested_XML () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
#14 0x0000155553e4e5a0 in PetscLogHandlerView_Nested () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
#15 0x0000155553e56232 in PetscLogHandlerView () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
#16 0x0000155553e588c3 in PetscLogView () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
#17 0x0000155553e40eb5 in petsclogview_ () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br>
#18 0x0000000000402c8b in MAIN__ ()<br>
#19 0x00000000004023df in main ()<br>
[cid:ii_197ebccaa1d27ee6ef21]<br>
dr. ir. Christiaan Klaij | senior researcher<br>
Research & Development | CFD Development<br>
T +31 317 49 33 44<tel:+31%20317%2049%2033%2044> | <a href="https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!ekmHYyjQCQOSoZsuiOVaLC1ES7g6V5BR3QAM29XZk9xse4rZ3cQBIP5mk--PyGf6YJ9GhRyxHroRTjNXiH1z$" rel="noreferrer" target="_blank">www.marin.nl</a><<a href="https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!bhLWmMB1f8WaSDbp9K4m6tdMiaSZUO0fz4wfjGqnmEpFXM6dyY0NHVQFP9Rbvo2D9gl117ZjcVyTiAmcO8dj_LY$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!bhLWmMB1f8WaSDbp9K4m6tdMiaSZUO0fz4wfjGqnmEpFXM6dyY0NHVQFP9Rbvo2D9gl117ZjcVyTiAmcO8dj_LY$</a>><<a href="https://urldefense.us/v3/__https://www.marin.nl/__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imk4ivm_tE$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__https://www.marin.nl/__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imk4ivm_tE$</a>><br>
[Facebook]<<a href="https://urldefense.us/v3/__https://www.facebook.com/marin.wageningen__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imkLNCvsiI$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__https://www.facebook.com/marin.wageningen__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imkLNCvsiI$</a>><br>
[LinkedIn]<<a href="https://urldefense.us/v3/__https://www.linkedin.com/company/marin__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imkrb79Ay4$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__https://www.linkedin.com/company/marin__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imkrb79Ay4$</a>><br>
[YouTube]<<a href="https://urldefense.us/v3/__https://www.youtube.com/marinmultimedia__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imkJiCoeLw$" rel="noreferrer" target="_blank">https://urldefense.us/v3/__https://www.youtube.com/marinmultimedia__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imkJiCoeLw$</a>><br>
<br>
<br>
<br>
--<br>
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener<br>
<br>
<a href="https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!ekmHYyjQCQOSoZsuiOVaLC1ES7g6V5BR3QAM29XZk9xse4rZ3cQBIP5mk--PyGf6YJ9GhRyxHroRTiXJnOgy$" rel="noreferrer" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><<a href="https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!ekmHYyjQCQOSoZsuiOVaLC1ES7g6V5BR3QAM29XZk9xse4rZ3cQBIP5mk--PyGf6YJ9GhRyxHroRTo-v9INJ$" rel="noreferrer" target="_blank">http://www.cse.buffalo.edu/~knepley/</a>><br>
</blockquote></div><div><br clear="all"></div><div><br></div><span class="gmail_signature_prefix">-- </span><br><div dir="ltr" class="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!ekmHYyjQCQOSoZsuiOVaLC1ES7g6V5BR3QAM29XZk9xse4rZ3cQBIP5mk--PyGf6YJ9GhRyxHroRTo-v9INJ$" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>