<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" /></head><body><div>Hi Junchao,<br /><br />Your fix works here too, not only for ex2f but more importantly also for our simulation code. What a relief, thanks a lot!<br /><br />Out of curiosity, do you understand why it failed to reproduce initially?<br /><br />Chris<br /><br />________________________________________<div dir="ltr" style="mso-line-height-rule:exactly;-webkit-text-size-adjust:100%;font-size:1px;direction:ltr;"><table dir="ltr" cellpadding="0" cellspacing="0" border="0" style="width:100%;direction:ltr;border-collapse:collapse;font-size:1px;"><tr style="font-size:0;"><td align="left" style="vertical-align:top;"><table cellpadding="0" cellspacing="0" border="0" style="border-collapse:collapse;font-size:0;line-height:normal;"><tr style="font-size:0;"><td align="left" style="padding:10px 0;vertical-align:top;"><img src="cid:image933554.png@340308A2.237ABEBD" width="125" height="40" border="0" alt="" style="width:125px;min-width:125px;max-width:125px;height:40px;min-height:40px;max-height:40px;font-size:0;" /></td></tr></table></td></tr><tr style="font-size:0;"><td align="left" style="vertical-align:top;"><table cellpadding="0" cellspacing="0" border="0" style="border-collapse:collapse;font-size:0;"><tr style="font-size:0;"><td align="left" style="vertical-align:top;"><table cellpadding="0" cellspacing="0" border="0" style="border-collapse:collapse;font-size:0;color:#000001;font-style:normal;font-weight:400;white-space:nowrap;"><tr style="font-size:14.67px;"><td align="left" style="vertical-align:top;font-family:Calibri,Arial,sans-serif;">dr. ir. </td><td align="left" style="vertical-align:top;font-family:Calibri,Arial,sans-serif;">Christiaan</td><td align="left" style="vertical-align:top;font-family:Calibri,Arial,sans-serif;"> Klaij</td><td align="left" style="vertical-align:top;font-family:Calibri,Arial,sans-serif;"> | </td><td align="left" style="vertical-align:top;font-family:Calibri,Arial,sans-serif;">senior researcher</td><td align="left" style="vertical-align:top;font-size:0;"></td></tr></table></td></tr></table></td></tr><tr style="font-size:0;"><td align="left" style="vertical-align:top;"><table cellpadding="0" cellspacing="0" border="0" style="border-collapse:collapse;font-size:0;color:#000001;font-style:normal;font-weight:400;white-space:nowrap;"><tr style="font-size:14.67px;"><td align="left" style="vertical-align:top;font-family:Calibri,Arial,sans-serif;">Research & Development</td><td align="left" style="vertical-align:top;font-family:Calibri,Arial,sans-serif;"> | </td><td align="left" style="vertical-align:top;font-family:Calibri,Arial,sans-serif;">CFD Development</td></tr></table></td></tr><tr style="font-size:0;"><td align="left" style="vertical-align:top;"><table cellpadding="0" cellspacing="0" border="0" style="border-collapse:collapse;font-size:0;"><tr style="font-size:0;"><td align="left" style="vertical-align:top;"><table cellpadding="0" cellspacing="0" border="0" style="border-collapse:collapse;font-size:0;color:#000001;font-style:normal;font-weight:400;white-space:nowrap;"><tr style="font-size:14.67px;"><td align="left" style="vertical-align:top;font-family:Calibri,Arial,sans-serif;">T <a href="tel:+31%20317%2049%2033%2044" target="_blank" id="LPlnk689713" style="text-decoration:none;color:#000001;">+31 317 49 33 44</a></td><td align="left" style="vertical-align:top;font-family:Calibri,Arial,sans-serif;"> | </td><td align="left" style="vertical-align:top;font-family:Calibri,Arial,sans-serif;"><a href="https://urldefense.us/v3/__https://www.marin.nl/__;!!G_uCfscf7eWS!chUVMpNpdFW_9GdVtoUeUbibFyK85qjxNrQXMnMmnMd8EM89uGGMImRDqorLBEfZlQvUW9oFNeAY2-BEgsYkoAY$" target="_blank" id="LPlnk689713" style="text-decoration:none;color:#000001;">www.marin.nl</a></td></tr></table></td></tr></table></td></tr><tr style="font-size:0;"><td align="left" style="vertical-align:top;"><table cellpadding="0" cellspacing="0" border="0" style="border-collapse:collapse;font-size:0;"><tr style="font-size:0;"><td align="left" style="padding:5px 0 0;vertical-align:top;"><table cellpadding="0" cellspacing="0" border="0" style="border-collapse:collapse;font-size:0;"><tr style="font-size:0;"><td align="left" style="vertical-align:top;"><table cellpadding="0" cellspacing="0" border="0" style="border-collapse:collapse;font-size:0;line-height:normal;"><tr style="font-size:0;"><td align="left" style="padding:0 3px 3px 0;vertical-align:top;"><a href="https://urldefense.us/v3/__https://www.facebook.com/marin.wageningen__;!!G_uCfscf7eWS!chUVMpNpdFW_9GdVtoUeUbibFyK85qjxNrQXMnMmnMd8EM89uGGMImRDqorLBEfZlQvUW9oFNeAY2-BEGHDL1A0$" target="_blank" id="LPlnk689713" style="text-decoration:none;"><img src="cid:image203317.png@BA2B2CC7.864514E7" width="15" height="15" border="0" title="Facebook" alt="Facebook" style="width:15px;min-width:15px;max-width:15px;height:15px;min-height:15px;max-height:15px;font-size:12px;" /></a></td></tr></table></td><td align="left" style="vertical-align:top;"><table cellpadding="0" cellspacing="0" border="0" style="border-collapse:collapse;font-size:0;line-height:normal;"><tr style="font-size:0;"><td align="left" style="padding:0 3px 3px 0;vertical-align:top;"><a href="https://urldefense.us/v3/__https://www.linkedin.com/company/marin__;!!G_uCfscf7eWS!chUVMpNpdFW_9GdVtoUeUbibFyK85qjxNrQXMnMmnMd8EM89uGGMImRDqorLBEfZlQvUW9oFNeAY2-BEWqrDabw$" target="_blank" id="LPlnk689713" style="text-decoration:none;"><img src="cid:image449198.png@C88E11A6.19284C23" width="15" height="15" border="0" title="LinkedIn" alt="LinkedIn" style="width:15px;min-width:15px;max-width:15px;height:15px;min-height:15px;max-height:15px;font-size:12px;" /></a></td></tr></table></td><td align="left" style="vertical-align:top;"><table cellpadding="0" cellspacing="0" border="0" style="border-collapse:collapse;font-size:0;line-height:normal;"><tr style="font-size:0;"><td align="left" style="padding:0 3px 3px 0;vertical-align:top;"><a href="https://urldefense.us/v3/__https://www.youtube.com/marinmultimedia__;!!G_uCfscf7eWS!chUVMpNpdFW_9GdVtoUeUbibFyK85qjxNrQXMnMmnMd8EM89uGGMImRDqorLBEfZlQvUW9oFNeAY2-BETUdmUOU$" target="_blank" id="LPlnk689713" style="text-decoration:none;"><img src="cid:image038368.png@069C1F1F.DDC43AE1" width="15" height="15" border="0" title="YouTube" alt="YouTube" style="width:15px;min-width:15px;max-width:15px;height:15px;min-height:15px;max-height:15px;font-size:12px;" /></a></td></tr></table></td></tr></table></td></tr></table></td></tr></table><span style="font-family:remialcxesans;"> <span style="font-family:'template-zjzHWwipEfCqpwAiSIGong';"> </span><span style="font-family:'zone-1';"> </span><span style="font-family:'zones-AQ';"> </span></span></div><br />From: Zongze Yang <yangzongze@gmail.com><br />Sent: Thursday, July 24, 2025 8:21 AM<br />To: Junchao Zhang; Barry Smith<br />Cc: Klaij, Christiaan; PETSc users list<br />Subject: Re: [petsc-users] problem with nested logging, standalone example<br /><br />You don't often get email from yangzongze@gmail.com. Learn why this is important<<a href="https://urldefense.us/v3/__https://aka.ms/LearnAboutSenderIdentification__;!!G_uCfscf7eWS!chUVMpNpdFW_9GdVtoUeUbibFyK85qjxNrQXMnMmnMd8EM89uGGMImRDqorLBEfZlQvUW9oFNeAY2-BEOEC2eB4$">https://aka.ms/LearnAboutSenderIdentification</a>><br />Thank you for the quick fix — it works well on my end.<br /><br />Best wishes,<br />Zongze<br /><br />From: Junchao Zhang <junchao.zhang@gmail.com><br />Date: Thursday, July 24, 2025 at 02:55<br />To: Barry Smith <bsmith@petsc.dev><br />Cc: Zongze Yang <yangzongze@gmail.com>, Klaij, Christiaan <C.Klaij@marin.nl>, PETSc users list <petsc-users@mcs.anl.gov><br />Subject: Re: [petsc-users] problem with nested logging, standalone example<br /><br />I think I have a fix at <a href="https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/merge_requests/8583__;!!G_uCfscf7eWS!chUVMpNpdFW_9GdVtoUeUbibFyK85qjxNrQXMnMmnMd8EM89uGGMImRDqorLBEfZlQvUW9oFNeAY2-BEG6w6w7Y$">https://gitlab.com/petsc/petsc/-/merge_requests/8583</a><br /><br />Chirs and Zongze, could you try it?<br /><br />Thanks!<br />--Junchao Zhang<br /><br /><br />On Tue, Jul 22, 2025 at 4:16 PM Barry Smith <bsmith@petsc.dev<mailto:bsmith@petsc.dev>> wrote:<br /><br />  Yippee! (maybe)<br /><br />On Jul 22, 2025, at 4:18 PM, Junchao Zhang <junchao.zhang@gmail.com<mailto:junchao.zhang@gmail.com>> wrote:<br /><br />With Chris's example, I did reproduce the "MPI_ERR_BUFFER: invalid buffer pointer" on a machine.  I am looking into it.<br /><br />Thanks.<br />--Junchao Zhang<br /><br /><br />On Tue, Jul 22, 2025 at 9:51 AM Zongze Yang <yangzongze@gmail.com<mailto:yangzongze@gmail.com>> wrote:<br />Hi,<br />I encountered a similar issue with Firedrake when using the -log_view option with XML format on macOS. Below is the error message. The Firedrake code and the shell script used to run it are attached.<br /><br />```<br />[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br />[0]PETSC ERROR: General MPI error<br />[0]PETSC ERROR: MPI error 1 MPI_ERR_BUFFER: invalid buffer pointer<br />[0]PETSC ERROR: See <a href="https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!chUVMpNpdFW_9GdVtoUeUbibFyK85qjxNrQXMnMmnMd8EM89uGGMImRDqorLBEfZlQvUW9oFNeAY2-BEtbvLIZk$">https://petsc.org/release/faq/</a><https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!eiv8Wo1VhQz4c2L8MbDoPcg0KZ0loiWlwjI1MR6VEtFfLWTjZNV4UssfSUT-F9tKXb2GjX8Ar-YrWmBGIAY9ujQp$> for trouble shooting.<br />[0]PETSC ERROR: PETSc Release Version 3.23.4, unknown<br />[0]PETSC ERROR: test.py with 2 MPI process(es) and PETSC_ARCH arch-firedrake-default on 192.168.10.51 by zzyang Tue Jul 22 22:24:05 2025<br />[0]PETSC ERROR: Configure options: PETSC_ARCH=arch-firedrake-default --COPTFLAGS="-O3 -march=native -mtune=native" --CXXOPTFLAGS="-O3 -march=native -mtune=native" --FOPTFLAGS="-O3 -mtune=native" --with-c2html=0 --with-debugging=0 --with-fortran-bindings=0 --with-shared-libraries=1 --with-strict-petscerrorcode --download-cmake --download-bison --download-fftw --download-mumps-avoid-mpi-in-place --with-hdf5-dir=/opt/homebrew --with-hwloc-dir=/opt/homebrew --download-metis --download-mumps --download-netcdf --download-pnetcdf --download-ptscotch --download-scalapack --download-suitesparse --download-superlu_dist --download-slepc --with-zlib --download-hpddm --download-libpng --download-ctetgen --download-tetgen --download-triangle --download-mmg --download-parmmg --download-p4est --download-eigen --download-hypre --download-pragmatic<br />[0]PETSC ERROR: #1 PetscLogNestedTreePrintLine() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:289<br />[0]PETSC ERROR: #2 PetscLogNestedTreePrint() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:383<br />[0]PETSC ERROR: #3 PetscLogNestedTreePrint() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:384<br />[0]PETSC ERROR: #4 PetscLogNestedTreePrint() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:384<br />[0]PETSC ERROR: #5 PetscLogNestedTreePrint() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:384<br />[0]PETSC ERROR: #6 PetscLogNestedTreePrint() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:384<br />[0]PETSC ERROR: #7 PetscLogNestedTreePrint() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:384<br />[0]PETSC ERROR: #8 PetscLogNestedTreePrint() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:384<br />[0]PETSC ERROR: #9 PetscLogNestedTreePrint() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:384<br />[0]PETSC ERROR: #10 PetscLogNestedTreePrint() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:384<br />[0]PETSC ERROR: #11 PetscLogNestedTreePrint() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:384<br />[0]PETSC ERROR: #12 PetscLogNestedTreePrint() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:384<br />[0]PETSC ERROR: #13 PetscLogNestedTreePrintTop() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:420<br />[0]PETSC ERROR: #14 PetscLogHandlerView_Nested_XML() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/xmlviewer.c:443<br />[0]PETSC ERROR: #15 PetscLogHandlerView_Nested() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/impls/nested/lognested.c:405<br />[0]PETSC ERROR: #16 PetscLogHandlerView() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/handler/interface/loghandler.c:342<br />[0]PETSC ERROR: #17 PetscLogView() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/plog.c:2043<br />[0]PETSC ERROR: #18 PetscLogViewFromOptions() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/logging/plog.c:2084<br />[0]PETSC ERROR: #19 PetscFinalize() at /Users/zzyang/opt/firedrake/firedrake-pip/petsc/src/sys/objects/pinit.c:1552<br />PetscFinalize() failed [error code: 98]<br />--------------------------------------------------------------------------<br />prterun has exited due to process rank 0 with PID 28986 on node 192.168.10.51 exiting<br />improperly. There are three reasons this could occur:<br /><br />1. this process did not call "init" before exiting, but others in the<br />job did. This can cause a job to hang indefinitely while it waits for<br />all processes to call "init". By rule, if one process calls "init",<br />then ALL processes must call "init" prior to termination.<br /><br />2. this process called "init", but exited without calling "finalize".<br />By rule, all processes that call "init" MUST call "finalize" prior to<br />exiting or it will be considered an "abnormal termination"<br /><br />3. this process called "MPI_Abort" or "prte_abort" and the mca<br />parameter prte_create_session_dirs is set to false. In this case, the<br />run-time cannot detect that the abort call was an abnormal<br />termination. Hence, the only error message you will receive is this<br />one.<br /><br />This may have caused other processes in the application to be<br />terminated by signals sent by prterun (as reported here).<br /><br />You can avoid this message by specifying -quiet on the prterun command<br />line.<br />--------------------------------------------------------------------------<br />```<br /><br />Best wishes,<br />Zongze<br /><br />From: petsc-users <petsc-users-bounces@mcs.anl.gov<mailto:petsc-users-bounces@mcs.anl.gov>> on behalf of Klaij, Christiaan via petsc-users <petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov>><br />Date: Monday, July 14, 2025 at 15:58<br />To: Barry Smith <bsmith@petsc.dev<mailto:bsmith@petsc.dev>><br />Cc: PETSc users list <petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov>><br />Subject: Re: [petsc-users] problem with nested logging, standalone example<br /><br />@Junchao: yes, all with my ex2f.F90 variation on two or three cores<br /><br />@Barry: it's really puzzling that you cannot reproduce. Can you try running it a dozen times in a row? And look at the report_performance.xml file? When it hangs I see some nan's, for instance here in the VecAXPY event:<br /><br />               <events><br />                    <event><br />                        <name>VecAXPY</name><br />                        <time><br />                            <avgvalue>0.00610203</avgvalue><br />                            <minvalue>0.</minvalue><br />                            <maxvalue>0.0122041</maxvalue><br />                            <minloc>1</minloc><br />                            <maxloc>0</maxloc><br />                        </time><br />                        <ncalls><br />                            <avgvalue>0.5</avgvalue><br />                            <minvalue>0.</minvalue><br />                            <maxvalue>1.</maxvalue><br />                            <minloc>1</minloc><br />                            <maxloc>0</maxloc><br />                        </ncalls><br />                    </event><br />                    <event><br />                        <name>self</name><br />                        <time><br />                            <value>-nan.</value><br />                        </time><br /><br />This is what I did in my latest attempt on the login node of our Rocky Linux 9 cluster:<br />1) download petsc-3.23.4.tar.gz from the petsc website<br />2) ./configure -prefix=~/petsc/install --with-cxx=0 --with-debugging=0 --with-mpi-dir=/cm/shared/apps/mpich/ge/gcc/64/3.4.2<br />3) adjust my example to this version of petsc (file is attached)<br />4) make ex2f-cklaij-dbg-v2<br />5) mpirun -n 2 ./ex2f-cklaij-dbg-v2<br /><br />So the exact versions are: petsc-3.23.4, system mpich 3.4.2, system gcc 11.5.0<br /><br />________________________________________<br />From: Barry Smith <bsmith@petsc.dev<mailto:bsmith@petsc.dev>><br />Sent: Friday, July 11, 2025 11:22 PM<br />To: Klaij, Christiaan<br />Cc: Junchao Zhang; PETSc users list<br />Subject: Re: [petsc-users] problem with nested logging, standalone example<br /><br /><br />  And yet we cannot reproduce.<br /><br />  Please tell us the exact PETSc version and MPI implementation versions. And reattach your reproducing example. And exactly how you run it.<br /><br /><br />  Can you reproduce it on  an "ordinary" machine, say a Mac or Linux laptop.<br /><br />  Barry<br /><br />  If I could reproduce the problem here is how I would debug. I put use -start_in_debugger and then put break points in places which it seem problematic. Presumably I would end up with a hang with each MPI process in a "different place" and from that I may be able to determine how that happened.<br /><br /><br /><br />> On Jul 11, 2025, at 7:58 AM, Klaij, Christiaan <C.Klaij@marin.nl<mailto:C.Klaij@marin.nl>> wrote:<br />><br />> In summary for future reference:<br />> - tested 3 different machines, two at Marin, one at the national HPC<br />> - tested 3 different mpi implementation (intelmpi, openmpi and mpich)<br />> - tested openmpi in both release and debug<br />> - tested 2 different compilers (intel and gnu), both older and very recent versions<br />> - tested with the most basic config (./configure --with-cxx=0 --with-debugging=0 --download-mpich)<br />><br />> All of these test either segfault, or hang or error-out at the call to PetscLogView.<br />><br />> Chris<br />><br />> ________________________________________<br />> From: Klaij, Christiaan <C.Klaij@marin.nl<mailto:C.Klaij@marin.nl>><br />> Sent: Friday, July 11, 2025 10:10 AM<br />> To: Barry Smith; Junchao Zhang<br />> Cc: PETSc users list<br />> Subject: Re: [petsc-users] problem with nested logging, standalone example<br />><br />> @Matt: no MPI errors indeed. I've tried with MPICH and I get the same hanging.<br />> @Barry: both stack traces aren't exactly the same, see a sample with MPICH below.<br />><br />> If it cannot be reproduced at your side, I'm afraid this is another dead end. Thanks anyway, I really appreciate all your help.<br />><br />> Chris<br />><br />> (gdb) bt<br />> #0  0x000015555033bc2e in MPIDI_POSIX_mpi_release_gather_gather.constprop.0 ()<br />>   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br />> #1  0x000015555033db8a in MPIDI_POSIX_mpi_allreduce_release_gather ()<br />>   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br />> #2  0x000015555033e70f in MPIR_Allreduce ()<br />>   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br />> #3  0x000015555033f22e in PMPI_Allreduce ()<br />>   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br />> #4  0x0000155553f85d69 in MPIU_Allreduce_Count (comm=-2080374782,<br />>    op=1476395020, dtype=1275072547, count=1, outbuf=0x7fffffffac70,<br />>    inbuf=0x7fffffffac60)<br />>    at /home/cklaij/petsc/petsc-3.23.4/src/sys/objects/pinit.c:1839<br />> #5  MPIU_Allreduce_Private (inbuf=inbuf@entry=0x7fffffffac60,<br />>    outbuf=outbuf@entry=0x7fffffffac70, count=count@entry=1,<br />>    dtype=dtype@entry=1275072547, op=op@entry=1476395020, comm=-2080374782)<br />>    at /home/cklaij/petsc/petsc-3.23.4/src/sys/objects/pinit.c:1869<br />> #6  0x0000155553f33dbe in PetscPrintXMLNestedLinePerfResults (<br />>    viewer=viewer@entry=0x458890, name=name@entry=0x155554ef6a0d 'mbps\000',<br />>    value=<optimized out>, minthreshold=minthreshold@entry=0,<br />>    maxthreshold=maxthreshold@entry=0.01,<br />>    minmaxtreshold=minmaxtreshold@entry=1.05)<br />>    at /home/cklaij/petsc/petsc-3.23.4/src/sys/logging/handler/impls/nested/xmlviewer.c:255<br />><br />><br />> (gdb) bt<br />> #0  0x000015554fed3b17 in clock_gettime@GLIBC_2.2.5 () from /lib64/libc.so.6<br />> #1  0x0000155550b0de71 in ofi_gettime_ns ()<br />>   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br />> #2  0x0000155550b0dec9 in ofi_gettime_ms ()<br />>   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br />> #3  0x0000155550b2fab5 in sock_cq_sreadfrom ()<br />>   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br />> #4  0x00001555505ca6f7 in MPIDI_OFI_progress ()<br />>   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br />> #5  0x0000155550591fe9 in progress_test ()<br />>   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br />> #6  0x00001555505924a3 in MPID_Progress_wait ()<br />>   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br />> #7  0x000015555043463e in MPIR_Wait_state ()<br />>   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br />> #8  0x000015555052ec49 in MPIC_Wait ()<br />>   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br />> #9  0x000015555053093e in MPIC_Sendrecv ()<br />>   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br />> #10 0x00001555504bf674 in MPIR_Allreduce_intra_recursive_doubling ()<br />>   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br />> #11 0x00001555505b61de in MPIDI_OFI_mpi_finalize_hook ()<br />>   from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12<br />><br />> ________________________________________<br />> From: Barry Smith <bsmith@petsc.dev<mailto:bsmith@petsc.dev>><br />> Sent: Thursday, July 10, 2025 11:10 PM<br />> To: Junchao Zhang<br />> Cc: Klaij, Christiaan; PETSc users list<br />> Subject: Re: [petsc-users] problem with nested logging, standalone example<br />><br />><br />>  I cannot reproduce<br />><br />> On Jul 10, 2025, at 3:46 PM, Junchao Zhang <junchao.zhang@gmail.com<mailto:junchao.zhang@gmail.com>> wrote:<br />><br />> Adding -mca coll_hcoll_enable 0 didn't change anything at my end.  Strange.<br />><br />> --Junchao Zhang<br />><br />><br />> On Thu, Jul 10, 2025 at 3:39 AM Klaij, Christiaan <C.Klaij@marin.nl<mailto:C.Klaij@marin.nl><mailto:C.Klaij@marin.nl<mailto:C.Klaij@marin.nl>>> wrote:<br />> An additional clue perhaps: with the option OMPI_MCA_coll_hcoll_enable=0, the code does not hang but gives the error below.<br />><br />> Chris<br />><br />><br />> $ mpirun -mca coll_hcoll_enable 0 -n 2 ./ex2f-cklaij-dbg -pc_type jacobi -ksp_monitor_short -ksp_gmres_cgs_refinement_type refine_always<br />> 0 KSP Residual norm 1.11803<br />> 1 KSP Residual norm 0.591608<br />> 2 KSP Residual norm 0.316228<br />> 3 KSP Residual norm < 1.e-11<br />> 0 KSP Residual norm 0.707107<br />> 1 KSP Residual norm 0.408248<br />> 2 KSP Residual norm < 1.e-11<br />> Norm of error < 1.e-12 iterations 3<br />> [1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br />> [1]PETSC ERROR: General MPI error<br />> [1]PETSC ERROR: MPI error 1 MPI_ERR_BUFFER: invalid buffer pointer<br />> [1]PETSC ERROR: See https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!dcT9AzbxDJMLIie0NhYIw4YU2TObPM3WHhzR-HlzrpfbjPd6sgsPX009yFy1lw_eLLu2WprNwYRABMK43J9p4SM$ <https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJjkYxsN9$> for trouble shooting.<br />> [1]PETSC ERROR: Petsc Release Version 3.22.4, Mar 01, 2025<br />> [1]PETSC ERROR: ./ex2f-cklaij-dbg with 2 MPI process(es) and PETSC_ARCH on login1 by cklaij Thu Jul 10 10:33:33 2025<br />> [1]PETSC ERROR: Configure options: --prefix=/home/cklaij/ReFRESCO/trunk/install/extLibs --with-mpi-dir=/cm/shared/apps/openmpi/gcc/5.0.6-debug --with-x=0 --with-mpe=0 --with-debugging=0 --download-superlu_dist=https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/superlu_dist-8.1.2.tar.gz__;!!G_uCfscf7eWS!dcT9AzbxDJMLIie0NhYIw4YU2TObPM3WHhzR-HlzrpfbjPd6sgsPX009yFy1lw_eLLu2WprNwYRABMK4VVy6P4U$ <https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/superlu_dist-8.1.2.tar.gz__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJkouVHb2$> --with-blaslapack-dir=/cm/shared/apps/oneapi/2024.2.1/mkl/2024.2 --download-parmetis=https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/parmetis-4.0.3-p9.tar.gz__;!!G_uCfscf7eWS!dcT9AzbxDJMLIie0NhYIw4YU2TObPM3WHhzR-HlzrpfbjPd6sgsPX009yFy1lw_eLLu2WprNwYRABMK4-9b1K84$ <https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/parmetis-4.0.3-p9.tar.gz__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJrjo6-SP$> --download-metis=https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/metis-5.1.0-p11.tar.gz__;!!G_uCfscf7eWS!dcT9AzbxDJMLIie0NhYIw4YU2TObPM3WHhzR-HlzrpfbjPd6sgsPX009yFy1lw_eLLu2WprNwYRABMK4Y9uaqiQ$ <https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/metis-5.1.0-p11.tar.gz__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJhCc9MRE$> --with-packages-build-dir=/home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild --with-ssl=0 --with-shared-libraries=1 CFLAGS="-std=gnu11 -Wall -funroll-all-loops -O3 -DNDEBUG" CXXFLAGS="-std=gnu++14 -Wall -funroll-all-loops -O3 -DNDEBUG " COPTFLAGS="-std=gnu11 -Wall -funroll-all-loops -O3 -DNDEBUG" CXXOPTFLAGS="-std=gnu++14 -Wall -funroll-all-loops -O3 -DNDEBUG " FCFLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG" F90FLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG" FOPTFLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG"<br />> [1]PETSC ERROR: #1 PetscLogNestedTreePrintLine() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:289<br />> [1]PETSC ERROR: #2 PetscLogNestedTreePrint() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:377<br />> [1]PETSC ERROR: #3 PetscLogNestedTreePrint() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:384<br />> [1]PETSC ERROR: #4 PetscLogNestedTreePrintTop() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:420<br />> [1]PETSC ERROR: #5 PetscLogHandlerView_Nested_XML() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:443<br />> [1]PETSC ERROR: #6 PetscLogHandlerView_Nested() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/lognested.c:405<br />> [1]PETSC ERROR: #7 PetscLogHandlerView() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/interface/loghandler.c:342<br />> [1]PETSC ERROR: #8 PetscLogView() at /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/plog.c:2040<br />> [1]PETSC ERROR: #9 ex2f-cklaij-dbg.F90:301<br />> --------------------------------------------------------------------------<br />> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_SELF<br />> Proc: [[55228,1],1]<br />> Errorcode: 98<br />><br />> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.<br />> You may or may not see output from other processes, depending on<br />> exactly when Open MPI kills them.<br />> --------------------------------------------------------------------------<br />> --------------------------------------------------------------------------<br />> prterun has exited due to process rank 1 with PID 0 on node login1 calling<br />> "abort". This may have caused other processes in the application to be<br />> terminated by signals sent by prterun (as reported here).<br />> --------------------------------------------------------------------------<br />><br />> ________________________________________<br />> <image198746.png><br />> dr. ir.         Christiaan       Klaij   |      senior researcher<br />> Research & Development   |      CFD Development<br />> T +31 317 49 33 44<tel:+31%20317%2049%2033%2044>         |      https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!dcT9AzbxDJMLIie0NhYIw4YU2TObPM3WHhzR-HlzrpfbjPd6sgsPX009yFy1lw_eLLu2WprNwYRABMK4BUEn1h8$ <https://urldefense.us/v3/__https://www.marin.nl/__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJrOqapgp$><br />> <image542473.png><https://urldefense.us/v3/__https://www.facebook.com/marin.wageningen__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJoD4fuV7$><br />> <image555176.png><https://urldefense.us/v3/__https://www.linkedin.com/company/marin__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJospHf95$><br />> <image269837.png><https://urldefense.us/v3/__https://www.youtube.com/marinmultimedia__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJrpsjB_W$><br />><br />><br />> From: Klaij, Christiaan <C.Klaij@marin.nl<mailto:C.Klaij@marin.nl><mailto:C.Klaij@marin.nl<mailto:C.Klaij@marin.nl>>><br />> Sent: Thursday, July 10, 2025 10:15 AM<br />> To: Junchao Zhang<br />> Cc: PETSc users list<br />> Subject: Re: [petsc-users] problem with nested logging, standalone example<br />><br />> Hi Junchao,<br />><br />> Thanks for testing. I've fixed the error but unfortunately that doesn't change the behavior, the code still hangs as before, with the same stack trace...<br />><br />> Chris<br />><br />> ________________________________________<br />> From: Junchao Zhang <junchao.zhang@gmail.com<mailto:junchao.zhang@gmail.com><mailto:junchao.zhang@gmail.com<mailto:junchao.zhang@gmail.com>>><br />> Sent: Tuesday, July 8, 2025 10:58 PM<br />> To: Klaij, Christiaan<br />> Cc: PETSc users list<br />> Subject: Re: [petsc-users] problem with nested logging, standalone example<br />><br />> Hi, Chris,<br />> First, I had to fix an error in your test by adding " PetscCallA(MatSetFromOptions(AA,ierr))" at line 254.<br />> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br />> [0]PETSC ERROR: Object is in wrong state<br />> [0]PETSC ERROR: Mat object's type is not set: Argument # 1<br />> ...<br />> [0]PETSC ERROR: #1 MatSetValues() at /scratch/jczhang/petsc/src/mat/interface/matrix.c:1503<br />> [0]PETSC ERROR: #2 ex2f.F90:258<br />><br />> Then I could ran the test without problems<br />> mpirun -n 2 ./ex2f -pc_type jacobi -ksp_monitor_short -ksp_gmres_cgs_refinement_type refine_always<br />> 0 KSP Residual norm 1.11803<br />> 1 KSP Residual norm 0.591608<br />> 2 KSP Residual norm 0.316228<br />> 3 KSP Residual norm < 1.e-11<br />> 0 KSP Residual norm 0.707107<br />> 1 KSP Residual norm 0.408248<br />> 2 KSP Residual norm < 1.e-11<br />> Norm of error < 1.e-12 iterations 3<br />><br />> I used petsc-3.22.4, gcc-11.3, openmpi-5.0.6 and configured with<br />> ./configure --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-openmpi --with-ssl=0 --with-shared-libraries=1 CFLAGS="-std=gnu11 -Wall -funroll-all-loops -O3 -DNDEBUG" CXXFLAGS="-std=gnu++14 -Wall -funroll-all-loops -O3 -DNDEBUG " COPTFLAGS="-std=gnu11 -Wall -funroll-all-loops -O3 -DNDEBUG" CXXOPTFLAGS="-std=gnu++14 -Wall -funroll-all-loops -O3 -DNDEBUG " FCFLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG" F90FLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG" FOPTFLAGS="-Wall -funroll-all-loops -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime -Wno-unused-function -O3 -DNDEBUG"<br />><br />> Could you fix the error and retry?<br />><br />> --Junchao Zhang<br />><br />><br />> On Sun, Jul 6, 2025 at 12:57 PM Klaij, Christiaan via petsc-users <petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov><mailto:petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov>><mailto:petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov><mailto:petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov>>>> wrote:<br />> Attached is a standalone example of the issue described in the<br />> earlier thread "problem with nested logging". The issue appeared<br />> somewhere between petsc 3.19.4 and 3.23.4.<br />><br />> The example is a variation of ../ksp/tutorials/ex2f.F90, where<br />> I've added the nested log viewer with one event as well as the<br />> solution of a small system on rank zero.<br />><br />> When running on mulitple procs the example hangs during<br />> PetscLogView with the backtrace below. The configure.log is also<br />> attached in the hope that you can replicate the issue.<br />><br />> Chris<br />><br />><br />> #0 0x000015554c84ea9e in mca_pml_ucx_recv (buf=0x7fffffff9e30, count=1,<br />> datatype=0x15554c9ef900 <ompi_mpi_2dblprec>, src=1, tag=-12,<br />> comm=0x7f1e30, mpi_status=0x0) at pml_ucx.c:700<br />> #1 0x000015554c65baff in ompi_coll_base_allreduce_intra_recursivedoubling (<br />> sbuf=0x7fffffff9e20, rbuf=0x7fffffff9e30, count=1,<br />> dtype=0x15554c9ef900 <ompi_mpi_2dblprec>,<br />> op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30, module=0xaec630)<br />> at base/coll_base_allreduce.c:247<br />> #2 0x000015554c6a7e40 in ompi_coll_tuned_allreduce_intra_do_this (<br />> sbuf=0x7fffffff9e20, rbuf=0x7fffffff9e30, count=1,<br />> dtype=0x15554c9ef900 <ompi_mpi_2dblprec>,<br />> op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30, module=0xaec630,<br />> algorithm=3, faninout=0, segsize=0) at coll_tuned_allreduce_decision.c:142<br />> #3 0x000015554c6a054f in ompi_coll_tuned_allreduce_intra_dec_fixed (<br />> sbuf=0x7fffffff9e20, rbuf=0x7fffffff9e30, count=1,<br />> dtype=0x15554c9ef900 <ompi_mpi_2dblprec>,<br />> op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30, module=0xaec630)<br />> at coll_tuned_decision_fixed.c:216<br />> #4 0x000015554c68e160 in mca_coll_hcoll_allreduce (sbuf=0x7fffffff9e20,<br />> rbuf=0x7fffffff9e30, count=1, dtype=0x15554c9ef900 <ompi_mpi_2dblprec>,<br />> op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30, module=0xaecb80)<br />> at coll_hcoll_ops.c:217<br />> #5 0x000015554c59811a in PMPI_Allreduce (sendbuf=0x7fffffff9e20,<br />> recvbuf=0x7fffffff9e30, count=1, datatype=0x15554c9ef900 <ompi_mpi_2dblprec>, op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30) at allreduce.c:123<br />> #6 0x0000155553eabede in MPIU_Allreduce_Private () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br />> #7 0x0000155553e50d08 in PetscPrintXMLNestedLinePerfResults () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br />> #8 0x0000155553e5123e in PetscLogNestedTreePrintLine () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br />> #9 0x0000155553e51f3a in PetscLogNestedTreePrint () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br />> #10 0x0000155553e51e96 in PetscLogNestedTreePrint () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br />> #11 0x0000155553e51e96 in PetscLogNestedTreePrint () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br />> #12 0x0000155553e52142 in PetscLogNestedTreePrintTop () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br />> #13 0x0000155553e5257b in PetscLogHandlerView_Nested_XML () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br />> #14 0x0000155553e4e5a0 in PetscLogHandlerView_Nested () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br />> #15 0x0000155553e56232 in PetscLogHandlerView () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br />> #16 0x0000155553e588c3 in PetscLogView () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br />> #17 0x0000155553e40eb5 in petsclogview_ () from /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22<br />> #18 0x0000000000402c8b in MAIN__ ()<br />> #19 0x00000000004023df in main ()<br />> [cid:ii_197ebccaa1d27ee6ef21]<br />> dr. ir. Christiaan Klaij | senior researcher<br />> Research & Development | CFD Development<br />> T +31 317 49 33 44<tel:+31%20317%2049%2033%2044> | https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!dcT9AzbxDJMLIie0NhYIw4YU2TObPM3WHhzR-HlzrpfbjPd6sgsPX009yFy1lw_eLLu2WprNwYRABMK4BUEn1h8$ <https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJhphmV4x$><https://urldefense.us/v3/__https://www.marin.nl/__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imk4ivm_tE$<https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJhphmV4x$%3E%3Chttps://urldefense.us/v3/__https://www.marin.nl/__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imk4ivm_tE$>><br />> [Facebook]<https://urldefense.us/v3/__https://www.facebook.com/marin.wageningen__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imkLNCvsiI$><br />> [LinkedIn]<https://urldefense.us/v3/__https://www.linkedin.com/company/marin__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imkrb79Ay4$><br />> [YouTube]<https://urldefense.us/v3/__https://www.youtube.com/marinmultimedia__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imkJiCoeLw$><br />><br />><br /><br /><br /></div></body></html>