<html><head><meta http-equiv="content-type" content="text/html; charset=utf-8"></head><body style="overflow-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;"><div><br></div>  Please take a look at the notes in <a href="https://urldefense.us/v3/__https://petsc.org/release/manualpages/Sys/PetscShmgetAllocateArray/__;!!G_uCfscf7eWS!fTBuqCCGHRprXT35JlP-Fm7YsLqOM33bEFcdtlsJEH1q0worVtAiE6xo-LviQiuClRi9lhg4z48P1hVdAp7sD5E$">https://petsc.org/release/manualpages/Sys/PetscShmgetAllocateArray/</a>  For some reason your program is not able to access/use the Unix shared memory; check if you are already using the shared memory (so it is not available for a new run) or the limits are too low to access enough memory.<div><br></div><div>   Barry<br id="lineBreakAtBeginningOfMessage"><div><br><blockquote type="cite"><div>On Oct 23, 2024, at 8:23 AM, Praveen C <cpraveen@gmail.com> wrote:</div><br class="Apple-interchange-newline"><div><div>Dear all<br><br>I am not able to run the boussinesq example from geoclaw using petsc@3.22.0<br><br>https://urldefense.us/v3/__https://github.com/clawpack/geoclaw/tree/3303883f46572c58130d161986b8a87a57ca7816/examples/bouss__;!!G_uCfscf7eWS!e3VQ4NHKmXGstRsQW5vtI7fmKfUT9zmJkMJcPbcvPyIjicyfJpNoMgx3wZ-qyGcKNSjIkNZkzilec8MnHN6PMw$ <br><br>It runs with petsc@3.21.6<br><br>The error I get is given below. After printing this, the code does not progress.<br><br>I use the following petsc options<br><br># set min numbers of matrix rows per MPI rank  (default is 10000)<br>-mpi_linear_solve_minimum_count_per_rank 5000<br><br><br># Krylov linear solver:<br>-mpi_linear_solver_server<br>-mpi_linear_solver_server_view<br>-ksp_type gmres<br>-ksp_max_it 200<br>-ksp_reuse_preconditioner<br>-ksp_rtol 1.e-9<br><br># preconditioner:<br>-pc_type gamg<br><br>I installed petsc and other dependencies for clawpack using miniforge.<br><br>Thanks<br>pc<br><br>==> Use Bouss. in water deeper than    1.0000000000000000       Using a PETSc solver<br>Using Bouss equations from the start<br>rnode allocated...<br>node allocated...<br>listOfGrids allocated...<br>Storage allocated...<br>bndList allocated...<br>Gridding level   1 at t =  0.000000E+00:     4 grids with       10000 cells<br>  Setting initial dt to    2.9999999999999999E-002<br> max threads set to            6<br>   Done reading data, starting computation ...       Total zeta at initial time:    39269.907650665169      GEOCLAW: Frame    0 output files done at time t =  0.000000D+00<br><br>[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>[0]PETSC ERROR: Petsc has generated inconsistent data<br>[0]PETSC ERROR: Unable to locate PCMPI allocated shared address 0x55e6d750ae20<br>[0]PETSC ERROR: WARNING! There are unused option(s) set! Could be the program crashed before usage or a spelling mistake, etc!<br>[0]PETSC ERROR:   Option left: name:-ksp_max_it value: 200 source: file<br>[0]PETSC ERROR:   Option left: name:-ksp_reuse_preconditioner (no value) source: file<br>[0]PETSC ERROR:   Option left: name:-ksp_rtol value: 1.e-9 source: file<br>[0]PETSC ERROR:   Option left: name:-ksp_type value: gmres source: file<br>[0]PETSC ERROR:   Option left: name:-mpi_linear_solve_minimum_count_per_rank value: 5000 source: file<br>[0]PETSC ERROR:   Option left: name:-mpi_linear_solver_server_view (no value) source: file<br>[0]PETSC ERROR:   Option left: name:-pc_type value: gamg source: file<br>[0]PETSC ERROR: See https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!e3VQ4NHKmXGstRsQW5vtI7fmKfUT9zmJkMJcPbcvPyIjicyfJpNoMgx3wZ-qyGcKNSjIkNZkzilec8MvNjNo7A$  for trouble shooting.<br>[0]PETSC ERROR: Petsc Release Version 3.22.0, Sep 28, 2024  [0]PETSC ERROR: /home/praveen/bouss/radial_flat/xgeoclaw with 6 MPI process(es) and PETSC_ARCH  on euler by praveen Thu Oct 17 21:49:54 2024<br>[0]PETSC ERROR: Configure options: AR=${PREFIX}/bin/x86_64-conda-linux-gnu-ar CC=mpicc CXX=mpicxx FC=mpifort CFLAGS="-march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /opt/miniforge/envs/claw/include  " CPPFLAGS="-DNDEBUG -D_FORTIFY_SOURCE=2 -O2 -isystem /opt/miniforge/envs/claw/include" CXXFLAGS="-fvisibility-inlines-hidden -fmessage-length=0 -march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /opt/miniforge/envs/claw/include  " FFLAGS="-march=nocona -mtune=haswell -ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 -ffunction-sections -pipe -isystem /opt/miniforge/envs/claw/include   -Wl,--no-as-needed" LDFLAGS="-pthread -fopenmp -Wl,-O2 -Wl,--sort-common -Wl,--as-needed -Wl,-z,relro -Wl,-z,now -Wl,--disable-new-dtags -Wl,--gc-sections -Wl,--allow-shlib-undefined -Wl,-rpath,/opt/miniforge/envs/claw/lib -Wl,-rpath-link,/opt/miniforge/envs/claw/lib -L/opt/miniforge/envs/claw/lib -Wl,-rpath-link,/opt/miniforge/envs/claw/lib" LIBS="-Wl,-rpath,/opt/miniforge/envs/claw/lib -lmpi_mpifh -lgfortran" --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --with-clib-autodetect=0 --with-cxxlib-autodetect=0 --with-fortranlib-autodetect=0 --with-debugging=0 --with-blas-lib=libblas.so --with-lapack-lib=liblapack.so --with-yaml=1 --with-hdf5=1 --with-fftw=1 --with-hwloc=0 --with-hypre=1 --with-metis=1 --with-mpi=1 --with-mumps=1 --with-parmetis=1 --with-pthread=1 --with-ptscotch=1 --with-shared-libraries --with-ssl=0 --with-scalapack=1 --with-superlu=1 --with-superlu_dist=1 --with-superlu_dist-include=/opt/miniforge/envs/claw/include/superlu-dist --with-superlu_dist-lib=-lsuperlu_dist --with-suitesparse=1 --with-suitesparse-dir=/opt/miniforge/envs/claw --with-x=0 --with-scalar-type=real   --with-cuda=0 --prefix=/opt/miniforge/envs/claw<br>[0]PETSC ERROR: #1 PetscShmgetMapAddresses() at /home/conda/feedstock_root/build_artifacts/petsc_1728030599661/work/src/sys/utils/server.c:114<br>[0]PETSC ERROR: #2 PCMPISetMat() at /home/conda/feedstock_root/build_artifacts/petsc_1728030599661/work/src/ksp/pc/impls/mpi/pcmpi.c:269<br>[0]PETSC ERROR: #3 PCSetUp_MPI() at /home/conda/feedstock_root/build_artifacts/petsc_1728030599661/work/src/ksp/pc/impls/mpi/pcmpi.c:853<br>[0]PETSC ERROR: #4 PCSetUp() at /home/conda/feedstock_root/build_artifacts/petsc_1728030599661/work/src/ksp/pc/interface/precon.c:1071<br>[0]PETSC ERROR: #5 KSPSetUp() at /home/conda/feedstock_root/build_artifacts/petsc_1728030599661/work/src/ksp/ksp/interface/itfunc.c:415<br>[0]PETSC ERROR: #6 KSPSolve_Private() at /home/conda/feedstock_root/build_artifacts/petsc_1728030599661/work/src/ksp/ksp/interface/itfunc.c:826<br>[0]PETSC ERROR: #7 KSPSolve() at /home/conda/feedstock_root/build_artifacts/petsc_1728030599661/work/src/ksp/ksp/interface/itfunc.c:1075</div></div></blockquote></div><br></div></body></html>