<div dir="ltr">Thanks for the update. Let's assume it is a bug in MPI :)<br clear="all"><div><div dir="ltr" class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr">--Junchao Zhang</div></div></div><br></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Aug 13, 2020 at 11:15 AM Chris Hewson <<a href="mailto:chris@resfrac.com">chris@resfrac.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">Just as an update to this, I can confirm that using the mpich version (3.3.2) downloaded with the petsc download solved this issue on my end.<br clear="all"><div><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><b><br></b></div><div dir="ltr"><b>Chris Hewson</b><div>Senior Reservoir Simulation Engineer</div><div>ResFrac</div><div>+1.587.575.9792</div></div></div></div></div></div></div></div><br></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Jul 23, 2020 at 5:58 PM Junchao Zhang <<a href="mailto:junchao.zhang@gmail.com" target="_blank">junchao.zhang@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr"><div>On Mon, Jul 20, 2020 at 7:05 AM Barry Smith <<a href="mailto:bsmith@petsc.dev" target="_blank">bsmith@petsc.dev</a>> wrote:<br></div></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><div><br></div> Is there a comprehensive MPI test suite (perhaps from MPICH)? Is there any way to run this full test suite under the problematic MPI and see if it detects any problems? <div><br></div><div> Is so, could someone add it to the FAQ in the debugging section?</div></div></blockquote><div>MPICH does have a test suite. It is at the subdir test/mpi of downloaded <a href="http://www.mpich.org/static/downloads/3.3.2/mpich-3.3.2.tar.gz" target="_blank">mpich</a>. It annoyed me since it is not user-friendly. It might be helpful in catching bugs at very small scale. But say if I want to test allreduce on 1024 ranks on 100 doubles, I have to hack the test suite.</div><div>Anyway, the instructions are here.<br></div></div><blockquote style="margin:0px 0px 0px 40px;border:none;padding:0px"><div class="gmail_quote"><div>For the purpose of petsc, under test/mpi one can configure it with</div></div><div class="gmail_quote"><div>$./configure CC=mpicc CXX=mpicxx FC=mpifort --enable-strictmpi --enable-threads=funneled --enable-fortran=f77,f90 --enable-fast --disable-spawn --disable-cxx --disable-ft-tests // It is weird I disabled cxx but I had to set CXX!</div></div><div class="gmail_quote"><div>$make -k -j8 // -k is to keep going and ignore compilation errors, e.g., when building tests for MPICH extensions not in MPI standard, but your MPI is OpenMPI.</div></div><div class="gmail_quote"><div>$ // edit testlist, remove lines mpi_t, rma, f77, impls. Those are sub-dirs containing tests for MPI routines Petsc does not rely on. </div></div><div class="gmail_quote"><div>$ make testings or directly './runtests -tests=testlist'</div></div><div class="gmail_quote"><div><br></div></div><div class="gmail_quote"><div>On a batch system, </div></div><div class="gmail_quote"><div>$export MPITEST_BATCHDIR=`pwd`/btest // specify a batch dir, say btest, </div></div><div class="gmail_quote"><div>$./runtests -batch -mpiexec=mpirun -np=1024 -tests=testlist // Use 1024 ranks if a test does no specify the number of processes.</div></div><div class="gmail_quote"><div>$ // It copies test binaries to the batch dir and generates a script runtests.batch there. Edit the script to fit your batch system and then submit a job and wait for its finish. </div></div><div class="gmail_quote"><div>$ cd btest && ../checktests --ignorebogus</div></div></blockquote><div class="gmail_quote"><div><br></div><div>PS: Fande, changing an MPI fixed your problem does not necessarily mean the old MPI has bugs. It is complicated. It could be a petsc bug. You need to provide us a code to reproduce your error. It does not matter if the code is big. </div><div><br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><div><br></div><div> Thanks</div><div><br></div><div> Barry</div><div><br><div><br><blockquote type="cite"><div>On Jul 20, 2020, at 12:16 AM, Fande Kong <<a href="mailto:fdkong.jd@gmail.com" target="_blank">fdkong.jd@gmail.com</a>> wrote:</div><br><div><div dir="ltr"><div dir="ltr">Trace could look like this:<div><br></div><div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------</span></div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: Argument out of range</span></div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: key 45226154 is greater than largest key allowed 740521</span></div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: See </span><a href="https://www.mcs.anl.gov/petsc/documentation/faq.html" style="background-color:transparent;margin-top:0pt;margin-bottom:0pt;color:rgb(74,110,224)" target="_blank"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">https://www.mcs.anl.gov/petsc/documentation/faq.html</span></a><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt"> for trouble shooting.</span></div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: Petsc Release Version 3.13.3, unknown </span></div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: ../../griffin-opt on a arch-moose named r6i5n18 by wangy2 Sun Jul 19 17:14:28 2020</span></div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: Configure options --download-hypre=1 --with-debugging=no --with-shared-libraries=1 --download-fblaslapack=1 --download-metis=1 --download-ptscotch=1 --download-parmetis=1 --download-superlu_dist=1 --download-mumps=1 --download-scalapack=1 --download-slepc=1 --with-mpi=1 --with-cxx-dialect=C++11 --with-fortran-bindings=0 --with-sowing=0 --with-64-bit-indices --download-mumps=0</span></div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: #1 PetscTableFind() line 132 in /home/wangy2/trunk/sawtooth/griffin/moose/petsc/include/petscctable.h</span></div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: #2 MatSetUpMultiply_MPIAIJ() line 33 in /home/wangy2/trunk/sawtooth/griffin/moose/petsc/src/mat/impls/aij/mpi/mmaij.c</span></div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: #3 MatAssemblyEnd_MPIAIJ() line 876 in /home/wangy2/trunk/sawtooth/griffin/moose/petsc/src/mat/impls/aij/mpi/mpiaij.c</span></div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: #4 MatAssemblyEnd() line 5347 in /home/wangy2/trunk/sawtooth/griffin/moose/petsc/src/mat/interface/matrix.c</span></div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: #5 MatPtAPNumeric_MPIAIJ_MPIXAIJ_allatonce() line 901 in /home/wangy2/trunk/sawtooth/griffin/moose/petsc/src/mat/impls/aij/mpi/mpiptap.c</span></div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: #6 MatPtAPNumeric_MPIAIJ_MPIMAIJ_allatonce() line 3180 in /home/wangy2/trunk/sawtooth/griffin/moose/petsc/src/mat/impls/maij/maij.c</span></div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: #7 MatProductNumeric_PtAP() line 704 in /home/wangy2/trunk/sawtooth/griffin/moose/petsc/src/mat/interface/matproduct.c</span></div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: #8 MatProductNumeric() line 759 in /home/wangy2/trunk/sawtooth/griffin/moose/petsc/src/mat/interface/matproduct.c</span></div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: #9 MatPtAP() line 9199 in /home/wangy2/trunk/sawtooth/griffin/moose/petsc/src/mat/interface/matrix.c</span></div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: #10 MatGalerkin() line 10236 in /home/wangy2/trunk/sawtooth/griffin/moose/petsc/src/mat/interface/matrix.c</span></div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: #11 PCSetUp_MG() line 745 in /home/wangy2/trunk/sawtooth/griffin/moose/petsc/src/ksp/pc/impls/mg/mg.c</span></div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: #12 PCSetUp_HMG() line 220 in /home/wangy2/trunk/sawtooth/griffin/moose/petsc/src/ksp/pc/impls/hmg/hmg.c</span></div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: #13 PCSetUp() line 898 in /home/wangy2/trunk/sawtooth/griffin/moose/petsc/src/ksp/pc/interface/precon.c</span></div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: #14 KSPSetUp() line 376 in /home/wangy2/trunk/sawtooth/griffin/moose/petsc/src/ksp/ksp/interface/itfunc.c</span></div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: #15 KSPSolve_Private() line 633 in /home/wangy2/trunk/sawtooth/griffin/moose/petsc/src/ksp/ksp/interface/itfunc.c</span></div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: #16 KSPSolve() line 853 in /home/wangy2/trunk/sawtooth/griffin/moose/petsc/src/ksp/ksp/interface/itfunc.c</span></div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: #17 SNESSolve_NEWTONLS() line 225 in /home/wangy2/trunk/sawtooth/griffin/moose/petsc/src/snes/impls/ls/ls.c</span></div><div style="color:rgb(14,16,26);background-color:transparent;margin-top:0pt;margin-bottom:0pt"><span style="background-color:transparent;margin-top:0pt;margin-bottom:0pt">[640]PETSC ERROR: #18 SNESSolve() line 4519 in /home/wangy2/trunk/sawtooth/griffin/moose/petsc/src/snes/interface/snes.c</span></div></div></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Sun, Jul 19, 2020 at 11:13 PM Fande Kong <<a href="mailto:fdkong.jd@gmail.com" target="_blank">fdkong.jd@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr"><div>I am not entirely sure what is happening, but we encountered similar issues recently. It was not reproducible. It might occur at different stages, and errors could be weird other than "ctable stuff." Our code was Valgrind clean since every PR in moose needs to go through rigorous Valgrind checks before it reaches the devel branch. The errors happened when we used mvapich.</div><div><br></div><div>We changed to use HPE-MPT (a vendor stalled MPI), then everything was smooth. May you try a different MPI? It is better to try a system carried one. </div><div><br></div><div>We did not get the bottom of this problem yet, but we at least know this is kind of MPI-related. </div><div><br></div><div>Thanks,</div><div><br></div><div>Fande,</div><div><br></div></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Sun, Jul 19, 2020 at 3:28 PM Chris Hewson <<a href="mailto:chris@resfrac.com" target="_blank">chris@resfrac.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">Hi,<div><br></div><div>I am having a bug that is occurring in PETSC with the return string:</div><div><br></div><div>[7]PETSC ERROR: PetscTableFind() line 132 in /home/chewson/petsc-3.13.2/include/petscctable.h key 7556 is greater than largest key allowed 5693</div><div><br></div><div>This is using petsc-3.13.2, compiled and running using mpich with -O3 and debugging turned off tuned to the haswell architecture and occurring either before or during a KSPBCGS solve/setup or during a MUMPS factorization solve (I haven't been able to replicate this issue with the same set of instructions etc.).</div><div><br></div><div>This is a terrible way to ask a question, I know, and not very helpful from your side, but this is what I have from a user's run and can't reproduce on my end (either with the optimization compilation or with debugging turned on). This happens when the code has run for quite some time and is happening somewhat rarely.</div><div><br></div><div>More than likely I am using a static variable (code is written in c++) that I'm not updating when the matrix size is changing or something silly like that, but any help or guidance on this would be appreciated. </div><div><br><div><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><b>Chris Hewson</b><div>Senior Reservoir Simulation Engineer</div><div>ResFrac</div><div>+1.587.575.9792</div></div></div></div></div></div></div></div></div></div>
</blockquote></div>
</blockquote></div>
</div></blockquote></div><br></div></div></blockquote></div></div>
</blockquote></div>
</blockquote></div>