<div dir="ltr"><div dir="ltr">On Sat, Mar 30, 2019 at 4:31 PM Kokron, Daniel S. (ARC-606.2)[InuTeq, LLC] via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov">petsc-users@mcs.anl.gov</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">





<div lang="EN-US">
<div class="gmail-m_6031926317805565715WordSection1">
<p class="MsoNormal"><span style="font-size:11pt">Last time I built PETSc on Pleiades it was version 3.8.3.  Using the same build procedure with the same compilers and MPI libraries with 3.11 does not work.  Is there a way to enable more verbose diagnostics
 during the configure phase so I can figure out what executable was being run and how it was compiled?</span></p></div></div></blockquote><div><br></div><div>This is not the right option:</div><div><br></div><div>  <span style="font-size:14.6667px">--with-mpi-exec=mpiexec</span></div><div> </div><div>it is</div><div><br></div><div>  <span style="font-size:14.6667px">--with-mpiexec=mpiexec</span></div><div> </div><div>  Thanks,</div><div><br></div><div>      Matt</div><div><br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div lang="EN-US"><div class="gmail-m_6031926317805565715WordSection1"><p class="MsoNormal"></p>
<p class="MsoNormal"><span style="font-size:11pt">PBS r147i6n10 24> ./configure --prefix=/nobackupp8/XXX /Projects/CHEM/BoA_Case/Codes-2018.3.222/binaries/petsc-3.11+ --with-debugging=0 --with-shared-libraries=1 --with-cc=mpicc --with-fc=mpif90 --with-cxx=mpicxx
 --with-blas-lapack-dir=$MKLROOT/lib/intel64 --with-scalapack-include=$MKLROOT/include --with-scalapack-lib="$MKLROOT/lib/intel64/libmkl_scalapack_lp64.so $MKLROOT/lib/intel64/libmkl_blacs_sgimpt_lp64.so" --with-cpp=/usr/bin/cpp --with-gnu-compilers=0 --with-vendor-compilers=intel
 -COPTFLAGS="-g -O3 -xCORE-AVX2 -diag-disable=cpu-dispatch" -CXXOPTFLAGS="-g -O3 -xCORE-AVX2 -diag-disable=cpu-dispatch" -FOPTFLAGS="-g -O3 -xCORE-AVX2 -diag-disable=cpu-dispatch" --with-mpi=true --with-mpi-exec=mpiexec --with-mpi-compilers=1 --with-precision=double
 --with-scalar-type=real --with-x=0 --with-x11=0 --with-memalign=32<u></u><u></u></span></p>
<p class="MsoNormal"><span style="font-size:11pt"><u></u> <u></u></span></p>
<p class="MsoNormal"><span style="font-size:11pt">I get this which usually means that an executable was linked with libmpi, but was not launched with mpiexec.<u></u><u></u></span></p>
<p class="MsoNormal"><span style="font-size:11pt"><u></u> <u></u></span></p>
<p class="MsoNormal"><span style="font-size:11pt">TESTING: configureMPITypes from config.packages.MPI(/nobackupp8/dkokron/Projects/CHEM/BoA_Case/Codes-2018.3.222/petsc/config/BuildSystem/config/packages/MPI.py:283)<u></u><u></u></span></p>
<p class="MsoNormal"><span style="font-size:11pt">????????CMPT ERROR: mpiexec_mpt must be used to launch all MPI applications<u></u><u></u></span></p>
<p class="MsoNormal"><span style="font-size:11pt">????????CMPT ERROR: mpiexec_mpt must be used to launch all MPI applications<u></u><u></u></span></p>
<p class="MsoNormal"><span style="font-size:11pt">????????CMPT ERROR: mpiexec_mpt must be used to launch all MPI applications<u></u><u></u></span></p>
<p class="MsoNormal"><span style="font-size:11pt"><u></u> <u></u></span></p>
<p class="MsoNormal"><span style="font-size:11pt">If I let it continue, configure reports that MPI is empty.<u></u><u></u></span></p>
<p class="MsoNormal"><span style="font-size:11pt"><u></u> <u></u></span></p>
<p class="MsoNormal"><span style="font-size:11pt">make:<u></u><u></u></span></p>
<p class="MsoNormal"><span style="font-size:11pt">BLAS/LAPACK: -Wl,-rpath,/nasa/intel/Compiler/2018.3.222/compilers_and_libraries_2018.3.222/linux/mkl/lib/intel64 -L/nasa/intel/Compiler/2018.3.222/compilers_and_libraries_2018.3.222/linux/mkl/lib/intel64 -lmkl_intel_lp64
 -lmkl_sequential -lmkl_core -lpthread<u></u><u></u></span></p>
<p class="MsoNormal"><span style="font-size:11pt">MPI:<u></u><u></u></span></p>
<p class="MsoNormal"><span style="font-size:11pt">cmake:<u></u><u></u></span></p>
<p class="MsoNormal"><span style="font-size:11pt">pthread:<u></u><u></u></span></p>
<p class="MsoNormal"><span style="font-size:11pt">scalapack:<u></u><u></u></span></p>
<p class="MsoNormal"><span style="font-size:11pt"><u></u> <u></u></span></p>
<p class="MsoNormal"><span style="font-size:8.5pt;font-family:Tahoma,sans-serif;color:black">Daniel Kokron<br>
Redline Performance Solutions<br>
SciCon/APP group</span><span style="font-size:11pt"><u></u><u></u></span></p>
<p class="MsoNormal"><span style="font-size:11pt">-- <u></u><u></u></span></p>
<p class="MsoNormal"><u></u> <u></u></p>
</div>
</div>

</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr" class="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>