<div dir="ltr"><div>Satish,</div><div>Thanks.</div><div>Santiago<br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Sat, Apr 25, 2020 at 12:05 PM Satish Balay <<a href="mailto:balay@mcs.anl.gov">balay@mcs.anl.gov</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">I have no idea whats different with UBUNTU MUMPS.<br>
<br>
You can try debugging it as the error message suggests.<br>
<br>
My suggestion is to stick with petsc-3.13 and --download-mumps.<br>
<br>
You can use ubuntu packages for software where API is more consistent - like blas/lapack, MPICH/OpenMPI<br>
<br>
Satish<br>
<br>
On Sat, 25 Apr 2020, <a href="mailto:san.temporal@gmail.com" target="_blank">san.temporal@gmail.com</a> wrote:<br>
<br>
> Hi,<br>
> <br>
> I am trying to compile PETSc, using the precompiled Mumps in Ubuntu. The<br>
> available Mumps version is 5.1.2, so I use PETSc 3.11 (for 3.13 I would<br>
> require mumps 5.2.1, not available as a precompiled package).<br>
> <br>
> The packages are:<br>
> <br>
> $ dpkg -l | grep mumps<br>
> ii libmumps-5.1.2:amd64 5.1.2-4<br>
> amd64 Direct linear systems solver - parallel shared<br>
> libraries<br>
> ii libmumps-dev:amd64 5.1.2-4<br>
> amd64 Direct linear systems solver - parallel<br>
> development files<br>
> ii libmumps-ptscotch-5.1.2:amd64 5.1.2-4<br>
> amd64 Direct linear systems solver - PTScotch-version<br>
> shared libraries<br>
> ii libmumps-ptscotch-dev:amd64 5.1.2-4<br>
> amd64 Direct linear systems solver - PTScotch-version<br>
> development files<br>
> ii libmumps-scotch-5.1.2:amd64 5.1.2-4<br>
> amd64 Direct linear systems solver - Scotch-version<br>
> shared libraries<br>
> ii libmumps-scotch-dev:amd64 5.1.2-4<br>
> amd64 Direct linear systems solver - Scotch-version<br>
> development files<br>
> ii libmumps-seq-5.1.2:amd64 5.1.2-4<br>
> amd64 Direct linear systems solver - non-parallel<br>
> shared libraries<br>
> ii libmumps-seq-dev:amd64 5.1.2-4<br>
> amd64 Direct linear systems solver - non-parallel<br>
> development files<br>
> <br>
> So I configure with<br>
> <br>
> $ ./configure --with-cc=mpicc --with-fc=mpif90 -with-cxx=mpicxx<br>
> --prefix=/home/santiago/usr/local --with-make-np=10<br>
> --with-shared-libraries<br>
> --with-packages-download-dir=/home/santiago/Documents/installers/petsc<br>
> --download-fblaslapack --with-mumps --with-scalapack --with-debugging=0<br>
> COPTFLAGS='-O -O3 -march=native -mtune=native' FOPTFLAGS='-O -O3<br>
> -march=native -mtune=native' CXXOPTFLAGS='-O -O3 -march=native<br>
> -mtune=native' --force<br>
> <br>
> works fine. But<br>
> <br>
> $ make PETSC_DIR=/home/santiago/usr/local PETSC_ARCH="" test<br>
> Running test examples to verify correct installation<br>
> Using PETSC_DIR=/home/santiago/usr/local and PETSC_ARCH=<br>
> C/C++ example src/snes/examples/tutorials/ex19 run successfully with 1<br>
> MPI process<br>
> Possible error running C/C++ src/snes/examples/tutorials/ex19 with 2<br>
> MPI processes<br>
> See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a><br>
> lid velocity = 0.0016, prandtl # = 1., grashof # = 1.<br>
> [0]PETSC ERROR:<br>
> ------------------------------------------------------------------------<br>
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,<br>
> probably memory access out of range<br>
> [0]PETSC ERROR: Try option -start_in_debugger or<br>
> -on_error_attach_debugger<br>
> [0]PETSC ERROR: or see<br>
> <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind</a><br>
> [0]PETSC ERROR: or try <a href="http://valgrind.org" rel="noreferrer" target="_blank">http://valgrind.org</a> on GNU/linux and Apple Mac<br>
> OS X to find memory corruption errors<br>
> [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link,<br>
> and run<br>
> [0]PETSC ERROR: to get more information on the crash.<br>
> [0]PETSC ERROR: --------------------- Error Message<br>
> --------------------------------------------------------------<br>
> [0]PETSC ERROR: Signal received<br>
> [0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a><br>
> for trouble shooting.<br>
> [0]PETSC ERROR: Petsc Release Version 3.11.4, Sep, 28, 2019<br>
> [0]PETSC ERROR: ./ex19 on a named isaiasPrecision-7820 by santiago Sat<br>
> Apr 25 09:52:01 2020<br>
> [1]PETSC ERROR:<br>
> ------------------------------------------------------------------------<br>
> [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,<br>
> probably memory access out of range<br>
> [1]PETSC ERROR: Try option -start_in_debugger or<br>
> -on_error_attach_debugger<br>
> [1]PETSC ERROR: or see<br>
> <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind</a><br>
> [1]PETSC ERROR: or try <a href="http://valgrind.org" rel="noreferrer" target="_blank">http://valgrind.org</a> on GNU/linux and Apple Mac<br>
> OS X to find memory corruption errors<br>
> [1]PETSC ERROR: configure using --with-debugging=yes, recompile, link,<br>
> and run<br>
> [1]PETSC ERROR: to get more information on the crash.<br>
> [1]PETSC ERROR: --------------------- Error Message<br>
> --------------------------------------------------------------<br>
> [1]PETSC ERROR: Signal received<br>
> [1]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a><br>
> for trouble shooting.<br>
> [1]PETSC ERROR: Petsc Release Version 3.11.4, Sep, 28, 2019<br>
> [1]PETSC ERROR: ./ex19 on a named isaiasPrecision-7820 by santiago Sat<br>
> Apr 25 09:52:01 2020<br>
> [1]PETSC ERROR: Configure options --with-cc=mpicc --with-fc=mpif90<br>
> -with-cxx=mpicxx --prefix=/home/santiago/usr/local --with-make-np=10<br>
> --with-shared-libraries<br>
> --with-packages-download-dir=/home/santiago/Documents/installers/petsc<br>
> --download-fblaslapack --with-mumps --with-scalapack --with-debugging=0<br>
> COPTFLAGS="-O -O3 -march=native -mtune=native" FOPTFLAGS="-O -O3<br>
> -march=native -mtune=native" CXXOPTFLAGS="-O -O3 -march=native<br>
> -mtune=native" --force<br>
> [1]PETSC ERROR: #1 User provided function() line 0 in unknown file<br>
> [0]PETSC ERROR: Configure options --with-cc=mpicc --with-fc=mpif90<br>
> -with-cxx=mpicxx --prefix=/home/santiago/usr/local --with-make-np=10<br>
> --with-shared-libraries<br>
> --with-packages-download-dir=/home/santiago/Documents/installers/petsc<br>
> --download-fblaslapack --with-mumps --with-scalapack --with-debugging=0<br>
> COPTFLAGS="-O -O3 -march=native -mtune=native" FOPTFLAGS="-O -O3<br>
> -march=native -mtune=native" CXXOPTFLAGS="-O -O3 -march=native<br>
> -mtune=native" --force<br>
> [0]PETSC ERROR: #1 User provided function() line 0 in unknown file<br>
> <br>
> --------------------------------------------------------------------------<br>
> MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD<br>
> with errorcode 59.<br>
> <br>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.<br>
> You may or may not see output from other processes, depending on<br>
> exactly when Open MPI kills them.<br>
> <br>
> --------------------------------------------------------------------------<br>
> [isaiasPrecision-7820:09935] 1 more process has sent help message<br>
> help-mpi-api.txt / mpi-abort<br>
> [isaiasPrecision-7820:09935] Set MCA parameter<br>
> "orte_base_help_aggregate" to 0 to see all help / error messages<br>
> <br>
> fails. As mentioned in<br>
> <a href="https://www.mcs.anl.gov/petsc/documentation/faq.html#PetscOptionsInsertFile" rel="noreferrer" target="_blank">https://www.mcs.anl.gov/petsc/documentation/faq.html#PetscOptionsInsertFile</a><br>
> (even if not the same error) I checked<br>
> <br>
> $ ping `hostname`<br>
> <br>
> It works fine.<br>
> As a reference, a PETSc version that is compiled with ... --download-mumps<br>
> --download-scalapack ... works fine.<br>
> <br>
> How can I compile and successfully check PETSc, using the precompiled Mumps<br>
> in Ubuntu?<br>
> <br>
> Thanks<br>
> <br>
<br>
</blockquote></div>