[petsc-users] Configure with precompiled mumps and other packages

san.temporal at gmail.com san.temporal at gmail.com
Sat Apr 25 08:04:56 CDT 2020


Hi,

I am trying to compile PETSc, using the precompiled Mumps in Ubuntu. The
available Mumps version is 5.1.2, so I use PETSc 3.11 (for 3.13 I would
require mumps 5.2.1, not available as a precompiled package).

The packages are:

    $ dpkg -l | grep mumps
    ii  libmumps-5.1.2:amd64                          5.1.2-4
              amd64        Direct linear systems solver - parallel shared
libraries
    ii  libmumps-dev:amd64                            5.1.2-4
              amd64        Direct linear systems solver - parallel
development files
    ii  libmumps-ptscotch-5.1.2:amd64                 5.1.2-4
              amd64        Direct linear systems solver - PTScotch-version
shared libraries
    ii  libmumps-ptscotch-dev:amd64                   5.1.2-4
              amd64        Direct linear systems solver - PTScotch-version
development files
    ii  libmumps-scotch-5.1.2:amd64                   5.1.2-4
              amd64        Direct linear systems solver - Scotch-version
shared libraries
    ii  libmumps-scotch-dev:amd64                     5.1.2-4
              amd64        Direct linear systems solver - Scotch-version
development files
    ii  libmumps-seq-5.1.2:amd64                      5.1.2-4
              amd64        Direct linear systems solver - non-parallel
shared libraries
    ii  libmumps-seq-dev:amd64                        5.1.2-4
              amd64        Direct linear systems solver - non-parallel
development files

So I configure with

    $ ./configure --with-cc=mpicc --with-fc=mpif90 -with-cxx=mpicxx
--prefix=/home/santiago/usr/local --with-make-np=10
 --with-shared-libraries
--with-packages-download-dir=/home/santiago/Documents/installers/petsc
--download-fblaslapack --with-mumps --with-scalapack --with-debugging=0
COPTFLAGS='-O -O3 -march=native -mtune=native' FOPTFLAGS='-O -O3
-march=native -mtune=native' CXXOPTFLAGS='-O -O3 -march=native
-mtune=native' --force

works fine. But

    $ make PETSC_DIR=/home/santiago/usr/local PETSC_ARCH="" test
    Running test examples to verify correct installation
    Using PETSC_DIR=/home/santiago/usr/local and PETSC_ARCH=
    C/C++ example src/snes/examples/tutorials/ex19 run successfully with 1
MPI process
    Possible error running C/C++ src/snes/examples/tutorials/ex19 with 2
MPI processes
    See http://www.mcs.anl.gov/petsc/documentation/faq.html
    lid velocity = 0.0016, prandtl # = 1., grashof # = 1.
    [0]PETSC ERROR:
------------------------------------------------------------------------
    [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range
    [0]PETSC ERROR: Try option -start_in_debugger or
-on_error_attach_debugger
    [0]PETSC ERROR: or see
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
    [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac
OS X to find memory corruption errors
    [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link,
and run
    [0]PETSC ERROR: to get more information on the crash.
    [0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
    [0]PETSC ERROR: Signal received
    [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
for trouble shooting.
    [0]PETSC ERROR: Petsc Release Version 3.11.4, Sep, 28, 2019
    [0]PETSC ERROR: ./ex19 on a  named isaiasPrecision-7820 by santiago Sat
Apr 25 09:52:01 2020
    [1]PETSC ERROR:
------------------------------------------------------------------------
    [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range
    [1]PETSC ERROR: Try option -start_in_debugger or
-on_error_attach_debugger
    [1]PETSC ERROR: or see
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
    [1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac
OS X to find memory corruption errors
    [1]PETSC ERROR: configure using --with-debugging=yes, recompile, link,
and run
    [1]PETSC ERROR: to get more information on the crash.
    [1]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
    [1]PETSC ERROR: Signal received
    [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
for trouble shooting.
    [1]PETSC ERROR: Petsc Release Version 3.11.4, Sep, 28, 2019
    [1]PETSC ERROR: ./ex19 on a  named isaiasPrecision-7820 by santiago Sat
Apr 25 09:52:01 2020
    [1]PETSC ERROR: Configure options --with-cc=mpicc --with-fc=mpif90
-with-cxx=mpicxx --prefix=/home/santiago/usr/local --with-make-np=10
--with-shared-libraries
--with-packages-download-dir=/home/santiago/Documents/installers/petsc
--download-fblaslapack --with-mumps --with-scalapack --with-debugging=0
COPTFLAGS="-O -O3 -march=native -mtune=native" FOPTFLAGS="-O -O3
-march=native -mtune=native" CXXOPTFLAGS="-O -O3 -march=native
-mtune=native" --force
    [1]PETSC ERROR: #1 User provided function() line 0 in  unknown file
    [0]PETSC ERROR: Configure options --with-cc=mpicc --with-fc=mpif90
-with-cxx=mpicxx --prefix=/home/santiago/usr/local --with-make-np=10
--with-shared-libraries
--with-packages-download-dir=/home/santiago/Documents/installers/petsc
--download-fblaslapack --with-mumps --with-scalapack --with-debugging=0
COPTFLAGS="-O -O3 -march=native -mtune=native" FOPTFLAGS="-O -O3
-march=native -mtune=native" CXXOPTFLAGS="-O -O3 -march=native
-mtune=native" --force
    [0]PETSC ERROR: #1 User provided function() line 0 in  unknown file

--------------------------------------------------------------------------
    MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
    with errorcode 59.

    NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
    You may or may not see output from other processes, depending on
    exactly when Open MPI kills them.

--------------------------------------------------------------------------
    [isaiasPrecision-7820:09935] 1 more process has sent help message
help-mpi-api.txt / mpi-abort
    [isaiasPrecision-7820:09935] Set MCA parameter
"orte_base_help_aggregate" to 0 to see all help / error messages

fails. As mentioned in
https://www.mcs.anl.gov/petsc/documentation/faq.html#PetscOptionsInsertFile
(even if not the same error) I checked

    $ ping `hostname`

It works fine.
As a reference, a PETSc version that is compiled with ... --download-mumps
--download-scalapack ... works fine.

How can I compile and successfully check PETSc, using the precompiled Mumps
in Ubuntu?

Thanks
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200425/1d22fcac/attachment-0001.html>


More information about the petsc-users mailing list