[petsc-users] Configuration process of Petsc hanging
Stefano Zampini
stefano.zampini at gmail.com
Fri May 31 14:22:32 CDT 2019
It should be —with-batch=1
> On May 31, 2019, at 10:21 PM, Ma, Xiao via petsc-users <petsc-users at mcs.anl.gov> wrote:
>
> Hi Satish,
>
> I have added these configure options
> --batch=1 --known-64-bit-blas-indices=0 -known-mpi-shared-libraries=0
>
> It is still hanging
>
> Best,
> Xiao
> From: Balay, Satish <balay at mcs.anl.gov>
> Sent: Friday, May 31, 2019 12:35
> To: Ma, Xiao
> Cc: petsc-users at mcs.anl.gov
> Subject: Re: [petsc-users] Configuration process of Petsc hanging
>
> PETSc configure is attempting to run some MPI binaries - and that is hanging with this MPI.
>
> You can retry with the options:
>
> --batch=1 --known-64-bit-blas-indices=0 -known-mpi-shared-libraries=0
>
> [and follow instructions provided by configure]
>
> Satish
>
>
>
> On Fri, 31 May 2019, Ma, Xiao via petsc-users wrote:
>
> > Hi ,
> >
> > I am trying to install Pylith which is a earthquake simulator using Petsc library, I am building it in PSC bridge cluster, during the steps of building Petsc, the configuration hanging at
> >
> > TESTING: configureMPITypes from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:247)
> >
> >
> > I am not sure if this has to with the configuration setup of the mpi version I am using.
> >
> > Any help would be deeply appreciated.
> >
> > I am attaching the configure options here:
> >
> >
> > Saving to: ‘petsc-pylith-2.2.1.tgz’
> >
> > 100%[===========================================================>] 10,415,016 37.3MB/s in 0.3s
> >
> > 2019-05-31 14:03:13 (37.3 MB/s) - ‘petsc-pylith-2.2.1.tgz’ saved [10415016/10415016]
> >
> > FINISHED --2019-05-31 14:03:13--
> > Total wall clock time: 1.1s
> > Downloaded: 1 files, 9.9M in 0.3s (37.3 MB/s)
> > /usr/bin/tar -zxf petsc-pylith-2.2.1.tgz
> > cd petsc-pylith && \
> > ./configure --prefix=/home/xm12345/pylith \
> > --with-c2html=0 --with-x=0 \
> > --with-clanguage=C \
> > --with-mpicompilers=1 \
> > --with-shared-libraries=1 --with-64-bit-points=1 --with-large-file-io=1 \
> > --download-chaco=1 --download-ml=1 --download-f2cblaslapack=1 --with-hdf5=1 --with -debugging=0 --with-fc=0 CPPFLAGS="-I/home/xm12345/pylith/include -I/home/xm12345/pylith/include " L DFLAGS="-L/home/xm12345/pylith/lib -L/home/xm12345/pylith/lib64 -L/home/xm12345/pylith/lib -L/home/xm 12345/pylith/lib64 " CFLAGS="-g -O2" CXXFLAGS="-g -O2 -DMPICH_IGNORE_CXX_SEEK" FCFLAGS="" \
> > PETSC_DIR=/home/xm12345/build/pylith/petsc-pylith PETSC_ARCH=arch-pylith && \
> > make -f gmakefile -j2 PETSC_DIR=/home/xm12345/build/pylith/petsc-pylith PETSC_ARCH=arch-pylit h && \
> > make PETSC_DIR=/home/xm12345/build/pylith/petsc-pylith install && \
> > make PETSC_DIR=/home/xm12345/build/pylith/petsc-pylith test && \
> > touch ../installed_petsc
> > ===============================================================================
> > Configuring PETSc to compile on your system
> > ===============================================================================
> > =============================================================================== ***** WARNING: MAKEFLAGS (set to w) found in environment variables - ignoring use ./configure MAKEFLAGS=$MAKEFLAGS if you really want to use that value ****** =============================================================================== =============================================================================== WARNING! Compiling PETSc with no debugging, this should only be done for timing and production runs. All development should be done when configured using --with-debugging=1 =============================================================================== TESTING: configureMPITypes from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:247)
> >
> >
> >
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190531/cb60b024/attachment.html>
More information about the petsc-users
mailing list