[petsc-users] configure cannot find a c preprocessor
Santiago Andres Triana
repepo at gmail.com
Thu Dec 21 02:28:04 CST 2017
There is no mpich install on this cluster... I will talk to the sysadmins
to see if this is feasible...
On other news, configure was successful by turning off C++, however make
failed: (attached logs)
...
------------------------------------------
Using mpiexec: /opt/sgi/mpt/mpt-2.12/bin/mpirun
==========================================
Building PETSc using GNU Make with 32 build threads
==========================================
gmake[2]: Entering directory `/space/hpc-home/trianas/petsc-3.8.3'
Use "/usr/bin/gmake V=1" to see verbose compile lines, "/usr/bin/gmake V=0"
to suppress.
CLINKER
/space/hpc-home/trianas/petsc-3.8.3/arch-linux2-c-debug/lib/libpetsc.so.3.8.3
/sw/sdev/binutils/x86_64/2.22/bin/ld: cannot find -lcpuset.so
/sw/sdev/binutils/x86_64/2.22/bin/ld: cannot find -lbitmask.so
collect2: error: ld returned 1 exit status
gmake[2]: ***
[/space/hpc-home/trianas/petsc-3.8.3/arch-linux2-c-debug/lib/libpetsc.so.3.8.3]
Error 1
gmake[2]: Leaving directory `/space/hpc-home/trianas/petsc-3.8.3'
gmake[1]: *** [gnumake] Error 2
gmake[1]: Leaving directory `/space/hpc-home/trianas/petsc-3.8.3'
**************************ERROR*************************************
Error during compile, check arch-linux2-c-debug/lib/petsc/conf/make.log
Send it and arch-linux2-c-debug/lib/petsc/conf/configure.log to
petsc-maint at mcs.anl.gov
********************************************************************
there seems to be a problem with the libcpuset.so and libbitmask.so
libraries. Make wants lcpuset.so and lbitmask.so, which do not exist in
this system.
On Thu, Dec 21, 2017 at 5:10 AM, Smith, Barry F. <bsmith at mcs.anl.gov> wrote:
>
>
> > On Dec 20, 2017, at 5:52 PM, Matthew Knepley <knepley at gmail.com> wrote:
> >
> > On Wed, Dec 20, 2017 at 6:31 PM, Santiago Andres Triana <
> repepo at gmail.com> wrote:
> > I got a different error now... hope it's a good sign!
> >
> > hpca-login:~/petsc-3.8.3> ./configure --with-cc=gcc --with-fc=gfortran
> --with-cxx=g++ --with-mpi-include=/opt/sgi/mpt/mpt-2.12/include
> --with-mpi-lib="-L/opt/sgi/mpt/mpt-2.12/lib -lmpi -lpthread"
> LIBS="/usr/lib64/libcpuset.so.1 /usr/lib64/libbitmask.so.1"
> > ============================================================
> ===================
> > Configuring PETSc to compile on your system
> > ============================================================
> ===================
> > TESTING: CxxMPICheck from config.packages.MPI(config/
> BuildSystem/config/packages/MPI.py:351)
> ************************************************************
> *******************
> > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log
> for details):
> > ------------------------------------------------------------
> -------------------
> > C++ error! MPI_Finalize() could not be located!
> > ************************************************************
> *******************
> >
> > It looks like there is crazy C++ stuff in SGI MPT. I can see two chioces:
> >
> > a) Turn off C++: --with-cxx=0
> >
> > b) Find out what crazy C++ library MPT has and stick it in
> --with-mpi-lib
>
> 3) filter the error message as previously discussed (this time for
> C++), then one does not have "find out what crazy..." since the mpicxx
> knows it.
>
> Barry
>
> >
> > No amount of MPI optimization is worth this pain. Does your machine have
> an MPICH install?
> >
> > Thanks,
> >
> > Matt
> >
> > On Thu, Dec 21, 2017 at 12:21 AM, Satish Balay <balay at mcs.anl.gov>
> wrote:
> > Hm configure is misbehaving with /usr/lib64/libcpuset.so.1 notation. Try:
> >
> > ./configure --with-cc=gcc --with-fc=gfortran --with-cxx=g++
> --with-mpi-include=/opt/sgi/mpt/mpt-2.12/include
> --with-mpi-lib="-L/opt/sgi/mpt/mpt-2.12/lib -lmpi -lpthread"
> LIBS="/usr/lib64/libcpuset.so.1 /usr/lib64/libbitmask.so.1"
> >
> > Satish
> >
> >
> > On Wed, 20 Dec 2017, Santiago Andres Triana wrote:
> >
> > > thanks Satish,
> > >
> > > did not work unfortunately, configure.log attached. Here's the output:
> > >
> > > hpca-login:~/petsc-3.8.3> ./configure --with-cc=gcc --with-fc=gfortran
> > > --with-cxx=g++ --with-mpi-include=/opt/sgi/mpt/mpt-2.12/include
> > > --with-mpi-lib="-L/opt/sgi/mpt/mpt-2.12/lib -lmpi -lpthread
> > > /usr/lib64/libcpuset.so.1 /usr/lib64/libbitmask.so.1"
> > > ============================================================
> ===================
> > > Configuring PETSc to compile on your system
> > >
> > > ============================================================
> ===================
> > > TESTING: check from
> > > config.libraries(config/BuildSystem/config/libraries.py:158)
> > >
> > > ************************************************************
> *******************
> > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log
> for
> > > details):
> > > ------------------------------------------------------------
> -------------------
> > > --with-mpi-lib=['-L/opt/sgi/mpt/mpt-2.12/lib', '-lmpi', '-lpthread',
> > > '/usr/lib64/libcpuset.so.1', '/usr/lib64/libbitmask.so.1'] and
> > > --with-mpi-include=['/opt/sgi/mpt/mpt-2.12/include'] did not work
> > > ************************************************************
> *******************
> > >
> > > On Thu, Dec 21, 2017 at 12:07 AM, Satish Balay <balay at mcs.anl.gov>
> wrote:
> > >
> > > > Its strange compiler.
> > > >
> > > > You can try:
> > > >
> > > > ./configure --with-cc=gcc --with-fc=gfortran --with-cxx=g++
> > > > --with-mpi-include=/opt/sgi/mpt/mpt-2.12/include
> > > > --with-mpi-lib="-L/opt/sgi/mpt/mpt-2.12/lib -lmpi -lpthread
> > > > /usr/lib64/libcpuset.so.1 /usr/lib64/libbitmask.so.1"
> > > >
> > > > Satish
> > > >
> > > > On Wed, 20 Dec 2017, Santiago Andres Triana wrote:
> > > >
> > > > > This is what I get:
> > > > >
> > > > > hpca-login:~> mpicc -show
> > > > > gcc -I/opt/sgi/mpt/mpt-2.12/include -L/opt/sgi/mpt/mpt-2.12/lib
> -lmpi
> > > > > -lpthread /usr/lib64/libcpuset.so.1 /usr/lib64/libbitmask.so.1
> > > > >
> > > > > On Wed, Dec 20, 2017 at 11:59 PM, Satish Balay <balay at mcs.anl.gov>
> > > > wrote:
> > > > >
> > > > > > >>>
> > > > > > Executing: mpicc -E -I/dev/shm/pbs.3111462.hpc-
> > > > pbs/petsc-fdYfuH/config.setCompilers
> > > > > > /dev/shm/pbs.3111462.hpc-pbs/petsc-fdYfuH/config.
> > > > setCompilers/conftest.c
> > > > > > stderr:
> > > > > > gcc: warning: /usr/lib64/libcpuset.so.1: linker input file unused
> > > > because
> > > > > > linking not done
> > > > > > gcc: warning: /usr/lib64/libbitmask.so.1: linker input file
> unused
> > > > because
> > > > > > linking not done
> > > > > > <<<<
> > > > > >
> > > > > > Looks like your mpicc is printing this verbose thing on stdout
> [why is
> > > > > > it doing a link check during preprocesing?] - thus confusing
> PETSc
> > > > > > configure.
> > > > > >
> > > > > > Workarround is to fix this compiler not to print such messages.
> Or use
> > > > > > different compilers..
> > > > > >
> > > > > > What do you have for:
> > > > > >
> > > > > > mpicc -show
> > > > > >
> > > > > >
> > > > > > Satish
> > > > > >
> > > > > > On Wed, 20 Dec 2017, Santiago Andres Triana wrote:
> > > > > >
> > > > > > > Dear petsc-users,
> > > > > > >
> > > > > > > I'm trying to install petsc in a cluster using SGI's MPT. The
> mpicc
> > > > > > > compiler is in the search path. The configure command is:
> > > > > > >
> > > > > > > ./configure --with-scalar-type=complex --with-mumps=1
> > > > --download-mumps
> > > > > > > --download-parmetis --download-metis --download-scalapack
> > > > > > >
> > > > > > > However, this leads to an error (configure.log attached):
> > > > > > >
> > > > > > > ============================================================
> > > > > > ===================
> > > > > > > Configuring PETSc to compile on your system
> > > > > > >
> > > > > > > ============================================================
> > > > > > ===================
> > > > > > > TESTING: checkCPreprocessor from
> > > > > > > config.setCompilers(config/BuildSystem/config/
> setCompilers.py:599)
> > > > > > >
> > > > > > > ************************************************************
> > > > > > *******************
> > > > > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see
> > > > configure.log for
> > > > > > > details):
> > > > > > > ------------------------------------------------------------
> > > > > > -------------------
> > > > > > > Cannot find a C preprocessor
> > > > > > > ************************************************************
> > > > > > *******************
> > > > > > >
> > > > > > > The configure.log says something about cpp32, here's the
> excerpt:
> > > > > > >
> > > > > > > Possible ERROR while running preprocessor: exit code 256
> > > > > > > stderr:
> > > > > > > gcc: error: cpp32: No such file or directory
> > > > > > >
> > > > > > >
> > > > > > > Any ideas of what is going wrong? any help or comments are
> highly
> > > > > > > appreciated. Thanks in advance!
> > > > > > >
> > > > > > > Andres
> > > > > > >
> > > > > >
> > > > > >
> > > > >
> > > >
> > > >
> > >
> >
> >
> >
> >
> >
> > --
> > What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> > -- Norbert Wiener
> >
> > https://www.cse.buffalo.edu/~knepley/
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20171221/7d339e83/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: make.log
Type: application/octet-stream
Size: 16026 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20171221/7d339e83/attachment-0002.obj>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: configure.log
Type: application/octet-stream
Size: 5478794 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20171221/7d339e83/attachment-0003.obj>
More information about the petsc-users
mailing list