[petsc-users] configure fails with batch+scalapack
Smith, Barry F.
bsmith at mcs.anl.gov
Sun Dec 17 16:06:22 CST 2017
It helps if we have configure.log
But if the non-batch version installed the Scalapack then why not just run a second time with the batch option; it should automatically use the scalapack that was already successfully installed. Send configure.log if it fails.
Barry
> On Dec 17, 2017, at 2:29 PM, Santiago Andres Triana <repepo at gmail.com> wrote:
>
> Dear petsc-users,
>
> I'm trying to install petsc in a cluster that uses a job manager. This is the configure command I use:
>
> ./configure --known-mpi-shared-libraries=1 --with-scalar-type=complex --with-mumps=1 --download-mumps --download-parmetis --with-blaslapack-dir=/sw/sdev/intel/psxe2015u3/composer_xe_2015.3.187/mkl --download-metis --with-scalapack=1 --download-scalapack --with-batch
>
> This fails when including the option --with-batch together with --download-scalapack:
>
> ===============================================================================
> Configuring PETSc to compile on your system
> ===============================================================================
> TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:158) *******************************************************************************
> UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details):
> -------------------------------------------------------------------------------
> Unable to find scalapack in default locations!
> Perhaps you can specify with --with-scalapack-dir=<directory>
> If you do not want scalapack, then give --with-scalapack=0
> You might also consider using --download-scalapack instead
> *******************************************************************************
>
>
> However, if I omit the --with-batch option, the configure script manages to succeed (it downloads and compiles scalapack; the install fails later at the make debug because of the job manager). Any help or suggestion is highly appreciated. Thanks in advance!
>
> Andres
More information about the petsc-users
mailing list