[petsc-dev] PETSc configuration with Cray MPI
Matthew Knepley
knepley at gmail.com
Thu Mar 24 21:50:36 CDT 2016
You might also look at
https://bitbucket.org/petsc/petsc/src/01a9465dffce7b62bec37ab3fd4574a4a21c26c9/config/examples/arch-cray-xt5-opt.py?at=master&fileviewer=file-view-default
https://bitbucket.org/petsc/petsc/src/01a9465dffce7b62bec37ab3fd4574a4a21c26c9/config/examples/arch-cray-xt6-pkgs-opt.py?at=master&fileviewer=file-view-default
Matt
On Thu, Mar 24, 2016 at 9:26 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
> send configure.log to petsc-maint at mcs.anl.gov This email address
> cannot handle such large files.
>
>
> > On Mar 24, 2016, at 8:30 PM, Hector E Barrios Molano <hectorb at utexas.edu>
> wrote:
> >
> > Hi Experts!
> >
> > I'm trying to configure PETSc on a system with Cray mpich. I got the
> following configure message:
> >
> >
> ===============================================================================
> > Configuring PETSc to compile on your system
> >
> ===============================================================================
> > TESTING: check from
> config.libraries(config/BuildSystem/config/libraries.py:146)
>
>
> *******************************************************************************
> > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log
> for details):
> >
> -------------------------------------------------------------------------------
> > Unable to find mpi in default locations!
> > Perhaps you can specify with --with-mpi-dir=<directory>
> > If you do not want MPI, then give --with-mpi=0
> > You might also consider using --download-mpich instead
> >
> *******************************************************************************
> >
> > Initially I was using the option --with-mpi-dir but configure script
> told me that it did not work.
> >
> >
> > This is the configure command that I am using:
> >
> > ./configure
> --prefix=/work/03341/hector/lonestar/installed/petsc3.6-intel-debug
> --PETSC_DIR=/work/03341/hector/lonestar/sources/petsc-master
> --PETSC_ARCH=haswell-debug --with-batch --with-cc=mpicc --with-cxx=mpicxx
> --with-fc=mpif90
> --with-parmetis-dir=/work/03341/hector/lonestar/installed/parmetis/
> --with-metis-dir=/work/03341/hector/lonestar/installed/parmetis/
> --download-ptscotch --download-hypre
> --with-blas-lapack-lib=[/work/03341/hector/lonestar/mkl_static/libmkl_intel_lp64.a,/work/03341/hector/lonestar/mkl_static/libmkl_core.a,/work/03341/hector/lonestar/mkl_static/libmkl_intel_thread.a]
> --with-scalapack-include=/opt/apps/intel/
> 16.0.1.150/compilers_and_libraries_2016.1.150/linux/mkl/include/
> --with-scalapack-lib=[/work/03341/hector/lonestar/mkl_static/libmkl_scalapack_lp64.a,/work/03341/hector/lonestar/mkl_static/libmkl_blacs_intelmpi_lp64.a]
> --with-valgrind=1
> --with-valgrind-dir=/work/03341/hector/lonestar/installed/
> --with-shared-libraries=0 --with-fortran-interfaces=1
> --FC_LINKER_FLAGS="-openmp -openmp-link static" --FFLAGS="-openmp
> -openmp-link static" --LIBS="-Wl,--start-group
> /work/03341/hector/lonestar/mkl_static/libmkl_intel_lp64.a
> /work/03341/hector/lonestar/mkl_static/libmkl_core.a
> /work/03341/hector/lonestar/mkl_static/libmkl_intel_thread.a
> -Wl,--end-group -ldl -lpthread -lm"
> >
> >
> > Do you have any idea on how to solve the problem?
> >
> > Thanks for your help!
> >
> > Best regards,
> >
> > Hector
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20160324/3fe8aa76/attachment.html>
More information about the petsc-dev
mailing list