[petsc-users] Configuring PETSc with OpenMPI

Balay, Satish balay at mcs.anl.gov
Mon Jan 7 10:39:41 CST 2019


Configure Options: --configModules=PETSc.Configure --optionsModule=config.compilerOptions PETSC_ARCH=linux-cumulus-hpc --with-cc=gcc --with-fc=gfortran --with-cxx=g++ --with-mpi-dir=/usr/local/depot/openmpi-3.1.1-gcc-7.3.0/bin/ --download-parmetis --download-metis --download-ptscotch --download-superlu_dist --download-mumps --with-scalar-type=complex --with-debugging=no --download-scalapack --download-superlu --download-fblaslapack=1 --download-cmake

' --with-cc=gcc --with-fc=gfortran --with-cxx=g++' prevents usage of mpicc etc - so remove these options when using with-mpi-dir

Or use:

--with-cc=/usr/local/depot/openmpi-3.1.1-gcc-7.3.0/bin/mpicc etc..

Satish


On Mon, 7 Jan 2019, Sal Am via petsc-users wrote:

> Added the log file.
> 
> >From OpenMPI:
> 
> > The only special configuration that you need to build PETSc is to ensure
> > that Open MPI's wrapper compilers (i.e., mpicc and mpif77) are in your
> > $PATH before running the PETSc configure.py script.
> >
> > PETSc should then automatically find Open MPI's wrapper compilers and
> > correctly build itself using Open MPI.
> >
> The OpenMPI dir is on my PATH which contain mpicc and mpif77.
> 
> This is on a HPC, if that matters.
> 



More information about the petsc-users mailing list