[petsc-users] Configuring PETSc with OpenMPI
Sal Am
tempohoper at gmail.com
Tue Jan 8 15:04:23 CST 2019
Thank you that worked flawlessly!
Am Di., 8. Jan. 2019, 14:07 hat Balay, Satish <balay at mcs.anl.gov>
geschrieben:
> Should have clarified: when using
> --with-cc=/usr/local/depot/openmpi-3.1.1-gcc-7.3.0/bin/mpicc - removing
> -with-mpi-dir option.
>
> i.e - try:
>
> /configure PETSC_ARCH=linux-cumulus-hpc
> --with-cc=/usr/local/depot/openmpi-3.1.1-gcc-7.3.0/bin/mpicc
> --with-fc=/usr/local/depot/openmpi-3.1.1-gcc-7.3.0/bin/mpifort
> --with-cxx=/usr/local/depot/openmpi-3.1.1-gcc-7.3.0/bin/mpicxx
> --download-parmetis --download-metis --download-ptscotch
> --download-superlu_dist --download-mumps --with-scalar-type=complex
> --with-debugging=no --download-scalapack --download-superlu
> --download-fblaslapack=1 --download-cmake
>
> Satish
>
> On Tue, 8 Jan 2019, Sal Am via petsc-users wrote:
>
> > Thanks Satish for quick response,
> >
> > I tried both of the above, first removing the options --with-cc etc. and
> > then explicitly specifying the path e.g.
> > --with-cc=/usr/local/depot/openmpi-3.1.1-gcc-7.3.0/bin/mpicc etc..
> > Netiher worked, the error is still the same telling me "did not work" I
> > have attached the log file.
> >
> > Thanks
> >
> > On Mon, Jan 7, 2019 at 4:39 PM Balay, Satish <balay at mcs.anl.gov> wrote:
> >
> > > Configure Options: --configModules=PETSc.Configure
> > > --optionsModule=config.compilerOptions PETSC_ARCH=linux-cumulus-hpc
> > > --with-cc=gcc --with-fc=gfortran --with-cxx=g++
> > > --with-mpi-dir=/usr/local/depot/openmpi-3.1.1-gcc-7.3.0/bin/
> > > --download-parmetis --download-metis --download-ptscotch
> > > --download-superlu_dist --download-mumps --with-scalar-type=complex
> > > --with-debugging=no --download-scalapack --download-superlu
> > > --download-fblaslapack=1 --download-cmake
> > >
> > > ' --with-cc=gcc --with-fc=gfortran --with-cxx=g++' prevents usage of
> mpicc
> > > etc - so remove these options when using with-mpi-dir
> > >
> > > Or use:
> > >
> > > --with-cc=/usr/local/depot/openmpi-3.1.1-gcc-7.3.0/bin/mpicc etc..
> > >
> > > Satish
> > >
> > >
> > > On Mon, 7 Jan 2019, Sal Am via petsc-users wrote:
> > >
> > > > Added the log file.
> > > >
> > > > >From OpenMPI:
> > > >
> > > > > The only special configuration that you need to build PETSc is to
> > > ensure
> > > > > that Open MPI's wrapper compilers (i.e., mpicc and mpif77) are in
> your
> > > > > $PATH before running the PETSc configure.py script.
> > > > >
> > > > > PETSc should then automatically find Open MPI's wrapper compilers
> and
> > > > > correctly build itself using Open MPI.
> > > > >
> > > > The OpenMPI dir is on my PATH which contain mpicc and mpif77.
> > > >
> > > > This is on a HPC, if that matters.
> > > >
> > >
> > >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190108/8774ef1d/attachment.html>
More information about the petsc-users
mailing list