<div dir="auto">Thank you that worked flawlessly!</div><br><div class="gmail_quote"><div dir="ltr">Am Di., 8. Jan. 2019, 14:07 hat Balay, Satish <<a href="mailto:balay@mcs.anl.gov">balay@mcs.anl.gov</a>> geschrieben:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Should have clarified: when using --with-cc=/usr/local/depot/openmpi-3.1.1-gcc-7.3.0/bin/mpicc - removing -with-mpi-dir option.<br>
<br>
i.e - try:<br>
<br>
/configure PETSC_ARCH=linux-cumulus-hpc --with-cc=/usr/local/depot/openmpi-3.1.1-gcc-7.3.0/bin/mpicc --with-fc=/usr/local/depot/openmpi-3.1.1-gcc-7.3.0/bin/mpifort --with-cxx=/usr/local/depot/openmpi-3.1.1-gcc-7.3.0/bin/mpicxx --download-parmetis --download-metis --download-ptscotch --download-superlu_dist --download-mumps --with-scalar-type=complex --with-debugging=no --download-scalapack --download-superlu --download-fblaslapack=1 --download-cmake<br>
<br>
Satish<br>
<br>
On Tue, 8 Jan 2019, Sal Am via petsc-users wrote:<br>
<br>
> Thanks Satish for quick response,<br>
> <br>
> I tried both of the above, first removing the options --with-cc etc. and<br>
> then explicitly specifying the path e.g.<br>
> --with-cc=/usr/local/depot/openmpi-3.1.1-gcc-7.3.0/bin/mpicc etc..<br>
> Netiher worked, the error is still the same telling me "did not work" I<br>
> have attached the log file.<br>
> <br>
> Thanks<br>
> <br>
> On Mon, Jan 7, 2019 at 4:39 PM Balay, Satish <<a href="mailto:balay@mcs.anl.gov" target="_blank" rel="noreferrer">balay@mcs.anl.gov</a>> wrote:<br>
> <br>
> > Configure Options: --configModules=PETSc.Configure<br>
> > --optionsModule=config.compilerOptions PETSC_ARCH=linux-cumulus-hpc<br>
> > --with-cc=gcc --with-fc=gfortran --with-cxx=g++<br>
> > --with-mpi-dir=/usr/local/depot/openmpi-3.1.1-gcc-7.3.0/bin/<br>
> > --download-parmetis --download-metis --download-ptscotch<br>
> > --download-superlu_dist --download-mumps --with-scalar-type=complex<br>
> > --with-debugging=no --download-scalapack --download-superlu<br>
> > --download-fblaslapack=1 --download-cmake<br>
> ><br>
> > ' --with-cc=gcc --with-fc=gfortran --with-cxx=g++' prevents usage of mpicc<br>
> > etc - so remove these options when using with-mpi-dir<br>
> ><br>
> > Or use:<br>
> ><br>
> > --with-cc=/usr/local/depot/openmpi-3.1.1-gcc-7.3.0/bin/mpicc etc..<br>
> ><br>
> > Satish<br>
> ><br>
> ><br>
> > On Mon, 7 Jan 2019, Sal Am via petsc-users wrote:<br>
> ><br>
> > > Added the log file.<br>
> > ><br>
> > > >From OpenMPI:<br>
> > ><br>
> > > > The only special configuration that you need to build PETSc is to<br>
> > ensure<br>
> > > > that Open MPI's wrapper compilers (i.e., mpicc and mpif77) are in your<br>
> > > > $PATH before running the PETSc configure.py script.<br>
> > > ><br>
> > > > PETSc should then automatically find Open MPI's wrapper compilers and<br>
> > > > correctly build itself using Open MPI.<br>
> > > ><br>
> > > The OpenMPI dir is on my PATH which contain mpicc and mpif77.<br>
> > ><br>
> > > This is on a HPC, if that matters.<br>
> > ><br>
> ><br>
> ><br>
> <br>
<br>
</blockquote></div>