ML with OpenMPI
Lisandro Dalcin
dalcinl at gmail.com
Sat Mar 22 08:19:31 CDT 2008
Give a try. When using MPICH2, PETSc just passes
"--with-mpi=PATH_TO_MPI" and ML get it right. Perhaps ML have some
trouble with OpenMPI, I've never tried. If you built OpenMPI yourself
and with shared libs, do not forget to define LD_LIBRARY_PATH to point
to the dir with the OpenMPI libs. If not, some configure test of ML
could fail, and then MPI is assumed to be absent.
On 3/21/08, Jed Brown <jed at 59a2.org> wrote:
> On Fri 2008-03-21 19:31, Lisandro Dalcin wrote:
> > Mmm... I believe this is a configuration issue... if ML_MPI were
> > defined, then ML_USR_COMM should be MPI_Comm. But the problem is
> > perhaps on the ML side, not the PETSc side.
> >
> > "ml_common.h" #define ML_MPI if macro HAVE_MPI is defined. In turn
> > HAVE_MPI is at "ml_config.h", and that file is surelly generated by ML
> > configure script. For some reason ML's configure failed to found MPI
> > with the command line stuff PETSc pass to it. Look at the
> > 'config.log' file inside the "ml-5.0" dir to find what happened.
>
>
> From the ML docs, it looks like ML's configure expects --with-mpi-compilers if it
> is building with MPI. I modified config/PETSc/packages/ml.py to include this
> option and I'll let you know if it fixes the problem.
>
>
> Jed
>
>
--
Lisandro Dalcín
---------------
Centro Internacional de Métodos Computacionales en Ingeniería (CIMEC)
Instituto de Desarrollo Tecnológico para la Industria Química (INTEC)
Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET)
PTLC - Güemes 3450, (3000) Santa Fe, Argentina
Tel/Fax: +54-(0)342-451.1594
More information about the petsc-dev
mailing list