ML with OpenMPI

Jed Brown jed at 59A2.org
Fri Mar 21 18:01:01 CDT 2008


On Fri 2008-03-21 19:31, Lisandro Dalcin wrote:
> Mmm... I believe this is a configuration issue... if ML_MPI were
> defined, then ML_USR_COMM should be MPI_Comm. But the problem is
> perhaps on the ML side, not the PETSc side.
> 
> "ml_common.h" #define ML_MPI if macro HAVE_MPI is defined. In turn
> HAVE_MPI is at "ml_config.h", and that file is surelly generated by ML
> configure script. For some reason ML's configure failed to found MPI
> with the command line stuff PETSc pass to it.  Look at the
> 'config.log' file inside the "ml-5.0" dir to find what happened.

From the ML docs, it looks like ML's configure expects --with-mpi-compilers if it
is building with MPI.  I modified config/PETSc/packages/ml.py to include this
option and I'll let you know if it fixes the problem.

Jed
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 197 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20080322/4821389c/attachment.sig>


More information about the petsc-dev mailing list