[petsc-dev] Sequential external packages and MPI

Smith, Barry F. bsmith at mcs.anl.gov
Thu Aug 22 01:12:47 CDT 2019



> On Aug 22, 2019, at 12:31 AM, Pierre Jolivet via petsc-dev <petsc-dev at mcs.anl.gov> wrote:
> 
> Hello,
> PETSc is linking “sequential” libraries with MPI libraries.
> $ otool -L libmetis.dylib
> 	/usr/local/opt/mpich/lib/libmpi.12.dylib (compatibility version 14.0.0, current version 14.7.0)
> $ otool -L libfftw3.dylib
> 	/usr/local/opt/mpich/lib/libmpi.12.dylib (compatibility version 14.0.0, current version 14.7.0)

   Yes this is because we use one set of compilers to build everything in a single configure.

> Is there anyway to avoid this, by using a “sequential” compiler and/or linker?

   This is not impossible but would make the convoluted logic of BuildSystem even worse (for example we would need to run tests on the sequential compilers to make sure they work, may sure they are compatible with MPI compilers....) 

> I’m asking because we use PETSc libraries to compile both parallel and sequential wrappers.

  What are "wrappers"? 

> Our Metis wrapper is marked as a sequential one, but since you are linking libmetis with MPI, this is problematic for some configurations.

  What is your work flow? Are you using --prefix to compile particular combinations of external packages and put them in the prefix directory ? Then you can make PETSc builds and just use --with-xxx-dir=/prefixlocation to use them when building PETSc?

  With this model you can use the sequential compilers for the sequential libraries and the MPI ones for the MPI libraries for example

  ./configure --download-metis --with-mpi=0  --prefix=/home/bsmith/myprebuilts

   ./configure --download-parmetis --with-metis-dir=/home/bsmith/myprebuilts --prefix=/home/bsmith/myprebuilts

   For a sequential PETSc build that uses metis then

  ./configure --with-metis-dir=/home/bsmith/myprebuilts ....--with-mpi=0

  For a parallel PETSc build 

 /configure --with-metis-dir=/home/bsmith/myprebuilts --with-parmetis-dir=/home/bsmith/myprebuilts.     

  Barry


> 
> Thanks,
> Pierre



More information about the petsc-dev mailing list