[petsc-dev] Sequential external packages and MPI

Balay, Satish balay at mcs.anl.gov
Thu Aug 22 07:26:09 CDT 2019


But they pick default compilers [prefer gcc] - that might not be compatible with MPI compilers.

This is preferred because - esp in cross-compile env - these tools are
run on the front-end node - and PETSc applications are run on the
back-end [compute] nodes.

Also we have --download-sowing-cc= etc options - if you want to switch this default.


Satish

On Thu, 22 Aug 2019, Smith, Barry F. via petsc-dev wrote:

> 
>   cmake, make, sowing and utilities that are not libraries that go into the overall simulation are not compiled with mpicxx etc. 
> 
> > On Aug 22, 2019, at 1:03 AM, Pierre Jolivet via petsc-dev <petsc-dev at mcs.anl.gov> wrote:
> > 
> > 
> > 
> >> On 22 Aug 2019, at 7:42 AM, Balay, Satish <balay at mcs.anl.gov> wrote:
> >> 
> >> On Thu, 22 Aug 2019, Pierre Jolivet via petsc-dev wrote:
> >> 
> >>> Hello,
> >>> PETSc is linking “sequential” libraries with MPI libraries.
> >>> $ otool -L libmetis.dylib
> >>> 	/usr/local/opt/mpich/lib/libmpi.12.dylib (compatibility version 14.0.0, current version 14.7.0)
> >>> $ otool -L libfftw3.dylib
> >>> 	/usr/local/opt/mpich/lib/libmpi.12.dylib (compatibility version 14.0.0, current version 14.7.0)
> >> 
> >> this will occur if one uses mpi compilers to build PETSc.
> > 
> > Why, though?
> > If MPICXX_SHOW != “Unavailable”, is it mandatory to force CXX=MPICXX in Metis CMake?
> > Wouldn’t it be possible to just extract the compiler binary name and use that as CXX?
> > I understand you don’t want to either overcomplicate things or fix something that is not broken — for you — so I’m just making sure that it would be OK if I patch this like that locally.
> > 
> >>> Is there anyway to avoid this, by using a “sequential” compiler and/or linker?
> >> 
> >> Yes - you can build these (sequential) packages/petsc with --with-mpi=0 [and without mpi compilers]
> >> 
> >>> I’m asking because we use PETSc libraries to compile both parallel and sequential wrappers.
> >>> Our Metis wrapper is marked as a sequential one, but since you are linking libmetis with MPI, this is problematic for some configurations.
> >> 
> >> Not sure what What mean by 'wrappers' here - esp 'Metis wrapper'. Its
> >> just a library.
> > 
> > It’s just another dynamic library compiled on top of libmetis that is then dynamically loaded by a DSL, which may or may not be launched with MPIRUN.
> > 
> >> If you are using petsc build tools to install these packages for a
> >> different use [other than the petsc usage specified by configure] -
> >> use different petsc builds as indicated above for different packages -
> >> as needed.
> > 
> > Having to configure + build PETSc with both real and complex numbers is already long enough.
> > That would mean a 3rd build, but why not.
> > Are there some guarantees that CXX with --with-mpi=0 will be the same as the underlying compiler of MPICXX? (I’m thinking of incompatible libc++ that would make it impossible to link in the same library Metis and the later one compiled PETSc with --with-mpi=1)
> 
>   This can be tricky. If you using --download-mpich then you know they are compatible since the MPI is built from the sequential, but if you are only given mpicc you have to be careful how you pull out the parts, You can start with  -show which works for some of them. OpenMPI has several options to get info about exactly what it uses for flags etc.
> 
>   Instead of hacking PETSc I would recommending having simple scripts that use PETSc configure in the way I described in my previous email
> 
>    Barry
> 
> > 
> > Thanks,
> > Pierre
> > 
> >> BTW: Current petsc configure/builder builds only parallel fftw. [it does not support building sequential fftw. But I guess this could be added]
> >> 
> >> Satish
> > 
> 
> 


More information about the petsc-dev mailing list