[petsc-dev] Sequential external packages and MPI
Pierre Jolivet
pierre.jolivet at enseeiht.fr
Thu Aug 22 01:03:08 CDT 2019
> On 22 Aug 2019, at 7:42 AM, Balay, Satish <balay at mcs.anl.gov> wrote:
>
> On Thu, 22 Aug 2019, Pierre Jolivet via petsc-dev wrote:
>
>> Hello,
>> PETSc is linking “sequential” libraries with MPI libraries.
>> $ otool -L libmetis.dylib
>> /usr/local/opt/mpich/lib/libmpi.12.dylib (compatibility version 14.0.0, current version 14.7.0)
>> $ otool -L libfftw3.dylib
>> /usr/local/opt/mpich/lib/libmpi.12.dylib (compatibility version 14.0.0, current version 14.7.0)
>
> this will occur if one uses mpi compilers to build PETSc.
Why, though?
If MPICXX_SHOW != “Unavailable”, is it mandatory to force CXX=MPICXX in Metis CMake?
Wouldn’t it be possible to just extract the compiler binary name and use that as CXX?
I understand you don’t want to either overcomplicate things or fix something that is not broken — for you — so I’m just making sure that it would be OK if I patch this like that locally.
>> Is there anyway to avoid this, by using a “sequential” compiler and/or linker?
>
> Yes - you can build these (sequential) packages/petsc with --with-mpi=0 [and without mpi compilers]
>
>> I’m asking because we use PETSc libraries to compile both parallel and sequential wrappers.
>> Our Metis wrapper is marked as a sequential one, but since you are linking libmetis with MPI, this is problematic for some configurations.
>
> Not sure what What mean by 'wrappers' here - esp 'Metis wrapper'. Its
> just a library.
It’s just another dynamic library compiled on top of libmetis that is then dynamically loaded by a DSL, which may or may not be launched with MPIRUN.
> If you are using petsc build tools to install these packages for a
> different use [other than the petsc usage specified by configure] -
> use different petsc builds as indicated above for different packages -
> as needed.
Having to configure + build PETSc with both real and complex numbers is already long enough.
That would mean a 3rd build, but why not.
Are there some guarantees that CXX with --with-mpi=0 will be the same as the underlying compiler of MPICXX? (I’m thinking of incompatible libc++ that would make it impossible to link in the same library Metis and the later one compiled PETSc with --with-mpi=1)
Thanks,
Pierre
> BTW: Current petsc configure/builder builds only parallel fftw. [it does not support building sequential fftw. But I guess this could be added]
>
> Satish
More information about the petsc-dev
mailing list