[petsc-dev] Multiple MPICH/fblaslapack Installs For Multiple Arches

Jed Brown jed at jedbrown.org
Sun Mar 22 16:25:49 CDT 2020


Jacob Faibussowitsch <jacob.fai at gmail.com> writes:

> Hello all,
>
> As part of development, I have several arch folders lying around in my PETSC_DIR namely a 32-bit OSX, 64-bit OSX, 32-bit linux with valgrind, 64-bit linux with valgrind, and a 32-bit up to date with current master. All of these have a —download-mpich —download-fblaslapack and hence their own copy of each (so that’s 5 copies of each, plus other duplicated packages im sure). At this stage, even getting the bare minimum of these arches ready for dev work after a rebase/git pull takes decades as package versions, or conf settings change, forcing a rebuild of the same packages multiple times.
>
> My question(s):
> What petsc ./configure options are necessary to change the
> configuration of each library w.r.t. petsc? i.e. can my 64-bit arches
> use my 32-bit MPICH/fblaslapack and vice-versa? 

Yes on all points regarding MPI and BLAS/Lapack.  I recommend installing
a current MPICH and/or Open MPI system-wide, preferably hooked up to
ccache (see replies to this thread:
https://lists.mcs.anl.gov/pipermail/petsc-dev/2020-January/025505.html),
as well as BLAS/Lapack system-wide.  It's the other packages that are
more likely to depend on int/scalar configuration, but even many of
those (HDF5, SuiteSparse, etc.) aren't built specially for PETSc.

> Does this change when I have —with-debug on or off? If so, what other
> packages have a similar ability? Is there anywhere in ./configure
> —help where this kind of information would be documented?
>
> I suspect that this hasn’t been fully explored since its primarily a developer “problem” and not one the average user will run into/care about (since they usually aren’t building petsc multiple times). I’m sure everyone has their own ways of tackling this problem, I’d love to hear them.
>
> Best regards,
>
> Jacob Faibussowitsch
> (Jacob Fai - booss - oh - vitch)
> Cell: (312) 694-3391


More information about the petsc-dev mailing list