[petsc-dev] Multiple MPICH/fblaslapack Installs For Multiple Arches

Jed Brown jed at jedbrown.org
Sun Mar 22 16:40:19 CDT 2020


Jacob Faibussowitsch <jacob.fai at gmail.com> writes:

>> Yes on all points regarding MPI and BLAS/Lapack.  I recommend installing
>> a current MPICH and/or Open MPI system-wide, preferably hooked up to
>> ccache (see replies to this thread:
>> https://lists.mcs.anl.gov/pipermail/petsc-dev/2020-January/025505.html <https://lists.mcs.anl.gov/pipermail/petsc-dev/2020-January/025505.html>),
>> as well as BLAS/Lapack system-wide.  It's the other packages that are
>> more likely to depend on int/scalar configuration, but even many of
>> those (HDF5, SuiteSparse, etc.) aren't built specially for PETSc.
>
> Is the home-brew MPICH, openblas, lapack sufficient here? Or is it recommended to build all three from source?

It looks current.  OSX (or Xcode?) ships with a good BLAS/Lapack so you
shouldn't need install anything, but OpenBLAS or BLIS (faster) from
Homebrew should be fine.


More information about the petsc-dev mailing list