[petsc-dev] [petsc-users] Bad memory scaling with PETSc 3.10

Matthew Knepley knepley at gmail.com
Wed Mar 27 08:30:03 CDT 2019


On Wed, Mar 27, 2019 at 8:55 AM Victor Eijkhout via petsc-dev <
petsc-dev at mcs.anl.gov> wrote:

> On Mar 27, 2019, at 7:29 AM, Mark Adams <mfadams at lbl.gov> wrote:
>
> How should he configure to this? remove "--download-fblaslapack=1" and add
> ....
>
>
> 1. If using gcc
>
> module load mkl
>
> with either compiler:
>
> export BLAS_LAPACK_LOAD=--with-blas-lapack-dir=${MKLROOT}
>
> 2.  We define MPICH_HOME for you.
>
> With Intel MPI:
>
>   export PETSC_MPICH_HOME="${MPICH_HOME}/intel64"
>   export mpi="--with-mpi-compilers=1 --with-mpi-include=${TACC_IMPI_INC}
> --with-mpi-lib=${TACC_IMPI_LIB}/release_mt/libmpi.so”
>
> with mvapich:
>
>   export PETSC_MPICH_HOME="${MPICH_HOME}"
>   export mpi="--with-mpi-compilers=1 --with-mpi-dir=${PETSC_MPICH_HOME}”
>
> (looks like a little redundancy in my script)
>

I think Satish now prefers

  --with-cc=${MPICH_HOME}/mpicc --with-cxx=${MPICH_HOME}/mpicxx
--with-fc=${MPICH_HOME}/mpif90

  Thanks,

    Matt


> Victor.
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20190327/ea59b979/attachment.html>


More information about the petsc-dev mailing list