[petsc-dev] ccache tips?
Jed Brown
jed at jedbrown.org
Fri Jan 17 10:41:11 CST 2020
To use system MPI, I made a directory with one-line scripts
$ rg . ~/usr/ccache/mpich/
/home/jed/usr/ccache/mpich/bin/mpiexec
1:#!/bin/sh
3:/opt/mpich/bin/mpiexec "$@"
/home/jed/usr/ccache/mpich/bin/mpicc
1:#!/bin/dash
3:ccache /opt/mpich/bin/mpicc "$@"
/home/jed/usr/ccache/mpich/bin/mpicxx
1:#!/bin/dash
3:ccache /opt/mpich/bin/mpicxx "$@"
ln -s /opt/mpich/include ~/usr/ccache/mpich/include
mkdir ~/usr/ccache/mpich/lib # Configure wants (or wanted) this directory to exist, but it can be empty
Then --with-mpi-dir=$HOME/usr/ccache/mpich is all you need. Works with
all packages and I haven't touched it in nearly a decade.
Matthew Knepley <knepley at gmail.com> writes:
> I configured MPI with CC="ccache gcc"
>
> knepley/feature-dm-remove-hybrid *$:/PETSc3/petsc/petsc-dev$
> /PETSc3/petsc/bin/mpicc -show
> /Users/knepley/MacSoftware/bin/ccache gcc -Qunused-arguments
> -fstack-protector -Qunused-arguments -g3 -Wl,-flat_namespace
> -I/PETSc3/petsc/include -L/PETSc3/petsc/lib -lpmpich -lmpic
> h -lopa -lmpl -lpthread
>
> and then used --with-mpi-dir. This is the way to go if you do multiple
> ARCHes. Have not had a problem on OSX or Linux. Works with all packages.
>
> Matt
>
> On Fri, Jan 17, 2020 at 11:29 AM Patrick Sanan <patrick.sanan at gmail.com>
> wrote:
>
>> I'm shamefully not using ccache. How do I do it? Is it as simple as
>> ./configure --with-cc="ccache gcc" --with-cxx="ccache g++"? Works on OS X
>> and various Linuxes? Any known issue with external packages or otherwise?
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
More information about the petsc-dev
mailing list