<br><br><div class="gmail_quote">On Thu, Oct 13, 2011 at 23:01, <span dir="ltr"><<a href="mailto:Kevin.Buckley@ecs.vuw.ac.nz">Kevin.Buckley@ecs.vuw.ac.nz</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
Hi there,<br>
<br>
Some time ago I built a PETSc (3.0.0-p9) on the BG/P down at NZ's<br>
Univ of Canterbury, for a researcher here at VUW who wanted to<br>
run the PISM ice-sheet modelling code on top of it, and in the<br>
process discovered that the version installed by the sys admins<br>
at the facility was buggy, so the researcher carried on compiling<br>
later PISMs against my 3.0.0-p9.<br>
<br>
UoC are in the process of upgrading to a BG/P and the researcher<br>
has asked me to see if I can coddle things together, is keen to<br>
run with what he had before and is under "time pressure" to get<br>
the remaining results needed for a paper, so is hoping I can<br>
install something ahead of the facility's sys admins install<br>
a system wide version.<br>
<br>
Notwithstanding the obvious upgrade in release of PETSc from our<br>
3.0 series to the current 3.2, I notice that the petsc-bgl-tools<br>
wrapper package only (not surprisingly I guess, given the bgl bit)<br>
provided for IBM 7.0 and 8.0 compiler suites, so, have you guys<br>
tried out PETSc on a BG/P yet?<br></blockquote><div><br></div><div>BG/P has been out for a while, so of course people have been running PETSc on it for years. You can look at</div><div><br></div><div>config/examples/arch-bgp-ibm-opt.py</div>
<div><br></div><div>in the source tree. Here is a configuration that we used for some benchmarks on Shaheen last year:</div><div><br></div><div><div>Configure options: --with-x=0 --with-is-color-value-type=short --with-debugging=1 --with-fortran-kernels=1 --with-mpi-dir=/bgsys/drivers/ppcfloor/comm --with-batch=1 --known-mpi-shared-libraries=1 --known-memcmp-ok --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-size_t=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --known-level1-dcache-assoc=0 --known-level1-dcache-linesize=32 --known-level1-dcache-size=32768 --download-hypre=1 --with-shared=0 --prefix=/opt/share/ksl/petsc/dev-dec9-hypre/ppc450d-bgp_xlc_hypre_fast --with-clanguage=c --COPTFLAGS=" -O3 -qhot" --CXXOPTFLAGS=" -O3 -qhot" --FOPTFLAGS=" -O3 -qhot" --LIBS=" -L/bgsys/ibm_essl/sles10/prod/opt/ibmmath/lib -L/opt/ibmcmp/xlsmp/bg/1.7/lib -L/opt/ibmcmp/xlmass/bg/4.4/bglib -L/opt/ibmcmp/xlf/bg/11.1/bglib -L/bgsys/ibm_essl/sles10/prod/opt/ibmmath/lib -L/opt/ibmcmp/xlsmp/bg/1.7/lib -L/opt/ibmcmp/xlmass/bg/4.4/bglib -L/opt/ibmcmp/xlf/bg/11.1/bglib -lesslbg -lxlf90_r -lxlopt -lxlsmp -lxl -lxlfmath -lesslbg -lxlf90_r -lxlopt -lxlsmp -lxl -lxlfmath -O3" --CC=/bgsys/drivers/ppcfloor/comm/xl/bin/mpixlc_r --CXX=/bgsys/drivers/ppcfloor/comm/xl/bin/mpixlcxx_r --FC=/bgsys/drivers/ppcfloor/comm/xl/bin/mpixlf90_r --with-debugging=0 PETSC_ARCH=bgp-xlc-hypre-fast</div>
</div><div><br></div></div>