[petsc-users] FW: [alcf-support #195845] PETSc compilation

Barry Smith bsmith at mcs.anl.gov
Thu Jan 23 13:35:32 CST 2014


  Nothing for us to do. 

> ponga is not a member of MM-MEDE
> Projects available: ATPESC2013
> For assistance, contact support at alcf.anl.gov
> Filter /soft/cobalt/scripts/clusterbank-account failed
> 
> 
> So, I wonder if you can change me to the project MM-MEDE which I am member
> (and also PI). 


On Jan 23, 2014, at 12:53 PM, Tsien, Victor <vtsien at alcf.anl.gov> wrote:

> petsc group,
> 
> Can you help?  
> 
> Thanks,
> 
> Victor
> ________________________________________
> From: Mauri Ponga [mponga at caltech.edu]
> Sent: Wednesday, January 22, 2014 5:46 PM
> To: support at alcf.anl.gov
> Subject: [alcf-support #195845] PETSc compilation
> 
> User info for mponga at caltech.edu
> =================================
> Username:  ponga
> Full Name: Mauricio Ponga
> Projects:  ATPESC2013
>             ('*' denotes INCITE projects)
> =================================
> 
> 
> Hi,
> 
> I am trying to compile my program in VESTA. In order to compile it I need
> PETSc. However, I am using a old version (petsc-3.0.0-p8) which I am trying
> to configure and compile in my user account. I used the following options:
> 
> ./configure --prefix=/soft/libraries/petsc/3.3-p2/xl-opt --with-batch=1
> --with-blacs-include=/soft/libraries/alcf/current/xl/SCALAPACK/
> --with-blacs-lib=/soft/libraries/alcf/current/xl/SCALAPACK/lib/libscalapack.a
> --with-blas-lapack-lib="-L/soft/libraries/alcf/current/xl/LAPACK/lib
> -llapack -L/soft/libraries/alcf/current/xl/BLAS/lib -lblas"
> --with-cc=mpixlc_r --with-cxx=mpixlcxx_r --with-debugging=0
> --with-fc="mpixlf77_r -qnosave" --with-fortran-kernels=1
> --with-is-color-value-type=short
> --with-scalapack-include=/soft/libraries/alcf/current/xl/SCALAPACK/
> --with-scalapack-lib=/soft/libraries/alcf/current/xl/SCALAPACK/lib/libscalapack.a
> --with-shared-libraries=0 --with-x=0 -COPTFLAGS=" -O3 -qhot=level=0
> -qsimd=auto -qmaxmem=-1 -qstrict -qstrict_induction" -CXXOPTFLAGS=" -O3
> -qhot=level=0 -qsimd=auto -qmaxmem=-1 -qstrict -qstrict_induction"
> -FOPTFLAGS=" -O3 -qhot=level=0 -qsimd=auto -qmaxmem=-1 -qstrict
> -qstrict_induction" PETSC_ARCH=arch-bgq-ibm-opt with-mpi-shared=0
> 
> Configuration finished fine, but at the end of the script appear the
> following message
> 
> =================================================================================
> 
>    Since your compute nodes require use of a batch system or mpiexec you
> must:
> 1) Submit ./conftest to 1 processor of your batch system or system you
> are
>    cross-compiling for; this will generate the file
> reconfigure.py
> 2) Run ./reconfigure.py (to complete the configure
> process).
> =================================================================================
> 
> I tried to run the command ./conftest using jobscript.sh (in
> /home/ponga/petsc-3.0.0-p8) but the following message appears:
> 
> ponga is not a member of MM-MEDE
> Projects available: ATPESC2013
> For assistance, contact support at alcf.anl.gov
> Filter /soft/cobalt/scripts/clusterbank-account failed
> 
> 
> So, I wonder if you can change me to the project MM-MEDE which I am member
> (and also PI). Also I wonder if I am following the right steps to configure
> PETSc in my user account.
> 
> Thank you.
> 
> Mauricio Ponga
> 



More information about the petsc-users mailing list