[petsc-dev] titan
Mark Adams
mfadams at lbl.gov
Sat Jul 7 07:09:01 CDT 2018
On Fri, Jul 6, 2018 at 7:30 PM Smith, Barry F. <bsmith at mcs.anl.gov> wrote:
>
> Mark,
>
> We should get your arch-titan-opt64idx-pgi.py into the master branch
> of PETSc in config/examples if there is not something equivalent there
> already.
>
I do not recall if I did anything special for Titan, aprun and compilers is
about all I see. maybe '--known-mpi-shared-libraries=1' was added,
otherwise it looks like a cloned config file. I did comment out cmake after
learning that I could load the module. Maybe make a note in this file as I
did here
#!/usr/bin/env python
if __name__ == '__main__':
import sys
import os
sys.path.insert(0, os.path.abspath('config'))
import configure
configure_options = [
'--COPTFLAGS=-fast',
'--CXXOPTFLAGS=-fast',
'--FOPTFLAGS=-fast',
'--with-ssl=0',
'--with-batch=0',
'--with-mpiexec=aprun',
'--download-fblaslapack',
# '--download-superlu',
'--with-openmp',
'--prefix=/ccs/proj/env003/petscv3.9-titan-opt64-pgi',
'--download-hypre',
'--download-metis',
'--with-hwloc=0',
'--download-parmetis',
# '--download-cmake', // use: module load cmake
'--with-cc=cc',
# '--with-clib-autodetect=0',
'--with-cxx=CC',
# '--with-cxxlib-autodetect=0',
'--with-fc=ftn',
# '--with-fortranlib-autodetect=0',
'--with-shared-libraries=0',
'--known-mpi-shared-libraries=1',
'--with-x=0',
'--with-64-bit-indices=1',
'--with-debugging=0',
'PETSC_ARCH=arch-titan-opt64-pgi',
]
configure.petsc_configure(configure_options)
~
>
> Also is the .py file different for summit? I would expect so, we
> should get that into config/examples also.
>
And and same with summit, this still has cmake, which I assume could be
removed if you load the cmake module:
#!/usr/bin/env python
if __name__ == '__main__':
import sys
import os
sys.path.insert(0, os.path.abspath('config'))
import configure
configure_options = [
'--COPTFLAGS=-g',
'--CXXOPTFLAGS=-g',
'--FOPTFLAGS=-g',
'--with-ssl=0',
'--with-batch=0',
# '--with-mpiexec=aprun',
# '--download-superlu',
# '--with-threadcomm',
# '--with-pthreadclasses',
# '--with-openmp',
'--prefix=/ccs/proj/env003/petscv3.7-dbg64-summit-pgi',
# '--download-hypre',
'--download-metis',
'--with-hwloc=0',
'--download-parmetis',
'--download-cmake',
'--with-cc=mpicc',
# '--with-fc=0',
# '--with-clib-autodetect=0',
# '--with-cxx=mpiCC',
# '--with-cxxlib-autodetect=0',
'--with-fc=mpif90',
# '--with-fortranlib-autodetect=0',
'--with-shared-libraries=0',
'--known-mpi-shared-libraries=1',
'--with-x=0',
'--with-64-bit-indices',
'--with-debugging=1',
'PETSC_ARCH=arch-summit-dbg64-pgi',
]
configure.petsc_configure(configure_options)
~
>
> Thanks
>
> Barry
>
>
> > On Jul 6, 2018, at 6:19 PM, Mark Adams <mfadams at lbl.gov> wrote:
> >
> > Todd, I've been building PETSc on Titan and SUMMIT for users. cray-petsc
> has been hosed for a while. I put my most recent configure file in:
> >
> > /ccs/proj/env003/arch-titan-opt64idx-pgi.py
> >
> > This is a public space. My builds are here also. You run with, for
> instance,
> >
> > make PETSC_DIR=/ccs/proj/env003/petscv3.9-opt64-pgi PETSC_ARCH=""
> >
> > There are several other builds in there that you can use. I now see that
> I don't have the machine name in the directory name, yuck. But this and a
> v3.7 version are recent and work (on Titan).
> >
> > Mark
> >
> >
> > On Fri, Jul 6, 2018 at 6:47 PM Smith, Barry F. <bsmith at mcs.anl.gov>
> wrote:
> >
> > I know Mark Adams has tried recently; without limited success.
> >
> > As always the big problem is facilities removing accounts, such as
> Satish's, so testing gets difficult.
> >
> > But yes, we want to support Titan so have users send
> configure.log/make.log to petsc-maint at mcs.anl.gov
> >
> > Barry
> >
> >
> > > On Jul 6, 2018, at 12:45 PM, Munson, Todd <tmunson at mcs.anl.gov> wrote:
> > >
> > >
> > > Barry,
> > >
> > > Have you tested petsc compilation on titan? Is there a config example
> > > for that machine?
> > >
> > > Apparently titan died and when they brought it back up, the petsc
> install
> > > was fracked, so some ORNL users are trying to compile petsc locally
> > > rather than wait for the facilities people.
> > >
> > > Thanks, Todd
> > >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20180707/3e739435/attachment-0001.html>
More information about the petsc-dev
mailing list