[petsc-users] PETSC on Cray Hazelhen
Vaz, Guilherme
G.Vaz at marin.nl
Mon Jun 19 07:57:55 CDT 2017
Guys,
I made it working...
After a lot of trial and error, based on Matthew, Stefano and Cray $PETSC_DIR/include/petscconfiginfo.h, these are my "best" confopts which worked:
CONFOPTS="--prefix=$PETSC_INSTALL_DIR \
--known-has-attribute-aligned=1 --known-mpi-int64_t=0 --known-bits-per-byte=8 \
--known-sdot-returns-double=0 --known-snrm2-returns-double=0 --known-level1-dcache-assoc=4 \
--known-level1-dcache-linesize=64 --known-level1-dcache-size=16384 --known-memcmp-ok=1 \
--known-mpi-c-double-complex=1 --known-mpi-long-double=1 \
--known-mpi-shared-libraries=0 \
--known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-sizeof-char=1 --known-sizeof-double=8 --known-sizeof-float=4 \
--known-sizeof-int=4 --known-sizeof-long-long=8 --known-sizeof-long=8 --known-sizeof-short=2 --known-sizeof-size_t=8 --known-sizeof-void-p=8 \
--with-ar=ar \
--with-batch=1 \
--with-cc=/opt/cray/craype/2.5.10/bin/cc \
--with-clib-autodetect=0 \
--with-cxx=0 \
--with-cxxlib-autodetect=0 \
--with-debugging=0 \
--with-dependencies=0 \
--with-fc=/opt/cray/craype/2.5.10/bin/ftn \
--with-fortran-datatypes=0 --with-fortran-interfaces=0 --with-fortranlib-autodetect=0 \
--CFLAGS=-O3 \
--FFLAGS=-O3 -lstdc++ \
--LDFLAGS=-dynamic \
--FOPTFLAGS=-O3 \ --COPTFLAGS=-O3 \
--with-ranlib=ranlib \
--with-scalar-type=real \
--with-shared-ld=ar \
--with-etags=0 \
--with-x=0 \
--with-ssl=0 \
--with-shared-libraries=0 \
--with-mpi-lib="" --with-mpi-include="" \
--download-superlu_dist=$SOURCE_DIR/$SUPERLU_SOURCE_FILE \
--download-parmetis=$SOURCE_DIR/$PARMETIS_SOURCE_FILE \
--download-metis=$SOURCE_DIR/$METIS_SOURCE_FILE \
--with-external-packages-dir=$INSTALL_DIR "
Thanks for the help. And I hope this may help some Cray newbies like me.
Guilherme V.
________________________________
From: Stefano Zampini <stefano.zampini at gmail.com>
Sent: Tuesday, June 13, 2017 4:10 PM
To: Vaz, Guilherme
Cc: Matthew Knepley; PETSc
Subject: Re: [petsc-users] PETSC on Cray Hazelhen
Cray machines can be used with shared libraries, it’s not like the earlier versions of BG/Q
Yes, you need almost all of this. If you run with configure with the option —with-batch=1, will then generate something like the one I have sent you.
—with-shared-libraries is a PETSc configuration, i.e. you will create libpetsc.so
—LDFLAGS=-dynamic is used to link dynamically a PETSc executable
—known-mpi-shared-libraries=0 will use a statically linked MPI
You can remove the first two options listed above if you would like to have a static version of PETSC.
You may want to refine the options for optimized builds, i.e. add your favorite COPTFLAGS and remove —with-debugging=1
Another thing you can do. Load any of the PETSc modules on the XC40, and then look at the file
$PETSC_DIR/include/petscconfiginfo.h
On Jun 13, 2017, at 4:01 PM, Vaz, Guilherme <G.Vaz at marin.nl<mailto:G.Vaz at marin.nl>> wrote:
Stefano/Mathew,
Do I need all this :-)?
And I dont want a debug version, just an optimized release version. Thus I understand the -g -O0 flags for debug, but the rest I am not sure what is for debug and for a release version. Sorry...
I am also kind of confused on the shared libraries issue,
'--known-mpi-shared-libraries=0',
'--with-shared-libraries=1',
on the static vs dynamic linking (I thought in XC-40 we had to compile everything statically,
--LDFLAGS=-dynamic
and on the FFFLAG:
--FFLAGS=-mkl=sequential -g -O0 -lstdc++
Is this to be used with Intel MKL libraries?
Thanks for the help, you both.
Guilherme V.
dr. ir. Guilherme Vaz | CFD Research Coordinator / Senior CFD Researcher | Research & Development
MARIN | T +31 317 49 33 25 | M +31 621 13 11 97 | G.Vaz at marin.nl<mailto:G.Vaz at marin.nl> | www.marin.nl<http://www.marin.nl/>
<image067d96.PNG><https://www.linkedin.com/company/marin> <image0c3f53.PNG><http://www.youtube.com/marinmultimedia> <image887afd.PNG><https://twitter.com/MARIN_nieuws> <image2a2a45.PNG><https://www.facebook.com/marin.wageningen>
MARIN news: MARIN deelt onderzoek tijdens R&D Seminar, 21 juni 2017<http://www.marin.nl/web/News/News-items/MARIN-deelt-onderzoek-tijdens-RD-Seminar-21-juni-2017.htm>
________________________________
From: Stefano Zampini <stefano.zampini at gmail.com<mailto:stefano.zampini at gmail.com>>
Sent: Tuesday, June 13, 2017 3:42 PM
To: Vaz, Guilherme
Cc: Matthew Knepley; PETSc
Subject: Re: [petsc-users] PETSC on Cray Hazelhen
Guilherme,
here is my debug configuration (with shared libraries) in PETSc on a XC40
'--CFLAGS=-mkl=sequential -g -O0 ',
'--CXXFLAGS=-mkl=sequential -g -O0 ',
'--FFLAGS=-mkl=sequential -g -O0 -lstdc++',
'--LDFLAGS=-dynamic',
'--download-metis-cmake-arguments=-DCMAKE_C_COMPILER_FORCED=1',
'--download-metis=1',
'--download-parmetis-cmake-arguments=-DCMAKE_C_COMPILER_FORCED=1',
'--download-parmetis=1',
'--known-bits-per-byte=8',
'--known-has-attribute-aligned=1',
'--known-level1-dcache-assoc=8',
'--known-level1-dcache-linesize=64',
'--known-level1-dcache-size=32768',
'--known-memcmp-ok=1',
'--known-mpi-c-double-complex=1',
'--known-mpi-int64_t=1',
'--known-mpi-long-double=1',
'--known-mpi-shared-libraries=0',
'--known-sdot-returns-double=0',
'--known-sizeof-MPI_Comm=4',
'--known-sizeof-MPI_Fint=4',
'--known-sizeof-char=1',
'--known-sizeof-double=8',
'--known-sizeof-float=4',
'--known-sizeof-int=4',
'--known-sizeof-long-long=8',
'--known-sizeof-long=8',
'--known-sizeof-short=2',
'--known-sizeof-size_t=8',
'--known-sizeof-void-p=8',
'--known-snrm2-returns-double=0',
'--with-ar=ar',
'--with-batch=1',
'--with-cc=/opt/cray/craype/2.4.2/bin/cc',
'--with-clib-autodetect=0',
'--with-cmake=/home/zampins/local/bin/cmake',
'--with-cxx=/opt/cray/craype/2.4.2/bin/CC',
'--with-cxxlib-autodetect=0',
'--with-debugging=1',
'--with-dependencies=0',
'--with-etags=0',
'--with-fc=/opt/cray/craype/2.4.2/bin/ftn',
'--with-fortran-datatypes=0',
'--with-fortran-interfaces=0',
'--with-fortranlib-autodetect=0',
'--with-pthread=0',
'--with-ranlib=ranlib',
'--with-scalar-type=real',
'--with-shared-ld=ar',
'--with-shared-libraries=1',
'PETSC_ARCH=arch-intel-debug',
2017-06-13 15:34 GMT+02:00 Vaz, Guilherme <G.Vaz at marin.nl<mailto:G.Vaz at marin.nl>>:
Dear Matthew,
Thanks. It went further, but now I get:
TESTING: configureMPIEXEC from config.packages.MPI(/zhome/academic/HLRS/pri/iprguvaz/ReFRESCO/Dev/trunk/Libs/build/petsc-3.7.5/config/BuildSystem/config/packages/MPI.py:143)
*******************************************************************************
UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details):
-------------------------------------------------------------------------------
Must give a default value for known-mpi-shared-libraries since executables cannot be run
*******************************************************************************
Last lines from the log:
File "./config/configure.py", line 405, in petsc_configure
framework.configure(out = sys.stdout)
File "/zhome/academic/HLRS/pri/iprguvaz/ReFRESCO/Dev/trunk/Libs/build/petsc-3.7.5/config/BuildSystem/config/framework.py", line 1090, in configure
self.processChildren()
File "/zhome/academic/HLRS/pri/iprguvaz/ReFRESCO/Dev/trunk/Libs/build/petsc-3.7.5/config/BuildSystem/config/framework.py", line 1079, in processChildren
self.serialEvaluation(self.childGraph)
File "/zhome/academic/HLRS/pri/iprguvaz/ReFRESCO/Dev/trunk/Libs/build/petsc-3.7.5/config/BuildSystem/config/framework.py", line 1060, in serialEvaluation
child.configure()
File "/zhome/academic/HLRS/pri/iprguvaz/ReFRESCO/Dev/trunk/Libs/build/petsc-3.7.5/config/BuildSystem/config/package.py", line 791, in configure
self.executeTest(self.checkSharedLibrary)
File "/zhome/academic/HLRS/pri/iprguvaz/ReFRESCO/Dev/trunk/Libs/build/petsc-3.7.5/config/BuildSystem/config/base.py", line 126, in executeTest
ret = test(*args,**kargs)
File "/zhome/academic/HLRS/pri/iprguvaz/ReFRESCO/Dev/trunk/Libs/build/petsc-3.7.5/config/BuildSystem/config/packages/MPI.py", line 135, in checkSharedLibrary
self.shared = self.libraries.checkShared('#include <mpi.h>\n','MPI_Init','MPI_Initialized','MPI_Finalize',checkLink = self.checkPackageLink,libraries = self.lib, defaultArg = 'known-mpi-shared-libraries', ex
ecutor = self.mpiexec)
File "/zhome/academic/HLRS/pri/iprguvaz/ReFRESCO/Dev/trunk/Libs/build/petsc-3.7.5/config/BuildSystem/config/libraries.py", line 471, in checkShared
if self.checkRun(defaultIncludes, body, defaultArg = defaultArg, executor = executor):
File "/zhome/academic/HLRS/pri/iprguvaz/ReFRESCO/Dev/trunk/Libs/build/petsc-3.7.5/config/BuildSystem/config/base.py", line 628, in checkRun
(output, returnCode) = self.outputRun(includes, body, cleanup, defaultArg, executor)
File "/zhome/academic/HLRS/pri/iprguvaz/ReFRESCO/Dev/trunk/Libs/build/petsc-3.7.5/config/BuildSystem/config/base.py", line 598, in outputRun
raise ConfigureSetupError('Must give a default value for '+defaultOutputArg+' since executables cannot be run')
Any ideas? Something related with
--with-shared-libraries=0 \
--with-batch=1 \
The first I set because it was in the cray example, and the second because aprun (the mpiexec of Cray) is not available in the frontend.
Thanks,
Guilherme V.
dr. ir. Guilherme Vaz | CFD Research Coordinator / Senior CFD Researcher | Research & Development
MARIN | T +31 317 49 33 25 | M +31 621 13 11 97 | G.Vaz at marin.nl<mailto:G.Vaz at marin.nl> | www.marin.nl<http://www.marin.nl/>
<image628296.PNG><https://www.linkedin.com/company/marin> <imagecdd6dd.PNG><http://www.youtube.com/marinmultimedia> <imageae7ad3.PNG><https://twitter.com/MARIN_nieuws> <image9aa6ed.PNG><https://www.facebook.com/marin.wageningen>
MARIN news: MARIN deelt onderzoek tijdens R&D Seminar, 21 juni 2017<http://www.marin.nl/web/News/News-items/MARIN-deelt-onderzoek-tijdens-RD-Seminar-21-juni-2017.htm>
________________________________
From: Matthew Knepley <knepley at gmail.com<mailto:knepley at gmail.com>>
Sent: Tuesday, June 13, 2017 2:34 PM
To: Vaz, Guilherme
Cc: PETSc
Subject: Re: [petsc-users] PETSC on Cray Hazelhen
On Tue, Jun 13, 2017 at 3:48 AM, Vaz, Guilherme <G.Vaz at marin.nl<mailto:G.Vaz at marin.nl>> wrote:
Dear all,
I am trying to install PETSC on a Cray XC40 system (Hazelhen) with the usual Cray wrappers for Intel compilers, with some chosen external packages and MKL libraries.
I read some threads in the mailing list about this, and I tried the petsc-3.7.5/config/examples/arch-cray-xt6-pkgs-opt.py configuration options. After trying this (please abstract from my own env vars),
CONFOPTS="--prefix=$PETSC_INSTALL_DIR \
--with-cc=cc \
--with-cxx=CC \
--with-fc=ftn \
--with-clib-autodetect=0 \
--with-cxxlib-autodetect=0 \
--with-fortranlib-autodetect=0 \
--COPTFLAGS=-fast -mp \
--CXXOPTFLAGS=-fast -mp \
--FOPTFLAGS=-fast -mp \
--with-shared-libraries=0 \
--with-batch=1 \
--with-x=0 \
--with-mpe=0 \
--with-debugging=0 \
--download-superlu_dist=$SOURCE_DIR/$SUPERLU_SOURCE_FILE \
--with-blas-lapack-dir=$BLASDIR \
--download-parmetis=$SOURCE_DIR/$PARMETIS_SOURCE_FILE \
--download-metis=$SOURCE_DIR/$METIS_SOURCE_FILE \
--with-external-packages-dir=$INSTALL_DIR \
--with-ssl=0 "
I get the following error:
TESTING: checkFortranLinkingCxx from config.compilers(/zhome/academic/HLRS/pri/iprguvaz/ReFRESCO/Dev/trunk/Libs/build/petsc-3.7.5/config/BuildSystem/config/compilers.py:1097)
*******************************************************************************
UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details):
-------------------------------------------------------------------------------
Fortran could not successfully link C++ objects
*******************************************************************************
Does it ring a bell? Any tips?
You turned off autodetection, so it will not find libstdc++. That either has to be put in LIBS, or I would recommend
--with-cxx=0
since nothing you have there requires C++.
Thanks,
Matt
Thanks,
Guilherme V.
dr. ir. Guilherme Vaz | CFD Research Coordinator / Senior CFD Researcher | Research & Development
MARIN | T +31 317 49 33 25<tel:+31%20317%20493%20325> | M +31 621 13 11 97 | G.Vaz at marin.nl<mailto:G.Vaz at marin.nl> | www.marin.nl<http://www.marin.nl/>
<imageb0ee12.PNG><https://www.linkedin.com/company/marin> <imaged8f08c.PNG><http://www.youtube.com/marinmultimedia> <imagedacd80.PNG><https://twitter.com/MARIN_nieuws> <image1bd020.PNG><https://www.facebook.com/marin.wageningen>
MARIN news: Maritime Safety seminar, September 12, Singapore<http://www.marin.nl/web/News/News-items/Maritime-Safety-seminar-September-12-Singapore.htm>
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
http://www.caam.rice.edu/~mk51/
--
Stefano
dr. ir. Guilherme Vaz | CFD Research Coordinator / Senior CFD Researcher | Research & Development
MARIN | T +31 317 49 33 25 | M +31 621 13 11 97 | G.Vaz at marin.nl<mailto:G.Vaz at marin.nl> | www.marin.nl<http://www.marin.nl>
[LinkedIn]<https://www.linkedin.com/company/marin> [YouTube] <http://www.youtube.com/marinmultimedia> [Twitter] <https://twitter.com/MARIN_nieuws> [Facebook] <https://www.facebook.com/marin.wageningen>
MARIN news: Floating cities: our future is on the water<http://www.marin.nl/web/News/News-items/Floating-cities-our-future-is-on-the-water-1.htm>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170619/a817e9be/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image9fc2ea.PNG
Type: image/png
Size: 293 bytes
Desc: image9fc2ea.PNG
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170619/a817e9be/attachment-0004.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image5fa2a7.PNG
Type: image/png
Size: 331 bytes
Desc: image5fa2a7.PNG
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170619/a817e9be/attachment-0005.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: imagefc5855.PNG
Type: image/png
Size: 333 bytes
Desc: imagefc5855.PNG
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170619/a817e9be/attachment-0006.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: imagef440c9.PNG
Type: image/png
Size: 253 bytes
Desc: imagef440c9.PNG
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170619/a817e9be/attachment-0007.png>
More information about the petsc-users
mailing list