[petsc-dev] Error running on Titan with GPUs & GNU

Mark Adams mfadams at lbl.gov
Wed Oct 31 05:34:25 CDT 2018


On Wed, Oct 31, 2018 at 5:05 AM Karl Rupp <rupp at iue.tuwien.ac.at> wrote:

> Hi Mark,
>
> please comment or remove lines 83 and 84 in
>   config/BuildSystem/config/packages/cuda.py
>
> Is there a compiler newer than GCC 4.3 available?
>

You mean 6.3?

06:33  ~$ module avail gcc

----------------------------------------------------- /opt/modulefiles
-----------------------------------------------------
gcc/4.8.1          gcc/4.9.3          gcc/6.1.0          gcc/6.3.0(default)
gcc/7.2.0
gcc/4.8.2          gcc/5.3.0          gcc/6.2.0          gcc/7.1.0
gcc/7.3.0



>
> Best regards,
> Karli
>
>
>
> On 10/31/18 8:15 AM, Mark Adams via petsc-dev wrote:
> > After loading a cuda module ...
> >
> > On Wed, Oct 31, 2018 at 2:58 AM Mark Adams <mfadams at lbl.gov
> > <mailto:mfadams at lbl.gov>> wrote:
> >
> >     I get an error with --with-cuda=1
> >
> >     On Tue, Oct 30, 2018 at 4:44 PM Smith, Barry F. <bsmith at mcs.anl.gov
> >     <mailto:bsmith at mcs.anl.gov>> wrote:
> >
> >         --with-cudac=1 should be --with-cuda=1
> >
> >
> >
> >          > On Oct 30, 2018, at 12:35 PM, Smith, Barry F. via petsc-dev
> >         <petsc-dev at mcs.anl.gov <mailto:petsc-dev at mcs.anl.gov>> wrote:
> >          >
> >          >
> >          >
> >          >> On Oct 29, 2018, at 8:09 PM, Mark Adams <mfadams at lbl.gov
> >         <mailto:mfadams at lbl.gov>> wrote:
> >          >>
> >          >> And a debug build seems to work:
> >          >
> >          >    Well ok.
> >          >
> >          >    Are there newer versions of the Gnu compiler for this
> >         system? Are there any other compilers on the system that would
> >         likely be less buggy? IBM compilers? If this simple code
> >         generates a gross error with optimization who's to say how many
> >         more subtle bugs may be induced in the library by the buggy
> >         optimizer (there may be none but IMHO probability says there
> >         will be others).
> >          >
> >          >    Is there any chance that valgrind runs on this machine;
> >         you could run the optimized version through it and see what it
> says.
> >          >
> >          >   Barry
> >          >
> >          >>
> >          >> 21:04 1 master= /lustre/atlas/proj-shared/geo127/petsc$ make
> >
>  PETSC_DIR=/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda
> >         PETSC_ARCH="" test
> >          >> Running test examples to verify correct installation
> >          >> Using
> >
>  PETSC_DIR=/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda
> >         and PETSC_ARCH=
> >          >> *******************Error detected during compile or
> >         link!*******************
> >          >> See http://www.mcs.anl.gov/petsc/documentation/faq.html
> >          >>
> >
>  /lustre/atlas/proj-shared/geo127/petsc/src/snes/examples/tutorials
> >         ex19
> >          >>
> >
>  *********************************************************************************
> >          >> cc -o ex19.o -c -g
> >
>  -I/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/include
>   `pwd`/ex19.c
> >          >> cc -g  -o ex19 ex19.o
> >
>  -L/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib
> >
>  -Wl,-rpath,/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib
> >
>  -L/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib
> >         -lpetsc -lHYPRE -lflapack -lfblas -lparmetis -lmetis -ldl
> >          >>
> >
>  /lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib/libpetsc.a(dlimpl.o):
> >         In function `PetscDLOpen':
> >          >>
> >
>  /lustre/atlas1/geo127/proj-shared/petsc/src/sys/dll/dlimpl.c:108: warning:
> >         Using 'dlopen' in statically linked applications requires at
> >         runtime the shared libraries from the glibc version used for
> linking
> >          >>
> >
>  /lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib/libpetsc.a(send.o):
> >         In function `PetscOpenSocket':
> >          >>
> >
>  /lustre/atlas1/geo127/proj-shared/petsc/src/sys/classes/viewer/impls/socket/send.c:108:
> >         warning: Using 'gethostbyname' in statically linked applications
> >         requires at runtime the shared libraries from the glibc version
> >         used for linking
> >          >> true ex19
> >          >> rm ex19.o
> >          >> Possible error running C/C++
> >         src/snes/examples/tutorials/ex19 with 1 MPI process
> >          >> See http://www.mcs.anl.gov/petsc/documentation/faq.html
> >          >> lid velocity = 0.0016, prandtl # = 1., grashof # = 1.
> >          >> Number of SNES iterations = 2
> >          >> Application 19081049 resources: utime ~1s, stime ~1s, Rss
> >         ~17112, inblocks ~36504, outblocks ~111043
> >          >> Possible error running C/C++
> >         src/snes/examples/tutorials/ex19 with 2 MPI processes
> >          >> See http://www.mcs.anl.gov/petsc/documentation/faq.html
> >          >> lid velocity = 0.0016, prandtl # = 1., grashof # = 1.
> >          >> Number of SNES iterations = 2
> >          >> Application 19081050 resources: utime ~1s, stime ~1s, Rss
> >         ~19816, inblocks ~36527, outblocks ~111043
> >          >> 5a6
> >          >>> Application 19081051 resources: utime ~1s, stime ~0s, Rss
> >         ~13864, inblocks ~36527, outblocks ~111043
> >          >>
> >
>  /lustre/atlas/proj-shared/geo127/petsc/src/snes/examples/tutorials
> >          >> Possible problem with ex19_hypre, diffs above
> >          >> =========================================
> >          >> *******************Error detected during compile or
> >         link!*******************
> >          >> See http://www.mcs.anl.gov/petsc/documentation/faq.html
> >          >>
> >
>  /lustre/atlas/proj-shared/geo127/petsc/src/snes/examples/tutorials
> >         ex5f
> >          >> *********************************************************
> >          >> ftn -c -g
> >
>  -I/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/include
> >            -o ex5f.o ex5f.F90
> >          >> ftn -g   -o ex5f ex5f.o
> >
>  -L/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib
> >
>  -Wl,-rpath,/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib
> >
>  -L/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib
> >         -lpetsc -lHYPRE -lflapack -lfblas -lparmetis -lmetis -ldl
> >          >>
> >
>  /lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib/libpetsc.a(dlimpl.o):
> >         In function `PetscDLOpen':
> >          >>
> >
>  /lustre/atlas1/geo127/proj-shared/petsc/src/sys/dll/dlimpl.c:108: warning:
> >         Using 'dlopen' in statically linked applications requires at
> >         runtime the shared libraries from the glibc version used for
> linking
> >          >>
> >
>  /lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib/libpetsc.a(send.o):
> >         In function `PetscOpenSocket':
> >          >>
> >
>  /lustre/atlas1/geo127/proj-shared/petsc/src/sys/classes/viewer/impls/socket/send.c:108:
> >         warning: Using 'gethostbyname' in statically linked applications
> >         requires at runtime the shared libraries from the glibc version
> >         used for linking
> >          >> rm ex5f.o
> >          >> Possible error running Fortran example
> >         src/snes/examples/tutorials/ex5f with 1 MPI process
> >          >> See http://www.mcs.anl.gov/petsc/documentation/faq.html
> >          >> Number of SNES iterations =     4
> >          >> Application 19081055 resources: utime ~1s, stime ~0s, Rss
> >         ~12760, inblocks ~36800, outblocks ~111983
> >          >> Completed test examples
> >          >> 21:06 master= /lustre/atlas/proj-shared/geo127/petsc$
> >          >
> >
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20181031/c1097fc5/attachment.html>


More information about the petsc-dev mailing list