[petsc-dev] Error running on Titan with GPUs & GNU
Mark Adams
mfadams at lbl.gov
Wed Oct 31 01:58:37 CDT 2018
I get an error with --with-cuda=1
On Tue, Oct 30, 2018 at 4:44 PM Smith, Barry F. <bsmith at mcs.anl.gov> wrote:
> --with-cudac=1 should be --with-cuda=1
>
>
>
> > On Oct 30, 2018, at 12:35 PM, Smith, Barry F. via petsc-dev <
> petsc-dev at mcs.anl.gov> wrote:
> >
> >
> >
> >> On Oct 29, 2018, at 8:09 PM, Mark Adams <mfadams at lbl.gov> wrote:
> >>
> >> And a debug build seems to work:
> >
> > Well ok.
> >
> > Are there newer versions of the Gnu compiler for this system? Are
> there any other compilers on the system that would likely be less buggy?
> IBM compilers? If this simple code generates a gross error with
> optimization who's to say how many more subtle bugs may be induced in the
> library by the buggy optimizer (there may be none but IMHO probability says
> there will be others).
> >
> > Is there any chance that valgrind runs on this machine; you could run
> the optimized version through it and see what it says.
> >
> > Barry
> >
> >>
> >> 21:04 1 master= /lustre/atlas/proj-shared/geo127/petsc$ make
> PETSC_DIR=/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda
> PETSC_ARCH="" test
> >> Running test examples to verify correct installation
> >> Using
> PETSC_DIR=/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda
> and PETSC_ARCH=
> >> *******************Error detected during compile or
> link!*******************
> >> See http://www.mcs.anl.gov/petsc/documentation/faq.html
> >> /lustre/atlas/proj-shared/geo127/petsc/src/snes/examples/tutorials ex19
> >>
> *********************************************************************************
> >> cc -o ex19.o -c -g
> -I/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/include
> `pwd`/ex19.c
> >> cc -g -o ex19 ex19.o
> -L/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib
> -Wl,-rpath,/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib
> -L/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib
> -lpetsc -lHYPRE -lflapack -lfblas -lparmetis -lmetis -ldl
> >>
> /lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib/libpetsc.a(dlimpl.o):
> In function `PetscDLOpen':
> >> /lustre/atlas1/geo127/proj-shared/petsc/src/sys/dll/dlimpl.c:108:
> warning: Using 'dlopen' in statically linked applications requires at
> runtime the shared libraries from the glibc version used for linking
> >>
> /lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib/libpetsc.a(send.o):
> In function `PetscOpenSocket':
> >>
> /lustre/atlas1/geo127/proj-shared/petsc/src/sys/classes/viewer/impls/socket/send.c:108:
> warning: Using 'gethostbyname' in statically linked applications requires
> at runtime the shared libraries from the glibc version used for linking
> >> true ex19
> >> rm ex19.o
> >> Possible error running C/C++ src/snes/examples/tutorials/ex19 with 1
> MPI process
> >> See http://www.mcs.anl.gov/petsc/documentation/faq.html
> >> lid velocity = 0.0016, prandtl # = 1., grashof # = 1.
> >> Number of SNES iterations = 2
> >> Application 19081049 resources: utime ~1s, stime ~1s, Rss ~17112,
> inblocks ~36504, outblocks ~111043
> >> Possible error running C/C++ src/snes/examples/tutorials/ex19 with 2
> MPI processes
> >> See http://www.mcs.anl.gov/petsc/documentation/faq.html
> >> lid velocity = 0.0016, prandtl # = 1., grashof # = 1.
> >> Number of SNES iterations = 2
> >> Application 19081050 resources: utime ~1s, stime ~1s, Rss ~19816,
> inblocks ~36527, outblocks ~111043
> >> 5a6
> >>> Application 19081051 resources: utime ~1s, stime ~0s, Rss ~13864,
> inblocks ~36527, outblocks ~111043
> >> /lustre/atlas/proj-shared/geo127/petsc/src/snes/examples/tutorials
> >> Possible problem with ex19_hypre, diffs above
> >> =========================================
> >> *******************Error detected during compile or
> link!*******************
> >> See http://www.mcs.anl.gov/petsc/documentation/faq.html
> >> /lustre/atlas/proj-shared/geo127/petsc/src/snes/examples/tutorials ex5f
> >> *********************************************************
> >> ftn -c -g
> -I/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/include
> -o ex5f.o ex5f.F90
> >> ftn -g -o ex5f ex5f.o
> -L/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib
> -Wl,-rpath,/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib
> -L/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib
> -lpetsc -lHYPRE -lflapack -lfblas -lparmetis -lmetis -ldl
> >>
> /lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib/libpetsc.a(dlimpl.o):
> In function `PetscDLOpen':
> >> /lustre/atlas1/geo127/proj-shared/petsc/src/sys/dll/dlimpl.c:108:
> warning: Using 'dlopen' in statically linked applications requires at
> runtime the shared libraries from the glibc version used for linking
> >>
> /lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib/libpetsc.a(send.o):
> In function `PetscOpenSocket':
> >>
> /lustre/atlas1/geo127/proj-shared/petsc/src/sys/classes/viewer/impls/socket/send.c:108:
> warning: Using 'gethostbyname' in statically linked applications requires
> at runtime the shared libraries from the glibc version used for linking
> >> rm ex5f.o
> >> Possible error running Fortran example src/snes/examples/tutorials/ex5f
> with 1 MPI process
> >> See http://www.mcs.anl.gov/petsc/documentation/faq.html
> >> Number of SNES iterations = 4
> >> Application 19081055 resources: utime ~1s, stime ~0s, Rss ~12760,
> inblocks ~36800, outblocks ~111983
> >> Completed test examples
> >> 21:06 master= /lustre/atlas/proj-shared/geo127/petsc$
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20181031/04a2e532/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: configure.log
Type: application/octet-stream
Size: 3546811 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20181031/04a2e532/attachment-0001.obj>
More information about the petsc-dev
mailing list