<div dir="ltr"><div dir="ltr">It looks like configure is not finding the correct cc. It does not seem hard to find. <div><br></div><div><div>06:37 master= /lustre/atlas/proj-shared/geo127/petsc$ cc --version</div><div>gcc (GCC) 6.3.0 20161221 (Cray Inc.)</div><div>Copyright (C) 2016 Free Software Foundation, Inc.</div><div>This is free software; see the source for copying conditions. There is NO</div><div>warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.</div><div><br></div><div>06:37 master= /lustre/atlas/proj-shared/geo127/petsc$ which cc</div><div>/opt/cray/craype/2.5.13/bin/cc</div><div>06:38 master= /lustre/atlas/proj-shared/geo127/petsc$ which gcc</div><div>/opt/gcc/6.3.0/bin/gcc</div><div><br></div></div></div></div><br><div class="gmail_quote"><div dir="ltr">On Wed, Oct 31, 2018 at 6:34 AM Mark Adams <<a href="mailto:mfadams@lbl.gov">mfadams@lbl.gov</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div dir="ltr"><br><br><div class="gmail_quote"><div dir="ltr">On Wed, Oct 31, 2018 at 5:05 AM Karl Rupp <<a href="mailto:rupp@iue.tuwien.ac.at" target="_blank">rupp@iue.tuwien.ac.at</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Hi Mark,<br>
<br>
please comment or remove lines 83 and 84 in<br>
config/BuildSystem/config/packages/cuda.py<br>
<br>
Is there a compiler newer than GCC 4.3 available?<br></blockquote><div><br></div><div>You mean 6.3?</div><div><br></div><div><div>06:33 ~$ module avail gcc</div><div><br></div><div>----------------------------------------------------- /opt/modulefiles -----------------------------------------------------</div><div>gcc/4.8.1 gcc/4.9.3 gcc/6.1.0 gcc/6.3.0(default) gcc/7.2.0</div><div>gcc/4.8.2 gcc/5.3.0 gcc/6.2.0 gcc/7.1.0 gcc/7.3.0</div></div><div><br></div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
<br>
Best regards,<br>
Karli<br>
<br>
<br>
<br>
On 10/31/18 8:15 AM, Mark Adams via petsc-dev wrote:<br>
> After loading a cuda module ...<br>
> <br>
> On Wed, Oct 31, 2018 at 2:58 AM Mark Adams <<a href="mailto:mfadams@lbl.gov" target="_blank">mfadams@lbl.gov</a> <br>
> <mailto:<a href="mailto:mfadams@lbl.gov" target="_blank">mfadams@lbl.gov</a>>> wrote:<br>
> <br>
> I get an error with --with-cuda=1<br>
> <br>
> On Tue, Oct 30, 2018 at 4:44 PM Smith, Barry F. <<a href="mailto:bsmith@mcs.anl.gov" target="_blank">bsmith@mcs.anl.gov</a><br>
> <mailto:<a href="mailto:bsmith@mcs.anl.gov" target="_blank">bsmith@mcs.anl.gov</a>>> wrote:<br>
> <br>
> --with-cudac=1 should be --with-cuda=1<br>
> <br>
> <br>
> <br>
> > On Oct 30, 2018, at 12:35 PM, Smith, Barry F. via petsc-dev<br>
> <<a href="mailto:petsc-dev@mcs.anl.gov" target="_blank">petsc-dev@mcs.anl.gov</a> <mailto:<a href="mailto:petsc-dev@mcs.anl.gov" target="_blank">petsc-dev@mcs.anl.gov</a>>> wrote:<br>
> ><br>
> ><br>
> ><br>
> >> On Oct 29, 2018, at 8:09 PM, Mark Adams <<a href="mailto:mfadams@lbl.gov" target="_blank">mfadams@lbl.gov</a><br>
> <mailto:<a href="mailto:mfadams@lbl.gov" target="_blank">mfadams@lbl.gov</a>>> wrote:<br>
> >><br>
> >> And a debug build seems to work:<br>
> ><br>
> > Well ok.<br>
> ><br>
> > Are there newer versions of the Gnu compiler for this<br>
> system? Are there any other compilers on the system that would<br>
> likely be less buggy? IBM compilers? If this simple code<br>
> generates a gross error with optimization who's to say how many<br>
> more subtle bugs may be induced in the library by the buggy<br>
> optimizer (there may be none but IMHO probability says there<br>
> will be others).<br>
> ><br>
> > Is there any chance that valgrind runs on this machine;<br>
> you could run the optimized version through it and see what it says.<br>
> ><br>
> > Barry<br>
> ><br>
> >><br>
> >> 21:04 1 master= /lustre/atlas/proj-shared/geo127/petsc$ make<br>
> PETSC_DIR=/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda<br>
> PETSC_ARCH="" test<br>
> >> Running test examples to verify correct installation<br>
> >> Using<br>
> PETSC_DIR=/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda<br>
> and PETSC_ARCH=<br>
> >> *******************Error detected during compile or<br>
> link!*******************<br>
> >> See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a><br>
> >><br>
> /lustre/atlas/proj-shared/geo127/petsc/src/snes/examples/tutorials<br>
> ex19<br>
> >><br>
> *********************************************************************************<br>
> >> cc -o ex19.o -c -g <br>
> -I/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/include `pwd`/ex19.c<br>
> >> cc -g -o ex19 ex19.o <br>
> -L/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib<br>
> -Wl,-rpath,/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib<br>
> -L/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib<br>
> -lpetsc -lHYPRE -lflapack -lfblas -lparmetis -lmetis -ldl<br>
> >><br>
> /lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib/libpetsc.a(dlimpl.o):<br>
> In function `PetscDLOpen':<br>
> >><br>
> /lustre/atlas1/geo127/proj-shared/petsc/src/sys/dll/dlimpl.c:108: warning:<br>
> Using 'dlopen' in statically linked applications requires at<br>
> runtime the shared libraries from the glibc version used for linking<br>
> >><br>
> /lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib/libpetsc.a(send.o):<br>
> In function `PetscOpenSocket':<br>
> >><br>
> /lustre/atlas1/geo127/proj-shared/petsc/src/sys/classes/viewer/impls/socket/send.c:108:<br>
> warning: Using 'gethostbyname' in statically linked applications<br>
> requires at runtime the shared libraries from the glibc version<br>
> used for linking<br>
> >> true ex19<br>
> >> rm ex19.o<br>
> >> Possible error running C/C++<br>
> src/snes/examples/tutorials/ex19 with 1 MPI process<br>
> >> See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a><br>
> >> lid velocity = 0.0016, prandtl # = 1., grashof # = 1.<br>
> >> Number of SNES iterations = 2<br>
> >> Application 19081049 resources: utime ~1s, stime ~1s, Rss<br>
> ~17112, inblocks ~36504, outblocks ~111043<br>
> >> Possible error running C/C++<br>
> src/snes/examples/tutorials/ex19 with 2 MPI processes<br>
> >> See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a><br>
> >> lid velocity = 0.0016, prandtl # = 1., grashof # = 1.<br>
> >> Number of SNES iterations = 2<br>
> >> Application 19081050 resources: utime ~1s, stime ~1s, Rss<br>
> ~19816, inblocks ~36527, outblocks ~111043<br>
> >> 5a6<br>
> >>> Application 19081051 resources: utime ~1s, stime ~0s, Rss<br>
> ~13864, inblocks ~36527, outblocks ~111043<br>
> >><br>
> /lustre/atlas/proj-shared/geo127/petsc/src/snes/examples/tutorials<br>
> >> Possible problem with ex19_hypre, diffs above<br>
> >> =========================================<br>
> >> *******************Error detected during compile or<br>
> link!*******************<br>
> >> See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a><br>
> >><br>
> /lustre/atlas/proj-shared/geo127/petsc/src/snes/examples/tutorials<br>
> ex5f<br>
> >> *********************************************************<br>
> >> ftn -c -g <br>
> -I/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/include <br>
> -o ex5f.o ex5f.F90<br>
> >> ftn -g -o ex5f ex5f.o <br>
> -L/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib<br>
> -Wl,-rpath,/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib<br>
> -L/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib<br>
> -lpetsc -lHYPRE -lflapack -lfblas -lparmetis -lmetis -ldl<br>
> >><br>
> /lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib/libpetsc.a(dlimpl.o):<br>
> In function `PetscDLOpen':<br>
> >><br>
> /lustre/atlas1/geo127/proj-shared/petsc/src/sys/dll/dlimpl.c:108: warning:<br>
> Using 'dlopen' in statically linked applications requires at<br>
> runtime the shared libraries from the glibc version used for linking<br>
> >><br>
> /lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib/libpetsc.a(send.o):<br>
> In function `PetscOpenSocket':<br>
> >><br>
> /lustre/atlas1/geo127/proj-shared/petsc/src/sys/classes/viewer/impls/socket/send.c:108:<br>
> warning: Using 'gethostbyname' in statically linked applications<br>
> requires at runtime the shared libraries from the glibc version<br>
> used for linking<br>
> >> rm ex5f.o<br>
> >> Possible error running Fortran example<br>
> src/snes/examples/tutorials/ex5f with 1 MPI process<br>
> >> See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a><br>
> >> Number of SNES iterations = 4<br>
> >> Application 19081055 resources: utime ~1s, stime ~0s, Rss<br>
> ~12760, inblocks ~36800, outblocks ~111983<br>
> >> Completed test examples<br>
> >> 21:06 master= /lustre/atlas/proj-shared/geo127/petsc$<br>
> ><br>
> <br>
</blockquote></div></div></div>
</blockquote></div>