<div dir="ltr"><br><br><div class="gmail_quote"><div dir="ltr">On Mon, Oct 29, 2018 at 6:55 PM Smith, Barry F. <<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><br>
   Here is the code<br>
<br>
  PetscStackCallBLAS("LAPACKgeev",LAPACKgeev_("N","N",&bn,R,&bN,realpart,imagpart,&sdummy,&idummy,&sdummy,&idummy,work,&lwork,&lierr));<br>
  if (lierr) SETERRQ1(PETSC_COMM_SELF,PETSC_ERR_LIB,"Error in LAPACK routine %d",(int)lierr);<br>
<br>
   What is unfathomable is that it prints (int) lierr of 0 but then the if () test should not be satisfied. <br>
<br>
   Do a ./configure with debugging turned on, could be an optimizing compiler error.<br></blockquote><div><br></div><div>Configuring debug now. </div><div><br></div><div>Note, I was able to run ex56 (ksp) which does not use GMRES. This error was from a GMRES method so maybe this is an isolated problem.</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<br>
   Barry<br>
<br>
<br>
> On Oct 29, 2018, at 3:56 PM, Mark Adams via petsc-dev <<a href="mailto:petsc-dev@mcs.anl.gov" target="_blank">petsc-dev@mcs.anl.gov</a>> wrote:<br>
> <br>
> I get this error running the tests using GPUs. An error in an LAPACK routine.<br>
> <br>
> 16:50 master= /lustre/atlas/proj-shared/geo127/petsc$ make PETSC_DIR=/lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda PETSC_ARCH="" test<br>
> Running test examples to verify correct installation<br>
> Using PETSC_DIR=/lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda and PETSC_ARCH=<br>
> *******************Error detected during compile or link!*******************<br>
> See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a><br>
> /lustre/atlas/proj-shared/geo127/petsc/src/snes/examples/tutorials ex19<br>
> *********************************************************************************<br>
> cc -o ex19.o -c -O   -I/lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda/include    `pwd`/ex19.c<br>
> cc -O  -o ex19 ex19.o  -L/lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda/lib -Wl,-rpath,/lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda/lib -L/lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda/lib -lpetsc -lHYPRE -lparmetis -lmetis -ldl<br>
> /lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda/lib/libpetsc.a(dlimpl.o): In function `PetscDLOpen':<br>
> dlimpl.c:(.text+0x3b): warning: Using 'dlopen' in statically linked applications requires at runtime the shared libraries from the glibc version used for linking<br>
> /lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda/lib/libpetsc.a(send.o): In function `PetscOpenSocket':<br>
> send.c:(.text+0x3be): warning: Using 'gethostbyname' in statically linked applications requires at runtime the shared libraries from the glibc version used for linking<br>
> true ex19<br>
> rm ex19.o<br>
> Possible error running C/C++ src/snes/examples/tutorials/ex19 with 1 MPI process<br>
> See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a><br>
> lid velocity = 0.0016, prandtl # = 1., grashof # = 1.<br>
> Number of SNES iterations = 2<br>
> Application 19079964 resources: utime ~1s, stime ~1s, Rss ~29412, inblocks ~37563, outblocks ~131654<br>
> Possible error running C/C++ src/snes/examples/tutorials/ex19 with 2 MPI processes<br>
> See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a><br>
> lid velocity = 0.0016, prandtl # = 1., grashof # = 1.<br>
> [1]PETSC ERROR: [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>
> --------------------- Error Message --------------------------------------------------------------<br>
> [1]PETSC ERROR: [0]PETSC ERROR: Error in external library<br>
> Error in external library<br>
> [1]PETSC ERROR: [0]PETSC ERROR: Error in LAPACK routine 0<br>
> Error in LAPACK routine 0<br>
> [1]PETSC ERROR: [0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.<br>
> See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.<br>
> [1]PETSC ERROR: [0]PETSC ERROR: Petsc Development GIT revision: v3.10.2-461-g0ed19bb123  GIT Date: 2018-10-29 13:43:53 +0100<br>
> Petsc Development GIT revision: v3.10.2-461-g0ed19bb123  GIT Date: 2018-10-29 13:43:53 +0100<br>
> [1]PETSC ERROR: [0]PETSC ERROR: ./ex19 on a  named nid16438 by adams Mon Oct 29 16:52:05 2018<br>
> ./ex19 on a  named nid16438 by adams Mon Oct 29 16:52:05 2018<br>
> [1]PETSC ERROR: [0]PETSC ERROR: Configure options --with-cudac=1 --with-batch=0 --prefix=/lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda --download-hypre --download-metis --download-parmetis --with-cc=cc --with-clib-autodetect=0 --with-cxx=CC --with-cxxlib-autodetect=0 --with-fc=ftn --with-fortranlib-autodetect=0 --with-shared-libraries=0 --known-mpi-shared-libraries=1 --with-mpiexec=aprun --with-x=0 --with-64-bit-indices --with-debugging=0 PETSC_ARCH=arch-titan-opt64idx-gnu-cuda PETSC_DIR=/lustre/atlas/proj-shared/geo127/petsc<br>
> Configure options --with-cudac=1 --with-batch=0 --prefix=/lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda --download-hypre --download-metis --download-parmetis --with-cc=cc --with-clib-autodetect=0 --with-cxx=CC --with-cxxlib-autodetect=0 --with-fc=ftn --with-fortranlib-autodetect=0 --with-shared-libraries=0 --known-mpi-shared-libraries=1 --with-mpiexec=aprun --with-x=0 --with-64-bit-indices --with-debugging=0 PETSC_ARCH=arch-titan-opt64idx-gnu-cuda PETSC_DIR=/lustre/atlas/proj-shared/geo127/petsc<br>
> [1]PETSC ERROR: [0]PETSC ERROR: #1 KSPComputeEigenvalues_GMRES() line 144 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/impls/gmres/gmreig.c<br>
> #1 KSPComputeEigenvalues_GMRES() line 144 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/impls/gmres/gmreig.c<br>
> [1]PETSC ERROR: [0]PETSC ERROR: #2 KSPComputeEigenvalues() line 132 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/interface/itfunc.c<br>
> #2 KSPComputeEigenvalues() line 132 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/interface/itfunc.c<br>
> [1]PETSC ERROR: [0]PETSC ERROR: #3 KSPChebyshevComputeExtremeEigenvalues_Private() line 288 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/impls/cheby/cheby.c<br>
> #3 KSPChebyshevComputeExtremeEigenvalues_Private() line 288 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/impls/cheby/cheby.c<br>
> [1]PETSC ERROR: [0]PETSC ERROR: #4 KSPSolve_Chebyshev() line 390 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/impls/cheby/cheby.c<br>
> #4 KSPSolve_Chebyshev() line 390 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/impls/cheby/cheby.c<br>
> [1]PETSC ERROR: [0]PETSC ERROR: #5 KSPSolve() line 780 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/interface/itfunc.c<br>
> #5 KSPSolve() line 780 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/interface/itfunc.c<br>
> [1]PETSC ERROR: [0]PETSC ERROR: #6 PCMGMCycle_Private() line 20 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/pc/impls/mg/mg.c<br>
> #6 PCMGMCycle_Private() line 20 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/pc/impls/mg/mg.c<br>
> [1]PETSC ERROR: [0]PETSC ERROR: #7 PCApply_MG() line 377 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/pc/impls/mg/mg.c<br>
> #7 PCApply_MG() line 377 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/pc/impls/mg/mg.c<br>
> [1]PETSC ERROR: [0]PETSC ERROR: #8 PCApply() line 462 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/pc/interface/precon.c<br>
> #8 PCApply() line 462 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/pc/interface/precon.c<br>
> [1]PETSC ERROR: [0]PETSC ERROR: #9 KSP_PCApply() line 281 in /lustre/atlas/proj-shared/geo127/petsc/include/petsc/private/kspimpl.h<br>
> #9 KSP_PCApply() line 281 in /lustre/atlas/proj-shared/geo127/petsc/include/petsc/private/kspimpl.h<br>
> [1]PETSC ERROR: [0]PETSC ERROR: #10 KSPFGMRESCycle() line 166 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c<br>
> #10 KSPFGMRESCycle() line 166 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c<br>
> [1]PETSC ERROR: [0]PETSC ERROR: #11 KSPSolve_FGMRES() line 291 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c<br>
> #11 KSPSolve_FGMRES() line 291 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c<br>
> [1]PETSC ERROR: [0]PETSC ERROR: #12 KSPSolve() line 780 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/interface/itfunc.c<br>
> #12 KSPSolve() line 780 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/interface/itfunc.c<br>
> [1]PETSC ERROR: [0]PETSC ERROR: #13 SNESSolve_NEWTONLS() line 224 in /lustre/atlas1/geo127/proj-shared/petsc/src/snes/impls/ls/ls.c<br>
> #13 SNESSolve_NEWTONLS() line 224 in /lustre/atlas1/geo127/proj-shared/petsc/src/snes/impls/ls/ls.c<br>
> [1]PETSC ERROR: [0]PETSC ERROR: #14 SNESSolve() line 4396 in /lustre/atlas1/geo127/proj-shared/petsc/src/snes/interface/snes.c<br>
> #14 SNESSolve() line 4396 in /lustre/atlas1/geo127/proj-shared/petsc/src/snes/interface/snes.c<br>
> [1]PETSC ERROR: [0]PETSC ERROR: #15 main() line 161 in /lustre/atlas/proj-shared/geo127/petsc/src/snes/examples/tutorials/ex19.c<br>
> #15 main() line 161 in /lustre/atlas/proj-shared/geo127/petsc/src/snes/examples/tutorials/ex19.c<br>
> [1]PETSC ERROR: [0]PETSC ERROR: PETSc Option Table entries:<br>
> PETSc Option Table entries:<br>
> [1]PETSC ERROR: [0]PETSC ERROR: -da_refine 3<br>
> -da_refine 3<br>
> [1]PETSC ERROR: [0]PETSC ERROR: -ksp_type fgmres<br>
> -ksp_type fgmres<br>
> [1]PETSC ERROR: [0]PETSC ERROR: -pc_type mg<br>
> -pc_type mg<br>
> [1]PETSC ERROR: [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint@mcs.anl.gov----------<br>
> ----------------End of Error Message -------send entire error message to petsc-maint@mcs.anl.gov----------<br>
> Rank 0 [Mon Oct 29 16:52:05 2018] [c7-7c1s4n2] application called MPI_Abort(MPI_COMM_WORLD, 76) - process 0<br>
> Rank 1 [Mon Oct 29 16:52:05 2018] [c7-7c1s4n2] application called MPI_Abort(MPI_COMM_WORLD, 76) - process 1<br>
> _pmiu_daemon(SIGCHLD): [NID 16438] [c7-7c1s4n2] [Mon Oct 29 16:52:05 2018] PE RANK 1 exit signal Aborted<br>
> Application 19079965 exit codes: 134<br>
> Application 19079965 resources: utime ~1s, stime ~1s, Rss ~29412, inblocks ~37571, outblocks ~131660<br>
> 5a6<br>
> > Application 19079968 resources: utime ~1s, stime ~1s, Rss ~29412, inblocks ~37586, outblocks ~131654<br>
> <br>
<br>
</blockquote></div></div>