<div dir="ltr"><div dir="ltr">Still getting this error with downloaded lapack. I sent the logs on the other thread.<div><br></div><div><br></div><div><div>18:02 master= /lustre/atlas/proj-shared/geo127/petsc$ make PETSC_DIR=/lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda PETSC_ARCH="" test</div><div>Running test examples to verify correct installation</div><div>Using PETSC_DIR=/lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda and PETSC_ARCH=</div><div>*******************Error detected during compile or link!*******************</div><div>See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a></div><div>/lustre/atlas/proj-shared/geo127/petsc/src/snes/examples/tutorials ex19</div><div>*********************************************************************************</div><div>cc -o ex19.o -c -O   -I/lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda/include    `pwd`/ex19.c</div><div>cc -O  -o ex19 ex19.o  -L/lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda/lib -Wl,-rpath,/lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda/lib -L/lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda/lib -lpetsc -lHYPRE -lflapack -lfblas -lparmetis -lmetis -ldl</div><div>/lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda/lib/libpetsc.a(dlimpl.o): In function `PetscDLOpen':</div><div>dlimpl.c:(.text+0x3b): warning: Using 'dlopen' in statically linked applications requires at runtime the shared libraries from the glibc version used for linking</div><div>/lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda/lib/libpetsc.a(send.o): In function `PetscOpenSocket':</div><div>send.c:(.text+0x3be): warning: Using 'gethostbyname' in statically linked applications requires at runtime the shared libraries from the glibc version used for linking</div><div>true ex19</div><div>rm ex19.o</div><div>Possible error running C/C++ src/snes/examples/tutorials/ex19 with 1 MPI process</div><div>See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a></div><div>lid velocity = 0.0016, prandtl # = 1., grashof # = 1.</div><div>Number of SNES iterations = 2</div><div>Application 19080270 resources: utime ~0s, stime ~1s, Rss ~72056, inblocks ~19397, outblocks ~51049</div><div>Possible error running C/C++ src/snes/examples/tutorials/ex19 with 2 MPI processes</div><div>See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a></div><div>lid velocity = 0.0016, prandtl # = 1., grashof # = 1.</div><div>[1]PETSC ERROR: [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------</div><div>--------------------- Error Message --------------------------------------------------------------</div><div>[1]PETSC ERROR: [0]PETSC ERROR: Error in external library</div><div>Error in external library</div><div>[1]PETSC ERROR: [0]PETSC ERROR: Error in LAPACK routine 0</div><div>Error in LAPACK routine 0</div><div>[1]PETSC ERROR: [0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.</div><div>See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.</div><div>[1]PETSC ERROR: [0]PETSC ERROR: Petsc Development GIT revision: v3.10.2-461-g0ed19bb123  GIT Date: 2018-10-29 13:43:53 +0100</div><div>Petsc Development GIT revision: v3.10.2-461-g0ed19bb123  GIT Date: 2018-10-29 13:43:53 +0100</div><div>[1]PETSC ERROR: [0]PETSC ERROR: ./ex19 on a  named nid08331 by adams Mon Oct 29 18:07:59 2018</div><div>./ex19 on a  named nid08331 by adams Mon Oct 29 18:07:59 2018</div><div>[1]PETSC ERROR: [0]PETSC ERROR: Configure options --with-cudac=1 --with-batch=0 --prefix=/lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda --download-hypre --download-metis --download-parmetis --download-fblaslapack --with-cc=cc --with-clib-autodetect=0 --with-cxx=CC --with-cxxlib-autodetect=0 --with-fc=ftn --with-fortranlib-autodetect=0 --with-shared-libraries=0 --known-mpi-shared-libraries=1 --with-mpiexec=aprun --with-x=0 --with-64-bit-indices --with-debugging=0 PETSC_ARCH=arch-titan-opt64idx-gnu-cuda PETSC_DIR=/lustre/atlas/proj-shared/geo127/petsc</div><div>Configure options --with-cudac=1 --with-batch=0 --prefix=/lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda --download-hypre --download-metis --download-parmetis --download-fblaslapack --with-cc=cc --with-clib-autodetect=0 --with-cxx=CC --with-cxxlib-autodetect=0 --with-fc=ftn --with-fortranlib-autodetect=0 --with-shared-libraries=0 --known-mpi-shared-libraries=1 --with-mpiexec=aprun --with-x=0 --with-64-bit-indices --with-debugging=0 PETSC_ARCH=arch-titan-opt64idx-gnu-cuda PETSC_DIR=/lustre/atlas/proj-shared/geo127/petsc</div><div>[1]PETSC ERROR: [0]PETSC ERROR: #1 KSPComputeEigenvalues_GMRES() line 144 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/impls/gmres/gmreig.c</div><div>#1 KSPComputeEigenvalues_GMRES() line 144 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/impls/gmres/gmreig.c</div><div>[1]PETSC ERROR: #2 KSPComputeEigenvalues() line 132 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/interface/itfunc.c</div><div>[0]PETSC ERROR: [1]PETSC ERROR: #2 KSPComputeEigenvalues() line 132 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/interface/itfunc.c</div><div>#3 KSPChebyshevComputeExtremeEigenvalues_Private() line 288 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/impls/cheby/cheby.c</div><div>[0]PETSC ERROR: [1]PETSC ERROR: #3 KSPChebyshevComputeExtremeEigenvalues_Private() line 288 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/impls/cheby/cheby.c</div><div>#4 KSPSolve_Chebyshev() line 390 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/impls/cheby/cheby.c</div><div>[0]PETSC ERROR: [1]PETSC ERROR: #4 KSPSolve_Chebyshev() line 390 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/impls/cheby/cheby.c</div><div>#5 KSPSolve() line 780 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/interface/itfunc.c</div><div>[0]PETSC ERROR: [1]PETSC ERROR: #5 KSPSolve() line 780 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/interface/itfunc.c</div><div>#6 PCMGMCycle_Private() line 20 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/pc/impls/mg/mg.c</div><div>[0]PETSC ERROR: [1]PETSC ERROR: #6 PCMGMCycle_Private() line 20 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/pc/impls/mg/mg.c</div><div>#7 PCApply_MG() line 377 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/pc/impls/mg/mg.c</div><div>[0]PETSC ERROR: [1]PETSC ERROR: #7 PCApply_MG() line 377 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/pc/impls/mg/mg.c</div><div>#8 PCApply() line 462 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/pc/interface/precon.c</div><div>[0]PETSC ERROR: [1]PETSC ERROR: #8 PCApply() line 462 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/pc/interface/precon.c</div><div>#9 KSP_PCApply() line 281 in /lustre/atlas/proj-shared/geo127/petsc/include/petsc/private/kspimpl.h</div><div>[0]PETSC ERROR: [1]PETSC ERROR: #9 KSP_PCApply() line 281 in /lustre/atlas/proj-shared/geo127/petsc/include/petsc/private/kspimpl.h</div><div>#10 KSPFGMRESCycle() line 166 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c</div><div>[0]PETSC ERROR: [1]PETSC ERROR: #10 KSPFGMRESCycle() line 166 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c</div><div>#11 KSPSolve_FGMRES() line 291 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c</div><div>[0]PETSC ERROR: [1]PETSC ERROR: #11 KSPSolve_FGMRES() line 291 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/impls/gmres/fgmres/fgmres.c</div><div>#12 KSPSolve() line 780 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/interface/itfunc.c</div><div>[0]PETSC ERROR: [1]PETSC ERROR: #12 KSPSolve() line 780 in /lustre/atlas1/geo127/proj-shared/petsc/src/ksp/ksp/interface/itfunc.c</div><div>#13 SNESSolve_NEWTONLS() line 224 in /lustre/atlas1/geo127/proj-shared/petsc/src/snes/impls/ls/ls.c</div><div>[0]PETSC ERROR: [1]PETSC ERROR: #13 SNESSolve_NEWTONLS() line 224 in /lustre/atlas1/geo127/proj-shared/petsc/src/snes/impls/ls/ls.c</div><div>#14 SNESSolve() line 4396 in /lustre/atlas1/geo127/proj-shared/petsc/src/snes/interface/snes.c</div><div>[0]PETSC ERROR: [1]PETSC ERROR: #14 SNESSolve() line 4396 in /lustre/atlas1/geo127/proj-shared/petsc/src/snes/interface/snes.c</div><div>#15 main() line 161 in /lustre/atlas/proj-shared/geo127/petsc/src/snes/examples/tutorials/ex19.c</div><div>[0]PETSC ERROR: [1]PETSC ERROR: #15 main() line 161 in /lustre/atlas/proj-shared/geo127/petsc/src/snes/examples/tutorials/ex19.c</div><div>PETSc Option Table entries:</div><div>[0]PETSC ERROR: [1]PETSC ERROR: PETSc Option Table entries:</div><div>-da_refine 3</div><div>[0]PETSC ERROR: [1]PETSC ERROR: -da_refine 3</div><div>-ksp_type fgmres</div><div>[0]PETSC ERROR: [1]PETSC ERROR: -ksp_type fgmres</div><div>-pc_type mg</div><div>[0]PETSC ERROR: -pc_type mg</div><div>[1]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint@mcs.anl.gov----------</div><div>[0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint@mcs.anl.gov----------</div><div>Rank 0 [Mon Oct 29 18:07:59 2018] [c20-3c1s5n3] application called MPI_Abort(MPI_COMM_WORLD, 76) - process 0</div><div>Rank 1 [Mon Oct 29 18:07:59 2018] [c20-3c1s5n3] application called MPI_Abort(MPI_COMM_WORLD, 76) - process 1</div><div>_pmiu_daemon(SIGCHLD): [NID 08331] [c20-3c1s5n3] [Mon Oct 29 18:07:59 2018] PE RANK 1 exit signal Aborted</div><div>Application 19080271 exit codes: 134</div><div>Application 19080271 resources: utime ~0s, stime ~1s, Rss ~72056, inblocks ~19405, outblocks ~51055</div><div>5a6</div><div>> Application 19080272 resources: utime ~1s, stime ~1s, Rss ~72056, inblocks ~19420, outblocks ~51049</div><div>/lustre/atlas/proj-shared/geo127/petsc/src/snes/examples/tutorials</div><div>Possible problem with ex19_hypre, diffs above</div><div>=========================================</div><div>*******************Error detected during compile or link!*******************</div><div>See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a></div><div>/lustre/atlas/proj-shared/geo127/petsc/src/snes/examples/tutorials ex5f</div><div>*********************************************************</div><div>ftn -c -O    -I/lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda/include    -o ex5f.o ex5f.F90</div><div>ftn -O   -o ex5f ex5f.o  -L/lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda/lib -Wl,-rpath,/lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda/lib -L/lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda/lib -lpetsc -lHYPRE -lflapack -lfblas -lparmetis -lmetis -ldl</div><div>/lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda/lib/libpetsc.a(dlimpl.o): In function `PetscDLOpen':</div><div>dlimpl.c:(.text+0x3b): warning: Using 'dlopen' in statically linked applications requires at runtime the shared libraries from the glibc version used for linking</div><div>/lustre/atlas/proj-shared/geo127/petsc_titan_opt64idx_gnu_cuda/lib/libpetsc.a(send.o): In function `PetscOpenSocket':</div><div>send.c:(.text+0x3be): warning: Using 'gethostbyname' in statically linked applications requires at runtime the shared libraries from the glibc version used for linking</div><div>rm ex5f.o</div><div>Possible error running Fortran example src/snes/examples/tutorials/ex5f with 1 MPI process</div></div></div></div>