<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On Thu, Mar 15, 2018 at 5:06 PM, Manuel Valera <span dir="ltr"><<a href="mailto:mvalera-w@mail.sdsu.edu" target="_blank">mvalera-w@mail.sdsu.edu</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr">Ok Matthew and everyone, i finally made my small example work, im sending the ksp_view and log_view for feedback / check up if it seems correct.</div></blockquote><div><br></div><div>It looks fine. Of course the time is dominated by kernel launch latency since its small.</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div>What i would like to do next is explore how fast this setup solves my linear systems, for it i could fire up the whole big model or i could extract the binaries and solve with this standalone, i know how do extract a matrix binary, is there an analog for a vector?</div></div></blockquote><div><br></div><div>I do not see the advantage in involving disk.</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div> i would like to test my system for scaling and then look at multi-node scaling as well as multi-gpu, any ideas would be welcome,</div><div><br></div><div>Thanks,</div><div><br></div><div><br></div><div>.-.-.-.-.-.- ksp_view and log_view .-.-.-.-.-.-</div><div><br></div><div><div>[valera@node50 alone]$ mpirun -n 1 ./linsolve -vec_type seqcuda -mat_type seqaijcusparse -ksp_view -log_view</div><div> laplacian.petsc !</div><div> TrivSoln loaded, size: 125 / 125</div><div>KSP Object: 1 MPI processes</div><div> type: gcr</div><div> restart = 30 </div><div> restarts performed = 1 </div><div> maximum iterations=10000, initial guess is zero</div><div> tolerances: relative=1e-11, absolute=1e-50, divergence=10000.</div><div> right preconditioning</div><div> using UNPRECONDITIONED norm type for convergence test</div><div>PC Object: 1 MPI processes</div><div> type: ilu</div><div> out-of-place factorization</div><div> 0 levels of fill</div><div> tolerance for zero pivot 2.22045e-14</div><div> matrix ordering: natural</div><div> factor fill ratio given 1., needed 1.</div><div> Factored matrix follows:</div><div> Mat Object: 1 MPI processes</div><div> type: seqaij</div><div> rows=125, cols=125</div><div> package used to perform factorization: petsc</div><div> total: nonzeros=1685, allocated nonzeros=1685</div><div> total number of mallocs used during MatSetValues calls =0</div><div> not using I-node routines</div><div> linear system matrix = precond matrix:</div><div> Mat Object: 1 MPI processes</div><div> type: seqaijcusparse</div><div> rows=125, cols=125</div><div> total: nonzeros=1685, allocated nonzeros=1685</div><div> total number of mallocs used during MatSetValues calls =0</div><div> not using I-node routines</div><div> soln maxval: 1.0000000000007194 </div><div> soln minval: 0.99999999999484046 </div><div> Norm: 1.3586823299239453E-011</div><div> Its: 4</div><div>******************************<wbr>******************************<wbr>******************************<wbr>******************************</div><div>*** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document ***</div><div>******************************<wbr>******************************<wbr>******************************<wbr>******************************</div><div><br></div><div>------------------------------<wbr>---------------- PETSc Performance Summary: ------------------------------<wbr>----------------</div></div><div><br></div><div><div>./linsolve on a named node50 with 1 processor, by valera Thu Mar 15 14:04:47 2018</div><div>Using Petsc Development GIT revision: v3.8.3-2027-g045eeab GIT Date: 2018-03-12 13:30:25 -0500</div><div><br></div><div> Max Max/Min Avg Total </div><div>Time (sec): 7.459e-01 1.00000 7.459e-01</div><div>Objects: 7.500e+01 1.00000 7.500e+01</div><div>Flop: 6.024e+04 1.00000 6.024e+04 6.024e+04</div><div>Flop/sec: 8.076e+04 1.00000 8.076e+04 8.076e+04</div><div>MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00</div><div>MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00</div><div>MPI Reductions: 0.000e+00 0.00000</div><div><br></div><div>Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract)</div><div> e.g., VecAXPY() for real vectors of length N --> 2N flop</div><div> and VecAXPY() for complex vectors of length N --> 8N flop</div><div><br></div><div>Summary of Stages: ----- Time ------ ----- Flop ----- --- Messages --- -- Message Lengths -- -- Reductions --</div><div> Avg %Total Avg %Total counts %Total Avg %Total counts %Total </div><div> 0: Main Stage: 7.4589e-01 100.0% 6.0240e+04 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% </div><div><br></div><div>------------------------------<wbr>------------------------------<wbr>------------------------------<wbr>------------------------------</div><div>See the 'Profiling' chapter of the users' manual for details on interpreting output.</div><div>Phase summary info:</div><div> Count: number of times phase was executed</div><div> Time and Flop: Max - maximum over all processors</div><div> Ratio - ratio of maximum to minimum over all processors</div><div> Mess: number of messages sent</div><div> Avg. len: average message length (bytes)</div><div> Reduct: number of global reductions</div><div> Global: entire computation</div><div> Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop().</div><div> %T - percent time in this phase %F - percent flop in this phase</div><div> %M - percent messages in this phase %L - percent message lengths in this phase</div><div> %R - percent reductions in this phase</div><div> Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max time over all processors)</div><div>------------------------------<wbr>------------------------------<wbr>------------------------------<wbr>------------------------------</div><div>Event Count Time (sec) Flop --- Global --- --- Stage --- Total</div><div> Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s</div><div>------------------------------<wbr>------------------------------<wbr>------------------------------<wbr>------------------------------</div><div><br></div><div>--- Event Stage 0: Main Stage</div><div><br></div><div>VecDotNorm2 4 1.0 2.0170e-04 1.0 3.99e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 7 0 0 0 0 7 0 0 0 20</div><div>VecMDot 3 1.0 1.4329e-04 1.0 1.49e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 10</div></div><div>VecNorm 6 1.0 3.7265e-04 1.0 1.49e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 4<br></div><div><div>VecScale 8 1.0 6.8903e-05 1.0 1.00e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 15</div><div>VecSet 74 1.0 6.0844e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div><div>VecAXPY 9 1.0 9.3699e-05 1.0 2.25e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 4 0 0 0 0 4 0 0 0 24</div><div>VecAYPX 1 1.0 2.8685e-01 1.0 2.50e+02 1.0 0.0e+00 0.0e+00 0.0e+00 38 0 0 0 0 38 0 0 0 0 0</div><div>VecMAXPY 6 1.0 1.2016e-04 1.0 6.00e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 10 0 0 0 0 10 0 0 0 50</div><div>VecAssemblyBegin 3 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div><div>VecAssemblyEnd 3 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div><div>VecCUDACopyTo 5 1.0 2.8849e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div><div>VecCUDACopyFrom 9 1.0 9.7275e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div><div>MatMult 6 1.0 3.6573e-04 1.0 1.95e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 32 0 0 0 0 32 0 0 0 53</div><div>MatSolve 4 1.0 1.1349e-04 1.0 1.30e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 22 0 0 0 0 22 0 0 0 114</div><div>MatLUFactorNum 1 1.0 2.9087e-05 1.0 1.13e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 19 0 0 0 0 19 0 0 0 389</div><div>MatILUFactorSym 1 1.0 2.7180e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div><div>MatAssemblyBegin 1 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div><div>MatAssemblyEnd 1 1.0 4.5514e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div><div>MatGetRowIJ 1 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div><div>MatGetOrdering 1 1.0 8.5831e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div><div>MatLoad 1 1.0 2.4009e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div><div>MatView 2 1.0 1.6499e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div><div>MatCUSPARSECopyTo 2 1.0 4.4584e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div><div>PCSetUp 1 1.0 1.8096e-04 1.0 1.13e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 19 0 0 0 0 19 0 0 0 63</div><div>PCApply 4 1.0 1.1659e-04 1.0 1.30e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 22 0 0 0 0 22 0 0 0 111</div><div>KSPSetUp 1 1.0 2.2879e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div><div>KSPSolve 1 1.0 2.8820e-01 1.0 4.52e+04 1.0 0.0e+00 0.0e+00 0.0e+00 39 75 0 0 0 39 75 0 0 0 0</div><div>------------------------------<wbr>------------------------------<wbr>------------------------------<wbr>------------------------------</div></div><div><br></div><div><br></div><div><div>Memory usage is given in bytes:</div><div><br></div><div>Object Type Creations Destructions Memory Descendants' Mem.</div><div>Reports information only for process 0.</div><div><br></div><div>--- Event Stage 0: Main Stage</div><div><br></div><div> Vector 64 63 104280 0.</div><div> Matrix 2 2 51368 0.</div><div> Viewer 3 1 848 0.</div><div> Preconditioner 2 1 1016 0.</div><div> Krylov Solver 1 1 1248 0.</div><div> Index Set 3 3 3900 0.</div><div>==============================<wbr>==============================<wbr>==============================<wbr>==============================</div><div>Average time to get PetscTime(): 9.53674e-08</div><div>#PETSc Option Table entries:</div><div>-ksp_view</div><div>-log_view</div><div>-mat_type seqaijcusparse</div><div>-matload_block_size 1</div><div>-vec_type seqcuda</div><div>#End of PETSc Option Table entries</div><div>Compiled without FORTRAN kernels</div><div>Compiled with full precision matrices (default)</div><div>sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4</div><div>Configure options: --prefix=/usr/local/petsc.cod/<wbr>petsc-install --with-mpi-dir=/usr/lib64/<wbr>openmpi --with-blaslapack-dir=/usr/<wbr>lib64 COPTFLAGS=-O2 CXXOPTFLAGS=-O2 FOPTFLAGS=-O2 --with-shared-libraries=1 --with-debugging=0 --with-cuda=1 --with-cuda-arch=sm_60</div><div>------------------------------<wbr>-----------</div><div>Libraries compiled on Wed Mar 14 14:53:02 2018 on node50 </div><div>Machine characteristics: Linux-3.10.0-693.17.1.el7.x86_<wbr>64-x86_64-with-centos-7.2.<wbr>1511-Core</div><div>Using PETSc directory: /usr/local/petsc.cod/petsc-<wbr>install</div><div>Using PETSc arch: </div><div>------------------------------<wbr>-----------</div><div><br></div><div>Using C compiler: /usr/lib64/openmpi/bin/mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -fvisibility=hidden -O2 </div><div>Using Fortran compiler: /usr/lib64/openmpi/bin/mpif90 -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -O2 </div><div>------------------------------<wbr>-----------</div><div><br></div><div>Using include paths: -I/usr/local/petsc.cod/petsc-<wbr>install/include -I/usr/local/petsc.cod/petsc-<wbr>install//include -I/usr/local/cuda/include -I/usr/lib64/openmpi/include</div><div>------------------------------<wbr>-----------</div><div><br></div><div>Using C linker: /usr/lib64/openmpi/bin/mpicc</div><div>Using Fortran linker: /usr/lib64/openmpi/bin/mpif90</div><div>Using libraries: -Wl,-rpath,/usr/local/petsc.<wbr>cod/petsc-install/lib -L/usr/local/petsc.cod/petsc-<wbr>install/lib -lpetsc -Wl,-rpath,/usr/lib64 -L/usr/lib64 -Wl,-rpath,/usr/local/cuda/<wbr>lib64 -L/usr/local/cuda/lib64 -Wl,-rpath,/usr/lib64/openmpi/<wbr>lib -L/usr/lib64/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_<wbr>64-redhat-linux/4.8.5 -L/usr/lib/gcc/x86_64-redhat-<wbr>linux/4.8.5 -llapack -lblas -lm -lcufft -lcublas -lcudart -lcusparse -lX11 -lstdc++ -ldl -lmpi_usempi -lmpi_mpifh -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lpthread -lstdc++ -ldl</div><div>------------------------------<wbr>-----------</div></div><div><br></div></div><div class="gmail_extra"><br><div class="gmail_quote">On Wed, Mar 14, 2018 at 4:23 PM, Matthew Knepley <span dir="ltr"><<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote"><span>On Thu, Mar 15, 2018 at 8:18 AM, Manuel Valera <span dir="ltr"><<a href="mailto:mvalera-w@mail.sdsu.edu" target="_blank">mvalera-w@mail.sdsu.edu</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr">Ok so, i went back and erased the old libpetsc.so.3 i think it was the one causing problems, i had --with-shared-libraries=0 and the installation complained of not having that file, then reinstalled with --with-shared-libraries=1 and it is finally recognizing my system installation with only CUDA, but now it gives a 11 SEGV violation error:</div></blockquote><div><br></div></span><div>But ex19 runs?</div><div><br></div><div>And this crash is without CUDA? Just run valgrind on it</div><div><br></div><div> Matt</div><div><div class="m_-7147497785134130018h5"><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div><div>[valera@node50 alone]$ ./linsolve </div><div> laplacian.petsc !</div><div> TrivSoln loaded, size: 125 / 125</div><div>[0]PETSC ERROR: ------------------------------<wbr>------------------------------<wbr>------------</div><div>[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range</div><div>[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger</div><div>[0]PETSC ERROR: or see <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind" target="_blank">http://www.mcs.anl.gov/petsc/d<wbr>ocumentation/faq.html#valgrind</a></div><div>[0]PETSC ERROR: or try <a href="http://valgrind.org" target="_blank">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to find memory corruption errors</div><div>[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run </div><div>[0]PETSC ERROR: to get more information on the crash.</div><div>[0]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--</div><div>[0]PETSC ERROR: Signal received</div><div>[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">http://www.mcs.anl.gov/petsc/d<wbr>ocumentation/faq.html</a> for trouble shooting.</div><div>[0]PETSC ERROR: Petsc Development GIT revision: v3.8.3-2027-g045eeab GIT Date: 2018-03-12 13:30:25 -0500</div><div>[0]PETSC ERROR: ./linsolve on a named node50 by valera Wed Mar 14 16:17:34 2018</div><div>[0]PETSC ERROR: Configure options --prefix=/usr/local/petsc.cod/<wbr>petsc-install --with-mpi-dir=/usr/lib64/open<wbr>mpi --with-blaslapack-dir=/usr/lib<wbr>64 COPTFLAGS=-O2 CXXOPTFLAGS=-O2 FOPTFLAGS=-O2 --with-shared-libraries=1 --with-debugging=0 --with-cuda=1 --with-cuda-arch=sm_60</div><div>[0]PETSC ERROR: #1 User provided function() line 0 in unknown file</div><div>------------------------------<wbr>------------------------------<wbr>--------------</div><div>MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD </div><div>with errorcode 59.</div><div><br></div><div>NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.</div><div>You may or may not see output from other processes, depending on</div><div>exactly when Open MPI kills them.</div><div>------------------------------<wbr>------------------------------<wbr>--------------</div><div>[valera@node50 alone]$ </div></div><div><br></div><div><br></div><div><br></div><div><br></div><div>Thanks,</div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div></div><div class="gmail_extra"><br><div class="gmail_quote">On Wed, Mar 14, 2018 at 1:52 PM, Matthew Knepley <span dir="ltr"><<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote"><span>On Thu, Mar 15, 2018 at 4:01 AM, Manuel Valera <span dir="ltr"><<a href="mailto:mvalera-w@mail.sdsu.edu" target="_blank">mvalera-w@mail.sdsu.edu</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr">Ok well, it turns out the $PETSC_DIR points to the testpetsc directory, and it makes, install and tests without problems (only a problem on ex5f) but trying to reconfigure on valera/petsc directory asks me to change the $PETSC_DIR variable,</div></blockquote><div><br></div></span><div>Yes, you must set PETSC_DIR to point to the installation you want to use.</div><div><br></div><div> Matt</div><div><div class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725h5"><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div>Meanwhile the system installation still points to the valera/petsc/cuda build,</div><div><br></div><div>should i just delete the petsc installation folder and start over?</div><div><br></div><div>Thanks, </div></div><div class="gmail_extra"><br><div class="gmail_quote">On Wed, Mar 14, 2018 at 11:36 AM, Matthew Knepley <span dir="ltr"><<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote"><span>On Thu, Mar 15, 2018 at 3:25 AM, Manuel Valera <span dir="ltr"><<a href="mailto:mvalera-w@mail.sdsu.edu" target="_blank">mvalera-w@mail.sdsu.edu</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr">yeah that worked,<div><br></div><div><div>[valera@node50 tutorials]$ ./ex19 -dm_vec_type seqcuda -dm_mat_type seqaijcusparse</div><div>lid velocity = 0.0625, prandtl # = 1., grashof # = 1.</div><div>Number of SNES iterations = 2</div><div>[valera@node50 tutorials]$ </div></div><div><br></div><div>How do i make sure the other program refer to this installation? using the same arguments there i get:</div><div><br></div><div><div>[valera@node50 alone]$ ./linsolve -vec_type seqcuda -mat_type seqaijcusparse</div><div> laplacian.petsc !</div><div>[0]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--</div><div>[0]PETSC ERROR: Unknown type. Check for miss-spelling or missing package: <a href="http://www.mcs.anl.gov/petsc/documentation/installation.html#external" target="_blank">http://www.mcs.anl.gov/petsc/d<wbr>ocumentation/installation.html<wbr>#external</a></div><div>[0]PETSC ERROR: Unknown vector type: seqcuda</div></div></div></blockquote><div><br></div></span><div>This PETSc has not been configured with CUDA. It is located in <span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline">/home/valera/petsc. The other</span></div><div><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline">one you used is located in <span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><span> </span>/home/valera/testpetsc. It does not make much sense to me that</span></span></div><div><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline">it does not understand CUDA since it says the configure arguments had --with-cuda=1. There</span></span></div><div><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline">must have been a build problem. Rebuild</span></span></div><div><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><br></span></span></div><div><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"> cd $PETSC_DIR</span></span></div><div><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"> make all</span></span></div><div><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><br></span></span></div><div><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline">If you still have a problem, reconfigure</span></span></div><div><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><br></span></span></div><div><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"> cd $PETSC_DIR</span></span></div><div><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"> ./cuda/lib/petsc/conf/reconfig<wbr>ure-cuda.py</span></span></div><div><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"> make all</span></span></div><div><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><br></span></span></div><div><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline">If that still fails, then something very bizarre is happening on your machine and we will have</span></span></div><div><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline">to exchange more mail.</span></span></div><div><div class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473h5"><div><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><br></span></span></div><div><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"> Matt</span></span></div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div><div>[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">http://www.mcs.anl.gov/petsc/d<wbr>ocumentation/faq.html</a> for trouble shooting.</div><div>[0]PETSC ERROR: Petsc Development GIT revision: v3.8.3-1817-g96b6f8a GIT Date: 2018-02-28 10:19:08 -0600</div><div>[0]PETSC ERROR: ./linsolve on a cuda named node50 by valera Wed Mar 14 11:25:11 2018</div><div>[0]PETSC ERROR: Configure options PETSC_ARCH=cuda --with-cc=mpicc --with-cxx=mpic++ --with-fc=mpifort --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --with-shared-libraries=1 --with-debugging=1 --with-cuda=1 --with-cuda-arch=sm_60 --with-cusp=1 --with-cusp-dir=/home/valera/c<wbr>usp --with-vienacl=1 --download-fblaslapack=1 --download-hypre</div><div>[0]PETSC ERROR: #1 VecSetType() line 42 in /home/valera/petsc/src/vec/vec<wbr>/interface/vecreg.c</div><div>[0]PETSC ERROR: #2 VecSetTypeFromOptions_Private(<wbr>) line 1241 in /home/valera/petsc/src/vec/vec<wbr>/interface/vector.c</div><div>[0]PETSC ERROR: #3 VecSetFromOptions() line 1276 in /home/valera/petsc/src/vec/vec<wbr>/interface/vector.c</div><div>[0]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--</div><div>[0]PETSC ERROR: Object is in wrong state</div><div>[0]PETSC ERROR: Vec object's type is not set: Argument # 1</div><div>[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">http://www.mcs.anl.gov/petsc/d<wbr>ocumentation/faq.html</a> for trouble shooting.</div><div>[0]PETSC ERROR: Petsc Development GIT revision: v3.8.3-1817-g96b6f8a GIT Date: 2018-02-28 10:19:08 -0600</div><div>[0]PETSC ERROR: ./linsolve on a cuda named node50 by valera Wed Mar 14 11:25:11 2018</div><div>[0]PETSC ERROR: Configure options PETSC_ARCH=cuda --with-cc=mpicc --with-cxx=mpic++ --with-fc=mpifort --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --with-shared-libraries=1 --with-debugging=1 --with-cuda=1 --with-cuda-arch=sm_60 --with-cusp=1 --with-cusp-dir=/home/valera/c<wbr>usp --with-vienacl=1 --download-fblaslapack=1 --download-hypre</div><div>[0]PETSC ERROR: #4 VecDuplicate() line 375 in /home/valera/petsc/src/vec/vec<wbr>/interface/vector.c</div><div>[0]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--</div><div>[0]PETSC ERROR: Object is in wrong state</div><div>[0]PETSC ERROR: Vec object's type is not set: Argument # 1</div><div>[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">http://www.mcs.anl.gov/petsc/d<wbr>ocumentation/faq.html</a> for trouble shooting.</div><div>[0]PETSC ERROR: Petsc Development GIT revision: v3.8.3-1817-g96b6f8a GIT Date: 2018-02-28 10:19:08 -0600</div><div>[0]PETSC ERROR: ./linsolve on a cuda named node50 by valera Wed Mar 14 11:25:11 2018</div><div>[0]PETSC ERROR: Configure options PETSC_ARCH=cuda --with-cc=mpicc --with-cxx=mpic++ --with-fc=mpifort --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --with-shared-libraries=1 --with-debugging=1 --with-cuda=1 --with-cuda-arch=sm_60 --with-cusp=1 --with-cusp-dir=/home/valera/c<wbr>usp --with-vienacl=1 --download-fblaslapack=1 --download-hypre</div><div>[0]PETSC ERROR: #5 VecDuplicate() line 375 in /home/valera/petsc/src/vec/vec<wbr>/interface/vector.c</div><div>[0]PETSC ERROR: #6 User provided function() line 0 in User file</div><div>[valera@node50 alone]$ </div></div><div><br></div><div>I made sure there is a call for Vec/MatSetFromOptions() there, i am loading the matrix from a petsc binary in this case,</div><div><br></div><div>Thanks, </div><div><br></div></div><div class="gmail_extra"><br><div class="gmail_quote">On Wed, Mar 14, 2018 at 11:22 AM, Matthew Knepley <span dir="ltr"><<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote"><span>On Thu, Mar 15, 2018 at 3:19 AM, Manuel Valera <span dir="ltr"><<a href="mailto:mvalera-w@mail.sdsu.edu" target="_blank">mvalera-w@mail.sdsu.edu</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">Yes, this is the system installation that is being correctly linked (the linear solver and model are not linking the correct installation idk why yet) i configured with only CUDA this time because of the message Karl Rupp posted on my installation thread, where he says only one type of library will work at a time, anyway this is what i got:<div><br></div><div><div>[valera@node50 tutorials]$ ./ex19 -dm_vec_type seqcuda -dm_mat_type seqaijcusp</div><div>lid velocity = 0.0625, prandtl # = 1., grashof # = 1.</div><div>[0]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--</div><div>[0]PETSC ERROR: Unknown type. Check for miss-spelling or missing package: <a href="http://www.mcs.anl.gov/petsc/documentation/installation.html#external" target="_blank">http://www.mcs.anl.gov/petsc/d<wbr>ocumentation/installation.html<wbr>#external</a></div><div>[0]PETSC ERROR: Unknown Mat type given: seqaijcusp</div></div></div></blockquote><div><br></div></span><div>It is telling you the problem. Use</div><div><br></div><div> -dm_mat_type seqaijcusparse</div><div><br></div><div> Matt</div><div><div class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850h5"><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div><div>[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">http://www.mcs.anl.gov/petsc/d<wbr>ocumentation/faq.html</a> for trouble shooting.</div><div>[0]PETSC ERROR: Petsc Development GIT revision: v3.8.3-2027-g045eeab GIT Date: 2018-03-12 13:30:25 -0500</div><div>[0]PETSC ERROR: ./ex19 on a named node50 by valera Wed Mar 14 11:17:25 2018</div><div>[0]PETSC ERROR: Configure options --prefix=/usr/local/petsc.cod/<wbr>petsc-install --with-mpi-dir=/usr/lib64/open<wbr>mpi --with-blaslapack-dir=/usr/lib<wbr>64 COPTFLAGS=-O2 CXXOPTFLAGS=-O2 FOPTFLAGS=-O2 --with-shared-libraries=0 --download-hypre --with-debugging=0 --with-cuda=1 --with-cuda-arch=sm_60</div><div>[0]PETSC ERROR: #1 MatSetType() line 61 in /home/valera/testpetsc/src/mat<wbr>/interface/matreg.c</div><div>[0]PETSC ERROR: #2 DMCreateMatrix_DA() line 693 in /home/valera/testpetsc/src/dm/<wbr>impls/da/fdda.c</div><div>[0]PETSC ERROR: #3 DMCreateMatrix() line 1199 in /home/valera/testpetsc/src/dm/<wbr>interface/dm.c</div><div>[0]PETSC ERROR: #4 SNESSetUpMatrices() line 646 in /home/valera/testpetsc/src/sne<wbr>s/interface/snes.c</div><div>[0]PETSC ERROR: #5 SNESSetUp_NEWTONLS() line 296 in /home/valera/testpetsc/src/sne<wbr>s/impls/ls/ls.c</div><div>[0]PETSC ERROR: #6 SNESSetUp() line 2795 in /home/valera/testpetsc/src/sne<wbr>s/interface/snes.c</div><div>[0]PETSC ERROR: #7 SNESSolve() line 4187 in /home/valera/testpetsc/src/sne<wbr>s/interface/snes.c</div><div>[0]PETSC ERROR: #8 main() line 161 in /home/valera/testpetsc/src/sne<wbr>s/examples/tutorials/ex19.c</div><div>[0]PETSC ERROR: PETSc Option Table entries:</div><div>[0]PETSC ERROR: -dm_mat_type seqaijcusp</div><div>[0]PETSC ERROR: -dm_vec_type seqcuda</div><div>[0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint@mcs.anl.gov-------<wbr>---</div><div>------------------------------<wbr>------------------------------<wbr>--------------</div><div>MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD </div><div>with errorcode 86.</div><div><br></div><div>NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.</div><div>You may or may not see output from other processes, depending on</div><div>exactly when Open MPI kills them.</div><div>------------------------------<wbr>------------------------------<wbr>--------------</div></div><div><br></div></div><div class="gmail_extra"><br><div class="gmail_quote">On Wed, Mar 14, 2018 at 11:16 AM, Matthew Knepley <span dir="ltr"><<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On Thu, Mar 15, 2018 at 3:12 AM, Manuel Valera <span dir="ltr"><<a href="mailto:mvalera-w@mail.sdsu.edu" target="_blank">mvalera-w@mail.sdsu.edu</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">Thanks, got this error:</div></blockquote><div><br></div><div>Did you not configure with CUSP? It looks like you have CUDA, so use</div><div><br></div><div> -dm_vec_type seqcuda</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div><div class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-m_4474698985491234451h5"><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div><div>[valera@node50 testpetsc]$ cd src/snes/examples/tutorials/</div><div>[valera@node50 tutorials]$ PETSC_ARCH="" make ex19</div><div>/usr/lib64/openmpi/bin/mpicc -o ex19.o -c -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -fvisibility=hidden -O2 -I/home/valera/testpetsc/incl<wbr>ude -I/home/valera/testpetsc/arch-<wbr>linux2-c-opt/include -I/usr/local/petsc.cod/petsc-i<wbr>nstall/include -I/usr/local/cuda/include -I/usr/lib64/openmpi/include `pwd`/ex19.c</div><div>/usr/lib64/openmpi/bin/mpicc -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -fvisibility=hidden -O2 -o ex19 ex19.o -L/home/valera/testpetsc/arch-<wbr>linux2-c-opt/lib -Wl,-rpath,/usr/local/<a href="http://petsc.co" target="_blank">petsc.co</a><wbr>d/petsc-install/lib -L/usr/local/petsc.cod/petsc-i<wbr>nstall/lib -Wl,-rpath,/usr/lib64 -L/usr/lib64 -Wl,-rpath,/usr/local/cuda/lib<wbr>64 -L/usr/local/cuda/lib64 -L/usr/lib64/openmpi/lib -L/usr/lib/gcc/x86_64-redhat-l<wbr>inux/4.8.5 -Wl,-rpath,/usr/lib64/openmpi/<wbr>lib -lpetsc -lHYPRE -llapack -lblas -lm -lcufft -lcublas -lcudart -lcusparse -lX11 -lstdc++ -ldl -lmpi_usempi -lmpi_mpifh -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lpthread -lstdc++ -ldl</div><div>/usr/bin/rm -f ex19.o</div><div>[valera@node50 tutorials]$ ./ex19 -dm_vec_type seqcusp -dm_mat_type seqaijcusp</div><div>lid velocity = 0.0625, prandtl # = 1., grashof # = 1.</div><div>[0]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--</div><div>[0]PETSC ERROR: Unknown type. Check for miss-spelling or missing package: <a href="http://www.mcs.anl.gov/petsc/documentation/installation.html#external" target="_blank">http://www.mcs.anl.gov/petsc/d<wbr>ocumentation/installation.html<wbr>#external</a></div><div>[0]PETSC ERROR: Unknown vector type: seqcusp</div><div>[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">http://www.mcs.anl.gov/petsc/d<wbr>ocumentation/faq.html</a> for trouble shooting.</div><div>[0]PETSC ERROR: Petsc Development GIT revision: v3.8.3-2027-g045eeab GIT Date: 2018-03-12 13:30:25 -0500</div><div>[0]PETSC ERROR: ./ex19 on a named node50 by valera Wed Mar 14 11:12:11 2018</div><div>[0]PETSC ERROR: Configure options --prefix=/usr/local/petsc.cod/<wbr>petsc-install --with-mpi-dir=/usr/lib64/open<wbr>mpi --with-blaslapack-dir=/usr/lib<wbr>64 COPTFLAGS=-O2 CXXOPTFLAGS=-O2 FOPTFLAGS=-O2 --with-shared-libraries=0 --download-hypre --with-debugging=0 --with-cuda=1 --with-cuda-arch=sm_60</div><div>[0]PETSC ERROR: #1 VecSetType() line 42 in /home/valera/testpetsc/src/vec<wbr>/vec/interface/vecreg.c</div><div>[0]PETSC ERROR: #2 DMCreateGlobalVector_DA() line 39 in /home/valera/testpetsc/src/dm/<wbr>impls/da/dadist.c</div><div>[0]PETSC ERROR: #3 DMCreateGlobalVector() line 865 in /home/valera/testpetsc/src/dm/<wbr>interface/dm.c</div><div>[0]PETSC ERROR: #4 main() line 158 in /home/valera/testpetsc/src/sne<wbr>s/examples/tutorials/ex19.c</div><div>[0]PETSC ERROR: PETSc Option Table entries:</div><div>[0]PETSC ERROR: -dm_mat_type seqaijcusp</div><div>[0]PETSC ERROR: -dm_vec_type seqcusp</div><div>[0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint@mcs.anl.gov-------<wbr>---</div><div>------------------------------<wbr>------------------------------<wbr>--------------</div><div>MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD </div><div>with errorcode 86.</div><div><br></div><div>NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.</div><div>You may or may not see output from other processes, depending on</div><div>exactly when Open MPI kills them.</div><div>------------------------------<wbr>------------------------------<wbr>--------------</div><div>[valera@node50 tutorials]$ </div></div><div><br></div></div><div class="gmail_extra"><br><div class="gmail_quote">On Wed, Mar 14, 2018 at 11:10 AM, Matthew Knepley <span dir="ltr"><<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote"><span>On Thu, Mar 15, 2018 at 2:46 AM, Manuel Valera <span dir="ltr"><<a href="mailto:mvalera-w@mail.sdsu.edu" target="_blank">mvalera-w@mail.sdsu.edu</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">Ok lets try that, if i go to /home/valera/testpetsc/arch<wbr>-linux2-c-opt/tests/src/snes/e<wbr>xamples/tutorials there is runex19.sh and a lot of other ex19 variantes, but if i run that i get:</div></blockquote><div><br></div></span><div><div>knepley/feature-plex-functiona<wbr>ls *$:/PETSc3/petsc/petsc-dev$ pushd src/snes/examples/tutorials/</div><div>knepley/feature-plex-functiona<wbr>ls *$:/PETSc3/petsc/petsc-dev/src<wbr>/snes/examples/tutorials$ PETSC_ARCH=arch-master-debug make ex19<br></div><div>knepley/feature-plex-functiona<wbr>ls *$:/PETSc3/petsc/petsc-dev/src<wbr>/snes/examples/tutorials$ ./ex19 -dm_vec_type seqcusp -dm_mat_type seqaijcusp<br></div></div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div><div class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-m_4474698985491234451m_-7369477022532066496m_9175290598307314193h5"><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div><div>[valera@node50 tutorials]$ ./runex19.sh</div><div>not ok snes_tutorials-ex19_1</div><div>#<span style="white-space:pre-wrap"> </span>------------------------------<wbr>------------------------------<wbr>--------------</div><div>#<span style="white-space:pre-wrap"> </span>mpiexec was unable to launch the specified application as it could not access</div><div>#<span style="white-space:pre-wrap"> </span>or execute an executable:</div><div>#<span style="white-space:pre-wrap"> </span></div><div>#<span style="white-space:pre-wrap"> </span>Executable: ../ex19</div><div>#<span style="white-space:pre-wrap"> </span>Node: node50</div><div>#<span style="white-space:pre-wrap"> </span></div><div>#<span style="white-space:pre-wrap"> </span>while attempting to start process rank 0.</div><div>#<span style="white-space:pre-wrap"> </span>------------------------------<wbr>------------------------------<wbr>--------------</div><div>#<span style="white-space:pre-wrap"> </span>2 total processes failed to start</div><div>ok snes_tutorials-ex19_1 # SKIP Command failed so no diff</div></div><div><br></div><div>is this the one i should be running ?</div><div><br></div></div><div class="gmail_extra"><br><div class="gmail_quote">On Wed, Mar 14, 2018 at 10:39 AM, Matthew Knepley <span dir="ltr"><<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote"><span>On Thu, Mar 15, 2018 at 2:27 AM, Manuel Valera <span dir="ltr"><<a href="mailto:mvalera-w@mail.sdsu.edu" target="_blank">mvalera-w@mail.sdsu.edu</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">Ok thanks Matt, i made a smaller case with only the linear solver and a 25x25 matrix, the error i have in this case is:</div></blockquote><div><br></div></span><div>Ah, it appears that not all parts of your problem are taking the type options. If you want the</div><div>linear algebra objects to change type, you need to have</div><div><br></div><div> VecSetFromOptions() and MatSetFromOptions()</div><div><br></div><div>called after you create them, but before sizes are set and data is entered. However, it should</div><div>not be possible to have a seq Vec with the seqcusp AXPY routine set. Something else is wrong...</div><div>Did you try a PETSc example, such as SNES ex19, with this?</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div><div class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-m_4474698985491234451m_-7369477022532066496m_9175290598307314193m_6827561345868940943gmail-m_6014072403565971686h5"><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div><div>[valera@node50 alone]$ mpirun -n 1 ./linsolve -vec_type cusp -mat_type aijcusparse</div><div> laplacian.petsc !</div><div> TrivSoln loaded, size: 125 / 125</div><div> RHS loaded, size: 125 / 125</div><div>[0]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--</div><div>[0]PETSC ERROR: Null argument, when expecting valid pointer</div><div>[0]PETSC ERROR: Null Pointer: Parameter # 4</div><div>[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">http://www.mcs.anl.gov/petsc/d<wbr>ocumentation/faq.html</a> for trouble shooting.</div><div>[0]PETSC ERROR: Petsc Development GIT revision: v3.8.3-1817-g96b6f8a GIT Date: 2018-02-28 10:19:08 -0600</div><div>[0]PETSC ERROR: ./linsolve on a cuda named node50 by valera Wed Mar 14 10:24:35 2018</div><div>[0]PETSC ERROR: Configure options PETSC_ARCH=cuda --with-cc=mpicc --with-cxx=mpic++ --with-fc=mpifort --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --with-shared-libraries=1 --with-debugging=1 --with-cuda=1 --with-cuda-arch=sm_60 --with-cusp=1 --with-cusp-dir=/home/valera/c<wbr>usp --with-vienacl=1 --download-fblaslapack=1 --download-hypre</div><div>[0]PETSC ERROR: #1 VecSetValues() line 851 in /home/valera/petsc/src/vec/vec<wbr>/interface/rvector.c</div><div>[0]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--</div><div>[0]PETSC ERROR: Invalid argument</div><div>[0]PETSC ERROR: Object (seq) is not seqcusp or mpicusp</div><div>[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">http://www.mcs.anl.gov/petsc/d<wbr>ocumentation/faq.html</a> for trouble shooting.</div><div>[0]PETSC ERROR: Petsc Development GIT revision: v3.8.3-1817-g96b6f8a GIT Date: 2018-02-28 10:19:08 -0600</div><div>[0]PETSC ERROR: ./linsolve on a cuda named node50 by valera Wed Mar 14 10:24:35 2018</div><div>[0]PETSC ERROR: Configure options PETSC_ARCH=cuda --with-cc=mpicc --with-cxx=mpic++ --with-fc=mpifort --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --with-shared-libraries=1 --with-debugging=1 --with-cuda=1 --with-cuda-arch=sm_60 --with-cusp=1 --with-cusp-dir=/home/valera/c<wbr>usp --with-vienacl=1 --download-fblaslapack=1 --download-hypre</div><div>[0]PETSC ERROR: #2 VecCUSPGetArrayRead() line 1792 in /home/valera/petsc/src/vec/vec<wbr>/impls/seq/seqcusp/<a href="http://veccusp2.cu" target="_blank">veccusp2.cu</a></div><div>[0]PETSC ERROR: #3 VecAXPY_SeqCUSP() line 314 in /home/valera/petsc/src/vec/vec<wbr>/impls/seq/seqcusp/<a href="http://veccusp2.cu" target="_blank">veccusp2.cu</a></div><div>[0]PETSC ERROR: #4 VecAXPY() line 612 in /home/valera/petsc/src/vec/vec<wbr>/interface/rvector.c</div><div>[0]PETSC ERROR: #5 KSPSolve_GCR_cycle() line 60 in /home/valera/petsc/src/ksp/ksp<wbr>/impls/gcr/gcr.c</div><div>[0]PETSC ERROR: #6 KSPSolve_GCR() line 114 in /home/valera/petsc/src/ksp/ksp<wbr>/impls/gcr/gcr.c</div><div>[0]PETSC ERROR: #7 KSPSolve() line 669 in /home/valera/petsc/src/ksp/ksp<wbr>/interface/itfunc.c</div><div> soln maxval: 0.0000000000000000 </div><div> soln minval: 0.0000000000000000 </div><div> Norm: 11.180339887498949 </div><div> Its: 0</div><div>WARNING! There are options you set that were not used!</div><div>WARNING! could be spelling mistake, etc!</div><div>Option left: name:-mat_type value: aijcusparse</div><div>[valera@node50 alone]$ </div></div><div><br></div><div><br></div><div>I also see the configure options are not correct, so i guess is still linking a different petsc installation, but maybe we can try to make it work as it is, i will let you know if i am able to link the correct petsc installation here,</div><div><br></div><div>Best,</div><div><br></div><div><br></div><div><br></div></div><div class="gmail_extra"><br><div class="gmail_quote">On Sun, Mar 11, 2018 at 9:00 AM, Matthew Knepley <span dir="ltr"><<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote"><span>On Fri, Mar 9, 2018 at 3:05 AM, Manuel Valera <span dir="ltr"><<a href="mailto:mvalera-w@mail.sdsu.edu" target="_blank">mvalera-w@mail.sdsu.edu</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">Hello all,<div><br></div><div>I am working on porting a linear solver into GPUs for timing purposes, so far i've been able to compile and run the CUSP libraries and compile PETSc to be used with CUSP and ViennaCL, after the initial runs i noticed some errors, they are different for different flags and i would appreciate any help interpreting them,</div><div><br></div><div>The only elements in this program that use PETSc are the laplacian matrix (sparse), the RHS and X vectors and a scatter petsc object, so i would say it's safe to pass the command line arguments for the Mat/VecSetType()s instead of changing the source code,</div><div><br></div><div>If i use <b>-vec_type cuda -mat_type aijcusparse</b> or <b>-vec_type viennacl -mat_type aijviennacl </b>i get the following:</div></div></blockquote><div><br></div></span><div>These systems do not properly propagate errors. My only advice is to run a smaller problem and see.</div><span><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div><div>[0]PETSC ERROR: ------------------------------<wbr>------------------------------<wbr>------------</div><div>[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range</div><div>[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger</div><div>[0]PETSC ERROR: or see <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind" target="_blank">http://www.mcs.anl.gov/petsc/d<wbr>ocumentation/faq.html#valgrind</a></div><div>[0]PETSC ERROR: or try <a href="http://valgrind.org" target="_blank">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to find memory corruption errors</div><div>[0]PETSC ERROR: likely location of problem given in stack below</div><div>[0]PETSC ERROR: --------------------- Stack Frames ------------------------------<wbr>------</div><div>[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,</div><div>[0]PETSC ERROR: INSTEAD the line number of the start of the function</div><div>[0]PETSC ERROR: is given.</div><div>[0]PETSC ERROR: [0] VecSetValues line 847 /home/valera/petsc/src/vec/vec<wbr>/interface/rvector.c</div><div>[0]PETSC ERROR: [0] VecSetType line 36 /home/valera/petsc/src/vec/vec<wbr>/interface/vecreg.c</div><div>[0]PETSC ERROR: [0] VecSetTypeFromOptions_Private line 1230 /home/valera/petsc/src/vec/vec<wbr>/interface/vector.c</div><div>[0]PETSC ERROR: [0] VecSetFromOptions line 1271 /home/valera/petsc/src/vec/vec<wbr>/interface/vector.c</div><div>[0]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--</div><div>[0]PETSC ERROR: Signal received</div><div>[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">http://www.mcs.anl.gov/petsc/d<wbr>ocumentation/faq.html</a> for trouble shooting.</div><div>[0]PETSC ERROR: Petsc Development GIT revision: v3.8.3-1817-g96b6f8a GIT Date: 2018-02-28 10:19:08 -0600</div><div>[0]PETSC ERROR: ./gcmSeamount on a cuda named node50 by valera Thu Mar 8 09:50:51 2018</div><div>[0]PETSC ERROR: Configure options PETSC_ARCH=cuda --with-cc=mpicc --with-cxx=mpic++ --with-fc=mpifort --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --with-shared-libraries=1 --with-debugging=1 --with-cuda=1 --with-cuda-arch=sm_60 --with-cusp=1 --with-cusp-dir=/home/valera/c<wbr>usp --with-vienacl=1 --download-fblaslapack=1 --download-hypre</div><div>[0]PETSC ERROR: #5 User provided function() line 0 in unknown file</div><div>------------------------------<wbr>------------------------------<wbr>--------------</div></div><div><br></div><div>This seems to be a memory out of range, maybe my vector is too big for my CUDA system? how do i assess that?</div><div><br></div><div><br></div><div>Next, if i use <b>-vec_type cusp -mat_type aijcusparse </b>i get something different and more interesting:</div></div></blockquote><div><br></div></span><div>We need to see the entire error message, since it has the stack.</div><div><br></div><div>This seems like a logic error, but could definitely be on our end. Here is how I think about these:</div><div><br></div><div> 1) We have nightly test solves, so at least some solver configuration works</div><div><br></div><div> 2) Some vector which is marked read-only (happens for input to solvers), but someone is trying to update it.</div><div> The stack will tell me where this is happening.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div><div class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-m_4474698985491234451m_-7369477022532066496m_9175290598307314193m_6827561345868940943gmail-m_6014072403565971686m_-8817072490006586813m_-3705405426505396343h5"><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div><div>[0]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--</div><div>[0]PETSC ERROR: Object is in wrong state</div><div>[0]PETSC ERROR: Vec is locked read only, argument # 3</div><div>[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">http://www.mcs.anl.gov/petsc/d<wbr>ocumentation/faq.html</a> for trouble shooting.</div><div>[0]PETSC ERROR: Petsc Development GIT revision: v3.8.3-1817-g96b6f8a GIT Date: 2018-02-28 10:19:08 -0600</div><div>[0]PETSC ERROR: ./gcmSeamount on a cuda named node50 by valera Thu Mar 8 10:02:19 2018</div><div>[0]PETSC ERROR: Configure options PETSC_ARCH=cuda --with-cc=mpicc --with-cxx=mpic++ --with-fc=mpifort --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --with-shared-libraries=1 --with-debugging=1 --with-cuda=1 --with-cuda-arch=sm_60 --with-cusp=1 --with-cusp-dir=/home/valera/c<wbr>usp --with-vienacl=1 --download-fblaslapack=1 --download-hypre</div><div>[0]PETSC ERROR: #48 KSPSolve() line 615 in /home/valera/petsc/src/ksp/ksp<wbr>/interface/itfunc.c</div><div> PETSC_SOLVER_ONLY 6.8672990892082453E-005 s</div><div>[0]PETSC ERROR: --------------------- Error Message ------------------------------<wbr>------------------------------<wbr>--</div><div>[0]PETSC ERROR: Invalid argument</div><div>[0]PETSC ERROR: Object (seq) is not seqcusp or mpicusp</div><div>[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">http://www.mcs.anl.gov/petsc/d<wbr>ocumentation/faq.html</a> for trouble shooting.</div><div>[0]PETSC ERROR: Petsc Development GIT revision: v3.8.3-1817-g96b6f8a GIT Date: 2018-02-28 10:19:08 -0600</div><div>[0]PETSC ERROR: ./gcmSeamount on a cuda named node50 by valera Thu Mar 8 10:02:19 2018</div><div>[0]PETSC ERROR: Configure options PETSC_ARCH=cuda --with-cc=mpicc --with-cxx=mpic++ --with-fc=mpifort --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --with-shared-libraries=1 --with-debugging=1 --with-cuda=1 --with-cuda-arch=sm_60 --with-cusp=1 --with-cusp-dir=/home/valera/c<wbr>usp --with-vienacl=1 --download-fblaslapack=1 --download-hypre</div><div>[0]PETSC ERROR: #49 VecCUSPGetArrayReadWrite() line 1718 in /home/valera/petsc/src/vec/vec<wbr>/impls/seq/seqcusp/<a href="http://veccusp2.cu" target="_blank">veccusp2.cu</a></div><div>[0]PETSC ERROR: #50 VecScatterCUSP_StoS() line 269 in /home/valera/petsc/src/vec/vec<wbr>/impls/seq/seqcusp/<a href="http://vecscattercusp.cu" target="_blank">vecscatterc<wbr>usp.cu</a></div></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div>And it yields a "solution" to the system and also a log at the end:</div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><div>./gcmSeamount on a cuda named node50 with 1 processor, by valera Thu Mar 8 10:02:24 2018</div><div>Using Petsc Development GIT revision: v3.8.3-1817-g96b6f8a GIT Date: 2018-02-28 10:19:08 -0600</div><div><br></div><div> Max Max/Min Avg Total </div><div>Time (sec): 4.573e+00 1.00000 4.573e+00</div><div>Objects: 8.100e+01 1.00000 8.100e+01</div><div>Flop: 3.492e+07 1.00000 3.492e+07 3.492e+07</div><div>Flop/sec: 7.637e+06 1.00000 7.637e+06 7.637e+06</div><div>Memory: 2.157e+08 1.00000 2.157e+08</div><div>MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00</div><div>MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00</div><div>MPI Reductions: 0.000e+00 0.00000</div><div><br></div><div>Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract)</div><div> e.g., VecAXPY() for real vectors of length N --> 2N flop</div><div> and VecAXPY() for complex vectors of length N --> 8N flop</div><div><br></div><div>Summary of Stages: ----- Time ------ ----- Flop ----- --- Messages --- -- Message Lengths -- -- Reductions --</div><div> Avg %Total Avg %Total counts %Total Avg %Total counts %Total </div><div> 0: Main Stage: 4.5729e+00 100.0% 3.4924e+07 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% </div><div><br></div><div>------------------------------<wbr>------------------------------<wbr>------------------------------<wbr>------------------------------</div><div>See the 'Profiling' chapter of the users' manual for details on interpreting output.</div><div>Phase summary info:</div><div> Count: number of times phase was executed</div><div> Time and Flop: Max - maximum over all processors</div><div> Ratio - ratio of maximum to minimum over all processors</div><div> Mess: number of messages sent</div><div> Avg. len: average message length (bytes)</div><div> Reduct: number of global reductions</div><div> Global: entire computation</div><div> Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop().</div><div> %T - percent time in this phase %F - percent flop in this phase</div><div> %M - percent messages in this phase %L - percent message lengths in this phase</div><div> %R - percent reductions in this phase</div><div> Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max time over all processors)</div><div>------------------------------<wbr>------------------------------<wbr>------------------------------<wbr>------------------------------</div><div><br></div><div><br></div><div> ##############################<wbr>############################</div><div> # #</div><div> # WARNING!!! #</div><div> # #</div><div> # This code was compiled with a debugging option, #</div><div> # To get timing results run ./configure #</div><div> # using --with-debugging=no, the performance will #</div><div> # be generally two or three times faster. #</div><div> # #</div><div> ##############################<wbr>############################</div><div><br></div><div><br></div><div>Event Count Time (sec) Flop --- Global --- --- Stage --- Total</div><div> Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s</div><div>------------------------------<wbr>------------------------------<wbr>------------------------------<wbr>------------------------------</div><div><br></div><div>--- Event Stage 0: Main Stage</div><div><br></div><div>MatLUFactorNum 1 1.0 4.9502e-02 1.0 3.49e+07 1.0 0.0e+00 0.0e+00 0.0e+00 1100 0 0 0 1100 0 0 0 706</div><div>MatILUFactorSym 1 1.0 1.9642e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div><div>MatAssemblyBegin 2 1.0 6.9141e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div><div>MatAssemblyEnd 2 1.0 2.6612e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 6 0 0 0 0 6 0 0 0 0 0</div><div>MatGetRowIJ 1 1.0 5.0068e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div><div>MatGetOrdering 1 1.0 1.7186e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div><div>MatLoad 1 1.0 1.1575e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 3 0 0 0 0 3 0 0 0 0 0</div><div>MatView 1 1.0 8.0877e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 2 0 0 0 0 2 0 0 0 0 0</div><div>MatCUSPCopyTo 1 1.0 2.4664e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 5 0 0 0 0 5 0 0 0 0 0</div><div>VecSet 68 1.0 5.1665e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0</div><div>VecAssemblyBegin 17 1.0 5.2691e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div><div>VecAssemblyEnd 17 1.0 4.3631e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div><div>VecScatterBegin 15 1.0 1.5345e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div><div>VecCUSPCopyFrom 1 1.0 1.1199e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0</div><div>KSPSetUp 1 1.0 5.1929e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0</div><div>PCSetUp 2 1.0 8.6590e-02 1.0 3.49e+07 1.0 0.0e+00 0.0e+00 0.0e+00 2100 0 0 0 2100 0 0 0 403</div><div>------------------------------<wbr>------------------------------<wbr>------------------------------<wbr>------------------------------</div><div><br></div><div>Memory usage is given in bytes:</div><div><br></div><div>Object Type Creations Destructions Memory Descendants' Mem.</div><div>Reports information only for process 0.</div><div><br></div><div>--- Event Stage 0: Main Stage</div><div><br></div><div> Matrix 3 1 52856972 0.</div><div> Matrix Null Space 1 1 608 0.</div><div> Vector 66 3 3414600 0.</div><div> Vector Scatter 1 1 680 0.</div><div> Viewer 3 2 1680 0.</div><div> Krylov Solver 1 0 0 0.</div><div> Preconditioner 2 1 864 0.</div><div> Index Set 4 1 800 0.</div><div>==============================<wbr>==============================<wbr>==============================<wbr>==============================</div><div>Average time to get PetscTime(): 9.53674e-08</div><div>#PETSc Option Table entries:</div><div>-ksp_view</div><div>-log_view</div><div>-mat_type aijcusparse</div><div>-matload_block_size 1</div><div>-vec_type cusp</div><div>#End of PETSc Option Table entries</div><div>Compiled without FORTRAN kernels</div><div>Compiled with full precision matrices (default)</div><div>sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4</div><div>Configure options: PETSC_ARCH=cuda --with-cc=mpicc --with-cxx=mpic++ --with-fc=mpifort --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --with-shared-libraries=1 --with-debugging=1 --with-cuda=1 --with-cuda-arch=sm_60 --with-cusp=1 --with-cusp-dir=/home/valera/c<wbr>usp --with-vienacl=1 --download-fblaslapack=1 --download-hypre</div><div>------------------------------<wbr>-----------</div><div>Libraries compiled on Mon Mar 5 16:37:18 2018 on node50 </div><div>Machine characteristics: Linux-3.10.0-693.17.1.el7.x86_<wbr>64-x86_64-with-centos-7.2.1511<wbr>-Core</div><div>Using PETSc directory: /home/valera/petsc</div><div>Using PETSc arch: cuda</div><div>------------------------------<wbr>-----------</div><div><br></div><div>Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector -fvisibility=hidden -O3 </div><div>Using Fortran compiler: mpifort -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -O3 </div><div>------------------------------<wbr>-----------</div><div><br></div><div>Using include paths: -I/home/valera/petsc/cuda/incl<wbr>ude -I/home/valera/petsc/include -I/home/valera/petsc/include -I/home/valera/petsc/cuda/incl<wbr>ude -I/home/valera/cusp/ -I/usr/local/cuda/include</div><div>------------------------------<wbr>-----------</div><div><br></div><div>Using C linker: mpicc</div><div>Using Fortran linker: mpifort</div><div>Using libraries: -Wl,-rpath,/home/valera/petsc/<wbr>cuda/lib -L/home/valera/petsc/cuda/lib -lpetsc -Wl,-rpath,/home/valera/petsc/<wbr>cuda/lib -L/home/valera/petsc/cuda/lib -Wl,-rpath,/usr/local/cuda/lib<wbr>64 -L/usr/local/cuda/lib64 -Wl,-rpath,/usr/lib64/openmpi/<wbr>lib -L/usr/lib64/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64<wbr>-redhat-linux/4.8.5 -L/usr/lib/gcc/x86_64-redhat-l<wbr>inux/4.8.5 -lHYPRE -lflapack -lfblas -lm -lcufft -lcublas -lcudart -lcusparse -lX11 -lstdc++ -ldl -lmpi_usempi -lmpi_mpifh -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lpthread -lstdc++ -ldl</div><div>------------------------------<wbr>-----------</div></div><div><br></div><div><br></div><div><br></div><div>Thanks for your help,</div><div><br></div><div>Manuel </div><div><br></div><div><br></div><div><br></div></div>
</blockquote></div></div></div><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-m_4474698985491234451m_-7369477022532066496m_9175290598307314193m_6827561345868940943gmail-m_6014072403565971686m_-8817072490006586813m_-3705405426505396343HOEnZb"><font color="#888888"><br><br clear="all"><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-m_4474698985491234451m_-7369477022532066496m_9175290598307314193m_6827561345868940943gmail-HOEnZb"><font color="#888888"><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-m_4474698985491234451m_-7369477022532066496m_9175290598307314193m_6827561345868940943gmail-m_6014072403565971686m_-8817072490006586813HOEnZb"><font color="#888888"><div><br></div>-- <br><div class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-m_4474698985491234451m_-7369477022532066496m_9175290598307314193m_6827561345868940943gmail-m_6014072403565971686m_-8817072490006586813m_-3705405426505396343m_6575046406500791398gmail_signature"><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.caam.rice.edu/~mk51/" target="_blank">https://www.cse.buffalo.edu/~k<wbr>nepley/</a><br></div></div></div></div></div>
</font></span></font></span></font></span></div></div><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-m_4474698985491234451m_-7369477022532066496m_9175290598307314193m_6827561345868940943gmail-HOEnZb"><font color="#888888">
</font></span></blockquote></div><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-m_4474698985491234451m_-7369477022532066496m_9175290598307314193m_6827561345868940943gmail-HOEnZb"><font color="#888888"><br></font></span></div><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-m_4474698985491234451m_-7369477022532066496m_9175290598307314193m_6827561345868940943gmail-HOEnZb"><font color="#888888">
</font></span></blockquote></div></div></div><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-m_4474698985491234451m_-7369477022532066496m_9175290598307314193m_6827561345868940943gmail-HOEnZb"><font color="#888888"><div><div class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-m_4474698985491234451m_-7369477022532066496m_9175290598307314193m_6827561345868940943gmail-m_6014072403565971686h5"><br><br clear="all"><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-m_4474698985491234451m_-7369477022532066496HOEnZb"><font color="#888888"><div><br></div>-- <br><div class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-m_4474698985491234451m_-7369477022532066496m_9175290598307314193m_6827561345868940943gmail-m_6014072403565971686m_-8817072490006586813gmail_signature"><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.caam.rice.edu/~mk51/" target="_blank">https://www.cse.buffalo.edu/~k<wbr>nepley/</a><br></div></div></div></div></div>
</font></span></div></div></font></span></div></div><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-m_4474698985491234451m_-7369477022532066496HOEnZb"><font color="#888888">
</font></span></blockquote></div><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-m_4474698985491234451m_-7369477022532066496HOEnZb"><font color="#888888"><br></font></span></div><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-m_4474698985491234451m_-7369477022532066496HOEnZb"><font color="#888888">
</font></span></blockquote></div></div></div><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-m_4474698985491234451m_-7369477022532066496HOEnZb"><font color="#888888"><div><div class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-m_4474698985491234451m_-7369477022532066496m_9175290598307314193h5"><br><br clear="all"><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-HOEnZb"><font color="#888888"><div><br></div>-- <br><div class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-m_4474698985491234451m_-7369477022532066496m_9175290598307314193m_6827561345868940943gmail_signature"><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.caam.rice.edu/~mk51/" target="_blank">https://www.cse.buffalo.edu/~k<wbr>nepley/</a><br></div></div></div></div></div>
</font></span></div></div></font></span></div></div><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-HOEnZb"><font color="#888888">
</font></span></blockquote></div><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-HOEnZb"><font color="#888888"><br></font></span></div><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-HOEnZb"><font color="#888888">
</font></span></blockquote></div></div></div><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-HOEnZb"><font color="#888888"><div><div class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-m_4474698985491234451h5"><br><br clear="all"><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287HOEnZb"><font color="#888888"><div><br></div>-- <br><div class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail-m_4474698985491234451m_-7369477022532066496gmail_signature"><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.caam.rice.edu/~mk51/" target="_blank">https://www.cse.buffalo.edu/~k<wbr>nepley/</a><br></div></div></div></div></div>
</font></span></div></div></font></span></div></div><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287HOEnZb"><font color="#888888">
</font></span></blockquote></div><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287HOEnZb"><font color="#888888"><br></font></span></div><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287HOEnZb"><font color="#888888">
</font></span></blockquote></div></div></div><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287HOEnZb"><font color="#888888"><div><div class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850h5"><br><br clear="all"><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415HOEnZb"><font color="#888888"><div><br></div>-- <br><div class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287m_-2879601938011470850m_-3628341682252417953gmail_signature"><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.caam.rice.edu/~mk51/" target="_blank">https://www.cse.buffalo.edu/~k<wbr>nepley/</a><br></div></div></div></div></div>
</font></span></div></div></font></span></div></div><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415HOEnZb"><font color="#888888">
</font></span></blockquote></div><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415HOEnZb"><font color="#888888"><br></font></span></div><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415HOEnZb"><font color="#888888">
</font></span></blockquote></div></div></div><span class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415HOEnZb"><font color="#888888"><div><div class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473h5"><br><br clear="all"><span class="m_-7147497785134130018m_1260164640617907927HOEnZb"><font color="#888888"><div><br></div>-- <br><div class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415m_3773707719788978473m_-8358939555169834287gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.caam.rice.edu/~mk51/" target="_blank">https://www.cse.buffalo.edu/~k<wbr>nepley/</a><br></div></div></div></div></div>
</font></span></div></div></font></span></div></div><span class="m_-7147497785134130018m_1260164640617907927HOEnZb"><font color="#888888">
</font></span></blockquote></div><span class="m_-7147497785134130018m_1260164640617907927HOEnZb"><font color="#888888"><br></font></span></div><span class="m_-7147497785134130018m_1260164640617907927HOEnZb"><font color="#888888">
</font></span></blockquote></div></div></div><span class="m_-7147497785134130018m_1260164640617907927HOEnZb"><font color="#888888"><div><div class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725h5"><br><br clear="all"><span class="HOEnZb"><font color="#888888"><div><br></div>-- <br><div class="m_-7147497785134130018m_1260164640617907927m_-56753979551532725m_-911696043303800415gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.caam.rice.edu/~mk51/" target="_blank">https://www.cse.buffalo.edu/~k<wbr>nepley/</a><br></div></div></div></div></div>
</font></span></div></div></font></span></div></div><span class="HOEnZb"><font color="#888888">
</font></span></blockquote></div><span class="HOEnZb"><font color="#888888"><br></font></span></div><span class="HOEnZb"><font color="#888888">
</font></span></blockquote></div></div></div><span class="HOEnZb"><font color="#888888"><div><div class="m_-7147497785134130018h5"><br><br clear="all"><div><br></div>-- <br><div class="m_-7147497785134130018m_1260164640617907927gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.caam.rice.edu/~mk51/" target="_blank">https://www.cse.buffalo.edu/~k<wbr>nepley/</a><br></div></div></div></div></div>
</div></div></font></span></div></div>
</blockquote></div><br></div>
</blockquote></div><br><br clear="all"><div><br></div>-- <br><div class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.caam.rice.edu/~mk51/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div>
</div></div>