Eugene,<div><br></div><div>Based off of</div><div><br></div><div><meta charset="utf-8"> Configure options --prefix=/home/kukushkinav<br>--with-blas-lapack-dir=/opt/intel/composerxe-2011.0.084/mkl<br>--with-mpi-dir=/opt/intel/impi/<a href="http://4.0.1.7/intel64/bin" target="_blank">4.0.1.007/intel64/bin</a> --with-cuda=1<br>
--with-cusp=1 --with-thrust=1<br>--with-thrust-dir=/home/kukushkinav/include<br>--with-cusp-dir=/home/kukushkinav/include</div><div><br></div><div>It looks like maybe you're not setting the configure flag --with-cuda-arch=XXX. Without this, PETSc uses the nvcc default, which when I last looked was sm_10. The problem with this is that sm_10 doesn't support double-precision. Maybe this is your problem? For example, I use -with-cuda-arch=sm_13, which corresponds to NVIDIA compute capability 1.3</div>
<div><br></div><div>Cheers,</div><div><br></div><div>Victor</div><div>---<br>Victor L. Minden<br><br>Tufts University<br>School of Engineering<br>Class of 2012<br>
<br><br><div class="gmail_quote">On Fri, Apr 22, 2011 at 12:22 PM, Евгений Козлов <span dir="ltr"><<a href="mailto:neoveneficus@gmail.com">neoveneficus@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
Hello,<br>
<br>
I am interesting in using PETSc for iterative solving sparse linear<br>
system on multi-GPU systems.<br>
<br>
First of all, I compiled PETSc-dev and tried to run some examples,<br>
which will run on GPU.<br>
<br>
I found src/snes/examples/tutorials/<a href="http://ex47cu.cu" target="_blank">ex47cu.cu</a>. It was compiled and executed.<br>
<br>
Out of original program src/snes/examples/tutorials/<a href="http://ex47cu.cu" target="_blank">ex47cu.cu</a>:<br>
<br>
[0]PETSC ERROR: --------------------- Error Message<br>
------------------------------------<br>
[0]PETSC ERROR: Floating point exception!<br>
[0]PETSC ERROR: User provided compute function generated a Not-a-Number!<br>
[0]PETSC ERROR:<br>
------------------------------------------------------------------------<br>
[0]PETSC ERROR: Petsc Development HG revision:<br>
d3e10315d68b1dd5481adb2889c7d354880da362 HG Date: Wed Apr 20 21:03:56<br>
2011 -0500<br>
[0]PETSC ERROR: See docs/changes/index.html for recent updates.<br>
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.<br>
[0]PETSC ERROR: See docs/index.html for manual pages.<br>
[0]PETSC ERROR:<br>
------------------------------------------------------------------------<br>
[0]PETSC ERROR: ex47cu on a arch-linu named cn03 by kukushkinav Fri<br>
Apr 22 18:33:32 2011<br>
[0]PETSC ERROR: Libraries linked from /home/kukushkinav/lib<br>
[0]PETSC ERROR: Configure run at Thu Apr 21 19:18:22 2011<br>
[0]PETSC ERROR: Configure options --prefix=/home/kukushkinav<br>
--with-blas-lapack-dir=/opt/intel/composerxe-2011.0.084/mkl<br>
--with-mpi-dir=/opt/intel/impi/<a href="http://4.0.1.007/intel64/bin" target="_blank">4.0.1.007/intel64/bin</a> --with-cuda=1<br>
--with-cusp=1 --with-thrust=1<br>
--with-thrust-dir=/home/kukushkinav/include<br>
--with-cusp-dir=/home/kukushkinav/include<br>
[0]PETSC ERROR:<br>
------------------------------------------------------------------------<br>
[0]PETSC ERROR: SNESSolve_LS() line 167 in src/snes/impls/ls/ls.c<br>
[0]PETSC ERROR: SNESSolve() line 2407 in src/snes/interface/snes.c<br>
[0]PETSC ERROR: main() line 38 in src/snes/examples/tutorials/<a href="http://ex47cu.cu" target="_blank">ex47cu.cu</a><br>
application called MPI_Abort(MPI_COMM_WORLD, 72) - process 0<br>
<br>
Then I tried to find the place in source with the problem, I changed<br>
the function in struct ApplyStencil to<br>
<br>
void operator()(Tuple t) { thrust::get<0>(t) = 1; }<br>
<br>
Result:<br>
[0]PETSC ERROR: --------------------- Error Message<br>
------------------------------------<br>
[0]PETSC ERROR: Floating point exception!<br>
[0]PETSC ERROR: Infinite or not-a-number generated in mdot, entry 0!<br>
[0]PETSC ERROR:<br>
------------------------------------------------------------------------<br>
[0]PETSC ERROR: Petsc Development HG revision:<br>
d3e10315d68b1dd5481adb2889c7d354880da362 HG Date: Wed Apr 20 21:03:56<br>
2011 -0500<br>
[0]PETSC ERROR: See docs/changes/index.html for recent updates.<br>
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.<br>
[0]PETSC ERROR: See docs/index.html for manual pages.<br>
[0]PETSC ERROR:<br>
------------------------------------------------------------------------<br>
[0]PETSC ERROR: ex47cu on a arch-linu named cn11 by kukushkinav Fri<br>
Apr 22 18:58:04 2011<br>
[0]PETSC ERROR: Libraries linked from /home/kukushkinav/lib<br>
[0]PETSC ERROR: Configure run at Thu Apr 21 19:18:22 2011<br>
[0]PETSC ERROR: Configure options --prefix=/home/kukushkinav<br>
--with-blas-lapack-dir=/opt/intel/composerxe-2011.0.084/mkl<br>
--with-mpi-dir=/opt/intel/impi/<a href="http://4.0.1.007/intel64/bin" target="_blank">4.0.1.007/intel64/bin</a> --with-cuda=1<br>
--with-cusp=1 --with-thrust=1<br>
--with-thrust-dir=/home/kukushkinav/include<br>
--with-cusp-dir=/home/kukushkinav/include<br>
[0]PETSC ERROR:<br>
------------------------------------------------------------------------<br>
[0]PETSC ERROR: VecMDot() line 1146 in src/vec/vec/interface/rvector.c<br>
[0]PETSC ERROR: KSPGMRESClassicalGramSchmidtOrthogonalization() line<br>
66 in src/ksp/ksp/impls/gmres/borthog2.c<br>
[0]PETSC ERROR: GMREScycle() line 161 in src/ksp/ksp/impls/gmres/gmres.c<br>
[0]PETSC ERROR: KSPSolve_GMRES() line 244 in src/ksp/ksp/impls/gmres/gmres.c<br>
[0]PETSC ERROR: KSPSolve() line 426 in src/ksp/ksp/interface/itfunc.c<br>
[0]PETSC ERROR: SNES_KSPSolve() line 3107 in src/snes/interface/snes.c<br>
[0]PETSC ERROR: SNESSolve_LS() line 190 in src/snes/impls/ls/ls.c<br>
[0]PETSC ERROR: SNESSolve() line 2407 in src/snes/interface/snes.cС уважением,<br>
Евгений<br>
[0]PETSC ERROR: main() line 38 in src/snes/examples/tutorials/<a href="http://ex47cu.cu" target="_blank">ex47cu.cu</a><br>
<br>
RedHat 5.5, Cuda 3.2<br>
<br>
Question1: Is it my problem or a bug of the algorithm?<br>
<br>
Question2: Where can I find simple doc or example, which describe how<br>
to solve sparse linear systems on multi-GPU systems using PETSc?<br>
<font color="#888888"><br>
<br>
--<br>
Best regards,<br>
Eugene<br>
</font></blockquote></div><br></div>