Dear Barry,<br><br>I just found your discussion about this problem.<br>Will you add relevant functions for it recently?<br>Thanks a lot.<br><br>Best,<br>Yujie<br><br><div class="gmail_quote">On Wed, Feb 8, 2012 at 2:58 PM, Barry Smith <span dir="ltr">&lt;<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>&gt;</span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><br>
   The &quot;ghosted&quot; forms of vectors is not supported currently with GPUs.<br>
<span class="HOEnZb"><font color="#888888"><br>
   Barry<br>
</font></span><div class="HOEnZb"><div class="h5"><br>
On Feb 8, 2012, at 2:53 PM, recrusader wrote:<br>
<br>
&gt; Dear PETSc developers,<br>
&gt;<br>
&gt; I have FEM codes. It works very well in CPU computation.<br>
&gt; Now, I add &#39;-vec_type seqcusp -mat_type aijcusp&#39; when running the<br>
&gt; codes, I met the following errors. My question that whether there is<br>
&gt; an example to demonstrate how to revise the codes for the conversion<br>
&gt; from CPU to GPU.<br>
&gt;<br>
&gt; In addition, &#39;seqcusp&#39;  and &#39;seqaijcusp&#39; are used when vec and mat are<br>
&gt; saved in one GPU card?<br>
&gt;<br>
&gt; Thank you very much.<br>
&gt;<br>
&gt; [0]PETSC ERROR: --------------------- Error Message<br>
&gt; ------------------------------------<br>
&gt; [0]PETSC ERROR: Invalid argument!<br>
&gt; [0]PETSC ERROR: Vector type seqcusp does not have local representation!<br>
&gt; [0]PETSC ERROR:<br>
&gt; ------------------------------------------------------------------------<br>
&gt; [0]PETSC ERROR: Petsc Release Version 3.2.0, Patch 3, Fri Sep 30<br>
&gt; 10:28:33 CDT 2011<br>
&gt; [0]PETSC ERROR: See docs/changes/index.html for recent updates.<br>
&gt; [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.<br>
&gt; [0]PETSC ERROR: See docs/index.html for manual pages.<br>
&gt; [0]PETSC ERROR:<br>
&gt; ------------------------------------------------------------------------<br>
&gt; [0]PETSC ERROR:<br>
&gt; /work/01820/ylu/libmesh_svn01232012/examples/myproj_sp1/myproj-opt on<br>
&gt; a westmere- named <a href="http://c300-205.ls4.tacc.utexas.edu" target="_blank">c300-205.ls4.tacc.utexas.edu</a> by ylu Wed Feb  8<br>
&gt; 14:34:46 2012<br>
&gt; [0]PETSC ERROR: Libraries linked from<br>
&gt; /opt/apps/intel11_1/mvapich2_1_6/petsc/3.2/westmere-cuda/lib<br>
&gt; [0]PETSC ERROR: Configure run at Fri Dec 16 11:27:43 2011<br>
&gt; [0]PETSC ERROR: Configure options --with-x=0 -with-pic<br>
&gt; --with-external-packages-dir=/var/tmp/petsc-3.2-buildroot//opt/apps/intel11_1/mvapich2_1_6/petsc/3.2/externalpackages-cuda<br>
&gt; --with-mpi-compilers=1 --with-mpi-dir=/opt/apps/intel11_1/mvapich2/1.6<br>
&gt; --with-scalar-type=real --with-dynamic-loading=0<br>
&gt; --with-shared-libraries=0 --with-chaco=1<br>
&gt; --download-chaco=/home1/0000/build/rpms/SOURCES/petsc-externalpackages/Chaco-2.2.tar.gz<br>
&gt; --with-spai=1 --download-spai=/home1/0000/build/rpms/SOURCES/petsc-externalpackages/spai_3.0-mar-06.tar.gz<br>
&gt; --with-hypre=1 --download-hypre=/home1/0000/build/rpms/SOURCES/petsc-externalpackages/hypre-2.6.0b.tar.gz<br>
&gt; --with-mumps=1 --download-mumps=/home1/0000/build/rpms/SOURCES/petsc-externalpackages/MUMPS_4.9.2.tar.gz<br>
&gt; --with-scalapack=1<br>
&gt; --download-scalapack=/home1/0000/build/rpms/SOURCES/petsc-externalpackages/scalapack.tgz<br>
&gt; --with-blacs=1 --download-blacs=/home1/0000/build/rpms/SOURCES/petsc-externalpackages/blacs-dev.tar.gz<br>
&gt; --with-spooles=1<br>
&gt; --download-spooles=/home1/0000/build/rpms/SOURCES/petsc-externalpackages/spooles-2.2-dec-2008.tar.gz<br>
&gt; --with-superlu=1<br>
&gt; --download-superlu=/home1/0000/build/rpms/SOURCES/petsc-externalpackages/SuperLU_4.1-December_20_2010.tar.gz<br>
&gt; --with-superlu_dist=1<br>
&gt; --download-superlu_dist=/home1/0000/build/rpms/SOURCES/petsc-externalpackages/SuperLU_DIST_2.5-December_21_2010.tar.gz<br>
&gt; --with-parmetis=1<br>
&gt; --download-parmetis=/home1/0000/build/rpms/SOURCES/petsc-externalpackages/ParMetis-dev-p3.tar.gz<br>
&gt; --with-debugging=no<br>
&gt; --with-blas-lapack-dir=/opt/apps/intel/11.1/mkl/lib/em64t<br>
&gt; --with-mpiexec=mpirun_rsh --with-cuda=1<br>
&gt; --with-cuda-dir=/opt/apps/cuda/4.0/cuda/<br>
&gt; --with-cusp-dir=/opt/apps/cuda/4.0/cuda/<br>
&gt; --with-thrust-dir=/opt/apps/cuda/4.0/cuda/ --COPTFLAGS=-xW<br>
&gt; --CXXOPTFLAGS=-xW --FOPTFLAGS=-xW<br>
&gt; [0]PETSC ERROR:<br>
&gt; ------------------------------------------------------------------------<br>
&gt; [0]PETSC ERROR: VecGhostGetLocalForm() line 82 in<br>
&gt; src/vec/vec/impls/mpi/commonmpvec.c<br>
&gt; [0]PETSC ERROR: zero() line 974 in<br>
&gt; &quot;unknowndirectory/&quot;/work/01820/ylu/libmesh_svn01232012/include/numerics/petsc_vector.h<br>
&gt; application called MPI_Abort(comm=0x84000002, 62) - process 0<br>
&gt;<br>
&gt; Best,<br>
&gt; Yujie<br>
<br>
</div></div></blockquote></div><br>