[petsc-users] code changes from CPU to GPU

recrusader recrusader at gmail.com
Wed Feb 8 20:56:34 CST 2012


Dear Barry,

I just found your discussion about this problem.
Will you add relevant functions for it recently?
Thanks a lot.

Best,
Yujie

On Wed, Feb 8, 2012 at 2:58 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
>   The "ghosted" forms of vectors is not supported currently with GPUs.
>
>   Barry
>
> On Feb 8, 2012, at 2:53 PM, recrusader wrote:
>
> > Dear PETSc developers,
> >
> > I have FEM codes. It works very well in CPU computation.
> > Now, I add '-vec_type seqcusp -mat_type aijcusp' when running the
> > codes, I met the following errors. My question that whether there is
> > an example to demonstrate how to revise the codes for the conversion
> > from CPU to GPU.
> >
> > In addition, 'seqcusp'  and 'seqaijcusp' are used when vec and mat are
> > saved in one GPU card?
> >
> > Thank you very much.
> >
> > [0]PETSC ERROR: --------------------- Error Message
> > ------------------------------------
> > [0]PETSC ERROR: Invalid argument!
> > [0]PETSC ERROR: Vector type seqcusp does not have local representation!
> > [0]PETSC ERROR:
> > ------------------------------------------------------------------------
> > [0]PETSC ERROR: Petsc Release Version 3.2.0, Patch 3, Fri Sep 30
> > 10:28:33 CDT 2011
> > [0]PETSC ERROR: See docs/changes/index.html for recent updates.
> > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> > [0]PETSC ERROR: See docs/index.html for manual pages.
> > [0]PETSC ERROR:
> > ------------------------------------------------------------------------
> > [0]PETSC ERROR:
> > /work/01820/ylu/libmesh_svn01232012/examples/myproj_sp1/myproj-opt on
> > a westmere- named c300-205.ls4.tacc.utexas.edu by ylu Wed Feb  8
> > 14:34:46 2012
> > [0]PETSC ERROR: Libraries linked from
> > /opt/apps/intel11_1/mvapich2_1_6/petsc/3.2/westmere-cuda/lib
> > [0]PETSC ERROR: Configure run at Fri Dec 16 11:27:43 2011
> > [0]PETSC ERROR: Configure options --with-x=0 -with-pic
> >
> --with-external-packages-dir=/var/tmp/petsc-3.2-buildroot//opt/apps/intel11_1/mvapich2_1_6/petsc/3.2/externalpackages-cuda
> > --with-mpi-compilers=1 --with-mpi-dir=/opt/apps/intel11_1/mvapich2/1.6
> > --with-scalar-type=real --with-dynamic-loading=0
> > --with-shared-libraries=0 --with-chaco=1
> >
> --download-chaco=/home1/0000/build/rpms/SOURCES/petsc-externalpackages/Chaco-2.2.tar.gz
> > --with-spai=1
> --download-spai=/home1/0000/build/rpms/SOURCES/petsc-externalpackages/spai_3.0-mar-06.tar.gz
> > --with-hypre=1
> --download-hypre=/home1/0000/build/rpms/SOURCES/petsc-externalpackages/hypre-2.6.0b.tar.gz
> > --with-mumps=1
> --download-mumps=/home1/0000/build/rpms/SOURCES/petsc-externalpackages/MUMPS_4.9.2.tar.gz
> > --with-scalapack=1
> >
> --download-scalapack=/home1/0000/build/rpms/SOURCES/petsc-externalpackages/scalapack.tgz
> > --with-blacs=1
> --download-blacs=/home1/0000/build/rpms/SOURCES/petsc-externalpackages/blacs-dev.tar.gz
> > --with-spooles=1
> >
> --download-spooles=/home1/0000/build/rpms/SOURCES/petsc-externalpackages/spooles-2.2-dec-2008.tar.gz
> > --with-superlu=1
> >
> --download-superlu=/home1/0000/build/rpms/SOURCES/petsc-externalpackages/SuperLU_4.1-December_20_2010.tar.gz
> > --with-superlu_dist=1
> >
> --download-superlu_dist=/home1/0000/build/rpms/SOURCES/petsc-externalpackages/SuperLU_DIST_2.5-December_21_2010.tar.gz
> > --with-parmetis=1
> >
> --download-parmetis=/home1/0000/build/rpms/SOURCES/petsc-externalpackages/ParMetis-dev-p3.tar.gz
> > --with-debugging=no
> > --with-blas-lapack-dir=/opt/apps/intel/11.1/mkl/lib/em64t
> > --with-mpiexec=mpirun_rsh --with-cuda=1
> > --with-cuda-dir=/opt/apps/cuda/4.0/cuda/
> > --with-cusp-dir=/opt/apps/cuda/4.0/cuda/
> > --with-thrust-dir=/opt/apps/cuda/4.0/cuda/ --COPTFLAGS=-xW
> > --CXXOPTFLAGS=-xW --FOPTFLAGS=-xW
> > [0]PETSC ERROR:
> > ------------------------------------------------------------------------
> > [0]PETSC ERROR: VecGhostGetLocalForm() line 82 in
> > src/vec/vec/impls/mpi/commonmpvec.c
> > [0]PETSC ERROR: zero() line 974 in
> >
> "unknowndirectory/"/work/01820/ylu/libmesh_svn01232012/include/numerics/petsc_vector.h
> > application called MPI_Abort(comm=0x84000002, 62) - process 0
> >
> > Best,
> > Yujie
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120208/46feadb3/attachment-0001.htm>


More information about the petsc-users mailing list