[petsc-dev] MatPinToCPU

Mark Adams mfadams at lbl.gov
Sat Jul 27 10:28:17 CDT 2019


I'm not sure what to do here. The problem is that pinned-to-cpu vectors are
calling *VecCUDACopyFromGPU* here.

Should I set *x->valid_GPU_array *to something else, like
PETSC_OFFLOAD_CPU, in PinToCPU so this block of code i s not executed?

PetscErrorCode VecGetArray(Vec x,PetscScalar **a)
{
  PetscErrorCode ierr;
#if defined(PETSC_HAVE_VIENNACL)
  PetscBool      is_viennacltype = PETSC_FALSE;
#endif

  PetscFunctionBegin;
  PetscValidHeaderSpecific(x,VEC_CLASSID,1);
  ierr = VecSetErrorIfLocked(x,1);CHKERRQ(ierr);
  if (x->petscnative) {
#if defined(PETSC_HAVE_VIENNACL) || defined(PETSC_HAVE_CUDA)
    if (*x->valid_GPU_array* == PETSC_OFFLOAD_GPU) {
#if defined(PETSC_HAVE_VIENNACL)
      ierr =
PetscObjectTypeCompareAny((PetscObject)x,&is_viennacltype,VECSEQVIENNACL,VECMPIVIENNACL,VECVIENNACL,"");CHKERRQ(ierr);
      if (is_viennacltype) {
        ierr = VecViennaCLCopyFromGPU(x);CHKERRQ(ierr);
      } else
#endif
      {
#if defined(PETSC_HAVE_CUDA)

*        ierr = VecCUDACopyFromGPU(x);CHKERRQ(ierr);*#endif
     }
    } else if (x->valid_GPU_array == PETSC_OFFLOAD_UNALLOCATED) {
#if defined(PETSC_HAVE_VIENNACL)
      ierr =
PetscObjectTypeCompareAny((PetscObject)x,&is_viennacltype,VECSEQVIENNACL,VECMPIVIENNACL,VECVIENNACL,"");CHKERRQ(ierr);
      if (is_viennacltype) {
        ierr = VecViennaCLAllocateCheckHost(x);CHKERRQ(ierr);
      } else
#endif
      {
#if defined(PETSC_HAVE_CUDA)
        ierr = VecCUDAAllocateCheckHost(x);CHKERRQ(ierr);
#endif
      }
    }
#endif
    *a = *((PetscScalar**)x->data);
  } else {


On Tue, Jul 23, 2019 at 9:18 PM Smith, Barry F. <bsmith at mcs.anl.gov> wrote:

>
>  Yes, it needs to be able to switch back and forth between the CPU and GPU
> methods so you need to move into it the setting of the methods that is
> currently directly in the create method. See how
> MatConvert_SeqAIJ_SeqAIJViennaCL() calls ierr =
> MatPinToCPU_SeqAIJViennaCL(A,PETSC_FALSE);CHKERRQ(ierr); to set the methods
> for the GPU initially.
>
>   Barry
>
>
> > On Jul 23, 2019, at 7:32 PM, Mark Adams <mfadams at lbl.gov> wrote:
> >
> >
> >   What are the symptoms of it not working? Does it appear to be still
> copying the matrices to the GPU? then running the functions on the GPU?
> >
> >
> > The object is dispatching the CUDA mat-vec etc.
> >
> >   I suspect the pinning is incompletely done for CUDA (and MPIOpenCL)
> matrices.
> >
> >
> > Yes, git grep MatPinToCPU shows stuff for ViennaCL but not CUDA.
> >
> > I guess I can add something like this below. Do we need to set the
> device methods? They are already set when this method is set, right?
> >
> > We need the equivalent of
> >
> > static PetscErrorCode MatPinToCPU_SeqAIJViennaCL(Mat A,PetscBool flg)
> > {
> >   PetscFunctionBegin;
> >   A->pinnedtocpu = flg;
> >   if (flg) {
> >     A->ops->mult           = MatMult_SeqAIJ;
> >     A->ops->multadd        = MatMultAdd_SeqAIJ;
> >     A->ops->assemblyend    = MatAssemblyEnd_SeqAIJ;
> >     A->ops->duplicate      = MatDuplicate_SeqAIJ;
> >   } else {
> >     A->ops->mult           = MatMult_SeqAIJViennaCL;
> >     A->ops->multadd        = MatMultAdd_SeqAIJViennaCL;
> >     A->ops->assemblyend    = MatAssemblyEnd_SeqAIJViennaCL;
> >     A->ops->destroy        = MatDestroy_SeqAIJViennaCL;
> >     A->ops->duplicate      = MatDuplicate_SeqAIJViennaCL;
> >   }
> >   PetscFunctionReturn(0);
> > }
> >
> > for MPIViennaCL and MPISeqAIJ Cusparse but it doesn't look like it has
> been written yet.
> >
> >
> > >
> > > It does not seem to work. It does not look like CUDA has an
> MatCreateVecs. Should I add one and copy this flag over?
> >
> >    We do need this function. But I don't see how it relates to pinning.
> When the matrix is pinned to the CPU we want it to create CPU vectors which
> I assume it does.
> >
> >
> > >
> > > Mark
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20190727/9c597e73/attachment.html>


More information about the petsc-dev mailing list