[petsc-users] Questions about setting values for GPU based matrices
Matthew Knepley
knepley at gmail.com
Fri Oct 28 11:32:00 CDT 2011
On Fri, Oct 28, 2011 at 10:24 AM, Fredrik Heffer Valdmanis <
fredva at ifi.uio.no> wrote:
> Hi,
>
> I am working on integrating the new GPU based vectors and matrices into
> FEniCS. Now, I'm looking at the possibility for getting some speedup during
> finite element assembly, specifically when inserting the local element
> matrix into the global element matrix. In that regard, I have a few
> questions I hope you can help me out with:
>
> - When calling MatSetValues with a MATSEQAIJCUSP matrix as parameter, what
> exactly is it that happens? As far as I can see, MatSetValues is not
> implemented for GPU based matrices, neither is the mat->ops->setvalues set
> to point at any function for this Mat type.
>
Yes, MatSetValues always operates on the CPU side. It would not make sense
to do individual operations on the GPU.
I have written batched of assembly for element matrices that are all the
same size:
http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manualpages/Mat/MatSetValuesBatch.html
> - Is it such that matrices are assembled in their entirety on the CPU, and
> then copied over to the GPU (after calling MatAssemblyBegin)? Or are values
> copied over to the GPU each time you call MatSetValues?
>
That function assembles the matrix on the GPU and then copies to the CPU.
The only time you do not want this copy is when
you are running in serial and never touch the matrix afterwards, so I left
it in.
> - Can we expect to see any speedup from using MatSetValuesBatch over
> MatSetValues, or is the batch version simply a utility function? This
> question goes for both CPU- and GPU-based matrices.
>
CPU: no
GPU: yes, I see about the memory bandwidth ratio
Matt
> Thanks,
>
> Fredrik V
>
--
What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20111028/163623c6/attachment.htm>
More information about the petsc-users
mailing list