[petsc-users] Regarding the status of VecSetValues(Blocked) for GPU vectors

Mark Adams mfadams at lbl.gov
Thu Mar 17 19:19:37 CDT 2022


LocalToGlobal is a DM thing..
Sajid, do use DM?
If you need to add off procesor entries then DM could give you a local
vector as Matt said that you can add to for off procesor values and then
you could use the CPU communication in DM.

On Thu, Mar 17, 2022 at 7:19 PM Matthew Knepley <knepley at gmail.com> wrote:

> On Thu, Mar 17, 2022 at 4:46 PM Sajid Ali Syed <sasyed at fnal.gov> wrote:
>
>> Hi PETSc-developers,
>>
>> Is it possible to use VecSetValues with distributed-memory CUDA & Kokkos
>> vectors from the device, i.e. can I call VecSetValues with GPU memory
>> pointers and expect PETSc to figure out how to stash on the device it until
>> I call VecAssemblyBegin (at which point PETSc could use GPU-aware MPI to
>> populate off-process values) ?
>>
>> If this is not currently supported, is supporting this on the roadmap?
>> Thanks in advance!
>>
>
> VecSetValues() will fall back to the CPU vector, so I do not think this
> will work on device.
>
> Usually, our assembly computes all values and puts them in a "local"
> vector, which you can access explicitly as Mark said. Then
> we call LocalToGlobal() to communicate the values, which does work
> directly on device using specialized code in VecScatter/PetscSF.
>
> What are you trying to do?
>
>   THanks,
>
>       Matt
>
>
>> Thank You,
>> Sajid Ali (he/him) | Research Associate
>> Scientific Computing Division
>> Fermi National Accelerator Laboratory
>> s-sajid-ali.github.io
>>
>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220317/15dd9a0b/attachment.html>


More information about the petsc-users mailing list