[petsc-users] 32-bit vs 64-bit GPU support
Rohan Yadav
rohany at alumni.cmu.edu
Fri Aug 11 14:31:15 CDT 2023
> We do not currently have any code for using 64 bit integer sizes on
the GPUs.
Thank you, just wanted confirmation.
> Given the current memory available on GPUs is 64 bit integer support
needed? I think even a single vector of length 2^31 will use up most of the
GPU's memory? Are the practical, not synthetic, situations that require 64
bit integer support on GPUs immediately? For example, is the vector length
of the entire parallel vector across all GPUs limited to 32 bits?
With modern GPU sizes, for example A100's with 80GB of memory, a vector of
length 2^31 is not that much memory -- one could conceivably run a CG solve
with local vectors > 2^31.
Thanks Junchao, I might look into that. However, I currently am not trying
to solve such a large problem -- these questions just came from wondering
why the cuSPARSE kernel PETSc was calling was running faster than mine.
Rohan
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230811/b721ca13/attachment-0001.html>
More information about the petsc-users
mailing list