[petsc-dev] memory usage with GPUs

Barry Smith bsmith at mcs.anl.gov
Fri Jul 29 09:46:24 CDT 2016


  Do you know if Nvidia provides an API to determine how much GPU memory has been currently allocated? If it does then that could easily be hooked up in same style as the PetscMemory...() routines for example PetscGPUMemory...(). If they don't provide an API then obviously there is no way to get that data.

   Barry

> On Jul 27, 2016, at 8:18 PM, Mark Adams <mfadams at lbl.gov> wrote:
> 
> Note, I wanted to use this for a Petsc code  that uses GPUs but not through at Petsc.
> 
> So my case is a little odd and I did not expect support but I just wanted to check.
> 
> On Wednesday, July 27, 2016, Karl Rupp <rupp at iue.tuwien.ac.at> wrote:
> 
>    We (meanings someone) should add logging of GPU allocations separately.
> 
> Agreed. I'll do it (though not immediately, as there are other things with higher priority to make GPU support in PETSc more useful)
> 
> Best regards,
> Karli
> 
> 
> 
> On Jul 27, 2016, at 1:41 PM, Matthew Knepley <knepley at gmail.com> wrote:
> 
> No, although since we mirror storage you could get an idea if you broke out the GPU vecs into a separate stage.
> 
>     Matt
> 
> On Wed, Jul 27, 2016 at 9:05 AM, Mark Adams <mfadams at lbl.gov> wrote:
> Would/could PETSc memory usage methods (PetscMemoryGetMaximumUsage) pick up GPU memory usage?
> 
> 
> 
> 
> --
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
> 
> 




More information about the petsc-dev mailing list