[petsc-users] CPU utilization during GPU solver

David Fuentes fuentesdt at gmail.com
Sat Nov 17 09:50:38 CST 2012


Hi,

I'm using petsc 3.3p4
I'm trying to run a nonlinear SNES solver on GPU with gmres and jacobi PC
using VECSEQCUSP and MATSEQAIJCUSP datatypes for the rhs and jacobian
matrix respectively.
When running top I still see significant CPU utilization (800-900 %CPU)
during the solve ? possibly from some multithreaded operations ?

Is this expected ?
I was thinking that since I input everything into the solver as a CUSP
datatype, all linear algebra operations would be on the GPU device from
there and wasn't expecting to see such CPU utilization during the solve ?
Do I probably have an error in my code somewhere ?

Thanks,
David
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20121117/80af73f9/attachment-0001.html>


More information about the petsc-users mailing list