[petsc-users] CPU utilization during GPU solver

Matthew Knepley knepley at gmail.com
Sat Nov 17 11:02:22 CST 2012

On Sat, Nov 17, 2012 at 10:50 AM, David Fuentes <fuentesdt at gmail.com> wrote:
> Hi,
> I'm using petsc 3.3p4
> I'm trying to run a nonlinear SNES solver on GPU with gmres and jacobi PC
> using VECSEQCUSP and MATSEQAIJCUSP datatypes for the rhs and jacobian matrix
> respectively.
> When running top I still see significant CPU utilization (800-900 %CPU)
> during the solve ? possibly from some multithreaded operations ?
> Is this expected ?
> I was thinking that since I input everything into the solver as a CUSP
> datatype, all linear algebra operations would be on the GPU device from
> there and wasn't expecting to see such CPU utilization during the solve ?
> Do I probably have an error in my code somewhere ?

We cannot answer performance questions without -log_summary


> Thanks,
> David

What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which
their experiments lead.
-- Norbert Wiener

More information about the petsc-users mailing list