[petsc-users] Function evaluation slowness ?
Timothée Nicolas
timothee.nicolas at gmail.com
Tue Aug 25 00:45:33 CDT 2015
Hi,
I am testing PETSc on the supercomputer where I used to run my explicit MHD
code. For my tests I use 256 processes on a problem of size 128*128*640 =
10485760, that is, 40960 grid points per process, and 8 degrees of freedom
(or physical fields). The explicit code was using Runge-Kutta 4 for the
time scheme, which means 4 function evaluation per time step (plus one
operation to put everything together, but let's forget this one).
I could thus easily determine that the typical time required for a function
evaluation was of the order of 50 ms.
Now with the implicit Newton-Krylov solver written in PETSc, in the present
state where for now I have not implemented any Jacobian or preconditioner
whatsoever (so I run with -snes_mf), I measure a typical time between two
time steps of between 5 and 20 seconds, and the number of function
evaluations for each time step obtained with SNESGetNumberFunctionEvals is
17 (I am speaking of a particular case of course)
This means a time per function evaluation of about 0.5 to 1 second, that
is, 10 to 20 times slower.
So I have some questions about this.
1. First does SNESGetNumberFunctionEvals take into account the function
evaluations required to evaluate the Jacobian when -snes_mf is used, as
well as the operations required by the GMRES (Krylov) method ? If it were
the case, I would somehow intuitively expect a number larger than 17, which
could explain the increase in time.
2. In any case, I thought that all things considered, the function
evaluation would be the most time consuming part of a Newton-Krylov solver,
am I completely wrong about that ? Is the 10-20 factor legit ?
I realize of course that preconditioning should make all this smoother, in
particular allowing larger time steps, but here I am just concerned about
the sheer Function evaluation time.
Best regards
Timothee NICOLAS
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150825/63a85e94/attachment-0001.html>
More information about the petsc-users
mailing list