[petsc-users] [EXT]Re: Is using PETSc appropriate for this problem
Barry Smith
bsmith at petsc.dev
Thu Sep 17 19:19:58 CDT 2020
Yes, sorry, I meant to write non-linear smoother. What Matt is saying is that just like non-linear Gauss-Seidel has the linear version linear Gauss-Seidel there is also a non-linear Jacobi. Just like in the linear world non-linear Gauss-Seidel generally converges faster than non-linear Jacobi (often much faster) but in the context of the full approximation scheme (nonlinear multigrid) the smoother details are only part of the convergence picture so one can possibly use Jacobi instead of Gauss-Seidel as the smoother. PETSc has SNESFAS for implementing the full approximation scheme as well as accelerators for it like SNESQN, SNESANDERSON, SNESNGMRES which are used in a way similar to the way linear Krylov methods are used to accelerate linear multigrid or linear Jacobi or linear Gauss-Seidel.
Because Jacobi (linear or nonlinear) can do all the updates simultaneously they can be "batched" as Jed notes or can run very efficiently (if coded well) on GPUs. The fact that the convergence rate may be a bit smaller than for Gauss-Seidel may be outweighed by the fact that it can much more efficiently utilize the hardware.
I would suggest write your non-linear Gauss-Seidel with SNES and then time it in the context of your entire application/simulation, you can always go back and customize the code for each problem size by writing little naked Newton code directly it you need the improvement in speed.
Barry
We actually struggle ourselves in PETSc with writing efficient smoothers based on patches due to the overhead of the standard SNES/KSP solvers that were not coded specifically for very small problems.
> On Sep 17, 2020, at 4:05 PM, Matthew Knepley <knepley at gmail.com> wrote:
>
> On Thu, Sep 17, 2020 at 2:54 PM Alexander B Prescott <alexprescott at email.arizona.edu <mailto:alexprescott at email.arizona.edu>> wrote:
> Thank you all for your input. Matt is right, I cannot batch as this formulation must be done sequentially.
>
> >> Sounds a bit like a non-smoother (Gauss-Seidel type), speculating based on these few words.
>
> Barry, it is similar to a Gauss-Seidel solver in that solution updates from previous solves are used in the most recent Newton solve, though I'm not exactly sure what you mean by "non-smoother".
>
> He means a nonlinear smoother. You iterate over your domain solving small nonlinear problems in order to get closer to the solution
> of the big nonlinear problem. Depending on what you are doing, it might be possible to decouple these, which would likely be much
> more efficient.
>
> Thanks,
>
> Matt
>
> Best,
> Alexander
>
>
>
> On Thu, Sep 17, 2020 at 6:06 AM Matthew Knepley <knepley at gmail.com <mailto:knepley at gmail.com>> wrote:
> External Email
>
> On Thu, Sep 17, 2020 at 12:23 AM Jed Brown <jed at jedbrown.org <mailto:jed at jedbrown.org>> wrote:
> Alexander B Prescott <alexprescott at email.arizona.edu <mailto:alexprescott at email.arizona.edu>> writes:
>
> >> Are the problems of varying nonlinearity, that is will some converge
> >> with say a couple of Newton iterations while others require more, say 8 or
> >> more Newton steps?
> >>
> > The nonlinearity should be pretty similar, the problem setup is the same at
> > every node but the global domain needs to be traversed in a specific order.
>
>
> It sounds like you may have a Newton solver now for each individual problem? If so, could you make a histogram of number of iterations necessary to solve? Does it have a long tail or does every problem take 3 and 4 iterations (for example).
>
> If there is no long tail, then you can batch. If there is a long tail, you really want a solver that does one problem at a time, or a more dynamic system that checks which have completed and shrinks the active problem down. (That complexity has a development and execution time cost.)
>
> He cannot batch if the solves are sequential, as he says above.
>
> Matt
>
> --
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
>
>
> --
> Alexander Prescott
> alexprescott at email.arizona.edu <mailto:alexprescott at email.arizona.edu>
> PhD Candidate, The University of Arizona
> Department of Geosciences
> 1040 E. 4th Street
> Tucson, AZ, 85721
>
>
> --
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200917/8b2dff90/attachment.html>
More information about the petsc-users
mailing list