[petsc-users] SNES: approximating the Jacobian with computed residuals?

Barry Smith bsmith at mcs.anl.gov
Tue Apr 22 12:50:27 CDT 2014


On Apr 22, 2014, at 11:47 AM, Peter Brune <prbrune at gmail.com> wrote:

> -snes_view would also be useful for basic santity checks.

   and -snes_monitor ?


> 
> 
> On Tue, Apr 22, 2014 at 11:43 AM, Peter Brune <prbrune at gmail.com> wrote:
> 
> 
> 
> On Tue, Apr 22, 2014 at 10:56 AM, Fischer, Greg A. <fischega at westinghouse.com> wrote:
>  
> 
> From: Peter Brune [mailto:prbrune at gmail.com] 
> Sent: Tuesday, April 22, 2014 10:16 AM
> To: Fischer, Greg A.
> Cc: petsc-users at mcs.anl.gov
> Subject: Re: [petsc-users] SNES: approximating the Jacobian with computed residuals?
> 
>  
> 
> On Tue, Apr 22, 2014 at 8:48 AM, Fischer, Greg A. <fischega at westinghouse.com> wrote:
> 
> Hello PETSc-users,
> 
> I'm using the SNES component with the NGMRES method in my application. I'm using a matrix-free context for the Jacobian and the MatMFFDComputeJacobian() function in my FormJacobian routine. My understanding is that this effectively approximates the Jacobian using the equation at the bottom of Page 103 in the PETSc User's Manual. This works, but the expense of computing two function evaluations in each SNES iteration nearly wipes out the performance improvements over Picard iteration.
> 
>  
> 
> Try -snes_type anderson.  It's less stable than NGMRES, but requires one function evaluation per iteration.  The manual is out of date.  I guess it's time to fix that.  It's interesting that the cost of matrix assembly and a linear solve is around the same as that of a function evaluation.  Output from -log_summary would help in the diagnosis.
> 
>  
> 
> I tried the –snes_type anderson option, and it seems to be requiring even more function evaluations than the Picard iterations. I’ve attached –log_summary output. This seems strange, because I can use the NLKAIN code (http://nlkain.sourceforge.net/) to fairly good effect, and I’ve read that it’s related to Anderson mixing. Would it be useful to adjust the parameters?
> 
> 
> If I recall correctly, NLKAIN is yet another improvement on Anderson Mixing.  Our NGMRES is what's in O/W and is built largely around being nonlinearly preconditionable with something strong like FAS.  What is the perceived difference in convergence? (what does -snes_monitor say?) Any multitude of tolerances may be different between the two methods, and it's hard to judge without knowing much, much more.  Seeing what happens when one changes the parameters is of course important if you're looking at performance.
> 
> By Picard, you mean simple fixed-point iteration, right?  What constitutes a Picard iteration is a longstanding argument on this list and therefore requires clarification, unfortunately. :)  This (without linesearch) can be duplicated in PETSc with -snes_type nrichardson -snes_linesearch_type basic.  For a typical problem one must damp this with -snes_linesearch_damping <damping parameter>  That's what the linesearch is there to avoid, but this takes more function evaluations.
>  
> 
>  
> 
> I’ve also attached –log_summary output for NGMRES. Does anything jump out as being amiss?
> 
> 
>       ##########################################################
>       #                                                        #
>       #                          WARNING!!!                    #
>       #                                                        #
>       #   This code was compiled with a debugging option,      #
>       #   To get timing results run ./configure                #
>       #   using --with-debugging=no, the performance will      #
>       #   be generally two or three times faster.              #
>       #                                                        #
>       ##########################################################
>  
> Timing comparisons aren't reasonable with debugging on.
> 
> 
>  
> 
>  
> 
> 
> Based on my (limited) understanding of the Oosterlee/Washio SIAM paper ("Krylov Subspace Acceleration of Nonlinear Multigrid..."), they seem to suggest that it's possible to approximate the Jacobian with a series of previously-computed residuals (eq 2.14), rather than additional function evaluations in each iteration. Is this correct? If so, could someone point me to a reference that demonstrates how to do this with PETSc?
> 
>  
> 
> What indication do you have that the Jacobian is calculated at all in the NGMRES method?  The two function evaluations are related to computing the quantities labeled F(u_M) and F(u_A) in O/W.  We already use the Jacobian approximation for the minimization problem (2.14).
> 
>  
> 
> - Peter
> 
>  
> 
> Thanks for the clarification.
> 
>  
> 
> -Greg
> 
>  
> 
> 
> Or, perhaps a better question to ask is: are there other ways of reducing the computing burden associated with estimating the Jacobian?
> 
> Thanks,
> Greg
> 
>  
> 
> 
> 



More information about the petsc-users mailing list