[petsc-users] Implementing the Sherman Morisson formula (low rank update) in petsc4py and FEniCS?
Matthew Knepley
knepley at gmail.com
Wed Feb 5 07:35:27 CST 2020
Perhaps Barry is right that you want Picard, but suppose you really want
Newton.
"This problem can be solved efficiently using the Sherman-Morrison formula"
Well, maybe. The main assumption here is that inverting K is cheap. I see
two things you can do in a straightforward way:
1) Use MatCreateLRC()
https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatCreateLRC.html
to create the Jacobian
and solve using an iterative method. If you pass just K was the
preconditioning matrix, you can use common PCs.
2) We only implemented MatMult() for LRC, but you could stick your SMW
code in for MatSolve_LRC if you really want to factor K. We would
of course help you do this.
Thanks,
Matt
On Wed, Feb 5, 2020 at 1:36 AM Smith, Barry F. via petsc-users <
petsc-users at mcs.anl.gov> wrote:
>
> I am not sure of everything in your email but it sounds like you want
> to use a "Picard" iteration to solve [K(u)−kaaT]Δu=−F(u). That is solve
>
> A(u^{n}) (u^{n+1} - u^{n}) = F(u^{n}) - A(u^{n})u^{n} where A(u) = K(u)
> - kaaT
>
> PETSc provides code to this with SNESSetPicard() (see the manual pages) I
> don't know if Petsc4py has bindings for this.
>
> Adding missing python bindings is not terribly difficult and you may be
> able to do it yourself if this is the approach you want.
>
> Barry
>
>
>
> > On Feb 4, 2020, at 5:07 PM, Olek Niewiarowski <aan2 at princeton.edu>
> wrote:
> >
> > Hello,
> > I am a FEniCS user but new to petsc4py. I am trying to modify the KSP
> solver through the SNES object to implement the Sherman-Morrison
> formula(e.g. http://fourier.eng.hmc.edu/e176/lectures/algebra/node6.html
> ). I am solving a nonlinear system of the form [K(u)−kaaT]Δu=−F(u). Here
> the jacobian matrix K is modified by the term kaaT, where k is a scalar.
> Notably, K is sparse, while the term kaaT results in a full matrix. This
> problem can be solved efficiently using the Sherman-Morrison formula :
> >
> > [K−kaaT]-1 = K-1 - (kK-1 aaTK-1)/(1+kaTK-1a)
> > I have managed to successfully implement this at the linear solve level
> (by modifying the KSP solver) inside a custom Newton solver in python by
> following an incomplete tutorial at
> https://www.firedrakeproject.org/petsc-interface.html#defining-a-preconditioner
> :
> > • while (norm(delU) > alpha): # while not converged
> > •
> > • self.update_F() # call to method to update r.h.s form
> > • self.update_K() # call to update the jacobian form
> > • K = assemble(self.K) # assemble the jacobian matrix
> > • F = assemble(self.F) # assemble the r.h.s vector
> > • a = assemble(self.a_form) # assemble the a_form (see
> Sherman Morrison formula)
> > •
> > • for bc in self.mem.bc: # apply boundary conditions
> > • bc.apply(K, F)
> > • bc.apply(K, a)
> > •
> > • B = PETSc.Mat().create()
> > •
> > • # Assemble the bilinear form that defines A and get
> the concrete
> > • # PETSc matrix
> > • A = as_backend_type(K).mat() # get the PETSc objects
> for K and a
> > • u = as_backend_type(a).vec()
> > •
> > • # Build the matrix "context" # see firedrake docs
> > • Bctx = MatrixFreeB(A, u, u, self.k)
> > •
> > • # Set up B
> > • # B is the same size as A
> > • B.setSizes(*A.getSizes())
> > • B.setType(B.Type.PYTHON)
> > • B.setPythonContext(Bctx)
> > • B.setUp()
> > •
> > •
> > • ksp = PETSc.KSP().create() # create the KSP linear
> solver object
> > • ksp.setOperators(B)
> > • ksp.setUp()
> > • pc = ksp.pc
> > • pc.setType(pc.Type.PYTHON)
> > • pc.setPythonContext(MatrixFreePC())
> > • ksp.setFromOptions()
> > •
> > • solution = delU # the incremental displacement at
> this iteration
> > •
> > • b = as_backend_type(-F).vec()
> > • delu = solution.vector().vec()
> > •
> > • ksp.solve(b, delu)
> >
> > • self.mem.u.vector().axpy(0.25, self.delU.vector()) #
> poor man's linesearch
> > • counter += 1
> > Here is the corresponding petsc4py code adapted from the firedrake docs:
> >
> > • class MatrixFreePC(object):
> > •
> > • def setUp(self, pc):
> > • B, P = pc.getOperators()
> > • # extract the MatrixFreeB object from B
> > • ctx = B.getPythonContext()
> > • self.A = ctx.A
> > • self.u = ctx.u
> > • self.v = ctx.v
> > • self.k = ctx.k
> > • # Here we build the PC object that uses the concrete,
> > • # assembled matrix A. We will use this to apply the
> action
> > • # of A^{-1}
> > • self.pc = PETSc.PC().create()
> > • self.pc.setOptionsPrefix("mf_")
> > • self.pc.setOperators(self.A)
> > • self.pc.setFromOptions()
> > • # Since u and v do not change, we can build the
> denominator
> > • # and the action of A^{-1} on u only once, in the setup
> > • # phase.
> > • tmp = self.A.createVecLeft()
> > • self.pc.apply(self.u, tmp)
> > • self._Ainvu = tmp
> > • self._denom = 1 + self.k*self.v.dot(self._Ainvu)
> > •
> > • def apply(self, pc, x, y):
> > • # y <- A^{-1}x
> > • self.pc.apply(x, y)
> > • # alpha <- (v^T A^{-1} x) / (1 + v^T A^{-1} u)
> > • alpha = (self.k*self.v.dot(y)) / self._denom
> > • # y <- y - alpha * A^{-1}u
> > • y.axpy(-alpha, self._Ainvu)
> > •
> > •
> > • class MatrixFreeB(object):
> > •
> > • def __init__(self, A, u, v, k):
> > • self.A = A
> > • self.u = u
> > • self.v = v
> > • self.k = k
> > •
> > • def mult(self, mat, x, y):
> > • # y <- A x
> > • self.A.mult(x, y)
> > •
> > • # alpha <- v^T x
> > • alpha = self.v.dot(x)
> > •
> > • # y <- y + alpha*u
> > • y.axpy(alpha, self.u)
> > However, this approach is not efficient as it requires many iterations
> due to the Newton step being fixed, so I would like to implement it using
> SNES and use line search. Unfortunately, I have not been able to find any
> documentation/tutorial on how to do so. Provided I have the FEniCS forms
> for F, K, and a, I'd like to do something along the lines of:
> > solver = PETScSNESSolver() # the FEniCS SNES wrapper
> > snes = solver.snes() # the petsc4py SNES object
> > ## ??
> > ksp = snes.getKSP()
> > # set ksp option similar to above
> > solver.solve()
> >
> > I would be very grateful if anyone could could help or point me to a
> reference or demo that does something similar (or maybe a completely
> different way of solving the problem!).
> > Many thanks in advance!
> > Alex
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200205/ae49894c/attachment.html>
More information about the petsc-users
mailing list