[petsc-users] Multigrid with PML

Mark Adams mfadams at lbl.gov
Fri Jul 15 03:46:47 CDT 2016


On Thu, Jul 14, 2016 at 9:10 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
>    This is a very difficult problem. I am not surprised that GAMG performs
> poorly, I would be surprised if it performed well at all.
>
>    I think you need to do some googling of   "helmholtz PML linear system
> solve" to find what other people have used. The first hit I got was this
> http://www.math.tau.ac.il/services/phd/dissertations/Singer_Ido.pdf and
> every iterative method he tried ended up requiring MANY iterations with
> refinement. This is 14 years old so there will be better suggestions out
> there. One that caught my eye was
> http://www.sciencedirect.com/science/article/pii/S0022247X11005063
>
>
>   Barry
>
> Just looking at the matrix makes it clear to me that conventional
> iterative methods are not going to work well, many of the diagonal entries
> are zero and even in rows with a diagonal entry it is much smaller in
> magnitude than the diagonal entries.
>

Indefinite Helmholtz is hard unless you are not shifting very far. This
zero diagonals must come from PML.

First get rid of PML and see if you can solve anything to your satisfaction.

I have a paper on this, using AMG, and I tried to be inclusive, but I did
miss a potentially useful method of adding a complex shift to damp the
system. You can Google something like 'complex shift helmholtz damp'.  If
you are shifting deep (high frequency Helmholtz), then use direct solvers.


>
> > On Jul 13, 2016, at 2:30 PM, Safin, Artur <aks084000 at utdallas.edu>
> wrote:
> >
> > Dear PETSc community,
> >
> > I am working on solving a Helmholtz problem with PML. The issue is that
> I am finding it very hard to deal with the resulting matrix system; I can
> get the correct solution for coarse meshes, but it takes roughly 2-4 times
> as long to converge for each successively refined mesh. I've noticed that
> without PML, I do not have problems with convergence speed.
> >
> > I am using the GMRES solver with GAMG as the preconditioner (with
> block-Jacobi preconditioner for the multigrid solves). I have also tried to
> assemble a separate preconditioning matrix with the complex shift 1+0.5i,
> that does not seem to improve the results. Currently I am running with
> >
> >    -ksp_type fgmres \
> >    -pc_type gamg \
> >    -mg_levels_pc_type bjacobi \
> >    -pc_mg_type full \
> >    -ksp_gmres_restart 150 \
> >
> > Can anyone suggest some way of speeding up the convergence? Any help
> would be appreciated. I am attaching the output from kspview.
> >
> > Best,
> >
> > Artur
> >
> > <kspview>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160715/70877b7e/attachment.html>


More information about the petsc-users mailing list