[petsc-users] Multigrid with PML

Barry Smith bsmith at mcs.anl.gov
Wed Jul 13 20:10:52 CDT 2016


  Can you run with the additional option -ksp_view_mat binary and email the resulting file which will be called binaryoutput to petsc-maint at mcs.anl.gov

   Barry

> On Jul 13, 2016, at 2:30 PM, Safin, Artur <aks084000 at utdallas.edu> wrote:
> 
> Dear PETSc community,
> 
> I am working on solving a Helmholtz problem with PML. The issue is that I am finding it very hard to deal with the resulting matrix system; I can get the correct solution for coarse meshes, but it takes roughly 2-4 times as long to converge for each successively refined mesh. I've noticed that without PML, I do not have problems with convergence speed.
> 
> I am using the GMRES solver with GAMG as the preconditioner (with block-Jacobi preconditioner for the multigrid solves). I have also tried to assemble a separate preconditioning matrix with the complex shift 1+0.5i, that does not seem to improve the results. Currently I am running with
> 
>    -ksp_type fgmres \
>    -pc_type gamg \
>    -mg_levels_pc_type bjacobi \
>    -pc_mg_type full \
>    -ksp_gmres_restart 150 \
> 
> Can anyone suggest some way of speeding up the convergence? Any help would be appreciated. I am attaching the output from kspview.
> 
> Best,
> 
> Artur
> 
> <kspview>



More information about the petsc-users mailing list