[petsc-users] Preconditioner for Helmholtz-like problem

Barry Smith bsmith at petsc.dev
Sat Sep 19 00:41:51 CDT 2020


  These are small enough that likely sparse direct solvers are the best use of your time and for general efficiency. 

  PETSc supports 3 parallel direct solvers, SuperLU_DIST, MUMPs and Pastix. I recommend configuring PETSc for all three of them and then comparing them for problems of interest to you.

   --download-superlu_dist --download-mumps --download-pastix --download-scalapack (used by MUMPS) --download-metis --download-parmetis --download-ptscotch 

  Barry


> On Sep 18, 2020, at 11:28 PM, Alexey Kozlov <Alexey.V.Kozlov.2 at nd.edu> wrote:
> 
> Thanks for the tips! My matrix is complex and unsymmetric. My typical test case has of the order of one million equations. I use a 2nd-order finite-difference scheme with 19-point stencil, so my typical test case uses several GB of RAM.
> 
> On Fri, Sep 18, 2020 at 11:52 PM Jed Brown <jed at jedbrown.org <mailto:jed at jedbrown.org>> wrote:
> Unfortunately, those are hard problems in which the "good" methods are technical and hard to make black-box.  There are "sweeping" methods that solve on 2D "slabs" with PML boundary conditions, H-matrix based methods, and fancy multigrid methods.  Attempting to solve with STRUMPACK is probably the easiest thing to try (--download-strumpack).
> 
> https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MATSOLVERSSTRUMPACK.html <https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MATSOLVERSSTRUMPACK.html>
> 
> Is the matrix complex symmetric?
> 
> Note that you can use a direct solver (MUMPS, STRUMPACK, etc.) for a 3D problem like this if you have enough memory.  I'm assuming the memory or time is unacceptable and you want an iterative method with much lower setup costs.
> 
> Alexey Kozlov <Alexey.V.Kozlov.2 at nd.edu <mailto:Alexey.V.Kozlov.2 at nd.edu>> writes:
> 
> > Dear all,
> >
> > I am solving a convected wave equation in a frequency domain. This equation
> > is a 3D Helmholtz equation with added first-order derivatives and mixed
> > derivatives, and with complex coefficients. The discretized PDE results in
> > a sparse linear system (about 10^6 equations) which is solved in PETSc. I
> > am having difficulty with the code convergence at high frequency, skewed
> > grid, and high Mach number. I suspect it may be due to the preconditioner I
> > use. I am currently using the ILU preconditioner with the number of fill
> > levels 2 or 3, and BCGS or GMRES solvers. I suspect the state of the art
> > has evolved and there are better preconditioners for Helmholtz-like
> > problems. Could you, please, advise me on a better preconditioner?
> >
> > Thanks,
> > Alexey
> >
> > -- 
> > Alexey V. Kozlov
> >
> > Research Scientist
> > Department of Aerospace and Mechanical Engineering
> > University of Notre Dame
> >
> > 117 Hessert Center
> > Notre Dame, IN 46556-5684
> > Phone: (574) 631-4335
> > Fax: (574) 631-8355
> > Email: akozlov at nd.edu <mailto:akozlov at nd.edu>
> 
> 
> -- 
> Alexey V. Kozlov
> 
> Research Scientist
> Department of Aerospace and Mechanical Engineering
> University of Notre Dame
> 
> 117 Hessert Center
> Notre Dame, IN 46556-5684
> Phone: (574) 631-4335
> Fax: (574) 631-8355
> Email: akozlov at nd.edu <mailto:akozlov at nd.edu>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200919/db6ad7f5/attachment-0001.html>


More information about the petsc-users mailing list