[petsc-users] Preconditioner for Helmholtz-like problem
Pierre Jolivet
pierre at joliv.et
Sat Oct 17 01:52:43 CDT 2020
> On 17 Oct 2020, at 5:47 AM, Alexey Kozlov <Alexey.V.Kozlov.2 at nd.edu> wrote:
>
> Thank you for your advice! My sparse matrix seems to be very stiff so I have decided to concentrate on the direct solvers. I have very good results with MUMPS. Due to a lack of time I haven’t got a good result with SuperLU_DIST and haven’t compiled PETSc with Pastix yet but I have a feeling that MUMPS is the best. I have run a sequential test case with built-in PETSc LU (-pc_type lu -ksp_type preonly) and MUMPs (-pc_type lu -ksp_type preonly -pc_factor_mat_solver_type mumps) with default settings and found that MUMPs was about 50 times faster than the built-in LU and used about 3 times less RAM. Do you have any idea why it could be?
>
> My test case has about 100,000 complex equations with about 3,000,000 non-zeros. PETSc was compiled with the following options: ./configure --with-blaslapack-dir=/opt/crc/i/intel/19.0/mkl --enable-g --with-valgrind-dir=/opt/crc/v/valgrind/3.14/ompi --with-scalar-type=complex --with-clanguage=c --with-openmp --with-debugging=0 COPTFLAGS='-mkl=parallel -O2 -mavx -axCORE-AVX2 -no-prec-div -fp-model fast=2' FOPTFLAGS='-mkl=parallel -O2 -mavx -axCORE-AVX2 -no-prec-div -fp-model fast=2' CXXOPTFLAGS='-mkl=parallel -O2 -mavx -axCORE-AVX2 -no-prec-div -fp-model fast=2' --download-superlu_dist --download-mumps --download-scalapack --download-metis --download-cmake --download-parmetis --download-ptscotch.
>
> Running MUPMS in parallel using MPI also gave me a significant gain in performance (about 10 times on a single cluster node).
>
> Could you, please, advise me whether I can adjust some options for the direct solvers to improve performance?
Your problem may be too small, but if you stick to full MUMPS, it may be worth playing around with the block low-rank (BLR) options.
Here are some references: http://mumps.enseeiht.fr/doc/Thesis_TheoMary.pdf#page=191 <http://mumps.enseeiht.fr/doc/Thesis_TheoMary.pdf#page=191> http://mumps.enseeiht.fr/doc/ud_2017/Shantsev_Talk.pdf <http://mumps.enseeiht.fr/doc/ud_2017/Shantsev_Talk.pdf>
The relevant options in PETSc are -mat_mumps_icntl_35, -mat_mumps_icntl_36, and -mat_mumps_cntl_7
Thanks,
Pierre
> Should I try MUMPS in OpenMP mode?
>
> On Sat, Sep 19, 2020 at 7:40 AM Mark Adams <mfadams at lbl.gov <mailto:mfadams at lbl.gov>> wrote:
> As Jed said high frequency is hard. AMG, as-is, can be adapted (https://link.springer.com/article/10.1007/s00466-006-0047-8 <https://link.springer.com/article/10.1007/s00466-006-0047-8>) with parameters.
> AMG for convection: use richardson/sor and not chebyshev smoothers and in smoothed aggregation (gamg) don't smooth (-pc_gamg_agg_nsmooths 0).
> Mark
>
> On Sat, Sep 19, 2020 at 2:11 AM Alexey Kozlov <Alexey.V.Kozlov.2 at nd.edu <mailto:Alexey.V.Kozlov.2 at nd.edu>> wrote:
> Thanks a lot! I'll check them out.
>
> On Sat, Sep 19, 2020 at 1:41 AM Barry Smith <bsmith at petsc.dev <mailto:bsmith at petsc.dev>> wrote:
>
> These are small enough that likely sparse direct solvers are the best use of your time and for general efficiency.
>
> PETSc supports 3 parallel direct solvers, SuperLU_DIST, MUMPs and Pastix. I recommend configuring PETSc for all three of them and then comparing them for problems of interest to you.
>
> --download-superlu_dist --download-mumps --download-pastix --download-scalapack (used by MUMPS) --download-metis --download-parmetis --download-ptscotch
>
> Barry
>
>
>> On Sep 18, 2020, at 11:28 PM, Alexey Kozlov <Alexey.V.Kozlov.2 at nd.edu <mailto:Alexey.V.Kozlov.2 at nd.edu>> wrote:
>>
>> Thanks for the tips! My matrix is complex and unsymmetric. My typical test case has of the order of one million equations. I use a 2nd-order finite-difference scheme with 19-point stencil, so my typical test case uses several GB of RAM.
>>
>> On Fri, Sep 18, 2020 at 11:52 PM Jed Brown <jed at jedbrown.org <mailto:jed at jedbrown.org>> wrote:
>> Unfortunately, those are hard problems in which the "good" methods are technical and hard to make black-box. There are "sweeping" methods that solve on 2D "slabs" with PML boundary conditions, H-matrix based methods, and fancy multigrid methods. Attempting to solve with STRUMPACK is probably the easiest thing to try (--download-strumpack).
>>
>> https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MATSOLVERSSTRUMPACK.html <https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MATSOLVERSSTRUMPACK.html>
>>
>> Is the matrix complex symmetric?
>>
>> Note that you can use a direct solver (MUMPS, STRUMPACK, etc.) for a 3D problem like this if you have enough memory. I'm assuming the memory or time is unacceptable and you want an iterative method with much lower setup costs.
>>
>> Alexey Kozlov <Alexey.V.Kozlov.2 at nd.edu <mailto:Alexey.V.Kozlov.2 at nd.edu>> writes:
>>
>> > Dear all,
>> >
>> > I am solving a convected wave equation in a frequency domain. This equation
>> > is a 3D Helmholtz equation with added first-order derivatives and mixed
>> > derivatives, and with complex coefficients. The discretized PDE results in
>> > a sparse linear system (about 10^6 equations) which is solved in PETSc. I
>> > am having difficulty with the code convergence at high frequency, skewed
>> > grid, and high Mach number. I suspect it may be due to the preconditioner I
>> > use. I am currently using the ILU preconditioner with the number of fill
>> > levels 2 or 3, and BCGS or GMRES solvers. I suspect the state of the art
>> > has evolved and there are better preconditioners for Helmholtz-like
>> > problems. Could you, please, advise me on a better preconditioner?
>> >
>> > Thanks,
>> > Alexey
>> >
>> > --
>> > Alexey V. Kozlov
>> >
>> > Research Scientist
>> > Department of Aerospace and Mechanical Engineering
>> > University of Notre Dame
>> >
>> > 117 Hessert Center
>> > Notre Dame, IN 46556-5684
>> > Phone: (574) 631-4335
>> > Fax: (574) 631-8355
>> > Email: akozlov at nd.edu <mailto:akozlov at nd.edu>
>>
>>
>> --
>> Alexey V. Kozlov
>>
>> Research Scientist
>> Department of Aerospace and Mechanical Engineering
>> University of Notre Dame
>>
>> 117 Hessert Center
>> Notre Dame, IN 46556-5684
>> Phone: (574) 631-4335
>> Fax: (574) 631-8355
>> Email: akozlov at nd.edu <mailto:akozlov at nd.edu>
>
>
>
> --
> Alexey V. Kozlov
>
> Research Scientist
> Department of Aerospace and Mechanical Engineering
> University of Notre Dame
>
> 117 Hessert Center
> Notre Dame, IN 46556-5684
> Phone: (574) 631-4335
> Fax: (574) 631-8355
> Email: akozlov at nd.edu <mailto:akozlov at nd.edu>
>
>
> --
> Alexey V. Kozlov
>
> Research Scientist
> Department of Aerospace and Mechanical Engineering
> University of Notre Dame
>
> 117 Hessert Center
> Notre Dame, IN 46556-5684
> Phone: (574) 631-4335
> Fax: (574) 631-8355
> Email: akozlov at nd.edu <mailto:akozlov at nd.edu>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20201017/9be9ae26/attachment-0001.html>
More information about the petsc-users
mailing list