[petsc-users] GMRES with matrix-free method and preconditioning matrix for higher performance.

Choi Kyungjun kyungjun.choi92 at gmail.com
Wed Aug 31 07:34:35 CDT 2016


2016-08-31 21:23 GMT+09:00 Matthew Knepley <knepley at gmail.com>:

> On Wed, Aug 31, 2016 at 7:22 AM, Choi Kyungjun <kyungjun.choi92 at gmail.com>
> wrote:
>
>> Thank you very much again Matt.
>>
>> Just another simple question.
>>
>> 2016-08-31 20:00 GMT+09:00 Matthew Knepley <knepley at gmail.com>:
>>
>>> On Wed, Aug 31, 2016 at 5:46 AM, Choi Kyungjun <
>>> kyungjun.choi92 at gmail.com> wrote:
>>>
>>>> Thanks Matt.
>>>>
>>>> I really appreciate your help every time.
>>>>
>>>>
>>>> I think I forgot mentioning code info again.
>>>>
>>>> 1)
>>>> I'm working on 2-D/3-D Compressible Euler equation solver, which is
>>>> completely hyperbolic system.
>>>>
>>>
>>> Okay, then MG is out. I would start by using a sparse direct solver,
>>> like SuperLU. Then for parallelism
>>> you could use ASM, and as the subsolver use SuperLU, so something like
>>>
>>>   -ksp_type gmres -pc_type asm -sub_pc_type superlu
>>>
>>> You could get more sophisticated by
>>>
>>>   - Trying to have blocks bigger than 1 process and using SuperLU_dist
>>>
>>>   - Splitting up the fields using PCFIELDSPLIT. There are indications
>>> that solving one of the fields first
>>>     can really help convergence. I am thinking of the work of David
>>> Keyes and LuLu Liu on MSM methods.
>>>
>>
>> For the above part,
>>
>> It's not compatible with -snes_mf  command line option, is it?
>>
>
> No. I think MF is not a useful idea unless you have a preconditioning
> matrix.
>
>   Thanks,
>
>      Matt
>


But in order to use KSP context,

I have to make my system matrix,  isn't it?

Then what's the difference between having preconditioning matrix preA and
making system matrix A?


Because  *-snes_mf* option requireed no system matrix and just computed the
residual which felt very convenient.


If is necessary to make my system matrix to use KSP - GMRES, as you
recommended above, then I'll try.


Thank  you very much.

Kyungjun.



>
>
>> I applied cmd line options like below
>> *-snes_mf     -ksp_type  gmres     -pc_type asm     -sub_pc_type superlu
>>     -snes_view     -snes_monitor     -ksp_monitor    -snes_converged_reason
>>    -ksp_converged_reason*
>>
>>
>> and my code flows like this
>>
>> *- call SNESCreate(PETSC_COMM_WORLD, Mixt%snes, ier)*
>> *- call SNESSetFunction(Mixt%snes, Mixt%r, FormFunction, userctx, ier)*
>> *- call SNESSetFromOptions(Mixt%snes, ier)*
>> *- call SNESGetKSP(Mixt%snes, ksp, ier)*
>> *- call KSPGetPC(ksp, pc, ier)*
>>
>> *- call KSPSetFromOptions(ksp, ier)*
>> *- call PCSetFromOptions(pc, ier)*
>>
>>
>>
>>
>>
>>
>>>
>>>
>>>> 2)
>>>> And I'm trying to implement some implicit time scheme for convergence
>>>> of my steady state problem.
>>>>
>>>> I used LUSGS implicit scheme before, but these days GMRES implicit
>>>> scheme is popular for quadratic convergence characteristics.
>>>>
>>>
>>> I am not sure I understand here. Nothing about GMRES is quadratic.
>>> However, Newton's method can be quadratic
>>> if your initial guess is good, and GMRES could be part of a solver for
>>> that.
>>>
>>>
>>>> For implicit time scheme, it is just same as matrix inversion, so I
>>>> started PETSc library for GMRES, which is one of the greatest mathematical
>>>> library.
>>>>
>>>>
>>>> 3)
>>>> As I'm using different numerical convective flux scheme (e.g. Roe's
>>>> FDS, AUSM, etc), it would be really time consuming to derive Jacobian
>>>> matrix for each scheme.
>>>>
>>>
>>> Yes. However, the preconditioner matrix only needs to be approximate. I
>>> think you should derive one for the easiest flux scheme and
>>> always use that. The important thing is to couple the unknowns which
>>> influence each other, rather than the precise method of influence.
>>>
>>>   Thanks,
>>>
>>>       Matt
>>>
>>>
>>>> So I was fascinated by matrix-free method (I didn't completely
>>>> understand this method back then), and I implemented GMRES with no
>>>> preconditioning matrix with your help.
>>>>
>>>> After that, I wanted to ask you about any accelerating methods for my
>>>> GMRES condition.
>>>>
>>>> I will try applying CHEBY preconditioner as you mentioned first (even
>>>> if its performance wouldn't be that good).
>>>>
>>>> In order to constitute user-provided preconditioning matrix, could you
>>>> tell me any similar examples?
>>>>
>>>>
>>>> Thanks again.
>>>>
>>>> Your best,
>>>>
>>>> Kyungjun.
>>>>
>>>>
>>>> 2016-08-31 19:07 GMT+09:00 Matthew Knepley <knepley at gmail.com>:
>>>>
>>>>> On Wed, Aug 31, 2016 at 3:49 AM, Choi Kyungjun <
>>>>> kyungjun.choi92 at gmail.com> wrote:
>>>>>
>>>>>> Dear Petsc.
>>>>>>
>>>>>> I am implementing Petsc library for my CFD flow code.
>>>>>>
>>>>>> Thanks to Matt, I got what I wanted last week.
>>>>>>
>>>>>> It was the GMRES with matrix-free method, no preconditioning matrix
>>>>>> and command line options are below.
>>>>>>
>>>>>> *-snes_mf    -pc_type none    -..monitor   -..converged_reason*
>>>>>>
>>>>>> The solve worked, but performed very poorly.
>>>>>>
>>>>>>
>>>>>> I learned that the efficiency of Krylov-subspace methods depends
>>>>>> strongly depends on a good preconditioner.
>>>>>>
>>>>>> And in the Petsc manual, the matrix-free method is allowed only with
>>>>>> no preconditioning, a user-provided preconditioner matrix, or a
>>>>>> user-provided preconditioner shell.
>>>>>>
>>>>>>
>>>>>> Here are my questions.
>>>>>>
>>>>>> 1) To improve the solver performance using GMRES, is there any way
>>>>>> using snes_mf without preconditioning matrix?
>>>>>>
>>>>>
>>>>> Not really. The CHEBY preconditioner will work without an explicit
>>>>> matrix, however its not great by itself.
>>>>>
>>>>>
>>>>>> 2) For user-provided preconditioner matrix, I saw some example codes
>>>>>> that provide approx. Jacobian matrix as preconditioner matrix. But this
>>>>>> means that I should derive approx. Jacobian mat for system, am I right?
>>>>>>
>>>>>
>>>>> Yes.
>>>>>
>>>>>
>>>>>> 3) I'd like to know which is the fastest way to solve with GMRES
>>>>>> method. Could you tell me or let me know any other examples?
>>>>>>
>>>>>
>>>>> 1) The solve depends greatly on the physics/matrix you are using.
>>>>> Without knowing that, we can't say anything. For example, is
>>>>> the system elliptic? If so, then using Multigrid (MG) is generally a
>>>>> good idea.
>>>>>
>>>>> 2) In general, I think its a mistake to think of GMRES or any KSP as a
>>>>> solver. We should think of them as accelerators for solvers,
>>>>> as they were originally intended. For example, MG is a good solver for
>>>>> elliptic CFD equations, as long as you somehow deal with
>>>>> incompressibility. Then you can use GMRES to cleanup some things you
>>>>> miss when implementing your MG solver.
>>>>>
>>>>> 3) The best thing to do in this case is to look at the literature,
>>>>> which is voluminous, and find the solver you want to implement. PETSc
>>>>> really speeds up the actually implementation and testing.
>>>>>
>>>>>   Thanks,
>>>>>
>>>>>      Matt
>>>>>
>>>>>
>>>>>> Thank you very much for your help.
>>>>>>
>>>>>> Sincerely,
>>>>>>
>>>>>> Kyungjun.
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> What most experimenters take for granted before they begin their
>>>>> experiments is infinitely more interesting than any results to which their
>>>>> experiments lead.
>>>>> -- Norbert Wiener
>>>>>
>>>>
>>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160831/8f903a25/attachment-0001.html>


More information about the petsc-users mailing list