[petsc-dev] Right-preconditioned GMRES
Pierre Jolivet
pierre.jolivet at enseeiht.fr
Thu Nov 7 04:24:55 CST 2019
> On 7 Nov 2019, at 5:32 AM, Smith, Barry F. <bsmith at mcs.anl.gov> wrote:
>
>
> Some idiot logged what they did, but not why they did it.
>
> commit bf108f309acab50613e150419c680842cf4b8a05 (HEAD)
> Author: Barry Smith <bsmith at mcs.anl.gov>
> Date: Thu Mar 18 20:40:53 2004 -0600
>
> bk-changeset-1.2063.1.1
> barrysmith at barry-smiths-computer.local|ChangeSet|20040319024053|12244
> ChangeSet
> 1.2063.1.1 04/03/18 20:40:53 barrysmith at barry-smiths-computer.local +5 -0
> if matrix is symmetric try to use preconditioner options for symmetric matrices
>
>
> Here is my guess as to this guys reasoning 15 years ago: if the user knows their problem is SPD and they thus switch to CG they will get garbage using the default ASM type, they will be confused and unhappy with the result not realizing that it is due to an inappropriate preconditioner default when using CG. The penalty is, of course, someone using GMRES will get slower convergence than they should as you point out.
>
> Today I think we could do better. We could introduce the concept of a "symmetric" preconditioner PCIsSymmetric() PCIsSymmetricKnown() and then CG/all KSP that require symmetric preconditioners could query this information and error immediately if the PC indicates it is NOT symmetric.
Don’t forget about PCIsHermitian() and PCIsHermitianKnown().
But what you suggest goes along what Matt answered earlier.
All this is OK for me.
Though, in the meantime, I’ll force the type to RESTRICT in my PC (https://gitlab.com/petsc/petsc/commit/7e2068805d975b7ad588c59b62e2c0b3e60cd4af#2c7d367ac831f3b0c5fb767c0eb16c1ea7ae7fe0_720_722 <https://gitlab.com/petsc/petsc/commit/7e2068805d975b7ad588c59b62e2c0b3e60cd4af#2c7d367ac831f3b0c5fb767c0eb16c1ea7ae7fe0_720_722>), because I don’t think it’s OK to go from a convergence in 9 iterations:
[…]
9 KSP Residual norm 3.632028841798e-05
[…]
PC Object: (pc_hpddm_levels_1_) 4 MPI processes
type: asm
total subdomain blocks = 4, user-defined overlap
restriction/interpolation type - RESTRICT
[…]
To a convergence in 26 iterations:
[…]
26 KSP Residual norm 1.548760754380e-04
[…]
PC Object: (pc_hpddm_levels_1_) 4 MPI processes
type: asm
total subdomain blocks = 4, user-defined overlap
restriction/interpolation type - BASIC
[…]
For a Helmholtz equation with only 4 subdomains. But that’s just my opinion of course.
Thanks,
Pierre
> Or we could have PCSetSymmetric() which turns on whatever PC type specific options are needed to make the preconditioner symmetric and the symmetric requiring KSP could turn on this option. One has to be careful but because if the user specifically set RAS then it should not be overruled and changed without their knowledge. Note that in asm.c it has the code if (!osm->type_set) { so if the user set RAS it will not change it.
>
> I am not sure that there is a perfect solution that satisfies all use cases but I agree the current behavior is questionable and could be replaced with better behavior that still prevents tragedies of failure of ASM with CG due to the defaults.
>
> Barry
>
>
>
>
>> On Nov 6, 2019, at 12:12 PM, Pierre Jolivet <pierre.jolivet at enseeiht.fr> wrote:
>>
>> I need to figure this out myself first.
>> In the meantime, here is another (PCASM related) question for you: why is PCASMType switched to BASIC when using a symmetric Pmat? (I guess I don’t have to tell you about the performance of RAS vs. ASM)
>> To me, that would make sense if the default KSP was also switched to CG instead of GMRES if the Pmat and Amat were symmetric, but I don’t think this is the case.
>>
>> Thanks,
>> Pierre
>>
>>> On 24 Oct 2019, at 5:40 PM, Smith, Barry F. <bsmith at mcs.anl.gov> wrote:
>>>
>>>
>>> Send the code and exact instructions to run a "good" and a "bad" ASM
>>>
>>> Barry
>>>
>>>
>>>> On Oct 14, 2019, at 10:44 AM, Pierre Jolivet <pierre.jolivet at enseeiht.fr> wrote:
>>>>
>>>> Here are the three logs.
>>>> FGMRES also gives a wrong first iterate.
>>>> I think Mark was right in the sense that the problem is _most likely_ in my RHS.
>>>> But I need to figure out why I only get this problem with right-preconditioned KSPs with restrict or none.
>>>>
>>>> Thanks,
>>>> Pierre
>>>>
>>>>
>>>>
>>>>> On 13 Oct 2019, at 8:16 PM, Smith, Barry F. <bsmith at mcs.anl.gov> wrote:
>>>>>
>>>>>
>>>>> Is this one process with one subdomain? (And hence no meaningful overlap since there is nothing to overlap?) And you expect to get the "exact" answer on one iteration?
>>>>>
>>>>> Please run the right preconditioned GMRES with -pc_asm_type [restrict and basic and none] -ksp_monitor_true_solution and send the output for the three cases.
>>>>>
>>>>> For kicks you can also try FGMRES (which always uses right preconditioning) to see if the same problem appears.
>>>>>
>>>>> Barry
>>>>>
>>>>>
>>>>>> On Oct 13, 2019, at 2:41 AM, Pierre Jolivet via petsc-dev <petsc-dev at mcs.anl.gov> wrote:
>>>>>>
>>>>>> Hello,
>>>>>> I’m struggling to understand the following weirdness with PCASM with exact subdomain solvers.
>>>>>> I’m dealing with a very simple Poisson problem with Dirichlet + Neumann BCs.
>>>>>> If I use PCBJACOBI + KSPPREONLY or 1 iteration of GMRES either preconditioned on the right or on the left, I get the expected result, cf. attached screenshot.
>>>>>> If I use PCASM + KSPPREONLY or 1 iteration of GMRES preconditioned on the left, I get the expected result as well.
>>>>>> However, with PCASM + 1 iteration of GMRES preconditioned on the right, I don’t get what I should (I believe).
>>>>>> Furthermore, this problem is specific to -pc_asm_type restrict,none (I get the expected result with basic,interpolate).
>>>>>>
>>>>>> Any hint?
>>>>>>
>>>>>> Thanks,
>>>>>> Pierre
>>>>>>
>>>>>> $ -sub_pc_type lu -ksp_max_it 1 -ksp_type gmres -pc_type bjacobi -ksp_pc_side right -> bjacobi_OK
>>>>>> $ -sub_pc_type lu -ksp_max_it 1 -ksp_type gmres -pc_type asm -ksp_pc_side left -> asm_OK
>>>>>> $ -sub_pc_type lu -ksp_max_it 1 -ksp_type gmres -pc_type asm -ksp_pc_side right -> asm_KO
>>>>>>
>>>>>> <bjacobi_OK.png><asm_OK.png><asm_KO.png>
>>>>>
>>>>
>>>> <dump-basic><dump-none><dump-restrict>
>>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20191107/98143e25/attachment-0001.html>
More information about the petsc-dev
mailing list