[petsc-users] KSP step not accelerating despite good preconditioning in SNES

Timothée Nicolas timothee.nicolas at gmail.com
Sun Oct 4 11:05:29 CDT 2015


Thank you. It seems to be working like a charm with PCSHELL, I have the
drastic reductions in KSP iterations ! (incidentally, my convergence
problem was probably mostly due to a mistake in the Schur matrix, but
putting things in the right place also helps !)

Now I have to move the KSP step in the preconditioner to multigrid method,
it should in principle be more efficient than the default KSP options for
my specific problem. However there is almost no example using multigrid in
the documentation. I found only ksp/ksp/examples/tutorials/ex42.c and
ksp/ksp/examples/tests/ex19.c, however in the first one it is not the
default and trying to use the option -stokes_pc_type pcmg produced errors
like
[0]PETSC ERROR: Unable to find requested PC type pcmg

Regarding the second one, it says at the top that it is the "bad way", so I
am not sure I should follow this example... In which sense is it a bad way
? Is it because there are only two levels and in principle we shoud have a
hierarchy until the lowest grid ?

In anycase if anyone has a nice multigrid example on a simple case like
Laplacian, in the "good way", I would be interested

Best

Timothee


2015-09-28 14:21 GMT+02:00 Timothée Nicolas <timothee.nicolas at gmail.com>:

> Thanks a lot, I expected something like that. I will look in this
> direction.
>
> Best
>
> Timothée
>
> 2015-09-28 21:17 GMT+09:00 Matthew Knepley <knepley at gmail.com>:
>
>> On Mon, Sep 28, 2015 at 2:53 AM, Timothée Nicolas <
>> timothee.nicolas at gmail.com> wrote:
>>
>>> Hi all,
>>>
>>> I have something strange in my application and I don't know what could
>>> cause this. I am trying to do an implicit MHD problem and I thought I
>>> finally figured out the preconditioner step, but somehow I don't get the
>>> expected result, not even close.
>>>
>>> For the preconditioning, I am using an approximate Schur complement,
>>> which requires two relatively easy KSP inversions at each preconditioner
>>> application. I apply this algorithm directly to the result function at the
>>> end of the routine FormFunction. I have checked that the approximation to
>>> the inversion of the Jacobian is good, in the sense that when I multiply
>>> the preconditioned vector by the *total* Jacobian matrix, I indeed
>>> recover almost the initial unpreconditioned vector. Also, I know that my
>>> Jacobian matrix is correct, because (i) I have checked manually that F(X +
>>> dX) ~ F(X) + J * dX and (ii) when I don't use -snes_mf and use the provided
>>> Jacobian matrix the result is pretty much equivalent to using -snes_mf.
>>>
>>> In my understanding, this means that what I effectively feed to SNES at
>>> the end of my FormFunction routine is a good approximation to J^(-1) F. As
>>> a result, I naturally expect that the number of KSP iterations necessary to
>>> achieve one SNES iteration be drastically reduced. However, I observe
>>> virtually no change whatsoever in the number of iterations.
>>>
>>> Any thoughts about what I could be missing ? Maybe I forgot to set a
>>> SNES or KSP option somewhere ? I can send pieces of code if needs be.
>>>
>>
>> It sounds like you are putting this in the wrong place. If you have the
>> action of a good preconditioner for the Jacobian, then you
>> should use a PCSHELL and pass it to the KSP. It does not belong in the
>> FormFunction.
>>
>>   Thanks,
>>
>>      Matt
>>
>>
>>> Best
>>>
>>> Timothee
>>>
>>>
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151004/dc7847ee/attachment-0001.html>


More information about the petsc-users mailing list