[petsc-users] GMRES stability
Barry Smith
bsmith at mcs.anl.gov
Fri Feb 27 05:30:53 CST 2015
Ok, please provide the rest of the information I asked for.
Barry
> On Feb 27, 2015, at 2:33 AM, Orxan Shibliyev <orxan.shibli at gmail.com> wrote:
>
> No. It does not converge at all or iow it diverges.
>
> On Thu, Feb 26, 2015 at 9:36 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>>
>> By stability I assume you mean the the GMRES does not converge (or converges too slowly)?
>>
>> The way to improve GMRES convergence is with a preconditioner better suited to your problem. By default PETSc uses GMRES with a block Jacobi preconditioner with one block per process and ILU(0) on each block. For some problems this is fine, but for many problems it will give bad convergence.
>>
>> What do you get for -ksp_view (are you using the default?) Are you running yet in parallel?
>>
>> As a test on one process you can use GS in PETSc as the preconditioner and make sure you get similar convergence to your code. For example -ksp_richardson -pc_type sor on one processor will give you a GS solver.
>>
>> Once we know a bit more about your problem we can suggest better preconditioners.
>>
>> Barry
>>
>>
>>> On Feb 26, 2015, at 10:25 PM, Orxan Shibliyev <orxan.shibli at gmail.com> wrote:
>>>
>>> Hi
>>>
>>> I tried to solve Ax=b with my own Gauss-Seidel code and Petsc's GMRES.
>>> With my GS, for a steady state problem I can set CFL=40 and for
>>> unsteady case can set dt=0.1. However, for GMRES I can't set CFL more
>>> than 5 and for unsteady case dt more than 0.00001. I need GMRES for
>>> parallel computations so I cannot use GS for this purpose. Is there a
>>> way to improve the stability of GMRES?
>>
More information about the petsc-users
mailing list