[petsc-users] CG fails to converge in parallel
Pierre Jolivet
pierre.jolivet at lip6.fr
Thu Apr 20 01:07:40 CDT 2023
> On 20 Apr 2023, at 7:53 AM, Bojan Niceno <bojan.niceno.scientist at gmail.com> wrote:
>
> Dear all,
>
>
> I am solving a Laplace equation with finite volume method on an unstructured grid with a Fortran code I have developed, and PETSc 3.19 library.
>
> I first used cg solver with asm preconditioner, which converges nicely when executed sequentially, but fails for MPI parallel version. I believed that there must be an error in which I set up and assemble parallel matrices for PETSc, but I soon noticed that if I use bicg with asm, everything works fine, the parallel bicg/asm shows almost the same convergence as sequential version.
KSPCG requires a symmetric PC.
By default, PCASMType is PC_ASM_RESTRICT, which yields a non-symmetric preconditioner.
With a single process, this does not matter, but with more than one process, it does.
If you switch to -pc_asm_type basic, KSPCG should converge.
That being said, for Laplace equation, there are much faster alternatives than PCASM, e.g., PCGAMG.
Thanks,
Pierre
> I could carry on with bicg, but I am still worried a little bit. To my knowledge of Krylov solvers, which is admittedly basic since I am a physicist only using linear algebra, the convergence of bicg should be very similar to that of cg when symmetric systems are solved. When I run my cases sequentially, I see that's indeed the case. But in parallel, bicg converges and cg fails.
>
> Do you see the above issues as an anomaly and if so, could you advise how to search for a cause?
>
>
> Kind regards,
>
>
> Bojan
>
>
>
>
>
More information about the petsc-users
mailing list