[petsc-users] CG fails to converge in parallel

Bojan Niceno bojan.niceno.scientist at gmail.com
Thu Apr 20 00:53:33 CDT 2023


Dear all,


I am solving a Laplace equation with finite volume method on an
unstructured grid with a Fortran code I have developed, and PETSc 3.19
library.

I first used cg solver with asm preconditioner, which converges nicely when
executed sequentially, but fails for MPI parallel version.  I believed that
there must be an error in which I set up and assemble parallel matrices for
PETSc, but I soon noticed that if I use bicg with asm, everything works
fine, the parallel bicg/asm shows almost the same convergence as sequential
version.

I could carry on with bicg, but I am still worried a little bit.  To my
knowledge of Krylov solvers, which is admittedly basic since I am a
physicist only using linear algebra, the convergence of bicg should be very
similar to that of cg when symmetric systems are solved.  When I run my
cases sequentially, I see that's indeed the case.  But in parallel, bicg
converges and cg fails.

Do you see the above issues as an anomaly and if so, could you advise how
to search for a cause?


    Kind regards,


    Bojan
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230420/f81f11ab/attachment.html>


More information about the petsc-users mailing list