[petsc-users] incredibly good performance of scipy lgmres

Stefano Zampini stefano.zampini at gmail.com
Wed Dec 9 10:25:29 CST 2020


Could it be that scipy lgmres is reporting the wrong number of iterations?

I would try to replicate the scipy code first
https://github.com/scipy/scipy/blob/master/scipy/sparse/linalg/isolve/lgmres.py

Il Mer 9 Dic 2020, 19:17 Florian Bruckner <e0425375 at gmail.com> ha scritto:

> Dear PETSc developers,
> I am currently re-implementing our FEM-BEM code using Firedrake.
> The original code we were using is based on FEniCS and uses scipy sparse
> solvers for the solution of the coupled FEM / BEM system.
>
> For some reason the scipy lgmres method seems to outperform all other
> methods which we tried. E.g. for the strayfield-calculation of a 10x10x10
> unit cube scipy-lgmres needs 5 iterations (without preconditioner), whereas
> scipy-gmres needs 167. The new implementation uses petsc-gmres and
> petsc-lgmres, but both need around 170 iterations.
>
> If I understand lgmres correctly it only improves convergence if gmres is
> restarted. Since it only needs 5 iterations i think this cannot be the
> reason. But nevertheless since the method seems to perform very good, it
> would be worth looking at the differences in detail. I provide the dense
> data of the system-matrix and right-hand-side vector that I used, as well
> as scripts for the different considered methods.
>
> Any ideas how scipy-lgmres could be that good? It would be nice if someone
> could validate my results (lgmres solves within 5 iterations). For me the
> next step will be to wrap scipy-lgmres using petsc4py. I know how to do it
> with petsc4py directly, but I am not exactly sure how it works with the
> firedrake interface.
>
> best wishes
> Florian
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20201209/59f77733/attachment.html>


More information about the petsc-users mailing list