[petsc-dev] Should pARMS converge in parallel?
Jed Brown
jed at 59A2.org
Wed Mar 16 15:37:28 CDT 2011
Jose, thanks for adding pARMS.
I wonder if there is something wrong or if I'm just not using it correctly.
I'm testing with ksp ex2 and it works fine in serial:
$ mpiexec -n 1 ./ex2 -m 50 -n 50 -ksp_converged_reason -pc_type parms
Linear solve converged due to CONVERGED_RTOL iterations 10
Norm of error 9.34118e-05 iterations 10
but it diverges in parallel:
$ mpiexec -n 2 ./ex2 -m 50 -n 50 -ksp_converged_reason -pc_type parms
Linear solve did not converge due to DIVERGED_DTOL iterations 630
Norm of error 3.12589e+06 iterations 630
I can make it "converge" by turning off restart, but the algebraic residual
is actually large indicating that the preconditioner is actually nonlinear.
$ mpiexec -n 2 ./ex2 -m 50 -n 50 -ksp_converged_reason -pc_type parms
-ksp_gmres_restart 1000
Linear solve converged due to CONVERGED_RTOL iterations 81
Norm of error 30.4982 iterations 81
Indeed, using FGMRES makes it converge
$ mpiexec -n 2 ./ex2 -m 50 -n 50 -ksp_converged_reason -pc_type parms
-ksp_gmres_restart 1000 -ksp_type fgmres
Linear solve converged due to CONVERGED_RTOL iterations 71
Norm of error 0.000150845 iterations 71
but avoiding restarts is really critical
$ mpiexec -n 2 ./ex2 -m 50 -n 50 -ksp_converged_reason -pc_type parms
-ksp_gmres_restart 30 -ksp_type fgmres
Linear solve converged due to CONVERGED_RTOL iterations 771
Norm of error 0.00202382 iterations 771
What do you think about making the preconditioner linear by default? Is
there a way to make it more tolerant of restarts, at least for such a simple
problem (2D Laplace, 5-point stencil, uniform grid)?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20110316/9c086e7c/attachment.html>
More information about the petsc-dev
mailing list