[petsc-users] Is linear solver performance is worse in parallel?
Matthew Knepley
knepley at gmail.com
Thu Jun 29 15:58:57 CDT 2017
On Thu, Jun 29, 2017 at 3:38 PM, Lucas Clemente Vella <lvella at gmail.com>
wrote:
> Hi, I have a problem that is easily solvable with 8 processes, (by easily
> I mean with few iterations). Using PCFIELDSPLIT, I get 2 outer iterations
> and 6 inner iterations, reaching residual norm of 1e-8. The system have
> 786432 unknowns in total, and the solver setting is given by:
>
> PetscOptionsInsertString(NULL,
> "-ksp_type fgmres "
> "-pc_type fieldsplit "
> "-pc_fieldsplit_detect_saddle_point "
> "-pc_fieldsplit_type schur "
> "-pc_fieldsplit_schur_fact_type full "
> "-pc_fieldsplit_schur_precondition self "
> "-fieldsplit_0_ksp_type bcgs "
> "-fieldsplit_0_pc_type hypre "
> "-fieldsplit_1_ksp_type gmres "
> "-fieldsplit_1_pc_type lsc "
> "-fieldsplit_1_lsc_pc_type hypre "
> "-fieldsplit_1_lsc_pc_hypre_boomeramg_cycle_type w");
>
> Problem is, it is slow, (compared to less complex systems, solvable simply
> with bcgs+hypre), and to try to speed things up, I've ran with 64
> processes, which gives only 12288 unknowns per process. In this setting,
> inner iteration reaches the maximum of 15 iterations I set, and the outer
> iteration couldn't lower the residual norm from 1e2 after 20 iterations.
>
> Is this supposed to happen? Increasing the number of parallel processes is
> supposed to worsen the solver performance? I just want to clear this issue
> from Petsc and Hypre side if possible, so if I ever experience such
> behavior again, I can be sure my code is wrong...
>
1) For figuring out convergence issues, I would start with a smaller
problem, so you can run lots of them
2) For any questions about convergence, we need to see the output of
-ksp_view -ksp_monitor_true_residual
-fieldsplit_1_ksp_monitor_true_residual -ksp_converged_reason
3) Please start with -fieldsplit_0_pc_type lu so we can just look at the
Schur complement system
4) It sounds like the strength of your Schur complement preconditioner is
not uniform in the size of the problem. Why
do you think LSC would be a good idea? Also, 'self' preconditioning for
many equations, like Stokes, is not uniform
in problem size.
5) What are your equations?
6) I would start with -fieldsplit_1_pc_type lu, which will test your PC
matrix, and after that works, change things one at a time.
Thanks,
Matt
> --
> Lucas Clemente Vella
> lvella at gmail.com
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
http://www.caam.rice.edu/~mk51/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170629/8d05d72b/attachment.html>
More information about the petsc-users
mailing list