[petsc-users] PETSc usage issues

Barry Smith bsmith at petsc.dev
Wed Sep 28 13:43:39 CDT 2022


   -pc_type fieldsplit -pc_fieldsplit_detect_saddle_point -pc_fieldsplit_type schur

   Now there will be two additional decisions you need to make how to precondition the A00 block and the Schur complement. 

   For the A00 block the option is 

   -fieldsplit_0_pc_type something    where depending on your problem gamg may be a good place to start 
   -fieldsplit_0_ksp_type preonly (likely is the default)

   For the Schur complement start with just using the default and see how the convergence goes.

   Use -ksp_view to see all the parts of the preconditioner and -ksp_monitor_true_residual to see how it is coverging.

   Run with -help | grep fieldsplit to see all possible options and of course consult https://petsc.org/release/docs/manualpages/PC/PCFIELDSPLIT.html <https://petsc.org/release/docs/manualpages/PC/PCFIELDSPLIT.html>

  Barry




   

> On Sep 27, 2022, at 11:47 PM, wangzj997 <wangzj997 at foxmail.com> wrote:
> 
> Dear PETSc development team:
> 
> Currently, I am learning and trying to use  <>PETSc <>'s KSP to solve large-scale sparse linear systems Ax= b, where A is symmetric positive definite and nonsingular. However, the main diagonal of A contains many 0 items, which leads to the fact that many preconditioners cannot be used when using MPI for multi-process solution, and the number of iterations is large, and the convergence is slow. May I ask how to solve this problem? If it is necessary to make the main diagonal of A all non-zero items, is there any solution in PETSc?
> 
> I would be grateful if you would reply and answer my question.
> 
> Best Regards.
> 
> 
> 
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220928/5a374499/attachment.html>


More information about the petsc-users mailing list