[petsc-users] (no subject)

Barry Smith bsmith at mcs.anl.gov
Thu Jul 28 23:11:04 CDT 2011


On Jul 28, 2011, at 10:57 PM, Debao Shao wrote:

> Dear all:
> 
> I'm using PETSC iterative solver(PCILU & KSPGMRES) in OOQP, but when the to be solved matrix is ill-posed, say condition number ~ 1e+4, then, KSPGMRES will exceed its maximal iteration number(default as 10000), while when I checked the same data with cholmod direct solver, it can solve the problem and return an answer correctly.
> 
> Does it mean the iterative solver(like GMRES) relies more on the well-pose matrix than cholmod direct solver?

   Yes, generally iterative solvers take a longer time for more ill-conditioned matrices. Direct solvers are not usually effected much by ill conditioning until it becomes large.

> How can I improve the case to avoid KSPGMRES fail case?

   There are many many choices in PETSc making it hard to know how to get started on getting an efficient solver.

   1) When the matrix is very ill-conditioned  or small (say less than 1000 by 1000) then the best bet is a direct solver -pc_type lu (and in parallel you need to install an external solver package MUMPS or SuperLU_dist see http://www.mcs.anl.gov/petsc/petsc-as/documentation/linearsolvertable.html) 

    2) in your case condition number of ~ 10,000 is low enough that iterative solver can likely be faster than direct solver so it is worth investigating iterative solvers. The first thing to ask is there any special structure in the matrix you can take advantage of? Is is symmetric, positive definite then use -ksp_type cg  Does the matrix arise from an elliptic PDE discretization on a mesh then likely you can get multigrid to work well either geometric or algebraic see http://www.mcs.anl.gov/petsc/petsc-as/documentation/linearsolvertable.html) .   Does the matrix come from a fluids problem? Then there are various tricks we can help you with. 

   If you know of no particular structure then the fall back is ILU. Since ILU(0) doesn't seem to work you can try ILU(k) for small k integer to see if that helps, just add the option -pc_factor_levels 1  or 2 or 3 and see how that affects the convergence. Always run with -ksp_monitor_true_residual -ksp_converged_reason when testing to see how the solver is working. Since the default GMRES restart is 30 that may be hurting your convergence, you can try a larger restart with -ksp_gmres_restart 100 or use -ksp_type bcgs that does not require a restart. 

   This is enough to get you started.

   Are you interested in the parallel case? That introduces many more difficulties. 

   Barry

> 
> Your answer is appreciated.
> 
> Thanks,
> Debao
> 
> -- The information contained in this communication and any attachments is confidential and may be privileged, and is for the sole use of the intended recipient(s). Any unauthorized review, use, disclosure or distribution is prohibited. Unless explicitly stated otherwise in the body of this communication or the attachment thereto (if any), the information is provided on an AS-IS basis without any express or implied warranties or liabilities. To the extent you are relying on this information, you are doing so at your own risk. If you are not the intended recipient, please notify the sender immediately by replying to this message and destroy all copies of this message and any attachments. ASML is neither liable for the proper and complete transmission of the information contained in this communication, nor for any delay in its receipt.



More information about the petsc-users mailing list