[petsc-users] Question concerning ilu and bcgs

hong at aspiritech.org hong at aspiritech.org
Wed Feb 18 09:49:52 CST 2015


 Have you tried other solvers, e.g., PETSc default gmres/ilu, bcgs/ilu etc.
The matrix is small. If it is ill-conditioned, then pc_type lu would work
the best.

Hong

On Wed, Feb 18, 2015 at 9:34 AM, Sun, Hui <hus003 at ucsd.edu> wrote:

>  With options:
>
>  -pc_type hypre -pc_hypre_type pilut -pc_hypre_pilut_maxiter 1000
> -pc_hypre_pilut_tol 1e-3 -ksp_type bcgs -ksp_rtol 1e-10 -ksp_max_it 10
> -ksp_monitor_short -ksp_converged_reason -ksp_view
>
>  Here is the full output:
>
>     0 KSP Residual norm 1404.62
>
>   1 KSP Residual norm 88.9068
>
>   2 KSP Residual norm 64.73
>
>   3 KSP Residual norm 71.0224
>
>   4 KSP Residual norm 69.5044
>
>   5 KSP Residual norm 455.458
>
>   6 KSP Residual norm 174.876
>
>   7 KSP Residual norm 183.031
>
>   8 KSP Residual norm 650.675
>
>   9 KSP Residual norm 79.2441
>
>  10 KSP Residual norm 84.1985
>
> Linear solve did not converge due to DIVERGED_ITS iterations 10
>
> KSP Object: 1 MPI processes
>
>   type: bcgs
>
>   maximum iterations=10, initial guess is zero
>
>   tolerances:  relative=1e-10, absolute=1e-50, divergence=10000
>
>   left preconditioning
>
>   using PRECONDITIONED norm type for convergence test
>
> PC Object: 1 MPI processes
>
>   type: hypre
>
>     HYPRE Pilut preconditioning
>
>     HYPRE Pilut: maximum number of iterations 1000
>
>     HYPRE Pilut: drop tolerance 0.001
>
>     HYPRE Pilut: default factor row size
>
>   linear system matrix = precond matrix:
>
>   Mat Object:   1 MPI processes
>
>     type: seqaij
>
>     rows=62500, cols=62500
>
>     total: nonzeros=473355, allocated nonzeros=7.8125e+06
>
>     total number of mallocs used during MatSetValues calls =0
>
>       not using I-node routines
>
> Time cost: 0.756198,  0.662984,  0.105672
>
>
>
>
>  ------------------------------
> *From:* Matthew Knepley [knepley at gmail.com]
> *Sent:* Wednesday, February 18, 2015 3:30 AM
> *To:* Sun, Hui
> *Cc:* petsc-users at mcs.anl.gov
> *Subject:* Re: [petsc-users] Question concerning ilu and bcgs
>
>    On Wed, Feb 18, 2015 at 12:33 AM, Sun, Hui <hus003 at ucsd.edu> wrote:
>
>>  I have a matrix system Ax = b, A is of type MatSeqAIJ or MatMPIAIJ,
>> depending on the number of cores.
>>
>>  I try to solve this problem by pc_type ilu and ksp_type bcgs, it does
>> not converge. The options I specify are:
>>
>> -pc_type hypre -pc_hypre_type pilut -pc_hypre_pilut_maxiter 1000
>> -pc_hypre_pilut_tol 1e-3 -ksp_type b\
>>
>> cgs -ksp_rtol 1e-10 -ksp_max_it 1000 -ksp_monitor_short
>> -ksp_converged_reason
>>
>
>  1) Run with -ksp_view, so we can see exactly what was used
>
>  2) ILUT is unfortunately not a well-defined algorithm, and I believe the
> parallel version makes different decisions
>     than the serial version.
>
>    Thanks,
>
>      Matt
>
>
>>   The first a few lines of the output are:
>>
>>   0 KSP Residual norm 1404.62
>>
>>   1 KSP Residual norm 88.9068
>>
>>   2 KSP Residual norm 64.73
>>
>>   3 KSP Residual norm 71.0224
>>
>>   4 KSP Residual norm 69.5044
>>
>>   5 KSP Residual norm 455.458
>>
>>   6 KSP Residual norm 174.876
>>
>>   7 KSP Residual norm 183.031
>>
>>   8 KSP Residual norm 650.675
>>
>>   9 KSP Residual norm 79.2441
>>
>>  10 KSP Residual norm 84.1985
>>
>>
>>  This clearly indicates non-convergence. However, I output the sparse
>> matrix A and vector b to MATLAB, and run the following command:
>>
>> [L,U] = ilu(A,struct('type','ilutp','droptol',1e-3));
>>
>> [ux1,fl1,rr1,it1,rv1] = bicgstab(A,b,1e-10,1000,L,U);
>>
>>
>>  And it converges in MATLAB, with flag fl1=0, relative residue
>> rr1=8.2725e-11, and iteration it1=89.5. I'm wondering how can I figure out
>> what's wrong.
>>
>>
>>  Best,
>>
>> Hui
>>
>
>
>
>  --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150218/033cc1cb/attachment.html>


More information about the petsc-users mailing list