[petsc-users] Problems with preconditioners, which one?
Barry Smith
bsmith at mcs.anl.gov
Wed Aug 4 10:29:20 CDT 2010
If you look at the eigenvalues of the matrix closely you will see it has two parts. Eigenvalues with real part between .0033 and .01 and eigenvalues between 2 and 12. I suspect this two levels of well-conditioned pieces is due to the parts of the matrix have zeros on the diagonals and the rest without zeros (this is why I asked if the problem was a Stokes problem). Perhaps a Schur complement type preconditioner done with the PCFieldSplit preconditioner is the way to go, you would eliminate in the Schur complement the rows/columns of the matrix that do not have the zeros on the diagonal.
Barry
On Aug 4, 2010, at 10:10 AM, Hong Zhang wrote:
> Filippo,
> Sorry, your matrix is ill-conditioned, gmres+ilu does not converge in
> true residual
> (not preconditioned residual!):
> ./ex10 -f0 $D/A_rhs_spiga -ksp_monitor_true_residual
> 0 KSP preconditioned resid norm 1.011174511507e+01 true resid norm
> 2.084628109126e-02 ||r(i)||/||b|| 1.000000000000e+00
> 1 KSP preconditioned resid norm 1.498441679989e+00 true resid norm
> 4.919980218651e-02 ||r(i)||/||b|| 2.360123706052e+00
> 2 KSP preconditioned resid norm 1.657440211742e-01 true resid norm
> 1.624190902280e-02 ||r(i)||/||b|| 7.791274113447e-01
> 3 KSP preconditioned resid norm 7.568788163764e-02 true resid norm
> 1.516900048065e-02 ||r(i)||/||b|| 7.276597880572e-01
> 4 KSP preconditioned resid norm 3.158616884464e-02 true resid norm
> 1.703303336172e-02 ||r(i)||/||b|| 8.170777937395e-01
> 5 KSP preconditioned resid norm 6.169977289081e-08 true resid norm
> 1.629219484085e-02 ||r(i)||/||b|| 7.815396314348e-01
>
> Using superlu, I get condition number
> ./ex10 -f0 $D/A_rhs_spiga -pc_type lu -pc_factor_mat_solver_package
> superlu -mat_superlu_conditionnumber
>
> Recip. condition number = 1.171784e-04
>
> For such tiny matrix (27x27), condition number=1.e+4.
> Using superlu_dist or mumps LU as precondition might be the only
> option for your application.
>
> Hong
>
>
>
> On Wed, Aug 4, 2010 at 9:53 AM, Hong Zhang <hzhang at mcs.anl.gov> wrote:
>> Filippo,
>>
>> Yes, the matrix is well conditioned.
>> Using ~petsc-dev/src/mat/examples/tests/ex78.c, I converted your
>> matlab matrix and rhs
>> into petsc binary format. Then run it with
>> petsc-dev/src/ksp/ksp/examples/tutorials/ex10.c:
>>
>> ./ex10 -f0 $D/A_rhs_spiga -pc_type lu
>> which gives same error on numerical factorization as yours.
>>
>> Then switching to superlu and mumps LU direct sovler, I get
>> ./ex10 -f0 $D/A_rhs_spiga -pc_type lu -pc_factor_mat_solver_package superlu
>> Number of iterations = 1
>> Residual norm < 1.e-12
>>
>> Using petsc default sovler (gmres+ilu):
>> ./ex10 -f0 $D/A_rhs_spiga -ksp_monitor
>> 0 KSP Residual norm 1.011174511507e+01
>> 1 KSP Residual norm 1.498441679989e+00
>> 2 KSP Residual norm 1.657440211742e-01
>> 3 KSP Residual norm 7.568788163764e-02
>> 4 KSP Residual norm 3.158616884464e-02
>> 5 KSP Residual norm 6.169977289081e-08
>> Number of iterations = 5
>> Residual norm 0.0162922
>>
>> As you see, it converges well.
>> For you info., I attached the binary file A_rhs_spiga here.
>>
>> Hong
>>
>> On Tue, Aug 3, 2010 at 11:09 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>>>
>>> Matt and Hong,
>>> The eigenvalues are all nonzero, so I don't see how the matrix could be
>>> singular, all the real parts are positive so it is even a nice matrix. But a
>>> bunch of diagonal entries are identically zero, this screws up most LU
>>> factorization schemes that do not do pivoting. (Matlab does pivoting).
>>>
>>> 0.0109 + 0.0453i
>>> 0.0109 - 0.0453i
>>> 0.0071 + 0.0131i
>>> 0.0071 - 0.0131i
>>> 0.0072 + 0.0121i
>>> 0.0072 - 0.0121i
>>> 0.0049 + 0.0062i
>>> 0.0049 - 0.0062i
>>> 0.0059 + 0.0012i
>>> 0.0059 - 0.0012i
>>> 0.0034 + 0.0031i
>>> 0.0034 - 0.0031i
>>> 0.0068 + 0.0226i
>>> 0.0068 - 0.0226i
>>> 0.0061 + 0.0061i
>>> 0.0061 - 0.0061i
>>> 0.0033 + 0.0028i
>>> 0.0033 - 0.0028i
>>> 6.0000
>>> 8.0000
>>> 2.0000
>>> 8.0000
>>> 12.0000
>>> 4.0000
>>> 2.0000
>>> 4.0000
>>> 2.0000
>>>
>>> On Aug 3, 2010, at 11:03 PM, Matthew Knepley wrote:
>>>
>>> On Tue, Aug 3, 2010 at 10:54 PM, Filippo Spiga
>>> <filippo.spiga at disco.unimib.it> wrote:
>>>>
>>>> Dear Hong,
>>>>>
>>>>> This confirms that your Jacobian is singular, thus none of linear
>>>>> solvers would work.
>>>>
>>>> So do any preconditioner not help me to solve the problem?
>>>
>>> There can exist no solutions when the matrix is singular, thus you have a
>>> problem
>>> with either:
>>> a) the problem definition
>>> b) the creation of your matrix in PETSc
>>>
>>>>
>>>> I put some stuff here: http://tinyurl.com/fil-petsc
>>>> - "A_LS.m" is matrix (saved by PETSc)
>>>> - "b_LS-m"
>>>> - the file "eigenvalues_A" contains the eigenvalues of the matrix A,
>>>> computed by MATLAB.
>>>>
>>>> I used "-pc_type lu" and 1 only processor. The result is the same of my
>>>> previous email (*).
>>>
>>> This shows that your matrix is singular.
>>>
>>>>
>>>> Anyway if I solve the problem using MATLAB I get the right solution. The
>>>> formulation seems correct. To be
>>>
>>> What does this mean? What method in MATLAB? Some methods (like CG) can
>>> iterate on rank deficient
>>> matrices with a compatible rhs and get a solution, but other Krylov methods
>>> will fail. Most preconditioners
>>> will fail.
>>>
>>>>
>>>> honest, the eigenvalues don't say me nothing. But I'm a computer
>>>> scientist, not a mathematician. I'm not able to recognize which
>>>> preconditioner I should use or which modifications (scaling all/part of the
>>>> rows? reformulate the system in another way?...) do to solve the problem.
>>>> From my side, it is not possible to try all the preconditioners and also it
>>>> is not the right way...
>>>
>>> Actually, I strongly disagree. Preconditioners are very problem specific,
>>> and it is often impossible
>>> to prove which one will work for a certain problem. There are many
>>> well-known results along these
>>> lines, such as the paper of Greenbaum, Strakos, and Ptak on GMRES.
>>> Experimentation is essential.
>>> Matt
>>>
>>>>
>>>> Once again, thanks.
>>>>
>>>> (*)
>>>> [0|23:14:58]: unknown: MatLUFactorNumeric_SeqAIJ() line 668 in
>>>> src/mat/impls/aij/seq/aijfact.c: Zero pivot row 1 value 0 tolerance
>>>> 2.77778e-14 * rowsum 0.0277778
>>>>
>>>> --
>>>>
>>>> Filippo SPIGA
>>>>
>>>> «Nobody will drive us out of Cantor's paradise.»
>>>> -- David Hilbert
>>>>
>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their experiments
>>> is infinitely more interesting than any results to which their experiments
>>> lead.
>>> -- Norbert Wiener
>>>
>>>
>>
More information about the petsc-users
mailing list