[petsc-users] Questions on dense and sparse matrix block

Barry Smith bsmith at mcs.anl.gov
Wed Jan 22 20:52:23 CST 2014


   That is unfortunate.  You may need a good preconditioning strategy. Can you tell us more about your problem? Do you have some fluid variables that couple between cells and then chemistry in each cell.

  Barry


On Jan 22, 2014, at 8:14 PM, Danyang Su <danyang.su at gmail.com> wrote:

> with option '-pc_type lu', I got an error message after some steps.
> 
> [0]PETSC ERROR: --------------------- Error Message ------------------------------------
> [0]PETSC ERROR: Detected zero pivot in LU factorization:
> see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot!
> [0]PETSC ERROR: Zero pivot row 5 value 7.63923e-045 tolerance 2.22045e-014!
> 
> Danyang
> 
> On 22/01/2014 5:28 PM, Barry Smith wrote:
>>   With a 1d problem likely the sparse format and -pc_type lu is the way to go.
>> 
>>   Barry
>> 
>> On Jan 22, 2014, at 7:15 PM, Danyang Su <danyang.su at gmail.com> wrote:
>> 
>>> On 22/01/2014 4:32 PM, Barry Smith wrote:
>>>>   What ODE integrator are you using? Your own? One in PETSc?
>>> I use my own integrator and PETSc is used as KSP solver. I may need to test more cases especially 2D cases to see the behavior of dense format and sparse format.
>>>>    You could just use -pc_type lu and the sparse format right? Or is that too slow?
>>>> 
>>>>    Barry
>>>> 
>>>>   Jed and Emil,  do we have any integrators that keep the time-step small due to slow convergence of Newton in TS?
>>>> 
>>>> 
>>>> On Jan 22, 2014, at 5:05 PM, Danyang Su <danyang.su at gmail.com> wrote:
>>>> 
>>>>> On 22/01/2014 1:42 PM, Barry Smith wrote:
>>>>>>    Linear solvers based on ILU will behave very differently when extra zeros are kept in the matrix; they generally converge much faster because the ILU behaves more like an LU (especially in 1d).   So first run one 1 process with dense and sparse formats and -pc_type lu;  do they behave differently? (they pretty much shouldn’t).
>>>>> It does not behave differently with the option '-pc_type lu'.
>>>>>>   Next run the sparse format version with -ksp_monitor_true_residual and see how the solver is converging compared to the dense format version.
>>>>> The converging is quite different.
>>>>> ---Sparse format
>>>>>  0 KSP preconditioned resid norm 5.242977405897e-004 true resid norm 3.927123490361e-006 ||r(i)||/||b|| 1.000000000000e+000
>>>>>  1 KSP preconditioned resid norm 6.118580743238e-006 true resid norm 2.917502282301e-008 ||r(i)||/||b|| 7.429107562985e-003
>>>>>  2 KSP preconditioned resid norm 9.271130393116e-007 true resid norm 9.649272009380e-009 ||r(i)||/||b|| 2.457083927476e-003
>>>>>  3 KSP preconditioned resid norm 5.812641606714e-009 true resid norm 4.073757311146e-011 ||r(i)||/||b|| 1.037338734355e-005
>>>>>  4 KSP preconditioned resid norm 1.992914862465e-010 true resid norm 1.261636843933e-012 ||r(i)||/||b|| 3.212623303112e-007
>>>>>  5 KSP preconditioned resid norm 1.422122839379e-012 true resid norm 2.393761421284e-014 ||r(i)||/||b|| 6.095457469466e-009
>>>>> ---Dense format
>>>>>  0 KSP preconditioned resid norm 2.948306125658e+000 true resid norm 2.436662454678e-004 ||r(i)||/||b|| 1.000000000000e+000
>>>>>  1 KSP preconditioned resid norm 9.798852520841e-015 true resid norm 1.267168968393e-018 ||r(i)||/||b|| 5.200428832315e-015
>>>>> 
>>>>> The convergence seems difficult for the sparse format and the number of outer newton iteration is usually much larger than dense format.
>>>>> So the time step cannot increase due to this.
>>>>> 
>>>>> Thanks,
>>>>> 
>>>>> Danyang
>>>>>>    Barry
>>>>>> 
>>>>>> On Jan 22, 2014, at 2:11 PM, Danyang Su <danyang.su at gmail.com> wrote:
>>>>>> 
>>>>>>> Dear All,
>>>>>>> 
>>>>>>> I have a reactive transport problem that use block matrices. Each block can be dense with a lot of zero entries or sparse without zero entries. The model has been tested for a 1D reactive transport problem. When dense block is used, it works well and time step can be increased gradually to reach a maximum time step, but when sparse block is used, the time step remains at a small value. I checked the entries for both dense blocks and sparse blocks (A), they have the same non-zero entries. With the same RHS (b), the solution (X) is a little different, but seems acceptable. The first matrix with both dense blocks and sparse blocks has been attached. The matrix is exported in matrix market exchange format (.mtp).
>>>>>>> 
>>>>>>> I wonder if this is caused by the outer newton iteration or the solver as this is not a general problem. This problem only occurs in some cases.
>>>>>>> 
>>>>>>> Does anyone run into this problem before?
>>>>>>> 
>>>>>>> Thanks and regards,
>>>>>>> 
>>>>>>> Danyang
>>>>>>> <a_reactran_rt_1_dense.mtp><a_reactran_rt_1_dense.PNG><b_reactran_rt_1_dense.txt><x_reactran_rt_1_dense.txt><a_reactran_rt_1_sparse.mtp><a_reactran_rt_1_sparse.PNG><b_reactran_rt_1_sparse.txt><x_reactran_rt_1_sparse.txt>
> 



More information about the petsc-users mailing list