some sor questions

Barry Smith bsmith at mcs.anl.gov
Tue Sep 22 15:40:51 CDT 2009


On Sep 22, 2009, at 3:37 PM, Stephan Kramer wrote:

> Thanks for your answers
>
> Barry Smith wrote:
>> On Sep 22, 2009, at 8:47 AM, Stephan Kramer wrote:
>>> Hi all,
>>>
>>> I have some questions basically about the MatRelax_SeqAIJ routine:
>>>
>>> If I understand correctly there are 2 versions of the sor routine   
>>> depending on whether or not there is a zero guess, so that with a   
>>> zero guess in the forward sweep you don't need to multiply the  
>>> upper  diagonal part U with the part of the x vector that is still  
>>> zero.  Why then does it look like that both versions log the same  
>>> number of  flops? I would have expected that the full forward  
>>> sweep (i.e. no  zero guess) takes 2*a->nz flops (i.e. the same as  
>>> a matvec) and not  a->nz.
>>    You are right. This is an error in our code. It will be in the   
>> next patch.
>>> Why does the Richardson iteration with sor not take the zero  
>>> guess  into account, i.e. why does PCApplyRichardson_SOR not set   
>>> SOR_ZERO_INIT_GUESS in the call to MatRelax if the Richardson ksp   
>>> has a zero initial guess set?
>>    This appears to be a design limitation. There is no mechanism  
>> to  pass the information that the initial guess is zero into   
>> PCApplyRichardson(). We could add support for this by adding one  
>> more  argument to PCApplyRichardson() for this information. I don't  
>> see a  simpler way.  If one is running, say 2 iterations of  
>> Richardson then  this would be a measurable improvement in time. If  
>> one is running many  iterations then the savings is tiny. Perhaps  
>> this support should be  added.
>
> I'm thinking of the application of ksp richardson with sor as a  
> smoother in pcmg. In which case the down smoother will have zero  
> initial guess (as it only acts on the residual), and there will be  
> typicaly only 1 or 2 iterations, so the saving would be significant.  
> Is there another way I should set this up instead?

    I will add this support to petsc-dev. It should be there by  
tomorrow morning. Thanks for pointing this out and please let me know  
if you see other issues like these that I can fix.

    Barry

>
>>> In parallel if you specify SOR_LOCAL_FORWARD_SWEEP or   
>>> SOR_LOCAL_BACKWARD_SWEEP it
>>> calls MatRelax on the local part of the matrix, mat->A, with   
>>> its=lits and lits=PETSC_NULL (?).
>>> However the first line of MatRelax_SeqAIJ then says: its =  
>>> its*lits.  Is that right?
>>    This is all wrong. It should be passing 1 in, not PETSC_NULL.  
>> This  was fixed in petsc-dev but not in petsc-3.0.0 I will fix it  
>> in  petsc-3.0.0 and it will be in the next patch.
>>    Thanks for pointing out the problems.
>>    If you plan to use SOR a lot, you might consider switching to   
>> petsc-dev since I have made some improvements there. Also consider  
>> the  Eisenstat trick preconditioner.
>>    Barry
>>> Please tell me if I'm totally misunderstanding how the routine   
>>> works, thanks for any help.
>>>
>>> Cheers
>>> Stephan
>>>
>>> -- 
>>> Stephan Kramer  <s.kramer at imperial.ac.uk>
>>> Applied Modelling and Computation Group,
>>> Department of Earth Science and Engineering,
>>> Imperial College London
>




More information about the petsc-dev mailing list