[petsc-dev] Multigrid is confusing

Mark F. Adams mark.adams at columbia.edu
Thu May 24 16:26:09 CDT 2012


On May 24, 2012, at 4:10 PM, Barry Smith wrote:

> 
> On May 24, 2012, at 2:37 PM, Mark F. Adams wrote:
> 
>> 
>> On May 24, 2012, at 3:30 PM, Jed Brown wrote:
>> 
>>> On Thu, May 24, 2012 at 2:16 PM, Mark F. Adams <mark.adams at columbia.edu> wrote:
>>> Is Eisenstat really worth it?
>>> 
>>> Given a memory centric future and that Jed, even now, is not seeing a win .. maybe with a regular grid and R/B ordering you can skip a whole pass through the data but for AIJ (I assume that is what ex5 uses) lexagraphic ordering is probably a better model and its does not look like a big win to me.
>>> 
>>> Maybe Eisenstat should split the matrix storage similar to the factorization kernels?
> 
>   Absolutely.
>> 
>> Is it worth adding and maintaining a new kernel for Eisenstat ...
> 
>   Absolutely. And if it turns out to be too much of a pain to write and maintain such a kernel there is something wrong with our programming model and code development system. The right system should make all the complexity fall away; when the complexity becomes too much of a bear you know you have the wrong system.

OK, in that case then I really have to add my parallel Gauss-Siedel to PETSc.  You write the kernels (not hard, just take an IS of rows to process and a flag to process them in reverse order or not) for AIJ and the BAIJs and I'll convert my code to C.

Mark 

> 
>   Barry
> 
>> 
>>> 
>>> 
>>> Mark
>>> 
>>> On May 24, 2012, at 2:48 PM, Barry Smith wrote:
>>> 
>>>> 
>>>> On May 24, 2012, at 1:20 PM, Jed Brown wrote:
>>>> 
>>>>> On Wed, May 23, 2012 at 2:52 PM, Jed Brown <jedbrown at mcs.anl.gov> wrote:
>>>>> On Wed, May 23, 2012 at 2:26 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>>>>> 
>>>>> Note that you could use -pc_type eisenstat perhaps in this case instead. Might save lots of flops?  I've often wondered about doing Mark's favorite chebyshev smoother with Eisenstat, seems like it should be a good match.
>>>>> 
>>>>> [0]PETSC ERROR: --------------------- Error Message ------------------------------------
>>>>> [0]PETSC ERROR: No support for this operation for this object type!
>>>>> [0]PETSC ERROR: Cannot have different mat and pmat!
>>>>> 
>>>>> Also, I'm having trouble getting Eisenstat to be more than very marginally faster than SOR.
>>>> 
>>>>   There is still a bug related to Eisenstat. If you are trying to use it with kspest you are likely hitting the bug.
>>>> 
>>>>   I will fix the bug when I have time.
>>>> 
>>>>  Barry
>>>> 
>>>>> 
>>>>> 
>>>>> I think we should later be getting the eigenvalue estimate by applying the preconditioned operator to a few random vectors, then orthogonalizing. The basic algorithm is to generate a random matrix X (say 5 or 10 columns), compute
>>>>> 
>>>>> Y = (P^{-1} A)^q X
>>>>> 
>>>>> where q is 1 or 2 or 3, then compute
>>>>> 
>>>>> Q R = Y
>>>>> 
>>>>> and compute the largest singular value of the small matrix R. The orthogonalization can be done in one reduction and all the MatMults can be done together. Whenever we manage to implement a MatMMult and PCMApply or whatever (names inspired by VecMDot), this will provide a very low communication way to get the eigenvalue estimates.
>>>>> 
>>>>> 
>>>>> I want to turn off norms in Chebyshev by default (they are very wasteful), but how should I make -mg_levels_ksp_monitor turn them back on? I'm already tired of typing -mg_levels_ksp_norm_type unpreconditioned.
>>>> 
>>>> 
>>> 
>>> 
>> 
> 
> 




More information about the petsc-dev mailing list