[petsc-users] gamg failure with petsc-dev

Stephan Kramer s.kramer at imperial.ac.uk
Tue Apr 1 13:10:49 CDT 2014


On 01/04/14 16:07, Mark Adams wrote:
> Stephan, I have pushed a pull request to fix this but for now you can just
> use -mg_levels_ksp_type chebyshev -mg_levels_pc_type jacobi.  This used to
> be the default be we move to SOR recently.
> Mark

Ah, that's great news. Thanks a lot for the effort. You're right: the previous defaults should be fine for us; your fix should hopefully only improve things

>
>
> On Sat, Mar 29, 2014 at 5:52 PM, Mark Adams <mfadams at lbl.gov> wrote:
>
>> Sorry for getting to this late.  I think you have figured it out basically
>> but there are a few things:
>>
>> 1) You must set the block size of A (bs=2) for the null spaces to work and
>> for aggregation MG to work properly. SA-AMG really does not make sense
>> unless you work at the vertex level, for which we need the block size.

Yes indeed. I've come to realize this now by looking into how smoothed aggregation with a near null space actually works. We currently have our dofs numbered the wrong way around (vertices on the 
inside, velocity component on the outside - which made sense for other eqns we solve with the model) so will take a bit of work, but might well be worth the effort

Thanks a lot for looking into this
Cheers
Stephan


>>
>> 2) You must be right that the zero column is because the aggregation
>> produced a singleton aggregate.  And so the coarse grid is low rank.  This
>> is not catastrophic, it is like a fake BC equations.  The numerics just
>> have to work around it.  Jacobi does this.  I will fix SOR.
>>
>> Mark
>>
>>
>>> Ok, I found out a bit more. The fact that the prolongator has zero
>>> columns appears to arise in petsc 3.4 as well. The only reason it wasn't
>>> flagged before is that the default for the smoother (not the aggregation
>>> smoother but the standard pre and post smoothing) changed from jacobi to
>>> sor. I can make the example work with the additional option:
>>>
>>> $ ./ex49 -elas_pc_type gamg -mx 100 -my 100 -mat_no_inode
>>> -elas_mg_levels_1_pc_type jacobi
>>>
>>> Vice versa, if in petsc 3.4.4 I change ex49 to include the near nullspace
>>> (the /* constrain near-null space bit */) at the end, it works with jacobi
>>> (the default in 3.4) but it breaks with sor with the same error message as
>>> above. I'm not entirely sure why jacobi doesn't give an error with a zero
>>> on the diagonal, but the zero column also means that the related coarse dof
>>> doesn't actually affect the fine grid solution.
>>>
>>> I think (but I might be barking up the wrong tree here) that the zero
>>> columns appear because the aggregation method typically will have a few
>>> small aggregates that are not big enough to support the polynomials of the
>>> near null space (i.e. the polynomials restricted to an aggregate are not
>>> linearly independent). A solution would be to reduce the number of
>>> polynomials for these aggregates (only take the linearly independent).
>>> Obviously this has the down-side that the degrees of freedom per aggregate
>>> at the coarse level is no longer a constant making the administration more
>>> complicated. It would be nice to find a solution though as I've always been
>>> taught that jacobi is not a robust smoother for multigrid.
>>>
>>> Cheers
>>> Stephan
>>>
>>>
>>>
>>
>


More information about the petsc-users mailing list