[petsc-dev] -pc_type gamg for vector PDE

Alexander Grayver agrayver at gfz-potsdam.de
Fri Jan 20 08:19:11 CST 2012


Jed,

My initial motivation to try gamg was that paper:
W.A. /Mulder/, A /multigrid solver for 3D/ electromagnetic diffusion

Where it seems to work fine (on uniform grids only at least).
I use MUMPS currently and it is very robust. It properly solves systems 
with extremely low \omega and \sigma for any stretched (which would 
'kill' any multigrid I guess) grids.
I don't expect iterative solvers to be that robust, but I would like to 
solve even simple models with ~uniform grid of the order 10^7

So, I will try to look at HYPRE's approach for Maxwell equations.
Thanks a lot.

On 20.01.2012 13:47, Jed Brown wrote:
> On Fri, Jan 20, 2012 at 05:37, Alexander Grayver 
> <agrayver at gfz-potsdam.de <mailto:agrayver at gfz-potsdam.de>> wrote:
>
>     Well in my case it is the other way around. I solve this equation
>     for low \omega and possibly very low \sigma. The curl-curl term
>     dominates always.
>
>
> Yes, this is the hard regime.
>
>>     A few things to try:
>>
>>     1) '-pc_gamg_type sa' will provide what should be a better solver
>>     for SPD problems.
>
>     Unfortunately, the convergence with this option is worse for
>     several operators I've tried.
>     Would it help to specify coordinates vertices?
>
>
> SA does not use coordinates except to add rigid body modes to the 
> near-null space, but that near-null space is different and much 
> smaller than your near-null space.
>
>     When configuring with hypre I get "Cannot use hypre with complex
>     numbers it is not coded for this capability".
>     I can reformulate my problem in terms of real matrix:
>
>     If C = A + iB
>     Cx=b  ==  [A -B; B A] [xr; xi] = [br; bi]
>
>     But I'm not sure that making spectra two times larger would not
>     influence my condition number?
>
>
> It's probably not so bad for conditioning, but the methods on 
> BoomerAMG are not intended for Maxwell. Hypre's user's manual (and 
> ML's) have sections on solving Maxwell with their custom interfaces, 
> you would have to read those sections and call them directly (or add 
> that functionality to the PETSc interface; if you want to do this, we 
> can advise and assist).
>
>>     3) You say 'staggard' but I just see E here.  Do you have E on
>>     faces?  I forget how staggering works here.  If E is cell
>>     centered then you have a system of 3x3 blocks (with the right
>>     ordering) and GAMG might benefit from setting the block size to
>>     tell it this:
>>
>>     MatSetBlockSize(mat,3);
>
>     I have fields on the edges, but I can formulate another staggering
>     scheme where fields are on the faces. Would it help?
>
>
> Not immediately. I'm not very familiar with the problem, but my 
> understanding is that the edge discretization is probably the one you 
> want anyway.
>
>     My matrix is 3x3 block. For instance, if I have 100^3 grid, then
>     matrix is 3*10^6 and each block is of 10^6 size.
>     Which block size should I pass to MatSetBlockSize? 10^6 or 3?
>     Because from its description it is not obvious.
>
>
> This "block size" stuff is for "point blocks" where you have several 
> dofs collocated at points (or modes of a modal basis). You don't have 
> that structure because different variables are at different points, so 
> you can't readily interlace your three macro-blocks to make 3x3 
> point-blocks.
>
>
>
>>     And Jed's answer addresses your 2nd question about null-space.
>>      These solvers will degrade as \omega\mu\sigma gets smaller.
>
>     I do observe this for low \omega and low \sigma (e.g. in the air).
>     I was thinking how could one project out this null-space.
>     Hitpmair, as Jed pointed, gives some clues. However it requires
>     some efforts to implement and wanted first to try petsc built in
>     stuff.
>
>
> If you can afford it, I suggest using a direct solver like MUMPS. 
> Depending on your geometry, 10^6 dofs might be manageable (depending 
> on how quickly you need to solve these problems).


-- 
Regards,
Alexander

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120120/629bbf51/attachment.html>


More information about the petsc-dev mailing list