[petsc-dev] -pc_type gamg for vector PDE
Alexander Grayver
agrayver at gfz-potsdam.de
Fri Jan 20 05:37:39 CST 2012
Mark, Jed,
Thanks a lot for your response.
> It looks like you have a largish \omega\mu\sigma because, as Jed
> mentioned, one would not expect GAMG to work well for the curl-curl part.
Well in my case it is the other way around. I solve this equation for
low \omega and possibly very low \sigma. The curl-curl term dominates
always.
>
> A few things to try:
>
> 1) '-pc_gamg_type sa' will provide what should be a better solver for
> SPD problems.
Unfortunately, the convergence with this option is worse for several
operators I've tried.
Would it help to specify coordinates vertices?
>
> 2) Try:
>
> -pc_type hypre ! instead of -pc_type gamg
> -pc_hypre_type boomeramg
> -pc_hypre_boomeramg_strong_threshold 0.6
When configuring with hypre I get "Cannot use hypre with complex numbers
it is not coded for this capability".
I can reformulate my problem in terms of real matrix:
If C = A + iB
Cx=b == [A -B; B A] [xr; xi] = [br; bi]
But I'm not sure that making spectra two times larger would not
influence my condition number?
>
> 3) You say 'staggard' but I just see E here. Do you have E on faces?
> I forget how staggering works here. If E is cell centered then you
> have a system of 3x3 blocks (with the right ordering) and GAMG might
> benefit from setting the block size to tell it this:
>
> MatSetBlockSize(mat,3);
I have fields on the edges, but I can formulate another staggering
scheme where fields are on the faces. Would it help?
My matrix is 3x3 block. For instance, if I have 100^3 grid, then matrix
is 3*10^6 and each block is of 10^6 size.
Which block size should I pass to MatSetBlockSize? 10^6 or 3? Because
from its description it is not obvious.
> And Jed's answer addresses your 2nd question about null-space. These
> solvers will degrade as \omega\mu\sigma gets smaller.
I do observe this for low \omega and low \sigma (e.g. in the air). I was
thinking how could one project out this null-space.
Hitpmair, as Jed pointed, gives some clues. However it requires some
efforts to implement and wanted first to try petsc built in stuff.
>
> Mark
>
> On Jan 19, 2012, at 5:37 PM, Jed Brown wrote:
>
>> On Wed, Jan 18, 2012 at 08:22, Alexander Grayver
>> <agrayver at gfz-potsdam.de <mailto:agrayver at gfz-potsdam.de>> wrote:
>>
>> Hello petsc team,
>>
>> I solve 3D vector Helmholtz equation like following:
>>
>> \nabla \times \nabla \times E + i\omega\mu\sigma E = -J
>>
>>
>> Multigrid methods for curl-curl problems are pretty specialized. ML
>> and Hypre have support for specific discretizations, I don't know if
>> they support an imaginary shift. The PETSc interface to these
>> packages does not currently support their special Maxwell interfaces.
>>
>> The methods are generally based on edge relaxation or auxiliary space
>> preconditioning, see Hiptmair or Arnold, Falk, and Winther for the
>> mathematical background.
>>
>>
>> I use structured staggered grid and FD. The solution is a vector
>> that consists of three parts E = {Ex Ey Ez}. The operator is
>> symmetric matrix with complex numbers on diagonal.
>> I'm interested in solving this system with iterative techniques.
>> I applied newly presented gamg and it gives promising results,
>> but all I did is just:
>> -ksp_type tfqmr -pc_type gamg
>>
>> I played with different ksp_type and gamg options which are
>> listed on PCGAMG doc page, but nothing improved convergence.
>> Could you please guide me a bit through usage of this technique?
>> The precise questions are:
>> 1. Do I have to do something to say petsc that my equation is a
>> vector equation? Is it important for gamg pc?
>> 2. Should I take into account null-space using KSPSetNullSpace?
>> Since it is well known that as \omega or \sigma get small,
>> null-space of geometric term (curl curl operator) starts to
>> dominate and system gets more ill-conditioned.
>> 3. Which options for gamg may improve convergence in my case?
>>
>>
>> Thanks a lot in advance.
>>
>> --
>> Regards,
>> Alexander
>>
>>
>
--
Regards,
Alexander
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120120/0e836389/attachment.html>
More information about the petsc-dev
mailing list