[petsc-users] DIVERGED_INDEFINITE_PC in algebraic multigrid
Michele Rosso
mrosso at uci.edu
Mon Jun 2 00:46:56 CDT 2014
Mark,
I tried to reset every-time: the number of iterations is now constant
during the whole simulation!
I tried GMG instead of AMG as well: it works in this case too, so the
trick was to reset the ksp object each time.
As you predicted, each solve takes longer since the ksp has to be setup
again. I noticed that the time increase is larger than 2x, particularly
for large grids.
I need to optimize the solve now, maybe by resetting only when needed.
Could you help me with that please?
Thanks,
Michele
On 05/28/2014 07:54 PM, Mark Adams wrote:
>
>
>
> On Mon, May 26, 2014 at 12:20 PM, Michele Rosso <mrosso at uci.edu
> <mailto:mrosso at uci.edu>> wrote:
>
> Mark,
>
> thank you for your input and sorry my late reply: I saw your email
> only now.
> By setting up the solver each time step you mean re-defining the
> KSP context every time?
>
>
> THe simplest thing is to just delete the object and create it again.
> THere are "reset" methods that do the same thing semantically but it
> is probably just easier to destroy the KSP object and recreate it and
> redo your setup code.
>
> Why should this help?
>
>
> AMG methods optimized for a particular operator but "stale" setup data
> often work well on problems that evolve, at least for a while, and it
> saves a lot of time to not redo the "setup" every time. How often you
> should "refresh" the setup data is problem dependant and the
> application needs to control that. There are some hooks to fine tune
> how much setup data is recomputed each solve, but we are just trying
> to see if redoing the setup every time helps. If this fixes the
> problem then we can think about cost. If it does not fix the problem
> then it is more serious.
>
> I will definitely try that as well as the hypre solution and
> report back.
> Again, thank you.
>
> Michele
>
>
> On 05/22/2014 09:34 AM, Mark Adams wrote:
>> If the solver is degrading as the coefficients change, and I
>> would assume get more nasty, you can try deleting the solver at
>> each time step. This will be about 2x more expensive, because it
>> does the setup each solve, but it might fix your problem.
>>
>> You also might try:
>>
>> -pc_type hypre
>> -pc_hypre_type boomeramg
>>
>>
>>
>>
>> On Mon, May 19, 2014 at 6:49 PM, Jed Brown <jed at jedbrown.org
>> <mailto:jed at jedbrown.org>> wrote:
>>
>> Michele Rosso <mrosso at uci.edu <mailto:mrosso at uci.edu>> writes:
>>
>> > Jed,
>> >
>> > thank you very much!
>> > I will try with ///-mg_levels_ksp_type chebyshev
>> -mg_levels_pc_type
>> > sor/ and report back.
>> > Yes, I removed the nullspace from both the system matrix
>> and the rhs.
>> > Is there a way to have something similar to Dendy's
>> multigrid or the
>> > deflated conjugate gradient method with PETSc?
>>
>> Dendy's MG needs geometry. The algorithm to produce the
>> interpolation
>> operators is not terribly complicated so it could be done,
>> though DMDA
>> support for cell-centered is a somewhat awkward. "Deflated
>> CG" can mean
>> lots of things so you'll have to be more precise. (Most
>> everything in
>> the "deflation" world has a clear analogue in the MG world,
>> but the
>> deflation community doesn't have a precise language to talk
>> about their
>> methods so you always have to read the paper carefully to
>> find out if
>> it's completely standard or if there is something new.)
>>
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140601/1f664f5a/attachment.html>
More information about the petsc-users
mailing list