[petsc-users] Multiple solves with PCMG fail
Barry Smith
bsmith at mcs.anl.gov
Tue Oct 21 21:27:06 CDT 2014
There are a few issues with your code
1) If you use KSPSetComputeRHS() then the function you provide is called immediately before the solve automatically for every solve. Thus your
code with the vector operations here
ierr = VecSet(ctx.rhs, (float) i * mx); CHKERRQ(ierr);
ierr = VecScale(ctx.rhs, rank); CHKERRQ(ierr);
ierr = KSPSolve(ksp, NULL, NULL); CHKERRQ(ierr);
does nothing. And each linear solve ends up being with a zero right hand side.
2) -pc_type mg doesn’t give enough information for the multigrid to know how many levels to use. In fact it is always using one hence it is not multigrid. You need to use -p_mg_levels 2 or 3 or whatever to get it to use more than one level.
Barry
> On Oct 21, 2014, at 1:50 AM, Filippo Leonardi <filippo.leonardi at sam.math.ethz.ch> wrote:
>
> Here,
>
> run this e.g. with
> mpirun -np 4 ./test1 -draw_pause -1 -pc_type mg -pc_mg_galerkin
>
> In Debug mode it crashes, in standard mode it fails. Using Petsc current.
>
> Run without mg and works!
>
> Hope it's not me being stupid and doing something terribly wrong!
>
> Best,
> Filippo
>
> On Monday 20 October 2014 07:29:04 Barry Smith wrote:
>> Can you send us the code: to petsc-maint at mcs.anl.gov or
>> petsc-users at mcs.anl.gov ? Or something that reproduces the problem?
>>
>> Barry
>>
>>> On Oct 20, 2014, at 3:31 AM, Filippo Leonardi
>>> <filippo.leonardi at sam.math.ethz.ch> wrote:
>>>
>>> Hi,
>>>
>>> I have a very specific problem that I cannot figure out with PCMG and
>>> multiple solves. I got a linear system that I solve many times, with same
>>> matrix but different RHS. I can successfully solve the system with
>>> standard techniques, such as default solver or LU or PCGAMG. Even MG
>>> works if I destroy the ksp each time or I recompute the matrices at each
>>> time. But when I try and go to MG and not recomputing the matrices each
>>> time the solver fails. Any idea?
>>>
>>> Here some detail: the setup:
>>> ierr = KSPSetDMActive(ksp, PETSC_FALSE); CHKERRQ(ierr);
>>> ierr = KSPSetDM(ksp, da); CHKERRQ(ierr);
>>> ierr = KSPSetComputeOperators(ksp, ComputeMatrix, ctx); CHKERRQ(ierr);
>>> ierr = KSPSetComputeRHS(ksp, ComputeRHS, ctx); CHKERRQ(ierr);
>>> ierr = KSPSetFromOptions(ksp); CHKERRQ(ierr);
>>>
>>> Then I solve as usual, for a large number of time steps:
>>> ierr = KSPSolve(ksp, NULL, NULL); CHKERRQ(ierr);
>>> ierr = KSPGetSolution(ksp, &phi);
>>>
>>> The solver converges and does that in a reasonable number of iterations:
>>> Linear solve converged due to CONVERGED_RTOL iterations 7
>>> And ksp_view and ksp_monitor do not show any weird stuff.
>>>
>>> - Weirdly enough using any solver (for instance cg+bjacobi or gamg)
>>> everything works (So the matrix and RHS are working fine).
>>> - But the problem persists with Galerkin matrices (-pc_mg_galerkin) (So is
>>> not a ComputeMatrix problem).
>>> - If I do:
>>> ierr = KSPSetComputeOperators(ksp, ComputeMatrix, this); CHKERRQ(ierr);
>>> between each solve or destroy the ksp entirely each time the solution is
>>> also perfect (So is not a boundary scaling or other stuff problem).
>>> - If I run MG with only 2 levels (so just coarse) I also get a fine
>>> result.
>>>
>>> Setup RHS is called each time as expected, setup matrix is called just
>>> once, also as expected.
>>>
>>> The only thing I can think is that MG does not update some value that
>>> actually needs to be recomputed.
>>>
>>> Any idea?
>>>
>>> The solution is not that different from:
>>> http://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/examples/tutorials/
>>> ex29.c.html
>>>
>>> Best,
>>> Filippo
> <test1.cpp>
More information about the petsc-users
mailing list