[petsc-users] GMRES -> PCMG -> PCASM pre- post- smoother
Lorenzo Alessio Botti
lorenzoalessiobotti at gmail.com
Thu Aug 20 05:29:10 CDT 2015
I tried to achieve this behaviour getting all the smothers and setting the same preconditioner to the down and up smoother on the same level.
smoothers.resize(nLevels+1);
smoothers_up.resize(nLevels);
for (PetscInt i = 0; i < nLevels; i++)
{
PCMGGetSmootherDown(M_pc,nLevels-i,&(smoothers[i]));
KSPSetInitialGuessNonzero(smoothers[i],PETSC_TRUE); // for full and wCicle
PCMGGetSmootherUp(M_pc,nLevels-i,&(smoothers_up[i]));
}
PCMGSetNumberSmoothDown(M_pc,1);
PCMGSetNumberSmoothUp(M_pc,1);
… set coarse solver options here
for (PetscInt i = 0; i < nLevels; i++)
{
PC pc;
KSPSetType(smoothers[i], KSPGMRES);
KSPGetPC(smoothers[i], &pc);
KSPSetPCSide(smoothers[i], PC_RIGHT);
PCSetType(pc, PCASM);
PCFactorSetPivotInBlocks(pc, PETSC_TRUE);
PCFactorSetAllowDiagonalFill(pc);
PCFactorSetReuseFill(pc, PETSC_TRUE);
PCFactorSetReuseOrdering(pc, PETSC_TRUE);
KSPSetType(smoothers_up[i], KSPGMRES);
KSPSetPC(smoothers_up[i], pc);
KSPSetPCSide(smoothers_up[i], PC_RIGHT);
KSPSetConvergenceTest(smoothers[i],KSPConvergedSkip,NULL,NULL);
KSPSetConvergenceTest(smoothers_up[i],KSPConvergedSkip,NULL,NULL);
KSPSetNormType(smoothers[i],KSP_NORM_NONE);
KSPSetNormType(smoothers_up[i],KSP_NORM_NONE);
}
Is this correct?
Note moreover that for Full Multigrid and W cicles to work as expected I need to add the KSPSetInitialGuessNonZero option.
Bests
Lorenzo
> Message: 4
> Date: Thu, 20 Aug 2015 02:37:13 -0500
> From: Barry Smith <bsmith at mcs.anl.gov <mailto:bsmith at mcs.anl.gov>>
> To: "Aulisa, Eugenio" <eugenio.aulisa at ttu.edu <mailto:eugenio.aulisa at ttu.edu>>
> Cc: "petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>" <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>>
> Subject: Re: [petsc-users] GMRES -> PCMG -> PCASM pre- post- smoother
> Message-ID: <1CF3ABE1-214C-4BBC-B8FF-93416EC26EFC at mcs.anl.gov <mailto:1CF3ABE1-214C-4BBC-B8FF-93416EC26EFC at mcs.anl.gov>>
> Content-Type: text/plain; charset="us-ascii"
>
>
> What you describe is not the expected behavior. I expected exactly the result that you expected.
>
> Do you perhaps have some PETSc options around that may be changing the post-smoother? On the command line or in the file petscrc or in the environmental variable PETSC_OPTIONS? Can you send us some code that we could run that reproduces the problem?
>
> Barry
>
>> On Aug 19, 2015, at 9:26 PM, Aulisa, Eugenio <eugenio.aulisa at ttu.edu <mailto:eugenio.aulisa at ttu.edu>> wrote:
>>
>> Hi,
>>
>> I am solving an iteration of
>>
>> GMRES -> PCMG -> PCASM
>>
>> where I build my particular ASM domain decomposition.
>>
>> In setting the PCMG I would like at each level
>> to use the same pre- and post-smoother
>> and for this reason I am using
>> ...
>> PCMGGetSmoother ( pcMG, level , &subksp );
>>
>> to extract and set at each level the ksp object.
>>
>> In setting PCASM then I use
>> ...
>> KSPGetPC ( subksp, &subpc );
>> PCSetType ( subpc, PCASM );
>> ...
>> and then set my own decomposition
>> ...
>> PCASMSetLocalSubdomains(subpc,_is_loc_idx.size(),&_is_ovl[0],&_is_loc[0]);
>> ...
>>
>> Now everything compiles, and runs with no memory leakage,
>> but I do not get the expected convergence.
>>
>> When I checked the output of -ksp_view, I saw something that puzzled me:
>> at each level >0, while in the MG pre-smoother the ASM domain decomposition
>> is the one that I set, for example with 4 processes I get
>>
>>>>>>>>>>>>>>>>>>>>>
>> ...
>> Down solver (pre-smoother) on level 2 -------------------------------
>> KSP Object: (level-2) 4 MPI processes
>> type: gmres
>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
>> GMRES: happy breakdown tolerance 1e-30
>> maximum iterations=1
>> using preconditioner applied to right hand side for initial guess
>> tolerances: relative=1e-12, absolute=1e-20, divergence=1e+50
>> left preconditioning
>> using nonzero initial guess
>> using NONE norm type for convergence test
>> PC Object: (level-2) 4 MPI processes
>> type: asm
>> Additive Schwarz: total subdomain blocks = 198, amount of overlap = 0
>> Additive Schwarz: restriction/interpolation type - RESTRICT
>> [0] number of local blocks = 52
>> [1] number of local blocks = 48
>> [2] number of local blocks = 48
>> [3] number of local blocks = 50
>> Local solve info for each block is in the following KSP and PC objects:
>> - - - - - - - - - - - - - - - - - -
>> ...
>>>>>>>>>>>>>
>>
>>
>> in the post-smoother I have the default ASM decomposition with overlapping 1:
>>
>>
>>>>>>>>>>>>>
>> ...
>> Up solver (post-smoother) on level 2 -------------------------------
>> KSP Object: (level-2) 4 MPI processes
>> type: gmres
>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
>> GMRES: happy breakdown tolerance 1e-30
>> maximum iterations=2
>> tolerances: relative=1e-12, absolute=1e-20, divergence=1e+50
>> left preconditioning
>> using nonzero initial guess
>> using NONE norm type for convergence test
>> PC Object: (level-2) 4 MPI processes
>> type: asm
>> Additive Schwarz: total subdomain blocks = 4, amount of overlap = 1
>> Additive Schwarz: restriction/interpolation type - RESTRICT
>> Local solve is same for all blocks, in the following KSP and PC objects:
>> KSP Object: (level-2sub_) 1 MPI processes
>> type: preonly
>> maximum iterations=10000, initial guess is zero
>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000
>> left preconditioning
>> ...
>>>>>>>>>>>>>>>
>> %%%%%%%%%%%%%%%%%%%%%%%%
>>
>> So it seams that by using
>>
>> PCMGGetSmoother ( pcMG, level , &subksp );
>>
>> I was capable to set both the pre- and post- smoothers to be PCASM
>> but everything I did after that applied only to the
>> pre-smoother, while the post-smoother got the default PCASM options.
>>
>> I know that I can use
>> PCMGGetSmootherDown and PCMGGetSmootherUp, but that would
>> probably double the memory allocation and the computational time in the ASM.
>>
>> Is there any way I can just use PCMGGetSmoother
>> and use the same PCASM in the pre- and post- smoother?
>>
>> I hope I was clear enough.
>>
>> Thanks a lot for your help,
>> Eugenio
>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150820/951ee85c/attachment-0001.html>
More information about the petsc-users
mailing list