[petsc-users] Questions about PCMG
Mark F. Adams
mark.adams at columbia.edu
Wed Apr 4 13:24:28 CDT 2012
I would expect 4 calls to MatLUFactorSym here. It looks like the coarse grid is not getting refactored in the second SNES solve.
Are you using Galerkin coarse grids? Perhaps you are not setting a new coarse grid with KSPSetOperator and so MG does not bother refactoring it.
Mark
On Apr 4, 2012, at 1:53 PM, Yuqi Wu wrote:
> Thank you.
>
> Can I ask another question?
>
> In my log summary output, it shows that although there are two SNES iteration and total 9 linear iterations. The functions MatLUFactorSym and MatLUFactorNum are only called for three times.
>
> MatLUFactorSym 3 1.0 1.4073e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.5e+01 1 0 0 0 2 1 0 0 0 2 0
> MatLUFactorNum 3 1.0 3.2754e+01 1.0 9.16e+09 1.0 0.0e+00 0.0e+00 0.0e+00 31 97 0 0 0 32 97 0 0 0 280
>
> I checked the -info output. It shows that One MatLUFactorSymbolic_SeqAIJ() is called in down smoother of the first SNES, one MatLUFactorSymbolic_SeqAIJ() is called in the coarse solve of the first SNES, and one MatLUFactorSymbolic_SeqAIJ() is called in the down smoother of the second SNES.
>
> Do you have any ideas why there are 9 multigrid iterations, but only 3 MatLUFactorSymbolic calls in the program?
>
> Best
>
> Yuqi
>
>
>
>
> ---- Original message ----
>> Date: Tue, 3 Apr 2012 20:08:27 -0500
>> From: petsc-users-bounces at mcs.anl.gov (on behalf of Barry Smith <bsmith at mcs.anl.gov>)
>> Subject: Re: [petsc-users] Questions about PCMG
>> To: PETSc users list <petsc-users at mcs.anl.gov>
>>
>>
>> There are two linear solves (for 1 SNES and 2 SNES) so there are two MGSetUp on each level. Then a total of 9 multigrid iterations (in both linear solves together) hence 9 smoother on level 0 (level 0 means coarse grid solve). One smooth down and one smooth up on level 1 hence 18 total smooths on level 1. 9 computation of residual on level 1 and 18 MgInterp because that logs both the restriction to level 0 and the interpolation back to level 1 and 18 = 9 + 9.
>>
>> Barry
>>
>> On Apr 3, 2012, at 7:57 PM, Yuqi Wu wrote:
>>
>>> Hi, Barry,
>>>
>>> Thank you. If my program converges in two SNES iteration,
>>> 0 SNES norm 1.014991e+02, 0 KSP its (nan coarse its average), last norm 0.000000e+00
>>> 1 SNES norm 9.925218e-05, 4 KSP its (5.25 coarse its average), last norm 2.268574e-06.
>>> 2 SNES norm 1.397282e-09, 5 KSP its (5.20 coarse its average), last norm 1.312605e-12.
>>>
>>> And -pc_mg_log shows the following output
>>>
>>> MGSetup Level 0 2 1.0 3.4091e-01 2.1 0.00e+00 0.0 3.0e+02 6.0e+04 3.0e+01 1 0 3 11 2 1 0 3 11 2 0
>>> MGSmooth Level 0 9 1.0 1.2126e+01 1.0 9.38e+08 3.2 2.8e+03 1.7e+03 6.4e+02 33 71 28 3 34 35 71 28 3 35 415
>>> MGSetup Level 1 2 1.0 1.3925e-01 2.1 0.00e+00 0.0 1.5e+02 3.1e+04 2.3e+01 0 0 1 3 1 0 0 1 3 1 0
>>> MGSmooth Level 1 18 1.0 5.8493e+00 1.0 3.66e+08 3.1 1.5e+03 2.9e+03 3.6e+02 16 28 15 3 19 17 28 15 3 19 339
>>> MGResid Level 1 9 1.0 1.1826e-01 1.4 1.49e+06 2.4 2.0e+02 2.7e+03 9.0e+00 0 0 2 0 0 0 0 2 0 0 70
>>> MGInterp Level 1 18 1.0 1.2317e-01 1.3 7.74e+05 2.2 3.8e+02 1.1e+03 1.8e+01 0 0 4 0 1 0 0 4 0 1 37
>>>
>>> What are the MGSmooth, MGResid, MGInterp represent for?
>>>
>>> Best
>>>
>>> Yuqi
>>>
>>> ---- Original message ----
>>>> Date: Tue, 3 Apr 2012 19:19:23 -0500
>>>> From: petsc-users-bounces at mcs.anl.gov (on behalf of Barry Smith <bsmith at mcs.anl.gov>)
>>>> Subject: Re: [petsc-users] Questions about PCMG
>>>> To: PETSc users list <petsc-users at mcs.anl.gov>
>>>>
>>>>
>>>> -pc_mg_log doesn't have anything to do with DA or DMMG it is part of the basic PCMG. Are you sure you are calling SNESSetFromOptions()?
>>>>
>>>> Barry
>>>>
>>>> On Apr 3, 2012, at 6:56 PM, Yuqi Wu wrote:
>>>>
>>>>> Hi, Mark,
>>>>>
>>>>> Thank you so much for your suggestion.
>>>>>
>>>>> The problem 1 is resolved by avoiding calling PCMGSetNumberSmoothUp.
>>>>>
>>>>> But since I am using the unstructured grid in my application, I didn't use DA or dmmg, so -pc_mg_log didn't give any level information. I try to run my code using -info with 1 processor, and I find out some interesting issues.
>>>>
>>
>
More information about the petsc-users
mailing list