On Thu, Feb 16, 2012 at 12:54 PM, <span dir="ltr"><<a href="mailto:coco@dmi.unict.it">coco@dmi.unict.it</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Dear list,<br>
<br>
I would like to parallelize a multigrid code by Petsc. I do not want to use the DMMG infrastructure, since it will be replaced in the next PETSc release. Therefore I preferred to use the multigrid as a preconditioner. In practice, I use the Richardson iteration, choosing the same matrix of the linear system as a preconditioner, so that I think the Richardson iteration should converge in only one iteration, and effectively it is like solving the whole linear system by the multigrid.<br>
</blockquote><div><br></div><div>Your understanding of the Richardson iteration is flawed. You can consult Yousef Saad's book for the standard definition and anaysis.</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
As a first test, I tried to use a one-grid multigrid (then not a truly multigrid). I just set the coarse solver (by a standard KSPSolve), and it should be enough, because the multigrid starts already from the coarsest grid and then it does not need the smoother and the transfer operators.<br>
Unfortunately, the iteration scheme (which should be the Richardson scheme) converges with a reason=4 (KSP_CONVERGED_ITS) to a wrong solution.<br>
On the other hand, if I solve the whole problem with the standard KSPSolve (then withouth setting the multigrid as a preconditioner ...), it converges to the right solution with a reason=2.<br></blockquote><div><br></div>
<div>Yes, give Richardson many more iterations, -ksp_max_it.</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
I thought that the two methods should be the exactly the same method, and I do not understand why they provide different convergence results.<br>
<br>
Here is the relevant code:<br>
<br>
// Set the matrix of the linear system<br>
Mat Mcc;<br>
ierr=MatCreate(PETSC_COMM_<u></u>WORLD,&Mcc); CHKERRQ(ierr);<br>
ierr=MatSetType(Mcc, MATMPIAIJ); CHKERRQ(ierr);<br>
ierr=MatSetSizes(Mcc,PETSC_<u></u>DECIDE,PETSC_DECIDE,1000,1000)<u></u>; CHKERRQ(ierr);<br>
ierr=setMatrix(Mcc); //It is a routine that set the values of the matrix Mcc<br>
<br>
// Set the ksp solver with the multigrid as a preconditioner<br>
KSP ksp, KspSolver;<br>
ierr = KSPCreate(PETSC_COMM_WORLD,&<u></u>ksp);CHKERRQ(ierr);<br>
ierr = KSPSetType(ksp,KSPRICHARDSON);<br>
ierr = KSPGetPC(ksp,&pc);CHKERRQ(<u></u>ierr);<br>
ierr = PCSetType(pc,PCMG);CHKERRQ(<u></u>ierr);<br>
ierr = PCMGSetLevels(pc,1,&PETSC_<u></u>COMM_WORLD);CHKERRQ(ierr);<br>
ierr = PCMGSetType(pc,PC_MG_<u></u>MULTIPLICATIVE);CHKERRQ(ierr);<br>
ierr = PCMGGetCoarseSolve(pc,&<u></u>kspCoarseSolve);CHKERRQ(ierr);<br>
ierr = KSPSetOperators(<u></u>kspCoarseSolve,Mcc,Mcc,<u></u>DIFFERENT_NONZERO_PATTERN);<u></u>CHKERRQ(ierr);<br>
ierr = KSPSetTolerances(<u></u>kspCoarseSolve,1.e-12,PETSC_<u></u>DEFAULT,PETSC_DEFAULT,PETSC_<u></u>DEFAULT);CHKERRQ(ierr);<br>
ierr = KSPSetOperators(ksp,Mcc,Mcc,<u></u>DIFFERENT_NONZERO_PATTERN);<u></u>CHKERRQ(ierr);<br>
ierr = KSPSetTolerances(ksp,1.e-12,<u></u>PETSC_DEFAULT,PETSC_DEFAULT,<u></u>PETSC_DEFAULT);CHKERRQ(ierr);<br>
ierr = KSPSolve(ksp,RHS,U);CHKERRQ(<u></u>ierr);<br>
<br>
// Solve with the standard KSPSolve<br>
KSP ksp1;<br>
ierr = KSPCreate(PETSC_COMM_WORLD,&<u></u>ksp1);CHKERRQ(ierr);<br>
ierr = KSPSetOperators(ksp1,Mcc,Mcc,<u></u>DIFFERENT_NONZERO_PATTERN);<u></u>CHKERRQ(ierr);<br>
ierr = KSPSetTolerances(ksp1,1.e-12/(<u></u>2*nn123),PETSC_DEFAULT,PETSC_<u></u>DEFAULT,PETSC_DEFAULT);<u></u>CHKERRQ(ierr);<br>
ierr = KSPSolve(ksp1,RHS,U1);CHKERRQ(<u></u>ierr);<br>
<br>
<br>
At the end, the Vector U and U1 are different.<br>
Thank you.<br>
<br>
Best regards,<br>
Armando<br>
<br>
</blockquote></div><br><br clear="all"><div><br></div>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener<br>