[petsc-users] Multigrid as a preconditioner

coco at dmi.unict.it coco at dmi.unict.it
Thu Feb 16 12:54:21 CST 2012


Dear list,

I would like to parallelize a multigrid code by Petsc. I do not want  
to use the DMMG infrastructure, since it will be replaced in the next  
PETSc release. Therefore I preferred to use the multigrid as a  
preconditioner. In practice, I use the Richardson iteration, choosing  
the same matrix of the linear system as a preconditioner, so that I  
think the Richardson iteration should converge in only one iteration,  
and effectively it is like solving the whole linear system by the  
multigrid.
As a first test, I tried to use a one-grid multigrid (then not a truly  
multigrid). I just set the coarse solver (by a standard KSPSolve), and  
it should be enough, because the multigrid starts already from the  
coarsest grid and then it does not need the smoother and the transfer  
operators.
Unfortunately, the iteration scheme (which should be the Richardson  
scheme) converges with a reason=4 (KSP_CONVERGED_ITS) to a wrong  
solution.
On the other hand, if I solve the whole problem with the standard  
KSPSolve (then withouth setting the multigrid as a preconditioner  
...), it converges to the right solution with a reason=2.
I thought that the two methods should be the exactly the same method,  
and I do not understand why they provide different convergence results.

Here is the relevant code:

// Set the matrix of the linear system
Mat Mcc;
ierr=MatCreate(PETSC_COMM_WORLD,&Mcc); CHKERRQ(ierr);
ierr=MatSetType(Mcc, MATMPIAIJ); CHKERRQ(ierr);
ierr=MatSetSizes(Mcc,PETSC_DECIDE,PETSC_DECIDE,1000,1000); CHKERRQ(ierr);
ierr=setMatrix(Mcc); //It is a routine that set the values of the matrix Mcc

// Set the ksp solver with the multigrid as a preconditioner
   KSP ksp, KspSolver;
   ierr = KSPCreate(PETSC_COMM_WORLD,&ksp);CHKERRQ(ierr);
   ierr = KSPSetType(ksp,KSPRICHARDSON);
   ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr);
   ierr = PCSetType(pc,PCMG);CHKERRQ(ierr);
   ierr = PCMGSetLevels(pc,1,&PETSC_COMM_WORLD);CHKERRQ(ierr);
   ierr = PCMGSetType(pc,PC_MG_MULTIPLICATIVE);CHKERRQ(ierr);
   ierr = PCMGGetCoarseSolve(pc,&kspCoarseSolve);CHKERRQ(ierr);
ierr =  
KSPSetOperators(kspCoarseSolve,Mcc,Mcc,DIFFERENT_NONZERO_PATTERN);CHKERRQ(ierr);
ierr =  
KSPSetTolerances(kspCoarseSolve,1.e-12,PETSC_DEFAULT,PETSC_DEFAULT,PETSC_DEFAULT);CHKERRQ(ierr);
ierr = KSPSetOperators(ksp,Mcc,Mcc,DIFFERENT_NONZERO_PATTERN);CHKERRQ(ierr);
ierr =  
KSPSetTolerances(ksp,1.e-12,PETSC_DEFAULT,PETSC_DEFAULT,PETSC_DEFAULT);CHKERRQ(ierr);
ierr = KSPSolve(ksp,RHS,U);CHKERRQ(ierr);

// Solve with the standard KSPSolve
KSP ksp1;
ierr = KSPCreate(PETSC_COMM_WORLD,&ksp1);CHKERRQ(ierr);
ierr = KSPSetOperators(ksp1,Mcc,Mcc,DIFFERENT_NONZERO_PATTERN);CHKERRQ(ierr);
ierr =  
KSPSetTolerances(ksp1,1.e-12/(2*nn123),PETSC_DEFAULT,PETSC_DEFAULT,PETSC_DEFAULT);CHKERRQ(ierr);
ierr = KSPSolve(ksp1,RHS,U1);CHKERRQ(ierr);


At the end, the Vector U and U1 are different.
Thank you.

Best regards,
Armando



More information about the petsc-users mailing list