[petsc-users] on the performance of MPI PETSc

Sun, Hui hus003 at ucsd.edu
Fri Jun 5 11:51:14 CDT 2015


Thank you Jed. I see. 

I have another question: Petsc uses multigrid as a preconditioner. How do I specify the option so that it becomes a solver? Is it by doing:
./ex34 -pc_type mg -pc_mg_type full -ksp_type richardson -ksp_monitor_short -pc_mg_levels 3 -mg_coarse_pc_factor_shift_type nonzero

MG as preconditioner or as solver, which one gives better performance and why? 

Best, 
Hui

________________________________________
From: Jed Brown [jed at jedbrown.org]
Sent: Friday, June 05, 2015 9:45 AM
To: Sun, Hui; Barry Smith
Cc: petsc-users at mcs.anl.gov
Subject: Re: [petsc-users] on the performance of MPI PETSc

"Sun, Hui" <hus003 at ucsd.edu> writes:
> If I run ex34 using the command:
> ./ex34 -pc_type none -ksp_type gmres -ksp_monitor_short
>
> I get the following output:
>   0 KSP Residual norm 0.0289149
>   1 KSP Residual norm < 1.e-11
> Residual norm 9.14804e-15
> Error norm 0.00020064
> Error norm 5.18301e-05
> Error norm 4.90288e-08
> Time cost: 1.60657 4.67008 0.0784049
>
> From the output, it seems that solving Poisson without a
> preconditioner is actually faster than using multigrid as a
> preconditioner. I think multigrid should do better than that.

Can't beat one iteration with on preconditioner.  The right hand side
(thus solution) is an eigenvector, so GMRES without preconditioning
converges in one iteration always.  The example could not be worse for
testing solvers.


More information about the petsc-users mailing list