[petsc-users] Questions about multigrid preconditioner and multigrid level

Smith, Barry F. bsmith at mcs.anl.gov
Fri Sep 20 00:53:47 CDT 2019



> On Sep 20, 2019, at 12:35 AM, Young Hyun Jo <bcjo17 at gmail.com> wrote:
> 
> Thanks for the answer.
> It's really helpful to understand the PETSc library.
> By the way, I just want to ask two more questions not related to the multigrid.
> 
> 1. Is there a method known as the fastest solver for Poisson's equation with the central difference scheme in the PETSc library?
> I want to clarify that I need the fastest method, not the least iteration method.

   Geometric multigrid. For larger problems it will be significantly faster than anything else.

   But for a fixed size "moderate" sized problem it could be something like a relatively simply preconditioned CG. Determining the sizes for one method is the fastest is based on experiments, running the problem size you need with various methods. NEVER run on a different size problem to make a selection for an different size, this can lead to bad choices since the size of the problem has a large effect on which method is fastest.

> I use the PCG(KSPCG) method without any preconditioner now, and I'm trying other methods too, but I couldn't find any other faster method yet.
> 
> 2. What kind of preconditioner is used in 'KSPCG'?

  You can run with -ksp_view to see exactly what solver and parameters are used. 

  By default PETSc uses block Jacobi with ILU(0) on each block. 

> I have made my own solver using PCG with a preconditioner 'incomplete Cholesky factorization' which is shown in https://en.wikipedia.org/wiki/Conjugate_gradient_method, and my solver takes more iterations than KSPCG.
> So, I'm wondering what is the default preconditioner for KSPCG, and whether it's usually the fastest one.

   We made this one the default because it is fairly good for a range of problems. 

   I can't explain why your incomplete Cholesky should require more iterations than our default. I would expect them to be pretty similar.

   Barry

> 
> 
> 
> 
> 2019년 9월 20일 (금) 오후 2:19, Smith, Barry F. <bsmith at mcs.anl.gov>님이 작성:
> 
>    The DMDA structured grid management in PETSc does not provide the needed interpolations for doing non-nested multigrid. The algebraic portions of PETSc's geometric multigrid would work fine for that case if you have your own way to provide the needed interpolation. I'll note that mathematically the interpolation is completely straightforward but the practical issues of computing such interpolations and managing the non-nested nature of the grids in MPI are nontrivial, not impossible or even particularly difficult but require careful thought and coding. The PETSc team doesn't have the resources or the need to develop this ability. I can only suggest sticking to the grid sizes where there is a natural nesting of the mesh points I don't think the coding effort is worth the benefit.
> 
>    Barry
> 
> 
> > On Sep 19, 2019, at 11:50 PM, Young Hyun Jo <bcjo17 at gmail.com> wrote:
> > 
> > 
> > Oh, I'm sorry. You're right.
> > I use Dirichlet boundary conditions with a central difference scheme.
> > I mentioned 130 grids, but I have an actual case below, and I get the errors :
> > 
> > DM Object: 8 MPI processes
> >   type: da
> > Processor [0] M 127 N 127 P 62 m 2 n 2 p 2 w 1 s 1
> > X range of indices: 0 64, Y range of indices: 0 64, Z range of indices: 0 31
> > Processor [1] M 127 N 127 P 62 m 2 n 2 p 2 w 1 s 1
> > X range of indices: 64 127, Y range of indices: 0 64, Z range of indices: 0 31
> > Processor [2] M 127 N 127 P 62 m 2 n 2 p 2 w 1 s 1
> > X range of indices: 0 64, Y range of indices: 64 127, Z range of indices: 0 31
> > Processor [3] M 127 N 127 P 62 m 2 n 2 p 2 w 1 s 1
> > X range of indices: 64 127, Y range of indices: 64 127, Z range of indices: 0 31
> > Processor [4] M 127 N 127 P 62 m 2 n 2 p 2 w 1 s 1
> > X range of indices: 0 64, Y range of indices: 0 64, Z range of indices: 31 62
> > Processor [5] M 127 N 127 P 62 m 2 n 2 p 2 w 1 s 1
> > X range of indices: 64 127, Y range of indices: 0 64, Z range of indices: 31 62
> > Processor [6] M 127 N 127 P 62 m 2 n 2 p 2 w 1 s 1
> > X range of indices: 0 64, Y range of indices: 64 127, Z range of indices: 31 62
> > Processor [7] M 127 N 127 P 62 m 2 n 2 p 2 w 1 s 1
> > X range of indices: 64 127, Y range of indices: 64 127, Z range of indices: 31 62
> > [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> > [0]PETSC ERROR: Arguments are incompatible
> > [0]PETSC ERROR: Ratio between levels: (mz - 1)/(Mz - 1) must be integer: mz 62 Mz 31
> > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
> > [0]PETSC ERROR: Petsc Release Version 3.11.3, Jun, 26, 2019 
> > [0]PETSC ERROR: ../../../eclipse-workspace/PIC3DXYZ/PIC3DXYZ_MPI/Debug/PIC3DXYZ_MPI on a  named mn0 by bcjo17 Fri Sep 20 13:44:29 2019
> > [0]PETSC ERROR: Configure options --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort --prefix=/home/bcjo17/petsc_mpiicc --with-mpi=1 --with-blaslapack-dir=/home/bcjo17/intel/compilers_and_libraries_2019.3.199/linux/mkl
> > [0]PETSC ERROR: #1 DMCreateInterpolation_DA_3D_Q1() line 773 in /home/bcjo17/Downloads/petsc-3.11.3/src/dm/impls/da/dainterp.c
> > [0]PETSC ERROR: #2 DMCreateInterpolation_DA() line 1039 in /home/bcjo17/Downloads/petsc-3.11.3/src/dm/impls/da/dainterp.c
> > [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> > [1]PETSC ERROR: Arguments are incompatible
> > [1]PETSC ERROR: Ratio between levels: (mz - 1)/(Mz - 1) must be integer: mz 62 Mz 31
> > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
> > [1]PETSC ERROR: Petsc Release Version 3.11.3, Jun, 26, 2019 
> > [1]PETSC ERROR: ../../../eclipse-workspace/PIC3DXYZ/PIC3DXYZ_MPI/Debug/PIC3DXYZ_MPI on a  named mn0 by bcjo17 Fri Sep 20 13:44:29 2019
> > [1]PETSC ERROR: Configure options --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort --prefix=/home/bcjo17/petsc_mpiicc --with-mpi=1 --with-blaslapack-dir=/home/bcjo17/intel/compilers_and_libraries_2019.3.199/linux/mkl
> > [1]PETSC ERROR: #1 DMCreateInterpolation_DA_3D_Q1() line 773 in /home/bcjo17/Downloads/petsc-3.11.3/src/dm/impls/da/dainterp.c
> > [1]PETSC ERROR: #2 DMCreateInterpolation_DA() line 1039 in /home/bcjo17/Downloads/petsc-3.11.3/src/dm/impls/da/dainterp.c
> > [1]PETSC ERROR: #3 DMCreateInterpolation() line 1114 in /home/bcjo17/Downloads/petsc-3.11.3/src/dm/interface/dm.c
> > [1]PETSC ERROR: #4 PCSetUp_MG() line 684 in /home/bcjo17/Downloads/petsc-3.11.3/src/ksp/pc/impls/mg/mg.c
> > [1]PETSC ERROR: #5 PCSetUp() line 932 in /home/bcjo17/Downloads/petsc-3.11.3/src/ksp/pc/interface/precon.c
> > [1]PETSC ERROR: #6 KSPSetUp() line 391 in /home/bcjo17/Downloads/petsc-3.11.3/src/ksp/ksp/interface/itfunc.c
> > [1]PETSC ERROR: #7 KSPSolve() line 725 in /home/bcjo17/Downloads/petsc-3.11.3/src/ksp/ksp/interface/itfunc.c
> > ... same messages for other processors  ...
> > 
> > The message 'Ratio between levels: (mz - 1)/(Mz - 1) must be integer: mz 62 Mz 31' made me think that there are some restrictions to use the multigrid.
> > I agree that people usually use a natural hierarchy for the multigrid method, but I just want to know whether it is possible or not.
> > So, could you please let me know how I can make it possible?
> > 
> > 
> > 
> > 2019년 9월 20일 (금) 오후 1:08, Smith, Barry F. <bsmith at mcs.anl.gov>님이 작성:
> > 
> > 
> >   You didn't indicate "why" you can't use multiple levels with "130 grids", is there some error message? Nor do you mention if you have periodic boundary conditions or are using cell or vertex centered unknowns. All of these things affect when you can coarsen for multigrain or not.
> > 
> >   Consider one simple case, Dirichlet boundary conditions with vertex centered unknowns, I show the fine | and coarse grid *
> > 
> > 
> >    |                  |                   |                      |                          |
> > 
> >    *                                     *                                                 *
> > 
> > 
> >    Now consider 4 points, 
> > 
> >    |                  |                   |                      |
> > 
> >    Where am I going to put the coarse points?
> > 
> >    It is possible to do multigrid with non-nesting of degrees for freedom like 
> > 
> >    |                 |                   |                  |
> >    *                          *                            *
> > 
> >  but that is really uncommon, nobody does it. People just use the grid sizes which have a natural hierarchy of 
> > nested coarser grids.
> > 
> >    Barry
> > 
> > 
> > 
> > > On Sep 19, 2019, at 10:48 PM, Young Hyun Jo via petsc-users <petsc-users at mcs.anl.gov> wrote:
> > > 
> > > 
> > > Hello, I'm Young Hyun Jo, and I study plasma physics and particle-in-cell simulation.
> > > Currently, I'm using PETSc to solve a 3D Poisson's equation in the FDM scheme.
> > > I have one question about the multigrid preconditioner.
> > > When I use PCG(KSPCG) with the multigrid preconditioner(PCMG), I get an error if I don't use the appropriate multigrid level for the grid number.
> > > For example, If I use 129 grids, I can use 7 multigrid levels.
> > > However, If I use 130 grids, I can't use any multigrid levels but one.
> > > So, It seems that the grid number is better to be (2*n + 1) to use multigrid preconditioner.
> > > Is this correct that the multigrid conditioner has some restrictions for the grid number to use?
> > > It will be really helpful for me to use PETSc properly.
> > > Thanks in advance.
> > > 
> > > Sincerely,
> > > Young Hyun Jo
> > 
> 



More information about the petsc-users mailing list