[petsc-users] questions about the multigrid framework
Peter Wang
pengxwang at hotmail.com
Wed Feb 9 10:44:24 CST 2011
Thank, Matt,
Did you mean All the Krylov methods alone will get worse with increasing grid number? Since the finer grid has smaller size and more number of grid.
Since I am a new user of PETSc, the easiest way for me is still keep in KSP solver. However, if the solver cannot satisfy the speed reqirement. I am thinking to use MG method. However, I don't have any experience on multigrid. Could you please give me some suggestion on it?
1, Since I have built the Matrix and the vector for finite difference scheme in KSP solver, where should I start from to transfer to multigrid? I studied the example in: src/ksp/ksp/examples/tutorials/ex22f.F. Is it a good prototype to be based on to create my own code? Is DMMG is the best tool for my problem?
2, How many levels of the subgrid are needed to obtain a solution close enough to the exact solution for a problem with 6 orders in length scale?
3, The procedure of building Matrix and RHS vector in MG method is to build the matrix and RHS in the finest grid level and the MG will start the computation from the coarsest level, right?
Thanks for your considerate reponse.
Date: Wed, 9 Feb 2011 10:00:37 -0600
From: knepley at gmail.com
To: petsc-users at mcs.anl.gov
Subject: Re: [petsc-users] questions about the multigrid framework
On Wed, Feb 9, 2011 at 9:58 AM, Peter Wang <pengxwang at hotmail.com> wrote:
Thanks Barry,
I run the code with -ksp_monitor_true_residual -ksp_converged_reason, and it turns out that the computation didn't get the real convergence. After I set the rtol and more iteration, the numerical solution get better. However, the computation converges very slowly with finer grid points. For example, with nx=2500 and ny=10000, (lx=2.5e-4,ly=1e-3, and the distribution varys mainly in y direction)
at IT=72009, true resid norm 1.638857052871e-01 ||Ae||/||Ax|| 9.159199925235e-07
IT=400000,true resid norm 1.638852449299e-01 ||Ae||/||Ax|| 9.159174196917e-07.
and it didn't converge yet.
I am wondering if the solver is changed, the convergency speed could get fater? Or, I should take anohte approach to use finer grids, like multigrid? Thanks for your help.
If you can get MG to work for your problem, its optimal. All the Krylov methods alone will get worse with increasing grid size.
Matt
> From: bsmith at mcs.anl.gov
> Date: Sun, 6 Feb 2011 21:30:56 -0600
> To: petsc-users at mcs.anl.gov
> Subject: Re: [petsc-users] questions about the multigrid framework
>
>
> On Feb 6, 2011, at 5:00 PM, Peter Wang wrote:
>
> > Hello, I have some concerns about the multigrid framework in PETSc.
> >
> > We are trying to solve a two dimensional problem with a large variety in length scales. The length of computational domain is in order of 1e3 m, and the width is in 1 m, nevertheless, there is a tiny object with 1e-3 m in a corner of the domain.
> >
> > As a first thinking, we tried to solve the problem with a larger number of uniform or non-uniform grids. However, the error of the numerical solution increases when the number of the grid is too large. In order to test the effect of the grid size on the solution, a domain with regular scale of 1m by 1m was tried to solve. It is found that the extreme small grid size might lead to large variation to the exact solution. For example, the exact solution is a linear distribution in the domain. The numerical solution is linear as similar as the exact solution when the grid number is nx=1000 by ny=1000. However, if the grid number is nx=10000 by ny=10000, the numerical solution varies to nonlinear distribution which boundary is the only same as the exact solution.
>
> Stop right here. 99.9% of the time what you describe should not happen, with a finer grid your solution (for a problem with a known solution for example) will be more accurate and won't suddenly get less accurate with a finer mesh.
>
> Are you running with -ksp_monitor_true_residual -ksp_converged_reason to make sure that it is converging? and using a smaller -ksp_rtol <tol> for more grid points. For example with 10,000 grid points in each direction and no better idea of what the discretization error is I would use a tol of 1.e-12
>
> Barry
>
> We'll deal with the multigrid questions after we've resolved the more basic issues.
>
>
> > The solver I used is a KSP solver in PETSc, which is set by calling :
> > KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr). Whether this solver is not suitable to the system with small size grid? Or, whether the problem crossing 6 orders of length scale is solvable with only one level grid system when the memory is enough for large matrix? Since there is less coding work for one level grid size, it would be easy to implement the solver.
> >
> > I did some research work on the website and found the slides by Barry on
> > http://www.mcs.anl.gov/petsc/petsc-2/documentation/tutorials/Columbia04/DDandMultigrid.pdf
> > It seems that the multigrid framework in PETSc is a possible approach to our problem. We are thinking to turn to the multigrid framework in PETSc to solve the problem. However, before we dig into it, there are some issues confusing us. It would be great if we can get any suggestion from you:
> > 1 Whether the multigrid framework can handle the problem with a large variety in length scales (up to 6 orders)? Is DMMG is the best tool for our problem?
> >
> > 2 The coefficient matrix A and the right hand side vector b were created for the finite difference scheme of the domain and solved by KSP solver (callKSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr)). Is it easy to immigrate the created Matrix A and Vector b to the multigrid framework?
> >
> > 3 How many levels of the subgrid are needed to obtain a solution close enough to the exact solution for a problem with 6 orders in length scale?
> >
>
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110209/271a4491/attachment.htm>
More information about the petsc-users
mailing list