[petsc-dev] help for use PETSc

Barry Smith bsmith at mcs.anl.gov
Wed Jun 29 10:19:58 CDT 2011


On Jun 29, 2011, at 10:08 AM, Matthew Knepley wrote:

> On Wed, Jun 29, 2011 at 2:48 PM, tuane <tuane at lncc.br> wrote:
> Thank you for your help.
> Well, but if the redundant preconditioner uses a direct solve what means:
> 1. number of  iterations diversify with tolerance 
> 2. reductions iterations with the use the KSPSetInitialGuessNonzero.
> When I use the jacobi preconditioner, the number of iterations remains high.
> 
> Redundant runs SOME preconditioner redundantly. If you specify nothing, it runs
> the default which is GMRES/ILU(0). You can always use -ksp_view to see what
> you used.\

   No, by default redundant uses LU with no Krylov method on each process.


> 
>    Matt
>  
> What about tolerance? when I use jacobi preconditioner I need of a tolerance =1.0e-14 for to obtain the same result when I use the redundant preconditioner with tolerance=1.0e-10. I think that the tolerance is too small for jacobi preconditioner. What do you think?
> 
> Thanks again.
> Tuane
> 
> 
> 
> On Tue, 28 Jun 2011 14:00:06 -0500, Jed Brown wrote:
> On Tue, Jun 28, 2011 at 11:55, tuane  wrote:
> 
> cg+redundant-PC give us the best result, similar to out original
> direct solver. we don't be sure what “redundant” is.
> 
> Redundant means that the whole problem is solved redundantly (using a
> direct solver by default) on every process. It only makes sense as a
> coarse level solver.
>   
> 
> Our response are beeing too dependent of these parameters:
> 
>   1. iterative solver (CG, GMRES, BCGS)
>   2. preconditioners (jacobi, redundant, ILU)
>   3. tolerance
>   4. number of processors
> 
> At this point, you should always run with -ksp_monitor_true_residual
> to make sure that it is really converging.
> 
>  We a
> 
> .  We used the routines:
> call PCSetType (pc, PCMG, ierr)
> 
> This is not algebraic multigrid, it is geometric multigrid. If you
> use DMDA to manage the gri
> ould use geometric multigrid here.
> 
> But I think you are using a Raviart-Thomas mixed space in which case
> the default interpolants from DMDA are not going to work for you.
> 
> The simplest thing you can do is to use PCFieldSplit to eliminate the
> fluxes such that the preconditioner can work with the non-mixed
> (H^1-conforming) operator defined in the potential/pressure space.
> 
> The following won't work right now, but it should work soon. I'm
> describing it here for the others on petsc-dev. If you call
> 
> PCFieldSplitSetIS(pc,"u",is_fluxes);
> PCFieldSplitSetIS(pc,"p",is_potential);
> 
> and
> 
> -pc_type fieldsplit -pc_fieldsplit_type schur
> -fieldsplit_u_pc_type jacobi # The (u,u) block is diagonal for lowest
> order RT spaces
> -fieldsplit_p_pc_type ml # or other multigrid, uses SIMPLE
> approximation of the Schur complement which happens to be exact
> because the (u,u) block is diagonal.
> 
> This won't work right now because PCFieldSplit does not actually call
> MatGetSchurComplement() is designed. It would simplify fieldsplit.c to
> use MatGetSchurComplement(), but then MatGetSubMatrix() would be
> called twice for certain blocks in the matrix, once inside the Schur
> complement and once directly from fieldsplit.c. This is why I so want
> to make a mode in which the parent retains ownership and the caller
> gets a (intended to be) read-only reference when MatGetSubMatrix() and
> MatGetSchurComplement() are called.
> 
> Links:
> ------
> [1] mailto:tuane at lncc.br
> 
> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener




More information about the petsc-dev mailing list