[petsc-dev] help for use PETSc
Matthew Knepley
knepley at gmail.com
Tue Jun 28 17:05:43 CDT 2011
Jed has given excellent advice. However, your problems sound small. You
should try using a direct solver
like MUMPS (with --download-mumps during configure, and -ksp_type preonly
-pc_type lu -pc_factor_mat_solver_package mumps
during the solve).
Matt
On Tue, Jun 28, 2011 at 2:00 PM, Jed Brown <jed at 59a2.org> wrote:
> On Tue, Jun 28, 2011 at 11:55, tuane <tuane at lncc.br> wrote:
>
>> cg+redundant-PC give us the best result, similar to out original direct
>> solver. we don't be sure what “redundant” is.
>>
>
> Redundant means that the whole problem is solved redundantly (using a
> direct solver by default) on every process. It only makes sense as a coarse
> level solver.
>
>
>> Our response are beeing too dependent of these parameters:
>>
>> 1. iterative solver (CG, GMRES, BCGS)
>> 2. preconditioners (jacobi, redundant, ILU)
>> 3. tolerance
>> 4. number of processors
>>
>
> At this point, you should always run with -ksp_monitor_true_residual to
> make sure that it is really converging.
>
> We also tried to use the Algebraic Multigrid, but without success. An
>> error occurs in the program execution. We used the routines:
>> call PCSetType (pc, PCMG, ierr)
>>
>
> This is not algebraic multigrid, it is geometric multigrid. If you use DMDA
> to manage the grid, then you could use geometric multigrid here.
>
> But I think you are using a Raviart-Thomas mixed space in which case the
> default interpolants from DMDA are not going to work for you.
>
> The simplest thing you can do is to use PCFieldSplit to eliminate the
> fluxes such that the preconditioner can work with the non-mixed
> (H^1-conforming) operator defined in the potential/pressure space.
>
>
> The following won't work right now, but it should work soon. I'm describing
> it here for the others on petsc-dev. If you call
>
> PCFieldSplitSetIS(pc,"u",is_fluxes);
> PCFieldSplitSetIS(pc,"p",is_potential);
>
> and
>
> -pc_type fieldsplit -pc_fieldsplit_type schur
> -fieldsplit_u_pc_type jacobi # The (u,u) block is diagonal for lowest order
> RT spaces
> -fieldsplit_p_pc_type ml # or other multigrid, uses SIMPLE approximation of
> the Schur complement which happens to be exact because the (u,u) block is
> diagonal.
>
>
> This won't work right now because PCFieldSplit does not actually call
> MatGetSchurComplement() is designed. It would simplify fieldsplit.c to use
> MatGetSchurComplement(), but then MatGetSubMatrix() would be called twice
> for certain blocks in the matrix, once inside the Schur complement and once
> directly from fieldsplit.c. This is why I so want to make a mode in which
> the parent retains ownership and the caller gets a (intended to be)
> read-only reference when MatGetSubMatrix() and MatGetSchurComplement() are
> called.
>
--
What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20110628/f85c4bbf/attachment.html>
More information about the petsc-dev
mailing list