[petsc-users] Roles of ksp_type and pc_type

Matthew Knepley knepley at gmail.com
Wed Nov 19 06:19:05 CST 2014


On Mon, Nov 17, 2014 at 4:27 PM, Massimiliano Leoni <
leoni.massimiliano1 at gmail.com> wrote:

> Hi again,
>   I've been experimenting further with fieldsplit, following your slides,
> but
> for some reason -pc_fieldsplit_type multiplicative and schur segfault at
> KSPSolve. Additive seems to work fine.
>

It looks to me like you have memory corruption in a different part of the
code. I
would build a debugging version, and consider running under valgrind.

I could not reproduce ex62 as in your slides because I don't have the header
> file and the script PetscGenerateFEMQuadrature.py, which I downloaded from
> the
> github repo, complains that PETSC.FEM module is missing, so I used my code,
> which is supopsd to assemble the same matrix, then I call KSPSetOperators
> with
> Stokes matrix as both the matrix and the preconditioner.
>

That is no longer necessary. It just runs out of the box. The slides from
SIAM CS&E 2013
show this. Here is a sample run

  cd src/snes/examples/tutorials
  make ex62
  ./ex62 -run_type full -refinement_limit 0.00625 -bc_type dirichlet
-interpolate 1 -vel_petscspace_order 2 -pres_petscspace_order 1 -ksp_type
fgmres -ksp_gmres_restart 100 -ksp_rtol 1.0e-9 -pc_type fieldsplit
-pc_fieldsplit_type schur -pc_fieldsplit_schur_factorization_type full
-fieldsplit_pressure_ksp_rtol 1e-10 -fieldsplit_velocity_ksp_type gmres
-fieldsplit_velocity_pc_type lu -fieldsplit_pressure_pc_type jacobi
-snes_monitor_short -ksp_monitor_short -snes_converged_reason
-ksp_converged_reason -snes_view -show_solution 0

  Thanks,

     Matt

I use the same options as in slide 115, the "block jacobi, inexact" version
> works, while the "block Gauss-seidl, inexact" doesn't and segfaults.
>
> I attach backtrace and log files, according to what I was told last time.
>
> Thanks for any help,
> Massimiliano
>
>
>
> In data giovedì 13 novembre 2014 15:07:40, Massimiliano Leoni ha scritto:
> > In data giovedì 13 novembre 2014 06:39:38, Matthew Knepley ha scritto:
> > > On Thu, Nov 13, 2014 at 6:24 AM, Massimiliano Leoni <
> > >
> > > This is not exactly right because here you are just using additive
> > > fieldsplit.
> >
> > AFAI understood from the user manual, this should be right because my
> > preconditioner is block diagonal, so I want to use block Jacobi.
> >
> > If I used the other one, P = [[C,O],[B,Mp/nu]] then I would need a
> > multiplicative fieldsplit because this would be lower triangular.
> >
> > Is this correct?
> >
> > > The mass matrix
> > > is a good preconditioner for the Schur complement, not the zero matrix.
> >
> > > Take a look at these slides:
> > What do you mean?
> > I know from theory that the preconditioner I am using is optimal
> [iteration
> > number independent of grid size].
> >
> > I read the slides and the manual and the Schur complement is what I was
> > talking about earlier, so it's not what I wanna do now -- even though
> > knowing that it's so easy to implement my be very useful.
> >
> > Anyway, thanks for the slides, they have been enlightening on the power
> of
> > command line options!
> >
> > >   Thanks,
> > >
> > >     Matt
> >
> > Thanks again,
> > Massimiliano
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141119/565346e8/attachment.html>


More information about the petsc-users mailing list