[petsc-users] Print the KSP for Schur complement

John johnlucassaturday at gmail.com
Mon Mar 12 13:14:58 CDT 2018


I see. This solver is embedded in the definition of the Schur complement *S*
(i.e. MatCreateSchurComplement). I explicitly generate a Schur
approximation matrix *S_hat*. Then I assign S and S_hat to a linear solver
KSPSetOperators(*ksp_p*, *S*, *S_hat*). This linear solver *ksp_p*
corresponds to the *fieldsplit_p_* in my understanding. And the action of S
on a vector involves a solver like *fieldsplit_p_inner*. I only call the
ksp_p object to solver the Schur equation, which involves multiple calls of
the inner solver to invert A_00. Shall I call KSPView after that the
KSPSolve of ksp_p? I hope my understanding of the action of fieldsplit_ is
right.

Thanks!

John

2018-03-12 11:02 GMT-07:00 Smith, Barry F. <bsmith at mcs.anl.gov>:

>
>   If you call the KSPView() after you have fully constructed and used the
> Schur complement then it should print all the information without the PC
> has not been set up so information may be incomplete but it only has that
> information after the solve is complete.
>
>    Barry
>
>
> > On Mar 12, 2018, at 12:49 PM, John <johnlucassaturday at gmail.com> wrote:
> >
> > Hi PETSc team:
> >
> > I am trying to analyze my code with which explicitly builds a Schur
> complement.
> >
> > I tried to print the inner solver for the Schur complement by calling
> >
> > MatSchurComplementGetKSP(S, &ksp_s);
> > KSPView(ksp_s, PETSC_VIEWER_STDOUT_WORLD);
> >
> > However, the info printed on screen cannot fully display the pc info:
> >
> > PC Object: (pc0_inner_) 48 MPI processes
> >   type: ml
> >   PC has not been set up so information may be incomplete
> >     type is MULTIPLICATIVE, levels=0 cycles=unknown
> >       Cycles per PCApply=0
> >       Using externally compute Galerkin coarse grid matrices
> >   linear system matrix = precond matrix:
> >   Mat Object: 48 MPI processes
> >     type: mpiaij
> >     rows=823875, cols=823875
> >     total: nonzeros=35031501, allocated nonzeros=35031501
> >     total number of mallocs used during MatSetValues calls =0
> >       using I-node (on process 0) routines: found 5615 nodes, limit used
> is 5
> >
> >
> > I tried to run a PETSc fieldsplit, by calling -fieldsplit_p_inner_ flag,
> I can get all the inner solver info on screen.
> >
> > Is there a way to print the solver in the definition of the Schur
> complement for the A_00 matrix for my case?
> >
> > Thanks!
> >
> > John
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180312/5489cc74/attachment.html>


More information about the petsc-users mailing list