[petsc-users] Moving from KSPSetNullSpace to MatSetNullSpace
Matthew Knepley
knepley at gmail.com
Tue Sep 29 12:10:49 CDT 2015
On Tue, Sep 29, 2015 at 12:08 PM, Gil Forsyth <gforsyth at gwu.edu> wrote:
> PETSc is version 3.6.1 -- I just included a log from 3.5.4 to show that
> the behavior seems to have changed between versions. The only difference
> in our code between 3.5.4 and 3.6.1 is the change from KSPSetNullSpace to
> MatSetNullSpace.
>
Mark made some GAMG changes which were later reversed because they had
unintended consequences like this.
I think what Barry means is, "you should get the behavior you expect using
the master branch from PETSc development"
Thanks,
Matt
> On Tue, Sep 29, 2015 at 1:04 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
>>
>> Update your PETSc
>>
>>
>> > On Sep 29, 2015, at 12:00 PM, Gil Forsyth <gforsyth at gwu.edu> wrote:
>> >
>> > Hi Barry,
>> >
>> > We aren't explicitly setting GMRES anywhere in the code and I'm not
>> sure why it's being used. Running our 3.5.4 code using KSPSetNullSpace
>> works with:
>> >
>> > $PETIBM_DIR/petibm3.5/bin/petibm2d -directory . -poisson_pc_type gamg
>> -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1 -poisson_ksp_view
>> -poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
>> kspview3.5.4
>> >
>> > and shows that the coarse grid solver is of type:preonly
>> >
>> > running the newer version that uses MatSetNullSpace in its stead and
>> adding in -poisson_mg_coarse_ksp_type preonly
>> >
>> > $PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type gamg
>> -poisson_pc_gamg_type agg -poisson_pc_gamg_agg_nsmooths 1
>> -poisson_mg_coarse_ksp_type preonly -poisson_ksp_view
>> -poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
>> kspview3.6.1
>> >
>> > still shows
>> >
>> > KSP Object:(poisson_) 1 MPI processes
>> > type: cg
>> > maximum iterations=10000
>> > tolerances: relative=1e-05, absolute=1e-50, divergence=10000
>> > left preconditioning
>> > using nonzero initial guess
>> > using PRECONDITIONED norm type for convergence test
>> > PC Object:(poisson_) 1 MPI processes
>> > type: gamg
>> > MG: type is MULTIPLICATIVE, levels=3 cycles=v
>> > Cycles per PCApply=1
>> > Using Galerkin computed coarse grid matrices
>> > GAMG specific options
>> > Threshold for dropping small values from graph 0
>> > AGG specific options
>> > Symmetric graph false
>> > Coarse grid solver -- level -------------------------------
>> > KSP Object: (poisson_mg_coarse_) 1 MPI processes
>> > type: gmres
>> > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
>> Orthogonalization with no iterative refinement
>> > GMRES: happy breakdown tolerance 1e-30
>> > maximum iterations=1, initial guess is zero
>> > tolerances: relative=1e-05, absolute=1e-50, divergence=10000
>> > left preconditioning
>> > using NONE norm type for convergence test
>> >
>> >
>> > both logs are attached.
>> >
>> >
>> > On Tue, Sep 29, 2015 at 12:37 PM, Barry Smith <bsmith at mcs.anl.gov>
>> wrote:
>> >
>> > This can't work. You can't use a GMRES inside a CG. Try changing
>> to -poisson_mg_coarse_ksp_type preonly
>> >
>> > KSP Object:(poisson_) 1 MPI processes
>> > type: cg
>> >
>> > KSP Object: (poisson_mg_coarse_) 1 MPI processes
>> > type: gmres
>> > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
>> Orthogonalization with no iterative refinement
>> > GMRES: happy breakdown tolerance 1e-30
>> > maximum iterations=1, initial guess is zero
>> >
>> >
>> > > On Sep 29, 2015, at 10:53 AM, Gil Forsyth <gforsyth at gwu.edu> wrote:
>> > >
>> > >
>> > > On Tue, Sep 29, 2015 at 11:42 AM, Matthew Knepley <knepley at gmail.com>
>> wrote:
>> > > On Tue, Sep 29, 2015 at 10:28 AM, Gil Forsyth <gforsyth at gwu.edu>
>> wrote:
>> > > Hi all,
>> > >
>> > > I've been having some trouble with what should be a relatively simple
>> update to an immersed boundary CFD solver from PETSc 3.5.4 to 3.6.1
>> > >
>> > > I'm getting indefinite PC errors for a simple lid-driven cavity test
>> problem, 32x32 at Re 100
>> > >
>> > > Under PETSc 3.5.4 using KSPSetNullSpace we used the following to set
>> the null space. This is for a 2D Poisson system with no immersed boundary
>> and so the null space is the constant vector.
>> > >
>> > > MatNullSpace nsp;
>> > > ierr = MatNullSpaceCreate(PETSC_COMM_WORLD, PETSC_TRUE, 0, NULL,
>> &nsp); CHKERRQ(ierr);
>> > > ierr = KSPSetNullSpace(ksp2, nsp); CHKERRQ(ierr);
>> > > ierr = MatNullSpaceDestroy(&nsp); CHKERRQ(ierr);
>> > >
>> > > Clearly this has to happen in the reverse order, since ksp2 would not
>> be created yet.
>> > >
>> > > For questions about solvers, we HAVE to see the complete output of
>> -ksp_view so we
>> > > know what we are dealing with. Its also nice to have
>> -ksp_monitor_true_residual -ksp_converged_reason
>> > >
>> > > Matt
>> > >
>> > > Yes -- sorry, those are both in inline files and are called in the
>> reverse order that I wrote them out.
>> > >
>> > > I've attached the output of
>> > >
>> > > $PETIBM_DIR/petibm3.6/bin/petibm2d -directory . -poisson_pc_type gamg
>> -poisson_pc_gamg_type agg -poisson_gamg_agg_nsmooths 1 -poisson_ksp_view
>> -poisson_ksp_monitor_true_residual -poisson_ksp_converged_reason >
>> kspview.log
>> > >
>> > >
>> > >
>> > > And then setup the KSP with
>> > > ierr = KSPCreate(PETSC_COMM_WORLD, &ksp2); CHKERRQ(ierr);
>> > > ierr = KSPSetOptionsPrefix(ksp2, "poisson_"); CHKERRQ(ierr);
>> > > ierr = KSPSetOperators(ksp2, QTBNQ, QTBNQ); CHKERRQ(ierr);
>> > > ierr = KSPSetInitialGuessNonzero(ksp2, PETSC_TRUE); CHKERRQ(ierr);
>> > > ierr = KSPSetType(ksp2, KSPCG); CHKERRQ(ierr);
>> > > ierr = KSPSetReusePreconditioner(ksp2, PETSC_TRUE); CHKERRQ(ierr);
>> > > ierr = KSPSetFromOptions(ksp2); CHKERRQ(ierr);
>> > >
>> > > The matrix QTBNQ does not change, only the rhs of the system is
>> updated.
>> > >
>> > > We run this with `-pc_type gamg -pc_gamg_type agg
>> -pc_gamg_agg_nsmooths 1` and everything seems to work as expected.
>> > >
>> > > Under PETSc 3.6.1, we change only the KSPSetNullSpace line, to
>> > >
>> > > ierr = MatSetNullSpace(QTBNQ, nsp); CHKERRQ(ierr);
>> > >
>> > > and the same code diverges after 1 timestep and returns a -8
>> KSP_DIVERGED_INDEFINITE_PC
>> > >
>> > > This is weird, especially because if we change nsmooths to 2, it runs
>> for 264 timesteps and the returns the same error. But we have explicitly
>> set KSPSetReusePreconditioner so it should be using the same PC, right?
>> > >
>> > > Change nsmooths to 3 and it again diverges after 1 timestep.
>> > >
>> > > Change nsmooths to 4 and it runs to completion.
>> > >
>> > > It seems like either gamg's behavior has changed, or that
>> KSPSetNullSpace was doing something implicitly that we now need to do
>> explicitly in addition to MatSetNullSpace?
>> > >
>> > > Thanks,
>> > > Gil Forsyth
>> > >
>> > >
>> > >
>> > > --
>> > > What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> > > -- Norbert Wiener
>> > >
>> > > <kspview.log>
>> >
>> >
>> > <kspview3.5.4><kspview3.6.1>
>>
>>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150929/f70562fb/attachment-0001.html>
More information about the petsc-users
mailing list