[petsc-dev] telescope does not work with TS

Dave May dave.mayhem23 at gmail.com
Mon Feb 13 14:38:36 CST 2017


On 13 February 2017 at 20:35, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
> > On Feb 13, 2017, at 2:25 PM, Dave May <dave.mayhem23 at gmail.com> wrote:
> >
> > Hey,
> >
> >
> > On 13 February 2017 at 19:41, Barry Smith <bsmith at mcs.anl.gov> wrote:
> >
> >    From the manual page: KSPSetComputeOperators() is not propagated to
> the sub KSP.
> >
> > Hmm - it does indeed say that.
> > The statement is partially correct.
> > If a DM is not attached to the KSP, the computeoperators method is NOT
> propagated.
> > If a DM is attached, is in fact propagated, see telescope_dmda.c : 502
> > As usual the doc lags behind the implementation :( [my fault]
> >
> >
> >    Yet there is some code
> > sred->ignore_kspcomputeoperators = PETSC_FALSE;
> > if (sred->ignore_kspcomputeoperators) {
> >           ierr = PetscViewerASCIIPrintf(subviewer,"  Telescope:
> ignoring KSPComputeOperators\n");CHKERRQ(ierr);
> >         }
> >   ierr = PetscOptionsBool("-pc_telescope_ignore_kspcomputeoperators","Ignore
> method used to compute A","PCTelescopeSetIgnoreKSPCompute
> Operators",sred->ignore_kspcomputeoperators,&sred->
> ignore_kspcomputeoperators,0);CHKERRQ(ierr);
> >   etc
> >
> >    that seems to indicate that Dave May intended to have support for
> KSPComputeOperator ... but didn't complete it.
> >
> > True - I didn't try this implementation out with any SNES examples.
> > Note it is/was tested and to my knowledge does work for linear cases.
> > Hopefully someone can confirm/deny this.
> >
> > I should also note that it was already fragile for the linear only case.
> For instance if the computeoperators function employed a matrix-free method
> which involved some stored data (e.g. viscosity on a quadrature point),
> then the user would have to cleverly (e.g. via a hack) also communicate
> said quadrature values from the parent comm to the sub-comm. I never found
> a clean way to manage this operation.
> >
> >
> >
> >    In this case it is using KSPComputeOperators_SNES(KSP ksp,Mat A,Mat
> B,void *ctx); which needs a couple of things changed to get it to work:
> >
> > 1) KSPComputeOperators_SNES() needs to work when the subdm is on a
> smaller set of processes so the call to SNESComputeJacobian() has to handle
> running on a subset of process (which of course it cannot do normally
> because the SNES has its full communicator while the DM only lives on a
> subset of processes).
> >
> > The motivating idea behind adding support for setcomputeoperators was to
> (i) avoid having to permute an assembled operator defined on the parent
> comm, into the ordering on the sub-comm and (ii) to permit people to use a
> matrix-free implementation.
> >
> > The SNES case, at least with an assembled jacobian, should be possible
> to make work. We would assemble J on the parent comm, then we can move the
> operator on the sub-comm and do what we like. Does that sound reasonable?
>
>   No, what if there are multiple levels on the smaller DM? Always assembly
> on the full and somehow permute down? What if there is a telescope inside a
> telescope.
>

You're right.. that case also kills my follow up suggestion.


>
> >
> >
> >
> >
> > 2) The DMRestrictHook_SNESVecSol() has to handle switching to a DM on a
> smaller set of processes and remapping the "solution" vector to that
> smaller set.
> >
> > Hmmm - that seems awkward. Sounds like the user will have to do
> something magical in their hook definition...
>
>    Not really, the shrinking of the solution vector to the smaller comm is
> the same as the shrinking of the right hand side of the system. So the
> process of shrinking is not difficult it is having the
> DMRestrictHook_SNESVecSol() knowing about the smaller DMs.
> >
> >
> >
> >   Dave and Jed,
> >
> >     All and all looks awfully complex and fragile to get this working.
> What do you think?
> >
> >    Barry
> >
> >   Curse whoever designed PETSc originally and didn't take into account
> in its basic design the need to subcom PETSc objects to smaller
> communicators :-(
> >
> > LOL.
> >
> >
> > Cheers,
> >   Dave
> >
> >
> >
> >
> > > On Feb 13, 2017, at 10:01 AM, Zhang, Hong <hongzhang at anl.gov> wrote:
> > >
> > > Hi Barry,
> > >
> > > When using telescope, I get an error complaining about missing
> Jacobian. But apparently I have a Jacobian set by TSSetIJacobian().
> > >
> > > [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > > [0]PETSC ERROR:
> > > [0]PETSC ERROR: Must call SNESSetJacobian(), DMSNESSetJacobian(),
> DMDASNESSetJacobianLocal(), etc
> > > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/
> documentation/faq.html for trouble shooting.
> > > [0]PETSC ERROR: Petsc Development GIT revision: v3.7.4-2764-g0ff3c95
> GIT Date: 2017-01-10 15:40:21 -0600
> > > [0]PETSC ERROR: ./ex5adj_ell on a arch-memalign-mkl-avx512-opt named
> isdp001.cels.anl.gov by hongzh Mon Feb 13 09:56:48 2017
> > > [0]PETSC ERROR: Configure options --COPTFLAGS="-g -O3 -xMIC-AVX512"
> --CXXOPTFLAGS="-g -O3 -xMIC-AVX512" --FOPTFLAGS="-g -O3 -xMIC-AVX512"
> --PETSC_ARCH=arch-memalign-mkl-avx512-opt --PETSC_DIR=/homes/hongzh/Projects/petsc
> --download-cmake --download-revolve=1 --with-blaslapack-dir=/opt/
> intel/2017-initial/compilers_and_libraries_2017/linux/mkl/lib/intel64
> --with-cc=mpiicc --with-cxx=mpiicpc --with-debugging=no --with-fc=mpiifort
> --with-memalign=64 --with-mpiexec=mpirun
> > > [0]PETSC ERROR: #1 SNESComputeJacobian() line 2242 in
> /home/hongzh/Projects/petsc/src/snes/interface/snes.c
> > > [0]PETSC ERROR: #2 KSPComputeOperators_SNES() line 569 in
> /home/hongzh/Projects/petsc/src/snes/interface/snes.c
> > > [0]PETSC ERROR: #3 KSPSetUp() line 327 in /home/hongzh/Projects/petsc/
> src/ksp/ksp/interface/itfunc.c
> > > [0]PETSC ERROR: #4 KSPSolve() line 599 in /home/hongzh/Projects/petsc/
> src/ksp/ksp/interface/itfunc.c
> > > [0]PETSC ERROR: #5 PCApply_Telescope_dmda() line 952 in
> /home/hongzh/Projects/petsc/src/ksp/pc/impls/telescope/telescope_dmda.c
> > > [0]PETSC ERROR: #6 PCApply() line 458 in /home/hongzh/Projects/petsc/
> src/ksp/pc/interface/precon.c
> > > [0]PETSC ERROR: #7 KSP_PCApply() line 251 in
> /homes/hongzh/Projects/petsc/include/petsc/private/kspimpl.h
> > > [0]PETSC ERROR: #8 KSPSolve_PREONLY() line 22 in
> /home/hongzh/Projects/petsc/src/ksp/ksp/impls/preonly/preonly.c
> > > [0]PETSC ERROR: #9 KSPSolve() line 656 in /home/hongzh/Projects/petsc/
> src/ksp/ksp/interface/itfunc.c
> > > [0]PETSC ERROR: #10 PCMGMCycle_Private() line 19 in
> /home/hongzh/Projects/petsc/src/ksp/pc/impls/mg/mg.c
> > > [0]PETSC ERROR: #11 PCMGMCycle_Private() line 53 in
> /home/hongzh/Projects/petsc/src/ksp/pc/impls/mg/mg.c
> > > [0]PETSC ERROR: #12 PCMGMCycle_Private() line 53 in
> /home/hongzh/Projects/petsc/src/ksp/pc/impls/mg/mg.c
> > > [0]PETSC ERROR: #13 PCMGMCycle_Private() line 53 in
> /home/hongzh/Projects/petsc/src/ksp/pc/impls/mg/mg.c
> > > [0]PETSC ERROR: #14 PCMGMCycle_Private() line 53 in
> /home/hongzh/Projects/petsc/src/ksp/pc/impls/mg/mg.c
> > > [0]PETSC ERROR: #15 PCApply_MG() line 331 in
> /home/hongzh/Projects/petsc/src/ksp/pc/impls/mg/mg.c
> > > [0]PETSC ERROR: #16 PCApply() line 458 in /home/hongzh/Projects/petsc/
> src/ksp/pc/interface/precon.c
> > > [0]PETSC ERROR: #17 KSP_PCApply() line 251 in
> /homes/hongzh/Projects/petsc/include/petsc/private/kspimpl.h
> > > [0]PETSC ERROR: #18 KSPInitialResidual() line 67 in
> /home/hongzh/Projects/petsc/src/ksp/ksp/interface/itres.c
> > > [0]PETSC ERROR: #19 KSPSolve_GMRES() line 233 in
> /home/hongzh/Projects/petsc/src/ksp/ksp/impls/gmres/gmres.c
> > > [0]PETSC ERROR: #20 KSPSolve() line 656 in /home/hongzh/Projects/petsc/
> src/ksp/ksp/interface/itfunc.c
> > > [0]PETSC ERROR: #21 SNESSolve_NEWTONLS() line 224 in
> /home/hongzh/Projects/petsc/src/snes/impls/ls/ls.c
> > > [0]PETSC ERROR: #22 SNESSolve() line 3967 in
> /home/hongzh/Projects/petsc/src/snes/interface/snes.c
> > > [0]PETSC ERROR: #23 TS_SNESSolve() line 171 in
> /home/hongzh/Projects/petsc/src/ts/impls/implicit/theta/theta.c
> > > [0]PETSC ERROR: #24 TSStep_Theta() line 211 in
> /home/hongzh/Projects/petsc/src/ts/impls/implicit/theta/theta.c
> > > [0]PETSC ERROR: #25 TSStep() line 3808 in /home/hongzh/Projects/petsc/
> src/ts/interface/ts.c
> > > [0]PETSC ERROR: #26 TSSolve() line 4053 in /home/hongzh/Projects/petsc/
> src/ts/interface/ts.c
> > > [0]PETSC ERROR: #27 main() line 171 in /homes/hongzh/Projects/petsc/
> src/ts/examples/tutorials/advection-diffusion-reaction/ex5adj_ell.c
> > > [0]PETSC ERROR: PETSc Option Table entries:
> > > [0]PETSC ERROR: -da_refine 4
> > > [0]PETSC ERROR: -dm_mat_type aij
> > > [0]PETSC ERROR: -forwardonly
> > > [0]PETSC ERROR: -ksp_converged_reason
> > > [0]PETSC ERROR: -ksp_view
> > > [0]PETSC ERROR: -mg_coarse_pc_telescope_reduction_factor 2
> > > [0]PETSC ERROR: -mg_coarse_pc_type telescope
> > > [0]PETSC ERROR: -mg_coarse_telescope_pc_type lu
> > > [0]PETSC ERROR: -mg_levels_pc_type jacobi
> > > [0]PETSC ERROR: -pc_type mg
> > > [0]PETSC ERROR: -ts_dt 1
> > > [0]PETSC ERROR: -ts_max_steps 1
> > > [0]PETSC ERROR: ----------------End of Error Message -------send
> entire error message to petsc-maint at mcs.anl.gov----------
> > >
> > > Thanks,
> > > Hong
> >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20170213/174b6c0b/attachment.html>


More information about the petsc-dev mailing list