[petsc-dev] [Bitbucket] Pull request #62: PCBDDC: bugfix in PCPostSolve_BDDC. (petsc/petsc)

Stefano Zampini s.zampini at cineca.it
Sat Jul 6 03:59:58 CDT 2013


2013/7/6 Jed Brown <jedbrown at mcs.anl.gov>

> [Moving to petsc-dev]
>
> Stefano Zampini <pullrequests-reply at bitbucket.org> writes:
>
> > --- you can reply above this line ---
> >
> > New comment on pull request:
> >
> >
> https://bitbucket.org/petsc/petsc/pull-request/62/pcbddc-bugfix-in-pcpostsolve_bddc#comment-361980
> >
> > Stefano Zampini said:
> >
> > I'm on a two days holiday, I will remove function pointers as soon as
> > I will come back home (Monday). Then you can merge a complete branch
> > with a clean history as you like ;-)
>
> Should I just add the following, or did you have more in mind?
>
> diff --git i/src/ksp/pc/impls/bddc/bddc.c w/src/ksp/pc/impls/bddc/bddc.c
> index 14c8718..d56745e 100644
> --- i/src/ksp/pc/impls/bddc/bddc.c
> +++ w/src/ksp/pc/impls/bddc/bddc.c
> @@ -933,7 +933,7 @@ static PetscErrorCode PCBDDCMatFETIDPGetRHS_BDDC(Mat
> fetidp_mat, Vec standard_rh
>
>    /* change of basis for physical rhs if needed
>       It also changes the rhs in case of dirichlet boundaries */
> -  (*mat_ctx->pc->ops->presolve)(mat_ctx->pc,NULL,standard_rhs,NULL);
> +  ierr =
> PCPreSolve_BDDC(mat_ctx->pc,NULL,standard_rhs,NULL);CHKERRQ(ierr);
>    /* store vectors for computation of fetidp final solution */
>    ierr =
> VecScatterBegin(pcis->global_to_D,standard_rhs,mat_ctx->temp_solution_D,INSERT_VALUES,SCATTER_FORWARD);CHKERRQ(ierr);
>    ierr =
> VecScatterEnd(pcis->global_to_D,standard_rhs,mat_ctx->temp_solution_D,INSERT_VALUES,SCATTER_FORWARD);CHKERRQ(ierr);
> @@ -1044,7 +1044,7 @@ static PetscErrorCode
> PCBDDCMatFETIDPGetSolution_BDDC(Mat fetidp_mat, Vec fetidp
>    ierr = VecScatterEnd
>  (pcis->global_to_D,pcis->vec1_D,standard_sol,INSERT_VALUES,SCATTER_REVERSE);CHKERRQ(ierr);
>    /* final change of basis if needed
>       Is also sums the dirichlet part removed during RHS assembling */
> -  (*mat_ctx->pc->ops->postsolve)(mat_ctx->pc,NULL,NULL,standard_sol);
> +  ierr =
> PCPostSolve_BDDC(mat_ctx->pc,NULL,NULL,standard_sol);CHKERRQ(ierr);
>    PetscFunctionReturn(0);
>
>  }
>
> Right.


> > Maybe I was not clear regarding Pre and Post solves, there's not
> > special code (except the if (..) guards) for FETI-DP into
> > PCPreSolve_BDDC and PCPostSolve_BDDC; they are needed by the BDDC
> > itself when the change of basis has been requested and/or some rows of
> > the MATIS matrix has been zeroed out to impose dirichlet
> > conditions. More specifically, change of basis is completely
> > trasparent to the user: MATIS local matrices are changed during
> > presolve and then restored back in postsolve.
>
> I'm not opposed to doing the change of basis in pre/post solve, but I
> don't think anyone liked the fragility induced by -ksp_diagonal_scale,
> which is the same kind of thing.
>
>
The PCBDDC user has also the possibility of not using the change of basis
(now it is the default).


> > Regarding FETI-DP: the current code (which is working great by the
> > way) is built on top of the PCBDDC class and it is still at its
> > infancy regarding a fully integration with PETSc and its
> > classes. PCPreSolve_BDDC and PCPostSolve_BDDC should be exactly where
> > they currently are in FETI-DP code, since FETI-DP context uses the
> > BDDC preconditioner (through the private function
> > PCBDDCApplyInterfacePreconditioner) to compute Matrix vector
> > multiplication in Krylov methods.
>
> I think this tracks off into a discussion that I'd like to get others'
> opinions about.  We currently have a very simple PCGalerkin that can
> formulate a problem in an alternative space (PCMG can do something
> similar and more).  We could have an enhanced version called PCFETIDP
> that does this setup for FETI-DP.  I don't consider it acceptable to ask
> the user to do anything special for FETI-DP beyond that which is
> algorithmically critical (MATIS Neumann matrices).
>
>
In my mind, MatFETIDP should be created starting by a MATIS matrix (as
PCBDDC) but the code placed either in ksp/pc or ksp/ksp, since it uses
internaly a PCBDDC, in order to not break "--with-single-library" builds.
As a preconditioner for FETI-DP system, now the default is the so called
Dirichlet preconditioner, which can be built starting from the MATIS.
Exposing Mat and PC object for FETI, then the user will need
MatFETIDPGetRHS and MatFETIDPGetSolution in order to switch between the
physical space of degrees of freedom and the space of FETI lagrange
multipliers.


> > We should decide how to definitively embed FETIDP into PETSc; does it
> > should be a KSP? What should be the calling sequence for the user to
> > build the FETI-DP matrix? Barry? Maybe we should move this to a
> > separate thread.
>



-- 

Ph. D. Stefano Zampini
CINECA
SuperComputing Applications and Innovations Department - SCAI
Via dei Tizii, 6 00185 Roma - ITALY
------------------------------------------------------------------------------------------------------------------------
Email: s.zampini at cineca.it
SkypeID: stefano.zampini
GoogleTalk: stefano.zampini at gmail.com
Tel: +39 06.44486.707
------------------------------------------------------------------------------------------------------------------------
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20130706/78da87de/attachment.html>


More information about the petsc-dev mailing list