[petsc-users] FETI-DP implementation and call sequence
Alp Kalpalp
alpkalpalp at gmail.com
Mon Oct 27 16:00:35 CDT 2014
Hi,
Are there any some simpler examples for BDDC/FETIDP algo?
Since it uses MATIS etc. it is somehow hard to track the ex59.
A simple case for example unstructured membrane solution etc. can be useful
I thnk.
On Mon, Sep 22, 2014 at 6:39 PM, Stefano Zampini <stefano.zampini at gmail.com>
wrote:
> Sorry for late reply. I just pushed a fix for the crash. It is in master.
>
> Stefano
> On Fri, 5 Sep 2014, Jed Brown wrote:
>
> > Satish Balay <balay at mcs.anl.gov> writes:
> > > Perhaps the following is the fix [with proper comments, more error
> > > checks?]. But someone more familiar with this code should check this..
> > >
> > > Satish
> > >
> > > --------------
> > > $ git diff |cat
> > > diff --git a/src/ksp/pc/impls/is/pcis.c b/src/ksp/pc/impls/is/pcis.c
> > > index dab5836..0fa0217 100644
> > > --- a/src/ksp/pc/impls/is/pcis.c
> > > +++ b/src/ksp/pc/impls/is/pcis.c
> > > @@ -140,6 +140,8 @@ PetscErrorCode PCISSetUp(PC pc)
> > > ierr =
> PetscObjectTypeCompare((PetscObject)pc->pmat,MATIS,&flg);CHKERRQ(ierr);
> > > if (!flg)
> SETERRQ(PetscObjectComm((PetscObject)pc),PETSC_ERR_ARG_WRONG,"Preconditioner
> type of Neumann Neumman requires matrix of type MATIS");
> > > matis = (Mat_IS*)pc->pmat->data;
> > > + PetscObjectReference((PetscObject)pc->pmat);
> > > + pcis->pmat = pc->pmat;
> >
> > Uh, PCISSetUp can be called more than once?
>
> I have no idea..
>
> >
> > And simply destroying the pcis->pmat reference is not enough because
> > that extra reference could significantly increase the peak memory usage.
>
> Curently the object (pc->pmat) is destroyed at the end anyway [perhaps
> duing PCDestroy()]. This fix changes the order a bit so that its
> destoryed only after its last use.
>
> > The right solution is to not hold that reference and not hold the info.
> >
> > > pcis->pure_neumann = matis->pure_neumann;
> > >
> > > @@ -378,8 +380,9 @@ PetscErrorCode PCISDestroy(PC pc)
> > > ierr = VecScatterDestroy(&pcis->global_to_B);CHKERRQ(ierr);
> > > ierr = PetscFree(pcis->work_N);CHKERRQ(ierr);
> > > if (pcis->ISLocalToGlobalMappingGetInfoWasCalled) {
> > > - ierr =
> ISLocalToGlobalMappingRestoreInfo((ISLocalToGlobalMapping)0,&(pcis->n_neigh),&(pcis->neigh),&(pcis->n_shared),&(pcis->shared));CHKERRQ(ierr);
> > > + ierr =
> ISLocalToGlobalMappingRestoreInfo(((Mat_IS*)pcis->pmat->data)->mapping,&(pcis->n_neigh),&(pcis->neigh),&(pcis->n_shared),&(pcis->shared));CHKERRQ(ierr);
> > > }
> >
> > Why not restore the info at the place it is gotten, like we do with
> > every other accessor?
>
> Looks like this info is stashed in 'pcis->n_neigh, pcis->neigh' etc -
> and reused later multple times. [perhaps preventing multiple
> mallocs/frees]
>
> $ git grep -l 'pcis->n_neigh'
> src/ksp/pc/impls/bddc/bddcfetidp.c
> src/ksp/pc/impls/is/nn/nn.c
> src/ksp/pc/impls/is/pcis.c
>
> Or perhaps this info should be stashed in the IS so multiple
> ISLocalToGlobalMappingGetInfo() calls are cheap [but then the malloc'd
> memory will live until IS is destroyed anyway]
>
> I guess there are 2 issues you are touching on. A fix for this crash -
> and code cleanup. My patch gets the examples working.
>
> But I'll defer both isses to Stefano [asuming he is aquainted with the
> above sources].
>
> Satish
>
> >
> > > + ierr = MatDestroy(&pcis->pmat);CHKERRQ(ierr);
> > > ierr =
> PetscObjectComposeFunction((PetscObject)pc,"PCISSetUseStiffnessScaling_C",NULL);CHKERRQ(ierr);
> > > ierr =
> PetscObjectComposeFunction((PetscObject)pc,"PCISSetSubdomainScalingFactor_C",NULL);CHKERRQ(ierr);
> > > ierr =
> PetscObjectComposeFunction((PetscObject)pc,"PCISSetSubdomainDiagonalScaling_C",NULL);CHKERRQ(ierr);
> > > diff --git a/src/ksp/pc/impls/is/pcis.h b/src/ksp/pc/impls/is/pcis.h
> > > index 4a42cf9..736ea8c 100644
> > > --- a/src/ksp/pc/impls/is/pcis.h
> > > +++ b/src/ksp/pc/impls/is/pcis.h
> > > @@ -73,6 +73,7 @@ typedef struct {
> > > /* We need:
> */
> > > /* proc[k].loc_to_glob(proc[k].shared[i][m]) ==
> proc[l].loc_to_glob(proc[l].shared[j][m]) */
> > > /* for all 0 <= m < proc[k].n_shared[i], or equiv'ly, for all 0 <=
> m < proc[l].n_shared[j] */
> > > + Mat pmat;
> > > } PC_IS;
> > >
> > > PETSC_EXTERN PetscErrorCode PCISSetUp(PC pc);
> >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141027/65e14f21/attachment.html>
More information about the petsc-users
mailing list