[petsc-users] MG Preconditioning

Gaetan Kenway kenway at utias.utoronto.ca
Sat Sep 1 18:02:39 CDT 2012


Hi Jed

That commit fixes the mg->galerkin issue.

I'm still having issues with the projection/restriction matrices. Are the
projection/restriction matrices supposed to be serial matrices operating on
the unknowns on each processor even when running in parallel?

Thanks,

Gaetan

On Sat, Sep 1, 2012 at 6:33 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
> On Sep 1, 2012, at 5:16 PM, Gaetan Kenway <kenway at utias.utoronto.ca>
> wrote:
>
> > Hi Jed
> >
> > After probing why the following code snippit isn't run:
> >
> >   if (mg->galerkin == 1) {
> >     Mat B;
> >     /* currently only handle case where mat and pmat are the same on
> coarser levels */
> >         dB   = B;
> >       }
> >     }
> >
> > It is because after I use  PCMGSetGalerkin(),  mg->galerkin has a value
> of -1 not 1. From what I can tell the value of PETSC_TRUE is
> architecture/implementation dependant.  I can't see where/how the value of
> mg->galerkin obtains a value of 1 and runs the code above.
> >
> > As for the other issue, after converting my baij matrix to aij, running
> MatRART() is ok.
> >
> > So we just have to figure out why calling
> >
> >
> >   call PCMGSetGalerkin(pc, PETSC_TRUE, PETScIerr)
> >   call EChk(PETScIerr,__FILE__,__LINE__)
> >
> > sets mg->galerkin to a value of -1 and the code above is expecting a
> value of 1.
>
>     This is a mistake with our Fortran interface.  Some Fortran compilers
> (for what crazy reason I don't know) use -1 as bool for true. Normally this
> doesn't matter because we have if (boolvalue) { checks and so negative one
> and one work the same way;
>
>     But the code   mg->galerkin = (PetscInt)use;  in PCMGSetGalekin()
> fails because we then check the numerical value of the result, instead of
> just if it is zero or nonzero.
>
>      I can change this conversion to   mg->galerkin = use ?  1 : 0  so the
> the Fortran use of -1 gets converted to one and this should resolve the
> problem.
>
>     I have pushed this fix into the petsc-3.3 and petsc-dev repositories.
> If you are using 3.3 I've attached a new src/ksp/pc/impls/mg/mg.c that you
> can drop into that directory and run make in that directory to update your
> library. If you are using petsc-dev then do an hg pull -u and rerun make in
> that directory.[see attached file: mg.c]
>
>     Thanks for determining the cause of this rather nasty bug.
>
>    Barry
>
>
> >
> > Gaetan
> >
> >
> >
> >
> > On Sat, Sep 1, 2012 at 5:34 PM, Jed Brown <jedbrown at mcs.anl.gov> wrote:
> > On Sat, Sep 1, 2012 at 4:20 PM, Gaetan Kenway <kenway at utias.utoronto.ca>
> wrote:
> > Sorry about the last email...I hit sent it too early.
> >
> > The full calling sequence is:
> >
> >
> >   call KSPCreate(SUMB_COMM_WORLD, ksp, PETScIerr)
> >   call EChk(PETScIerr,__FILE__,__LINE__)
> >
> >   useAD = .False.
> >   useTranspose = .True.
> >   usePC = .True.
> >   call setupStateResidualMatrix(drdwpret,useAD,usePC,useTranspose)
> >
> >  ! dRdwT has already been created
> >   call KSPSetOperators(ksp, dRdWT, dRdwpreT, &
> >        DIFFERENT_NONZERO_PATTERN, PETScIerr)
> >   call EChk(PETScIerr,__FILE__,__LINE__)
> >
> >   call KSPSetFromOptions(ksp, PETScIerr)
> >   call EChk(PETScIerr,__FILE__,__LINE__)
> >
> >   call KSPSetType(ksp, KSPFGMRES, PETScIerr)
> >   call EChk(PETScIerr,__FILE__,__LINE__)
> >
> >   call KSPGMRESSetRestart(ksp, adjRestart, PETScIerr)
> >   call EChk(PETScIerr,__FILE__,__LINE__)
> >
> >   call KSPGMRESSetCGSRefinementType(ksp, &
> >        KSP_GMRES_CGS_REFINE_IFNEEDED, PETScIerr)
> >   call EChk(PETScIerr,__FILE__,__LINE__)
> >
> >   call kspgetpc(ksp, pc, PETScIerr)
> >   call EChk(PETScIerr,__FILE__,__LINE__)
> >
> >   call pcsettype(pc, PCMG, PETScIerr)
> >   call EChk(PETScIerr,__FILE__,__LINE__)
> >
> >   call PetscOptionsSetValue('-pc_mg_levels','2',PETScIerr)
> >   call EChk(PETScIerr,__FILE__,__LINE__)
> >
> >  ! Set command line options
> >   call PCSetFromOptions(pc, PETScIerr)
> >   call EChk(PETScIerr,__FILE__,__LINE__)
> >
> >   call PCMGSetGalerkin(pc, PETSC_TRUE, PETScIerr)
> >   call EChk(PETScIerr,__FILE__,__LINE__)
> >
> >   ! Create the restriction operator between finest and one below:
> >   call createRestrictOperator(RL1, 1)
> >
> >   call PCMGSetRestriction(pc, 1, RL1, PETScierr)
> >   call EChk(PETScIerr,__FILE__,__LINE__)
> >
> >   call PCSetup(pc, PETScierr)
> >   call EChk(PETScIerr,__FILE__,__LINE__)
> >
> >  The error occurs in PCSetup() which calls PCSetup_MG() and the issue is
> in KSPGetVecs() on line 644 in mg.c. Just to clarify, this isn't a memory
> problem/segfault.
> >
> > The output below is a SEGV.
> >
> > PETSc returns control cleanly and I my code exits when the error is
> detected. Therefore a backtrace or valgrind doesn't yield any additional
> information. ( I Ran both).
> >
> > You can still set a breakpoint in PetscError (-start_in_debugger /
> -on_error_attach_debugger does this automatically). Then you can examine
> the stack.
> >
> > The terminal output is:
> >
> > [0]PETSC ERROR:
> ------------------------------------------------------------------------
> > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range
> > [0]PETSC ERROR: Try option -start_in_debugger or
> -on_error_attach_debugger
> > [0]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try
> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory
> corruption errors
> > [0]PETSC ERROR: likely location of problem given in stack below
> > [0]PETSC ERROR: ---------------------  Stack Frames
> ------------------------------------
> >  =================================================================
> > PETSc or MPI Error. Error Code 73. Detected on Proc  0
> > Error at line:   130 in file: setupPETScKsp.F90
> >  =================================================================
> > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> available,
> > [0]PETSC ERROR:       INSTEAD the line number of the start of the
> function
> > [0]PETSC ERROR:       is given.
> > [0]PETSC ERROR: [0] MatGetVecs line 8142
> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/mat/interface/matrix.c
> > [0]PETSC ERROR: [0] KSPGetVecs line 774
> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/ksp/interface/iterativ.c
> > [0]PETSC ERROR: [0] PCSetUp_MG line 508
> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/pc/impls/mg/mg.c
> > [0]PETSC ERROR: [0] PCSetUp line 810
> /nfs/mica/home/kenway/Downloads/petsc-3.3/src/ksp/pc/interface/precon.c
> >
> > I had a look at ex42.c but it uses a DM which I am not doing.
> >
> > It uses the DM for convenience to build the operators, but it does *not*
> tell the KSP about the DM. That is, it does not call KSPSetDM() or
> PCSetDM(). So the code in mg.c does not know about the DM.
> >
> > From the example there is the following:
> >
> > ierr = PCMGSetLevels(pc,nlevels,PETSC_NULL);CHKERRQ(ierr);
> >   ierr = PCMGSetType(pc,PC_MG_MULTIPLICATIVE);CHKERRQ(ierr);
> >   ierr = PCMGSetGalerkin(pc,PETSC_TRUE);CHKERRQ(ierr);
> >
> >   for (k=1; k<nlevels; k++){
> >     ierr =
> DMCreateInterpolation(da_list[k-1],da_list[k],&R,PETSC_NULL);CHKERRQ(ierr);
> >     ierr = PCMGSetInterpolation(pc,k,R);CHKERRQ(ierr);
> >     ierr = MatDestroy(&R);CHKERRQ(ierr);
> >   }
> >
> > Is there anything special about "R" in this case?
> >
> > Nothing special, it is just an MAIJ matrix that performs the
> interpolation.
> >
> > As I mentioned before, I know what the issue is, the coarse grid
> operators are not set at all. From what I can tell, they should be set in
> the routine PCSetup_MG in mg.c, but the code is never run unless pc->dm is
> True.
> >
> > What code is not run unless pc->dm?
> >
> > I guess I'll just create the operators myself. Even if I modify the code
> to make it enter the if statement at line 577
> >
> > You called PCMGSetGalerkin() so the code below should run, constructing
> the coarse levels. You could try setting a breakpoint in there to see why
> it's not running.
> >
> >
> >   if (mg->galerkin == 1) {
> >     Mat B;
> >     /* currently only handle case where mat and pmat are the same on
> coarser levels */
> >     ierr =
> KSPGetOperators(mglevels[n-1]->smoothd,&dA,&dB,&uflag);CHKERRQ(ierr);
> >     if (!pc->setupcalled) {
> >       for (i=n-2; i>-1; i--) {
> >         ierr =
> MatPtAP(dB,mglevels[i+1]->interpolate,MAT_INITIAL_MATRIX,1.0,&B);CHKERRQ(ierr);
> >         ierr =
> KSPSetOperators(mglevels[i]->smoothd,B,B,uflag);CHKERRQ(ierr);
> >         if (i != n-2) {ierr =
> PetscObjectDereference((PetscObject)dB);CHKERRQ(ierr);}
> >         dB   = B;
> >       }
> >       if (n > 1) {ierr =
> PetscObjectDereference((PetscObject)dB);CHKERRQ(ierr);}
> >     } else {
> >       for (i=n-2; i>-1; i--) {
> >         ierr =
> KSPGetOperators(mglevels[i]->smoothd,PETSC_NULL,&B,PETSC_NULL);CHKERRQ(ierr);
> >         ierr =
> MatPtAP(dB,mglevels[i+1]->interpolate,MAT_REUSE_MATRIX,1.0,&B);CHKERRQ(ierr);
> >         ierr =
> KSPSetOperators(mglevels[i]->smoothd,B,B,uflag);CHKERRQ(ierr);
> >         dB   = B;
> >       }
> >     }
> >
> > which seems like where it should be done....the MatPTAP will not run.
> >
> > What do you mean "will not run"?
> >
> > I'm using a mpibaij matrix for 'dRdwPreT' and my restriction matrix
> 'RL1' is a mpiaij matrix. When I try to run  MatRART myself with the code:
> >
> >  call MatRART(dRdwT, RL1, MAT_INITIAL_MATRIX, 1.0, newMat, PETScIERR)
> >  call EChk(PETScIerr,__FILE__,__LINE__)
> >
> > I receive error code 56: /* no support for requested operation */ (this
> is a clean exit from PETSc)
> >
> > MatRARt (or MatPtAP, for that matter) is not implemented for BAIJ with
> an AIJ projection. The AIJ projection will lose the block structure anyway,
> so just use AIJ for both.
> >
> >
> > In conclusion, I have two issues: 1) The coarse grid operators are not
> being set automatically in mg.c. 2) Using MatRART() with a mpibaij matrix
> and a mpiaij matrix is unsupported.
> >
> > Is there a PCMG example that doesn't use a DM to generate the
> restriction/prolongation/matrix information I can have a look at?
> >
> >
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120901/fe1101b3/attachment.html>


More information about the petsc-users mailing list