[petsc-users] Freeing Preconditioner Memory

Gaetan Kenway kenway at utias.utoronto.ca
Fri Aug 26 10:17:35 CDT 2011


Hello

I've almost got the petsc preconditioner free memory function to work. It
now run the actual free-ing part, but segfaults on when it uses GMRES to
solve the linear system.  The relevant section of code that setups the
ksp/pc/localpc objects is below (in fortran) as well as the c-function for
freeing the PC memory.

!***************** Fortran Code *********************
ksp_solver_type = 'asm'
ksp_subspace = 50
asm_overlap =1
local_pc_ilu_level = 1
local_pc_ordering = 'rcm'

!Setup global_ksp type and get the global PC
  call KSPSetType(ksp,ksp_solver_type,ierr)
  call KSPGMRESSetRestart(ksp, ksp_subspace,ierr)
  call KSPSetPreconditionerSide(ksp,PC_RIGHT,ierr);
   call KSPGetPC(ksp,global_pc,ierr);

 ! Setup global_pc
  call PCSetType(global_pc,"asm",ierr)
  call PCASMSetOverlap(global_pc,asm_overlap,ierr)
  call PCSetup(global_pc,ierr)
  call PCASMGetSubKSP(global_pc, nlocal,  first, local_ksp, ierr )

! Setup local_pc
  call KSPGetPC(local_ksp, local_pc, ierr )
  call PCSetType(local_pc, 'ilu', ierr)
  call PCFactorSetLevels(local_pc, local_pc_ilu_level, ierr)
  call PCFactorSetMatOrderingtype(local_pc, local_pc_ordering, ierr )
  call KSPSetType(local_ksp, KSPPREONLY, ierr)

  call KSPSetup(global_ksp,ierr)
  call KSPSetUpOnBlocks(global_ksp,ierr)

  ! Free Memory
  call PCASMFreeSpace(global_pc)

! ************* C Code **************************

void pcasmfreespace_(PC *inpc)
    {
      PC pc = *inpc;
      PC_ASM         *osm = (PC_ASM*)pc->data;
      PetscErrorCode ierr;

      if (osm->pmat) {
        if (osm->n_local_true > 0) {
          ierr = MatDestroyMatrices(osm->n_local_true,&osm->pmat);
          CHKERRQ(ierr);
        }
      osm->pmat = 0;
      }

      if (pc->pmat) {
       ierr = MatDestroy(pc->pmat);
       CHKERRQ(ierr);
       pc->pmat = 0;
      }
   }
! *********************************************8

When I run it, I get an error is PCApplyBAorAB
[0]PETSC ERROR: [0] MatMult line 1877 src/mat/interface/matrix.c
[0]PETSC ERROR: [0] PCApplyBAorAB line 540 src/ksp/pc/interface/precon.c
[0]PETSC ERROR: [0] GMREScycle line 132 src/ksp/ksp/impls/gmres/gmres.c
[0]PETSC ERROR: [0] KSPSolve_GMRES line 227 src/ksp/ksp/impls/gmres/gmres.c
[0]PETSC ERROR: [0] KSPSolve line 308 src/ksp/ksp/interface/itfunc.c

It appears then that you must keep the un-factored preconditioner matrix in
memory. Is this actually the case? Or have I done something silly here?

Thanks,
Gaetan



On Aug 25, 2011, at 1:29 PM, Gaetan Kenway wrote:

> Hello
>
> I've managed to get the c-function for freeing preconditioner memory
written. The contents of my new 'pcasmfreespace.c' is below:
>
> #include <private/pcimpl.h>     /*I "petscpc.h" I*/
>
> typedef struct {
>  PetscInt   n, n_local, n_local_true;
>  PetscInt   overlap;             /* overlap requested by user */
>  KSP        *ksp;                /* linear solvers for each block */
>  VecScatter *restriction;        /* mapping from global to subregion */
>  VecScatter *localization;       /* mapping from overlapping to
non-overlapping subregion */
>  VecScatter *prolongation;       /* mapping from subregion to global */
>  Vec        *x,*y,*y_local;      /* work vectors */
>  IS         *is;                 /* index set that defines each
overlapping subdomain */
>  IS         *is_local;           /* index set that defines each
non-overlapping subdomain, may be NULL */
>  Mat        *mat,*pmat;          /* mat is not currently used */
>  PCASMType  type;                /* use reduced interpolation, restriction
or both */
>  PetscInt  type_set;            /* if user set this value (so won't change
it for symmetric problems) */
>  PetscInt  same_local_solves;   /* flag indicating whether all local
solvers are same */
>  PetscInt  sort_indices;        /* flag to sort subdomain indices */
> } PC_ASM;
>
> void pcasmfreespace_(PC pc)
>     {
>
>       PC_ASM         *osm = (PC_ASM*)pc->data;
>       PetscErrorCode ierr;
>
>       if (osm->pmat) {
>     if (osm->n_local_true > 0) {
>       ierr = MatDestroyMatrices(osm->n_
local_true,&osm->pmat);CHKERRQ(ierr);
>     }
>     osm->pmat = 0;
>       }
>
>       if (pc->pmat) {ierr = MatDestroy(pc->pmat);CHKERRQ(ierr); pc->pmat =
0;}
>
>     }
>
> Note the underscore as I'm trying to call it from Fortran. When I call it
from fortran, I use:
>
> call pcasmfreespace(global_pc)
>
> This calls, the function, ok, but (according to Valgrind) I have an
invalid read on the line containing:
>
> if (osm->pmat){
>
> I suspect this is something to do with passing the fortran "pointer" of
the PC to c, or something along this lines. Is there anything else special
you have to do to pass the "fortran" petsc objects to c?
>
> Thanks
>
> Gaetan
>
>
> Message: 2
> Date: Sun, 21 Aug 2011 17:22:28 -0500
> From: Barry Smith <bsmith at mcs.anl.gov>
> Subject: Re: [petsc-users] Freeing Preconditioner Matrix Space
> To: PETSc users list <petsc-users at mcs.anl.gov>
> Message-ID: <953121EF-B6AC-42EE-87BE-D4402C121652 at mcs.anl.gov>
> Content-Type: text/plain; charset=us-ascii
>
>
>  You don't need to put that in the PETSc source. Just built it in the same
directory you build your application and link it in like any of your
application code. You will need to stick
> #include <private/pcimpl.h>     /*I "petscpc.h" I*/
>
> typedef struct {
>  PetscInt   n, n_local, n_local_true;
>  PetscInt   overlap;             /* overlap requested by user */
>  KSP        *ksp;                /* linear solvers for each block */
>  VecScatter *restriction;        /* mapping from global to subregion */
>  VecScatter *localization;       /* mapping from overlapping to
non-overlapping subregion */
>  VecScatter *prolongation;       /* mapping from subregion to global */
>  Vec        *x,*y,*y_local;      /* work vectors */
>  IS         *is;                 /* index set that defines each
overlapping subdomain */
>  IS         *is_local;           /* index set that defines each
non-overlapping subdomain, may be NULL */
>  Mat        *mat,*pmat;          /* mat is not currently used */
>  PCASMType  type;                /* use reduced interpolation, restriction
or both */
>  PetscBool  type_set;            /* if user set this value (so won't
change it for symmetric problems) */
>  PetscBool  same_local_solves;   /* flag indicating whether all local
solvers are same */
>  PetscBool  sort_indices;        /* flag to sort subdomain indices */
> } PC_ASM;
>
> before the subroutine.
>
>  Barry
>
> On Aug 21, 2011, at 3:24 PM, Gaetan Kenway wrote:
>
> > Hello
> >
> > I am attempting to implement a "hack" that was posted on the list a
while back. I'm working with the adjoint linear system solver for a CFD
solver. I'm using the ASM (or Block Jacobi) preconditioner with ILU(p) on
each of the sub-domains. I use a different Preconditioner matrix (Pmat) than
the actual jacobian. What I want to do is destroy the Pmat memory after the
ILU factorization is performed. The hack that was posted is copied below:
> >  PCASMFreeSpace(PC pc)
> >    {
> >   PC_ASM         *osm = (PC_ASM*)pc->data;
> >   PetscErrorCode ierr;
> >
> >      if (osm->pmat) {
> >     if (osm->n_local_true > 0) {
> >
> >       ierr = MatDestroyMatrices(osm->n_
> local_true,&osm->pmat);CHKERRQ(ierr);
> >     }
> >   osm->pmat = 0;
> >   }
> >   if (pc->pmat) {ierr = MatDestroy(pc->pmat);CHKERRQ(ierr); pc->pmat =
0;}
> >
> >   return 0;
> >  }
> >
> > However, I've had no luck actually getting the function compiled into
petsc. There are no erorrs reported with i type "make" in the asm directory,
but when I try to use the function in my application it can't find the
symbol while linking. Where does it go in the asm.c file? Does it use
"static PetscErrorCode" or "PetscErrorCode PETSCKSP_DLLEXPORT"? Does it have
to be added to the .h include files? What has to be done for it work with
Fortran?
> >
> > Any suggestions would be greatly appreciated.  This represents a
significant chunk of my application's memory (10%-30%) and as such its too
much to ignore.   Also is there any chance something like this would make it
into an actual PETSc release?
> >
> > Gaetan
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110826/d33f129c/attachment.htm>


More information about the petsc-users mailing list