Hello<br><br>I've almost got the petsc preconditioner free memory function to work. It now run the actual free-ing part, but segfaults on when it uses GMRES to solve the linear system. The relevant section of code that setups the ksp/pc/localpc objects is below (in fortran) as well as the c-function for freeing the PC memory. <br>
<br>!***************** Fortran Code *********************<br>ksp_solver_type = 'asm'<br>ksp_subspace = 50<br>asm_overlap =1<br>local_pc_ilu_level = 1<br>local_pc_ordering = 'rcm'<br><br>!Setup global_ksp type and get the global PC<br>
call KSPSetType(ksp,ksp_solver_type,ierr)<br> call KSPGMRESSetRestart(ksp, ksp_subspace,ierr)<br> call KSPSetPreconditionerSide(ksp,PC_RIGHT,ierr);<br> call KSPGetPC(ksp,global_pc,ierr); <br><br> ! Setup global_pc<br>
call PCSetType(global_pc,"asm",ierr)<br> call PCASMSetOverlap(global_pc,asm_overlap,ierr)<br> call PCSetup(global_pc,ierr)<br> call PCASMGetSubKSP(global_pc, nlocal, first, local_ksp, ierr )<br><br>! Setup local_pc<br>
call KSPGetPC(local_ksp, local_pc, ierr )<br> call PCSetType(local_pc, 'ilu', ierr)<br> call PCFactorSetLevels(local_pc, local_pc_ilu_level, ierr)<br> call PCFactorSetMatOrderingtype(local_pc, local_pc_ordering, ierr )<br>
call KSPSetType(local_ksp, KSPPREONLY, ierr)<br><br> call KSPSetup(global_ksp,ierr)<br> call KSPSetUpOnBlocks(global_ksp,ierr)<br><br> ! Free Memory<br> call PCASMFreeSpace(global_pc)<br> <br>! ************* C Code **************************<br>
<br>void pcasmfreespace_(PC *inpc)<br> {<br> PC pc = *inpc;<br> PC_ASM *osm = (PC_ASM*)pc->data;<br> PetscErrorCode ierr;<br> <br> if (osm->pmat) {<br> if (osm->n_local_true > 0) {<br>
ierr = MatDestroyMatrices(osm->n_local_true,&osm->pmat);<br> CHKERRQ(ierr);<br> }<br> osm->pmat = 0;<br> }<br> <br> if (pc->pmat) {<br> ierr = MatDestroy(pc->pmat);<br>
CHKERRQ(ierr);<br> pc->pmat = 0;<br> }<br> }<br>! *********************************************8<br><br>When I run it, I get an error is PCApplyBAorAB<br>[0]PETSC ERROR: [0] MatMult line 1877 src/mat/interface/matrix.c<br>
[0]PETSC ERROR: [0] PCApplyBAorAB line 540 src/ksp/pc/interface/precon.c<br>[0]PETSC ERROR: [0] GMREScycle line 132 src/ksp/ksp/impls/gmres/gmres.c<br>[0]PETSC ERROR: [0] KSPSolve_GMRES line 227 src/ksp/ksp/impls/gmres/gmres.c<br>
[0]PETSC ERROR: [0] KSPSolve line 308 src/ksp/ksp/interface/itfunc.c<br><br>It appears then that you must keep the un-factored preconditioner matrix in memory. Is this actually the case? Or have I done something silly here?<br>
<br>Thanks,<br>Gaetan<br><br><br>
<br>
On Aug 25, 2011, at 1:29 PM, Gaetan Kenway wrote:<br>
<br>
> Hello<br>
><br>
> I've managed to get the c-function for freeing preconditioner
memory written. The contents of my new 'pcasmfreespace.c' is below:<br>
><br>
> #include <private/pcimpl.h> /*I "petscpc.h" I*/<br>
><br>
> typedef struct {<br>
> PetscInt n, n_local, n_local_true;<br>
> PetscInt overlap; /* overlap requested by user */<br>
> KSP *ksp; /* linear solvers for each block */<br>
> VecScatter *restriction; /* mapping from global to subregion */<br>
> VecScatter *localization; /* mapping from overlapping to non-overlapping subregion */<br>
> VecScatter *prolongation; /* mapping from subregion to global */<br>
> Vec *x,*y,*y_local; /* work vectors */<br>
> IS *is; /* index set that defines each overlapping subdomain */<br>
> IS *is_local; /* index set that defines each non-overlapping subdomain, may be NULL */<br>
> Mat *mat,*pmat; /* mat is not currently used */<br>
> PCASMType type; /* use reduced interpolation, restriction or both */<br>
> PetscInt type_set; /* if user set this value (so won't change it for symmetric problems) */<br>
> PetscInt same_local_solves; /* flag indicating whether all local solvers are same */<br>
> PetscInt sort_indices; /* flag to sort subdomain indices */<br>
> } PC_ASM;<br>
><br>
> void pcasmfreespace_(PC pc)<br>
> {<br>
><br>
> PC_ASM *osm = (PC_ASM*)pc->data;<br>
> PetscErrorCode ierr;<br>
><br>
> if (osm->pmat) {<br>
> if (osm->n_local_true > 0) {<br>
> ierr = MatDestroyMatrices(osm->n_<div id=":7w">local_true,&osm->pmat);CHKERRQ(ierr);<br>
> }<br>
> osm->pmat = 0;<br>
> }<br>
><br>
> if (pc->pmat) {ierr = MatDestroy(pc->pmat);CHKERRQ(ierr); pc->pmat = 0;}<br>
><br>
> }<br>
><br>
> Note the underscore as I'm trying to call it from Fortran. When I call it from fortran, I use:<br>
><br>
> call pcasmfreespace(global_pc)<br>
><br>
> This calls, the function, ok, but (according to Valgrind) I have an invalid read on the line containing:<br>
><br>
> if (osm->pmat){<br>
><br>
> I suspect this is something to do with passing the fortran
"pointer" of the PC to c, or something along this lines. Is there
anything else special you have to do to pass the "fortran" petsc objects
to c?<br>
><br>
> Thanks<br>
><br>
> Gaetan<br>
><br>
><br>
> Message: 2<br>
> Date: Sun, 21 Aug 2011 17:22:28 -0500<br>
> From: Barry Smith <<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>><br>
> Subject: Re: [petsc-users] Freeing Preconditioner Matrix Space<br>
> To: PETSc users list <<a href="mailto:petsc-users@mcs.anl.gov">petsc-users@mcs.anl.gov</a>><br>
> Message-ID: <<a href="mailto:953121EF-B6AC-42EE-87BE-D4402C121652@mcs.anl.gov">953121EF-B6AC-42EE-87BE-D4402C121652@mcs.anl.gov</a>><br>
> Content-Type: text/plain; charset=us-ascii<br>
><br>
><br>
> You don't need to put that in the PETSc source. Just built it in
the same directory you build your application and link it in like any of
your application code. You will need to stick<br>
> #include <private/pcimpl.h> /*I "petscpc.h" I*/<br>
><br>
> typedef struct {<br>
> PetscInt n, n_local, n_local_true;<br>
> PetscInt overlap; /* overlap requested by user */<br>
> KSP *ksp; /* linear solvers for each block */<br>
> VecScatter *restriction; /* mapping from global to subregion */<br>
> VecScatter *localization; /* mapping from overlapping to non-overlapping subregion */<br>
> VecScatter *prolongation; /* mapping from subregion to global */<br>
> Vec *x,*y,*y_local; /* work vectors */<br>
> IS *is; /* index set that defines each overlapping subdomain */<br>
> IS *is_local; /* index set that defines each non-overlapping subdomain, may be NULL */<br>
> Mat *mat,*pmat; /* mat is not currently used */<br>
> PCASMType type; /* use reduced interpolation, restriction or both */<br>
> PetscBool type_set; /* if user set this value (so won't change it for symmetric problems) */<br>
> PetscBool same_local_solves; /* flag indicating whether all local solvers are same */<br>
> PetscBool sort_indices; /* flag to sort subdomain indices */<br>
> } PC_ASM;<br>
><br>
> before the subroutine.<br>
><br>
> Barry<br>
><br>
> On Aug 21, 2011, at 3:24 PM, Gaetan Kenway wrote:<br>
><br>
> > Hello<br>
> ><br>
> > I am attempting to implement a "hack" that was posted on the
list a while back. I'm working with the adjoint linear system solver for
a CFD solver. I'm using the ASM (or Block Jacobi) preconditioner with
ILU(p) on each of the sub-domains. I use a different Preconditioner
matrix (Pmat) than the actual jacobian. What I want to do is destroy the
Pmat memory after the ILU factorization is performed. The hack that was
posted is copied below:<br>
> > PCASMFreeSpace(PC pc)<br>
> > {<br>
> > PC_ASM *osm = (PC_ASM*)pc->data;<br>
> > PetscErrorCode ierr;<br>
> ><br>
> > if (osm->pmat) {<br>
> > if (osm->n_local_true > 0) {<br>
> ><br>
> > ierr = MatDestroyMatrices(osm->n_<br>
> local_true,&osm->pmat);CHKERRQ(ierr);<br>
> > }<br>
> > osm->pmat = 0;<br>
> > }<br>
> > if (pc->pmat) {ierr = MatDestroy(pc->pmat);CHKERRQ(ierr); pc->pmat = 0;}<br>
> ><br>
> > return 0;<br>
> > }<br>
> ><br>
> > However, I've had no luck actually getting the function
compiled into petsc. There are no erorrs reported with i type "make" in
the asm directory, but when I try to use the function in my application
it can't find the symbol while linking. Where does it go in the asm.c
file? Does it use "static PetscErrorCode" or "PetscErrorCode
PETSCKSP_DLLEXPORT"? Does it have to be added to the .h include files?
What has to be done for it work with Fortran?<br>
> ><br>
> > Any suggestions would be greatly appreciated. This represents
a significant chunk of my application's memory (10%-30%) and as such
its too much to ignore. Also is there any chance something like this
would make it into an actual PETSc release?<br>
> ><br>
> > Gaetan<br>
> ><br>
><br>
><br>
</div><br>