[petsc-users] Load distributed matrices from directory

Barry Smith bsmith at mcs.anl.gov
Mon Oct 2 07:50:18 CDT 2017


  MPCs?

  If you have a collection of "overlapping matrices" on disk then you will be responsible for even providing the matrix-vector product for the operator which you absolutely need if you are going to use any Krylov based overlapping Schwarz method. How do you plan to perform the matrix vector products?

   With regard to applying the preconditioner that is more straightforward. Each process gets is "overlapped" matrix and all you need to do is provide the VecScatter from a global vector to the "overlapped vector", do the local MatMult() with your "overlapped matrix" and then scatter-add back. 

  I question if these entire process is even worth your time. Note: I am not a fan of the "custom application code"
 for a "custom" domain decomposition method (obviously or I would never have designed PETSc ;). I believe in general purpose library that you can then "customize" for your unique problem once you've determined by profiling that the customization/optimization is even worth it. For example, I would just start with PCASM then if I determine that it is not selecting good subdomains you can go and add customization to provide more information to it of how you want the subdomains to be defined etc. Some possibly useful routines for customization

PETSC_EXTERN PetscErrorCode PCASMSetLocalSubdomains(PC,PetscInt,IS[],IS[]);
PETSC_EXTERN PetscErrorCode PCASMSetTotalSubdomains(PC,PetscInt,IS[],IS[]);
PETSC_EXTERN PetscErrorCode PCASMSetOverlap(PC,PetscInt);
PETSC_EXTERN PetscErrorCode PCASMSetDMSubdomains(PC,PetscBool);
PETSC_EXTERN PetscErrorCode PCASMGetDMSubdomains(PC,PetscBool*);
PETSC_EXTERN PetscErrorCode PCASMSetSortIndices(PC,PetscBool);

PETSC_EXTERN PetscErrorCode PCASMSetType(PC,PCASMType);
PETSC_EXTERN PetscErrorCode PCASMGetType(PC,PCASMType*);
PETSC_EXTERN PetscErrorCode PCASMSetLocalType(PC,PCCompositeType);
PETSC_EXTERN PetscErrorCode PCASMGetLocalType(PC,PCCompositeType*);
PETSC_EXTERN PetscErrorCode PCASMCreateSubdomains(Mat,PetscInt,IS*[]);
PETSC_EXTERN PetscErrorCode PCASMDestroySubdomains(PetscInt,IS[],IS[]);
PETSC_EXTERN PetscErrorCode PCASMCreateSubdomains2D(PetscInt,PetscInt,PetscInt,PetscInt,PetscInt,PetscInt,PetscInt*,IS**,IS**);
PETSC_EXTERN PetscErrorCode PCASMGetLocalSubdomains(PC,PetscInt*,IS*[],IS*[]);
PETSC_EXTERN PetscErrorCode PCASMGetLocalSubmatrices(PC,PetscInt*,Mat*[]);
PETSC_EXTERN PetscErrorCode PCASMGetSubMatType(PC,MatType*);
PETSC_EXTERN PetscErrorCode PCASMSetSubMatType(PC,MatType);

If these are not useful you could tell us what kind of customization you want to have within KSP/PCASM and depending on how generally useful it might be we could possibly add more hooks for you.

  Barry


> On Oct 2, 2017, at 10:12 AM, Matthieu Vitse <vitse at lmt.ens-cachan.fr> wrote:
> 
> 
>> Le 29 sept. 2017 à 17:43, Barry Smith <bsmith at mcs.anl.gov> a écrit :
>> 
>>  Or is your matrix generator code sequential and cannot generate the full matrix so you want to generate chunks at a time and save to disk then load them? Better for you to refactor your code to work in parallel in generating the whole thing (since you can already generate parts the refactoring shouldn't be terribly difficult).
> 
> Thanks for your answer. 
> 
> The matrix is already generated in parallel, but we want to keep control on the decomposition which conflicts with directly using PCASM. That’s why we would really like to work only with the distributed matrices. Are there some issues that would prevent me from doing that ? Moreover, ASM is a first step, we would like then to use those matrices for multi-preconditioning our problem, and take into account MPCs (as a consequence we really need to know the decomposition). 
> 
> Thanks, 
> 
>> Matt



More information about the petsc-users mailing list