MATMPIAIJ preallocation may not be cleaned by MatDestroy?
xiaoyin ji
sapphire.jxy at gmail.com
Thu Sep 24 12:10:57 CDT 2009
Hi all,
I've finally fixed the problem with the increasing solving time for
iterative call of ksp solver. The way is set mat type as MATAIJ
instead of MATMPIAIJ. Here is the new codes of mat construction.
MatCreate(PETSC_COMM_WORLD,&A);
MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,ngridtot,ngridtot);
MatSetType(A,MATAIJ);
MatSetFromOptions(A);
MPI_Comm_size(PETSC_COMM_WORLD,&size);
if (size > 1){
MatMPIAIJSetPreallocation(A,7,PETSC_NULL,2,PETSC_NULL);
} else {
MatSeqAIJSetPreallocation(A,7,PETSC_NULL);
}
The older version is like this, mat type is defined seperately based
on the system size as MATMPIAIJ and MATSEQAIJ respectively. This code
works ok in Fortran subroutines, but in C++ it will cause ksp solver
getting slower and slower like a parabola, which may be a memory leak.
MatCreate(world,&A);
MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,ngridtot,ngridtot);
// setup sparse matrix in either MATMPIAIJ or MATSEQAIJ format
MPI_Comm_size(world,&size);
if (size > 1) {
MatSetType(A,MATMPIAIJ);
MatSetFromOptions(A);
MatMPIAIJSetPreallocation(A,7,PETSC_NULL,2,PETSC_NULL);
} else {
MatSetType(A,MATSEQAIJ);
MatSetFromOptions(A);
MatSeqAIJSetPreallocation(A,7,PETSC_NULL);
}
However, I think petsc library still have problem, as both codes
should work in C++ or Fortran. Thanks.
Best,
Xiaoyin Ji
More information about the petsc-users
mailing list