[petsc-users] Preallocation (dnz, onz arrays) in sparse parallel matrix
Barry Smith
bsmith at mcs.anl.gov
Fri Oct 6 14:57:13 CDT 2017
> On Oct 6, 2017, at 4:08 PM, Thibaut Appel <t.appel17 at imperial.ac.uk> wrote:
>
> Dear PETSc users,
>
> I am trying to assemble a sparse matrix in parallel where my main objective is efficiency and scalability.
>
> Precisely, I am using MatMPIAIJSetPreallocation with diagonal entries (dnz) and off-diagonal entries (onz) arrays (non zero elements for each rows) to allocate the memory needed.
>
> Prior to the insertion of the elements in the matrix, I am doing a primary loop to determine those arrays dnz and onz for each processor owning its own set of rows. Ideally, this loop would look like
>
> for irow=istart, iend-1, i++ ----> count dnz(irow) and onz(irow)
> But it seems that you cannot call MatGetOwnershipRange(Mat,istart,iend,ierr) before MatMPIAIJSetPreallocation to get istart and iend. Why is that?
> Which optimal approach should be followed to count your non-zero elements for each processor? I saw two conversations where Barry Smith suggested the use of MatPreallocateInitialize/Finalize or PetscSplitOwnership, which means you have to determine yourself the rows owned by each processor? Is that not contrary to the "PETSc spirit"?
Use PetscSplitOwnership() to determine the ownerships.
The reason for not being about to use MatGetOwnershipRange() before setting the preallocation is just because of the design of the data constructor for the matrix class. You are correct that it might be possible to refactor the code to have more steps in the constructor allowing the call to MatGetOwnershipRange().
We welcome pull requests but are unlikely to make the change ourselves, the reason is that normally one is working with a mesh data structure (offend a DM) that provides (based on its decomposition) the ownership ranges (rather than just having the matrix decide on it) and hence one does not normally call the MatSetSizes() allowing the matrix to determine the ownership ranges.
Barry\
> Thanks for your help and have a nice weekend
>
> Thibaut
More information about the petsc-users
mailing list