[petsc-users] Understanding preallocation for MPI
Stefano Zampini
stefano.zampini at gmail.com
Mon Jul 10 04:20:39 CDT 2017
Look at the function that begins at line 1050 as the link should redirect to
Il 10 Lug 2017 11:18 AM, "Florian Lindner" <mailinglists at xgm.de> ha scritto:
Hey Stefano,
Am 10.07.2017 um 16:36 schrieb Stefano Zampini:
> Florian,
>
> Perhaps you might want to take a look at how this is done for MatIS
>
> https://bitbucket.org/petsc/petsc/src/f9d5775f43f69cbce5a7014a6ce3b2
4cc0e1214a/src/mat/impls/is/matis.c?at=master&fileviewer=
file-view-default#matis.c-1050
to what piece of code your are specifically refering to?
Line 655, MatSetValuesLocal_SubMat_IS?
and I should use ISLocalToGlobalMappingApply?
but this, if I understand correctly, would involve such a call for every
index.
Right now, my preallocation loop looks like:
PetscInt d_nnz[local_rows], o_nnz[local_rows];
PetscInt mapSize;
ISLocalToGlobalMappingGetSize(ISmapping, &mapSize);
const PetscInt *mapIndizes;
ISLocalToGlobalMappingGetIndices(ISmapping, &mapIndizes); // is called
just once
for (int row = row_range_start; row < row_range_end; row++) {
PetscInt relative_row = row - row_range_start;
d_nnz[relative_row] = 0;
o_nnz[relative_row] = 0;
int colStart = row - nz_ratio/2 * size;
int colEnd = row + nz_ratio/2 * size;
colStart = (colStart < 0) ? 0 : colStart;
colEnd = (colEnd > size) ? size : colEnd;
for (int col = 0; col < global_cols; col++) {
if (col >= colStart and col < colEnd) {
int petsc_col = mapIndizes[col]; // should be cheap
if (petsc_col >= col_range_start and petsc_col < col_range_end) {
d_nnz[relative_row]++;
} else {
o_nnz[relative_row]++;
}
}
}
}
ierr = MatMPIAIJSetPreallocation(A, 0, d_nnz, 0, o_nnz); CHKERRQ(ierr);
Best,
Florian
>
> Stefano
>
> Il 10 Lug 2017 10:23 AM, "Florian Lindner" <mailinglists at xgm.de <mailto:
mailinglists at xgm.de>> ha scritto:
>
> Hey,
>
> one more question about preallocation:
>
> I can determine if a column index is diagonal or off-diagonal using
that code
>
> if (col >= col_range_start and col < col_range_end)
> d_nnz[relative_row]++;
> else
> o_nnz[relative_row]++;
>
>
> My code, however uses index sets from which a ISlocalToGlobalMapping
created:
>
> // Creates a mapping from permutation and use that for the cols
> ierr = ISCreateGeneral(PETSC_COMM_WORLD, local_cols,
permutation.data(), PETSC_COPY_VALUES, &ISlocal); CHKERRV(ierr);
> ierr = ISSetPermutation(ISlocal); CHKERRV(ierr);
> ierr = ISAllGather(ISlocal, &ISglobal); CHKERRV(ierr); // Gather
the IS from all processors
> ierr = ISLocalToGlobalMappingCreateIS(ISglobal, &ISmapping);
CHKERRV(ierr); // Make it a mapping
>
> // Create an identity mapping and use that for the rows of A.
> ierr = ISCreateStride(PETSC_COMM_WORLD, local_rows,
row_range_start, 1, &ISidentity); CHKERRV(ierr);
> ierr = ISSetIdentity(ISidentity); CHKERRV(ierr);
> ierr = ISAllGather(ISidentity, &ISidentityGlobal); CHKERRV(ierr);
> ierr = ISLocalToGlobalMappingCreateIS(ISidentityGlobal,
&ISidentityMapping); CHKERRV(ierr);
>
> ierr = MatSetLocalToGlobalMapping(A, ISidentityMapping, ISmapping);
CHKERRV(ierr);
>
> since SetPreallocation routines define the diagonal / off-diagonal
blocks from the petsc ordering, I have to translate
> the col to a petsc_col.
>
> What is the best / fastest way to do that?
>
> Is that the way to go?
>
> PetscInt mapSize;
> ISLocalToGlobalMappingGetSize(ISmapping, &mapSize);
> const PetscInt *mapIndizes;
> ISLocalToGlobalMappingGetIndices(ISmapping, &mapIndizes);
>
> Thanks,
> Florian
>
>
>
> Am 07.07.2017 um 17:31 schrieb Florian Lindner:
> > Hello,
> >
> > I'm having some struggle understanding the preallocation for MPIAIJ
matrices, especially when a value is in
> off-diagonal
> > vs. diagonal block.
> >
> > The small example program is at https://pastebin.com/67dXnGm3
> >
> > In general it should be parallel, but right now I just run it in
serial.
> >
> > According to my understanding of
> >
> > http://www.mcs.anl.gov/petsc/petsc-3.7/docs/manualpages/
Mat/MatMPIAIJSetPreallocation.html
> <http://www.mcs.anl.gov/petsc/petsc-3.7/docs/manualpages/
Mat/MatMPIAIJSetPreallocation.html>
> >
> > a entry is in the diagonal submatrix, if its row is in the
OwnershipRange and its column is in OwnershipRangeColumn.
> > That also means that in a serial run, there is only a diagonal
submatrix.
> >
> > However, having MAT_NEW_NONZERO_ALLOCATION_ERR set, I get an error
when
> >
> > Inserting 6 elements in row 2, though I have exactly
> >
> > 2 o_nnz = 0, d_nnz = 6 (means 6 elements allocated in the diagonal
submatrix of row 2)
> >
> > Error is:
> >
> > [0]PETSC ERROR: Argument out of range
> > [0]PETSC ERROR: New nonzero at (2,5) caused a malloc
> > Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to
turn off this check
> >
> >
> > What is wrong with my understanding?
> >
> > Thanks,
> > Florian
> >
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170710/eefd4c54/attachment-0001.html>
More information about the petsc-users
mailing list