[petsc-users] Understanding preallocation for MPI

Barry Smith bsmith at mcs.anl.gov
Mon Jul 10 13:45:42 CDT 2017


  You might consider using http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatPreallocateInitialize.html and http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatPreallocateSetLocal.html#MatPreallocateSetLocal and friends

   These take out some of the busywork of setting the preallocation arrays. They are macros in petscmat.h so even if you don't use them you can see the code they use.

   Barry

> On Jul 10, 2017, at 3:22 AM, Florian Lindner <mailinglists at xgm.de> wrote:
> 
> Hey,
> 
> one more question about preallocation:
> 
> I can determine if a column index is diagonal or off-diagonal using that code
> 
> if (col >= col_range_start and col < col_range_end)
>    d_nnz[relative_row]++;
> else
>    o_nnz[relative_row]++;
> 
> 
> My code, however uses index sets from which a ISlocalToGlobalMapping created:
> 
>  // Creates a mapping from permutation and use that for the cols
>  ierr = ISCreateGeneral(PETSC_COMM_WORLD, local_cols, permutation.data(), PETSC_COPY_VALUES, &ISlocal); CHKERRV(ierr);
>  ierr = ISSetPermutation(ISlocal); CHKERRV(ierr);
>  ierr = ISAllGather(ISlocal, &ISglobal); CHKERRV(ierr); // Gather the IS from all processors
>  ierr = ISLocalToGlobalMappingCreateIS(ISglobal, &ISmapping); CHKERRV(ierr); // Make it a mapping
> 
>  // Create an identity mapping and use that for the rows of A.
>  ierr = ISCreateStride(PETSC_COMM_WORLD, local_rows, row_range_start, 1, &ISidentity); CHKERRV(ierr);
>  ierr = ISSetIdentity(ISidentity); CHKERRV(ierr);
>  ierr = ISAllGather(ISidentity, &ISidentityGlobal); CHKERRV(ierr);
>  ierr = ISLocalToGlobalMappingCreateIS(ISidentityGlobal, &ISidentityMapping); CHKERRV(ierr);
> 
>  ierr = MatSetLocalToGlobalMapping(A, ISidentityMapping, ISmapping); CHKERRV(ierr);
> 
> since SetPreallocation routines define the diagonal / off-diagonal blocks from the petsc ordering, I have to translate
> the col to a petsc_col.
> 
> What is the best / fastest way to do that?
> 
> Is that the way to go?
> 
>  PetscInt mapSize;
>  ISLocalToGlobalMappingGetSize(ISmapping, &mapSize);
>  const PetscInt *mapIndizes;
>  ISLocalToGlobalMappingGetIndices(ISmapping, &mapIndizes);
> 
> Thanks,
> Florian
> 
> 
> 
> Am 07.07.2017 um 17:31 schrieb Florian Lindner:
>> Hello,
>> 
>> I'm having some struggle understanding the preallocation for MPIAIJ matrices, especially when a value is in off-diagonal
>> vs. diagonal block.
>> 
>> The small example program is at https://pastebin.com/67dXnGm3
>> 
>> In general it should be parallel, but right now I just run it in serial.
>> 
>> According to my understanding of
>> 
>> http://www.mcs.anl.gov/petsc/petsc-3.7/docs/manualpages/Mat/MatMPIAIJSetPreallocation.html
>> 
>> a entry is in the diagonal submatrix, if its row is in the OwnershipRange and its column is in OwnershipRangeColumn.
>> That also means that in a serial run, there is only a diagonal submatrix.
>> 
>> However, having MAT_NEW_NONZERO_ALLOCATION_ERR set, I get an error when
>> 
>> Inserting 6 elements in row 2, though I have exactly
>> 
>> 2 o_nnz = 0, d_nnz = 6 (means 6 elements allocated in the diagonal submatrix of row 2)
>> 
>> Error is:
>> 
>> [0]PETSC ERROR: Argument out of range
>> [0]PETSC ERROR: New nonzero at (2,5) caused a malloc
>> Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to turn off this check
>> 
>> 
>> What is wrong with my understanding?
>> 
>> Thanks,
>> Florian
>> 



More information about the petsc-users mailing list