[petsc-users] PETSc set off-diagonal matrix pre-allocation

Matthew Knepley knepley at gmail.com
Tue Feb 14 07:26:34 CST 2017


On Tue, Feb 14, 2017 at 6:41 AM, Andrew Ho <andrewh0 at uw.edu> wrote:

> I have a 3x6 matrix, and I want to set it to (just as an example):
>
> 0 0 0 1 1 1
> 0 0 0 1 1 1
> 0 0 0 1 1 1
>
> From what I can tell, according to the documentation the MPIAIJ sparsity
> of this matrix is:
>

I believe this is an inconsistency in PETSc for rectangular matrices. We
divide matrices into diagonal and
off-diagonal parts mainly to separate communication from computation. Thus
on 1 proc, we never divide
them, even if the matrix is rectangular. Therefore we misinterpret your
preallocation.

Barry, do you think we want to try and change this?

  Matt


> d_nnz = [0, 0, 0]
> o_nnz = [3, 3, 3]
>
> However, when I do this, I get the following error:
>
> [0]PETSC ERROR: --------------------- Error Message
>> --------------------------------------------------------------
>>
>> [0]PETSC ERROR: Argument out of range
>> [0]PETSC ERROR: New nonzero at (0,3) caused a malloc
>> Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to turn
>> off this check
>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> for trouble shooting.
>> [0]PETSC ERROR: Petsc Release Version 3.7.5, unknown
>> [0]PETSC ERROR: ./ex3 on a arch-linux2-c-opt Tue Feb 14 04:23:56 2017
>> [0]PETSC ERROR: Configure options --with-debugging=0 --COPTFLAGS="-O3
>> -march=native" --CXXOPTFLAGS="-O3 -march=native" --FOPTFLAGS="-O3
>> -march=native"
>> [0]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 582 in
>> petsc/src/mat/impls/aij/mpi/mpiaij.c
>> [0]PETSC ERROR: #2 MatSetValues() line 1190 in petsc/src/mat/interface/matrix.c
>>
>> [0]PETSC ERROR: #3 main() line 36 in ex3.c
>> [0]PETSC ERROR: No PETSc Option Table entries
>> [0]PETSC ERROR: ----------------End of Error Message -------send entire
>> error message to petsc-maint at mcs.anl.gov----------
>
>
> Here's a working test code:
>
> #include <petsc.h>
>> #include <mpi.h>
>> int main(int argc, char** argv)
>> {
>>   PetscErrorCode err;
>>   err = PetscInitialize(&argc, &argv, NULL, "help");
>>   CHKERRQ(err);
>>   // create a sparse AIJ matrix distributed across MPI
>>   PetscInt global_width = 6;
>>   PetscInt global_height = 3;
>>   Mat A;
>>   err = MatCreate(MPI_COMM_WORLD, &A);
>>   CHKERRQ(err);
>>   err = MatSetType(A, MATMPIAIJ);
>>   CHKERRQ(err);
>>   // setup pre-allocation for matrix space
>>   {
>>     err =
>>       MatSetSizes(A, global_height, PETSC_DECIDE, global_height,
>> global_width);
>>     CHKERRQ(err);
>>     PetscInt d_nnz[] = {0, 0, 0};
>>     PetscInt o_nnz[] = {3, 3, 3};
>>     err = MatMPIAIJSetPreallocation(A, 0, d_nnz, 0, o_nnz);
>>     CHKERRQ(err);
>>   }
>>   err = MatSetUp(A);
>>   CHKERRQ(err);
>>   // set values inside the matrix
>>   for (PetscInt row = 0; row < global_height; ++row)
>>   {
>>     for (PetscInt col = global_height; col < global_width; ++col)
>>     {
>>       err = MatSetValue(A, row, col, 1, INSERT_VALUES);
>>       CHKERRQ(err);
>>     }
>>   }
>>   err = MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY);
>>   CHKERRQ(err);
>>   err = MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY);
>>   CHKERRQ(err);
>>   err = MatView(A, PETSC_VIEWER_STDOUT_WORLD);
>>   CHKERRQ(err);
>>   // free memory
>>   err = MatDestroy(&A);
>>   CHKERRQ(err);
>>   // cleanup any internal PETSc data at end of program
>>   err = PetscFinalize();
>>   CHKERRQ(err);
>> }
>
>
> Am I mis-understanding what the d_nnz and o_nnz parameter are supposed to
> mean?
> --
> Andrew Ho
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170214/641d29ce/attachment-0001.html>


More information about the petsc-users mailing list