[petsc-users] How to efficiently fill in, in parallel, a PETSc matrix from a COO sparse matrix?

Diego Magela Lemos diegomagela at usp.br
Tue Jun 20 06:41:56 CDT 2023


Considering, for instance, the following COO sparse matrix format, with
repeated indices:

std::vector<size_t> rows{0, 0, 1, 2, 3, 4};
std::vector<size_t> cols{0, 0, 1, 2, 3, 4};
std::vector<double> values{2, -1, 2, 3, 4, 5};

that represents a 5x5 diagonal matrix A.

So far, the code that I have is:

// fill_in_matrix.cc
static char help[] = "Fill in a parallel COO format sparse matrix.";
#include <petsc.h>#include <vector>
int main(int argc, char **args){
    Mat A;
    PetscInt m = 5, i, Istart, Iend;

    PetscCall(PetscInitialize(&argc, &args, NULL, help));

    PetscCall(MatCreate(PETSC_COMM_WORLD, &A));
    PetscCall(MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, m, m));
    PetscCall(MatSetFromOptions(A));
    PetscCall(MatSetUp(A));
    PetscCall(MatGetOwnershipRange(A, &Istart, &Iend));

    std::vector<PetscInt> II{0, 0, 1, 2, 3, 4};
    std::vector<PetscInt> JJ{0, 0, 1, 2, 3, 4};
    std::vector<PetscScalar> XX{2, -1, 2, 3, 4, 5};

    for (i = Istart; i < Iend; i++)
        PetscCall(MatSetValues(A, 1, &II.at(i), 1, &JJ.at(i),
&XX.at(i), ADD_VALUES));

    PetscCall(MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY));
    PetscCall(MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY));
    PetscCall(MatView(A, PETSC_VIEWER_STDERR_WORLD));

    PetscCall(MatDestroy(&A));
    PetscCall(PetscFinalize());
    return 0;
}

When running it with

petscmpiexec -n 4 ./fill_in_matrix


I get


 Mat Object: 4 MPI processes

  type: mpiaij
row 0: (0, 1.)
row 1: (1, 2.)
row 2: (2, 3.)
row 3: (3, 4.)
row 4:


Which is missing the entry of the last row.

What am I missing? Even better, which would be the best way to fill in
this matrix?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230620/6f74e705/attachment-0001.html>


More information about the petsc-users mailing list