[petsc-users] converting scipy sparse CSR matrix to petcs matrix with mpi
John Travers
jtravs at gmail.com
Thu Aug 29 07:39:48 CDT 2013
On 28 Aug 2013, at 14:58, Jed Brown <jedbrown at mcs.anl.gov> wrote:
> John Travers <jtravs at gmail.com> writes:
>
>> Hi,
>>
>> I currently generate PETSc matrices from scipy.sparse CSR format matrices as follows (where A is a scipy sparse CSR matrix):
>>
>> pA = PETSc.Mat().createAIJ(size=A.shape, csr=(A.indptr, A.indices, A.data))
>>
>> This work correctly on sequential runs, but if I run under MPI I get an error which I presume to be caused by the fact that all of my MPI processes try to simultaneously create this matrix, rather than splitting it? Eg. for 4 processes I get:
>
> Yeah, the size of the passed CSR part doesn't match the local size of
> the matrix. I think that given a range [rstart,rend), you can pass
>
> csr=(A.indptr[rstart:rend] - A.indptr[rstart],
> A.indices[A.indptr[rstart]:A.indptr[rend]],
> A.data[A.indptr[rstart]:A.indptr[rend]])
>
Thanks, this works (except it should be A.indptr[rstart:rend+1] in the first line I think).
> More simply, you can just create the matrix and loop over rows calling
> MatSetValues, but you have to do half of the spec above to set the
> number of nonzeros per row if you want it to be fast.
>
> Do you have to start with redundantly-computed scipy matrices? (That'll
> be a scalability bottleneck. It's usually important to distribute
> computation of the matrix entries unless you're only trying to use a few
> cores.)
Well, not in the long run, but at the moment that code is debugged and working and I only wanted to try the slepc solvers first.
More information about the petsc-users
mailing list