[petsc-users] Problem when getting matrix values owned by other processor

Hong hzhang at mcs.anl.gov
Thu Apr 14 13:01:48 CDT 2016


Yaoyu :
Can you send me the code for calling MatGetSubMatrices()?
I want to check how did you create mpi ISs.

Checking MatGetSubMatrices() for mpiaij matrix format, it seems duplicate
IS indices would work. I guess duplicate rows might be ok, but duplicate
columns might run into trouble from our implementation.

By definition, a submatrix of A should not have duplicate row/column of A,
isn't it?

Hong

Hi Hong,
>
>
> I think what you said about the ISs for MatGetSubMatrices() are right.
>
> Today I tried to run my program to get submatrix by
> MatGetSubMatrices() twice. One time with ISs created with
> PETSC_COMM_WORLD and the other time with PETSC_COMM_SELF. I was
> creating a 6x3 submatrix on each processor from a 570x3 parallel dense
> matrix. There are only 2 processors in total. The results are as
> follows:
>
> ======== submatrix with ISs created with PETSC_COMM_WORLD =========
> Processor 0
> Mat Object: 1 MPI processes
>   type: seqdense
> 2.7074996615625420e+10 6.8399452804377966e+11 -1.2730861976538324e+08
> 2.7074996615625420e+10 6.8399452804377966e+11 -1.2730861976538324e+08
> 2.7359996580000423e+10 8.2079343365253589e+11 6.3654303672968522e+06
> 2.9639996295000458e+10 1.9151846785225828e+12 -1.0184686034752984e+08
> 2.9924996259375465e+10 1.1947308948674746e+12 -3.8181422587255287e+08
> 5.4149993231250839e+10 2.6675786593707410e+13 -4.9650360873402004e+09
>
> Processor 1
> Mat Object: 1 MPI processes
>   type: seqdense
> 5.4149993231250839e+10 1.9183199555758315e+12 5.9880080526417112e+08
> 7.8374990203126236e+10 5.7441876402563477e+12 1.7367770705944958e+09
> 8.0939989882501266e+10 5.7347670250898613e+12 1.7900992494213123e+09
> 8.1509989811251282e+10 5.7751527083651416e+12 1.8027055821637158e+09
> 8.4074989490626328e+10 6.1543675577970762e+12 1.8557943339637902e+09
> 1.0829998646250171e+11 9.5915997778791719e+12 2.9940040263208551e+09
>
> ==== end of submatrix with ISs created with PETSC_COMM_WORLD ======
>
> ======== submatrix with ISs created with PETSC_COMM_SELF =========
> Processor 0
> Mat Object: 1 MPI processes
>   type: seqdense
> 2.7074996615625420e+10 6.8399452804377966e+11 -1.2730861976538324e+08
> 2.7074996615625420e+10 6.8399452804377966e+11 -1.2730861976538324e+08
> 2.7359996580000423e+10 8.2079343365253589e+11 6.3654303672968522e+06
> 2.9639996295000458e+10 1.9151846785225828e+12 -1.0184686034752984e+08
> 2.9924996259375465e+10 1.1947308948674746e+12 -3.8181422587255287e+08
> 5.4149993231250839e+10 2.6675786593707410e+13 -4.9650360873402004e+09
>
> Processor 1
> Mat Object: 1 MPI processes
>   type: seqdense
> 5.4149993231250839e+10 1.9183199555758315e+12 5.9880080526417112e+08
> 7.8374990203126236e+10 5.7441876402563477e+12 1.7367770705944958e+09
> 8.0939989882501266e+10 5.7347670250898613e+12 1.7900992494213123e+09
> 8.1509989811251282e+10 5.7751527083651416e+12 1.8027055821637158e+09
> 8.4074989490626328e+10 6.1543675577970762e+12 1.8557943339637902e+09
> 1.0829998646250171e+11 9.5915997778791719e+12 2.9940040263208551e+09
>
> ==== end of submatrix with ISs created with PETSC_COMM_SELF ======
>
> The results are identical. I really agree with you that only
> sequential ISs should be used when creating submatrix by
> MatGetSubMatrices(). The above results may be just a coincidence.
>
> I take your suggestions and I am using only sequential ISs with
> MatGetSubMatrices() now.
>
> Yet another question: Can I use IS which have duplicate entries in it?
> The documentation of MatGetSubMatrices() says that "The index sets may
> not have duplicate entries", so I think no duplicated entries are
> allowed in the IS. But again, I just tried IS which have duplicated
> entries. And the resultant submatrices for each processor seemd to be
> correct. In fact, in the above sample submatrices I showed, the
> submatrix owned by Processor 0 has its row IS specified as [x, x, a,
> b, c, d]. Where 'x' means the duplicated entries in the IS. Then I got
> two identical rows in the submatrix owned by Processor 0.
>
> But I think I was doing this incorrectly.
>
> Thanks.
>
> HU Yaoyu
>
> >
> > Yaoyu:
> >
> >  "MatGetSubMatrices() can extract ONLY sequential submatrices
> >    (from both sequential and parallel matrices). Use MatGetSubMatrix()
> >    to extract a parallel submatrix."
> >
> > Using parallel IS for MatGetSubMatrices() definitely incorrect, unless
> you
> > only use one process.
> >
> >>
> >> And further. I checked the submatrices I obtained by
> >> MatGetSubMatrices() with both parallel ISs and sequential ISs. The
> >> submatrices are identical. Is it means that I could actually use
> >> parallel IS (created with PETSC_COMM_WORLD) ?
> >>
> >
> > What did you compare with?  I do not understand what submatrices would
> > obtain with parallel IS using more than one process.
> >
> > Hong
> >
> >>
> >> > Yaoyu :
> >> > MatGetSubMatrices() returns sequential matrices.
> >> > IS must be sequential, created with PETSC_COMM_SELF.
> >> > See petsc/src/mat/examples/tests/ex42.c
> >> >
> >> > Check your submatrices.
> >> >
> >> > Hong
> >> >
> >> > Hi everyone,
> >> >>
> >> >> I am trying to get values owned by other processors of a parallel
> >> matrix.
> >> >>
> >> >> I tried to create a sub-matrix by using MatGetSubMatrices(), and then
> >> >> MatGetRow() on the sub-matrix. But MatGetRow() give me the following
> >> >> error message:
> >> >>
> >> >> ===== Error message begins =====
> >> >>
> >> >> No support for this operation for this object type
> >> >> only local rows
> >> >>
> >> >> ===== Error message ends =====
> >> >>
> >> >> The parallel matrix is a parallel dense matrix. The ISs for
> >> >> MatGetSubMatrices() are created using ISCreateGeneral() and
> >> >> PETSC_COMM_WORLD. The row IS is sorted by ISSort().
> >> >>
> >> >> What did I mistake while using the above functions? Is there a better
> >> >> way to get access to matrix values owned by other processor?
> >> >>
> >> >> Thanks!
> >> >>
> >> >> HU Yaoyu
> >> >>
> >> > -------------- next part --------------
> >> > An HTML attachment was scrubbed...
> >> > URL: <
> >>
> http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160413/6be45909/attachment.html
> >> >
> >> >
> >> > ------------------------------
> >> >
> >> > _______________________________________________
> >> > petsc-users mailing list
> >> > petsc-users at mcs.anl.gov
> >> > https://lists.mcs.anl.gov/mailman/listinfo/petsc-users
> >> >
> >> >
> >> > End of petsc-users Digest, Vol 88, Issue 41
> >> > *******************************************
> >>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160414/0133bb97/attachment.html>


More information about the petsc-users mailing list