[petsc-dev] API changes in MatIS
Jed Brown
jedbrown at mcs.anl.gov
Wed May 9 13:50:24 CDT 2012
On Wed, May 9, 2012 at 1:29 PM, Stefano Zampini
<stefano.zampini at gmail.com>wrote:
> Finally I have some time to work on BDDC. I'm thinking to restyle some of
> my BDDC code. In particular in setting up the coarse environment, I want
> to implement a function (mostly reusing some BDDC code I already wrote)
>
> MatiSSubassemble(A, IS ISSubdomains,*B,*intercomm,*partial_scatter)
>
> which, given subdomain indices, creates a new MATIS matrix on a subcomm of
> the communicator of A by subassembling. In particular, ISSubdomains should
> come from a call to MatPartitioningApply on the adjacency matrix of the
> subdomains.
>
> The communicator of B is the subcomm of A of processes which will receive
> some of the local matrices
> intercomm will be the communicator which can be used between each
> receiving process and their sending "friends"
> *partial_scatter will be the scatter context associated to the intercomm
>
> Jed, do you already wrote something similar for your PA? What's the best
> logic for you? Should I use MatCoarsen?
>
MatCoarsen is for graph aggregation/matching, which is different.
It seems to me that subassembly doesn't coarsen at all. If
MatGetSubMatrices() could extract parallel matrices directly, then it looks
more like that's what you're after. I don't think the implementation should
use an intercomm (too many implementations are buggy and it requires MPI-2
which PETSc tries to avoid depending on because enough users still have
antique systems that don't support it).
In any case, I think that creating a partition should be separate from
transforming the matrix.
For MatIS, I would be fine with creating a new MatIS using the same local
matrices on a subcommunicator, then MatConvert'ing it to AIJ (that will do
assembly), and discarding the temporary subcomm MatIS.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120509/4d4a38f1/attachment.html>
More information about the petsc-dev
mailing list