[petsc-dev] API changes in MatIS

Jed Brown jedbrown at mcs.anl.gov
Tue May 8 07:47:52 CDT 2012


On Tue, May 8, 2012 at 3:30 AM, Stefano Zampini
<stefano.zampini at gmail.com>wrote:

> Sorry, I didn't know that. A stupid question but, can you say how many
> users are using MATIS objects?
>

I don't know, probably not a great number, but enough that it has to be
documented.


>
> 2. What was wrong with the default MatGetVecs()?
>>
>>
> There's a conceptual difference: default MatGetVecs uses mat->rmap->n and
> mat->cmat->n as local sizes of the vectors since it assumes a distributed
> Matrix. MATIS object is not distributed in a standard way (say, all
> mat->rmap->n values sums to number of global rows), MatMult_IS is not
> directly performed using the local part of the global vector, thus there is
> not a real need (apart of Petsc checking the sizes) of the notion of
> mat->rmap->n (or mat->cmap->n) as in the standard implementation. What you
> really need is the global size of the vector; indeed the sizes of the local
> vectors used to perform MatMult_IS doesn't sum up to the vector global size.
>

PetscLayout and the matrix size statements don't work as you describe here;
if they do somewhere, it's broken. The matrix sizes are really statements
about the distribution of the _vector_ that the matrix multiplies against
and produces. With *AIJ, that happens to also be the row partition, and the
off-diagonal part is managed in a compatible way, but that is not required.
For dense matrices, it might make sense to store the matrix with an
entirely different distribution (e.g. cyclic). An implementation issue like
the sizes of the local matrices in MATIS should _not_ change the global
matrix size (rmap and cmap).


>
> 3. Options prefixes should inherit the prefix from the parent and should
>> end with "_" so that they aren't squashed together with the suboptions.
>>
>
> Since MatSetOptionsPrefix(is->A,"is") was already there, I thought it was
> right using the same prefix for local vectors. Change it as you prefer.
>

Ah, okay. It should be done like this in the future.

http://petsc.cs.iit.edu/petsc/petsc-dev/rev/b70fc058783a


>
>
>> 5. Why is the matrix being forced to be square?
>>
>>
> MATIS were already forced to be square before I began using it.
>

Okay, I just hate adding more places where such unnecessary constraints are
codified.


> Pushed a fix to the problems you highlighted
>
> http://petsc.cs.iit.edu/petsc/petsc-dev/rev/93e67397a066
>
> Calling MatSetBlockSize before MatSetSizes in MatCreateIS solved all my
> problems.
>

Great, thanks.


>
> Since we are speaking about MATIS. In my codes, I have a function (now it
> is in Fortran but I can translate it)
>
> MatISOptimize(Mat A,IS *LocalIS ,IS *GlobalIS)
>
> which changes the MATIS object, changing its local to global mapping to
> optimize either for local scatters and for global communications costs. It
> also changes the local matrix associated to the MATIS object. The function
> also returns the permutations used (if not PETSC_NULL). Can I add it to
> matis.c? Since it changes the underlying object, what are the requirements
> of PETSc? Should the user be able to insert the values with the original
> ordering? Any suggestions?


MatSetValues() is fine since only the global ordering is used.
MatSetValuesLocal() would ideally continue to work with the old local space
because the user has to redistribute their mesh to use the new ordering
(maybe there should be a translation filter).

But how does this function actually work? Can the subdomain boundaries
move, or are subdomains just moved to a different process?


For a different, but related algorithm, I'm considering writing a MatPA (PA
= partially assembled) which would choose some points to assemble
internally. The reason for storing in partially assembled form is that I
will want to solve a number of pinned Neumann problems to determine the
coarse space and I'd rather not have to take unnecessary submatrices. Do
you think this is a bad idea and I should just use MatIS? (I also want
support for multiple subdomains per process and multiple processes per
subdomain, though that could be added to MatIS.)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120508/54f3b387/attachment.html>


More information about the petsc-dev mailing list