[petsc-dev] API changes in MatIS

Stefano Zampini stefano.zampini at gmail.com
Tue May 8 08:32:49 CDT 2012


> Okay, I just hate adding more places where such unnecessary constraints
> are codified.
>
>

Me too. I will take a look where to change the code to safely remove the
requirement.

Since we are speaking about MATIS. In my codes, I have a function (now it
>> is in Fortran but I can translate it)
>>
>> MatISOptimize(Mat A,IS *LocalIS ,IS *GlobalIS)
>>
>> which changes the MATIS object, changing its local to global mapping to
>> optimize either for local scatters and for global communications costs. It
>> also changes the local matrix associated to the MATIS object. The function
>> also returns the permutations used (if not PETSC_NULL). Can I add it to
>> matis.c? Since it changes the underlying object, what are the requirements
>> of PETSc? Should the user be able to insert the values with the original
>> ordering? Any suggestions?
>
>
> MatSetValues() is fine since only the global ordering is used.
> MatSetValuesLocal() would ideally continue to work with the old local space
> because the user has to redistribute their mesh to use the new ordering
> (maybe there should be a translation filter).
>
>
What do you mean for "translation filter"?


> But how does this function actually work? Can the subdomain boundaries
> move, or are subdomains just moved to a different process?
>
>
Asking the first question:
The function works by first reordering local nodes with INTERNAL and
INTERFACE nodes (if the second argument LocalIS is not PETSC_NULL), in
order to be able to use memcpy when using local scatters of PCIS (e.g.
pcis->N_to_B). Then, it reorders the global numbering (if GlobalIS is not
PETSC_NULL) by exploiting data locality in the global vector and reduce the
amount of communications when scattering from global to local and
viceversa; in practice, a numbering of global nodes is constructed in such
a way that the greatest number of subdomain INTERNAL nodes will belong to
the local part of the global vector. Then, remaining INTERNAL nodes (notes
that if the parallel problem is well balanced, this will not occur) and
INTERFACE nodes are placed in the remaining global nodes, keeping them
sorted as they were in the previous global numbering. I think the latter
part  can be further modified. When I'll add to matis.c, maybe you can take
a look at the code and give me your suggestions.

Regarding the other question:
With MATIS (and in principle every subassembled matrix) you are forced to
use the local matrices assembled by the user or an assembled group of them
(as you surely know since you are developing a PA). You cannot safely (in a
theoretical way) move subdomain boundaries since you don't know (at least
in principle) how to reproduce the matrix values on the local interface
since they are neumann boundaries for the local PDE. The main question is:
how can I split the assembled values on the nodes which previously were
internal to the subdomain? (I think this is the main question when thinking
to a MatConvert implementation which converts from a matrix obtained by
DMDAGetMatrix to a MATIS).


>
> For a different, but related algorithm, I'm considering writing a MatPA
> (PA = partially assembled) which would choose some points to assemble
> internally. The reason for storing in partially assembled form is that I
> will want to solve a number of pinned Neumann problems to determine the
> coarse space and I'd rather not have to take unnecessary submatrices. Do
> you think this is a bad idea and I should just use MatIS? (I also want
> support for multiple subdomains per process and multiple processes per
> subdomain, though that could be added to MatIS.)
>

Cool. Are you implementing FETI-DP? I'm planning to implement it in the
very next future for isogeometric analysis. Asking to your question, it
depends on how you need to solve these "pinned Neumann problems": you need
to solve unrelated local problems defined by the local matrices or solve a
parallel Neumann problem? In the first case, MatIS is your choice. In the
other case, I think you need to resort to a multigrid for a new MATIS
object defined by the subdomains of interest.
However, I will be glad to help you with PA if you want.

-- 
Stefano
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120508/59548981/attachment.html>


More information about the petsc-dev mailing list