[petsc-users] Convert mat SEQAIJ to MPIAIJ
Thomas Witkowski
thomas.witkowski at tu-dresden.de
Tue Sep 4 10:28:57 CDT 2012
Am 04.09.2012 17:20, schrieb Matthew Knepley:
> On Tue, Sep 4, 2012 at 9:52 AM, Thomas Witkowski
> <thomas.witkowski at tu-dresden.de
> <mailto:thomas.witkowski at tu-dresden.de>> wrote:
>
>>
>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MATNEST.html
>>
> As I wrote in my initial question, each rank contains one and only
> one seqaij matrix, which all should be joined to one global matrix
> such that each local matrix is the corresponding diagonal block of
> the mpiaij matrix. I think, this does not work with nested matrices?
>
>
> Why does this not work? I really think you are making this harder than
> it has to be.
>
Mh, maybe I have an incomplete view of the possibilities how to use
nested matrices.
To become more specific: In the case of two mpi tasks, each containing
one seqaij matrix, how to call MatCreateNest? Is this correct:
Mat A;
MatCreaeteNest(PETSC_COMM_WORLD, 2, PETSC_NULL, 2, PETSC_NULL, V ,&A);
and V is defined on rank 0 as
Mat V[2] = {seqMat, PETSC_NULL} ;
and and rank 1 as
Mat V[2] = {PETSC_NULL, seqMat};
Thomas
> Matt
>
> Thomas
>
>>
>>> Hong
>>>
>>>>
>>>> Thomas :
>>>>
>>>> In my FETI-DP code, each rank creates a SEQAIJ
>>>> matrix that represents the discretization of the
>>>> interior domain. Just for debugging, I would like
>>>> to join these sequential matrices to one global
>>>> MPIAIJ matrix. This matrix has no off diagonal nnzs
>>>> and should be stored corresponding to the ranks
>>>> unknowns, thus, first all rows of the first rank
>>>> and so on. What's the most efficient way to do
>>>> this? Is it possible to create this parallel matrix
>>>> just as a view of the sequential ones, so without
>>>> copying the data? Thanks for any advise.
>>>>
>>>>
>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatCreateMPIAIJConcatenateSeqAIJ.html
>>>>
>>>> Note: entries in seqaij matrices are copied into a
>>>> mpiaij matrix without
>>>> inter-processor communication. Use petsc-3.3 for this
>>>> function.
>>>
>>> The function does not do what I expect. For example, if
>>> we have two mpi task and each contains one local square
>>> matrix with n rows, I want to create a global square
>>> matrix with 2n rows. This function create a non-square
>>> matrix of size 2n x n.
>>>
>>> Thomas
>>>
>>>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which
> their experiments lead.
> -- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120904/f5866ef6/attachment-0001.html>
More information about the petsc-users
mailing list