[petsc-users] Convert mat SEQAIJ to MPIAIJ
Thomas Witkowski
thomas.witkowski at tu-dresden.de
Tue Sep 4 11:23:55 CDT 2012
Mat seqMat;
MatCreateSeqAIJ(PETSC_COMM_SELF, 10, 10, 0, PETSC_NULL, &seqMat);
Mat nestMat;
MatCreateNest(PETSC_COMM_WORLD, 1, PETSC_NULL, 1, PETSC_NULL,
&seqMat, &nestMat);
Results in the following error message:
[0]PETSC ERROR: PetscSplitOwnership() line 93 in
/home/thomas/software/petsc-3.3-p0/src/sys/utils/psplit.c Sum of local
lengths 20 does not equal global length 10, my local length 10
likely a call to VecSetSizes() or MatSetSizes() is wrong.
See http://www.mcs.anl.gov/petsc/documentation/faq.html#split
[1]PETSC ERROR: PetscSplitOwnership() line 93 in
/home/thomas/software/petsc-3.3-p0/src/sys/utils/psplit.c Sum of local
lengths 20 does not equal global length 10, my local length 10
likely a call to VecSetSizes() or MatSetSizes() is wrong.
Thomas
Am 04.09.2012 17:36, schrieb Matthew Knepley:
> On Tue, Sep 4, 2012 at 10:28 AM, Thomas Witkowski
> <thomas.witkowski at tu-dresden.de
> <mailto:thomas.witkowski at tu-dresden.de>> wrote:
>
> Am 04.09.2012 17:20, schrieb Matthew Knepley:
>> On Tue, Sep 4, 2012 at 9:52 AM, Thomas Witkowski
>> <thomas.witkowski at tu-dresden.de
>> <mailto:thomas.witkowski at tu-dresden.de>> wrote:
>>
>>>
>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MATNEST.html
>>>
>> As I wrote in my initial question, each rank contains one and
>> only one seqaij matrix, which all should be joined to one
>> global matrix such that each local matrix is the
>> corresponding diagonal block of the mpiaij matrix. I think,
>> this does not work with nested matrices?
>>
>>
>> Why does this not work? I really think you are making this harder
>> than it has to be.
>>
> Mh, maybe I have an incomplete view of the possibilities how to
> use nested matrices.
>
> To become more specific: In the case of two mpi tasks, each
> containing one seqaij matrix, how to call MatCreateNest? Is this
> correct:
>
> Mat A;
> MatCreaeteNest(PETSC_COMM_WORLD, 2, PETSC_NULL, 2, PETSC_NULL, V ,&A);
>
> ^^^ This should be 1.
>
> Matt
>
> and V is defined on rank 0 as
>
> Mat V[2] = {seqMat, PETSC_NULL} ;
>
> and and rank 1 as
>
> Mat V[2] = {PETSC_NULL, seqMat};
>
>
> Thomas
>> Matt
>>
>> Thomas
>>
>>>
>>>> Hong
>>>>
>>>>>
>>>>> Thomas :
>>>>>
>>>>> In my FETI-DP code, each rank creates a SEQAIJ
>>>>> matrix that represents the discretization of
>>>>> the interior domain. Just for debugging, I
>>>>> would like to join these sequential matrices
>>>>> to one global MPIAIJ matrix. This matrix has
>>>>> no off diagonal nnzs and should be stored
>>>>> corresponding to the ranks unknowns, thus,
>>>>> first all rows of the first rank and so on.
>>>>> What's the most efficient way to do this? Is
>>>>> it possible to create this parallel matrix
>>>>> just as a view of the sequential ones, so
>>>>> without copying the data? Thanks for any advise.
>>>>>
>>>>>
>>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatCreateMPIAIJConcatenateSeqAIJ.html
>>>>>
>>>>> Note: entries in seqaij matrices are copied into a
>>>>> mpiaij matrix without
>>>>> inter-processor communication. Use petsc-3.3 for
>>>>> this function.
>>>>
>>>> The function does not do what I expect. For
>>>> example, if we have two mpi task and each contains
>>>> one local square matrix with n rows, I want to
>>>> create a global square matrix with 2n rows. This
>>>> function create a non-square matrix of size 2n x n.
>>>>
>>>> Thomas
>>>>
>>>>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to
>> which their experiments lead.
>> -- Norbert Wiener
>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which
> their experiments lead.
> -- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120904/cea6aee0/attachment.html>
More information about the petsc-users
mailing list