[petsc-dev] MatDestroy call in MatMerge

Jed Brown jedbrown at mcs.anl.gov
Sun Feb 26 21:21:56 CST 2012


On Sun, Feb 26, 2012 at 21:10, Hong Zhang <hzhang at mcs.anl.gov> wrote:

> Vaclav:
>
>> I feel that it is quite incorrect that MatMerge calls MatDestroy(&inmat)
>> since this corrupts "garbage collection" - inmat is destroyed but only
>> nullified in MatMerge. It remains unchanged outside so subsequent
>> MatDestroy call fails. I think that there is no reason for this destroy, if
>> am right, MatMerge does not corrupt the inmat so it can be reused and then
>> destroyed manually.
>>
> MatMerge():
>  Creates a single large PETSc matrix by concatenating local sequential
>                  matrices from each processor.
> It was written for petsc internal use (private function), and only works
> for mpiaij format. Thanks for the suggestion. I'll
> remove MatDestroy(&inmat) from
> MatMerge().
>

I think it should also be renamed to something like
MatCreateMPIAIJConcatenateSeqAIJ(). It doesn't make semantic sense for
general matrices because those need not be stored using a row partition.


>
>> The second question is - what is the principal difference between
>> MatMerge and MatMerge_SeqsToMPI - I am confused a bit.
>>
> MatMerge_SeqsToMPI - Creates a MPIAIJ matrix by adding sequential
>                  matrices from each processor .
>
>>
>> And I have finally one note about MatMerge_SeqsToMPI manual page: there
>> is instead a synopsis of MatMerge_SeqsToMPINumeric because this comment is
>> located above the MatMerge_SeqsToMPINumeric function in source.
>>
> I fixed this in petsc-3.2, and will merge to petsc-dev.
>
> We may need rename these two functions for better understanding.
>

MatCreateMPIAIJSumSeqAIJ()?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120226/433cda36/attachment.html>


More information about the petsc-dev mailing list