[petsc-users] custom sparse matrix
Barry Smith
bsmith at mcs.anl.gov
Mon Mar 21 10:19:28 CDT 2011
On Mar 21, 2011, at 10:14 AM, Alejandro Marcos Aragón wrote:
> Hi Barry,
>
> Thanks for your answer. I'll look into those functions, but instead of copying, do you know if there is a way to give the address of the arrays?
MatCreateSeqAIJWithArrays() will not copy the arrays but just use them. In parallel MatCreateMPIAIJWithSplitArrays() will not copy the arrays, just use them but you need to have the correct format for the arrays (in parallel we use a "non-standard" parallel compressed sparse row format that likely you don't use but check the manual page http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-3.0.0/docs/manualpages/Mat/MatCreateMPIAIJWithSplitArrays.html
> I want to use the PETSc parallel solver capabilities, but I don't want to copy something that I already have. Now, I'm not familiar with the MatCreateShell() function that you mentioned. Could you expand on this?
http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-3.0.0/docs/manualpages/Mat/MatCreateShell.html
>
> Thanks again
>
> Alejandro
>
>
> On Mar 21, 2011, at 2:14 PM, Barry Smith wrote:
>
>>
>> On Mar 21, 2011, at 6:43 AM, Alejandro Marcos Aragón wrote:
>>
>>> Hi everyone,
>>>
>>> I'm new to the list and to PETSc. I was wondering if this is possible: in my code I have my own sparse matrix data structure, so instead of creating a sparse matrix from PETSc and copying all the elements, I was wondering if there is a way to specify the storage of the elements. My custom data structure is a three-array compressed row format (or compressed column storage).
>>
>> If it truly is CSR you an use MatCreateSeqAIJWithArrays() or MatCreateMPIAIJWithArrays(). Otherwise you can use MatCreateShell() and provide your own function operations for MatMult() and whatever you will need.
>>
>> Generally keeping your own sparse matrix data structure is overrated, yes you may save a little time and a little memory but then you use all the flexibility and power of PETSc sparse matrices; why bother with PETSc in that case.
>>
>> Barry
>>
>>>
>>> Thank you,
>>>
>>> Alejandro M. Aragón, Ph.D.
>>
>
More information about the petsc-users
mailing list