[petsc-users] DMCreatMatrix preallocation

Smith, Barry F. bsmith at mcs.anl.gov
Mon Sep 17 18:38:38 CDT 2018


    The key is to understand that most PETSc functionality is supported by two levels of code: the abstract interface (usually defined in src/xxx/interface) and then one or more implementations (usually defined in src/xxx/impls/*). The abstract interface functions are XXXYYYY while the implementation functions names are usually XXXYYYY_ZZZZ.    So for example MatCreate() is the abstract interface function and MatCreate_SeqAIJ() is one particular implementation. 
You can find the the interface and implementation functions by simply googling the manual page (for example http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatCreate.html) and then looking at the bottom of the page for the location:

Location

src/mat/utils/gcreate.c

and the implementations

Implementations

MatCreate_HYPREStruct in src/dm/impls/da/hypre/mhyp.c
MatCreate_HYPRESStruct in src/dm/impls/da/hypre/mhyp.c
MatCreate_MPIAdj in src/mat/impls/adj/mpi/mpiadj.c
...

They are all hyper links that you can click on to go exactly to the routine. 

   Barry

If you use EMACS or VI/VIM you can use etags or tags (see the PETSc users manual page pages 235, 236) to jump directly the various functions also.



> On Sep 17, 2018, at 6:27 PM, Oleksandr Koshkarov <olk548 at mail.usask.ca> wrote:
> 
> Awesome, thank you. All those functions and flexibility imply the enormous beauty in PETSc structure. I wish to start reading petsc internal code, to be more comfortable using it and to ask less stupid questions. However, it is quite intimidating to start. Maybe you have some pointers, where a petsc user can start reading petsc code?
> 
> Thank you and best regards,
> 
> Oleksandr
> 
> 
> On 09/17/2018 05:23 PM, Smith, Barry F. wrote:
>> 
>>> On Sep 17, 2018, at 5:54 PM, Oleksandr Koshkarov <olk548 at mail.usask.ca> wrote:
>>> 
>>> Thank you. One more question:
>>> 
>>> Are DMDA matrices with smaller stencil compatible with state vectors with larger stencil width? If I precondition finite volume method were I use stencil width 2 (or more) with lower order method with stencil width 1, can this still work? (My jacobian is matrix free). Can this work together?
>>    Yes
>> 
>>> Thank you and best regards,
>>> 
>>> Oleksandr.
>>> 
>>> 
>>> On 09/17/2018 04:44 PM, Smith, Barry F. wrote:
>>>>> On Sep 17, 2018, at 5:36 PM, Oleksandr Koshkarov <olk548 at mail.usask.ca> wrote:
>>>>> 
>>>>> Wow, that is neat, thank you! I think it is exactly what I need. However, can you please clarify some details:
>>>>> 
>>>>> 1) does "ofill" represent all non diagonal blocs?  meaning they have the same pattern. (e.g., in 1d I will have 1+2*stencil_width blocks)
>>>>    Yes, it represents all the non diagonal blocks, this is a weakness in that depending on the discretization different diagonal blocks may have different nonzero structure. You need to use the union of all the off-diagonal block matrix nonzero structure.
>>>>> 2) this note: "This only makes sense when you are doing multicomponent problems but using the MPIAIJ matrix format". Sounds like MPIAIJ is wrong choice for multi component problem. Actually, I do not pick the DMDA matrix format myself, and use petsc default (which is probably MPIAIJ).
>>>>    It is MPIAIJ.
>>>> 
>>>>> Maybe, I should something else more heavily multi component problem? My matrices appears from discretizing 6D PDE, where I put 3D inside dof... The discretization is all stencil based (finite volume method).
>>>>    The BAIJ format is only appropriate if the "blocks" are essentially dense (since BAIJ stores them as dense). BAIJ format is not correct for you. You should be using the MPIAIJ format.
>>>> 
>>>>    Barry
>>>> 
>>>>> p.s. All those functions and flexibility imply the enormous beauty in PETSc structure. I wish to start reading petsc internal code, to be more comfortable using it and to ask more smart questions to you guys. However, it is quite intimidating to start. Maybe you have some pointers, how can I start?
>>>>> 
>>>>> Thank you and best regards,
>>>>> 
>>>>> Oleksandr.
>>>>> 
>>>>> 
>>>>> On 09/17/2018 02:37 PM, Smith, Barry F. wrote:
>>>>>>   You can use DMDASetBlockFills() or DMDASetBlockFillsSparse().
>>>>>> 
>>>>>>    If you need any finer scale control then they offer you need to copy the PETSc source that does the preallocation for DMDA generated matrices and customize it exactly for your problem.
>>>>>> 
>>>>>> 
>>>>>>     Barry
>>>>>> 
>>>>>> 
>>>>>>> On Sep 17, 2018, at 3:31 PM, Oleksandr Koshkarov <olk548 at mail.usask.ca> wrote:
>>>>>>> 
>>>>>>> Hello All,
>>>>>>> 
>>>>>>> I have a question about preallocation of DMDA matrix. As I understand, it preallocates the number of nonzeros using the stencil width info. However, it seems it will not be efficient for me: I have a large dof (around 2000), where only some of those dof's are coupled, so those dof*dof blocks inside the DMMatrix would be also sparse. I also use this DMDA matrix as preconditioner, which has lower coupling than original DMDA array, so stencil is effectively smaller (I precondition higher order finite volume method with lower order FV method). Can I manually specify how many zeros needs to be preallocated for the DMDA matrix?
>>>>>>> 
>>>>>>> p.s. I do not want o use normal matrix, as I relay on dmda indexing (MatSetValuesStencil).
>>>>>>> 
>>>>>>> Thank you and best regards,
>>>>>>> 
>>>>>>> Oleksandr.
>>>>>>> 
> 



More information about the petsc-users mailing list