[petsc-users] AIJ asembly from AIJs

Dave May dave.mayhem23 at gmail.com
Sat Nov 22 17:39:23 CST 2014


An alternative would be this.
Assuming you can map each sub-domain row,col index into a global index for
your coupled problem, you could

You create your matrix of type MATMPIAIJ
Then call
MatSeqAIJGetArray()
MatGetRowIJ()
Then for each row of your sub-domain matrix,
  1) convert your local row index into a global row index
  2) traverse through the CSR array and copy the local column indices into
a temporary array
  3) convert the local indices in the temporary array into global column
indices
  4) Call MatSetValues() (with ADD_VALUES) and add the entries from each
sub-domain into the parallel MPIAIJ matrix for the current row. You won't
need to copy the aij entries, just index pointer returned by
MatSeqAIJGetArray()
so that it is points to the beginning of the entries for the current row.

Call the appropriate restore functions
MatRestoreRowIJ()
MatSeqAIJRestoreArray()



PETSc team, seems the webpages

http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatSeqAIJGetArray.html

http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatSeqAIJRestoreArray.html
both contain a typo. The input matrix should be of type MATSEQAIJ

Cheers,
  Dave



On 23 November 2014 at 00:17, Matthew Knepley <knepley at gmail.com> wrote:

> On Sat, Nov 22, 2014 at 4:30 PM, Alp Kalpalp <alpkalpalp at gmail.com> wrote:
>
>> Hi,
>>
>> I am a newby in PetSc but I could not find a solution to my problem. Let
>> me describbe it to you.I have an SeqAIJ matrix on each processor. These
>> substructures stiffness matrices should be assembled on a MPIAIJ and they
>> have overlapping dofs (on boundaries). A is Eigen::SparseMatrix<double> and
>> my code was as follows;
>>
>> ierr =
>> MatCreateSeqAIJWithArrays(PETSC_COMM_SELF,m,n,A.outerIndexPtr(),A.innerIndexPtr(),A.valuePtr(),&subK);CHKERRQ(ierr);
>> ierr =
>> MatCreateMPIAIJSumSeqAIJ(PETSC_COMM_WORLD,subK,PETSC_DECIDE,PETSC_DECIDE,MAT_INITIAL_MATRIX,&K);CHKERRQ(ierr);
>>
>> Suppose I have 2 procs. first have 18x18 sparse mat, second hase 12x12
>> sparse matrix. Assembled system should be 24x24 for this specific problem.
>> Above code is wenting to infinite loops. debugger reports that there is a
>> deadlock on the progman.
>>
>
> This function does not do this job:
>
>
> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatCreateMPIAIJSumSeqAIJ.html
>
> It clearly says "The dimensions of the sequential matrix in each processor
> MUST be the same."
>
> So, I searched for a workaroud and I found:
>>
>> MatCreateMPIAIJWithArrays and this function seems fails when the matrices
>> has overlapping parts. So that it asembles 30x30 matrix'!!
>>
>> I check MatCreateMPIAIJ and it seems so costly to form d_nnz etc.
>>
>> Finally, I tried the MatCreate, MatSetValues procedure, however in here
>> MatSetValues requires a coordinate list of values not CSR format arrays!!!
>>
>> So my question is how may I set and MPIAIJ from my (distributed)
>> overlapping SeqAIJ matrices or preferebly using my CSR arrays directly?
>>
>
> 1) I think the best option is to put your element matrices directly into
> the MPIAIJ matrix, since each elemMat can be set with
>     a single call to MatSetValues().
>
> 2) If that is too much reorganization, then I would
>
>   a) Use MatCreateMPIAIJWithArrays() for the non-overlapping rows (remove
> the ghost rows)
>
>   b) Call MatSetValues() on each ghost row
>
> 3) If you cannot remove the overlapping rows, I would just call
> MatSetValues() on each row. It is unlikely that
>     the insertion process is a bottleneck in your computation (unless you
> are not preallocating correctly)
>
>   Thanks,
>
>      Matt
>
>
>> Thanks in advance
>>
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141123/5fa7a3de/attachment.html>


More information about the petsc-users mailing list