[petsc-users] Assembling a matrix for a DMComposite vector

Anush Krishnan k.anush at gmail.com
Fri May 2 11:34:33 CDT 2014


On 1 May 2014 18:25, Anton Popov <popov at uni-mainz.de> wrote:

>  On 5/1/14 10:39 PM, Anush Krishnan wrote:
>
> Hi Anton,
>
> On 29 April 2014 05:14, Anton Popov <popov at uni-mainz.de> wrote:
>
>>
>>  You can do the whole thing much easier (to my opinion).
>> Since you created two DMDA anyway, just do:
>>
>> - find first index on every processor using MPI_Scan
>> - create two global vectors (no ghosts)
>> - put proper global indicies to global vectors
>> - create two local vectors (with ghosts) and set ALL entries to -1 (to
>> have what you need in boundary ghosts)
>> - call global-to-local scatter
>>
>> Done!
>>
>
>  Won't the vectors contain floating point values? Are you storing your
> indices as real numbers?
>
> YES, exactly. And then I cast them to PetscInt when I compose stencils.
>
> Something like this:
>         idx[0] = (PetscInt) ivx[k][j][i];
>         idx[1] = (PetscInt) ivx[k][j][i+1];
>         idx[2] = (PetscInt) ivy[k][j][i];
>     ... and so on, where ivx, ivy, ... are the index arrays in x, y ..
> directions
>
> Then I insert (actually add) stencils using MatSetValues.
>
> By the way, you can ideally preallocate in parallel with
> MatMPIAIJSetPreallocation. To count precisely number entries in the
> diagonal & off-diagonal blocks use the same mechanism to easily access
> global indices, and then compare them with the local row range, which is
> also known:
> - within the range   -> d_nnz[i]++;
> - outside the range -> o_nnz[i]++;
>

Thanks a lot for the help. I did exactly that and it worked perfectly.

But just to clarify: if I was using 32-bit floats, would I start having
trouble when my matrix size reaches ~10 million due to the floating point
precision?


>
> Anton
>
>
>
>>
>> The advantage is that you can access global indices (including ghosts) in
>> every block using i-j-k indexing scheme.
>> I personally find this way quite easy to implement with PETSc
>>
>> Anton
>>
>
>  Thank you,
> Anush
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140502/f082a247/attachment.html>


More information about the petsc-users mailing list