[petsc-users] Assembling a matrix for a DMComposite vector
Anton Popov
popov at uni-mainz.de
Thu May 1 17:25:06 CDT 2014
On 5/1/14 10:39 PM, Anush Krishnan wrote:
> Hi Anton,
>
> On 29 April 2014 05:14, Anton Popov <popov at uni-mainz.de
> <mailto:popov at uni-mainz.de>> wrote:
>
>
> You can do the whole thing much easier (to my opinion).
> Since you created two DMDA anyway, just do:
>
> - find first index on every processor using MPI_Scan
> - create two global vectors (no ghosts)
> - put proper global indicies to global vectors
> - create two local vectors (with ghosts) and set ALL entries to -1
> (to have what you need in boundary ghosts)
> - call global-to-local scatter
>
> Done!
>
>
> Won't the vectors contain floating point values? Are you storing your
> indices as real numbers?
YES, exactly. And then I cast them to PetscInt when I compose stencils.
Something like this:
idx[0] = (PetscInt) ivx[k][j][i];
idx[1] = (PetscInt) ivx[k][j][i+1];
idx[2] = (PetscInt) ivy[k][j][i];
... and so on, where ivx, ivy, ... are the index arrays in x, y ..
directions
Then I insert (actually add) stencils using MatSetValues.
By the way, you can ideally preallocate in parallel with
MatMPIAIJSetPreallocation. To count precisely number entries in the
diagonal & off-diagonal blocks use the same mechanism to easily access
global indices, and then compare them with the local row range, which is
also known:
- within the range -> d_nnz[i]++;
- outside the range -> o_nnz[i]++;
Anton
>
>
> The advantage is that you can access global indices (including
> ghosts) in every block using i-j-k indexing scheme.
> I personally find this way quite easy to implement with PETSc
>
> Anton
>
>
> Thank you,
> Anush
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140502/f384bdd1/attachment.html>
More information about the petsc-users
mailing list