[petsc-users] Reduce DMDA's

Filippo Leonardi filippo.leonardi at sam.math.ethz.ch
Thu Sep 25 02:43:20 CDT 2014


Hi,

Let's say I have some independent DMDAs, created as follows:

    MPI_Comm_split(MPI_COMM_WORLD, comm_rank % 2, 0, &newcomm);
    
    DM da;
    DMDACreate2d(newcomm, DM_BOUNDARY_PERIODIC, DM_BOUNDARY_NONE,
					DMDA_STENCIL_BOX ,50, 50, PETSC_DECIDE, PETSC_DECIDE,
					1, 1, NULL, NULL,&da);

For instance for 4 processors I get 2 DMDA's. Now, I want to reduce (in the 
sense of MPI) the global/local DMDA vectors to only one of the MPI groups (say 
group 0). Is there an elegant way (e.g. with Scatters) to do that?

My current implementation would be: get the local array on each process and 
reduce (with MPI_Reduce) to the root of each partition.

DMDA for Group 1:
+------+------+
| 0    | 1    |
|      |      |
+------+------+
| 2    | 3    |
|      |      |
+------+------+
DMDA for Group 2:
+------+------+
| 4    | 5    |
|      |      |
+------+------+
| 6    | 7    |
|      |      |
+------+------+

Reduce rank 0 and 4 to rank 0.
Reduce rank 1 and 5 to rank 1.
Reduce rank 2 and 6 to rank 2.
Reduce rank 3 and 7 to rank 3.

Clearly this implementation is cumbersome. Any idea?

Best,
Filippo
-------------- next part --------------
A non-text attachment was scrubbed...
Name: ETHZ.vcf
Type: text/vcard
Size: 593 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140925/80b657cc/attachment.bin>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: ETHZ.vcf
Type: text/vcard
Size: 594 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140925/80b657cc/attachment-0001.bin>


More information about the petsc-users mailing list