[petsc-dev] Petsc "make test" have more failures for --with-openmp=1
Jed Brown
jed at jedbrown.org
Fri Mar 19 09:21:09 CDT 2021
Matthew Knepley <knepley at gmail.com> writes:
> On Fri, Mar 19, 2021 at 7:30 AM Lawrence Mitchell <wence at gmx.li> wrote:
>
>> > On 19 Mar 2021, at 03:51, Jed Brown <jed at jedbrown.org> wrote:
>> >
>> > It's a notable weakness of DMPlex that it does not apply such an
>> ordering of dofs and I've complained to Matt about it many times over the
>> years, but any blame rests solely with me for not carving out time to
>> implement it here.
>>
>> I think there is a way to do this with plex (at least we do it), like so:
>>
>> DMPlexGetOrdering(dm, MATORDERINGRCM, NULL, &isperm);
>>
>> you might need to then invert this ordering.
>>
>> Now when you make a section you either say
>>
>> PetscSectionSetPermutation(section, isperm);
>>
>
> You can do that if you only want to permute the dofs, but usually I want to
> reorder everything so that any Section comes out right
>
> DMPlexPermute()
Last we talked, and all that's tested, this can only permute within strata, not across strata. That would incur the "basically point Jacobi" problem for naive OpenMP smoothers highlighted in Allison's paper and (what I care about more) the memory locality impact is similar to moving from interlaced fields to segregated fields.
diff --git i/src/dm/impls/plex/tests/ex10.c w/src/dm/impls/plex/tests/ex10.c
index eed9aa878e..cfe35dc1a6 100644
--- i/src/dm/impls/plex/tests/ex10.c
+++ w/src/dm/impls/plex/tests/ex10.c
@@ -80,6 +80,7 @@ PetscErrorCode TestReordering(DM dm, AppCtx *user)
PetscFunctionBegin;
ierr = DMPlexGetOrdering(dm, order, NULL, &perm);CHKERRQ(ierr);
+ ierr = ISView(perm, NULL);CHKERRQ(ierr);
ierr = DMPlexPermute(dm, perm, &pdm);CHKERRQ(ierr);
ierr = PetscObjectSetOptionsPrefix((PetscObject) pdm, "perm_");CHKERRQ(ierr);
ierr = DMSetFromOptions(pdm);CHKERRQ(ierr);
$ $PETSC_ARCH/tests/dm/impls/plex/tests/ex10 -dim 2 -num_dof 1,1,1 -perm_dm_view
IS Object: 1 MPI processes
type: general
Number of indices in set 33
0 0
1 2
2 1
3 5
4 3
5 7
6 6
7 4
8 8
9 9
10 13
11 10
12 11
13 15
14 12
15 14
16 16
17 17
18 18
19 19
20 22
21 23
22 21
23 20
24 24
25 28
26 29
27 25
28 30
29 31
30 32
31 26
32 27
DM Object: (perm_) 1 MPI processes
type: plex
DM_0x564eba4a45d0_1 in 2 dimensions:
0-cells: 9
1-cells: 16
2-cells: 8
Labels:
celltype: 3 strata with value/size (0 (9), 3 (8), 1 (16))
depth: 3 strata with value/size (0 (9), 1 (16), 2 (8))
marker: 1 strata with value/size (1 (16))
Face Sets: 1 strata with value/size (1 (8))
Field Field_0:
adjacency FEM
Ordering method rcm reduced bandwidth from 51 to 51
Notice how the permutations are contained within the vertices {0, ..., 8}, edges {9, ..., 24}, and cells {25, ..., 32}. I would like to get rid of that restriction, but you've said it would have significant non-local consequences so I haven't tried.
More information about the petsc-dev
mailing list