[petsc-users] Question regarding DMPlex reordering
Pierre Seize
pierre.seize at onera.fr
Wed Oct 27 08:54:09 CDT 2021
Hi, thanks for the fix. It seems to work fine.
Out of curiosity, I noticed the MatOrderingType of DMPlexGetOrdering is
not used. Is this intentional ? To match MatGetOrdering ?
Pierre
On 27/10/21 15:03, Matthew Knepley wrote:
> On Wed, Oct 27, 2021 at 3:15 AM Pierre Seize <pierre.seize at onera.fr
> <mailto:pierre.seize at onera.fr>> wrote:
>
>
>
> On 26/10/21 22:28, Matthew Knepley wrote:
>> On Tue, Oct 26, 2021 at 10:17 AM Pierre Seize
>> <pierre.seize at onera.fr <mailto:pierre.seize at onera.fr>> wrote:
>>
>> Hi, I had the idea to try and renumber my mesh cells, as I've
>> heard it's better: "neighbouring cells are stored next to one
>> another, and memory access are faster".
>>
>> Right now, I load the mesh then I distribute it over the
>> processes. I thought I'd try to permute the numbering between
>> those two steps :
>>
>> DMPlexCreateFromFile
>> DMPlexGetOrdering
>> DMPlexPermute
>> DMPlexDistribute
>>
>> but that gives me an error when it runs on more than one process:
>>
>> [0]PETSC ERROR: --------------------- Error Message
>> --------------------------------------------------------------
>> [0]PETSC ERROR: No support for this operation for this object
>> type
>> [0]PETSC ERROR: Number of dofs for point 0 in the local
>> section should be positive
>> [0]PETSC ERROR: See https://petsc.org/release/faq/ for
>> trouble shooting.
>> [0]PETSC ERROR: Petsc Release Version 3.16.0, unknown
>> [0]PETSC ERROR: ./build/bin/yanss on a named ldmpe202z.onera
>> by pseize Tue Oct 26 16:03:33 2021
>> [0]PETSC ERROR: Configure options --PETSC_ARCH=arch-ld-gcc
>> --download-metis --download-parmetis --prefix=~/.local
>> --with-cgns
>> [0]PETSC ERROR: #1 PetscPartitionerDMPlexPartition() at
>> /stck/pseize/softwares/petsc/src/dm/impls/plex/plexpartition.c:720
>> [0]PETSC ERROR: #2 DMPlexDistribute() at
>> /stck/pseize/softwares/petsc/src/dm/impls/plex/plexdistribute.c:1630
>> [0]PETSC ERROR: #3 MeshLoadFromFile() at src/spatial.c:689
>> [0]PETSC ERROR: #4 main() at src/main.c:22
>> [0]PETSC ERROR: PETSc Option Table entries:
>> [0]PETSC ERROR: -draw_comp 0
>> [0]PETSC ERROR: -mesh data/box.msh
>> [0]PETSC ERROR: -mesh_view draw
>> [0]PETSC ERROR: -riemann anrs
>> [0]PETSC ERROR: -ts_max_steps 100
>> [0]PETSC ERROR: -vec_view_partition
>> [0]PETSC ERROR: ----------------End of Error Message
>> -------send entire error message to petsc-maint at mcs.anl.gov
>> <mailto:petsc-maint at mcs.anl.gov>----------
>>
>> I checked and before I tried to reorder the mesh, the
>> dm->localSection was NULL before entering DMPlexDistribute,
>> and I was able to fix the error with DMSetLocalSection(dm,
>> NULL) after DMPlexPermute, but it doesn't seems it's the
>> right way to do what I want. Does someone have any advice ?
>>
>> Oh, this is probably me trying to be too clever. If a local
>> section is defined, then I try to use the number of dofs in it to
>> load balance better.
>> There should never be a negative number of dofs in the local
>> section (a global section uses this to indicate a dof owned by
>> another process).
>> So eliminating the local section will definitely fix that error.
>>
>> Now the question of how you got a local section. DMPlexPermute()
>> does not create one, so it seems like you had one ahead of time,
>> and that
>> the values were not valid.
>
> DMPlexPermute calls DMGetLocalSection, which creates
> dm->localSection if it's NULL, so before DMPlexPermute my
> dm->localSection is NULL, and after it is set. Because of that I
> enter the if in src/dm/impls/plex/plexpartition.c:707 and then I
> got the error.
> If i have a "wrong" dm->localSection, I think it has to come from
> DMPlexPermute.
>
>> Note that you can probably get rid of some of the loading code using
>>
>> DMCreate(comm, &dm);
>> DMSetType(dm, DMPLEX);
>> DMSetFromOptions(dm);
>> DMViewFromOptions(dm, NULL, "-mesh_view");
>>
>> and use
>>
>> -dm_plex_filename databox,msh -mesh_view
>
> My loading code is already small, but just to make sure I wrote
> this minimal example:
>
> int main(int argc, char **argv){
> PetscErrorCode ierr;
>
> ierr = PetscInitialize(&argc, &argv, NULL, help); if (ierr)
> return ierr;
>
> DM dm, foo_dm;
> ierr = DMCreate(PETSC_COMM_WORLD, &dm); CHKERRQ(ierr);
> ierr = DMSetType(dm, DMPLEX); CHKERRQ(ierr);
> ierr = DMSetFromOptions(dm); CHKERRQ(ierr);
>
> IS perm;
> ierr = DMPlexGetOrdering(dm, NULL, NULL, &perm); CHKERRQ(ierr);
> ierr = DMPlexPermute(dm, perm, &foo_dm); CHKERRQ(ierr);
> if (foo_dm) {
> ierr = DMDestroy(&dm); CHKERRQ(ierr);
> dm = foo_dm;
> }
> ierr = DMPlexDistribute(dm, 2, NULL, &foo_dm); CHKERRQ(ierr);
> if (foo_dm) {
> ierr = DMDestroy(&dm); CHKERRQ(ierr);
> dm = foo_dm;
> }
>
> ierr = ISDestroy(&perm); CHKERRQ(ierr);
> ierr = DMDestroy(&dm); CHKERRQ(ierr);
> ierr = PetscFinalize();
> return ierr;
> }
>
> ran with mpiexec -n 2 ./build/bin/yanss -dm_plex_filename
> data/box.msh. The mesh is a 2D box from GMSH but I've got the same
> result with any mesh I've tried. It runs fine with 1 process but
> gives the previous error for more processes.
>
>
> Hi Pierre,
>
> You are right. This is my bug. Here is the fix:
>
> https://gitlab.com/petsc/petsc/-/merge_requests/4504
>
> Is it possible to try this branch?
>
> Thanks,
>
> Matt
>
> Pierre
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which
> their experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/%7Eknepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20211027/7a0356db/attachment-0001.html>
More information about the petsc-users
mailing list