[petsc-users] Fortran example for PETSc global to natural ordering test

Danyang Su danyang.su at gmail.com
Mon Dec 3 19:32:44 CST 2018


Hi Matt,

Attached is the test example with source code, makefile, data and screen 
output I wrote this afternoon. This example reads 2D mesh from vtk file 
and then distribute over all processors. I can get correct global order 
of local nodes after distribution, but the natural order of local nodes 
is always zero after using DMPlexGlobalToNaturalBegin/End(). Sounds like 
it does not take effect.

Would you mind helping me to check/test this code at you most convenience?

Thanks,

Danyang

On 2018-12-03 1:12 p.m., Danyang Su wrote:
>
>
> On 2018-12-03 12:56 p.m., Matthew Knepley wrote:
>> On Mon, Dec 3, 2018 at 3:40 PM Danyang Su <danyang.su at gmail.com 
>> <mailto:danyang.su at gmail.com>> wrote:
>>
>>
>>     On 2018-12-03 12:03 p.m., Matthew Knepley wrote:
>>>     On Mon, Dec 3, 2018 at 2:27 PM Danyang Su <danyang.su at gmail.com
>>>     <mailto:danyang.su at gmail.com>> wrote:
>>>
>>>         Hi Matt,
>>>
>>>         Thanks.
>>>
>>>         BTW: DmPlexGetVertexNumbering now can work using the latest
>>>         develop version. But the index is not in natural ordering
>>>         when DMSetUseNatural is called. That's why I want to use
>>>         PetscSFDistributeSection to check if I miss anything in the
>>>         code.
>>>
>>>     Can you explain that a little more? Maybe you can just push
>>>     forward what you want using the migrationSF.
>>
>>     Hi Matt,
>>
>>     Since I cannot figure what is wrong or missing in my code, I
>>     followed an old ex26.c example in
>>     src/dm/impls/plex/examples/tests to create similar code as shown
>>     below to test global to natural ordering. The code may be ugly
>>     with unnecessary functions in it. Using DmPlexGetVertexNumbering,
>>     I can get the value but it is not in natural order, instead, it
>>     is still in default PETSc order without calling
>>     DMSetUseNatural(dm,PETSC_TRUE,ierr).
>>
>> I do not understand what you are doing below. You just need to call
>>
>>     ierr = DMSetUseNatural(dm,PETSC_TRUE);CHKERRQ(ierr);
>>     ierr = DMPlexDistribute(dm,0,&migrationSF,&pdm);CHKERRQ(ierr);
>>     if (pdm) {
>>       ierr = DMPlexSetMigrationSF(pdm,migrationSF);CHKERRQ(ierr);
>>     }
>> and the DMGlobalToNaturalBegin/End() should work.
>
> You mean to use DMPlexGlobalToNaturalBegin/End(), right? That's what I 
> tried at first, but without success.
>
> I will create a test example to make further check if I can reproduce 
> the problem.
>
> Thanks,
>
> Danyang
>
>>
>>   Thanks,
>>
>>      Matt
>>
>>               if (rank == 0) then
>>
>>                 call
>>     DMPlexCreateFromCellList(Petsc_Comm_World,ndim,num_cells,
>>     num_nodes,num_nodes_per_cell,    &
>>     Petsc_False,dmplex_cells,ndim, dmplex_verts,dm,ierr)
>>                 CHKERRQ(ierr)
>>               else
>>                 call
>>     DMPlexCreateFromCellList(Petsc_Comm_World,ndim,0,
>>     0,num_nodes_per_cell,    &
>>     Petsc_False,dmplex_cells,ndim,dmplex_verts,dm,ierr)
>>                 CHKERRQ(ierr)
>>               end if
>>
>>               if (nprocs > 1) then
>>                 call DMSetUseNatural(dm,PETSC_TRUE,ierr)
>>                 CHKERRQ(ierr)
>>               end if
>>
>>               call DMPlexDistribute(dm,stencil_width, &
>>     migrationsf,distributedMesh,ierr)
>>               CHKERRQ(ierr)
>>
>>               if (distributedMesh /= PETSC_NULL_DM) then
>>                 call
>>     PetscSFCreateInverseSF(migrationsf,migrationsf_inv,ierr)
>>                 CHKERRQ(ierr)
>>
>>                 call
>>     DMCreateGlobalToNatural(distributedMesh,migrationsf,migrationsf_inv,ierr)
>>                 CHKERRQ(ierr)
>>
>>                 call DMGetSection(distributedMesh,section,ierr)
>>                 CHKERRQ(ierr)
>>
>>                 call
>>     PetscSectionCreate(Petsc_Comm_World,section_seq,ierr)
>>                 CHKERRQ(ierr)
>>
>>                 call PetscSFDistributeSection(migrationsf_inv,section, &
>>     PETSC_NULL_INTEGER,section_seq,ierr)
>>                 CHKERRQ(ierr)
>>
>>                 call DMPlexCreateGlobalToNaturalSF(distributedMesh, &
>>     section_seq,migrationsf,sf_natural,ierr)
>>                 CHKERRQ(ierr)
>>
>>                 call DMSetUseNatural(distributedMesh,PETSC_TRUE,ierr)
>>                 CHKERRQ(ierr)
>>
>>                 call PetscSFDestroy(migrationsf,ierr)
>>                 CHKERRQ(ierr)
>>
>>                 call PetscSFDestroy(migrationsf_inv,ierr)
>>                 CHKERRQ(ierr)
>>
>>               end if
>>
>>     Thanks,
>>
>>     Danyang
>>
>>>
>>>       Thanks,
>>>
>>>          Matt
>>>
>>>         Regards,
>>>
>>>         Danyang
>>>
>>>         On 2018-12-03 5:22 a.m., Matthew Knepley wrote:
>>>>         I need to write a custom Fortran stub for this one. I will
>>>>         get it done as soon as possible.
>>>>
>>>>           Thanks,
>>>>
>>>>             Matt
>>>>
>>>>         On Sat, Dec 1, 2018 at 7:16 PM Danyang Su via petsc-users
>>>>         <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>>
>>>>         wrote:
>>>>
>>>>             Hi All,
>>>>
>>>>             I got a simple compilation error when use
>>>>             PetscSFDistributeSection in
>>>>             Fortran. It looks like the required head files are
>>>>             included and the
>>>>             parameters are correctly defined. However, when compile
>>>>             the code, I got
>>>>             error undefined reference to
>>>>             `petscsfdistributesection_'. The code is
>>>>             shown below. Did I miss anything here?
>>>>
>>>>             #include <petsc/finclude/petscsys.h>
>>>>             #include <petsc/finclude/petscvec.h>
>>>>             #include <petsc/finclude/petscdm.h>
>>>>             #include <petsc/finclude/petscdmplex.h>
>>>>                    use petscsys
>>>>                    use petscvec
>>>>                    use petscdm
>>>>                    use petscdmplex
>>>>
>>>>                    implicit none
>>>>
>>>>                    PetscSection ::  section, section_seq
>>>>                    PetscSF :: migrationsf_inv, sf_natural
>>>>                    Vec :: vec_global, vec_natural
>>>>                    PetscErrorCode :: ierr
>>>>
>>>>                    ...
>>>>
>>>>                    call
>>>>             PetscSFDistributeSection(migrationsf_inv,section,      
>>>>                &
>>>>             PETSC_NULL_INTEGER,section_seq,ierr)
>>>>                    CHKERRQ(ierr)
>>>>
>>>>
>>>>                        call
>>>>             PetscSFDistributeSection(migrationsf_inv,section,      &
>>>>              PETSC_NULL_INTEGER,section_seq,ierr)
>>>>                        CHKERRQ(ierr)
>>>>
>>>>             Thanks,
>>>>
>>>>             Danyang
>>>>
>>>>
>>>>
>>>>         -- 
>>>>         What most experimenters take for granted before they begin
>>>>         their experiments is infinitely more interesting than any
>>>>         results to which their experiments lead.
>>>>         -- Norbert Wiener
>>>>
>>>>         https://www.cse.buffalo.edu/~knepley/
>>>>         <http://www.cse.buffalo.edu/~knepley/>
>>>
>>>
>>>
>>>     -- 
>>>     What most experimenters take for granted before they begin their
>>>     experiments is infinitely more interesting than any results to
>>>     which their experiments lead.
>>>     -- Norbert Wiener
>>>
>>>     https://www.cse.buffalo.edu/~knepley/
>>>     <http://www.cse.buffalo.edu/~knepley/>
>>
>>
>>
>> -- 
>> What most experimenters take for granted before they begin their 
>> experiments is infinitely more interesting than any results to which 
>> their experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/ 
>> <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181203/24a677a8/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: petsc-dmplex-test.tar.gz
Type: application/gzip
Size: 8518 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181203/24a677a8/attachment-0001.bin>


More information about the petsc-users mailing list