[petsc-users] Question on DMPlexCreateFromCellList and DMPlexCreateFromFile
Danyang Su
danyang.su at gmail.com
Thu Feb 15 18:40:21 CST 2018
Hi Matt,
I have a question on DMPlexCreateFromCellList and DMPlexCreateFromFile.
When use DMPlexCreateFromFile with Gmsh file input, it works fine and
each processor gets its own part. However, when use
DMPlexCreateFromCellList, all the processors have the same global mesh.
To my understand, I should put the global mesh as input, right?
Otherwise, I should use DMPlexCreateFromCellListParallel instead if the
input is local mesh.
Below is the test code I use, results from method 1 is wrong and that
from method 2 is correct. Would you please help to check if I did
anything wrong with DMPlexCreateFromCellList input?
!test with 4 processor, global num_cells = 8268, global num_nodes = 4250
!correct results
check rank 2 istart 2034 iend 3116
check rank 3 istart 2148 iend 3293
check rank 1 istart 2044 iend 3133
check rank 0 istart 2042 iend 3131
!wrong results
check rank 0 istart 8268 iend 12518
check rank 1 istart 8268 iend 12518
check rank 2 istart 8268 iend 12518
check rank 3 istart 8268 iend 12518
!c ************* test part *********************
!c method 1: create DMPlex from cell list, same duplicated global
meshes over all processors
!c the input parameters num_cells, num_nodes, dmplex_cells,
dmplex_verts are all global parameters (global mesh data)
call DMPlexCreateFromCellList(Petsc_Comm_World,ndim,num_cells, &
num_nodes,num_nodes_per_cell, &
Petsc_True,dmplex_cells,ndim, &
dmplex_verts,dmda_flow%da,ierr)
CHKERRQ(ierr)
!c method 2: create DMPlex from Gmsh file, for test purpose, this
works fine, each processor gets its own part
call DMPlexCreateFromFile(Petsc_Comm_World, &
prefix(:l_prfx)//'.msh',0, &
dmda_flow%da,ierr)
CHKERRQ(ierr)
!c *************end of test part*********************
distributedMesh = PETSC_NULL_OBJECT
!c distribute mesh over processes
call DMPlexDistribute(dmda_flow%da,0,PETSC_NULL_OBJECT, &
distributedMesh,ierr)
CHKERRQ(ierr)
!c destroy original global mesh after distribution
if (distributedMesh /= PETSC_NULL_OBJECT) then
call DMDestroy(dmda_flow%da,ierr)
CHKERRQ(ierr)
!c set the global mesh as distributed mesh
dmda_flow%da = distributedMesh
end if
!c get coordinates
call DMGetCoordinatesLocal(dmda_flow%da,gc,ierr)
CHKERRQ(ierr)
call DMGetCoordinateDM(dmda_flow%da,cda,ierr)
CHKERRQ(ierr)
call DMGetDefaultSection(cda,cs,ierr)
CHKERRQ(ierr)
call PetscSectionGetChart(cs,istart,iend,ierr)
CHKERRQ(ierr)
#ifdef DEBUG
if(info_debug > 0) then
write(*,*) "check rank ",rank," istart ",istart," iend ",iend
end if
#endif
Thanks and regards,
Danyang
-------------- next part --------------
A non-text attachment was scrubbed...
Name: stripf.msh
Type: model/mesh
Size: 382393 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180215/bef353a0/attachment-0001.msh>
More information about the petsc-users
mailing list