[petsc-users] Domain decomposition using DMPLEX

Smith, Barry F. bsmith at mcs.anl.gov
Mon Nov 25 23:02:50 CST 2019


"No, I have an unstructured mesh that increases in resolution away from the center of the cuboid. See Figure: 5 in the ArXiv paper https://arxiv.org/pdf/1907.02604.pdf  for a slice through the midplane of the cuboid.  Given this type of mesh, will dmplex do a cuboidal domain decomposition?"

  No definitely not. Why do you need a cuboidal domain decomposition? 

  Barry


> On Nov 25, 2019, at 10:45 PM, Swarnava Ghosh <swarnava89 at gmail.com> wrote:
> 
> Hi Matt,
> 
> 
> https://arxiv.org/pdf/1907.02604.pdf  
> 
> On Mon, Nov 25, 2019 at 7:54 PM Matthew Knepley <knepley at gmail.com> wrote:
> On Mon, Nov 25, 2019 at 6:25 PM Swarnava Ghosh <swarnava89 at gmail.com> wrote:
> Dear PETSc users and developers,
> 
> I am working with dmplex to distribute a 3D unstructured mesh made of tetrahedrons in a cuboidal domain. I had a few queries:
> 1) Is there any way of ensuring load balancing based on the number of vertices per MPI process.
> 
> You can now call DMPlexRebalanceSharedPoints() to try and get better balance of vertices.
>  
>   Thank you for pointing out this function!  
>  
> 2) As the global domain is cuboidal, is the resulting domain decomposition also cuboidal on every MPI process? If not, is there a way to ensure this? For example in DMDA, the default domain decomposition for a cuboidal domain is cuboidal. 
> 
> It sounds like you do not want something that is actually unstructured. Rather, it seems like you want to
> take a DMDA type thing and split it into tets. You can get a cuboidal decomposition of a hex mesh easily.
> Call DMPlexCreateBoxMesh() with one cell for every process, distribute, and then uniformly refine. This
> will not quite work for tets since the mesh partitioner will tend to violate that constraint. You could:
> 
> No, I have an unstructured mesh that increases in resolution away from the center of the cuboid. See Figure: 5 in the ArXiv paper https://arxiv.org/pdf/1907.02604.pdf  for a slice through the midplane of the cuboid.  Given this type of mesh, will dmplex do a cuboidal domain decomposition?
> 
> Sincerely,
> SG
>  
>   a) Prescribe the distribution yourself using the Shell partitioner type
> 
> or
> 
>   b) Write a refiner that turns hexes into tets
> 
> We already have a refiner that turns tets into hexes, but we never wrote the other direction because it was not clear
> that it was useful.
> 
>   Thanks,
> 
>      Matt
>  
> Sincerely,
> SG
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
> 
> https://www.cse.buffalo.edu/~knepley/



More information about the petsc-users mailing list