[petsc-users] Domain decomposition using DMPLEX
Matthew Knepley
knepley at gmail.com
Fri Nov 29 08:44:14 CST 2019
On Thu, Nov 28, 2019 at 9:45 PM Swarnava Ghosh <swarnava89 at gmail.com> wrote:
> Hi Barry,
>
> "Why do you need a cuboidal domain decomposition?"
>
> I gave it some thought. I don't always need a cuboidal decomposition. But
> I would need something that essentially minimized the surface area of the
> faces of each decomposition. Is there a way to get this? Could you please
> direct me to a reference a reference where I can read about the domain
> decomposition strategies used in petsc dmplex.
>
This is the point of graph partitioning, which minimizes the "cut" which
the the number of links between one partition and another. The ParMetis
manual has this kind of information, and citations.
Thanks,
Matt
> Sincerely,
> Swarnava
>
> On Mon, Nov 25, 2019 at 9:02 PM Smith, Barry F. <bsmith at mcs.anl.gov>
> wrote:
>
>>
>> "No, I have an unstructured mesh that increases in resolution away from
>> the center of the cuboid. See Figure: 5 in the ArXiv paper
>> https://arxiv.org/pdf/1907.02604.pdf for a slice through the midplane
>> of the cuboid. Given this type of mesh, will dmplex do a cuboidal domain
>> decomposition?"
>>
>> No definitely not. Why do you need a cuboidal domain decomposition?
>>
>> Barry
>>
>>
>> > On Nov 25, 2019, at 10:45 PM, Swarnava Ghosh <swarnava89 at gmail.com>
>> wrote:
>> >
>> > Hi Matt,
>> >
>> >
>> > https://arxiv.org/pdf/1907.02604.pdf
>> >
>> > On Mon, Nov 25, 2019 at 7:54 PM Matthew Knepley <knepley at gmail.com>
>> wrote:
>> > On Mon, Nov 25, 2019 at 6:25 PM Swarnava Ghosh <swarnava89 at gmail.com>
>> wrote:
>> > Dear PETSc users and developers,
>> >
>> > I am working with dmplex to distribute a 3D unstructured mesh made of
>> tetrahedrons in a cuboidal domain. I had a few queries:
>> > 1) Is there any way of ensuring load balancing based on the number of
>> vertices per MPI process.
>> >
>> > You can now call DMPlexRebalanceSharedPoints() to try and get better
>> balance of vertices.
>> >
>> > Thank you for pointing out this function!
>> >
>> > 2) As the global domain is cuboidal, is the resulting domain
>> decomposition also cuboidal on every MPI process? If not, is there a way to
>> ensure this? For example in DMDA, the default domain decomposition for a
>> cuboidal domain is cuboidal.
>> >
>> > It sounds like you do not want something that is actually unstructured.
>> Rather, it seems like you want to
>> > take a DMDA type thing and split it into tets. You can get a cuboidal
>> decomposition of a hex mesh easily.
>> > Call DMPlexCreateBoxMesh() with one cell for every process, distribute,
>> and then uniformly refine. This
>> > will not quite work for tets since the mesh partitioner will tend to
>> violate that constraint. You could:
>> >
>> > No, I have an unstructured mesh that increases in resolution away from
>> the center of the cuboid. See Figure: 5 in the ArXiv paper
>> https://arxiv.org/pdf/1907.02604.pdf for a slice through the midplane
>> of the cuboid. Given this type of mesh, will dmplex do a cuboidal domain
>> decomposition?
>> >
>> > Sincerely,
>> > SG
>> >
>> > a) Prescribe the distribution yourself using the Shell partitioner
>> type
>> >
>> > or
>> >
>> > b) Write a refiner that turns hexes into tets
>> >
>> > We already have a refiner that turns tets into hexes, but we never
>> wrote the other direction because it was not clear
>> > that it was useful.
>> >
>> > Thanks,
>> >
>> > Matt
>> >
>> > Sincerely,
>> > SG
>> >
>> >
>> > --
>> > What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> > -- Norbert Wiener
>> >
>> > https://www.cse.buffalo.edu/~knepley/
>>
>>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20191129/d58d6a43/attachment.html>
More information about the petsc-users
mailing list