[petsc-users] DMRefine fails
Rongliang Chen
rongliang.chan at gmail.com
Fri Feb 26 01:59:38 CST 2016
Dear Lawrence,
Thank you very much for you suggestions.
I said "totally wrong" because I found that the connectivity and number
of points of the refined mesh is wrong. I viewed the refined mesh with
paraview and found that there are some holes in it (see the attached
figures: Fig1.png for the non-overlapping case and Fig2.png for the
overlapping case).
My package is based on petsc-3.5 which does not have
DMPlexDistributeOverlap. Petsc-3.5 only has DMPlexEnlargePartition and
it seems can not add overlap for a distributed mesh. So I need to update
my package to petsc-3.6 before try your suggestions. It will takes me
some time to do it since my package is a little too large.
Best regards,
Rongliang
> ------------------------------
>
> Message: 2
> Date: Thu, 25 Feb 2016 10:47:22 +0000
> From: Lawrence Mitchell<lawrence.mitchell at imperial.ac.uk>
> To:petsc-users at mcs.anl.gov
> Subject: Re: [petsc-users] DMRefine fails
> Message-ID:<56CEDBBA.7040309 at imperial.ac.uk>
> Content-Type: text/plain; charset="windows-1252"
>
>
>
> On 25/02/16 06:19, Rongliang Chen wrote:
>> >Dear All,
>> >
>> >In my application, I need an overlapping partition of the mesh, that is
>> >to set the overlapping size in DMPlexDistribute(*dm, partitioner,
>> >overlap, NULL, &distributedMesh) to be nonzero (overlap=1).
>> >
>> >At the same time, I want to use the DMRefine to refine the distributed
>> >mesh. I found that the refined mesh obtained by DMRefine is totally
>> >wrong.
> What do you mean by "totally wrong"? Is it just that the overlapped
> region on the coarse mesh is also refined (such that rather than being
> one level deep, it is 2**nrefinements deep)?
>
>> >But if I set the overlapping size in DMPlexDistribute to be zero,
>> >then the refined mesh is correct. Please let me know if you have any
>> >ideas to use the DMRefine for a distributed mesh with nonzero
>> >overlapping size.
> I do this as follows:
>
> - Create the initial DM.
>
> - Distribute with zero overlap DMPlexDistribute(..., overlap=0, ...);
>
> - Refine the distributed DM as many times as you want.
>
> - Grow the overlap on the refined DM:
>
> DMPlexDistributeOverlap(refined_dm, 1, &pointsf, &overlapped_dm);
>
> Now the overlapped_dm is refined and has a 1-deep overlap.
>
> The only thing that, AFAIK, doesn't work out of the box with this
> scheme is the automatic computation of multigrid restriction and
> interpolation. The reason for this is that the code currently assumes
> that, given a coarse DM point, it can compute the corresponding fine
> DM cells by using:
>
> for r in range(num_sub_cells):
> fine_cell = coarse_cell*num_sub_cells + r
>
> There is a TODO comment in plex.c (in DMPlexMatSetClosureRefined):
>
> for (r = 0, q = 0; r < numSubcells; ++r) {
> /* TODO Map from coarse to fine cells */
> ierr = DMPlexGetTransitiveClosure(dmf, point*numSubcells + r,
> PETSC_TRUE, &numFPoints, &fpoints);CHKERRQ(ierr);
>
>
> The two-step refine then grow the overlap method I outlined above
> destroys this mapping. Hence the approach currently coded in plex.c
> will not work.
>
> This is for regular refinement. For non-nested refinement, I think
> you're good because the computation of the interpolator spins over the
> fine cells and searches for the coarse cells.
>
> Hope that helps!
>
> Lawrence
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Fig1.png
Type: image/png
Size: 86051 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160226/78df6615/attachment-0002.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Fig2.png
Type: image/png
Size: 115932 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160226/78df6615/attachment-0003.png>
More information about the petsc-users
mailing list