[petsc-users] Uninterpolating DMLabels?
Matthew Knepley
knepley at gmail.com
Fri Apr 10 11:38:56 CDT 2015
On Fri, Apr 10, 2015 at 1:01 AM, Justin Chang <jchang27 at uh.edu> wrote:
> Matt,
>
> I am not sure I follow what you're saying here. Currently I am working
> with 3D simplex elements so if the variable numSubcells is what you're
> referring to in CellRefinerGetAffineTransforms_Internal(), there's no
> example for REFINER_SIMPLEX_3D. What would it be in that case, 7?
>
Its 8. There are 8 fine cells for every coarse cell.
> What I am looking for is a function and/or a set of algorithms where given
> a sieve point m from the original coarse mesh, I can obtain the
> corresponding set of (fine mesh) sieve points. I am assuming c_f \in [C
> c_c, C (c_c + 1) ) describes just that?
>
Yes, C = 8.
> Is there an existing DMPlex related function that implements this
> directly? Because I don't see how CellRefinerGetAffineTransforms_Internal()
> will help.
>
I would just give you C. Yes, I would just write it. It should be 1 line.
Thanks,
Matt
>
> Thanks,
>
>
> On Thu, Apr 9, 2015 at 8:20 PM, Matthew Knepley <knepley at gmail.com> wrote:
>
>> On Thu, Apr 9, 2015 at 8:14 PM, Justin Chang <jchang27 at uh.edu> wrote:
>>
>>> Matt, thank you for the clarification. One more question somewhat on a
>>> related note.
>>>
>>> I have auxiliary data corresponding to the permeability of a reservoir
>>> (for instance, the SPE10 benchmark problem). The data is cell-wise and read
>>> from a binary file. I want to refine both the DM and this data. Is there a
>>> way to obtain the mapping from the coarse mesh to the fine mesh? What I
>>> mean is, given a coarse cell, can I somehow get the transitive closure of
>>> the fine cells it created during the refinement process? This would allow
>>> me to copy the value of the coarse cell data onto it's children.
>>>
>>
>> If you are doing regular refinement, there is a simple formula:
>>
>> c_f \in [ C c_c, C (c_c +1) )
>>
>> where C is the number of cells created from each cell. You can get C from
>>
>> CellRefinerGetAffineTransforms_Internal()
>>
>> which I could promote to a real interface function.
>>
>> Thanks,
>>
>> Matt
>>
>>
>>> Thanks,
>>>
>>> On Thu, Apr 9, 2015 at 7:52 PM, Matthew Knepley <knepley at gmail.com>
>>> wrote:
>>>
>>>> On Thu, Apr 9, 2015 at 7:21 PM, Justin Chang <jchang27 at uh.edu> wrote:
>>>>
>>>>> 1) I did what you had suggested and it works very well thanks.
>>>>>
>>>>
>>>> Cool.
>>>>
>>>>
>>>>> 2) In my case, calling uninterpolate does make a difference, just not
>>>>> necessarily in the solver step (I am using Tao's BLMVM). I attached two
>>>>> output files, one containing the timings and summaries with uninterpolating
>>>>> and one without uninterpolating. And the source code if you wanted to
>>>>> verify the "correctness" of my implementation. Namely, the biggest
>>>>> difference I see is in setting up the PetscSection. When uninterpolating
>>>>> the data this steps takes 8.64039 seconds but when I do not uninterpolate
>>>>> the data it takes 61.2977 seconds. Is this "significant" enough, or is it
>>>>> unimportant given the fact that this step occurs only once during any
>>>>> simulation?
>>>>>
>>>>
>>>> That is significant. Most of that time is in the operator preallocation
>>>> routine. It was cleaner to first gather the point
>>>> adjacency information, and then push the dof information on top. If I
>>>> aggressively merged the dof data, I could have
>>>> pruned a bunch of the work here and gotten much closer to the first
>>>> time, at the expense of more complicated code.
>>>> I guess this is the right pattern if you know that you will only have
>>>> unknowns on cells and vertices.
>>>>
>>>>
>>>>> 3) Also, about DMPlexDistribute(...). Right now it still seems
>>>>> extremely slow, hence why I am distributing a coarse mesh and having each
>>>>> processor refine its local portion of the mesh. In my current
>>>>> implementation, I have rank 0 distribute the mesh via ParMETIS. Will there
>>>>> eventually be a methodology to do parallel I/O then redistribution for load
>>>>> balancing or does it already exist?
>>>>>
>>>>
>>>> Redistribution for load balancing already exists. The stumbling block
>>>> right now is parallel mesh import. However,
>>>> the group at ICL (Michael Lange and Gerard Gorman) is working on this,
>>>> and we expect to have it working by
>>>> the end of summer. Until then, I recommend exactly what you are doing.
>>>>
>>>> Thanks,
>>>>
>>>> Matt
>>>>
>>>>
>>>>> Thanks,
>>>>>
>>>>> On Tue, Apr 7, 2015 at 8:37 PM, Matthew Knepley <knepley at gmail.com>
>>>>> wrote:
>>>>>
>>>>>> On Tue, Apr 7, 2015 at 7:18 PM, Justin Chang <jchang27 at uh.edu> wrote:
>>>>>>
>>>>>>> Hi all,
>>>>>>>
>>>>>>> So I am interpolating/uninterpolating my DMPlex mesh. This is the
>>>>>>> mesh information for a cell-vertex mesh only:
>>>>>>>
>>>>>>> DM Object: 1 MPI processes
>>>>>>> type: plex
>>>>>>> DM_0x84000000_0 in 3 dimensions:
>>>>>>> 0-cells: 159
>>>>>>> 3-cells: 592
>>>>>>> Labels:
>>>>>>> marker: 1 strata of sizes (100)
>>>>>>> depth: 2 strata of sizes (159, 592)
>>>>>>>
>>>>>>> I am interpolating the mesh with the following lines:
>>>>>>>
>>>>>>> ierr = DMPlexInterpolate(*dm, &idm);CHKERRQ(ierr);
>>>>>>> ierr = DMPlexCopyCoordinates(*dm, idm);CHKERRQ(ierr);
>>>>>>> ierr = DMPlexCopyLabels(*dm, idm);CHKERRQ(ierr);
>>>>>>> ierr = DMPlexGetLabel(idm, "marker", &label);CHKERRQ(ierr);
>>>>>>> ierr = DMPlexMarkBoundaryFaces(idm,label);CHKERRQ(ierr);
>>>>>>> ierr = DMDestroy(dm);CHKERRQ(ierr);
>>>>>>> *dm = idm;
>>>>>>>
>>>>>>> And the resulting mesh:
>>>>>>>
>>>>>>> DM Object: 1 MPI processes
>>>>>>> type: plex
>>>>>>> DM_0x84000000_1 in 3 dimensions:
>>>>>>> 0-cells: 159
>>>>>>> 1-cells: 845
>>>>>>> 2-cells: 1280
>>>>>>> 3-cells: 592
>>>>>>> Labels:
>>>>>>> marker: 1 strata of sizes (292)
>>>>>>> depth: 4 strata of sizes (159, 845, 1280, 592)
>>>>>>>
>>>>>>> The reason I did this is so that incase I decide to refine the mesh
>>>>>>> with "-dm_refine <number>" the DM will be sure to include the "marker"
>>>>>>> points in the refinement process. Now, if I decide to uninterpolate the
>>>>>>> mesh with the following lines:
>>>>>>>
>>>>>>> ierr = DMPlexUninterpolate(*dm, &idm);CHKERRQ(ierr);
>>>>>>> ierr = DMPlexCopyCoordinates(*dm, idm);CHKERRQ(ierr);
>>>>>>> ierr = DMPlexCopyLabels(*dm, idm);CHKERRQ(ierr);
>>>>>>> ierr = DMDestroy(dm);CHKERRQ(ierr);
>>>>>>> *dm = idm;
>>>>>>>
>>>>>>> I end up with this mesh:
>>>>>>>
>>>>>>> DM Object: 1 MPI processes
>>>>>>> type: plex
>>>>>>> DM_0x84000000_2 in 3 dimensions:
>>>>>>> 0-cells: 159
>>>>>>> 3-cells: 592
>>>>>>> Labels:
>>>>>>> marker: 1 strata of sizes (292)
>>>>>>> depth: 2 strata of sizes (159, 592)
>>>>>>>
>>>>>>> The problem is that although the cells and vertices have gone back
>>>>>>> to the original number, the "marker" label still includes face/edge points.
>>>>>>> This gives me an error once I invoke DMCreateGobalVector(...).
>>>>>>>
>>>>>>> [0]PETSC ERROR: --------------------- Error Message
>>>>>>> --------------------------------------------------------------
>>>>>>> [0]PETSC ERROR: Argument out of range
>>>>>>> [0]PETSC ERROR: Section point 755 should be in [0, 751)
>>>>>>> [0]PETSC ERROR: See
>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
>>>>>>> shooting.
>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision:
>>>>>>> v3.5.3-2528-gbee642f GIT Date: 2015-03-29 20:36:38 -0500
>>>>>>> [0]PETSC ERROR: ./cube_with_hole on a arch-linux2-c-debug named
>>>>>>> pacotaco by justin Tue Apr 7 19:09:12 2015
>>>>>>> [0]PETSC ERROR: Configure options --download-chaco
>>>>>>> --download-exodusii --download-fblaslapack --download-hdf5 --download-metis
>>>>>>> --download-mpich --download-mumps --download-netcdf --download-parmetis
>>>>>>> --download-scalapack --download-triangle --with-cc=gcc --with-cmake=cmake
>>>>>>> --with-cxx=g++ --with-debugging=1 --with-fc=gfortran --with-valgrind=1
>>>>>>> PETSC_ARCH=arch-linux2-c-debug
>>>>>>> [0]PETSC ERROR: #1 PetscSectionGetDof() line 499 in
>>>>>>> /home/justin/petsc-dev/src/vec/is/utils/vsectionis.c
>>>>>>> [0]PETSC ERROR: #2 DMPlexGetConeSize() line 841 in
>>>>>>> /home/justin/petsc-dev/src/dm/impls/plex/plex.c
>>>>>>> [0]PETSC ERROR: #3 DMPlexGetTransitiveClosure() line 1342 in
>>>>>>> /home/justin/petsc-dev/src/dm/impls/plex/plex.c
>>>>>>> [0]PETSC ERROR: #4 DMPlexLabelComplete() line 88 in
>>>>>>> /home/justin/petsc-dev/src/dm/impls/plex/plexsubmesh.c
>>>>>>> [0]PETSC ERROR: #5 DMCreateDefaultSection_Plex() line 5605 in
>>>>>>> /home/justin/petsc-dev/src/dm/impls/plex/plex.c
>>>>>>> [0]PETSC ERROR: #6 DMGetDefaultSection() line 3035 in
>>>>>>> /home/justin/petsc-dev/src/dm/interface/dm.c
>>>>>>> [0]PETSC ERROR: #7 DMGetDefaultGlobalSection() line 3266 in
>>>>>>> /home/justin/petsc-dev/src/dm/interface/dm.c
>>>>>>> [0]PETSC ERROR: #8 DMCreateGlobalVector_Section_Private() line 13 in
>>>>>>> /home/justin/petsc-dev/src/dm/interface/dmi.c
>>>>>>> [0]PETSC ERROR: #9 DMCreateGlobalVector_Plex() line 1170 in
>>>>>>> /home/justin/petsc-dev/src/dm/impls/plex/plexcreate.c
>>>>>>> [0]PETSC ERROR: #10 DMCreateGlobalVector() line 698 in
>>>>>>> /home/justin/petsc-dev/src/dm/interface/dm.c
>>>>>>> [0]PETSC ERROR: #11 main() line 363 in
>>>>>>> /home/justin/Dropbox/Research_Topics/Code_PETSc/Nonneg/cube_with_hole.c
>>>>>>>
>>>>>>>
>>>>>>> From this error, it seems it's trying to access sieve points that
>>>>>>> non longer exist. And I think it has to do with the label "marker" still
>>>>>>> contains data from the interpolated mesh. Is there a way to "uninterpolate"
>>>>>>> or remove such points? Or is there a better way of approaching this whole
>>>>>>> thing?
>>>>>>>
>>>>>>
>>>>>> 1) It would be easy to write a small function to throw out the points
>>>>>> in the label that do not exist in the DM. You
>>>>>> would just extract each stratumIS and then Clear each invalid
>>>>>> point.
>>>>>>
>>>>>> 2) However, I do not understand the rationale for this above. You
>>>>>> could just call MarkBoundaryFaces on the final mesh.
>>>>>> If you are really trying to preserve a label across regular
>>>>>> refinement AND you really do not want an interpolated mesh,
>>>>>> then code up 1). I have yet to see a case where the extra memory
>>>>>> for edges and faces makes a difference, but it may exist.
>>>>>>
>>>>>> Thanks,
>>>>>>
>>>>>> Matt
>>>>>>
>>>>>>
>>>>>>> Thanks,
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>> Justin Chang
>>>>>>> PhD Candidate, Civil Engineering - Computational Sciences
>>>>>>> University of Houston, Department of Civil and Environmental
>>>>>>> Engineering
>>>>>>> Houston, TX 77004
>>>>>>> (512) 963-3262
>>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> What most experimenters take for granted before they begin their
>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>> experiments lead.
>>>>>> -- Norbert Wiener
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Justin Chang
>>>>> PhD Candidate, Civil Engineering - Computational Sciences
>>>>> University of Houston, Department of Civil and Environmental
>>>>> Engineering
>>>>> Houston, TX 77004
>>>>> (512) 963-3262
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>
>>>
>>>
>>> --
>>> Justin Chang
>>> PhD Candidate, Civil Engineering - Computational Sciences
>>> University of Houston, Department of Civil and Environmental Engineering
>>> Houston, TX 77004
>>> (512) 963-3262
>>>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>
>
>
> --
> Justin Chang
> PhD Candidate, Civil Engineering - Computational Sciences
> University of Houston, Department of Civil and Environmental Engineering
> Houston, TX 77004
> (512) 963-3262
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150410/9143f4f5/attachment-0001.html>
More information about the petsc-users
mailing list