[MOAB-dev] DMMOAB (PETSc 3.5.1)

Vijay S. Mahadevan vijay.m at gmail.com
Tue Aug 19 11:05:28 CDT 2014


mbpart - R 2 file.h5m file2.h5m

Now use file2.h5m in your program.

Vijay
On Aug 19, 2014 11:00 AM, "Gerd Heber" <gheber at hdfgroup.org> wrote:

> 'mbpart -R' doesn't appear to make a difference. Same behavior. G.
>
> -----Original Message-----
> From: Vijay S. Mahadevan [mailto:vijay.m at gmail.com]
> Sent: Tuesday, August 19, 2014 10:48 AM
> To: Gerd Heber
> Cc: MOAB dev
> Subject: Re: [MOAB-dev] DMMOAB (PETSc 3.5.1)
>
> Yes, the entities in your sets with PARALLEL_PARTITION tag look badly
> segmented. You could try running the original mesh through the mbpart tool
> with the reorder option (-R) to get a new mesh with contiguous entity
> numbering. Let me know if the behavior reverts when using this new mesh.
>
> Vijay
>
> On Tue, Aug 19, 2014 at 10:41 AM, Gerd Heber <gheber at hdfgroup.org> wrote:
> > Maybe that's what's going on. Attached is the output from "mbsize -ll"
> > and the range returned by DMMoabGetAllVertices on process 1.
> > There are quite a few gaps. Does that make sense?
> >
> > G.
> >
> > -----Original Message-----
> > From: Vijay S. Mahadevan [mailto:vijay.m at gmail.com]
> > Sent: Tuesday, August 19, 2014 10:27 AM
> > To: Gerd Heber
> > Cc: MOAB dev
> > Subject: Re: [MOAB-dev] DMMOAB (PETSc 3.5.1)
> >
> > You could use the mbsize tool installed at $MOAB_INSTALL/bin/mbsize
> > with options "-ll" to see all entities
> >
> > mbsize -ll <filename>
> >
> > You can track down the PARALLEL_PARTITION tag on entity sets and find
> out whether the corresponding vertices in the element are number
> contiguously (in terms of GLOBAL_ID). If this is segmented, internally,
> things get reverted back to a native PETSc Vec in DMMoab.
> > Sorry about this confusion and I should've documented this better.
> > This inconsistent behavior needs to change and I'm working on a patch
> that will possibly perform renumbering on the fly so that contiguous memory
> access is available for MOAB based Vecs.
> >
> > Vijay
> >
> > On Tue, Aug 19, 2014 at 10:13 AM, Gerd Heber <gheber at hdfgroup.org>
> wrote:
> >> What's the best way to verify that? G.
> >>
> >> -----Original Message-----
> >> From: Vijay S. Mahadevan [mailto:vijay.m at gmail.com]
> >> Sent: Tuesday, August 19, 2014 9:58 AM
> >> To: Gerd Heber
> >> Cc: MOAB dev
> >> Subject: Re: [MOAB-dev] DMMOAB (PETSc 3.5.1)
> >>
> >>> DMMoabCreateVector(dm, existing_tag, PETSC_NULL, PETSC_TRUE,
> >>> PETSC_FALSE, &X)
> >>
> >> Yes, this should preserve the values in X vector. There is currently an
> implementation quirk that underneath the MOAB specific Vec, we check
> whether the local entities (vertices) are numbered contiguously so that
> tag_iterate can be used. If that's not the case, it actually creates a
> native PETSc Vec underneath and manages the memory through that. I am
> working on a patch to remove this limitation but I'm not sure whether you
> have hit this issue now.
> >>
> >> Can you just verify whether your local vertices in the mesh per
> >> processor are contiguously arranged ? i.e.,
> >>
> >> P1: (1-10) P2: (11-20) instead of P1: (1-5,11-15), P2: (6-10, 16-20)
> >>
> >> I will let you know once this patch is ready (along with a PR to track
> in PETSc) so that you can try it out.
> >>
> >> Vijay
> >>
> >> On Tue, Aug 19, 2014 at 9:49 AM, Gerd Heber <gheber at hdfgroup.org>
> wrote:
> >>> Vijay, here's something that I find confusing, or maybe I'm just doing
> something wrong.
> >>>
> >>> I call
> >>>
> >>> DMMoabCreateVector(dm, existing_tag, PETSC_NULL, PETSC_TRUE,
> >>> PETSC_FALSE, &X)
> >>>
> >>> and would expect X to have the values of existing_tag (non-zero). But
> X's values are all zero.
> >>> Is that the expected behavior?
> >>>
> >>> Thanks, G.
> >>>
> >>>
> >>>
> >>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/moab-dev/attachments/20140819/050a3065/attachment-0001.html>


More information about the moab-dev mailing list