[petsc-dev] DMPlex, Finite Volume, overlap and ghost cells...

John O'Sullivan jp.osullivan at auckland.ac.nz
Wed Mar 11 16:11:09 CDT 2015


Hi Matt,

Good morning from NZ! Thanks for the speedy response! That helps a lot.

I think the problem is the way I’m labelling the grid before setting up the ghost cells. I think…

When you say:
“When DMPlexConstructGhostCells() is called, it will put ghost cells on the other side of the true boundary”

Does that mean that overlap cells are labelled as ghost cells? Aren’t they on the other side of the true boundary? Or by true boundary do you mean the outer edge of the overlap cells?

I’m testing the code using two different 3 cell exo files. One is the simpleblock-100.exo from the shared directory and the other is an exo made using a python script and paraview. The simpleblock-100.exo has labels already assigned whereas the has none.

Should we be expecting the exo’s to be labelled appropriately outside of Petsc or should we check them and label them within the code? The two grids are being labelled very differently at the moment and I’m not sure which is right of (mostly likely) if both are wrong…

Thanks again
John


--

Dr John O'Sullivan
Lecturer
Department of Engineering Science
University of Auckland, New Zealand
email: jp.osullivan at auckland.ac.nz<https://lists.mcs.anl.gov/mailman/listinfo/petsc-dev>
tel: +64 (0)9 923 85353




From: Matthew Knepley [mailto:knepley at gmail.com]
Sent: Thursday, 12 March 2015 3:06 a.m.
To: John O'Sullivan
Cc: petsc-dev at mcs.anl.gov
Subject: Re: [petsc-dev] DMPlex, Finite Volume, overlap and ghost cells...

On Wed, Mar 11, 2015 at 3:34 AM, John O'Sullivan <jp.osullivan at auckland.ac.nz<mailto:jp.osullivan at auckland.ac.nz>> wrote:
Hi all,

I've managed to get myself very confused about how to use DMPlex correctly for a distributed Finite Volume grid...

My understanding was that along partition boundaries the ghost cells are used store cell information from neighbouring partitions so that the fluxes can be calculated.

Though debugging through ex11 it seems like overlap is set equal to 1?

I'm solving a simple pressure-diffusion equation on a 1D column (from an exo grid) which works fine on a single processor but not in parallel. I'm certainly not setting things up right or labeling correctly...

Could someone please explain the most appropriate way to set up and label the DM, whether the overlap should be 0 or 1 and whether ghost cells should be placed on internal partition boundaries.

Yes, for FV the partition overlap should be 1, as it is in ex11. This means that when the partition happens, we will
put a layer of cells on the other side of partition boundaries,

When DMPlexConstructGhostCells() is called, it will put ghost cells on the other side of the true boundary. Both
kinds of cells will be separated from interior cells by the DMPlexGetHybridBounds(&cMax), where cMax is the
first ghost cell.

Now you still need a way to get rid of faces between ghost cells (which only occur in cells across a partition boundary).
To do this, you use the "ghost" label we make during partitioning:

    ierr = DMPlexGetLabel(dm, "ghost", &ghostLabel);CHKERRQ(ierr);
    ierr = DMLabelGetValue(ghostLabel, face, &ghost);CHKERRQ(ierr);
    if (ghost >= 0) continue;

What exactly is going wrong?

   Matt

Thanks!
John


--
Dr John O'Sullivan

Lecturer

Department of Engineering Science

University of Auckland, New Zealand

email: jp.osullivan at auckland.ac.nz<https://lists.mcs.anl.gov/mailman/listinfo/petsc-dev>

tel: +64 (0)9 923 85353



--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20150311/feca2e7a/attachment.html>


More information about the petsc-dev mailing list