[petsc-dev] Simple Test example not working in parallel for DMPlex/FV/overlap/ghost cells...
Matthew Knepley
knepley at gmail.com
Thu Mar 12 10:43:41 CDT 2015
On Thu, Mar 12, 2015 at 4:19 AM, John O'Sullivan <
jp.osullivan at auckland.ac.nz> wrote:
> Hi Matt,
>
> Attached is a simple test program where I'm loading the
> simpleblock-100.exo and trying to set up a grid with closed boundaries on
> all the outer faces.
>
Just so I understand what you want. You do not want any ghost cells made
around the outer boundary. What you have will do that.
> We'd like to do it this way so we can use the same sort of architecture as
> in PetscFVM to skip ghost faces when calculating fluxes (like you suggested
> in your response yesterday)
>
Note that now, it will also mark the outer boundary faces as ghosts because
they have no partner cell.
> It works fine on a single processor but when we run it on 2 processors
> there's an indexing error when trying to create a matrix. There's obviously
> something wrong with the way we're setting up the dm but I don't know what
> it is? Or perrhaps we're not allowed to use DMPlexConstructGhostCells in
> this way? Also I was surprised by the values given for end_interior_cell on
> each processor as it seems to include the overlap cells which is not what I
> expected?
>
This is a bug in my latest preallocation push. Thanks for finding it. I
pushed a fix for this, and for many Fortran things
to next. I am including my modified source so that everything is
deallocated correctly (run with -malloc_test).
It seems to work for me now. Let me know if you have problems.
Thanks,
Matt
> One last thing, the FORTRAN interface for DMPlexCreateLabel was missing so
> I added it into zplexlabel.c which works fine but I'm not sure if this was
> the right way to do it. My version is attached.
>
> The full output from the program for two processors is below.
>
> Cheers
> John
>
> (ps Is there a way to make DMView output the Labels information for all
> processors?)
>
> $ mpirun -n 2 simpleTest2
> DM Object:Parallel Mesh 2 MPI processes
> type: plex
> Parallel Mesh in 3 dimensions:
> 0-cells: 12 16
> 1-cells: 20 28
> 2-cells: 11 16
> 3-cells: 2 3
> Labels:
> ClosedBoundaries: 0 strata of sizes ()
> Face Sets: 3 strata of sizes (2, 2, 5)
> Cell Sets: 1 strata of sizes (2)
> depth: 4 strata of sizes (12, 20, 11, 2)
> DM Object: 2 MPI processes
> type: plex
> DM_0x7f91c1d27ff0_0 in 3 dimensions:
> 0-cells: 12 16
> 1-cells: 20 28
> 2-cells: 11 16
> 3-cells: 2 (0) 3 (0)
> Labels:
> ghost: 2 strata of sizes (1, 10)
> vtk: 1 strata of sizes (1)
> Cell Sets: 1 strata of sizes (2)
> Face Sets: 3 strata of sizes (2, 2, 5)
> ClosedBoundaries: 0 strata of sizes ()
> 0 3 3 19 35
> depth: 4 strata of sizes (12, 20, 11, 2)
> 0 2 2 14 25
> [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [1]PETSC ERROR: Argument out of range
> [1]PETSC ERROR: Section point -1 should be in [1, 3)
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: Argument out of range
> [0]PETSC ERROR: Section point -3 should be in [0, 1)
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [0]PETSC ERROR: Petsc Development GIT revision: v3.5.3-2945-g6900bc9 GIT
> Date: 2015-03-10 18:15:26 -0500
> [0]PETSC ERROR: simpleTest2 on a arch-darwin-c-debug named
> Johns-MacBook-Air.local by jposunz Thu Mar 12 22:01:26 2015
> [0]PETSC ERROR: Configure options --download-fblaslapack --useThreads=0
> --download-triangle --download-netcdf --download-exodusii --download-hdf5
> --download-ptscotch --download-chaco
> [0]PETSC ERROR: #1 PetscSectionGetDof() line 499 in
> /Users/jposunz/projects/src/petsc/src/vec/is/utils/vsectionis.c
> [0]PETSC ERROR: #2 DMPlexUpdateAllocation_Static() line 609 in
> /Users/jposunz/projects/src/petsc/src/dm/impls/plex/plexpreallocate.c
> [0]PETSC ERROR: #3 DMPlexPreallocateOperator() line 763 in
> /Users/jposunz/projects/src/petsc/src/dm/impls/plex/plexpreallocate.c
> [0]PETSC ERROR: #4 DMCreateMatrix_Plex() line 746 in
> /Users/jposunz/projects/src/petsc/src/dm/impls/plex/plex.c
> [0]PETSC ERROR: [1]PETSC ERROR: See
> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
> [1]PETSC ERROR: Petsc Development GIT revision: v3.5.3-2945-g6900bc9 GIT
> Date: 2015-03-10 18:15:26 -0500
> [1]PETSC ERROR: simpleTest2 on a arch-darwin-c-debug named
> Johns-MacBook-Air.local by jposunz Thu Mar 12 22:01:26 2015
> #5 DMCreateMatrix() line 958 in
> /Users/jposunz/projects/src/petsc/src/dm/interface/dm.c
> [1]PETSC ERROR: Configure options --download-fblaslapack --useThreads=0
> --download-triangle --download-netcdf --download-exodusii --download-hdf5
> --download-ptscotch --download-chaco
> [1]PETSC ERROR: #1 PetscSectionGetDof() line 499 in
> /Users/jposunz/projects/src/petsc/src/vec/is/utils/vsectionis.c
> [1]PETSC ERROR: #2 DMPlexUpdateAllocation_Static() line 609 in
> /Users/jposunz/projects/src/petsc/src/dm/impls/plex/plexpreallocate.c
> [1]PETSC ERROR: #3 DMPlexPreallocateOperator() line 763 in
> /Users/jposunz/projects/src/petsc/src/dm/impls/plex/plexpreallocate.c
> [1]PETSC ERROR: #4 DMCreateMatrix_Plex() line 746 in
> /Users/jposunz/projects/src/petsc/src/dm/impls/plex/plex.c
> [1]PETSC ERROR: #5 DMCreateMatrix() line 958 in
> /Users/jposunz/projects/src/petsc/src/dm/interface/dm.c
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
> with errorcode 63.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
>
>
> --
> Dr John O'Sullivan
> Lecturer
> Department of Engineering Science
> University of Auckland, New Zealand
> email: jp.osullivan at auckland.ac.nz <https://lists.mcs.anl.gov/mailman/listinfo/petsc-dev>
> tel: +64 (0)9 923 85353
>
> ------------------------------
> *From:* Matthew Knepley [knepley at gmail.com]
> *Sent:* Thursday, 12 March 2015 3:06 a.m.
> *To:* John O'Sullivan
> *Cc:* petsc-dev at mcs.anl.gov
> *Subject:* Re: [petsc-dev] DMPlex, Finite Volume, overlap and ghost
> cells...
>
> On Wed, Mar 11, 2015 at 3:34 AM, John O'Sullivan <
> jp.osullivan at auckland.ac.nz> wrote:
>
>> Hi all,
>>
>> I've managed to get myself very confused about how to use DMPlex
>> correctly for a distributed Finite Volume grid...
>>
>> My understanding was that along partition boundaries the ghost cells are
>> used store cell information from neighbouring partitions so that the fluxes
>> can be calculated.
>>
>> Though debugging through ex11 it seems like overlap is set equal to 1?
>>
>> I'm solving a simple pressure-diffusion equation on a 1D column (from an
>> exo grid) which works fine on a single processor but not in parallel. I'm
>> certainly not setting things up right or labeling correctly...
>>
>> Could someone please explain the most appropriate way to set up and label
>> the DM, whether the overlap should be 0 or 1 and whether ghost cells should
>> be placed on internal partition boundaries.
>>
>
> Yes, for FV the partition overlap should be 1, as it is in ex11. This
> means that when the partition happens, we will
> put a layer of cells on the other side of partition boundaries,
>
> When DMPlexConstructGhostCells() is called, it will put ghost cells on
> the other side of the true boundary. Both
> kinds of cells will be separated from interior cells by the
> DMPlexGetHybridBounds(&cMax), where cMax is the
> first ghost cell.
>
> Now you still need a way to get rid of faces between ghost cells (which
> only occur in cells across a partition boundary).
> To do this, you use the "ghost" label we make during partitioning:
>
> ierr = DMPlexGetLabel(dm, "ghost", &ghostLabel);CHKERRQ(ierr);
> ierr = DMLabelGetValue(ghostLabel, face, &ghost);CHKERRQ(ierr);
> if (ghost >= 0) continue;
>
> What exactly is going wrong?
>
> Matt
>
>
>> Thanks!
>> John
>>
>> --
>> Dr John O'Sullivan
>> Lecturer
>> Department of Engineering Science
>> University of Auckland, New Zealand
>> email: jp.osullivan at auckland.ac.nz <https://lists.mcs.anl.gov/mailman/listinfo/petsc-dev>
>> tel: +64 (0)9 923 85353
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20150312/57e5ef70/attachment.html>
More information about the petsc-dev
mailing list