[petsc-dev] More on FV overlap cells and Fortran

John O'Sullivan jp.osullivan at auckland.ac.nz
Fri Mar 13 14:09:46 CDT 2015


Hi Matt,

I think we'd prefer a Fortran wrapper for DMPlexGetGlobalOffset_Internal(). We've got our heads around offsets now and are happy working with them.

In a relate query, when we're setting up a new vector with a different dof then we follow the approach you use in plexgeometry.c (see example code snippet below). But I don't understand why when we run this in parallel we DON'T need to check if the cell is off-process before calling PetscSectionSetDof. In fact when I did check (using the ghost label approach) it then gave the error: "Global dof 1 for point 1 is not the unconstrained 0" (see full error log below)?

Thanks!

    call DMPlexGetHeightStratum(dm, 0, start_cell, end_cell, ierr); CHKERRQ(ierr)
    call DMPlexGetHybridBounds(dm, end_interior_cell, &
         PETSC_NULL_INTEGER, PETSC_NULL_INTEGER, PETSC_NULL_INTEGER, ierr); CHKERRQ(ierr)

    call DMClone(dm, dmRocks, ierr); CHKERRQ(ierr)
    call PetscSectionCreate(PETSC_COMM_WORLD, rocks_section, ierr); CHKERRQ(ierr)
    call PetscSectionSetChart(rocks_section, start_cell, end_interior_cell, ierr); CHKERRQ(ierr)
    do c = start_cell, end_interior_cell-1
! Want a new vector with 4 dofs
! Doesn't work if we exclude cells that are off-process
      call PetscSectionSetDof(rocks_section, c, 4, ierr); CHKERRQ(ierr)
    end do

    call PetscSectionSetUp(rocks_section, ierr); CHKERRQ(ierr)
    call DMSetDefaultSection(dmRocks, rocks_section, ierr); CHKERRQ(ierr)
    call DMCreateGlobalVector(dmRocks, rockproperties, ierr); CHKERRQ(ierr)

The full error log if we exclude off-process cells during PetscSectionSetDof:

[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Invalid argument
[0]PETSC ERROR: Global dof 1 for point 1 is not the unconstrained 0
[1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[1]PETSC ERROR: Invalid argument
[1]PETSC ERROR: Global dof 1 for point 2 is not the unconstrained 0
[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[1]PETSC ERROR: Petsc Development GIT revision: v3.5.3-2966-g3546a49  GIT Date: 2015-03-12 10:43:22 -0500
[1]PETSC ERROR: darcy on a arch-darwin-c-debug named Johns-MacBook-Air.local by jposunz Sat Mar 14 08:07:25 2015
[1]PETSC ERROR: Configure options --download-fblaslapack --useThreads=0 --download-triangle --download-netcdf --download-exodusii --download-hdf5 --download-ptscotch --download-chaco
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.5.3-2966-g3546a49  GIT Date: 2015-03-12 10:43:22 -0500
[0]PETSC ERROR: [1]PETSC ERROR: #1 PetscSectionCreateGlobalSection() line 1010 in /Users/jposunz/projects/src/petsc/src/vec/is/utils/vsectionis.c
[1]PETSC ERROR: #2 DMGetDefaultGlobalSection() line 3269 in /Users/jposunz/projects/src/petsc/src/dm/interface/dm.c
[1]PETSC ERROR: #3 DMCreateGlobalVector_Section_Private() line 13 in /Users/jposunz/projects/src/petsc/src/dm/interface/dmi.c
[1]PETSC ERROR: #4 DMCreateGlobalVector_Plex() line 1170 in /Users/jposunz/projects/src/petsc/src/dm/impls/plex/plexcreate.c
[1]PETSC ERROR: #5 DMCreateGlobalVector() line 698 in /Users/jposunz/projects/src/petsc/src/dm/interface/dm.c
darcy on a arch-darwin-c-debug named Johns-MacBook-Air.local by jposunz Sat Mar 14 08:07:25 2015
[0]PETSC ERROR: Configure options --download-fblaslapack --useThreads=0 --download-triangle --download-netcdf --download-exodusii --download-hdf5 --download-ptscotch --download-chaco
[0]PETSC ERROR: #1 PetscSectionCreateGlobalSection() line 1010 in /Users/jposunz/projects/src/petsc/src/vec/is/utils/vsectionis.c
[0]PETSC ERROR: #2 DMGetDefaultGlobalSection() line 3269 in /Users/jposunz/projects/src/petsc/src/dm/interface/dm.c
[0]PETSC ERROR: #3 DMCreateGlobalVector_Section_Private() line 13 in /Users/jposunz/projects/src/petsc/src/dm/interface/dmi.c
[0]PETSC ERROR: #4 DMCreateGlobalVector_Plex() line 1170 in /Users/jposunz/projects/src/petsc/src/dm/impls/plex/plexcreate.c
[0]PETSC ERROR: #5 DMCreateGlobalVector() line 698 in /Users/jposunz/projects/src/petsc/src/dm/interface/dm.c
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
with errorcode 62.


--
Dr John O'Sullivan
Lecturer
Department of Engineering Science
University of Auckland, New Zealand
email: jp.osullivan at auckland.ac.nz<https://lists.mcs.anl.gov/mailman/listinfo/petsc-dev>
tel: +64 (0)9 923 85353

________________________________
From: Matthew Knepley [knepley at gmail.com]
Sent: Friday, 13 March 2015 11:57 p.m.
To: John O'Sullivan
Cc: petsc-dev at mcs.anl.gov
Subject: Re: More on FV overlap cells and Fortran

On Fri, Mar 13, 2015 at 4:56 AM, John O'Sullivan <jp.osullivan at auckland.ac.nz<mailto:jp.osullivan at auckland.ac.nz>> wrote:
Hi Matt,

We're getting there with our parallel Fortran FV code. But here's another question about the best way to do things. Another simpleTest2 code is attached to demonstrate the question and below is a copy of the relevant part:

<mailto:petsc-dev at mcs.anl.gov>! We worked out its important not to write to the parts of arrays that correspond to overlap cells.
! Following ex11 it would be done something like this using C:
!
! ierr = DMPlexPointGlobalRef(dm,c,x,&xc);CHKERRQ(ierr);
! if (xc) {ierr = (*mod->solution)(mod,0.0,cg->centroid,xc,mod->solutionctx);CHKERRQ(ierr);}
!
! And according to the documentation on DMPlexPointGlobalRef xc would be returned as null for
! the overlap blocks that don't belong to the processor
!
! We can achieve the same using the ghost labels on the overlap cells but is this a good
! idea? We'd prefer to follow the standard approach ie in ex11 but I don't think there's a
! clean way to do this in fortran as we can't pass pointers to structures...
! Could it be done with DMPlexPointGlobalRef? If so, any ideas?

I can do two things:

  a) Make a Fortran version of DMPlexPointGlobalRef(), which returns an F90 array of size dof

or

  b) Make a Fortran wrapper for DMPlexGetGlobalOffset_Internal(), which returns start/end, and
     start < 0 if its off-process

Tell me what you think is better. Also, you can see here that we are experimenting with different
interfaces. Jed likes getting a pointer back. I like getting offsets back.

  Thanks,

     Matt

  do c = start_cell, end_interior_cell-1
! ???  call DMPlexPointGlobalRef(dm, c, localx, someref)???
    call DMLabelGetValue(label, c, ghost, ierr); CHKERRQ(ierr)
    if (ghost < 0) then
       call PetscSectionGetOffset(solution_section, c, cell_offset, ierr); CHKERRQ(ierr)
       cell_offset = cell_offset + 1
       localx(cell_offset) = 100.0
    end if
 end do

Cheers!
John

--
Dr John O'Sullivan
Lecturer
Department of Engineering Science
University of Auckland, New Zealand
email: jp.osullivan at auckland.ac.nz<https://lists.mcs.anl.gov/mailman/listinfo/petsc-dev>
tel: +64 (0)9 923 85353

________________________________

From: Matthew Knepley [mailto:knepley at gmail.com<mailto:knepley at gmail.com>]
Sent: Friday, 13 March 2015 4:44 a.m.
To: John O'Sullivan
Cc: petsc-dev at mcs.anl.gov<mailto:petsc-dev at mcs.anl.gov>
Subject: Re: Simple Test example not working in parallel for DMPlex/FV/overlap/ghost cells...

On Thu, Mar 12, 2015 at 4:19 AM, John O'Sullivan <jp.osullivan at auckland.ac.nz<mailto:jp.osullivan at auckland.ac.nz>> wrote:
Hi Matt,

Attached is a simple test program where I'm loading the simpleblock-100.exo and trying to set up a grid with closed boundaries on all the outer faces.

Just so I understand what you want. You do not want any ghost cells made around the outer boundary. What you have will do that.

We'd like to do it this way so we can use the same sort of architecture as in PetscFVM to skip ghost faces when calculating fluxes (like you suggested in your response yesterday)

Note that now, it will also mark the outer boundary faces as ghosts because they have no partner cell.

It works fine on a single processor but when we run it on 2 processors there's an indexing error when trying to create a matrix. There's obviously something wrong with the way we're setting up the dm but I don't know what it is? Or perrhaps we're not allowed to use DMPlexConstructGhostCells in this way? Also I was surprised by the values given for end_interior_cell on each processor as it seems to include the overlap cells which is not what I expected?

This is a bug in my latest preallocation push. Thanks for finding it. I pushed a fix for this, and for many Fortran things
to next. I am including my modified source so that everything is deallocated correctly (run with -malloc_test).

It seems to work for me now. Let me know if you have problems.

  Thanks,

    Matt

One last thing, the FORTRAN interface for DMPlexCreateLabel was missing so I added it into zplexlabel.c which works fine but I'm not sure if this was the right way to do it. My version is attached.

The full output from the program for two processors is below.

Cheers
John

(ps Is there a way to make DMView output the Labels information for all processors?)

$ mpirun -n 2 simpleTest2
DM Object:Parallel Mesh 2 MPI processes
  type: plex
Parallel Mesh in 3 dimensions:
  0-cells: 12 16
  1-cells: 20 28
  2-cells: 11 16
  3-cells: 2 3
Labels:
  ClosedBoundaries: 0 strata of sizes ()
  Face Sets: 3 strata of sizes (2, 2, 5)
  Cell Sets: 1 strata of sizes (2)
  depth: 4 strata of sizes (12, 20, 11, 2)
DM Object: 2 MPI processes
  type: plex
DM_0x7f91c1d27ff0_0 in 3 dimensions:
  0-cells: 12 16
  1-cells: 20 28
  2-cells: 11 16
  3-cells: 2 (0) 3 (0)
Labels:
  ghost: 2 strata of sizes (1, 10)
  vtk: 1 strata of sizes (1)
  Cell Sets: 1 strata of sizes (2)
  Face Sets: 3 strata of sizes (2, 2, 5)
  ClosedBoundaries: 0 strata of sizes ()
           0           3           3          19          35
  depth: 4 strata of sizes (12, 20, 11, 2)
           0           2           2          14          25
[1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[1]PETSC ERROR: Argument out of range
[1]PETSC ERROR: Section point -1 should be in [1, 3)
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Argument out of range
[0]PETSC ERROR: Section point -3 should be in [0, 1)
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.5.3-2945-g6900bc9  GIT Date: 2015-03-10 18:15:26 -0500
[0]PETSC ERROR: simpleTest2 on a arch-darwin-c-debug named Johns-MacBook-Air.local by jposunz Thu Mar 12 22:01:26 2015
[0]PETSC ERROR: Configure options --download-fblaslapack --useThreads=0 --download-triangle --download-netcdf --download-exodusii --download-hdf5 --download-ptscotch --download-chaco
[0]PETSC ERROR: #1 PetscSectionGetDof() line 499 in /Users/jposunz/projects/src/petsc/src/vec/is/utils/vsectionis.c
[0]PETSC ERROR: #2 DMPlexUpdateAllocation_Static() line 609 in /Users/jposunz/projects/src/petsc/src/dm/impls/plex/plexpreallocate.c
[0]PETSC ERROR: #3 DMPlexPreallocateOperator() line 763 in /Users/jposunz/projects/src/petsc/src/dm/impls/plex/plexpreallocate.c
[0]PETSC ERROR: #4 DMCreateMatrix_Plex() line 746 in /Users/jposunz/projects/src/petsc/src/dm/impls/plex/plex.c
[0]PETSC ERROR: [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[1]PETSC ERROR: Petsc Development GIT revision: v3.5.3-2945-g6900bc9  GIT Date: 2015-03-10 18:15:26 -0500
[1]PETSC ERROR: simpleTest2 on a arch-darwin-c-debug named Johns-MacBook-Air.local by jposunz Thu Mar 12 22:01:26 2015
#5 DMCreateMatrix() line 958 in /Users/jposunz/projects/src/petsc/src/dm/interface/dm.c
[1]PETSC ERROR: Configure options --download-fblaslapack --useThreads=0 --download-triangle --download-netcdf --download-exodusii --download-hdf5 --download-ptscotch --download-chaco
[1]PETSC ERROR: #1 PetscSectionGetDof() line 499 in /Users/jposunz/projects/src/petsc/src/vec/is/utils/vsectionis.c
[1]PETSC ERROR: #2 DMPlexUpdateAllocation_Static() line 609 in /Users/jposunz/projects/src/petsc/src/dm/impls/plex/plexpreallocate.c
[1]PETSC ERROR: #3 DMPlexPreallocateOperator() line 763 in /Users/jposunz/projects/src/petsc/src/dm/impls/plex/plexpreallocate.c
[1]PETSC ERROR: #4 DMCreateMatrix_Plex() line 746 in /Users/jposunz/projects/src/petsc/src/dm/impls/plex/plex.c
[1]PETSC ERROR: #5 DMCreateMatrix() line 958 in /Users/jposunz/projects/src/petsc/src/dm/interface/dm.c
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
with errorcode 63.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------


--
Dr John O'Sullivan

Lecturer

Department of Engineering Science

University of Auckland, New Zealand

email: jp.osullivan at auckland.ac.nz<https://lists.mcs.anl.gov/mailman/listinfo/petsc-dev>

tel: +64 (0)9 923 85353

________________________________
From: Matthew Knepley [knepley at gmail.com<mailto:knepley at gmail.com>]
Sent: Thursday, 12 March 2015 3:06 a.m.
To: John O'Sullivan
Cc: petsc-dev at mcs.anl.gov<mailto:petsc-dev at mcs.anl.gov>
Subject: Re: [petsc-dev] DMPlex, Finite Volume, overlap and ghost cells...
On Wed, Mar 11, 2015 at 3:34 AM, John O'Sullivan <jp.osullivan at auckland.ac.nz<mailto:jp.osullivan at auckland.ac.nz>> wrote:
Hi all,

I've managed to get myself very confused about how to use DMPlex correctly for a distributed Finite Volume grid...

My understanding was that along partition boundaries the ghost cells are used store cell information from neighbouring partitions so that the fluxes can be calculated.

Though debugging through ex11 it seems like overlap is set equal to 1?

I'm solving a simple pressure-diffusion equation on a 1D column (from an exo grid) which works fine on a single processor but not in parallel. I'm certainly not setting things up right or labeling correctly...

Could someone please explain the most appropriate way to set up and label the DM, whether the overlap should be 0 or 1 and whether ghost cells should be placed on internal partition boundaries.

Yes, for FV the partition overlap should be 1, as it is in ex11. This means that when the partition happens, we will
put a layer of cells on the other side of partition boundaries,

When DMPlexConstructGhostCells() is called, it will put ghost cells on the other side of the true boundary. Both
kinds of cells will be separated from interior cells by the DMPlexGetHybridBounds(&cMax), where cMax is the
first ghost cell.

Now you still need a way to get rid of faces between ghost cells (which only occur in cells across a partition boundary).
To do this, you use the "ghost" label we make during partitioning:

    ierr = DMPlexGetLabel(dm, "ghost", &ghostLabel);CHKERRQ(ierr);
    ierr = DMLabelGetValue(ghostLabel, face, &ghost);CHKERRQ(ierr);
    if (ghost >= 0) continue;

What exactly is going wrong?

   Matt

Thanks!
John


--
Dr John O'Sullivan

Lecturer

Department of Engineering Science

University of Auckland, New Zealand

email: jp.osullivan at auckland.ac.nz<https://lists.mcs.anl.gov/mailman/listinfo/petsc-dev>

tel: +64 (0)9 923 85353



--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener



--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener



--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener



--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20150313/9517d1f3/attachment.html>


More information about the petsc-dev mailing list