[petsc-dev] Issues with DMComplexGetTransitiveClosure in Fortran

Matthew Knepley knepley at gmail.com
Wed Aug 29 21:51:13 CDT 2012


On Wed, Aug 29, 2012 at 1:35 PM, Chris Eldred <chris.eldred at gmail.com>wrote:

> DMComplexGetTransitiveClosure (and Restore) work fine now. What are
> the new restrictions on numbering mentioned in
> src/dm/impls/complex/examples/tests/ex2f90.F?
>

In the original Sieve, you could use any id for any point. This is very
flexible, but
hugely expensive in terms of memory, and somewhat expensive in terms of
time.

In DMComplex, each stratum in the DAG (it is technically a ranked poset)
must
be numbered contiguously. This means that vertices, faces, etc. are all
numbered
contiguously. It turns out that in practice, this is not an onerous
requirement, and
it allows very effective optimizations.

In addition, we have the convention that we number cells, then vertices,
then faces,
then edges. Nothing in the interface demands it, but all of the PETSc code
for FEM
obeys this since it simplifies debugging.

    Matt


> On Tue, Aug 28, 2012 at 8:16 PM, Matthew Knepley <knepley at gmail.com>
> wrote:
> > On Tue, Aug 28, 2012 at 5:01 PM, Chris Eldred <chris.eldred at gmail.com>
> > wrote:
> >>
> >> On Tue, Aug 28, 2012 at 3:49 PM, Matthew Knepley <knepley at gmail.com>
> >> wrote:
> >> > On Tue, Aug 28, 2012 at 4:39 PM, Chris Eldred <chris.eldred at gmail.com
> >
> >> > wrote:
> >> >>
> >> >> DMComplexGetTransitiveClosure does not appear to have the calling
> >> >> sequence (DM dm, PetscInt p, PetscBool useCone, PetscInt *numPoints,
> >> >> PetscInt *points[]) indicated in the documentation- instead it has DM
> >> >> dm, PetscInt p, PetscBool useCone, PetscInt *points[]. In addition,
> it
> >> >> does not appear to be returning the full closure- instead it just
> >> >> gives cone + the point.
> >> >
> >> >
> >> > 1) I will turn this code into a running example
> >> >
> >>
> >> Awesome! I have been using the Doublet mesh to test DMComplex routines
> >> and my own functions, so that is great.
> >
> >
> > Okay, there was a bug in the array handling. The test is
> >
> >   src/dm/impls/complex/examples/tests/ex2f90.F
> >
> >   Thanks,
> >
> >      Matt
> >
> >>
> >> > 2) I will fix the documentation. The interface change for Fortran
> comes
> >> > because the size
> >> >     is embedded in the array pointer. If you think I should still
> return
> >> > the
> >> > size in Fortran,
> >> >     let me know.
> >>
> >> I can get the size from the array pointer so just returning that works.
> >>
> >> >
> >> > 3) I will also check to make sure the documentation is clear that this
> >> > thing
> >> > returns each
> >> >     point AND its orientation.
> >> >
> >> > Will write again when I have your thing running. It could be an error
> >> > with
> >> > the Fortran array
> >> > processing I put in.
> >> >
> >> >   Thanks,
> >> >
> >> >      Matt
> >> >
> >> >>
> >> >> Sample code is below:
> >> >>
> >> >> program main
> >> >> implicit none
> >> >> !
> >> >> ! #echo '#include <finclude/petscsys.h>\n'
> >> >> #echo '#include <finclude/petsc.h90>'
> >> >> #echo '#include <finclude/petscdmcomplex.h>\n'
> >> >> #echo '#include <finclude/petscdmcomplex.h90>'
> >> >> ! #echo '#include <finclude/petscdm.h>\n'
> >> >> ! #echo '#include <finclude/petscdm.h90>\n'
> >> >>
> >> >> !
> >> >> DM dm
> >> >> PetscInt, target, dimension(3) :: EC
> >> >> PetscInt, target, dimension(2) :: VE
> >> >> PetscInt, pointer :: pEC(:), pVE(:)
> >> >> PetscInt, pointer :: nClosure(:)
> >> >> PetscInt, pointer :: pClosure(:), pCone(:), pCone2(:)
> >> >> PetscErrorCode ierr
> >> >>
> >> >> call PetscInitialize(PETSC_NULL_CHARACTER,ierr)
> >> >> CHKERRQ(ierr)
> >> >> call DMComplexCreate(PETSC_COMM_WORLD, dm, ierr)
> >> >> CHKERRQ(ierr)
> >> >>
> >> >> !Make Doublet Mesh from Fig 2 of Flexible Representation of
> >> >> Computational Meshes (except indexing is from 0 instead of 1)
> >> >> call DMComplexSetChart(dm, 0, 11, ierr)
> >> >> CHKERRQ(ierr)
> >> >>
> >> >> call DMComplexSetConeSize(dm, 9, 3,ierr)
> >> >> CHKERRQ(ierr)
> >> >> call DMComplexSetConeSize(dm, 10, 3,ierr)
> >> >> CHKERRQ(ierr)
> >> >> call DMComplexSetConeSize(dm, 4, 2,ierr)
> >> >> CHKERRQ(ierr)
> >> >> call DMComplexSetConeSize(dm, 5, 2,ierr)
> >> >> CHKERRQ(ierr)
> >> >> call DMComplexSetConeSize(dm, 6, 2,ierr)
> >> >> CHKERRQ(ierr)
> >> >> call DMComplexSetConeSize(dm, 7, 2,ierr)
> >> >> CHKERRQ(ierr)
> >> >> call DMComplexSetConeSize(dm, 8, 2,ierr)
> >> >> CHKERRQ(ierr)
> >> >>
> >> >> call DMSetUp(dm, ierr)
> >> >> CHKERRQ(ierr)
> >> >>
> >> >> EC(1) = 4
> >> >> EC(2) = 5
> >> >> EC(3) = 6
> >> >> pEC => EC
> >> >> call DMComplexSetCone(dm, 9 , pEC, ierr)
> >> >> CHKERRQ(ierr)
> >> >> EC(1) = 6
> >> >> EC(2) = 7
> >> >> EC(3) = 8
> >> >> pEC => EC
> >> >> call DMComplexSetCone(dm, 10 , pEC, ierr)
> >> >> CHKERRQ(ierr)
> >> >>
> >> >> VE(1) = 0
> >> >> VE(2) = 1
> >> >> pVE => VE
> >> >> call DMComplexSetCone(dm, 4 , pVE, ierr)
> >> >> CHKERRQ(ierr)
> >> >> VE(1) = 0
> >> >> VE(2) = 2
> >> >> pVE => VE
> >> >> call DMComplexSetCone(dm, 5 , pVE, ierr)
> >> >> CHKERRQ(ierr)
> >> >> VE(1) = 1
> >> >> VE(2) = 2
> >> >> pVE => VE
> >> >> call DMComplexSetCone(dm, 6 , pVE, ierr)
> >> >> CHKERRQ(ierr)
> >> >> VE(1) = 2
> >> >> VE(2) = 3
> >> >> pVE => VE
> >> >> call DMComplexSetCone(dm, 7 , pVE, ierr)
> >> >> CHKERRQ(ierr)
> >> >> VE(1) = 1
> >> >> VE(2) = 3
> >> >> pVE => VE
> >> >> call DMComplexSetCone(dm, 8 , pVE, ierr)
> >> >> CHKERRQ(ierr)
> >> >>
> >> >> call DMComplexSymmetrize(dm,ierr)
> >> >> CHKERRQ(ierr)
> >> >> call DMComplexStratify(dm,ierr)
> >> >> CHKERRQ(ierr)
> >> >>
> >> >> !Test Stuff
> >> >>
> >> >> call DMComplexGetTransitiveClosure(dm,10,PETSC_TRUE,nClosure,ierr)
> >> >> !pClosure,ierr)
> >> >> CHKERRQ(ierr)
> >> >> write(*,*) nClosure
> >> >> write(*,*) pClosure
> >> >>
> >> >> call DMDestroy(dm,ierr)
> >> >> CHKERRQ(ierr)
> >> >> call PetscFinalize(ierr)
> >> >> CHKERRQ(ierr)
> >> >> end
> >> >>
> >> >>
> >> >> The output is:
> >> >>           10           0           6           0           7
> >> >> 0           8
> >> >>
> >> >> and then there is an error in DMDestroy
> >> >>
> >> >> [0]PETSC ERROR: --------------------- Error Message
> >> >> ------------------------------------
> >> >> [0]PETSC ERROR: Object is in wrong state!
> >> >> [0]PETSC ERROR: Work array still checked out!
> >> >> [0]PETSC ERROR:
> >> >>
> >> >>
> ------------------------------------------------------------------------
> >> >> [0]PETSC ERROR: Petsc Development HG revision:
> >> >> b6fe82991deee4a0d3f9a20654bc7750a6b1fe0f  HG Date: Mon Aug 27
> 13:17:10
> >> >> 2012 -0500
> >> >> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
> >> >> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> >> >> [0]PETSC ERROR: See docs/index.html for manual pages.
> >> >> [0]PETSC ERROR:
> >> >>
> >> >>
> ------------------------------------------------------------------------
> >> >> [0]PETSC ERROR: ./test on a arch-linu named Puget-101334 by user Tue
> >> >> Aug 28 15:31:06 2012
> >> >> [0]PETSC ERROR: Libraries linked from
> >> >> /home/user/Desktop/LIBRARIES/petsc-dev/arch-linux2-cxx-debug/lib
> >> >> [0]PETSC ERROR: Configure run at Mon Aug 27 12:55:14 2012
> >> >> [0]PETSC ERROR: Configure options --download-boost --download-chaco
> >> >> --download-ctetgen --download-f-blas-lapack --download-fiat
> >> >> --download-generator --download-metis --download-ml --download-mpich
> >> >> --download-parmetis --download-scientificpython --download-triangle
> >> >> --with-clanguage=cxx --with-dynamic-loading --with-shared-libraries
> >> >> --with-sieve PETSC_ARCH=arch-linux2-cxx-debug
> >> >> [0]PETSC ERROR:
> >> >>
> >> >>
> ------------------------------------------------------------------------
> >> >> [0]PETSC ERROR: DMDestroy() line 234 in
> >> >> /home/user/Desktop/LIBRARIES/petsc-dev/src/dm/interface/dm.c
> >> >> application called MPI_Abort(MPI_COMM_WORLD, 73) - process 0
> >> >> [unset]: aborting job:
> >> >> application called MPI_Abort(MPI_COMM_WORLD, 73) - process 0
> >> >>
> >> >> Ideas?
> >> >>
> >> >> --
> >> >> Chris Eldred
> >> >> DOE Computational Science Graduate Fellow
> >> >> Graduate Student, Atmospheric Science, Colorado State University
> >> >> B.S. Applied Computational Physics, Carnegie Mellon University, 2009
> >> >> chris.eldred at gmail.com
> >> >
> >> >
> >> >
> >> >
> >> > --
> >> > What most experimenters take for granted before they begin their
> >> > experiments
> >> > is infinitely more interesting than any results to which their
> >> > experiments
> >> > lead.
> >> > -- Norbert Wiener
> >>
> >>
> >>
> >> --
> >> Chris Eldred
> >> DOE Computational Science Graduate Fellow
> >> Graduate Student, Atmospheric Science, Colorado State University
> >> B.S. Applied Computational Physics, Carnegie Mellon University, 2009
> >> chris.eldred at gmail.com
> >
> >
> >
> >
> > --
> > What most experimenters take for granted before they begin their
> experiments
> > is infinitely more interesting than any results to which their
> experiments
> > lead.
> > -- Norbert Wiener
>
>
>
> --
> Chris Eldred
> DOE Computational Science Graduate Fellow
> Graduate Student, Atmospheric Science, Colorado State University
> B.S. Applied Computational Physics, Carnegie Mellon University, 2009
> chris.eldred at gmail.com
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120829/1c0782cd/attachment.html>


More information about the petsc-dev mailing list