[petsc-dev] calling DMPlexPointLocalRef() from fortran
Matthew Knepley
knepley at gmail.com
Mon Apr 7 13:52:43 CDT 2014
On Mon, Apr 7, 2014 at 12:05 PM, Matthew Knepley <knepley at gmail.com> wrote:
> On Sun, Apr 6, 2014 at 9:43 PM, Adrian Croucher <a.croucher at auckland.ac.nz
> > wrote:
>
>>
>> On 05/04/14 02:31, Matthew Knepley wrote:
>>
>>
>> Instead of this, use
>>
>> call DMPlexGetDefaultSection(cell_dm, section, ierr); (Move outside
>> of loop)
>> call PetscSectionGetOffset(section, c, off, ierr);
>> part(off) =rank
>>
>>
>> OK thanks. The code compiles now.
>>
>> However, at runtime I'm getting another error, when it calls
>> DMSetPointSF():
>>
>> [0]PETSC ERROR: #1 DMSetPointSF() line 3204 in
>> /home/acro018/software/PETSc/code/src/dm/interface/dm.c
>> [1]PETSC ERROR: --------------------- Error Message
>> --------------------------------------------------------------
>> [1]PETSC ERROR: Invalid argument
>> [1]PETSC ERROR: Wrong type of object: Parameter # 1
>> [1]PETSC ERROR: See
>> http://http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
>> shooting.
>> [1]PETSC ERROR: Petsc Development GIT revision: v3.4.4-5306-g9849faf GIT
>> Date: 2014-04-06 15:35:41 -0500
>>
>> This seems to work fine in C, so I'm wondering if I need to do something
>> different in fortran, or if the interface isn't quite working right. My
>> test code is below. The offending call to DMSetPointSF() is in the routine
>> create_partition_vector(). Any tips much appreciated!
>>
>
> 1) You do not need those calls since DMClone() now does this automatically.
>
> 2) However, it looks like DMClone() is not functioning correctly for you
> since cell_dm should be alright.
>
> I will try and run your code today.
>
I got your code running. There were a few code problems (fixed in my
attached version), but the
problem you are seeing above is a bug in our system for generating Fortran
wrappers. The fix
will be pushed out soon, but I am attaching dmf.c which you can put in
src/dm/interface/ftn-auto
and rebuild and things should work for you.
Thanks,
Matt
> It also occurred to me there might just be something wrong with my mesh,
>> so I also tested it using a mesh created with DMPLexCreateBoxMesh()- but
>> this gave the same error (and also indicated that the fortran interface to
>> DMPlexSetRefinementLimit() is probably also missing in action!).
>>
>
> Fixed. Will push it today.
>
> Thanks,
>
> Matt
>
>
>> - Adrian
>>
>> --
>>
>> program testplex
>>
>> ! Testing PETSc dmplex from fortran, using a line of blocks in 3-D.
>> ! Create the DMPlex by passing in the cell definitions to
>> DMPlexCreateFromDAG(),
>> ! interpolate the result to form the faces and edges, distribute over
>> the
>> ! processors, and visualize the partitions in VTK.
>>
>> implicit none
>>
>> #include <finclude/petsc.h90>
>>
>> PetscInt, parameter :: num_cells = 20
>> PetscReal, parameter :: cell_size = 10.
>> PetscErrorCode :: ierr
>> DM :: dm, dist_dm, cell_dm
>> character(5) :: partitioner = "chaco"
>> PetscInt :: overlap = 0
>> Vec :: partition
>>
>> call PetscInitialize(PETSC_NULL_CHARACTER, ierr); CHKERRQ(ierr)
>>
>> call create_1d_grid(num_cells, cell_size, dm, ierr); CHKERRQ(ierr)
>>
>> call DMPlexDistribute(dm, partitioner, overlap, PETSC_NULL_OBJECT,
>> dist_dm, ierr)
>> CHKERRQ(ierr)
>> call DMDestroy(dm, ierr); CHKERRQ(ierr)
>> dm = dist_dm
>>
>> call DMView(dm, PETSC_VIEWER_STDOUT_WORLD); CHKERRQ(ierr)
>>
>> call create_partition_vector(dm, cell_dm, partition, ierr);
>> CHKERRQ(ierr)
>>
>> ! Visualize partition vector here...
>>
>> call DMDestroy(dm, ierr); CHKERRQ(ierr)
>> call PetscFinalize(ierr); CHKERRQ(ierr)
>>
>> end program testplex
>>
>> !------------------------------------------------------------------------
>>
>> subroutine create_1d_grid(num_cells, cell_size, dm, ierr)
>>
>> ! Creates grid consisting of a 1D line of cube cells in 3D.
>>
>> implicit none
>>
>> #include <finclude/petsc.h90>
>>
>> PetscInt, intent(in) :: num_cells
>> PetscReal, intent(in):: cell_size
>> DM, intent(out) :: dm
>> PetscErrorCode, intent(out) :: ierr
>> ! Locals:
>> PetscInt, parameter :: dim = 3
>> PetscMPIInt :: rank
>> PetscInt :: num_vertices_per_face, num_vertices_per_cell
>> PetscInt :: num_vertices, total_num_points
>> PetscInt :: i, i0, i1, l
>> PetscInt :: depth
>> real(8) :: x0, dx
>> PetscInt, allocatable :: num_points(:)
>> PetscInt, allocatable :: cone_size(:)
>> PetscInt, allocatable :: cones(:)
>> PetscInt, allocatable :: cone_orientations(:)
>> PetscScalar, allocatable :: vertex_coords(:)
>> DM :: interp_dm
>>
>> ierr = 0
>> call MPI_Comm_rank(PETSC_COMM_WORLD, rank, ierr); CHKERRQ(ierr)
>>
>> call DMPlexCreate(PETSC_COMM_WORLD, dm, ierr); CHKERRQ(ierr)
>> call PetscObjectSetName(dm, "testplex", ierr); CHKERRQ(ierr)
>> call DMPlexSetDimension(dm, dim, ierr); CHKERRQ(ierr)
>>
>> if (rank == 0) then
>>
>> num_vertices_per_face = 2 ** (dim-1)
>> num_vertices_per_cell = 2 * num_vertices_per_face
>> num_vertices = num_vertices_per_face * (num_cells + 1)
>> l = num_vertices_per_face * dim
>> total_num_points = num_vertices + num_cells ! # points in DAG
>>
>> allocate(vertex_coords(0:num_vertices*dim-1),
>> cones(0:num_cells*num_vertices_per_cell-1))
>>
>> ! Set up vertex coordinates
>> dx = real(cell_size)
>>
>> do i = 0, num_cells
>> i0 = i * l
>> i1 = (i+1)*l
>> x0 = real(i * cell_size)
>> vertex_coords(i0:i1) = \
>> [x0, 0.d0, 0.d0,\
>> x0, dx, 0.d0,\
>> x0, 0.d0, dx,\
>> x0, dx, dx]
>> end do
>>
>> ! Set up cells
>> do i = 0, num_cells-1
>> i0 = i * num_vertices_per_cell
>> i1 = (i+1) * num_vertices_per_cell
>> cones(i0:i1) = [0, 1, 5, 4, 2, 6, 7, 3] + i *
>> num_vertices_per_face
>> end do
>> cones = cones + num_cells
>>
>> num_points = [num_vertices, num_cells]
>> allocate(cone_size(0: total_num_points-1))
>> allocate(cone_orientations(0:num_cells*num_vertices_per_cell-1))
>> cone_size = 0
>> cone_size(0: num_cells-1) = num_vertices_per_cell
>> cone_orientations = 0
>> depth = 1
>>
>> call DMPlexCreateFromDAG(dm, depth, num_points, cone_size, cones, &
>> cone_orientations, vertex_coords, ierr); CHKERRQ(ierr)
>>
>> else ! rank > 0: create empty DMPlex
>>
>> num_points = [0, 0, 0]; cone_size = [0]
>> cones = [0]; cone_orientations = [0]
>> vertex_coords = [0]
>> call DMPlexCreateFromDAG(dm, 1, num_points, cone_size, &
>> cones, cone_orientations, vertex_coords, ierr); CHKERRQ(ierr)
>>
>> end if
>>
>> interp_dm = PETSC_NULL_OBJECT
>> call DMPlexInterpolate(dm, interp_dm, ierr); CHKERRQ(ierr)
>> call DMPlexCopyCoordinates(dm, interp_dm, ierr); CHKERRQ(ierr)
>> call DMDestroy(dm, ierr); CHKERRQ(ierr)
>> dm = interp_dm
>>
>> deallocate(num_points, cone_size, cones, cone_orientations,
>> vertex_coords)
>>
>> end subroutine create_1d_grid
>>
>> !------------------------------------------------------------------------
>>
>> subroutine create_partition_vector(dm, cell_dm, partition, ierr)
>>
>> ! Returns a DM and vector containing partitioning data
>>
>> implicit none
>>
>> #include <finclude/petsc.h90>
>>
>> DM, intent(in) :: dm
>> DM, intent(out) :: cell_dm
>> Vec, intent(out) :: partition
>> PetscErrorCode, intent(out) :: ierr
>> ! Locals:
>> MPI_Comm :: comm
>> PetscMPIInt :: rank
>> PetscSF :: sfPoint
>> PetscSection :: coord_section
>> Vec :: coordinates
>> PetscSection :: cell_section
>> PetscScalar, pointer :: part(:)
>> PetscInt :: cStart, cEnd, c, off
>>
>> call DMGetCoordinateSection(dm, coord_section, ierr); CHKERRQ(ierr)
>> call DMGetCoordinatesLocal(dm, coordinates, ierr); CHKERRQ(ierr)
>> call DMClone(dm, cell_dm, ierr); CHKERRQ(ierr)
>> call DMGetPointSF(dm, sfPoint, ierr); CHKERRQ(ierr)
>> call DMSetPointSF(cell_dm, sfPoint, ierr); CHKERRQ(ierr)
>> call DMSetCoordinateSection(cell_dm, coord_section, ierr); CHKERRQ(ierr)
>> call DMSetCoordinatesLocal(cell_dm, coordinates, ierr); CHKERRQ(ierr)
>> call PetscObjectGetComm(dm, comm, ierr); CHKERRQ(ierr)
>> call PetscSectionCreate(comm, cell_section, ierr); CHKERRQ(ierr)
>> call DMPlexGetHeightStratum(cell_dm, 0, cStart, cEnd, ierr);
>> CHKERRQ(ierr)
>> call PetscSectionSetChart(cell_section, cStart, cEnd, ierr);
>> CHKERRQ(ierr)
>> do c = cStart, cEnd-1
>> call PetscSectionSetDof(cell_section, c, 1, ierr); CHKERRQ(ierr)
>> end do
>> call PetscSectionSetUp(cell_section, ierr); CHKERRQ(ierr)
>> call DMSetDefaultSection(cell_dm, cell_section, ierr); CHKERRQ(ierr)
>> call PetscSectionDestroy(cell_section, ierr); CHKERRQ(ierr)
>> call DMCreateLocalVector(cell_dm, partition, ierr); CHKERRQ(ierr)
>> call PetscObjectSetName(partition, "partition", ierr); CHKERRQ(ierr)
>> call VecGetArrayF90(partition, part, ierr); CHKERRQ(ierr)
>> call MPI_Comm_rank(comm, rank); CHKERRQ(ierr)
>> do c = cStart, cEnd-1
>> call PetscSectionGetOffset(cell_section, c, off, ierr); CHKERRQ(ierr)
>> part(off) = rank
>> end do
>> call VecRestoreArrayF90(partition, part, ierr); CHKERRQ(ierr)
>>
>> end subroutine create_partition_vector
>>
>>
>>
>> --
>> Dr Adrian Croucher
>> Department of Engineering Science
>> University of Auckland
>> New Zealand
>> tel 64-9-373-7599 ext 84611
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20140407/c224f6da/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: testplex.F90
Type: application/octet-stream
Size: 6437 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20140407/c224f6da/attachment.obj>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: dmf.c
Type: text/x-csrc
Size: 22571 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20140407/c224f6da/attachment.bin>
More information about the petsc-dev
mailing list