[petsc-users] Vertex only unstructured mesh with DMPlex and DMSWARM
Matthew Knepley
knepley at gmail.com
Wed Feb 12 10:52:42 CST 2020
On Wed, Feb 12, 2020 at 3:56 AM Hill, Reuben <reuben.hill10 at imperial.ac.uk>
wrote:
> I'm a new Firedrake developer working on getting my head around PETSc. As
> far as I'm aware, all our PETSc calls are done via petsc4py.
>
> I'm after general help and advise on two fronts:
>
>
> *1*:
>
> I’m trying to represent a point cloud as a vertex-only mesh in an attempt
> to play nicely with the firedrake stack. If I try to do this using
> firedrake I manage to cause a PETSc seg fault error at the point of calling
>
> PETSc.DMPlex().createFromCellList(dim, cells, coords, comm=comm)
>
>
> with dim=0, cells=[[0]], coords=[[1., 2.]] and comm=COMM_WORLD.
>
> Output:
>
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> [0]PETSC ERROR: or see
> https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
> [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS
> X to find memory corruption errors
> [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and
> run
> [0]PETSC ERROR: to get more information on the crash.
> application called MPI_Abort(MPI_COMM_WORLD, 50152059) - process 0
>
>
> I’m now looking into getting firedrake to make a DMSWARM which seems to
> have been designed for doing something closer to this and allows nice
> things such as being able to make the particles (for me - the vertices of
> the mesh) move and jump between MPI ranks a-la particle in cell. I note
> DMSwarm docks don't suggest there is an equivalent of the plex.distribute()
> method in petsc4py (which I believe calls DMPlexDistribute) that a DMPlex
> has. In firedrake we create empty DMPlexes on each rank except 0, then call
> plex.distribute() to take care of parallel partitioning. How, therefore,
> should I meant to go about distributing particles across MPI ranks?
>
Patrick is right. Here is an example:
https://gitlab.com/petsc/petsc/-/blob/master/src/snes/examples/tutorials/ex63.c
Thanks
Matt
> *2*:
>
> I'm aware these questions may be very naive. Any advice for learning about
> relevant bits of PETSc for would be very much appreciated. I'm in chapter 2
> of the excellent manual (
> https://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf) and I'm also
> attempting to understand the DMSwarm example. I presume in oder to
> understand DMSWARM I really ought to understand DMs more generally (i.e.
> read the whole manual)?
>
>
> Many thanks
> Reuben Hill
>
>
> 1.
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200212/5de4b2a8/attachment.html>
More information about the petsc-users
mailing list