[petsc-users] PETSc on unstructured meshes / Sieve
Gong Ding
gdiso at ustc.edu
Thu Aug 25 11:46:54 CDT 2011
I had added vertex based FVM support to libmesh, for my semiconductor simulator, not very difficult.
In the year of 2007, I had spent about one month to read though the source code of libmesh.
And finally known how to write 3D, parallel numerical code based on unstructured grid.
How to use OO to describe different mesh cells (tri, quad, tet, prism, hex...) with an uniform interface.
How to organize the data structure of unstrctured mesh cells in memory.
How to partition mesh between processsors.
How to mapping variables to petsc vec and mat.
How to synchronous values on ghost node/cell
How to assemble petsc vec and mat.
Many many issues.
Thanks to all of you for your suggestions. I'll give Sieve a try.
I do understand that PETSc is neither a FEM nor a FVM framework. What I expect from PETSc is that it helps with the parallel communication across mesh partition boundaries. I don't think there is a better toolkit for this? On the contrary, I would like to
implement the matrix coefficients myself for learning purposes. For this reason projects like libmesh don't help (besides, it is for FEM, not FVM). By the way, why does libmesh reinvent their own mesh classes instead of using PETSc's?
While the PETSc pdf manual has several pages on working with distributed arrays, I don't find anything on distributed meshes except for a description of index sets. Is the use of DMMesh documented somewhere?
Marek
________________________________
From: Matthew Knepley <knepley at gmail.com>
To: Marek Schmitt <marek.schmitt at yahoo.com>; PETSc users list <petsc-users at mcs.anl.gov>
Sent: Wednesday, August 24, 2011 3:45 PM
Subject: Re: [petsc-users] PETSc on unstructured meshes / Sieve
On Wed, Aug 24, 2011 at 1:25 PM, Marek Schmitt <marek.schmitt at yahoo.com> wrote:
I would like to experiment with PETSc for learning FVM on unstructured grids. I get the impression that PETSc is primarily developed for structured grids with cartesian topology, is this true?
>
PETSc is not primarily developed for discretization and topology. It solves systems of nonlinear algebraic equations. It does have
some extensions that handle structured (DMDA) and unstructured (DMMesh) under the recently developed DM interface.
Pylith and Fenics seem to use Sieve for unstructured grids. Is Sieve part of PETSc?
>
Sieve is both the name of an interface (implemented by FEniCS) and an implementation of that interface (in PETSc).
Why is it so much hidden? The very silent sieve-dev mailing list exists since four years, but there is a recent post:
>
I am not sure its hidden. I answer all the questions on the list. Its a big project for one person to support. I put most of
my time into supporting PyLith (http://geodynamics.org/cig/software/pylith) which uses it for parallel FEM simulation.
PETSc is 20 years old, so we have had time to make some components more robust.
"2) Unstructured meshes. This is not well-documented. There is a tutorial presentation and a repository of code for it. A few people have used this, but it is nowhere near the level of clarity and robustness that the rest of PETSc has." (from http://lists.mcs.anl.gov/pipermail/sieve-dev/2010-October/000098.html)
>Is the sieve-dev list about a different sieve than what is used by Pylith and Fenics?
>
They are two different implementations.
There is a PETSc FAQ "Do you have examples of doing unstructured grid finite element computations (FEM) with PETSc?". It mentions Sieve but no further links or documentation.
>
>Is the directory petsc-3.1-p8/include/sieve all that is needed to work with Sieve? Or are these only header files, and I have to link to the Sieve library from somewhere else (then where can I find Sieve)?
>
You must use petsc-dev in order for the examples to work, like SNES ex12.
Thanks,
Matt
Please shine some light into the mysterious Sieve.
>Marek
>
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
More information about the petsc-users
mailing list