[petsc-users] sieve-dev Unstructured meshes in PETSC using Sieve

Chris Eldred chris.eldred at gmail.com
Wed Jul 25 14:30:58 CDT 2012


Where are the Fortran include files for DMComplex - I checked
${PETSC_DIR}/include/finclude but they are not there. The C/C++ headers are
in ${PETSC_DIR}/include/ though.

On Wed, Jul 25, 2012 at 11:48 AM, Chris Eldred <chris.eldred at gmail.com>wrote:

> Thanks for the info!- I will modify my code to use DMComplex instead of
> DMMesh (and migrate to the latest version of petsc-dev/slepc-dev). I'll let
> you know if that does not get rid of the Segfault as well.
>
>
> On Wed, Jul 25, 2012 at 11:34 AM, Matthew Knepley <knepley at gmail.com>wrote:
>
>> On Tue, Jul 24, 2012 at 12:31 PM, Chris Eldred <chris.eldred at gmail.com>wrote:
>>
>>> Hey PETSC/Sieve Developers,
>>>
>>> I am building a nonlinear shallow water testbed model (along with an
>>> associated eigensolver for the linear equations) intended to work on
>>> unstructured Voronoi meshes and cubed-sphere grids (with arbitrary
>>> block-structured refinement)- it will be a 2-D code. There will NOT be any
>>> adaptive mesh refinement- the mesh is defined once at the start of the
>>> application. It will support finite difference, finite volume and finite
>>> element-type (spectral elements and Discontinuous Galerkin) schemes- so
>>> variables will be defined on edges, cells and vertexes. I would like to use
>>> PETSC/SLEPC (currently limited to v3.2 for both since that is the latest
>>> version of SLEPC) for the spare linear algebra and eigenvalue solvers. This
>>> is intended as a useful tool for researchers in atmospheric model
>>> development- it will allow easy inter-comparison of different grids and
>>> schemes under a common framework.
>>>
>>
>> Cool. Use slepc-dev.
>>
>>
>>> Right now I have a serial version (written in Fortran 90) that
>>> implements a few different finite-difference schemes (along with a
>>> multigrid solver for square and hexagonal meshes) on unstructured Voronoi
>>> meshes and I would like to move to a parallel version (also using Fortran
>>> 90). The Sieve framework seems like an excellent fit for defining the
>>> unstructured mesh, managing variables defined on edges/faces/vertices and
>>> handling scatter/gather options between processes. I was planning on doing
>>> parallel partitioning using ParMetis.
>>>
>>
>> That is definitely what it is for.
>>
>>
>>> My understanding is that DMMesh handles mesh topology (interconnections,
>>> etc) while Sections define variables and mesh geometry (edge lengths,
>>> areas, etc.). Sections can be created over different depths/heights (chains
>>> of points in Sieve) in order to define variables on vertices/edges/cells.
>>>
>>
>> Yes.
>>
>>
>>> I am looking for documentation and examples of code use. I found:
>>>
>>> http://www.mcs.anl.gov/petsc/petsc-dev/src/snes/examples/tutorials/ex62.c.html
>>>
>>> http://www.mcs.anl.gov/petsc/petsc-current/src/snes/examples/tutorials/ex12.c.html
>>>
>>> Are there other examples/documentation available?
>>>
>>
>> Here is my simple tutorial:
>>
>> Building and Running ex62
>> --------------------------------------
>>
>> First, configure with FEM stuff turned on:
>>
>>     '--download-triangle',
>>     '--download-ctetgen',
>>     '--download-fiat',
>>     '--download-generator',
>>     '--download-chaco',
>>     '--download-metis',
>>     '--download-parmetis',
>>     '--download-scientificpython',
>>
>> I also use
>>
>>     '--with-dynamic-loading',
>>     '--with-shared-libraries',
>>     '--download-mpich',
>>     '--download-ml',
>>
>> and if you want to try GPU stuff
>>
>>     '--with-cuda',
>>     '--with-cuda-arch=sm_10',
>>     '--with-cuda-only',
>>     '--with-cudac=nvcc -m64',
>>
>> Then build PETSc with the Python make:
>>
>>    python2.7 ./config/builder2.py clean
>>    python2.7 ./config/builder2.py build    <This rebuilds only changed
>> things correctly as well>
>>
>>   python2.7 ./config/builder2.py --help <This is useful>
>>   python2.7 ./config/builder2.py build --help <This is too>
>>   python2.7 ./config/builder2.py check --help <And this>
>>
>> Once you have this, you should be able to build and run ex62
>>
>>    python2.7 ./config/builder2.py check src/snes/examples/tutorials/ex62.c
>> --testnum=0
>>
>> which runs the first test. You can run them all with no argument. All the
>> options are listed
>> at the top of ./config/builder.py.
>>
>>
>>
>>> Also, I was wondering what the difference is between DMMesh and
>>> DMComplex- it appears that they both implement the Sieve framework?
>>>
>>
>> DMMesh is the old DMComplex. I decided that C++ is a blight upon mankind
>> and templates are its Furies, so
>> I rewrite all of DMMesh in C, used Jed's new communication stuff, got rid
>> of iterators, and made things integrate
>> with the solvers much better.
>>
>>    Thanks,
>>
>>       Matt
>>
>>
>>> Thanks,
>>> Chris Eldred
>>>
>>> --
>>> Chris Eldred
>>> DOE Computational Science Graduate Fellow
>>> Graduate Student, Atmospheric Science, Colorado State University
>>> B.S. Applied Computational Physics, Carnegie Mellon University, 2009
>>> chris.eldred at gmail.com
>>>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>
>
>
> --
> Chris Eldred
> DOE Computational Science Graduate Fellow
> Graduate Student, Atmospheric Science, Colorado State University
> B.S. Applied Computational Physics, Carnegie Mellon University, 2009
> chris.eldred at gmail.com
>



-- 
Chris Eldred
DOE Computational Science Graduate Fellow
Graduate Student, Atmospheric Science, Colorado State University
B.S. Applied Computational Physics, Carnegie Mellon University, 2009
chris.eldred at gmail.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120725/437b8ba5/attachment.html>


More information about the petsc-users mailing list