[petsc-users] DM question.

Afanasiev Michael michael.afanasiev at erdw.ethz.ch
Fri Oct 23 03:01:03 CDT 2015


Hi Matthew,

So I’m discretizing via a tensor product of Lagrange polynomials co-located at the Gauss-Lobatto-Legendre (GLL) points. The polynomial order might range from 4 to 10 or something like that. The current problem is solved on 2D hexes.

I had found DMPlexCreateSection, and followed plex/ex1to get things set up. You can see my attempt below. Basically I’ve got 4 fields (displacement, velocity, acceleration, and strain) over each element. Here I’ve tried to setup a tensor product of 4th order (5 GLL points) Lagrange polynomials (line 11). This seemed to somewhat achieve what I wanted — I created a global vector and wrote it to a vtk file with VecView, and the numbering seemed to make sense. You can also see my attempt at defining a boundary condition (it looked like DMPlexCreateFromExodus labeled side sets as “Face Sets”, seems to have worked).

Does this seem to be headed in the right direction?

Cheers,
Mike.

DM
mesh::createSection(const DM &dm)
{

01        // displacement, velocity, acceleration, strain
02        IS bcPointIs[1];
03        PetscInt numBc = 1;
04        PetscInt bcField[1];
05        PetscInt numFields = 4;
06        PetscInt dim; DMGetDimension(dm, &dim);
07        PetscInt numComp[numFields];
08        for (auto i=0; i<numFields; i++) {numComp[i] = dim;}
09        PetscInt numDof[numFields*(dim+1)];
10        for (auto i=0; i<numFields*(dim+1); i++) {numDof[i] = 0;}
11        for (auto i=0; i<numFields; i++) {numDof[i*(dim+1)+dim] = 5;}
12        bcField[0] = 0;
13        PetscSection section;
14        DMPlexGetStratumIS(dm, "Face Sets", 1, &bcPointIs[0]);
15        DMPlexCreateSection(dm, dim, numFields, numComp, numDof, numBc, bcField,
16                                                NULL, bcPointIs, NULL, &section);
17        ISDestroy(&bcPointIs[0]);
18        PetscSectionSetFieldName(section, 0, "u");
19        PetscSectionSetFieldName(section, 1, "v");
20        PetscSectionSetFieldName(section, 2, "a");
21        PetscSectionSetFieldName(section, 3, "e");
22        DMSetDefaultSection(dm, section);
23        return dm;
}

--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut für Geophysik
ETH Zürich

Sonneggstrasse 5, NO H 39.2
CH 8092 Zürich
michael.afanasiev at erdw.ethz.ch<mailto:michael.afanasiev at erdw.ethz.ch>

On Oct 22, 2015, at 2:32 AM, Matthew Knepley <knepley at gmail.com<mailto:knepley at gmail.com>> wrote:

On Wed, Oct 21, 2015 at 3:07 PM, Dave May <dave.may at erdw.ethz.ch<mailto:dave.may at erdw.ethz.ch>> wrote:
Hey Mike,



On 21 October 2015 at 18:01, Afanasiev Michael <michael.afanasiev at erdw.ethz.ch<mailto:michael.afanasiev at erdw.ethz.ch>> wrote:
Hey Dave,

So I’ve got a couple of days where there’s nothing pressing to work on… was thinking of ripping out the parallel routines (ugly) in my wave propagation code and replacing them with Petsc DM routines. I can read in my exodusii mesh with DMPLEX, and partition it with chaco, which gives me a nicely partitioned DM. This takes me like 5 lines of code.  That’s amazing.

But here I’m stuck, and am having a whale of a time with the documentation. All I think I need is a way to modify the exodus-created DM, and add to it the degrees of freedom that are introduced by my quadrature rule. This would be really neat. I can just treat each sub-domain as its own mesh, with its own global numbering. Then whenever necessary I can scatter stuff the the real global degrees of freedom with something like VecLocToGlob. Most of the things I like about the code could stay the same (element-wise, matrix-free nature), just these parallel broadcasts would be infinitely nicer.


First off - I don't use DMPLEX.

Dave is refreshingly candid about his shortcomings ;)


But I just can’t figure out how to set this up. The main problem really boils down to: what’s the best way to add my quadrature points to an already-created DM, which was constructed with an exodus file? I guess I could do this after the file is read, but before the partitioning. In this case though, what’s stopping the partitioner from cutting an element in half? It seems like it would be a lot cleaner to do this post-partitioning.


Presumably what is read from exodus is just the vertices of the hexes, and what you want to do is define the function space (given by your GLL locations) on top of element geometry read in. Is that what you are asking about?

So Dave is right. We read in topology and geometry from ExodusII. Then you define a function space on top. How
exactly are you discretizing? In order to create vectors, do local to global, etc. Petsc really only need to know the
amount of data associated with each mesh piece. You can define this with PetscSection. If you give me an idea
what you want I can help you write the code easily I think.

  Thanks,

     Matt

Any hints here?

Actually I have no experience with this object.
I would just send an email to
  petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>
asking for help.

The developer of DMPLEX (Matt Knepley) will definitely answer within in 1 day.

Cheers,
  Dave


Best,
Mike.
--
Michael Afanasiev
Ph.D. Candidate
Computational Seismology
Institut für Geophysik
ETH Zürich

Sonneggstrasse 5, NO H 39.2
CH 8092 Zürich
michael.afanasiev at erdw.ethz.ch<mailto:michael.afanasiev at erdw.ethz.ch>





--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151023/be3fe575/attachment-0001.html>


More information about the petsc-users mailing list