[petsc-users] DMPlex in Fortran

Matthew Knepley knepley at gmail.com
Fri Aug 26 09:30:44 CDT 2022


On Thu, Aug 25, 2022 at 4:25 PM Mike Michell <mi.mike1021 at gmail.com> wrote:

> Thank you for the quick note.
> To use PetscFECreateLagrange(), it is unclear about the meaning of "Nc",
> which is explained as 'number of components' on the manual page.
> I looked at an example with Nc=1 in
> src/dm/impls/swarm/tutorials/ex1f90.F90, but could you please detail the
> meaning of Nc? If only one value (component) is defined on each of p0 and
> p1 DMs, is Nc=1?
>

Sure. Nc is the number of tensor components, although we usually think of
it as a rank-1 tensor (vector). For most elements, such as Lagrange
elements, we understand this as the Nc-fold tensor product of the scalar
element. For example, suppose you ask for P_2 with Nc = 3. That would mean
a vector field in 3D where each component is quadratic. On a tetrahedral
mesh, each vertex and edge would have 3 dogs, one for the x, y, and z
component of the field.

   Thanks,

     Matt


> Thanks,
>
>
>> On Thu, Aug 25, 2022 at 12:52 PM Mike Michell <mi.mike1021 at gmail.com>
>> wrote:
>>
>>> Hi, I am trying to find the function you commented,
>>> DMPlexCreateInterpolator(), from DMPlex manual page, but I do not think
>>> this function is visible. Any comment about this?
>>>
>>
>> Sorry, it is DMCreateInterpolation(), which in turn calls
>> DMPlexComputeInterpolatorNested/General.
>>
>>   Thanks,
>>
>>       Matt
>>
>>
>>> Thanks,
>>>
>>>
>>>> On Fri, Jul 8, 2022 at 11:26 PM Mike Michell <mi.mike1021 at gmail.com>
>>>> wrote:
>>>>
>>>>> I am using DMPlex for a code with written in Fortran in 2D and 3D.
>>>>> There were two questions.
>>>>>
>>>>> - As a follow up of the previous inquiry:
>>>>> https://www.mail-archive.com/petsc-users@mcs.anl.gov/msg43856.html
>>>>> Is the local-to-local halo exchange available in Fortran now or still
>>>>> pending? Currently local-to-global and global-to-local communication are
>>>>> used since local-to-local has not been allowed for Fortran.
>>>>>
>>>>
>>>> Sorry, it is still on my TODO list. I am trying to get stuff cleared
>>>> out.
>>>>
>>>>
>>>>> - One code solves discretized equations at each vertex, and another
>>>>> code I want to couple physics is solving equations at each cell centroid.
>>>>> Basically, the value at cell centroid needs to be mapped to vertex (or vice
>>>>> versa) through interpolation/extrapolation for coupling of two codes. Does
>>>>> petsc function provide this kind of mapping between cell centroid and
>>>>> vertex? The grids for the two codes can be overlapped. I was trying to find
>>>>> some FEM infrastructures in petsc, but so far havent found that kind of
>>>>> functionality. Can I get any comments on that?
>>>>>
>>>>
>>>> Yes, you can create both P0 and P1 discretizations
>>>> (PetscFECreateLagrange) in two different DMs using DMClone(), and then
>>>> create an interpolation operator (DMPlexCreateInterpolator) which maps
>>>> between them. Let me know if something is not clear there.
>>>>
>>>>   Thanks,
>>>>
>>>>     Matt
>>>>
>>>>
>>>>> Thanks,
>>>>> Mike
>>>>>
>>>>>
>>>>
>>>> --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>> https://www.cse.buffalo.edu/~knepley/
>>>> <http://www.cse.buffalo.edu/~knepley/>
>>>>
>>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220826/095ca51c/attachment.html>


More information about the petsc-users mailing list