[petsc-users] Output cell data related to DMDA
Denis Davydov
davydden at gmail.com
Fri Dec 18 05:33:11 CST 2020
Thanks Barry,
> A quick and dirty approach would be to create a new DMDA with a single DOF and
I would need something like DG with constant value per cell. Can one create those with DMDA and make sure that MPI partitioning is the same between the two DMDA?
> use VecStrideGather() to grab that single component into a DMCreateGlobalVector() of the new DMDA from the previous and then pass that new vector to the viewer routine. If the original DMDA is vertex
Yes, original one is vertex centered (Q1), but really only care about the output/visualization of scalars per cell used in the weak form of the original DMDA. I don’t need to combine them anyhow (DMStag-like, if I understand you correctly).
Sincerely
Denis
> centered and you want computed values on a cell centered then it becomes more complicated. If you truly need a combination of vertex and cell-centered values you might find DMSTAG is more useful for your needs.
>
> Barry
>
>
>
>
>> On Dec 18, 2020, at 4:05 AM, Denis Davydov <davydden at gmail.com> wrote:
>>
>> Hi Barry,
>>
>> What I am after is to output one scalar per cell of DMDA (for example heat conduction on this cell or MPI partitioning of the computation domain). I hope that’s what is meant by PETSC_VTK_CELL_FIELD.
>>
>> My understanding is that DMCreateGlobalVector will create a vector associated with the field/discretization/nodal unknowns/etc (that would be PETSC_VTK_POINT_FIELD?), which is not what I would like to visualize.
>>
>> Could you point me to the right direction to look at?
>>
>> If this is not possible with VTK interface, I am fine to go for other viewer formats (maybe it’s coincidentally easier to visualize in MATLAB).
>>
>> Sincerely,
>> Denis
>>
>>>> Am 18.12.2020 um 10:03 schrieb Barry Smith <bsmith at petsc.dev>:
>>>>
>>>
>>>
>>>> On Dec 18, 2020, at 2:29 AM, Denis Davydov <davydden at gmail.com> wrote:
>>>>
>>>> Hi Matt,
>>>>
>>>> By global vector you mean one created with
>>>>
>>>> VecCreateMPI(..., nel, PETSC_DETERMINE,...)
>>>>
>>>> ? If so, that gives segfault (even with 1 MPI process) in user write function, which is just
>>>>
>>>> VecView((Vec)obj,viewer);
>>>>
>>>> which clearly indicates that I misunderstand your comment.
>>>>
>>>> Would you please clarify what PETSc expect as a “global” vector in case of cell-based quantities as opposed to unknowns/fields associated with the DMDA discretization?
>>>>
>>>
>>> Denis,
>>>
>>> Not sure what you mean by cell-based but if your vector is associated with a DMDA you need to create DMCreateGlobalVector() to get the properly layout with respect to the DM. If you use VecCreateMPI() it just has a naive 1d layout not associated with the DMDA in any way so won't be compatible. (Of course we would hope the code would not "crash" with an incompatible vector but just produce a useful error message)
>>>
>>> Barry
>>>
>>>
>>>
>>>
>>>> Sincerely,
>>>> Denis
>>>>
>>>>>> Am 17.12.2020 um 18:58 schrieb Matthew Knepley <knepley at gmail.com>:
>>>>>>
>>>>>
>>>>>> On Thu, Dec 17, 2020 at 12:18 PM Denis Davydov <davydden at gmail.com> wrote:
>>>>>
>>>>>> Dear all,
>>>>>>
>>>>>> I would like to output cell data (eg conductivity coefficient) in VTK for DMDA setup.
>>>>>>
>>>>>> Given that I know how many elements/cells are owned locally, I hoped that PetscViewerVTKAddField with PETSC_VTK_CELL_FIELD would do the job.
>>>>>> However I am not sure whether provided vector should be fully distributed (no ghosts)? If not, can I get the required ghosts from DMDA created with DMDACreate3D ?
>>>>>
>>>>> I believe that it outputs global vectors, meaning that there are no ghosts.
>>>>>
>>>>> Thanks,
>>>>>
>>>>> Matt
>>>>>
>>>>>> Ps. I saw just one relevant discussion on the mailing list.
>>>>>>
>>>>>> Sincerely,
>>>>>> Denis
>>>>>
>>>>>
>>>>> --
>>>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>>>>> -- Norbert Wiener
>>>>>
>>>>> https://www.cse.buffalo.edu/~knepley/
>>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20201218/529c99fc/attachment.html>
More information about the petsc-users
mailing list