[petsc-users] Implementing periodicity using DMPlex

Jed Brown jed at jedbrown.org
Sat Jun 13 00:55:28 CDT 2020


Please always use "reply-all" so that your messages go to the list.
This is standard mailing list etiquette.  It is important to preserve
threading for people who find this discussion later and so that we do
not waste our time re-answering the same questions that have already
been answered in private side-conversations.  You'll likely get an
answer faster that way too.

Shashwat Tiwari <shaswat121994 at gmail.com> writes:

> Yeah I have use that too. I am getting zero values even in the Local Vector.

Corresponding to the ghost cells you created for that purpose, or otherwise?

> On Sat, 13 Jun 2020, 12:57 am , <jed at jedbrown.org> wrote:
>
>> You need DMGlobalToLocal.
>>
>> VecGhost is a different thing.
>>
>> On Jun 12, 2020 13:17, Shashwat Tiwari <shaswat121994 at gmail.com> wrote:
>>
>> Hi,
>> I am writing a first order 2D solver for unstructured grids with periodic
>> boundaries using DMPlex. After generating the mesh, I use
>> "DMSetPeriodicity" function to set periodicity in both directions. After
>> which I partition the mesh (DMPlexDistribute), construct ghost cells
>> (DMPlexConstructGhostCells), create a section, and set some initial values
>> in the global vector. Then I use "VecGhostUpdateBegin" to start updating
>> the boundary ghost cell values, but, I get the following error in case I
>> use multiple processors:
>>
>> [0]PETSC ERROR: Invalid argument
>> [0]PETSC ERROR: Vector is not ghosted
>> [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html
>> for trouble shooting.
>>
>> if I run with a single process, there is no error but the values remain
>> empty (zero) and are not updated. Kindly let me know, if I am missing some
>> crucial step before I can update the ghost values in order to implement the
>> periodic bc, or if there is any other approach to achieve it. I am
>> attaching a small code to demonstrate the issue for your reference.
>>
>> Regards,
>> Shashwat
>>
>>
>>


More information about the petsc-users mailing list