[petsc-users] Implementing periodicity using DMPlex

Stefano Zampini stefano.zampini at gmail.com
Mon Jun 15 04:24:33 CDT 2020


It is enough if you use DMPlexDistribute with 1 level of overlap, set the local section for your dofs,  and call DMGlobalToLocal . The local vector will contain data for what you call “ghost” cells. I call them “not-owned’ cells.

> On Jun 15, 2020, at 12:09 PM, Shashwat Tiwari <shaswat121994 at gmail.com <mailto:shaswat121994 at gmail.com>> wrote:
> 
> The way I'm trying to implement periodic bc is, when I loop over the boundary faces, say, on the left boundary of the domain to compute flux and residual, I need solution values from the two cells neighbouring the face, i.e. the left cell and the right cell (the face normal pointing from the left cell to the right cell and the right cell being a boundary ghost cell for boundary faces). For the boundary to be periodic, I need the value that I get from the right cell (boundary ghost cell) to be the solution value at its periodic counterpart, i.e. the solution value at left cell of the face on right boundary of the domain in this case. My question is how do I update the value at a boundary ghost cell with the value at the real cell which is its periodic counterpart from the other side of the domain. Is there some kind of mapping of the boundary ghost cells to their corresponding real cells which I can use to update the solution values at ghost cells? 
> 
> Regards,
> Shashwat
> 
> On Sun, Jun 14, 2020 at 5:11 AM Matthew Knepley <knepley at gmail.com <mailto:knepley at gmail.com>> wrote:
> On Fri, Jun 12, 2020 at 3:19 PM Shashwat Tiwari <shaswat121994 at gmail.com <mailto:shaswat121994 at gmail.com>> wrote:
> Hi, 
> I am writing a first order 2D solver for unstructured grids with periodic boundaries using DMPlex. After generating the mesh, I use "DMSetPeriodicity" function to set periodicity in both directions. After which I partition the mesh (DMPlexDistribute), construct ghost cells (DMPlexConstructGhostCells),
> 
> These ghost cells are for FVM boundary conditions. If you want cells to be shared across parallel partitions, then you want to give overlap=1
> to DMPlexDIstribute(). Is that what you want?
> 
>   Thanks,
> 
>      Matt
>  
> create a section, and set some initial values in the global vector. Then I use "VecGhostUpdateBegin" to start updating the boundary ghost cell values, but, I get the following error in case I use multiple processors:
> 
> [0]PETSC ERROR: Invalid argument
> [0]PETSC ERROR: Vector is not ghosted
> [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html <https://www.mcs.anl.gov/petsc/documentation/faq.html> for trouble shooting.
> 
> if I run with a single process, there is no error but the values remain empty (zero) and are not updated. Kindly let me know, if I am missing some crucial step before I can update the ghost values in order to implement the periodic bc, or if there is any other approach to achieve it. I am attaching a small code to demonstrate the issue for your reference.
> 
> Regards,
> Shashwat
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
> 
> https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200615/48365ab1/attachment.html>


More information about the petsc-users mailing list