[petsc-users] Updating TS solution outside PETSc

Smith, Barry F. bsmith at mcs.anl.gov
Thu Dec 5 10:16:31 CST 2019


   Are you using cell-centered or vertex centered discretization ( makes a slight difference)?

   Our model is to use DM_BOUNDARY_MIRROR  DMBoundaryType. This means that u_first_real_grid_point - u_its_ghost_point = 0 (since DMGlobalToLocal will automatically put into the physical ghost location the appropriate mirror values) thus u_n is zero along the edge; zero Neumann  conditions, for non-zero Neuman you need to put something in the "local rhs" to represent that, I'm not sure it is clear, but think about it in one dimension for the non-zero Neumann case.

   Bad news  not yet implemented for 3d.

   If you are using 3d we should fix this for you (or you fix it and make a MR because we should have this support). 

   If you have a 2d version of your code I would test with 3d your model etc and then let us know and request how we can get the 3d mirror written.

   Others may have alternative advice for Neumann with DMDA,

   Barry


> On Dec 5, 2019, at 10:00 AM, Ellen M. Price <ellen.price at cfa.harvard.edu> wrote:
> 
> Hi PETSc users,
> 
> I am working with a code that solves a set of PDEs on a rectangular
> domain with Neumann boundary conditions. My understanding of
> implementing the boundary condition is that I should set the boundary
> value to be the value that makes the finite difference derivative go to
> zero (or some other prescribed value) on that boundary.
> 
> I was attempting to update the solution from TS using a pre- or
> post-step/stage function and TSGetSolution, but this does not appear to
> work as expected. What would be the correct way to prescribe the
> boundary condition, given that I'm using TSRK for timestepping and a
> DMDA for discretization?
> 
> Looking forward to any help!
> 
> Ellen Price



More information about the petsc-users mailing list