[petsc-users] virtual nodes at processes boundary
praveen kumar
praveenpetsc at gmail.com
Fri May 6 09:30:16 CDT 2016
Thanks Matt , Thanks Barry. I'll get back to you.
Thanks,
Praveen
On Fri, May 6, 2016 at 7:48 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
> > On May 6, 2016, at 5:08 AM, praveen kumar <praveenpetsc at gmail.com>
> wrote:
> >
> > Hi,
> >
> > I am trying to implement Petsc for DD in a serial fortran FVM code. I
> want to use solver from serial code itself. Solver consists of gauss seidel
> + TDMA. BCs are given along with the solver at boundary virtual nodes. For
> Ex: CALL TDMA(0,nx+1), where BCs are given at 0 and nx+1 which are virtual
> nodes (which don't take part in computation). I partitioned the domain
> using DMDACreate and got the ghost nodes information using DMDAGetcorners.
> But how to create the virtual nodes at the processes boundary where BCs are
> to be given. Please suggest all the possibilities to fix this other than
> using PETSc for solver parallelization.
>
> DMCreateGlobalVector(dm,gvector,ierr);
> DMCreateLocalVector(dm,lvector,ierr);
>
> /* full up gvector with initial guess or whatever */
>
> DMGlobalToLocalBegin(dm,gvector,INSERT_VALUES,lvector,ierr)
> DMGlobalToLocalEnd(dm,gvector,INSERT_VALUES,lvector,ierr)
>
> Now the vector lvector has the ghost values you can use
>
> DMDAVecGetArrayF90(dm,lvector,fortran_array_pointer_of_correct
> dimension for your problem (1,2,3d))
>
> Note that the indexing into the fortran_array_pointer uses the global
> indexing, not the local indexing. You can use DMDAGetCorners() to get the
> start and end indices for each process.
>
> Barry
>
>
>
> >
> > Thanks,
> > Praveen
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160506/92f81fcb/attachment.html>
More information about the petsc-users
mailing list