[petsc-users] PETSc DM ex13f90

praveen kumar praveen1bharadwaj at gmail.com
Mon May 2 08:38:06 CDT 2016


Thanks a lot for such a lucid explanation. The note which you mentioned at
the end is very important for me, as my code contains loops going from both
1 to N and 0 to N+1.

Thanks,
Praveen
Research Scholar,
Computational Combustion Lab,
Dept. of Aerospace Engg.
IIT Madras

On Mon, May 2, 2016 at 12:48 PM, Åsmund Ervik <asmund.ervik at ntnu.no> wrote:

> Hi Praveen,
>
> First of all: I'm cc-ing the petsc-users list, to preserve this
> discussion also for others. And, you'll get better/quicker help by
> asking the list. (Please reply also to the list if you have more
> questions.)
>
> On 29. april 2016 12:15, praveen kumar wrote:
> > Hi Asmund,
> >
> > I am trying to implement PETSc in a serial Fortran code for domain
> > decomposition. I've gone through your Van der pol example. I'm confused
> > about subroutines petsc_to_local and local_to_petsc. Please correct me if
> > I've misunderstood something.
> >
>
> Good that you've found dm/ex13f90 at least a little useful. I'll try to
> clarify further.
>
> > While explaining these subroutines in ex13f90aux.F90, you have mentioned
> > that "Petsc gives you local arrays which are indexed using global
> > coordinates".
> > What are 'local arrays' ? Do you mean the local vector which is derived
> > from DMCreateLocalVector.
>
> To answer your questions, a brief recap on how PETSc uses/stores data.
> In PETSc terminology, there are local and global vectors that store your
> fields. The only real difference between local and global is that the
> local vectors also have ghost values set which have been gathered from
> other processors after you have done DMLocalToGlobalBegin/End. There is
> also a difference in use, where global vectors are intended to be used
> with solvers like KSP etc.
>
> The vectors are of a special data structure that is hard to work with
> manually. Therefore we use the DMDAVecGetArray command, which gives us
> an array that has the same data as the vector. The array coming from the
> local vector is what I call the "local array". The point of the
> petsc_to_local and local_to_petsc subroutines, and the point of the
> sentence you quoted is that when PETSc gives you this array, in a
> program running in parallel with MPI, the array has different indexing
> on each process.
>
> Let's think about 1D to keep it simple, and say we have 80 grid points
> in total distributed across 4 processes. On the first process the array
> is indexed from 0 to 19, then on the second process it is indexed from
> 20 to 39, third process has 40 to 59 and the fourth process has the
> indices from 60 to 79. In addition, in the local array, the first
> process will also have the values at index 20, 21 etc (up to the stencil
> width you have specified) that belong to the second process, after you
> have done DMLocatToGlobalBegin/End, but it cannot change these values.
> It can only use these values in computations, for instance when
> computing the value at index 19.
>
> The petsc_to_local subroutine changes this numbering system, such that
> on all processors the array is indexed from 1 to 20. This makes it
> easier to use with an existing serial Fortran code, which typically does
> all loops from 1 to N (and 1 is hard-coded). The local array then has
> the correct ghost values below 1 and above 20, unless the array is next
> to the global domain edge(s), where you must set the correct boundary
> conditions.
>
> > As far as I know, if petsc_to_local is called before a DO loop going
> > from i=0, nx; then nx becomes local.
>
> The petsc_to_local subroutine does not change the value of nx. This
> value comes from DMDAGetCorners, which is called on line 87 in
> ex13f90.F90. The petsc_to_local subroutine only calls DMDAVecGetArrayF90
> to go from vector to array, and then changes the array indexing.
>
> Note also that petsc_to_local assumes you want the indices to start at 1
> (as is standard in Fortran). If you want them to start at 0, you must
> change the array definitions for "f" and "array" in the subroutines
> transform_petsc_us and transform_us_petsc,
> from
>   PetscReal,intent(inout),dimension(:,1-stw:,1-stw:,1-stw:) :: f
> to
>   PetscReal,intent(inout),dimension(:,0-stw:,0-stw:,0-stw:) :: f
>
> Hope that helps,
> Åsmund
>
> >
> > Thanks,
> > Praveen
> > Research Scholar,
> > Computational Combustion Lab,
> > Dept.of Aerospace Engg.
> > IIT Madras
> >
>
>


-- 
B. Praveen Kumar
Research Scholar,
Computational Combustion Lab,
Dept.of Aerospace Engg.
IIT Madras
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160502/de982f0a/attachment.html>


More information about the petsc-users mailing list