<div dir="ltr">Thanks a lot for such a lucid explanation. The note which you mentioned at the end is very important for me, as my code contains loops going from both 1 to N and 0 to N+1.<div><br></div><div>Thanks,</div><div>Praveen</div><div>Research Scholar,</div><div>Computational Combustion Lab,</div><div>Dept. of Aerospace Engg.</div><div>IIT Madras</div></div><div class="gmail_extra"><br><div class="gmail_quote">On Mon, May 2, 2016 at 12:48 PM, Åsmund Ervik <span dir="ltr"><<a href="mailto:asmund.ervik@ntnu.no" target="_blank">asmund.ervik@ntnu.no</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Hi Praveen,<br>
<br>
First of all: I'm cc-ing the petsc-users list, to preserve this<br>
discussion also for others. And, you'll get better/quicker help by<br>
asking the list. (Please reply also to the list if you have more questions.)<br>
<br>
On 29. april 2016 12:15, praveen kumar wrote:<br>
> Hi Asmund,<br>
><br>
> I am trying to implement PETSc in a serial Fortran code for domain<br>
> decomposition. I've gone through your Van der pol example. I'm confused<br>
> about subroutines petsc_to_local and local_to_petsc. Please correct me if<br>
> I've misunderstood something.<br>
><br>
<br>
Good that you've found dm/ex13f90 at least a little useful. I'll try to<br>
clarify further.<br>
<br>
> While explaining these subroutines in ex13f90aux.F90, you have mentioned<br>
> that "Petsc gives you local arrays which are indexed using global<br>
> coordinates".<br>
> What are 'local arrays' ? Do you mean the local vector which is derived<br>
> from DMCreateLocalVector.<br>
<br>
To answer your questions, a brief recap on how PETSc uses/stores data.<br>
In PETSc terminology, there are local and global vectors that store your<br>
fields. The only real difference between local and global is that the<br>
local vectors also have ghost values set which have been gathered from<br>
other processors after you have done DMLocalToGlobalBegin/End. There is<br>
also a difference in use, where global vectors are intended to be used<br>
with solvers like KSP etc.<br>
<br>
The vectors are of a special data structure that is hard to work with<br>
manually. Therefore we use the DMDAVecGetArray command, which gives us<br>
an array that has the same data as the vector. The array coming from the<br>
local vector is what I call the "local array". The point of the<br>
petsc_to_local and local_to_petsc subroutines, and the point of the<br>
sentence you quoted is that when PETSc gives you this array, in a<br>
program running in parallel with MPI, the array has different indexing<br>
on each process.<br>
<br>
Let's think about 1D to keep it simple, and say we have 80 grid points<br>
in total distributed across 4 processes. On the first process the array<br>
is indexed from 0 to 19, then on the second process it is indexed from<br>
20 to 39, third process has 40 to 59 and the fourth process has the<br>
indices from 60 to 79. In addition, in the local array, the first<br>
process will also have the values at index 20, 21 etc (up to the stencil<br>
width you have specified) that belong to the second process, after you<br>
have done DMLocatToGlobalBegin/End, but it cannot change these values.<br>
It can only use these values in computations, for instance when<br>
computing the value at index 19.<br>
<br>
The petsc_to_local subroutine changes this numbering system, such that<br>
on all processors the array is indexed from 1 to 20. This makes it<br>
easier to use with an existing serial Fortran code, which typically does<br>
all loops from 1 to N (and 1 is hard-coded). The local array then has<br>
the correct ghost values below 1 and above 20, unless the array is next<br>
to the global domain edge(s), where you must set the correct boundary<br>
conditions.<br>
<br>
> As far as I know, if petsc_to_local is called before a DO loop going<br>
> from i=0, nx; then nx becomes local.<br>
<br>
The petsc_to_local subroutine does not change the value of nx. This<br>
value comes from DMDAGetCorners, which is called on line 87 in<br>
ex13f90.F90. The petsc_to_local subroutine only calls DMDAVecGetArrayF90<br>
to go from vector to array, and then changes the array indexing.<br>
<br>
Note also that petsc_to_local assumes you want the indices to start at 1<br>
(as is standard in Fortran). If you want them to start at 0, you must<br>
change the array definitions for "f" and "array" in the subroutines<br>
transform_petsc_us and transform_us_petsc,<br>
from<br>
PetscReal,intent(inout),dimension(:,1-stw:,1-stw:,1-stw:) :: f<br>
to<br>
PetscReal,intent(inout),dimension(:,0-stw:,0-stw:,0-stw:) :: f<br>
<br>
Hope that helps,<br>
Åsmund<br>
<br>
><br>
> Thanks,<br>
> Praveen<br>
> Research Scholar,<br>
> Computational Combustion Lab,<br>
> Dept.of Aerospace Engg.<br>
> IIT Madras<br>
><br>
<br>
</blockquote></div><br><br clear="all"><div><br></div>-- <br><div class="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div>B. Praveen Kumar<br>Research Scholar,<br></div>Computational Combustion Lab,<br></div><div>Dept.of Aerospace Engg.<br></div>IIT Madras<br></div></div></div></div>
</div>