[petsc-users] transfer vector data diagonally on DA
khalid ashraf
khalid_eee at yahoo.com
Mon Apr 2 16:53:08 CDT 2012
Thanks Matt. It works now.
I have another problem. I am writing this program.
for (k=zs; k<zs+zm; k++) {
for (j=ys; j<ys+ym; j++) {
for (i=xs; i<xs+xm; i++) {
if ( i!=(mx-1) || j!=my-1 || k!=mz-1)
{
bodyFx[0]=1;
bodyFx[1]=-0.3;
bodyFx[2]=-0.3;
bodyFx[3]=-0.3;
row[0]=k*mx*my+j*mx+i;
row[1]=k*mx*my+j*mx+i+1;
row[2]=k*mx*my+(j+1)*mx+i;
row[3]=(k+1)*mx*my+j*mx+i;
VecSetValues(fx_test,4,row,bodyFx,ADD_VALUES);
}
VecAssemblyBegin(fx_test);
VecAssemblyEnd(fx_test);
Output: Print fx_test
Here fx_test is a global vector. This program gives correct output on 1 proc. but on 4 processors,
the program runs but the outputs are wrong.
I also tried
for (k=zs; k<zs+zm; k++) {
for (j=ys; j<ys+ym; j++) {
for (i=xs; i<xs+xm; i++) {
if ( i!=(mx-1) || j!=my-1 || k!=mz-1)
{
bodyFx[0]=1;
bodyFx[1]=-0.3;
bodyFx[2]=-0.3;
bodyFx[3]=-0.3;
fx1_localptr[k][j][i+1]+=bodyFx[4];
fx1_localptr[k][j+1][i]+=bodyFx[3];
fx1_localptr[k+1][j][i]+=bodyFx[1];
fx1_localptr[k][j][i]+=bodyFx[0];
}
Here, fx1_localptr is an array from a local vector that is communicated from the global vector fx_test.
In this case as well, I get similar errors.
Thanks.
Khalid
Some final numbers of the vector with 8*8*6 grid, 4 proc,
-2.9999999999999999e-01
-2.9999999999999999e-01
-2.9999999999999999e-01
-2.9999999999999999e-01
1.0000000000000009e-01
1.0000000000000009e-01
1.0000000000000009e-01
-2.9999999999999999e-01
-2.9999999999999999e-01
-2.9999999999999999e-01
-2.9999999999999999e-01
0.0000000000000000e+00
4.0000000000000002e-01
1.0000000000000009e-01
1.0000000000000009e-01
1.0000000000000009e-01
0.0000000000000000e+00
0.0000000000000000e+00
0.0000000000000000e+00
0.0000000000000000e+00
1.0000000000000009e-01
1.0000000000000009e-01
1.0000000000000009e-01
-2.9999999999999999e-01
0.0000000000000000e+00
0.0000000000000000e+00
0.0000000000000000e+00
Some final numbers of the vector with 8*8*6 grid, 1 proc,
0.0000000000000000e+00
-2.9999999999999999e-01
-2.9999999999999999e-01
-2.9999999999999999e-01
-2.9999999999999999e-01
-2.9999999999999999e-01
-2.9999999999999999e-01
-2.9999999999999999e-01
0.0000000000000000e+00
-2.9999999999999999e-01
-2.9999999999999999e-01
-2.9999999999999999e-01
-2.9999999999999999e-01
-2.9999999999999999e-01
-2.9999999999999999e-01
-2.9999999999999999e-01
0.0000000000000000e+00
-2.9999999999999999e-01
-2.9999999999999999e-01
-2.9999999999999999e-01
-2.9999999999999999e-01
-2.9999999999999999e-01
-2.9999999999999999e-01
-2.9999999999999999e-01
0.0000000000000000e+00
0.0000000000000000e+00
0.0000000000000000e+00
0.0000000000000000e+00
0.0000000000000000e+00
0.0000000000000000e+00
0.0000000000000000e+00
0.0000000000000000e+00
0.0000000000000000e+00
________________________________
From: Matthew Knepley <knepley at gmail.com>
To: khalid ashraf <khalid_eee at yahoo.com>; PETSc users list <petsc-users at mcs.anl.gov>
Sent: Monday, April 2, 2012 5:41 AM
Subject: Re: [petsc-users] transfer vector data diagonally on DA
On Mon, Apr 2, 2012 at 3:36 AM, khalid ashraf <khalid_eee at yahoo.com> wrote:
Hi Jed,
>
>
>I am using petsc/3.1 and the include file "petscdmda.h" is not working. I am using the "petscda.h"
>and DACreate3D.
>unew_localptr[][][] is from a global vector using VecGetArray
>u_localptr[][][] is from a local vector communicated from a global vector.
>I tried
>unew_localptr[k][j][i]=u_localptr[k-1][j-1][i-1]
>
>but it gives Segmentation Violation error.
I would guess (because you provide almost no information about what you are doing), that
this is a domain on the edge. If you truly mean to have a periodic domain, you must set that in the
creation call. Then the local vector will also ghost regions outside the domain boundary.
Matt
I also tried unew_localptr[k][j][i]=u_localptr[k][j+1][i+1] which works but
>unew_localptr[k][j][i]=u_localptr[k+1][j+1][i+1] or unew_localptr[k][j][i]=u_localptr[k][j-1][i-1]
>does not work. I am running the program on4 processors.
>
>
>
>Thanks.
>
>
>
>
> ierr = DACreate3d(PETSC_COMM_WORLD,DA_YPERIODIC,DA_STENCIL_BOX,appctx.l,appctx.m,appctx.n,
> PETSC_DECIDE,PETSC_DECIDE,PETSC_DECIDE,1,1,PETSC_NULL,PETSC_NULL,PETSC_NULL,
> &appctx.da);CHKERRQ(ierr);
> for (k=zs; k<zs+zm; k++) {
> for (j=ys; j<ys+ym; j++) {
> for (i=xs; i<xs+xm; i++) {
>
>
> if(i!=0 || j!=0 || k!=0|| i!=mx-1 || j!=my-1 || k!=mz-1)
>
>
> unew_localptr[k][j][i]=u_localptr[k+1][j+1][i+1];
>
>}}}
>
>________________________________
>
>From: Jed Brown <jedbrown at mcs.anl.gov>
>To: khalid ashraf <khalid_eee at yahoo.com>; PETSc users list <petsc-users at mcs.anl.gov>
>Sent: Sunday, April 1, 2012 10:07 PM
>Subject: Re: [petsc-users] transfer vector data diagonally on DA
>
>
>
>On Sun, Apr 1, 2012 at 22:01, khalid ashraf <khalid_eee at yahoo.com> wrote:
>
>I want to transfer vector data diagonally in the DA grid like
>> for (k=zs; k<zs+zm; k++) {
>> for (j=ys; j<ys+ym; j++) {
>> for (i=xs; i<xs+xm; i++) {
>>if(i!=mx-1 || j!=my-1 || k!=mz-1){
>>u_new[k+1][j+1][i+1]=u[k][j][i];}
>>}}}
>>
>>
>>Could you please suggest the best way to do it minimizing interprocessor assignments.
>
>Both are on the same DMDA?
>
>
>Communicate U to Ulocal (DMGlobalToLocalBegin/End) using a BOX stencil with width at least 1, get the global array u_new[][][] from UGlobalNew and the local arrays u[][][] from Ulocal, then assign u_new[k][j][i] = u[k-1][j-1][i-1].
>
>
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120402/198a0763/attachment.htm>
More information about the petsc-users
mailing list