[petsc-users] VecSetValues

Barry Smith bsmith at mcs.anl.gov
Tue Apr 3 07:40:29 CDT 2012


   I think you are mis-understanding the global ordering when using DA vectors in parallel. You might consider using VecSetValuesLocal() or VecSetValuesStencil() instead of VecSetValues(). Read up on the users manual and examples about PETSc global ordering.

   Barry

On Apr 3, 2012, at 3:46 AM, khalid ashraf wrote:

> Hi Matt, I ran the program on two processors. I am trying to enter value into a 
> vector associated with DA. I am using VecSetValues since I may need to assign nonlocal 
> values to the vector. However, I find that on 2 procs, a general parallel vector works but
> the DAVector indexing is not working. Do you see any problem with the DA code ?
> 
> Thanks.
> 
> Khalid    
> 
> THIS WORKS: 30 is written at index 5
> -------------------------------
> ppy=5; ppx=30;
> VecCreate(PETSC_COMM_WORLD,&testVec);
> VecSetFromOptions(testVec);
> VecSetSizes(testVec,PETSC_DECIDE,384);
> 
> VecSetValues(testVec,1,&ppy,&ppx,INSERT_VALUES);
> VecAssemblyBegin(testVec);
> VecAssemblyEnd(testVec);
> =====================================
> 
> THIS DOES NOT WORK: puts 30 at index 9 instead of 5 when run on two procs.
> ---------------------------------
> ppy=5; ppx=30;
> ierr = DACreate3d(PETSC_COMM_WORLD,DA_YPERIODIC,DA_STENCIL_BOX,8,8,6,
>                     PETSC_DECIDE,PETSC_DECIDE,PETSC_DECIDE,1,1,PETSC_NULL,PETSC_NULL,PETSC_NULL,
>                     &appctx.da);CHKERRQ(ierr);
>  ierr = DACreateGlobalVector(da,&testVec);CHKERRQ(ierr)
> VecSetValues(testVec,1,&ppy,&ppx,INSERT_VALUES);
> VecAssemblyBegin(testVec);
> VecAssemblyEnd(testVec);
> 
> From: Matthew Knepley <knepley at gmail.com>
> To: khalid ashraf <khalid_eee at yahoo.com> 
> Cc: PETSc users list <petsc-users at mcs.anl.gov> 
> Sent: Monday, April 2, 2012 3:08 PM
> Subject: Re: [petsc-users] transfer vector data diagonally on DA
> 
> On Mon, Apr 2, 2012 at 4:53 PM, khalid ashraf <khalid_eee at yahoo.com> wrote:
> Thanks Matt. It works now. 
> 
> I have another problem. I am writing this program. 
> 
> 1) Always test on 2 procs once
> 
> 2) I have no idea what you actually want to do. However, it really simple to just print out
>     the indices you are using in row[] and the values in bodyFx[] and check that VecSetValues()
>     is working as you expect.
> 
>    Matt
>  
>  
>  for (k=zs; k<zs+zm; k++) {
>   for (j=ys; j<ys+ym; j++) {
>   for (i=xs; i<xs+xm; i++) {
>  if ( i!=(mx-1) || j!=my-1 || k!=mz-1)
> {
> bodyFx[0]=1;
> bodyFx[1]=-0.3;
> bodyFx[2]=-0.3;
> bodyFx[3]=-0.3;
> row[0]=k*mx*my+j*mx+i;
> row[1]=k*mx*my+j*mx+i+1;
> row[2]=k*mx*my+(j+1)*mx+i;
> row[3]=(k+1)*mx*my+j*mx+i;
> VecSetValues(fx_test,4,row,bodyFx,ADD_VALUES);
> }
> VecAssemblyBegin(fx_test);
> VecAssemblyEnd(fx_test);
> 
> Output: Print fx_test
> 
>  Here fx_test is a global vector. This program gives correct output on 1 proc. but on 4 processors,
> the program runs but the outputs are wrong.
> 
> I also tried
>  for (k=zs; k<zs+zm; k++) {
>   for (j=ys; j<ys+ym; j++) {
>   for (i=xs; i<xs+xm; i++) {
>  if ( i!=(mx-1) || j!=my-1 || k!=mz-1)
> {
> bodyFx[0]=1;
> bodyFx[1]=-0.3;
> bodyFx[2]=-0.3;
> bodyFx[3]=-0.3;
>  fx1_localptr[k][j][i+1]+=bodyFx[4];
>  fx1_localptr[k][j+1][i]+=bodyFx[3];
>  fx1_localptr[k+1][j][i]+=bodyFx[1];
>  fx1_localptr[k][j][i]+=bodyFx[0];
> }
> Here, fx1_localptr is an array from a local vector that is communicated from the global vector fx_test.
> In this case as well, I get similar errors. 
> 
> Thanks.
> Khalid
> 
> Some final numbers of the vector with 8*8*6 grid, 4 proc,  
> -2.9999999999999999e-01
> -2.9999999999999999e-01
> -2.9999999999999999e-01
> -2.9999999999999999e-01
> 1.0000000000000009e-01
> 1.0000000000000009e-01
> 1.0000000000000009e-01
> -2.9999999999999999e-01
> -2.9999999999999999e-01
> -2.9999999999999999e-01
> -2.9999999999999999e-01
> 0.0000000000000000e+00
> 4.0000000000000002e-01
> 1.0000000000000009e-01
> 1.0000000000000009e-01
> 1.0000000000000009e-01
> 0.0000000000000000e+00
> 0.0000000000000000e+00
> 0.0000000000000000e+00
> 0.0000000000000000e+00
> 1.0000000000000009e-01
> 1.0000000000000009e-01
> 1.0000000000000009e-01
> -2.9999999999999999e-01
> 0.0000000000000000e+00
> 0.0000000000000000e+00
> 0.0000000000000000e+00
> 
> 
> Some final numbers of the vector with 8*8*6 grid, 1 proc,  
> 0.0000000000000000e+00
> -2.9999999999999999e-01
> -2.9999999999999999e-01
> -2.9999999999999999e-01
> -2.9999999999999999e-01
> -2.9999999999999999e-01
> -2.9999999999999999e-01
> -2.9999999999999999e-01
> 0.0000000000000000e+00
> -2.9999999999999999e-01
> -2.9999999999999999e-01
> -2.9999999999999999e-01
> -2.9999999999999999e-01
> -2.9999999999999999e-01
> -2.9999999999999999e-01
> -2.9999999999999999e-01
> 0.0000000000000000e+00
> -2.9999999999999999e-01
> -2.9999999999999999e-01
> -2.9999999999999999e-01
> -2.9999999999999999e-01
> -2.9999999999999999e-01
> -2.9999999999999999e-01
> -2.9999999999999999e-01
> 0.0000000000000000e+00
> 0.0000000000000000e+00
> 0.0000000000000000e+00
> 0.0000000000000000e+00
> 0.0000000000000000e+00
> 0.0000000000000000e+00
> 0.0000000000000000e+00
> 0.0000000000000000e+00
> 0.0000000000000000e+00
> 
>                                
> 
> From: Matthew Knepley <knepley at gmail.com>
> To: khalid ashraf <khalid_eee at yahoo.com>; PETSc users list <petsc-users at mcs.anl.gov> 
> Sent: Monday, April 2, 2012 5:41 AM
> Subject: Re: [petsc-users] transfer vector data diagonally on DA
> 
> On Mon, Apr 2, 2012 at 3:36 AM, khalid ashraf <khalid_eee at yahoo.com> wrote:
> Hi Jed,
> 
> I am using petsc/3.1 and the include file "petscdmda.h" is not working. I am using the "petscda.h"
> and DACreate3D. 
> unew_localptr[][][] is from a global vector using VecGetArray
> u_localptr[][][] is from a local vector communicated from a global vector.
> I tried 
> unew_localptr[k][j][i]=u_localptr[k-1][j-1][i-1]
> but it gives Segmentation Violation error. 
> 
> I would guess (because you provide almost no information about what you are doing), that
> this is a domain on the edge. If you truly mean to have a periodic domain, you must set that in the
> creation call. Then the local vector will also ghost regions outside the domain boundary.
> 
>    Matt
>  
> I also tried unew_localptr[k][j][i]=u_localptr[k][j+1][i+1] which works but 
> unew_localptr[k][j][i]=u_localptr[k+1][j+1][i+1] or unew_localptr[k][j][i]=u_localptr[k][j-1][i-1]
> does not work. I am running the program on4 processors.
> 
> Thanks.
> 
> 
>   ierr = DACreate3d(PETSC_COMM_WORLD,DA_YPERIODIC,DA_STENCIL_BOX,appctx.l,appctx.m,appctx.n,
>                     PETSC_DECIDE,PETSC_DECIDE,PETSC_DECIDE,1,1,PETSC_NULL,PETSC_NULL,PETSC_NULL,
>                     &appctx.da);CHKERRQ(ierr);
>  for (k=zs; k<zs+zm; k++) {
>   for (j=ys; j<ys+ym; j++) {
>   for (i=xs; i<xs+xm; i++) {
> 
>   if(i!=0 || j!=0 || k!=0|| i!=mx-1 || j!=my-1 || k!=mz-1)
> 
>   unew_localptr[k][j][i]=u_localptr[k+1][j+1][i+1];
> }}}
> From: Jed Brown <jedbrown at mcs.anl.gov>
> To: khalid ashraf <khalid_eee at yahoo.com>; PETSc users list <petsc-users at mcs.anl.gov> 
> Sent: Sunday, April 1, 2012 10:07 PM
> Subject: Re: [petsc-users] transfer vector data diagonally on DA
> 
> On Sun, Apr 1, 2012 at 22:01, khalid ashraf <khalid_eee at yahoo.com> wrote:
> I want to transfer vector data diagonally in the DA grid like 
>  for (k=zs; k<zs+zm; k++) {
>   for (j=ys; j<ys+ym; j++) {
>   for (i=xs; i<xs+xm; i++) {
> if(i!=mx-1 || j!=my-1 || k!=mz-1){
> u_new[k+1][j+1][i+1]=u[k][j][i];}
> }}}
> 
> Could you please suggest the best way to do it minimizing interprocessor assignments.
> 
> Both are on the same DMDA?
> 
> Communicate U to Ulocal (DMGlobalToLocalBegin/End) using a BOX stencil with width at least 1, get the global array u_new[][][] from UGlobalNew and the local arrays u[][][] from Ulocal, then assign u_new[k][j][i] = u[k-1][j-1][i-1].
> 
> 
> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
> 
> 
> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
> 
> 



More information about the petsc-users mailing list