[petsc-users] How to access Vector data properly
Chu Songtao
st.chu at outlook.com
Mon Apr 9 10:19:54 CDT 2018
Hong,
I added the function to the end, but the results didn't change.
Then I tested another thing, assigning global by Get(Restore)Array instead of DMDANaturalToGlobal. And the results switched. VecView outputs 0 and PetscSynchronizedPrintf outputs correct values. Strange.
ierr = DMDACreate1d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,5,1,1,NULL,&da);CHKERRQ(ierr);
ierr = DMSetUp(da);CHKERRQ(ierr);
ierr = DMCreateGlobalVector(da,&global);CHKERRQ(ierr);
ierr = DMDAGetCorners(da,&x,NULL,NULL,&xm,NULL,NULL);CHKERRQ(ierr);
ierr = DMDAVecGetArray(da, global, &val);CHKERRQ(ierr);
ierr = PetscSynchronizedPrintf(PETSC_COMM_SELF, "Rank=%d\n", rank);CHKERRQ(ierr);
for (i = x; i < x + xm; ++i) {
val[i] = i;
}
ierr = DMDAVecRestoreArray(da, global, &val);CHKERRQ(ierr);
VecView(global,PETSC_VIEWER_STDOUT_WORLD);
ierr = DMDAGetCorners(da,&x,NULL,NULL,&xm,NULL,NULL);CHKERRQ(ierr);
ierr = DMDAVecGetArray(da, global, &val);CHKERRQ(ierr);
ierr = PetscSynchronizedPrintf(PETSC_COMM_SELF, "Rank=%d\n", rank);CHKERRQ(ierr);
for (i = x; i < x + xm; ++i) {
ierr = PetscSynchronizedPrintf(PETSC_COMM_SELF, "%4.f ", val[i]);CHKERRQ(ierr);
}
ierr = DMDAVecRestoreArray(da, global, &val);CHKERRQ(ierr);
PetscSynchronizedFlush(PETSC_COMM_SELF, PETSC_STDOUT);
$ mpiexec -n 2 ./test0
Rank=0
Rank=1
Vec Object: 2 MPI processes
type: mpi
Vec Object: Vec_0x7fffd6629a90_0 2 MPI processes
type: mpi
Rank=1
3 4 Process [0]
0.
0.
0.
Process [1]
0.
0.
Rank=0
0 1 2
________________________________
Songtao :
You may need adding PetscSynchronizedFlush() to the end. See petsc/src/dm/examples/tests/ex43.c
Hong
Hello,
I am just a beginner. Now I get confused on how to correctly use Vec and DMDA.
I set 1D DMDA global vector with natural values for a test. Then use VecView and DMDAVecGetArray to view the data, but the results are different. I don't know why.
this is the code:
PetscMPIInt rank;
PetscErrorCode ierr;
Vec global,local,natural;
DM da;
PetscReal *xnatural,*val;
PetscInt i,start,end,x,xm;
ierr = PetscInitialize(&argc,&argv,(char*)0,help);if (ierr) return ierr;
ierr = MPI_Comm_rank(PETSC_COMM_WORLD,&rank);CHKERRQ(ierr);
ierr = DMDACreate1d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,5,1,1,NULL,&da);CHKERRQ(ierr);
ierr = DMSetUp(da);CHKERRQ(ierr);
ierr = DMCreateGlobalVector(da,&global);CHKERRQ(ierr);
ierr = DMDACreateNaturalVector(da,&natural);CHKERRQ(ierr);
ierr = VecGetOwnershipRange(natural,&start,&end);CHKERRQ(ierr);
ierr = VecGetArray(natural,&xnatural);CHKERRQ(ierr);
for (i=start; i<end; i++) {
xnatural[i-start] = i;
}
ierr = VecRestoreArray(natural,&xnatural);CHKERRQ(ierr);
ierr = DMDANaturalToGlobalBegin(da,natural,INSERT_VALUES,global);CHKERRQ(ierr);
ierr = DMDANaturalToGlobalEnd(da,natural,INSERT_VALUES,global);CHKERRQ(ierr);
ierr = VecDestroy(&natural);CHKERRQ(ierr);
VecView(global,PETSC_VIEWER_STDOUT_WORLD);
ierr = DMDAGetCorners(da,&x,NULL,NULL,&xm,NULL,NULL);CHKERRQ(ierr);
ierr = DMDAVecGetArray(da, global, &val);CHKERRQ(ierr);
ierr = PetscSynchronizedPrintf(PETSC_COMM_SELF, "Rank=%d\n", rank);CHKERRQ(ierr);
for (i = x; i < x + xm; ++i) {
ierr = PetscSynchronizedPrintf(PETSC_COMM_SELF, "%4.f ", val[i]);CHKERRQ(ierr);
}
ierr = DMDAVecRestoreArray(da, global, &val);CHKERRQ(ierr);
and the results are:
$ ./test0
Vec Object: 1 MPI processes
type: seq
Vec Object: Vec_0x7ffff3cbc500_0 1 MPI processes
type: mpi
Process [0]
0.
1.
2.
3.
4.
Rank=0
0 1 2 3 4
$ mpiexec -n 2 ./test0
Vec Object: 2 MPI processes
type: mpi
Vec Object: Vec_0x7fffcf948a90_0 2 MPI processes
type: mpi
Process [0]
0.
1.
2.
Process [1]
3.
4.
Rank=0
0 0 0 Rank=1
0 0
the seq version is correct, but the mpi version is wrong. Every element in val is 0.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180409/d13ac03c/attachment-0001.html>
More information about the petsc-users
mailing list