[petsc-users] How to access Vector data properly

Songtao Chu st.chu at outlook.com
Mon Apr 9 12:13:30 CDT 2018


Yes, but the bug remains the same.
Then I switched my development environment to Linux just now, this code ran without any bug. So it seems this is a bug of WSL?

________________________________
From: Matthew Knepley <knepley at gmail.com>
Sent: Tuesday, April 10, 2018 0:42
To: Songtao Chu
Cc: Hong; petsc-users at mcs.anl.gov
Subject: Re: [petsc-users] How to access Vector data properly

Did you build EXACTLY the code I sent back?

  Thanks,

    Matt

On Mon, Apr 9, 2018 at 12:13 PM, Songtao Chu <st.chu at outlook.com<mailto:st.chu at outlook.com>> wrote:
The two output results are different in my test.
If I use the way of assignment variables in current code. I got error output of VecView as below:
Vec Object: 2 MPI processes
  type: mpi
Vec Object: Vec_0x7fffd6895a90_0 2 MPI processes
  type: mpi
Process [0]
0.
0.
0.
Process [1]
0.
0.

Or if I use the previous way (DMDANaturalToGlobal), I got error output of PetscSynchronizedPrintf:
Rank=0
   0    0    0
Rank=1
   0    0

Both are fine if I run without mpiexec.

Besides, my development environment is CLion + WSL + GCC 7.3.


________________________________
From: Matthew Knepley <knepley at gmail.com<mailto:knepley at gmail.com>>
Sent: Tuesday, April 10, 2018 0:01
To: Songtao Chu
Cc: Hong; petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] How to access Vector data properly

This works fine for me:


knepley/feature-pc-patch $:/PETSc3/petsc/petsc-pylith/tmp$ ../arch-master-debug/bin/mpiexec -n 2 ./tester

Vec Object: 2 MPI processes

  type: mpi

Process [0]

0.

1.

2.

Process [1]

3.

4.

Rank=0

   0    1    2

Rank=1

   3    4

What is the problem?

   Matt


On Mon, Apr 9, 2018 at 11:51 AM, Songtao Chu <st.chu at outlook.com<mailto:st.chu at outlook.com>> wrote:

static char help[] = "\n\n";
#include <petscdm.h>
#include <petscdmda.h>


int main(int argc,char **argv)
{
    PetscMPIInt     rank;
    PetscErrorCode  ierr;
    Vec             global,local,natural;
    DM              da;
    PetscReal       *val;
    PetscInt        i,x,xm;


    ierr = PetscInitialize(&argc,&argv,(char*)0,help);if (ierr) return ierr;
    ierr = MPI_Comm_rank(PETSC_COMM_WORLD,&rank);CHKERRQ(ierr);

    ierr = DMDACreate1d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,5,1,1,NULL,&da);CHKERRQ(ierr);
    ierr = DMSetUp(da);CHKERRQ(ierr);
    ierr = DMCreateGlobalVector(da,&global);CHKERRQ(ierr);

    ierr = DMDAGetCorners(da,&x,NULL,NULL,&xm,NULL,NULL);CHKERRQ(ierr);
    ierr = DMDAVecGetArray(da, global, &val);CHKERRQ(ierr);
    ierr = PetscSynchronizedPrintf(PETSC_COMM_SELF, "Rank=%d\n", rank);CHKERRQ(ierr);
    for (i = x; i < x + xm; ++i) {
        val[i] = i;
    }
    ierr = DMDAVecRestoreArray(da, global, &val);CHKERRQ(ierr);

    VecView(global,PETSC_VIEWER_STDOUT_WORLD);

    ierr = DMDAGetCorners(da,&x,NULL,NULL,&xm,NULL,NULL);CHKERRQ(ierr);
    ierr = DMDAVecGetArray(da, global, &val);CHKERRQ(ierr);
    ierr = PetscSynchronizedPrintf(PETSC_COMM_SELF, "Rank=%d\n", rank);CHKERRQ(ierr);
    for (i = x; i < x + xm; ++i) {
        ierr = PetscSynchronizedPrintf(PETSC_COMM_SELF, "%4.f ", val[i]);CHKERRQ(ierr);
    }
    ierr = DMDAVecRestoreArray(da, global, &val);CHKERRQ(ierr);
    PetscSynchronizedFlush(PETSC_COMM_SELF, PETSC_STDOUT);

    ierr = PetscFinalize();
    return ierr;
}


________________________________
From: Matthew Knepley <knepley at gmail.com<mailto:knepley at gmail.com>>
Sent: Monday, April 9, 2018 23:29
To: Chu Songtao
Cc: Hong; petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] How to access Vector data properly

On Mon, Apr 9, 2018 at 11:19 AM, Chu Songtao <st.chu at outlook.com<mailto:st.chu at outlook.com>> wrote:
Hong,
I added the function to the end, but the results didn't change.
Then I tested another thing, assigning global by Get(Restore)Array instead of DMDANaturalToGlobal. And the results switched. VecView outputs 0 and PetscSynchronizedPrintf outputs correct values. Strange.

Send the whole code. You have a logic error somewhere.

   Matt


    ierr = DMDACreate1d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,5,1,1,NULL,&da);CHKERRQ(ierr);
    ierr = DMSetUp(da);CHKERRQ(ierr);
    ierr = DMCreateGlobalVector(da,&global);CHKERRQ(ierr);

    ierr = DMDAGetCorners(da,&x,NULL,NULL,&xm,NULL,NULL);CHKERRQ(ierr);
    ierr = DMDAVecGetArray(da, global, &val);CHKERRQ(ierr);
    ierr = PetscSynchronizedPrintf(PETSC_COMM_SELF, "Rank=%d\n", rank);CHKERRQ(ierr);
    for (i = x; i < x + xm; ++i) {
        val[i] = i;
    }
    ierr = DMDAVecRestoreArray(da, global, &val);CHKERRQ(ierr);

    VecView(global,PETSC_VIEWER_STDOUT_WORLD);

    ierr = DMDAGetCorners(da,&x,NULL,NULL,&xm,NULL,NULL);CHKERRQ(ierr);
    ierr = DMDAVecGetArray(da, global, &val);CHKERRQ(ierr);
    ierr = PetscSynchronizedPrintf(PETSC_COMM_SELF, "Rank=%d\n", rank);CHKERRQ(ierr);
    for (i = x; i < x + xm; ++i) {
        ierr = PetscSynchronizedPrintf(PETSC_COMM_SELF, "%4.f ", val[i]);CHKERRQ(ierr);
    }
    ierr = DMDAVecRestoreArray(da, global, &val);CHKERRQ(ierr);
    PetscSynchronizedFlush(PETSC_COMM_SELF, PETSC_STDOUT);

$ mpiexec -n 2 ./test0
Rank=0
Rank=1
Vec Object: 2 MPI processes
  type: mpi
Vec Object: Vec_0x7fffd6629a90_0 2 MPI processes
  type: mpi
Rank=1
   3    4 Process [0]
0.
0.
0.
Process [1]
0.
0.
Rank=0
   0    1    2


________________________________

Songtao :
You may need adding PetscSynchronizedFlush() to the end. See petsc/src/dm/examples/tests/ex43.c
Hong
Hello,
    I am just a beginner. Now I get confused on how to correctly use Vec and DMDA.
    I set 1D DMDA global vector with natural values for a test. Then use VecView and DMDAVecGetArray to view the data, but the results are different. I don't know why.

this is the code:

PetscMPIInt     rank;
PetscErrorCode  ierr;
Vec             global,local,natural;
DM              da;
PetscReal       *xnatural,*val;
PetscInt        i,start,end,x,xm;


ierr = PetscInitialize(&argc,&argv,(char*)0,help);if (ierr) return ierr;
ierr = MPI_Comm_rank(PETSC_COMM_WORLD,&rank);CHKERRQ(ierr);

ierr = DMDACreate1d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,5,1,1,NULL,&da);CHKERRQ(ierr);
ierr = DMSetUp(da);CHKERRQ(ierr);
ierr = DMCreateGlobalVector(da,&global);CHKERRQ(ierr);

ierr = DMDACreateNaturalVector(da,&natural);CHKERRQ(ierr);
ierr = VecGetOwnershipRange(natural,&start,&end);CHKERRQ(ierr);
ierr = VecGetArray(natural,&xnatural);CHKERRQ(ierr);
for (i=start; i<end; i++) {
    xnatural[i-start] = i;
}
ierr = VecRestoreArray(natural,&xnatural);CHKERRQ(ierr);
ierr = DMDANaturalToGlobalBegin(da,natural,INSERT_VALUES,global);CHKERRQ(ierr);
ierr = DMDANaturalToGlobalEnd(da,natural,INSERT_VALUES,global);CHKERRQ(ierr);
ierr = VecDestroy(&natural);CHKERRQ(ierr);

VecView(global,PETSC_VIEWER_STDOUT_WORLD);

ierr = DMDAGetCorners(da,&x,NULL,NULL,&xm,NULL,NULL);CHKERRQ(ierr);
ierr = DMDAVecGetArray(da, global, &val);CHKERRQ(ierr);
ierr = PetscSynchronizedPrintf(PETSC_COMM_SELF, "Rank=%d\n", rank);CHKERRQ(ierr);
for (i = x; i < x + xm; ++i) {
    ierr = PetscSynchronizedPrintf(PETSC_COMM_SELF, "%4.f ", val[i]);CHKERRQ(ierr);
}
ierr = DMDAVecRestoreArray(da, global, &val);CHKERRQ(ierr);

and the results are:

$ ./test0
Vec Object: 1 MPI processes
  type: seq
Vec Object: Vec_0x7ffff3cbc500_0 1 MPI processes
  type: mpi
Process [0]
0.
1.
2.
3.
4.
Rank=0
   0    1    2    3    4

$ mpiexec -n 2 ./test0
Vec Object: 2 MPI processes
  type: mpi
Vec Object: Vec_0x7fffcf948a90_0 2 MPI processes
  type: mpi
Process [0]
0.
1.
2.
Process [1]
3.
4.
Rank=0
   0    0    0 Rank=1
   0    0


the seq version is correct, but the mpi version is wrong. Every element in val is 0.





--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/<http://www.caam.rice.edu/~mk51/>



--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/<http://www.caam.rice.edu/~mk51/>



--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/<http://www.caam.rice.edu/~mk51/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180409/ec579434/attachment-0001.html>


More information about the petsc-users mailing list