<div dir="ltr">Hi <div>I am experiencing possible memory issues with VecGetArray when it is used under sub-communicators (when I split the PETSC_COMM_WORLD to multiple subcomms). The following is the minimal code. Basically, you can test that if you parallelize the vector to more than one processor under a subcomm, the array obtained from the VecGetArray call doesn't seem to be correct.</div><div>Please test it with </div><div><br></div><div>1) mpirun -np 1 ./example -ncomms 1</div><div>2)
<span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline">mpirun -np 2 ./example -ncomms 2</span> </div><div>3)
<span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline">mpirun -np 2 ./example -ncomms 1</span>
<br>4) <span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:small;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline">mpirun -np 4 ./example -ncomms 2</span><br><div class="gmail_signature"><div dir="ltr"><br></div><div>you will 1) and 2) work as expected while in 3) and 4) some entries of the array are assigned erroneous values.</div><div><br></div><div>Any input will be appreciated.</div><div>Thanks</div><div>Zin</div><div><br></div><div>Minimal Code</div><div><br></div><div><div><font face="monospace, monospace">PetscErrorCode main(int argc, char **argv)</font></div><div><font face="monospace, monospace">{</font></div><div><font face="monospace, monospace"><br></font></div><div><font face="monospace, monospace"> MPI_Init(NULL, NULL);</font></div><div><font face="monospace, monospace"> PetscInitialize(&argc,&argv,NULL,NULL);</font></div><div><font face="monospace, monospace"> int size;</font></div><div><font face="monospace, monospace"> MPI_Comm_size(MPI_COMM_WORLD, &size);</font></div><div><font face="monospace, monospace"> PetscPrintf(PETSC_COMM_WORLD,"\tThe total number of processors is %d\n",size);</font></div><div><font face="monospace, monospace"><br></font></div><div><font face="monospace, monospace"> //check if the number of processors is divisible by the number of subcomms</font></div><div><font face="monospace, monospace"> int ncomms, np_per_comm;</font></div><div><font face="monospace, monospace"> PetscOptionsGetInt(NULL,"-ncomms",&ncomms,NULL);</font></div><div><font face="monospace, monospace"> if(!(size%ncomms==0)) SETERRQ(PETSC_COMM_WORLD,1,"The number of processes must be a multiple of ncomms so that it is divisible by the number of subcomms.");</font></div><div><font face="monospace, monospace"> np_per_comm=size/ncomms;</font></div><div><font face="monospace, monospace"> PetscPrintf(PETSC_COMM_WORLD,"\tThe number of subcomms is %d.\n\tEach subcomm has %d processors.\n",ncomms,np_per_comm);</font></div><div><font face="monospace, monospace"><br></font></div><div><font face="monospace, monospace"> //calculate the colour of each subcomm ( = rank of each processor / number of processors in each subcomm )</font></div><div><font face="monospace, monospace"> //note once calculated, the colour is fixed throughout the entire run</font></div><div><font face="monospace, monospace"> int rank;</font></div><div><font face="monospace, monospace"> MPI_Comm_rank(MPI_COMM_WORLD, &rank);</font></div><div><font face="monospace, monospace"> MPI_Comm subcomm;</font></div><div><font face="monospace, monospace"> int colour = rank/np_per_comm;</font></div><div><font face="monospace, monospace"> MPI_Comm_split(MPI_COMM_WORLD, colour, rank, &subcomm);</font></div><div><font face="monospace, monospace"><br></font></div><div><font face="monospace, monospace"> Vec u;</font></div><div><font face="monospace, monospace"> PetscScalar *_u;</font></div><div><font face="monospace, monospace"> int i,ns,ne;</font></div><div><font face="monospace, monospace"> PetscScalar tmp;</font></div><div><font face="monospace, monospace"><br></font></div><div><font face="monospace, monospace"> VecCreateMPI(subcomm,PETSC_DECIDE,10,&u);</font></div><div><font face="monospace, monospace"> VecSet(u,1.0+PETSC_i*0);</font></div><div><font face="monospace, monospace"><br></font></div><div><font face="monospace, monospace"> VecGetArray(u,&_u);</font></div><div><font face="monospace, monospace"> VecGetOwnershipRange(u,&ns,&ne);</font></div><div><font face="monospace, monospace"> for(i=ns;i<ne;i++){</font></div><div><font face="monospace, monospace"> VecGetValues(u,1,&i,&tmp);</font></div><div><font face="monospace, monospace"> PetscPrintf(PETSC_COMM_SELF,"colour %d, u[%d]_array = %g + i * (%g), u[%d]_vec = %g + i * %g \n",colour,i,creal(_u[i]),cimag(_u[i]),creal(tmp),cimag(tmp));</font></div><div><font face="monospace, monospace"> }</font></div><div><font face="monospace, monospace"> VecRestoreArray(u,&_u);</font></div><div><font face="monospace, monospace"><br></font></div><div><font face="monospace, monospace"> PetscFinalize();</font></div><div><font face="monospace, monospace"> return 0;</font></div><div><font face="monospace, monospace"><br></font></div><div><font face="monospace, monospace">}</font></div><br></div></div>
</div></div>