[petsc-users] about VecGetArray()

Matthew Knepley knepley at gmail.com
Sat Jun 7 13:16:23 CDT 2014


On Sat, Jun 7, 2014 at 1:02 PM, LikunTan <tlk0812 at hotmail.com> wrote:

> Hello,
>
> Thank you for your reply.  Sorry I did not make it clear. I added the part
> where I set values for M before calling VecGetArray(). Below is a more
> complete code. I am not sure how to access the elements consistently by
> using aM.
>

As Barry pointed out, this cannot work in parallel. A Vec stores contiguous
chunks of memory in process order. VecGetArray()
has access to the local chunk, starting from the process offset. This is
detalied in the manual. Random access in parallel is
not efficiently achievable.

  Thanks,

      Matt


> **************************************************************
> VecCreate(PETSC_COMM_WORLD, &M);
> VecSetSizes(M, LEN*DOF, NODE*DOF);
>
> //setting LEN and NODENP[] in the function get_nodes_process
> //the stored data are not contiguous
> MPI_Comm_rank(PETSC_COMM_WORLD, &rank);
> MPI_Comm_size(PETSC_COMM_WORLD, &size);
> get_nodes_process(NODE, size, rank)
>
> //set the values of M
> for(int i=0; i<LEN; i++)
> {
>        node=NODENP[i];
>        for(int j=0; j<DOF; j++)
>        {
>               col[j]=node*DOF+j;
>               val[j]=1.0*j;
>        }
>        VecSetValues(M, DOF, col, val, INSERT_VALUES);
> }
> VecAssemblyBegin(M);
> VecAssemblyEnd(M);
>
> //change values of M
> VecGetArray(M, &aM);
> for(int i=0; i<LEN; i++)
> {
>        node=NODENP[i];
>        for(int j=0; j<DOF; j++)
>       {
>            aM[node*DOF+j] or aM[i*DOF+j] ? //accessing the elements of M
>      }
> }
> VecRestoreArray(M, &aM);
> *********************************************************
>
> best,
>
>
> > Subject: Re: [petsc-users] about VecGetArray()
> > From: bsmith at mcs.anl.gov
> > Date: Sat, 7 Jun 2014 10:45:58 -0500
> > CC: petsc-users at mcs.anl.gov
> > To: tlk0812 at hotmail.com
> >
> >
> > It looks like you are trying to access the entire vector on all
> processes. You cannot do this. VecGetArray() only gives you access to the
> local values. If you need to access the entire vector on each process
> (which is not scalable or a good idea) you can use VecScatterCreateToAll()
> or VecScatterCreateToZero() then scatter the vector then use VecGetArray()
> on the now new sequential vector.
> >
> > Barry
> >
> > On Jun 7, 2014, at 1:40 AM, LikunTan <tlk0812 at hotmail.com> wrote:
> >
> > > Hello,
> > >
> > > I defined the partition of Vector, which is not stored contiguously.
> Here is a part of my code. The total number of nodes is NODE*DOF. Before
> executing this following code, I defined LEN and an array NODENP[] to store
> the number of nodes and the list of nodes in each processor. I accessed the
> element using aM[node*DOF+j] and aM[i*DOF+j], but none of them gave me the
> correct answer. Your help is well appreciated.
> > >
> > > **************************************************************
> > > VecCreate(PETSC_COMM_WORLD, &M);
> > > VecSetSizes(M, LEN*DOF, NODE*DOF);
> > >
> > > VecGetArray(M, &aM);
> > >
> > > for(int i=0; i<LEN; i++)
> > > {
> > > node=NODENP[i];
> > > for(int j=0; j<DOF; j++)
> > > {
> > > aM[node*DOF+j] or aM[i*DOF+j] ? //accessing the elements of M
> > >
> > > }
> > > }
> > > VecRestoreArray(M, &aM);
> > > *********************************************************
> >
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140607/44e1edcf/attachment.html>


More information about the petsc-users mailing list