[petsc-dev] Vec set a block of values

Junchao Zhang jczhang at mcs.anl.gov
Sat Apr 21 07:47:15 CDT 2018


On Sat, Apr 21, 2018 at 6:21 AM, Matthew Knepley <knepley at gmail.com> wrote:

> On Fri, Apr 20, 2018 at 5:02 PM, Junchao Zhang <jczhang at mcs.anl.gov>
> wrote:
>
>> VecScatter is too heavy (in both coding and runtime) for this simple
>> task. I just want to pad a vector loaded from a PetscViewer to match an
>> MPIBAIJ matrix. Thus the majority is memcpy, with few neighborhood
>> off-processor puts.
>>
>
> Now this makes no sense. You want to "pad" a Vec? What does this mean?
>
> 1) You want to extend its length, which none of these can do. You have to
> VecCreate another vector.
>
> 2) You want some elements to be zero. Just VecSet(v, 0) before setting any
> elements.
>
> If its not those, want is the operation?
>
I don't know why no one reply my post to petsc-users. It has background of
this question.  The word "pad" is also from comments of this example.


Subject: How to adapt vectors to matrices loaded from a viewer
To: PETSc users list <petsc-users at mcs.anl.gov>

 In a PETSc example (ex10.c
<http://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/examples/tutorials/ex10.c.html>)
one MatLoad A and then VecLoad b from a viewer. Since one can change mat
block size through option -matload_block_size, the code tries to pad b if A
and b's sizes mismatch, using the following test

186:   MatGetSize(A,&M,NULL);

187:   VecGetSize(b,&m);

188:   if (M != m) {   /* Create a new vector b by padding the old one */

  I think the code is wrong. One has to compare local column size of A and
local size of b, and if there is a mismatch on ANY processor, then one has
to create a new b.
  My questions are: Are users supposed to take this complexity? Does PETSc
provide a neat way to do that, for example, MatVecLoad(A,b,viewer)



>    Matt
>
>
>> --Junchao Zhang
>>
>> On Fri, Apr 20, 2018 at 3:57 PM, Jed Brown <jed at jedbrown.org> wrote:
>>
>>> Junchao, If you need to access off-process values and put them into a
>>> new vector, you should use VecScatter.
>>>
>>> "Smith, Barry F." <bsmith at mcs.anl.gov> writes:
>>>
>>> >   Setting large contiguous blocks of values is not a common use case.
>>> In finite elements the values are not contiguous.
>>> >
>>> >> On Apr 20, 2018, at 3:45 PM, Zhang, Junchao <jczhang at mcs.anl.gov>
>>> wrote:
>>> >>
>>> >> I agree the extra overhead can be small, but users are forced to
>>> write a loop where one single line gives the best.
>>> >>
>>> >> --Junchao Zhang
>>> >>
>>> >> On Fri, Apr 20, 2018 at 3:36 PM, Smith, Barry F. <bsmith at mcs.anl.gov>
>>> wrote:
>>> >>
>>> >>    When setting values into matrices and vectors we consider the
>>> "extra" overhead of needing to pass in the indices for all the values
>>> (instead of being able to set an arbitrary block of values without using
>>> indices for each one) to be a minimal overhead that we can live with.
>>> >>
>>> >>    Barry
>>> >>
>>> >>
>>> >> > On Apr 20, 2018, at 3:33 PM, Junchao Zhang <jczhang at mcs.anl.gov>
>>> wrote:
>>> >> >
>>> >> >
>>> >> > On Fri, Apr 20, 2018 at 3:18 PM, Matthew Knepley <knepley at gmail.com>
>>> wrote:
>>> >> > On Fri, Apr 20, 2018 at 4:10 PM, Junchao Zhang <jczhang at mcs.anl.gov>
>>> wrote:
>>> >> > To pad a vector, i.e., copy a vector to a new one, I have to call
>>> VecSetValue(newb,1,&idx,...) for each element. But to be efficient, what I
>>> really needs is to set a block of values in one call. It looks PETSc does
>>> not have a routine for that(?). I looked at VecSetValuesBlocked, but it
>>> looks it is not for that purpose.
>>> >> > Should we have something like VecSetValuesBlock(Vec v,PetscInt
>>> i,PetscInt cnt,PetscScalar *value, InsertMode mode) to set cnt values
>>> starting at index i?
>>> >> >
>>> >> > Use VecGetArray().
>>> >> > Did you mean VecGetArray b and newb, do a memcpy from b to new and
>>> then restore them? If yes, it does not work since some of the values I want
>>> to set might be remote.
>>> >> > E.g, I have 4 processors. b's size is 181 and is distributed as 46,
>>> 45,45,45, newb is distributed as 48,45,45,45 to match a matrix of block
>>> size 3.
>>> >> >
>>> >> >
>>> >> >   Matt
>>> >> >
>>> >> > --Junchao Zhang
>>> >> >
>>> >> >
>>> >> >
>>> >> > --
>>> >> > What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> >> > -- Norbert Wiener
>>> >> >
>>> >> > https://www.cse.buffalo.edu/~knepley/
>>> >> >
>>> >>
>>> >>
>>>
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20180421/5ef866fb/attachment-0001.html>


More information about the petsc-dev mailing list