[petsc-dev] Vec set a block of values
Junchao Zhang
jczhang at mcs.anl.gov
Fri Apr 20 15:45:46 CDT 2018
I agree the extra overhead can be small, but users are forced to write a
loop where one single line gives the best.
--Junchao Zhang
On Fri, Apr 20, 2018 at 3:36 PM, Smith, Barry F. <bsmith at mcs.anl.gov> wrote:
>
> When setting values into matrices and vectors we consider the "extra"
> overhead of needing to pass in the indices for all the values (instead of
> being able to set an arbitrary block of values without using indices for
> each one) to be a minimal overhead that we can live with.
>
> Barry
>
>
> > On Apr 20, 2018, at 3:33 PM, Junchao Zhang <jczhang at mcs.anl.gov> wrote:
> >
> >
> > On Fri, Apr 20, 2018 at 3:18 PM, Matthew Knepley <knepley at gmail.com>
> wrote:
> > On Fri, Apr 20, 2018 at 4:10 PM, Junchao Zhang <jczhang at mcs.anl.gov>
> wrote:
> > To pad a vector, i.e., copy a vector to a new one, I have to call
> VecSetValue(newb,1,&idx,...) for each element. But to be efficient, what I
> really needs is to set a block of values in one call. It looks PETSc does
> not have a routine for that(?). I looked at VecSetValuesBlocked, but it
> looks it is not for that purpose.
> > Should we have something like VecSetValuesBlock(Vec v,PetscInt
> i,PetscInt cnt,PetscScalar *value, InsertMode mode) to set cnt values
> starting at index i?
> >
> > Use VecGetArray().
> > Did you mean VecGetArray b and newb, do a memcpy from b to new and then
> restore them? If yes, it does not work since some of the values I want to
> set might be remote.
> > E.g, I have 4 processors. b's size is 181 and is distributed as 46,
> 45,45,45, newb is distributed as 48,45,45,45 to match a matrix of block
> size 3.
> >
> >
> > Matt
> >
> > --Junchao Zhang
> >
> >
> >
> > --
> > What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> > -- Norbert Wiener
> >
> > https://www.cse.buffalo.edu/~knepley/
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20180420/855381f4/attachment.html>
More information about the petsc-dev
mailing list