[petsc-dev] Vec set a block of values

Junchao Zhang jczhang at mcs.anl.gov
Fri Apr 20 16:18:31 CDT 2018


I don't care the little extra overhead. I just feel the avoidable loop in
the user code a bit ugly.

--Junchao Zhang

On Fri, Apr 20, 2018 at 4:11 PM, Smith, Barry F. <bsmith at mcs.anl.gov> wrote:

>
>   I would just use VecSetValues() since almost all values are local it
> will be scalable and a little extra time in setting values is not a big
> deal for this test code.
>
>    Barry
>
>
> > On Apr 20, 2018, at 4:09 PM, Jed Brown <jed at jedbrown.org> wrote:
> >
> > Junchao Zhang <jczhang at mcs.anl.gov> writes:
> >
> >> VecScatter is too heavy (in both coding and runtime) for this simple
> task.
> >> I just want to pad a vector loaded from a PetscViewer to match an
> MPIBAIJ
> >> matrix. Thus the majority is memcpy, with few neighborhood off-processor
> >> puts.
> >
> > At what address do those puts go, how do you avoid race conditions from
> > multiple processors having overlapping neighborhoods, and how does the
> > recipient know that the put has completed?  Just use VecScatter.  It
> > could be optimized to recognize contiguous runs above a certain size and
> > convert to memcpy.
> >
> >> --Junchao Zhang
> >>
> >> On Fri, Apr 20, 2018 at 3:57 PM, Jed Brown <jed at jedbrown.org> wrote:
> >>
> >>> Junchao, If you need to access off-process values and put them into a
> >>> new vector, you should use VecScatter.
> >>>
> >>> "Smith, Barry F." <bsmith at mcs.anl.gov> writes:
> >>>
> >>>>  Setting large contiguous blocks of values is not a common use case.
> In
> >>> finite elements the values are not contiguous.
> >>>>
> >>>>> On Apr 20, 2018, at 3:45 PM, Zhang, Junchao <jczhang at mcs.anl.gov>
> >>> wrote:
> >>>>>
> >>>>> I agree the extra overhead can be small, but users are forced to
> write
> >>> a loop where one single line gives the best.
> >>>>>
> >>>>> --Junchao Zhang
> >>>>>
> >>>>> On Fri, Apr 20, 2018 at 3:36 PM, Smith, Barry F. <bsmith at mcs.anl.gov
> >
> >>> wrote:
> >>>>>
> >>>>>   When setting values into matrices and vectors we consider the
> >>> "extra" overhead of needing to pass in the indices for all the values
> >>> (instead of being able to set an arbitrary block of values without
> using
> >>> indices for each one) to be a minimal overhead that we can live with.
> >>>>>
> >>>>>   Barry
> >>>>>
> >>>>>
> >>>>>> On Apr 20, 2018, at 3:33 PM, Junchao Zhang <jczhang at mcs.anl.gov>
> >>> wrote:
> >>>>>>
> >>>>>>
> >>>>>> On Fri, Apr 20, 2018 at 3:18 PM, Matthew Knepley <knepley at gmail.com
> >
> >>> wrote:
> >>>>>> On Fri, Apr 20, 2018 at 4:10 PM, Junchao Zhang <jczhang at mcs.anl.gov
> >
> >>> wrote:
> >>>>>> To pad a vector, i.e., copy a vector to a new one, I have to call
> >>> VecSetValue(newb,1,&idx,...) for each element. But to be efficient,
> what I
> >>> really needs is to set a block of values in one call. It looks PETSc
> does
> >>> not have a routine for that(?). I looked at VecSetValuesBlocked, but it
> >>> looks it is not for that purpose.
> >>>>>> Should we have something like VecSetValuesBlock(Vec v,PetscInt
> >>> i,PetscInt cnt,PetscScalar *value, InsertMode mode) to set cnt values
> >>> starting at index i?
> >>>>>>
> >>>>>> Use VecGetArray().
> >>>>>> Did you mean VecGetArray b and newb, do a memcpy from b to new and
> >>> then restore them? If yes, it does not work since some of the values I
> want
> >>> to set might be remote.
> >>>>>> E.g, I have 4 processors. b's size is 181 and is distributed as 46,
> >>> 45,45,45, newb is distributed as 48,45,45,45 to match a matrix of block
> >>> size 3.
> >>>>>>
> >>>>>>
> >>>>>>  Matt
> >>>>>>
> >>>>>> --Junchao Zhang
> >>>>>>
> >>>>>>
> >>>>>>
> >>>>>> --
> >>>>>> What most experimenters take for granted before they begin their
> >>> experiments is infinitely more interesting than any results to which
> their
> >>> experiments lead.
> >>>>>> -- Norbert Wiener
> >>>>>>
> >>>>>> https://www.cse.buffalo.edu/~knepley/
> >>>>>>
> >>>>>
> >>>>>
> >>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20180420/dc573449/attachment-0001.html>


More information about the petsc-dev mailing list