[petsc-users] Arbitrary ownership IS for a matrix

Matthew Knepley knepley at gmail.com
Wed Mar 16 05:20:33 CDT 2022


On Tue, Mar 15, 2022 at 9:58 PM Nicolás Barnafi <nabw91 at gmail.com> wrote:

> Hello, sorry to bring back this issue.
>
> I am observing some behavior that I don't understand to try to debug
> our code, so my question is: what happens when to set values to a
> matrix from dofs that don't belong to the processor? I.e. if processor
> 0 has [0 1 2] and proc 1 has dofs [3 4 5], if I set the value in
> position (3,3) in proc 0, does this not complain during assemble as
> long as I preallocated sufficient rows, even if these do not coincide
> with the ones from MatSetSizes?
>

Here is what happens. When you call MatSetValues(), if the value is for an
off-process row, it is
stored in a MatStash object. When you call MatAssemblyBegin(), those values
are sent to the
correct process, and inserted with the normal call. If there is
insufficient allocation, you would get
an error at that time.

  THanks,

    Matt


> Thanks in advance,
> Nicolas
>
> On Thu, Mar 10, 2022 at 2:50 AM Nicolás Barnafi <nabw91 at gmail.com> wrote:
> >
> > Thank you both very much, it is exactly what I needed.
> >
> > Best regards
> >
> > On Wed, Mar 9, 2022, 21:19 Matthew Knepley <knepley at gmail.com> wrote:
> >>
> >> On Wed, Mar 9, 2022 at 5:13 PM Barry Smith <bsmith at petsc.dev> wrote:
> >>>
> >>>
> >>>   You need to do a mapping of your global numbering to the standard
> PETSc numbering and use the PETSc numbering for all access to vectors and
> matrices.
> >>>
> >>>    https://petsc.org/release/docs/manualpages/AO/AOCreate.html
> provides one approach to managing the renumbering.
> >>
> >>
> >> You can think of this as the mapping to offsets that you would need in
> any event to store your values (they could not be directly addressed with
> your random indices).
> >>
> >>   Thanks,
> >>
> >>      Matt
> >>
> >>>
> >>>   Barry
> >>>
> >>>
> >>> On Mar 9, 2022, at 3:42 PM, Nicolás Barnafi <nabw91 at gmail.com> wrote:
> >>>
> >>> Hi community,
> >>>
> >>> I have an application with polytopal meshes (elements of arbitrary
> shape) where the distribution of dofs is not PETSc friendly, meaning that
> it is not true that cpu0 owns dofs [0,a), then cpu1 owns [a,b) and so on,
> but instead the distribution is in fact random. Another important detail is
> that boundary dofs are shared, meaning that if dof 150 is on the boundary,
> each subdomain vector has dof 150.
> >>>
> >>> Under this considerations:
> >>>
> >>> i) Is it possible to give an arbitrary mapping to the matrix structure
> or is the blocked distribution hard coded?
> >>> ii) Are the repeated boundary dofs an issue when computing a
> Fieldsplit preconditioner in parallel?
> >>>
> >>> Best regards,
> >>> Nicolas
> >>>
> >>>
> >>
> >>
> >> --
> >> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> >> -- Norbert Wiener
> >>
> >> https://www.cse.buffalo.edu/~knepley/
>
>
>
> --
> Nicolás Alejandro Barnafi Wittwer
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220316/194de70b/attachment-0001.html>


More information about the petsc-users mailing list