[petsc-users] Question about a parallel implementation of PCFIELDSPLIT

Matthew Knepley knepley at gmail.com
Fri Jan 19 15:31:27 CST 2024


On Fri, Jan 19, 2024 at 4:25 PM Barry Smith <bsmith at petsc.dev> wrote:

>
>    Generally fieldsplit is used on problems that have a natural "split" of
> the variables into two or more subsets. For example u0,v0,u1,v1,u2,v2,u3,v4
> This is often indicated in the vectors and matrices with the "blocksize"
> argument, 2 in this case. DM also often provides this information.
>
>    When laying out a vector/matrix with a blocksize one must ensure that
> an equal number of of the subsets appears on each MPI process. So, for
> example, if the above vector is distributed over 3 MPI processes one could
> use   u0,v0,u1,v1       u2,v2      u3,v3  but one cannot use u0,v0,u1
>  v1,u2,v2   u3,v3.  Another way to think about it is that one must split up
> the vector as indexed by block among the processes. For most multicomponent
> problems this type of decomposition is very natural in the logic of the
> code.
>

This blocking is only convenient, not necessary. You can specify your own
field division using PCFieldSplitSetIS().

  Thanks,

     Matt


>   Barry
>
>
> On Jan 19, 2024, at 3:19 AM, Pantelis Moschopoulos <
> pmoschopoulos at outlook.com> wrote:
>
> Dear all,
>
> When I am using PCFIELDSPLIT and pc type "schur" in serial mode everything
> works fine. When I turn now to parallel, I observe that the number of ranks
> that I can use must divide the number of N without any remainder, where N
> is the number of unknowns. Otherwise, an error of the following form
> emerges: "Local columns of A10 3473 do not equal local rows of A00 3471".
>
> Can I do something to overcome this?
>
> Thanks,
> Pantelis
>
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240119/aa8becac/attachment-0001.html>


More information about the petsc-users mailing list