[petsc-users] Repeated global indices in local maps
Matthew Knepley
knepley at gmail.com
Wed May 27 06:12:44 CDT 2020
On Wed, May 27, 2020 at 4:14 AM Prateek Gupta <prateekgupta1709 at gmail.com>
wrote:
> Hi,
> I am new to using petsc and need its nonlinear solvers for my code. I am
> currently using parmetis (outside petsc) to partition an unstructured mesh
> element-wise, but working with data on the vertices of the mesh.
> Consequently, I have repeated vertices in different MPI-processes/ranks.
> At the solver stage, I need to solve for the data on vertices (solution
> vector is defined on the vertices). So, I need to create a distributed
> vector over vertices of the mesh, but the distribution in MPI-ranks is not
> contiguous since partitioning is (has to be) done element wise. I am trying
> to figure out,
> 1. if I need only Local to Global IS or do I need to combine them with AO?
> 2. Even at the VecCreateMPI stage, is it possible to inform petsc that,
> although, say, rank_i has n_i components of the vector, but those
> components are not arranged contiguously?
>
> For instance,
>
> Global vertices vector v : [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11]
> v_rank_1 : [2, 3, 4, 8, 9, 7] ; v_rank_2 : [0, 1, 2, 3, 6, 10, 11, 8, 9, 5]
>
PETSc is going to number the unknowns contiguously by process. If you want
to also have another numbering, as above,
you must handle it somehow. You could use an AO. However, I believe it is
easier to just renumber your mesh after
partitioning. This is what PETSc does in its unstructured mesh code.
Thanks,
Matt
> Any help is greatly appreciated.
>
> Thank you.
> Prateek Gupta
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200527/22f50960/attachment.html>
More information about the petsc-users
mailing list