[petsc-users] interpreting results of ISLocalToGlobalMappingView

Wang quanwang.us at gmail.com
Sun Jun 4 20:53:03 CDT 2017


Thanks for your quick response. (Sorry, Matt, I didn't reply to all for the
first time.)

When I add another field, I got the following ( Now, MatGetOwnershipRange
gives [0 6] for rank 0 and [6 18] for rank 1 )

It seems that the global index for the second field is also starting from
zero,, instead of starting from N_{first field} (9 for this case).  But, in
the second column, the local index is accumulating. Is it reasonable or did
I make some mistakes when defining the dofs to setup the section?

I also have confusions when I play with ISLocalToGlobalMappingGetInfo(ltog,
nproc_nbr,procs_nbr, numprocs, indices_nbr, ierr). What does *numprocs* mean
here? I'm assuming the ith of its elements is  the number of indices that
have "ghost" copies at processor proc_nbr(i). But for this example, my test
 code gives numprocs of (\48, 24 \) on rank 0, which is larger than the
total dofs number in my problem, which is only 18.
ISLocalToGlobalMpngGetInfoSize(ltog, nproc_nbr, numprocmax, ierr) gives
nproc_nbr=2 and numprocmax=48 on both processors.

I attach the code and input file, both of which were found in this mail
list.

ISLocalToGlobalMapping results:
ISLocalToGlobalMapping Object: 2 MPI processes
  type: basic
[0] 0 0
[0] 1 0
[0] 2 1
[0] 3 1
[0] 4 5
[0] 5 5
[0] 6 2
[0] 7 2
[0] 8 6
[0] 9 6
[0] 10 8
[0] 11 8
[1] 0 3
[1] 1 3
[1] 2 4
[1] 3 4
[1] 4 5
[1] 5 5
[1] 6 6
[1] 7 6
[1] 8 7
[1] 9 7
[1] 10 8
[1] 11 8

On Sun, Jun 4, 2017 at 8:15 PM, Matthew Knepley <knepley at gmail.com> wrote:

> On Sun, Jun 4, 2017 at 7:08 PM, Wang <quanwang.us at gmail.com> wrote:
>
>> Hello. I have some confusions about the results given
>> by ISLocalToGlobalMappingView.
>>
>> After reading a simple mesh and associate each vertex with a scalar dof,
>> the test code uses DMPlexDistribute to get a distributed dm. Then I use the
>> following calls
>>
>>  call DMGetLocalToGlobalMapping(dm,ltog,ierr)
>>  call ISLocalToGlobalMappingView(ltog, PETSC_VIEWER_STDOUT_WORLD, ierr);
>>
>> and get following results for l2g. (MatGetOwnershipRange gives [0 3] for
>> rank 0 and [3 9] for rank 1)
>>
>> ISLocalToGlobalMapping Object: 2 MPI processes
>>   type: basic
>> [0] 0 0
>> [0] 1 1
>> [0] 2 5
>> [0] 3 2
>> [0] 4 6
>> [0] 5 8
>> [1] 0 3
>> [1] 1 4
>> [1] 2 5
>> [1] 3 6
>> [1] 4 7
>> [1] 5 8
>>
>>
>> The question is why, on rank 0,  the global indices (I assume the third
>> column) are not grouped into local chunks and ghost chunks. I understand
>> how to do local to global mapping without any concern of the actual
>> ordering, but I have some impression that in PETSC the ghost information is
>> always coming later in the local vector.
>>
>
> Nope. That is only guaranteed when using VecGhost.
>
>
>>  In this case, on rank 0, global index 5 should appear later than 0,1,2,
>> because it is ghost vertex for rank 0.
>>
>> I'm not trying to use this for FEM, but instead using the mesh management
>> in dmplex for other tasks. So I need to know more details.
>>
>
> We base the dof ordering on the mesh ordering. If you ordered the shared
> parts of the mesh last, this would produce the ordering you expect.
>
>    Matt
>
>
>> Thank you.
>>
>> QW
>>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> http://www.caam.rice.edu/~mk51/
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170604/b9c0ee7a/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: diy.f90
Type: text/x-fortran
Size: 15295 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170604/b9c0ee7a/attachment-0001.bin>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Q1_4cells.msh
Type: model/mesh
Size: 440 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170604/b9c0ee7a/attachment-0001.msh>


More information about the petsc-users mailing list