[petsc-users] Accessing Vector's ghost values

Mohammad Mirzadeh mirzadeh at gmail.com
Thu Feb 23 15:00:28 CST 2012


Are you using local numbering when accessing the local part of ghost nodes?

On Thu, Feb 23, 2012 at 12:43 PM, Bojan Niceno <bojan.niceno at psi.ch> wrote:

>  Dear Matt,
>
>
> I have a new insight, although is not the full resolution.  If I change my
> code in PETScSolver.cpp from:
>
>
>   /*-------------------------------------------------+
>   |  Make necessary PETSc intializations for vetors  |
>   +-------------------------------------------------*/
>   Int   nghost = N - n;
>   Int * ghosts = new Int(nghost);
>   for(Int n=0; n<M.mesh.nodes.size(); n++) {
>     assert( M.mesh.nodes[n].global_number >= 0);
>     assert( M.mesh.nodes[n].global_number < 14065);
>   }
>   for(Int i=n; i<N; i++) {
>     assert( M.mesh.nodes[i].global_number >= 0);
>     assert( M.mesh.nodes[i].global_number < 14065);
>     assert( ! (M.mesh.nodes[i].global_number >= n_start &&
>                M.mesh.nodes[i].global_number <  n_end) );
>       ghosts[i] = M.mesh.nodes[i].global_number;
>   }
>
>   VecCreateGhost(PETSC_COMM_WORLD, n, PETSC_DECIDE, nghost, &ghosts[0],
> &x);
>
> to:
>
>   /*-------------------------------------------------+
>   |  Make necessary PETSc intializations for vetors  |
>   +-------------------------------------------------*/
>   Int     nghost = N - n;
>   Indices ghosts;                                         // <---= NEW!
>   for(Int n=0; n<M.mesh.nodes.size(); n++) {
>     assert( M.mesh.nodes[n].global_number >= 0);
>     assert( M.mesh.nodes[n].global_number < 14065);
>   }
>   for(Int i=n; i<N; i++) {
>     assert( M.mesh.nodes[i].global_number >= 0);
>     assert( M.mesh.nodes[i].global_number < 14065);
>     assert( ! (M.mesh.nodes[i].global_number >= n_start &&
>                M.mesh.nodes[i].global_number <  n_end) );
>       ghosts.push_back( M.mesh.nodes[i].global_number );  // <---= NEW!
>
>   }
>   assert( ghosts.size() == nghost );                      // <---= NEW!
>
>   VecCreateGhost(PETSC_COMM_WORLD, n, PETSC_DECIDE, nghost, &ghosts[0],
> &x);
>
> I pass the VecCreateGhost phase.  "Indices" is an STL container of
> integers.  It seems it works better than classical C array for this case.
>
>
> However, I still do not see the ghost values, i.e. I get the following
> error:
>
> [1]PETSC ERROR: --------------------- Error Message
> ------------------------------------
> [1]PETSC ERROR: [0]PETSC ERROR: --------------------- Error Message
> ------------------------------------
> [0]PETSC ERROR: Argument out of range!
> [0]PETSC ERROR: Can only get local values, trying 3529!
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Argument out of range!
> [1]PETSC ERROR: Can only get local values, trying 22!
> [1]PETSC ERROR:
> ------------------------------------------------------------------------
> [1]PETSC ERROR: Petsc Release Version 3.2.0, Patch 6, Wed Jan 11 09:28:45
> CST 2012
> [1]PETSC ERROR: See docs/changes/index.html for recent updates.
> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [1]PETSC ERROR: See docs/index.html for manual pages.
> [1]PETSC ERROR: [2]PETSC ERROR: --------------------- Error Message
> ------------------------------------
> [2]PETSC ERROR: Argument out of range!
> [2]PETSC ERROR: Can only get local values, trying 86!
> [2]PETSC ERROR:
> ------------------------------------------------------------------------
> [2]PETSC ERROR: Petsc Release Version 3.2.0, Patch 6, Wed Jan 11 09:28:45
> CST 2012
>
> when I am trying to access values in ghost cells.  What do I have to use
> to see them ghosts?  I reckon VecGhostGetLocalForm should be used, right?
>
>
>     Kind regards,
>
>
>     Bojan
>
>
>
> On 2/23/2012 8:36 PM, Matthew Knepley wrote:
>
> On Thu, Feb 23, 2012 at 1:33 PM, Bojan Niceno <bojan.niceno at psi.ch> wrote:
>
>>  Dear Matt,
>>
>>
>> I sent the code as an attached tarball.  I sent it with case I run, so is
>> 2 MB big.  It is now in the cue for moderator's approval.
>>
>
>  No, you HAVE to send it to petsc-maint at mcs.anl.gov, as I said last time,
> for exactly this reason.
>
>    Matt
>
>
>>  Thanks.
>>
>>
>>     Kind regards,
>>
>>
>>     Bojan
>>
>>
>> On 2/23/2012 8:04 PM, Matthew Knepley wrote:
>>
>> On Thu, Feb 23, 2012 at 12:51 PM, Bojan Niceno <bojan.niceno at psi.ch>wrote:
>>
>>>  Dear Matt,
>>>
>>>
>>> are you sure?  It is almost 4000 lines long!  Shall I send only the
>>> function which bother me?
>>>
>>> If the entire code is what you need, shall I make a tarball and attach
>>> it?
>>>
>>
>>  Send something the builds and runs. Don't care how long it is.
>>
>>     Matt
>>
>>
>>>      Kind regards,
>>>
>>>
>>>     Bojan
>>>
>>> On 2/23/2012 7:44 PM, Matthew Knepley wrote:
>>>
>>> On Thu, Feb 23, 2012 at 12:28 PM, Bojan Niceno <bojan.niceno at psi.ch>wrote:
>>>
>>>>  On 2/23/2012 7:24 PM, Matthew Knepley wrote:
>>>>
>>>> On Thu, Feb 23, 2012 at 12:05 PM, Bojan Niceno <bojan.niceno at psi.ch>wrote:
>>>>
>>>>>  Dear Matthew,
>>>>>
>>>>>
>>>>> thank you for your response.  When I use VecCreateGhost, I get the
>>>>> following:
>>>>>
>>>>
>>>>  It appears that you passed a bad communicator. Did you not initialize
>>>> a 'comm' variable?
>>>>
>>>>
>>>> I pass PETSC_COMM_WORLD to VecCreateGhost.
>>>>
>>>> I don't know what you mean by 'comm' variable :-(  I called all the
>>>> routines to initialize PETSc.
>>>>
>>>
>>>  Send your code to petsc-maint at mcs.anl.gov.
>>>
>>>     Matt
>>>
>>>
>>>>
>>>>     Cheers,
>>>>
>>>>
>>>>     Bojan
>>>>
>>>>
>>>>     Matt
>>>>
>>>>
>>>>>  [0]PETSC ERROR:
>>>>> ------------------------------------------------------------------------
>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
>>>>> probably memory access out of range
>>>>> [0]PETSC ERROR: Try option -start_in_debugger or
>>>>> -on_error_attach_debugger
>>>>> [0]PETSC ERROR: or see
>>>>> http://www.mcs.anl.gov/petsc/petsc-as/documentation/faq.html#valgrind[0]PETSC
>>>>> ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to
>>>>> find memory corruption errors
>>>>> [0]PETSC ERROR: likely location of problem given in stack below
>>>>> [0]PETSC ERROR: ---------------------  Stack Frames
>>>>> ------------------------------------
>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
>>>>> available,
>>>>> [0]PETSC ERROR:       INSTEAD the line number of the start of the
>>>>> function
>>>>> [0]PETSC ERROR:       is given.
>>>>> [0]PETSC ERROR: [0] PetscCommDuplicate line 140 src/sys/objects/tagm.c
>>>>> [0]PETSC ERROR: [0] PetscHeaderCreate_Private line 30
>>>>> src/sys/objects/inherit.c
>>>>> [0]PETSC ERROR: [0] VecCreate line 32 src/vec/vec/interface/veccreate.c
>>>>> [0]PETSC ERROR: [0] VecCreateGhostWithArray line 567
>>>>> src/vec/vec/impls/mpi/pbvec.c
>>>>> [0]PETSC ERROR: [0] VecCreateGhost line 647
>>>>> src/vec/vec/impls/mpi/pbvec.c
>>>>> [0]PETSC ERROR: --------------------- Error Message
>>>>> ------------------------------------
>>>>> [0]PETSC ERROR: Signal received!
>>>>> [0]PETSC ERROR:
>>>>> ------------------------------------------------------------------------
>>>>> [0]PETSC ERROR: Petsc Release Version 3.2.0, Patch 6, Wed Jan 11
>>>>> 09:28:45 CST 2012
>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>>>>> [0]PETSC ERROR: See docs/index.html for manual pages.
>>>>> [0]PETSC ERROR:
>>>>> ------------------------------------------------------------------------
>>>>> [0]PETSC ERROR: ./PSI-Flow on a arch-linu named lccfd06 by niceno Thu
>>>>> Feb 23 19:02:45 2012
>>>>> [0]PETSC ERROR: Libraries linked from
>>>>> /homecfd/niceno/PETSc-3.2-p6/arch-linux2-c-debug/lib
>>>>> [0]PETSC ERROR: Configure run at Fri Feb 10 10:24:13 2012
>>>>> [0]PETSC ERROR: Configure options
>>>>> [0]PETSC ERROR:
>>>>> ------------------------------------------------------------------------
>>>>> [0]PETSC ERROR: User provided function() line 0 in unknown directory
>>>>> unknown file
>>>>>
>>>>> I don't understand what could be causing it.  I took very good care to
>>>>> match the global numbers of ghost cells when calling VecCreateGhost
>>>>>
>>>>>
>>>>>     Kind regards,
>>>>>
>>>>>
>>>>>     Bojan
>>>>>
>>>>>
>>>>> On 2/23/2012 5:53 PM, Matthew Knepley wrote:
>>>>>
>>>>> On Thu, Feb 23, 2012 at 10:46 AM, Bojan Niceno <bojan.niceno at psi.ch>wrote:
>>>>>
>>>>>> Hi all,
>>>>>>
>>>>>> I've never used a mailing list before, so I hope this message will
>>>>>> reach PETSc users and experts and someone might be willing to help me.  I
>>>>>> am also novice in PETSc.
>>>>>>
>>>>>> I have developed an unstructured finite volume solver on top of PETSc
>>>>>> libraries.  In sequential, it works like a charm.  For the parallel
>>>>>> version, I do domain decomposition externally with Metis, and work out
>>>>>> local and global numberings, as well as communication patterns between
>>>>>> processor.  (The latter don't seem to be needed for PETSc, though.)  When I
>>>>>> run my program in parallel, it also works, but I miss values in vectors'
>>>>>> ghost points.
>>>>>>
>>>>>> I create vectors with command: VecCreate(PETSC_COMM_WORLD, &x);
>>>>>>
>>>>>> Is it possible to get the ghost values if a vector is created like
>>>>>> this?
>>>>>>
>>>>>
>>>>>  I do not understand this question. By definition, "ghost values" are
>>>>> those not stored in the global vector.
>>>>>
>>>>>
>>>>>> I have tried to use VecCreateGhost, but for some reason which is
>>>>>> beyond my comprehension, PETSc goes berserk when it reaches the command:
>>>>>> VecCreateGhost(PETSC_COMM_WORLD, n, PETSC_DECIDE, nghost, ifrom, &x)
>>>>>>
>>>>>
>>>>>  I think you can understand that "berserk" tells me absolutely
>>>>> nothing. Error message? Stack trace? Did you try to run an
>>>>> example which uses VecGhost?
>>>>>
>>>>>    Thanks,
>>>>>
>>>>>       Matt
>>>>>
>>>>>
>>>>>> Can anyone help me?  Either how to reach ghost values for vector
>>>>>> created by VecCreate, or how to use VecCreateGhost properly?
>>>>>>
>>>>>>
>>>>>>   Kind regards,
>>>>>>
>>>>>>   Bojan
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>>  --
>>>>> What most experimenters take for granted before they begin their
>>>>> experiments is infinitely more interesting than any results to which their
>>>>> experiments lead.
>>>>> -- Norbert Wiener
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>>
>>>>
>>>>
>>>>
>>>>  --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>>
>>>>
>>>> --
>>>>
>>>
>>>
>>>
>>>  --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>>
>>>
>>> --
>>>
>>
>>
>>
>>  --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>>
>>
>> --
>>
>
>
>
>  --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
>
>
> --
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120223/b1b378df/attachment-0001.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: image/png
Size: 6515 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120223/b1b378df/attachment-0005.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: image/png
Size: 6515 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120223/b1b378df/attachment-0006.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: image/png
Size: 6515 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120223/b1b378df/attachment-0007.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: image/png
Size: 6515 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120223/b1b378df/attachment-0008.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: image/png
Size: 6515 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120223/b1b378df/attachment-0009.png>


More information about the petsc-users mailing list