[petsc-users] Questions abt ex22f

TAY wee-beng zonexo at gmail.com
Sun Apr 22 14:30:48 CDT 2012


Hi,

Sorry for the mistake. I tried again, changing to "call VecView(x,0,ierr)"

but still got the same error:

[wtay at hpc12:tutorials]$ ./ex22f
[0]PETSC ERROR: 
------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, 
probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see 
http://www.mcs.anl.gov/petsc/petsc-as/documentation/faq.html#valgrind[0]PETSC 
ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to 
find memory corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: ---------------------  Stack Frames 
------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR:       INSTEAD the line number of the start of the function
[0]PETSC ERROR:       is given.
[0]PETSC ERROR: [0] VecView line 735 
/home/wtay/Codes/petsc-3.2-p5/src/vec/vec/interface/vector.c
[0]PETSC ERROR: --------------------- Error Message 
------------------------------------
[0]PETSC ERROR: Signal received!
[0]PETSC ERROR: 
------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Release Version 3.2.0, Patch 5, Sat Oct 29 
13:45:54 CDT 2011
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR: 
------------------------------------------------------------------------
[0]PETSC ERROR: ./ex22f on a arch-linu named hpc12 by wtay Sun Apr 22 
21:25:01 2012
[0]PETSC ERROR: Libraries linked from 
/home/wtay/Lib/petsc-3.2-p5_mumps_debug/lib
[0]PETSC ERROR: Configure run at Sun Nov 27 15:39:26 2011
[0]PETSC ERROR: Configure options --with-mpi-dir=/opt/openmpi-1.5.3/ 
--with-blas-lapack-dir=/opt/intel_xe_2011/mkl/lib/intel64/ 
--with-debugging=1 --download-hypre=1 
--prefix=/home/wtay/Lib/petsc-3.2-p5_mumps_debug COPTFLAGS=-O0 
FOPTFLAGS=-O0 --download-mumps=1 --download-parmetis=1 
--download-scalapack=1 --download-blacs=1
[0]PETSC ERROR: 
------------------------------------------------------------------------
[0]PETSC ERROR: User provided function() line 0 in unknown directory 
unknown file
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 59.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------

Yours sincerely,

TAY wee-beng


On 22/4/2012 9:20 PM, Jed Brown wrote:
>
> Look at the man page for VecView, it needs a PetscViewer argument.
>
> On Apr 22, 2012 2:19 PM, "TAY wee-beng" <zonexo at gmail.com 
> <mailto:zonexo at gmail.com>> wrote:
>
>     Hi,
>
>     I have attached the ex22f.F file. The changes I added are given in
>     bold:
>
>     ...
>
>     PetscErrorCode   ierr
>           DM               da
>           KSP              ksp
>     *Vec              x,b*
>           external         ComputeRHS,ComputeMatrix
>
>     ....
>
>     call KSPSetUp(ksp,ierr)
>           call KSPSolve(ksp,PETSC_NULL_OBJECT,PETSC_NULL_OBJECT,ierr)
>     *call KSPGetSolution(ksp,x,ierr)
>           call VecView(x,ierr)*
>           call KSPDestroy(ksp,ierr)
>           call DMDestroy(da,ierr)
>
>     The error is:
>
>     [0]PETSC ERROR:
>     ------------------------------------------------------------------------
>     [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation
>     Violation, probably memory access out of range
>     [0]PETSC ERROR: Try option -start_in_debugger or
>     -on_error_attach_debugger
>     [0]PETSC ERROR: or see
>     http://www.mcs.anl.gov/petsc/petsc-as/documentation/faq.html#valgrind[0]PETSC
>     ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X
>     to find memory corruption errors
>     [0]PETSC ERROR: likely location of problem given in stack below
>     [0]PETSC ERROR: ---------------------  Stack Frames
>     ------------------------------------
>     [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
>     available,
>     [0]PETSC ERROR:       INSTEAD the line number of the start of the
>     function
>     [0]PETSC ERROR:       is given.
>     [0]PETSC ERROR: [0] VecView line 735
>     /home/wtay/Codes/petsc-3.2-p5/src/vec/vec/interface/vector.c
>     [0]PETSC ERROR: --------------------- Error Message
>     ------------------------------------
>     [0]PETSC ERROR: Signal received!
>     [0]PETSC ERROR:
>     ------------------------------------------------------------------------
>     [0]PETSC ERROR: Petsc Release Version 3.2.0, Patch 5, Sat Oct 29
>     13:45:54 CDT 2011
>     [0]PETSC ERROR: See docs/changes/index.html for recent updates.
>     [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>     [0]PETSC ERROR: See docs/index.html for manual pages.
>     [0]PETSC ERROR:
>     ------------------------------------------------------------------------
>     [0]PETSC ERROR: ./ex22f on a arch-linu named hpc12 by wtay Sun Apr
>     22 21:11:39 2012
>     [0]PETSC ERROR: Libraries linked from
>     /home/wtay/Lib/petsc-3.2-p5_mumps_debug/lib
>     [0]PETSC ERROR: Configure run at Sun Nov 27 15:39:26 2011
>     [0]PETSC ERROR: Configure options
>     --with-mpi-dir=/opt/openmpi-1.5.3/
>     --with-blas-lapack-dir=/opt/intel_xe_2011/mkl/lib/intel64/
>     --with-debugging=1 --download-hypre=1
>     --prefix=/home/wtay/Lib/petsc-3.2-p5_mumps_debug COPTFLAGS=-O0
>     FOPTFLAGS=-O0 --download-mumps=1 --download-parmetis=1
>     --download-scalapack=1 --download-blacs=1
>     [0]PETSC ERROR:
>     ------------------------------------------------------------------------
>     [0]PETSC ERROR: User provided function() line 0 in unknown
>     directory unknown file
>     --------------------------------------------------------------------------
>     MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
>     with errorcode 59.
>
>     NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
>     You may or may not see output from other processes, depending on
>     exactly when Open MPI kills them.
>
>     Yours sincerely,
>
>     TAY wee-beng
>
>
>     On 22/4/2012 9:06 PM, Jed Brown wrote:
>>
>>     Run in a debugger and/or use --with-debugging=1 so that the error
>>     trace has more information. You could also show us the exact code
>>     that you used.
>>
>>     On Apr 22, 2012 2:03 PM, "TAY wee-beng" <zonexo at gmail.com
>>     <mailto:zonexo at gmail.com>> wrote:
>>
>>         Hi,
>>
>>         I added "Vec x,b" after "KSP ksp"
>>         and then "call KSPGetSolution(ksp, x, ierr)"
>>
>>         I wanted to see the output so I added "call VecView(x,ierr)"
>>         but I got this error:
>>
>>         [0]PETSC ERROR:
>>         ------------------------------------------------------------------------
>>         [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation
>>         Violation, probably memory access out of range
>>         [0]PETSC ERROR: Try option -start_in_debugger or
>>         -on_error_attach_debugger
>>         [0]PETSC ERROR: or see
>>         http://www.mcs.anl.gov/petsc/petsc-as/documentation/faq.html#valgrind[0]PETSC
>>         ERROR: or try http://valgrind.org on GNU/linux and Apple Mac
>>         OS X to find memory corruption errors
>>         [0]PETSC ERROR: configure using --with-debugging=yes,
>>         recompile, link, and run
>>         [0]PETSC ERROR: to get more information on the crash.
>>         [0]PETSC ERROR: --------------------- Error Message
>>         ------------------------------------
>>         [0]PETSC ERROR: Signal received!
>>         [0]PETSC ERROR:
>>         ------------------------------------------------------------------------
>>         [0]PETSC ERROR: Petsc Release Version 3.2.0, Patch 5, Sat Oct
>>         29 13:45:54 CDT 2011
>>         [0]PETSC ERROR: See docs/changes/index.html for recent updates.
>>         [0]PETSC ERROR: See docs/faq.html for hints about trouble
>>         shooting.
>>         [0]PETSC ERROR: See docs/index.html for manual pages.
>>         [0]PETSC ERROR:
>>         ------------------------------------------------------------------------
>>         [0]PETSC ERROR: ./ex22f on a arch-linu named hpc12 by wtay
>>         Sun Apr 22 21:02:14 2012
>>         [0]PETSC ERROR: Libraries linked from
>>         /home/wtay/Lib/petsc-3.2-p5_mumps_rel/lib
>>         [0]PETSC ERROR: Configure run at Sun Nov 27 15:18:15 2011
>>         [0]PETSC ERROR: Configure options
>>         --with-mpi-dir=/opt/openmpi-1.5.3/
>>         --with-blas-lapack-dir=/opt/intel_xe_2011/mkl/lib/intel64/
>>         --with-debugging=0 --download-hypre=1
>>         --prefix=/home/wtay/Lib/petsc-3.2-p5_mumps_rel COPTFLAGS=-O3
>>         FOPTFLAGS=-O3 --download-mumps=1 --download-parmetis=1
>>         --download-scalapack=1 --download-blacs=1
>>         [0]PETSC ERROR:
>>         ------------------------------------------------------------------------
>>         [0]PETSC ERROR: User provided function() line 0 in unknown
>>         directory unknown file
>>         --------------------------------------------------------------------------
>>         MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
>>         with errorcode 59.
>>
>>
>>         Yours sincerely,
>>
>>         TAY wee-beng
>>
>>
>>         On 22/4/2012 2:53 PM, Matthew Knepley wrote:
>>>         On Sun, Apr 22, 2012 at 3:31 AM, TAY wee-beng
>>>         <zonexo at gmail.com <mailto:zonexo at gmail.com>> wrote:
>>>
>>>             Hi,
>>>
>>>             I am using petsc-dev 2012-04-20.
>>>
>>>             Btw, I'm referring to :
>>>
>>>             http://www.mcs.anl.gov/petsc/petsc-dev/src/ksp/ksp/examples/tutorials/ex22f.F.html
>>>
>>>             Part of the code is :
>>>
>>>             call KSPSetFromOptions(ksp,ierr)
>>>                    call KSPSetUp(ksp,ierr)
>>>                    call KSPSolve(ksp,PETSC_NULL_OBJECT,PETSC_NULL_OBJECT,ierr)
>>>                    call KSPDestroy(ksp,ierr)
>>>                    call DMDestroy(da,ierr)
>>>                    call PetscFinalize(ierr)
>>>
>>>
>>>
>>>             Unlike other codes like ex29c or ex45c, there isn't a
>>>             "call KSPGetSolution(ksp,x,ierr)"
>>>
>>>
>>>         You need to declare "Vec x", and then you can call
>>>         KSPGetSolution(ksp, x, ierr)
>>>
>>>            Matt
>>>
>>>             Also I want to add "call VecView(x,ierr)" to print out
>>>             the results, which is usally added after the above.
>>>
>>>             Thank you
>>>
>>>             Yours sincerely,
>>>
>>>             TAY wee-beng
>>>
>>>
>>>             On 22/4/2012 1:14 AM, Matthew Knepley wrote:
>>>>             On Sat, Apr 21, 2012 at 6:31 PM, TAY wee-beng
>>>>             <zonexo at gmail.com <mailto:zonexo at gmail.com>> wrote:
>>>>
>>>>                 Hi,
>>>>
>>>>                 May I know if ex22f is complete? I can't find :
>>>>
>>>>                 call KSPGetSolution(ksp,x,ierr)
>>>>
>>>>                 If I entered it, it says x not found.
>>>>
>>>>
>>>>             This is correct in petsc-dev. What version are you using?
>>>>
>>>>               Thanks,
>>>>
>>>>                 Matt
>>>>
>>>>                 Thank you!
>>>>
>>>>                 -- 
>>>>                 Yours sincerely,
>>>>
>>>>                 TAY wee-beng
>>>>
>>>>
>>>>
>>>>
>>>>             -- 
>>>>             What most experimenters take for granted before they
>>>>             begin their experiments is infinitely more interesting
>>>>             than any results to which their experiments lead.
>>>>             -- Norbert Wiener
>>>
>>>
>>>
>>>
>>>         -- 
>>>         What most experimenters take for granted before they begin
>>>         their experiments is infinitely more interesting than any
>>>         results to which their experiments lead.
>>>         -- Norbert Wiener
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120422/591a0d2d/attachment.htm>


More information about the petsc-users mailing list