[petsc-users] error in calling VecGetArrayf90()

Peter Wang pengxwang at hotmail.com
Wed Jan 5 14:21:47 CST 2011


Thanks to all of you.
 
   Yes, Ethan is correct. I just changed the code as shown as below, and the result is correct:
!========================         do i=1,Iend-Istart
              write(6,*)'check xx_v',i,xx_v(i),myid
         enddo  

!========================
 Now, I got the main concept of the pointer to the array, it is locally and begin with 1.  If I want to use them globally, I should convert the local index to the global.
 
Thanks again for all the considerate responses. Your advices always clear the trouble on the way to the final goal. I appreciate very much.
 
> From: ecoon at lanl.gov
> To: petsc-users at mcs.anl.gov
> Date: Wed, 5 Jan 2011 12:52:20 -0700
> Subject: Re: [petsc-users] error in calling VecGetArrayf90()
> 
> See the below program and output (done with dev, but I don't think this
> has changed). In F90, the local arrays, gotten by VecGetArrayF90, are
> all indexed:
> 
> xx_v(1:(Iend-Istart)). 
> 
> This should really be moot -- It's way easier to follow the example code
> for ex50f90.F and pass their arrays into a subroutine which casts the
> shape in the "PETSc way", indexed from (Istart:Iend-1). 
> 
> Ethan
> 
> 
> ---
> 
> 
> #define PETSC_USE_FORTRAN_MODULES 1
> #include "finclude/petscsysdef.h"
> #include "finclude/petscvecdef.h"
> 
> program test
> use petsc
> implicit none
> 
> Vec x
> PetscInt Istart, Iend
> PetscScalar, pointer:: xx_v(:)
> PetscInt i
> PetscErrorCode ierr
> PetscInt twenty
> PetscInt rank
> 
> twenty = 20
> 
> call PetscInitialize(PETSC_NULL_CHARACTER, ierr)
> call mpi_comm_rank(PETSC_COMM_WORLD,rank,ierr)
> call VecCreateMPI(PETSC_COMM_WORLD, PETSC_DECIDE, twenty, x, ierr)
> 
> call VecGetArrayF90(x, xx_v, ierr)
> xx_v = rank+1
> call VecRestoreArrayF90(x, xx_v, ierr)
> 
> call VecView(x, PETSC_VIEWER_STDOUT_WORLD, ierr)
> 
> call VecGetOwnershipRange(x, Istart, Iend)
> write (*,*) 'range on rank', rank, ':', Istart, Iend
> 
> call VecGetArrayF90(x, xx_v, ierr)
> CHKMEMQ
> do i = 1,(Iend-Istart)
> write (*,*) i, xx_v(i), rank
> enddo
> CHKMEMQ
> call VecRestoreArrayF90(x, xx_v, ierr)
> 
> call VecDestroy(x, ierr)
> call PetscFinalize(ierr)
> end program test
> 
> -----
> 
> $> ${PETSC_DIR}/${PETSC_ARCH}/bin/mpiexec -n 2 ./test
> Vector Object:
> type: mpi
> Process [0]
> 1
> 1
> 1
> 1
> 1
> 1
> 1
> 1
> 1
> 1
> Process [1]
> 2
> 2
> 2
> 2
> 2
> 2
> 2
> 2
> 2
> 2
> range on rank 1 : 10 20
> 1 2.0000000000000000 1
> 2 2.0000000000000000 1
> 3 2.0000000000000000 1
> 4 2.0000000000000000 1
> 5 2.0000000000000000 1
> 6 2.0000000000000000 1
> 7 2.0000000000000000 1
> 8 2.0000000000000000 1
> 9 2.0000000000000000 1
> 10 2.0000000000000000 1
> range on rank 0 : 0 10
> 1 1.00000000000000000 0
> 2 1.00000000000000000 0
> 3 1.00000000000000000 0
> 4 1.00000000000000000 0
> 5 1.00000000000000000 0
> 6 1.00000000000000000 0
> 7 1.00000000000000000 0
> 8 1.00000000000000000 0
> 9 1.00000000000000000 0
> 10 1.00000000000000000 0
> 
> ---
> 
> 
> On Wed, 2011-01-05 at 13:21 -0600, Barry Smith wrote:
> > I don't think this is correct. You are suppose to use the local indexing for each process. With the strange index starting at 1 instead of 0.
> > 
> > 
> > Barry
> > 
> > 
> > On Jan 5, 2011, at 1:03 PM, Ethan Coon wrote:
> > 
> > > On all processors, the array you get is indexed:
> > > 
> > > xx_v(1:(Iend-Istart))
> > > 
> > > while the Istart and Iend are global indices into the global Vec.
> > > 
> > > It's only by luck that the values on proc 3 are correct (this code
> > > should probably seg fault as it is accessing memory outside of the
> > > bounds of xx_v).
> > > 
> > > To access the array, you'll want:
> > > 
> > > do i=1, (Iend-Istart)
> > > write(6,*)'check xx_v',i,xx_v(i),myid
> > > enddo
> > > 
> > > or (better yet) pass it into a subroutine to get the array indexed
> > > correctly, like demonstrated in the example.
> > > 
> > > Ethan
> > > 
> > > On Wed, 2011-01-05 at 11:23 -0600, Peter Wang wrote:
> > >> Thanks, Satish,
> > >> 
> > >> The index of the array is modified to i+1:
> > >> !===================
> > >> do i=Istart,Iend-1
> > >> write(6,*)'check xx_v',i+1,xx_v(i+1),myid
> > >> enddo 
> > >> !===================
> > >> 
> > >> However, only the elements on root process (process 0) and the
> > >> last process (process 3) are corrent, is there any ohter logical
> > >> error?
> > >> 
> > >> check xx_v 1 3999.9999999999982 0
> > >> check xx_v 2 3999.9999999999982 0
> > >> check xx_v 3 3999.9999999999982 0
> > >> check xx_v 4 3999.9999999999982 0
> > >> check xx_v 5 3999.9999999999982 0
> > >> check xx_v 6 3000.0000000000005 0
> > >> check xx_v 7 3000.0000000000005 0
> > >> check xx_v 8 2.61360726650019422E-321 1 
> > >> check xx_v 9 7.90505033345994471E-323 1
> > >> check xx_v 10 1.69759663277221785E-312 1
> > >> check xx_v 11 6.16840020108069212E-317 1
> > >> check xx_v 12 6.16840316547456717E-317 1
> > >> check xx_v 13 6.16832658529946177E-317 1
> > >> check xx_v 14 1.99665037664579820E-314 2
> > >> check xx_v 15 6.19784009071943448E-317 2
> > >> check xx_v 16 6.20249221284067566E-317 2
> > >> check xx_v 17 6.20218737433719161E-317 2
> > >> check xx_v 18 6.18236051996958238E-317 2
> > >> check xx_v 19 6.16840316547456717E-317 2
> > >> check xx_v 20 6.18107199676522841E-317 3
> > >> check xx_v 21 0.0000000000000000 3
> > >> check xx_v 22 0.0000000000000000 3
> > >> check xx_v 23 0.0000000000000000 3
> > >> check xx_v 24 0.0000000000000000 3
> > >> check xx_v 25 0.0000000000000000 3
> > >> 
> > >> 
> > >> 
> > >>> Date: Tue, 4 Jan 2011 22:49:50 -0600
> > >>> From: balay at mcs.anl.gov
> > >>> To: petsc-users at mcs.anl.gov
> > >>> Subject: Re: [petsc-users] error in calling VecGetArrayf90()
> > >>> 
> > >>> The global index starts at Istart - but the array index starts at 1
> > >> [for fortran array]
> > >>> 
> > >>> Satish
> > >>> 
> > >>> On Tue, 4 Jan 2011, Peter Wang wrote:
> > >>> 
> > >>>> 
> > >>>> In last question, the pointer xx_v is local data. However, if
> > >> write them to the monitor or assign them to another array, the value
> > >> is incorrect.
> > >>>> 
> > >>>> The protion of the code to display them on the monitor is like as
> > >> following: 
> > >>>> call MatGetOwnershipRange(A,Istart,Iend,ierr)
> > >>>> call VecGetArrayF90(x,xx_v,ierr) ! Vector x is matched with Matrix
> > >> A in the same communicator
> > >>>> 
> > >>>> write(*,*)xx_v,myid ! write the poiner array together
> > >>>> 
> > >>>> do i=Istart,Iend-1
> > >>>> write(6,*)'check xx_v',i,xx_v(i),myid !write the element of the
> > >> array one by one with local range (Istart to Iend-1)
> > >>>> enddo 
> > >>>> 
> > >>>> 
> > >>>> =========The result is as following: ( the values of the elements
> > >> from 7 to 20 are not correct !!)
> > >>>> 
> > >>>> 3999.9999999999982 3999.9999999999982 3999.9999999999982
> > >> 3999.9999999999982 3999.9999999999982 3000.0000000000005
> > >> 3000.0000000000005 0
> > >>>> 
> > >>>> 3000.0000000000009 3000.0000000000009 3000.0000000000009
> > >> 2000.0000000000011 2000.0000000000011 2000.0000000000000 1
> > >>>> 
> > >>>> 2000.0000000000009 2000.0000000000009 1000.0000000000003
> > >> 1000.0000000000003 1000.0000000000003 999.99999999999989 2
> > >>>> 
> > >>>> 1000.0000000000000 0.0000000000000000 0.0000000000000000
> > >> 0.0000000000000000 0.0000000000000000 0.0000000000000000 3
> > >>>> 
> > >>>> 
> > >>>> check xx_v 0 0.0000000000000000 0.0000000000000000 0
> > >>>> check xx_v 1 3999.9999999999982 3999.9999999999982 0
> > >>>> check xx_v 2 3999.9999999999982 3999.9999999999982 0
> > >>>> check xx_v 3 3999.9999999999982 3999.9999999999982 0
> > >>>> check xx_v 4 3999.9999999999982 3999.9999999999982 0
> > >>>> check xx_v 5 3999.9999999999982 3999.9999999999982 0
> > >>>> check xx_v 6 3000.0000000000005 3000.0000000000005 0
> > >>>> check xx_v 7 1.99665037664579820E-314 1.99665037664579820E-314 1
> > >>>> check xx_v 8 2.61360726650019422E-321 2.61360726650019422E-321 1
> > >>>> check xx_v 9 7.90505033345994471E-323 7.90505033345994471E-323 1
> > >>>> check xx_v 10 1.69759663277221785E-312 1.69759663277221785E-312 1
> > >>>> check xx_v 11 6.16846344148335980E-317 6.16846344148335980E-317 1
> > >>>> check xx_v 12 6.16846640587723485E-317 6.16846640587723485E-317 1
> > >>>> check xx_v 13 6.16838982570212945E-317 6.16838982570212945E-317 2
> > >>>> check xx_v 14 1.99665037664579820E-314 1.99665037664579820E-314 2
> > >>>> check xx_v 15 6.19790333112210216E-317 6.19790333112210216E-317 2
> > >>>> check xx_v 16 6.20255545324334334E-317 6.20255545324334334E-317 2
> > >>>> check xx_v 17 6.20225061473985929E-317 6.20225061473985929E-317 2
> > >>>> check xx_v 18 6.18242376037225006E-317 6.18242376037225006E-317 2
> > >>>> check xx_v 19 6.16846640587723485E-317 6.16846640587723485E-317 3
> > >>>> check xx_v 20 6.18113523716789609E-317 6.18113523716789609E-317 3
> > >>>> check xx_v 21 0.0000000000000000 0.0000000000000000 3
> > >>>> check xx_v 22 0.0000000000000000 0.0000000000000000 3
> > >>>> check xx_v 23 0.0000000000000000 0.0000000000000000 3
> > >>>> check xx_v 24 0.0000000000000000 0.0000000000000000 3
> > >>>> 
> > >>>> ======The vector x is :
> > >>>> Process [0]
> > >>>> 4000
> > >>>> 4000
> > >>>> 4000
> > >>>> 4000
> > >>>> 4000
> > >>>> 3000
> > >>>> 3000
> > >>>> Process [1]
> > >>>> 3000
> > >>>> 3000
> > >>>> 3000
> > >>>> 2000
> > >>>> 2000
> > >>>> 2000
> > >>>> Process [2]
> > >>>> 2000
> > >>>> 2000
> > >>>> 1000
> > >>>> 1000
> > >>>> 1000
> > >>>> 1000
> > >>>> Process [3]
> > >>>> 1000
> > >>>> 0
> > >>>> 0
> > >>>> 0
> > >>>> 0
> > >>>> 0
> > >>>> 
> > >>>> 
> > >>>> 
> > >>>> 
> > >>>>> Date: Tue, 4 Jan 2011 17:50:11 -0600
> > >>>>> From: balay at mcs.anl.gov
> > >>>>> To: petsc-users at mcs.anl.gov
> > >>>>> Subject: Re: [petsc-users] error in calling VecGetArrayf90()
> > >>>>> 
> > >>>>> Did you included "finclude/petscvec.h90" in your code - as the
> > >> example did?
> > >>>>> 
> > >>>>> satish
> > >>>>> 
> > >>>>> On Tue, 4 Jan 2011, Peter Wang wrote:
> > >>>>> 
> > >>>>>> 
> > >>>>>> I am trying to obtain the value of each element of a solution
> > >> Vector by KSPsolve(). 
> > >>>>>> 
> > >>>>>> The variables are defined according the example of ex4f90.F in
> > >> \petsc-3.1-p5\src\snes\examples\tutorials\ as following,
> > >>>>>> 
> > >>>>>> PetscScalar, pointer :: xx_v(:)
> > >>>>>> 
> > >>>>>> ...
> > >>>>>> call KSPSolve(ksp,b,x,ierr)
> > >>>>>> call VecView(x,PETSC_VIEWER_STDOUT_WORLD,ierr)
> > >>>>>> 
> > >>>>>> call VecGetArrayF90(x,xx_v,ierr)
> > >>>>>> call VecRestoreArrayF90(x,xx_v,ierr)
> > >>>>>> 
> > >>>>>> ...
> > >>>>>> 
> > >>>>>> But, the error keeps coming out when call
> > >> VecGetArrayF90(x,xx_v,ierr) and call VecRestoreArrayF90(x,xx_v,ierr)
> > >> are not commented off.
> > >>>>>> 
> > >>>>>> 
> > >>>>>> The error information shows:
> > >>>>>> Caught signal number 11 SEGV: Segmentation Violation, probably
> > >> memory access out of range
> > >>>>>> 
> > >>>>>> [0]PETSC ERROR: --------------------- Stack Frames
> > >> ------------------------------------
> > >>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are
> > >> not available,
> > >>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of the
> > >> function
> > >>>>>> [0]PETSC ERROR: is given.
> > >>>>>> [0]PETSC ERROR: [0] F90Array1dCreate line 52
> > >> src/sys/f90-src/f90_cwrap.c
> > >>>>>> [0]PETSC ERROR: --------------------- Error Message
> > >> ------------------------------------
> > >>>>>> 
> > >>>>>> I checked the code according the example, but cannot see any
> > >> difference to that. Just don't know why the pointer array xx_v doesn't
> > >> work here? Thanks.
> > >>>>>> 
> > >>>>>> 
> > >>>>> 
> > >>>> 
> > >>> 
> > > 
> > > -- 
> > > -------------------------------------
> > > Ethan Coon
> > > Post-Doctoral Researcher
> > > Mathematical Modeling and Analysis
> > > Los Alamos National Laboratory
> > > 505-665-8289
> > > 
> > > http://www.ldeo.columbia.edu/~ecoon/
> > > -------------------------------------
> > > 
> > 
> 
> -- 
> -------------------------------------
> Ethan Coon
> Post-Doctoral Researcher
> Mathematical Modeling and Analysis
> Los Alamos National Laboratory
> 505-665-8289
> 
> http://www.ldeo.columbia.edu/~ecoon/
> -------------------------------------
> 
 		 	   		  
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110105/f98af85f/attachment.htm>


More information about the petsc-users mailing list