[petsc-users] Getting a copy of a PETSC vector into a user sequential vector

Ormiston, Scott J. SJ_Ormiston at UManitoba.ca
Fri Jan 14 14:14:59 CST 2011


We have been trying to use scatter to collect the entries of a PETSc 
solution vector into an ordinary sequential vector. We need this for our 
own post-processing and other calculations.

We were able to get ex15f.F to create and solve a matrix. Then we added 
code from ex30f.F and we were able to get it to work.  When we put the 
code into our own mainline, it fails with the run-time error message:

  0: Subscript out of range for array xx_v (main-petsc.f95: 1649)
      subscript=261373973, lower bound=1, upper bound=1, dimension=1
  --------------------------------------------------------------------------
  mpiexec has exited due to process rank 0 with PID 23105 on
  node mecfd01 exiting without calling "finalize". This may
  have caused other processes in the application to be
  terminated by signals sent by mpiexec (as reported here).
  --------------------------------------------------------------------------

Any ideas of what might be wrong? We believe the link libraries are the 
same between the two make files, and we just inserted the working 
example 15/30 code into our existing, working mainline.

For reference, an excerpt of the key code segments and the full relevent 
output are appended.

Scott Ormiston
=========================CODE EXCERPT===============================
!...
   REAL*8  sol(NILU) !NILU is the length of the solution vector
!
#include "finclude/petscsys.h"
#include "finclude/petscvec.h"
#include "finclude/petscmat.h"
#include "finclude/petscpc.h"
#include "finclude/petscksp.h"
#include "finclude/petscis.h"

#define xx_a(ib)  xx_v(xx_i + (ib))
   PetscOffset     xx_i
   PetscScalar     xx_v(1)

#include "finclude/petscvec.h90"
   PetscScalar, pointer ::  xp_v(:)
   Vec             seqV
   VecScatter      scat1
   IS              fromis, tois
   PetscInt        from(12), to(12)

   Vec              Vx,Vb,Vu
   Mat              A
   PC               pc
   KSP              ksp
   PetscScalar      PSv,one,neg_one
   double precision norm,tol
   PetscErrorCode   ierr
   PetscInt         PETSCi,PETSCj,II,JJ,Istart
   PetscInt         Iend,m,n,i1,its,five
   PetscMPIInt      rank
   PetscTruth       user_defined_pc,Pflg

   external SampleShellPCSetUp, SampleShellPCApply, SampleShellPCDestroy

!  Common block to store data for user-provided preconditioner
   common /myshellpc/ diag
   Vec    diag
!
   call PetscInitialize(PETSC_NULL_CHARACTER,ierr)

! ex15f.F code ...
!
         call KSPSolve(ksp,Vb,Vx,ierr)
!
!#######################################################
!-- put the solution from parallel vector vx into sol(*)
   do i=1,m*n
      from(i)=i-1
      to(i)  =i-1
   end do

   call VecCreateSeq(PETSC_COMM_SELF,m*n,seqV,ierr)
   call VecSet(seqV,neg_one,ierr)
   if (rank.eq.0) then
      print *, "seqV OLD"
      call VecView(seqV,PETSC_VIEWER_STDOUT_SELF,ierr)
   end if

   call ISCreateGeneral(PETSC_COMM_SELF,m*n,from,fromis,ierr)
   call ISCreateGeneral(PETSC_COMM_SELF,m*n,to,tois,ierr)

   call VecScatterCreate(Vx,fromis,seqV,tois,scat1,ierr)

   call ISDestroy(fromis,ierr)
   call ISDestroy(tois,ierr)

   call VecScatterBegin(scat1,Vx,seqV,INSERT_VALUES,SCATTER_FORWARD,ierr)
   call VecScatterEnd(scat1,Vx,seqV,INSERT_VALUES,SCATTER_FORWARD,ierr)

   if (rank.eq.0) then
      print *, "seqV NEW"
      call VecView(seqV,PETSC_VIEWER_STDOUT_SELF,ierr)
      !
      !------*.F version of VecGetArray
      call VecGetArray(seqV,xx_v,xx_i,ierr)
      write(6,*) 'after VecGetArray'
      write(6,*) 'n=',n, ' m=',m
      do PETSCi=1,m*n
         xx_a(PETSCi) = 100.0*PETSCi
         sol(i) = xx_a(i) + 1.55
         sol(i) = xx_v( xx_i + (i))
      end do
      call VecRestoreArray(seqV,xx_v,xx_i,ierr)
      !
      !------*.f95 version of VecGetArray
      call VecGetArrayF90(seqV,xp_v,ierr)
      print*,ierr
      do i=1,m*n
         sol(i)= xp_v(i) - 5.0
         xp_v(i) = 100.0*i
      end do
      call VecRestoreArrayF90(seqV,xp_v,ierr)
      print*,'the f95 version of test solution vector is:'
      do i=1,m*n
         print*,sol(i)
      end do
   end if
!...
!  Free work space.  All PETSc objects should be destroyed when they
!  are no longer needed.

         call KSPDestroy(ksp,ierr)
         call VecDestroy(Vu,ierr)
         call VecDestroy(Vx,ierr)
         call VecDestroy(Vb,ierr)
         call MatDestroy(A,ierr)

!  Always call PetscFinalize() before exiting a program.
!
       call PetscFinalize(ierr)

=========================OUTPUT=====================================
KSP Object:
   type: gmres
     GMRES: restart=30, using Classical (unmodified) Gram-Schmidt 
Orthogonalization with no iterative refinement
     GMRES: happy breakdown tolerance 1e-30
   maximum iterations=10000, initial guess is zero
   tolerances:  relative=1e-07, absolute=1e-50, divergence=10000
   left preconditioning
   using PRECONDITIONED norm type for convergence test
PC Object:
   type: jacobi
   linear system matrix = precond matrix:
   Matrix Object:
     type=seqaij, rows=12, cols=12
     total: nonzeros=46, allocated nonzeros=60
       not using I-node routines
  seqV OLD
-1
-1
-1
-1
-1
-1
-1
-1
-1
-1
-1
-1
  seqV NEW
1
1
1
1
1
1
1
1
1
1
1
1
  after VecGetArray
  n=            4  m=            3
0: Subscript out of range for array xx_v (main-petsc.f95: 1649)
     subscript=261373973, lower bound=1, upper bound=1, dimension=1
--------------------------------------------------------------------------
mpiexec has exited due to process rank 0 with PID 23105 on
node mecfd01 exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpiexec (as reported here).
--------------------------------------------------------------------------

-------------- next part --------------
A non-text attachment was scrubbed...
Name: SJ_Ormiston.vcf
Type: text/x-vcard
Size: 321 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110114/d2709f4e/attachment.vcf>


More information about the petsc-users mailing list