Newbie question about synchronization

Knut Erik Teigen knutert at stud.ntnu.no
Fri Mar 30 01:18:22 CDT 2007


I knew there had to be a simpler solution than mine. 
Thanks a lot, Satish!

On Thu, 2007-03-29 at 09:51 -0500, Satish Balay wrote:
> On Thu, 29 Mar 2007, Knut Erik Teigen wrote:
> 
> > On Thu, 2007-03-29 at 03:42 -0400, Diego Nehab wrote:
> 
> > > So far, using only one process, everything is simple and
> > > works (it took me longer to compile and test MPI and Petsc
> > > then to write code that solves my problem :)).
> > > 
> > > When I move to multiple processes, the solver still works, but I
> > > couldn't figure out how to select one of the processes to
> > > consolidate the solution vector and use it to save a file on disk.
> > > I always get an error of the form
> > > 
> > >     [0]PETSC ERROR: Argument out of range!
> > >     [0]PETSC ERROR: Can only get local values, trying xxx!
> > > 
> > > I assume I must bring the solution vector back from the other
> > > processes, right?
> > > 
> > > 2) If so, how do I do this?
> > If you only want to save the results to disk, you can use the VecView
> > function. Just create a viewer, like e.g.
> > PetscViewerAsciiOpen(PETSC_COMM_WORLD,"filename",&viewer)
> > VecView(solution,viewer)
> > You can also output in binary format using BinaryOpen instead. Check the
> > chapter on Viewers in the manual.
> > If you need to gather the results on one processor for further
> > computations, I use standard MPI calls, like this(in Fortran):
> > 
> >  call VecGetArray(sol,sol_ptr,sol_i,ierr)
> > 
> >  ! gather solution on process 0
> >  if (rank==0) then 
> >    do i=low,high !copy local solution to global solution
> >      global_sol(i)=sol_ptr(sol_i+i)
> >    enddo
> >    do p=1,size-1 !recieve local solution from other processes
> >      call
> > MPI_Recv(local_sol,loc_n,MPI_REAL,p,1,PETSC_COMM_WORLD,istat,mpierr)
> >      do i=1,loc_n !copy local part to correct position in global
> >        global_sol(i+high)=local_sol(i)
> >        high=high+1
> >      enddo
> >    enddo
> >  else
> >   do j=1,loc_n
> >     local_sol(j)=sol_ptr(sol_i+j)
> >   enddo  
> >   call MPI_Send(local_sol,loc_n,MPI_REAL,0,1,PETSC_COMM_WORLD,mpierr)
> >  endif
> > 
> >  call VecRestoreArray(sol,sol_ptr,sol_i,ierr)
> > 
> >  !copy global solution vector back to grid array
> >  if (rank==0) then
> >    do j=1,jmax
> >      do i=1,imax
> >        T(i,j)=global_sol((j-1)*imax+i)
> >      end do
> >    end do
> >  endif
> > 
> > This is probably not the recommended way of doing things. I'm quite new
> > at using PETSc myself, so if anyone has a better solution, please
> > enlighten me! I should use PETSc data structures for everything, but I'm
> > trying to integrate PETSc into already existing code, so it's not that
> > easy to do.
> 
> VecView() is the correct thing to do for dumping the vec into a file.
> However - if you need all the values of the vec on 1 proc - then the
> use VecScatterCreateToZero()
> 
> http://www-unix.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manualpages/Vec/VecScatterCreateToZero.html
> 
> Satish
> 
> 




More information about the petsc-users mailing list