[petsc-users] How to extract a slice at a given coordinate and view it ?

Timothée Nicolas timothee.nicolas at gmail.com
Fri Aug 28 22:15:30 CDT 2015


Hi,

I have been thinking for several hours about this problem and can't find an
efficient solution, however I imagine this must be possible somehow with
Petsc.

My problem is the following :

I work in 3D (R,Z,Phi) which makes my data quite heavy and I don't want to
save all the data of all my fields, even just once in a while. Instead, I
would like to save in a binary file a slice at a given angle, say phi=0.

As I did not find if it's natively possible in Petsc, I considered creating
a second 2D DMDA, on which I can create 2D vectors and view them with the
binary viewer. So far so good. However, upon creating the 2D DMDA,
naturally the distribution of processors does not correspond to the
distribution of the 3D DMDA. So I was considering creating global arrays,
filling them with the data of the 3D array in phi=0, then doing an
MPI_allgather to give the information to all the processors, to be able to
read the array and fill the 2D Petsc Vector with it. So the code would be
something along the lines of :

  PetscScalar, pointer :: gX2D(:,:,:)
  PetscScalar, pointer :: gX(:,:,:,:)
  ! LocalArray is locally
filled

  ! It is transmitted to GlobalArray via
MPI_Allgather

  real(8)          :: LocalArray(user%dof,user%mr,user%mz)
  real(8)          :: GlobalArray(user%dof,user%mr,user%mz)

  call DMDAVecGetArrayF90(da_phi0,X2D,gX2D,ierr)
  call DMDAVecGetArrayF90(da,X,gX,ierr)

  do k = user%phis,user%phie
     do j = user%zs,user%ze
        do i = user%rs,user%re
           do l=1,user%dof
              if (k.eq.phi_print) then
                 ! Numbering obtained with DMDAGetArrayF90 differs from
usual

                 LocalArray(l,i,j) = gX(l-1,i-1,j-1,k-1)
              end if
           end do
        end do
     end do
  end do

  nvals = user%dof*user%rm*user%zm

  call MPI_AllGather(LocalArray(1,user%rs,user%zs),        &
       &             nvals,MPI_REAL,                        &
       &             GlobalArray,       &
       &             nvals,MPI_REAL,MPI_COMM_WORLD,ierr)

  do j = zs2D,ze2D
     do i = rs2D,re2D
        do l=1,user%dof
           gX2D(l-1,i-1,j-1) = GlobalArray(l,i,j)
        end do
     end do
  end do

 call DMDAVecRestoreArrayF90(da_phi0,X2D,gX2D,ierr)
  call DMDAVecRestoreArrayF90(da,X,gX,ierr)

The problem is that MPI_allgather is not at all that simple. Exchanging
array information is much more complicated that I had anticipated ! See
this long post on stackoverflow :

http://stackoverflow.com/questions/17508647/sending-2d-arrays-in-fortran-with-mpi-gather

I could probably get it to work eventually, but it's pretty complicated,
and I was wondering if there was not a simpler alternative I could not see.
Besides, I am concerned about what could happen if the number of processors
is so large that the 2D Vector gets less than 2 points per processor (I
have lots of points in phi, so this can happen easily). Then Petsc would
complain.

Does anybody have ideas ?

Best

Timothée
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150829/7b3bd2fe/attachment.html>


More information about the petsc-users mailing list