[petsc-users] Problem in MPI communicator

Michele Rosso michele.rosso84 at gmail.com
Tue Jun 10 16:34:01 CDT 2014


Barry,

thank you for pointing this out. Now it works as expected. Thanks for 
your help.

Michele

On 06/10/2014 11:13 AM, Barry Smith wrote:
>    You have a bug in you code unrelated the the reordering. Note the variable x is declared in your module and then again inside the subroutine init. Thus when init does call VecDuplicate(b,x,ierr) it sets the value of the inner x, not the outer x.
>
>     It would be nice if the Fortran compilers warned you about variables like this.
>
>     Barry
>
>
> module poisson_solver
>      implicit none
> #include <finclude/petsc.h90>
>      private
>      Vec                    :: b, x
>      Mat                    :: A
>      KSP                    :: ksp
>      DM                     :: da
>      PetscScalar            :: dxdyqdz, dxdzqdy, dydzqdx
>      integer, parameter     :: dp = kind(0.0D0)
>      logical                :: reset = .false.
>      integer, parameter     :: max_number_iter = 15
>      
>      
>      public init, finalize, solve
>      save   b, x, da, A, ksp, dxdyqdz, dxdzqdy, dydzqdx, reset
>      
> contains
>
>      subroutine init(px,py,pz,nx,ny,nz,nxl,nyl,nzl)
>          integer,     intent(in)   :: px,py,pz,nx,ny,nz,nxl,nyl,nzl
>          PetscInt                  :: nnx(px), nny(py), nnz(pz)
>          PetscErrorCode            :: ierr
>          MatNullSpace              :: nullspace
>          integer                   :: newcomm,rank, newrank, x, y, z
>          PetscInt                  :: xs,xm,ys,zs,ym,zm
>          
>          call mpi_comm_rank(MPI_COMM_WORLD, rank, ierr)
> !        if(PETSC_COMM_WORLD/=MPI_COMM_WORLD) write(*,*) 'Communicator problem'
>          x = rank / (pz*py);
>          y = mod(rank,(pz*py))/pz
>          z = mod(mod(rank,pz*py),pz)
>          newrank = z*py*px + y*px + x;
>          call mpi_comm_split(MPI_COMM_WORLD, 1, newrank, newcomm, ierr)
>          PETSC_COMM_WORLD = newcomm
>          call PetscInitialize(PETSC_NULL_CHARACTER,ierr)
>          
>          
>          ! Create DM context
>          nnx = nxl
>          nny = nyl
>          nnz = nzl
>          call DMDACreate3d( PETSC_COMM_WORLD ,                                &
>                          & DMDA_BOUNDARY_PERIODIC , DMDA_BOUNDARY_PERIODIC,   &
>                          & DMDA_BOUNDARY_PERIODIC , DMDA_STENCIL_STAR,        &
>                          & nx, ny, nz, px, py, pz, 1, 1, nnx, nny, nnz, da , ierr)
>          call DMDASetInterpolationType(da, DMDA_Q0, ierr);
>          call DMCreateGlobalVector(da,b,ierr) ! Create Global Vectors
>          call VecDuplicate(b,x,ierr)
>
>
> On Jun 10, 2014, at 2:19 AM, Michele Rosso <mrosso at uci.edu> wrote:
>
>> Barry,
>>
>> thanks for your reply. I tried to do as you suggested but it did not solve anything as you expected.
>> I attached a minimal version on my code. The file "test.f90" contains both a main program and a module. The main program is set to run on 8 processors, but you can easily change the parameters in it. It seems like the issue affects the
>> DMGlobalToLocalBegin() subroutine in my subroutine restore_rhs().
>> Hope this helps.
>>
>> Thanks,
>> Michele
>>
>> On 06/09/2014 09:06 PM, Barry Smith wrote:
>>>     This may not be the only problem but you can never ever ever change PETSC_COMM_WORLD after PetscInitialize(). You need to
>>>
>>>    1) call MPI_Init() your self first then
>>>
>>>    2) do your renumbering using MPI_COMM_WORLD, not PETSC_COMM_WORLD
>>>
>>>    3) set PETSC_COMM_WORLD to your new comm
>>>
>>>    4) call PetscInitialize().
>>>
>>> Let us know how it goes, if you still get failure send the entire .F file so we can run it to track down the issue.
>>>
>>>     Barry
>>>
>>> On Jun 9, 2014, at 10:56 PM, Michele Rosso <michele.rosso84 at gmail.com> wrote:
>>>
>>>> Hi,
>>>>
>>>> I am trying to re-number the mpi ranks in order to have the domain decomposition obtained from DMDACreate3D() match the default decomposition provided by MPI_Cart_create(). I followed the method described in the FAQ:
>>>>
>>>>         call PetscInitialize(PETSC_NULL_CHARACTER,ierr)
>>>>         call mpi_comm_rank(MPI_COMM_WORLD, rank, ierr)
>>>>         if(PETSC_COMM_WORLD/=MPI_COMM_WORLD) write(*,*) 'Communicator problem'
>>>>         x = rank / (pz*py);
>>>>         y = mod(rank,(pz*py))/pz
>>>>         z = mod(mod(rank,pz*py),pz)
>>>>         newrank = z*py*px + y*px + x;
>>>>         call mpi_comm_split(PETSC_COMM_WORLD, 1, newrank, newcomm, ierr)
>>>>         PETSC_COMM_WORLD = newcomm
>>>>
>>>> I tried to run my code (it works fine with the standard PETSc decomposition) with the new decomposition but I received the error message;  I attached the full output. I run with only one processor to test the setup and I commented all the lines where I actually insert/get data into/from the PETSc arrays.
>>>> Could you please help fixing this?
>>>>
>>>> Thanks,
>>>> Michele
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> <outfile.txt>
>> <test.f90>



More information about the petsc-users mailing list