[petsc-users] Problem in MPI communicator

Michele Rosso mrosso at uci.edu
Tue Jun 10 02:19:27 CDT 2014


Barry,

thanks for your reply. I tried to do as you suggested but it did not 
solve anything as you expected.
I attached a minimal version on my code. The file "test.f90" contains 
both a main program and a module. The main program is set to run on 8 
processors, but you can easily change the parameters in it. It seems 
like the issue affects the
DMGlobalToLocalBegin() subroutine in my subroutine restore_rhs().
Hope this helps.

Thanks,
Michele

On 06/09/2014 09:06 PM, Barry Smith wrote:
>     This may not be the only problem but you can never ever ever change PETSC_COMM_WORLD after PetscInitialize(). You need to
>
>    1) call MPI_Init() your self first then
>
>    2) do your renumbering using MPI_COMM_WORLD, not PETSC_COMM_WORLD
>
>    3) set PETSC_COMM_WORLD to your new comm
>
>    4) call PetscInitialize().
>
> Let us know how it goes, if you still get failure send the entire .F file so we can run it to track down the issue.
>
>     Barry
>
> On Jun 9, 2014, at 10:56 PM, Michele Rosso <michele.rosso84 at gmail.com> wrote:
>
>> Hi,
>>
>> I am trying to re-number the mpi ranks in order to have the domain decomposition obtained from DMDACreate3D() match the default decomposition provided by MPI_Cart_create(). I followed the method described in the FAQ:
>>
>>         call PetscInitialize(PETSC_NULL_CHARACTER,ierr)
>>         call mpi_comm_rank(MPI_COMM_WORLD, rank, ierr)
>>         if(PETSC_COMM_WORLD/=MPI_COMM_WORLD) write(*,*) 'Communicator problem'
>>         x = rank / (pz*py);
>>         y = mod(rank,(pz*py))/pz
>>         z = mod(mod(rank,pz*py),pz)
>>         newrank = z*py*px + y*px + x;
>>         call mpi_comm_split(PETSC_COMM_WORLD, 1, newrank, newcomm, ierr)
>>         PETSC_COMM_WORLD = newcomm
>>
>> I tried to run my code (it works fine with the standard PETSc decomposition) with the new decomposition but I received the error message;  I attached the full output. I run with only one processor to test the setup and I commented all the lines where I actually insert/get data into/from the PETSc arrays.
>> Could you please help fixing this?
>>
>> Thanks,
>> Michele
>>
>>
>>
>>
>>
>> <outfile.txt>

-------------- next part --------------
A non-text attachment was scrubbed...
Name: test.f90
Type: text/x-fortran
Size: 10583 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140610/733b536d/attachment.bin>


More information about the petsc-users mailing list