[petsc-dev] Unevenly distributed MatNest and FieldSplit

Smith, Barry F. bsmith at mcs.anl.gov
Wed Jun 12 05:16:06 CDT 2019


  mpiexec -n <n>  ./myprogrm over options -log_trace > afile

  grep "\[0\]" afile > process0
  grep "\[1\]" afile > process1 

  paste process0 process1 | more

  For the two processes  pick ones that take the different paths in the code. 

  Almost for sure something is defined on a sub communicator where the group of processes in that sub communicator are getting "behind" the other processes caught by their own MPI reduction. Hopefully the above will make clear where the two sets of processes branch.

   Good luck,

   Barry


> On Jun 12, 2019, at 1:44 AM, Pierre Jolivet via petsc-dev <petsc-dev at mcs.anl.gov> wrote:
> 
> Hello,
> We are using a SNES to solve a steady-state FSI problem.
> The operator is defined as a MatNest with multiple fields.
> Some submatrices are entirely defined on a subset of processes (but they are still created on the same communicator as the MatNest).
> The preconditioner is defined as a FieldSplit.
> 
> During the first call to KSPSolve within SNESSolve, I’m getting this (with debugging turned on):
> [0]PETSC ERROR: VecGetSubVector() line 1243 in /Users/jolivet/Documents/repositories/petsc/src/vec/vec/interface/rvector.c MPI_Allreduce() called in different locations (code lines) on different processors
> [2]PETSC ERROR: VecNorm_MPI() line 57 in /Users/jolivet/Documents/repositories/petsc/src/vec/vec/impls/mpi/pvec2.c MPI_Allreduce() called in different locations (code lines) on different processors
> [1]PETSC ERROR: VecNorm_MPI() line 57 in /Users/jolivet/Documents/repositories/petsc/src/vec/vec/impls/mpi/pvec2.c MPI_Allreduce() called in different locations (code lines) on different processors
> 
> As you may have guessed, process 0 (resp. 1–2) is where the structure (resp. fluid) is handled.
> I’m attaching both stacks. I don’t see what could trigger such an error from my side, since everything is delegated to PETSc in the SNESSolve.
> Is there an easy way to debug this?
> Is there some way to dump _everything_ related to a KSP (Mat + PC + ISes) for “easier” debugging?
> 
> Thank you,
> Pierre
> 
> <stack_1--2.txt><stack_0.txt>



More information about the petsc-dev mailing list