[petsc-dev] Unevenly distributed MatNest and FieldSplit

Pierre Jolivet pierre.jolivet at enseeiht.fr
Wed Jun 12 01:44:25 CDT 2019


Hello,
We are using a SNES to solve a steady-state FSI problem.
The operator is defined as a MatNest with multiple fields.
Some submatrices are entirely defined on a subset of processes (but they are still created on the same communicator as the MatNest).
The preconditioner is defined as a FieldSplit.

During the first call to KSPSolve within SNESSolve, I’m getting this (with debugging turned on):
[0]PETSC ERROR: VecGetSubVector() line 1243 in /Users/jolivet/Documents/repositories/petsc/src/vec/vec/interface/rvector.c MPI_Allreduce() called in different locations (code lines) on different processors
[2]PETSC ERROR: VecNorm_MPI() line 57 in /Users/jolivet/Documents/repositories/petsc/src/vec/vec/impls/mpi/pvec2.c MPI_Allreduce() called in different locations (code lines) on different processors
[1]PETSC ERROR: VecNorm_MPI() line 57 in /Users/jolivet/Documents/repositories/petsc/src/vec/vec/impls/mpi/pvec2.c MPI_Allreduce() called in different locations (code lines) on different processors

As you may have guessed, process 0 (resp. 1–2) is where the structure (resp. fluid) is handled.
I’m attaching both stacks. I don’t see what could trigger such an error from my side, since everything is delegated to PETSc in the SNESSolve.
Is there an easy way to debug this?
Is there some way to dump _everything_ related to a KSP (Mat + PC + ISes) for “easier” debugging?

Thank you,
Pierre

-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: stack_1--2.txt
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20190612/aae46049/attachment.txt>
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: stack_0.txt
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20190612/aae46049/attachment-0001.txt>


More information about the petsc-dev mailing list