[petsc-users] fieldsplit recursive

Barry Smith bsmith at mcs.anl.gov
Thu Jun 28 13:55:07 CDT 2012


  Anton,

   This came about because we are now being much more pedantic about the blocksizes of PETSc objects and not allowing them to be causally changed when they shouldn't be. 

   You can resolve this problem by editing the file src/ksp/pc/impls/fieldsplit/fieldsplit.c locate the function 

#undef __FUNCT__  
#define __FUNCT__ "PCApply_FieldSplit"
static PetscErrorCode PCApply_FieldSplit(PC pc,Vec x,Vec y)
{
  PC_FieldSplit     *jac = (PC_FieldSplit*)pc->data;
  PetscErrorCode    ierr;
  PC_FieldSplitLink ilink = jac->head;
  PetscInt          cnt,bs;

  PetscFunctionBegin;

and add the two lines right here

  x->map->bs = jac->bs;
  y->map->bs = jac->bs;


then run make cmake in that directory. 

To resolve this permanently we will need to figure out how to insure those inner vectors are created with the correct block size. Are you willing to share your code with petsc-maint at mcs.anl.gov so that we can reproduce the problem and fix it properly for the long run? (The problem is in PETSc not in your code).

   Barry


   
On Jun 28, 2012, at 10:44 AM, Anton Popov wrote:

> Dear petsc team,
> 
> I'm trying to use fieldsplit preconditioner for the velocity block in the Stokes system which is also preconditioned by
> fieldsplit (kind of recursive).
> 
> Running example 42 from src/ksp/ksp/examples/tutorials with petsc-3.2, as follows:
> 
> mpiexec -np 4 ./ex42 \
> -stokes_ksp_type gcr \
> -stokes_ksp_rtol 1.0e-6 \
> -stokes_pc_type fieldsplit \
> -stokes_pc_fieldsplit_type SCHUR \
> -stokes_pc_fieldsplit_schur_factorization_type UPPER \
> -stokes_fieldsplit_u_ksp_type fgmres \
> -stokes_fieldsplit_u_ksp_rtol 1e-3 \
> -stokes_fieldsplit_u_pc_type fieldsplit \
> -stokes_fieldsplit_u_fieldsplit_ksp_type preonly \
> -stokes_fieldsplit_u_fieldsplit_pc_type ml \
> -stokes_fieldsplit_u_pc_fieldsplit_block_size 3 \
> -stokes_fieldsplit_u_pc_fieldsplit_type ADDITIVE \
> -stokes_fieldsplit_p_ksp_type preonly \
> -stokes_fieldsplit_p_pc_type jacobi \
> -stokes_ksp_monitor_blocks \
> -mx 16 \
> -model 3
> 
> gives nicely looking output.
> 
> But! Repeating the same exercise with petsc-3.3, like this (actually, there is only one difference: factorization -> fact):
> 
> mpiexec -np 4 ./ex42 \
> -stokes_ksp_type gcr \
> -stokes_ksp_rtol 1.0e-6 \
> -stokes_pc_type fieldsplit \
> -stokes_pc_fieldsplit_type SCHUR \
> -stokes-pc_fieldsplit_schur_fact_type UPPER \
> -stokes_fieldsplit_u_ksp_type fgmres \
> -stokes_fieldsplit_u_ksp_rtol 1e-3 \
> -stokes_fieldsplit_u_pc_type fieldsplit \
> -stokes_fieldsplit_u_fieldsplit_ksp_type preonly \
> -stokes_fieldsplit_u_fieldsplit_pc_type ml \
> -stokes_fieldsplit_u_pc_fieldsplit_block_size 3 \
> -stokes_fieldsplit_u_pc_fieldsplit_type ADDITIVE \
> -stokes_fieldsplit_p_ksp_type preonly \
> -stokes_fieldsplit_p_pc_type jacobi \
> -stokes_ksp_monitor_blocks \
> -mx 16 \
> -model 3
> 
> curses me hardly by claiming:
> 
> [0]PETSC ERROR: --------------------- Error Message ------------------------------------
> [0]PETSC ERROR: Object is in wrong state!
> [0]PETSC ERROR: Blocksize of x vector 1 does not match fieldsplit blocksize 3!
> [0]PETSC ERROR: ------------------------------------------------------------------------
> [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 0, Tue Jun  5 14:20:42 CDT 2012
> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [0]PETSC ERROR: See docs/index.html for manual pages.
> [0]PETSC ERROR: ------------------------------------------------------------------------
> [0]PETSC ERROR: ./ex42 on a int32-deb named mac11-005.geo.uni-mainz.de by anton Thu Jun 28 17:06:53 2012
> [0]PETSC ERROR: Libraries linked from /Users/anton/LIB/petsc-3.3-p0/int32-debug/lib
> [0]PETSC ERROR: Configure run at Tue Jun 12 15:32:21 2012
> [0]PETSC ERROR: Configure options PETSC_DIR=/Users/anton/LIB/petsc-3.3-p0 PETSC_ARCH=int32-debug --download-f-blas-lapack=1 --with-debugging=1 --COPTFLAGS="-g -O0" --FOPTFLAGS="-g -O0" --CXXOPTFLAGS="-g -O0" --with-c++-support=1 --with-fortran=1 --with-fortran-kernels=1 --with-large-file-io=1 --with-mpi-compilers=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 --download-ml=1 --download-hypre=1 --download-blacs=1 --download-scalapack=1 --download-metis=1 --download-parmetis=1 --download-mumps=1 --download-superlu_dist=1
> [0]PETSC ERROR: ------------------------------------------------------------------------
> [0]PETSC ERROR: PCApply_FieldSplit() line 726 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/pc/impls/fieldsplit/fieldsplit.c
> [0]PETSC ERROR: PCApply() line 384 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/pc/interface/precon.c
> [0]PETSC ERROR: KSPFGMRESCycle() line 169 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/ksp/impls/gmres/fgmres/fgmres.c
> [0]PETSC ERROR: KSPSolve_FGMRES() line 294 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/ksp/impls/gmres/fgmres/fgmres.c
> [0]PETSC ERROR: KSPSolve() line 446 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: PCApply_FieldSplit_Schur() line 693 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/pc/impls/fieldsplit/fieldsplit.c
> [0]PETSC ERROR: PCApply() line 384 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/pc/interface/precon.c
> [0]PETSC ERROR: KSPSolve_GCR_cycle() line 47 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/ksp/impls/gcr/gcr.c
> [0]PETSC ERROR: KSPSolve_GCR() line 117 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/ksp/impls/gcr/gcr.c
> [0]PETSC ERROR: KSPSolve() line 446 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: solve_stokes_3d_coupled() line 2045 in src/ksp/ksp/examples/tutorials/ex42.c
> [0]PETSC ERROR: main() line 2106 in src/ksp/ksp/examples/tutorials/ex42.c
> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
> 
> Similar error appeared in our code after upgrading to petsc-3.3, and we're using similar functionality and options as I posted above.
> 
> Please explain this issue. An advice how to get rid of the error is also appreciated.
> 
> Thanks a lot
> 
> Anton



More information about the petsc-users mailing list