[petsc-dev] [petsc-users] Recursive fieldsplit

Matthew Knepley knepley at gmail.com
Mon May 20 11:29:58 CDT 2013


On Mon, May 20, 2013 at 11:13 AM, Jed Brown <jedbrown at mcs.anl.gov> wrote:

> "Kaus, Boris" <kaus at uni-mainz.de> writes:
>
> > Hi,
> >
> > Example 42 (src/ksp/ksp/examples/tutorials) with recursive fieldsplit
> used to work in petsc 3.2 and petsc 3.3p2 with the patch suggested below by
> Barry.
>
> Damn,
>
>
> https://bitbucket.org/petsc/petsc/commits/ce780c64fa296d753b6ed81263eb2f3164d5f63f
>
> The two reference commits are:
>
>
> https://bitbucket.org/petsc/petsc/commits/5847835f01a83a65f10e398c0972b3c7a1e1c5f4
>
> https://bitbucket.org/petsc/petsc/commits/4442dace165024532baeb2e4a78f8791bee44482
>
> We cannot just force the new block size into the input vector because
> that breaks other tests.  We can fix it with disgusting hacks:
>
>   x_bs = x->map->bs;
>   x->map->bs = jac->bs;
>   VecStrideGatherAll(x,jac->x,INSERT_VALUES);
>   x->map->bs = x_bs;
>
> or we can replace our use of VecStrideGatherAll and VecStrideScatterAll
> with something else (or add an argument to the VecStride functions so
> that we can pass the block size---traditional users would pass
> PETSC_DECIDE).
>
> I don't consider making PetscLayout intentionally mutable is acceptable
> because it is shared many different ways.


I think we have to allow an argument to the strided operations, but the arg
should be a factor of the bs of the Vec I think.

   Matt


> > mpiexec -np 4 ./ex42 \
> > -stokes_ksp_type gcr \
> > -stokes_ksp_rtol 1.0e-6 \
> > -stokes_pc_type fieldsplit \
> > -stokes_pc_fieldsplit_type SCHUR \
> > -stokes_pc_fieldsplit_schur_factorization_type UPPER \
> > -stokes_fieldsplit_u_ksp_type fgmres \
> > -stokes_fieldsplit_u_ksp_rtol 1e-3 \
> > -stokes_fieldsplit_u_pc_type fieldsplit \
> > -stokes_fieldsplit_u_fieldsplit_ksp_type preonly \
> > -stokes_fieldsplit_u_fieldsplit_pc_type ml \
> > -stokes_fieldsplit_u_pc_fieldsplit_block_size 3 \
> > -stokes_fieldsplit_u_pc_fieldsplit_type ADDITIVE \
> > -stokes_fieldsplit_p_ksp_type preonly \
> > -stokes_fieldsplit_p_pc_type jacobi \
> > -stokes_ksp_monitor_blocks \
> > -mx 16 \
> > -model 3
> >
> > It no longer works in  petsc 3.4.0. Is this something that can be fixed
> and potentially added to 3.4.1?
> > Our production code relies on similar functionality.
> >
> > thanks a lot!
> >
> > Boris
> >
> >
> >
> > On Jun 28, 2012, at 8:55 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> >
> >>  Anton,
> >>
> >>   This came about because we are now being much more pedantic about the
> blocksizes of PETSc objects and not allowing them to be causally changed
> when they shouldn't be.
> >>
> >>   You can resolve this problem by editing the file
> src/ksp/pc/impls/fieldsplit/fieldsplit.c locate the function
> >>
> >> #undef __FUNCT__
> >> #define __FUNCT__ "PCApply_FieldSplit"
> >> static PetscErrorCode PCApply_FieldSplit(PC pc,Vec x,Vec y)
> >> {
> >>  PC_FieldSplit     *jac = (PC_FieldSplit*)pc->data;
> >>  PetscErrorCode    ierr;
> >>  PC_FieldSplitLink ilink = jac->head;
> >>  PetscInt          cnt,bs;
> >>
> >>  PetscFunctionBegin;
> >>
> >> and add the two lines right here
> >>
> >>  x->map->bs = jac->bs;
> >>  y->map->bs = jac->bs;
> >>
> >>
> >> then run make cmake in that directory.
> >>
> >> To resolve this permanently we will need to figure out how to insure
> those inner vectors are created with the correct block size. Are you
> willing to share your code with petsc-maint at mcs.anl.gov so that we can
> reproduce the problem and fix it properly for the long run? (The problem is
> in PETSc not in your code).
> >>
> >>   Barry
> >>
> >>
> >>
> >> On Jun 28, 2012, at 10:44 AM, Anton Popov wrote:
> >>
> >>> Dear petsc team,
> >>>
> >>> I'm trying to use fieldsplit preconditioner for the velocity block in
> the Stokes system which is also preconditioned by
> >>> fieldsplit (kind of recursive).
> >>>
> >>> Running example 42 from src/ksp/ksp/examples/tutorials with petsc-3.2,
> as follows:
> >>>
> >>> mpiexec -np 4 ./ex42 \
> >>> -stokes_ksp_type gcr \
> >>> -stokes_ksp_rtol 1.0e-6 \
> >>> -stokes_pc_type fieldsplit \
> >>> -stokes_pc_fieldsplit_type SCHUR \
> >>> -stokes_pc_fieldsplit_schur_factorization_type UPPER \
> >>> -stokes_fieldsplit_u_ksp_type fgmres \
> >>> -stokes_fieldsplit_u_ksp_rtol 1e-3 \
> >>> -stokes_fieldsplit_u_pc_type fieldsplit \
> >>> -stokes_fieldsplit_u_fieldsplit_ksp_type preonly \
> >>> -stokes_fieldsplit_u_fieldsplit_pc_type ml \
> >>> -stokes_fieldsplit_u_pc_fieldsplit_block_size 3 \
> >>> -stokes_fieldsplit_u_pc_fieldsplit_type ADDITIVE \
> >>> -stokes_fieldsplit_p_ksp_type preonly \
> >>> -stokes_fieldsplit_p_pc_type jacobi \
> >>> -stokes_ksp_monitor_blocks \
> >>> -mx 16 \
> >>> -model 3
> >>>
> >>> gives nicely looking output.
> >>>
> >>> But! Repeating the same exercise with petsc-3.3, like this (actually,
> there is only one difference: factorization -> fact):
> >>>
> >>> mpiexec -np 4 ./ex42 \
> >>> -stokes_ksp_type gcr \
> >>> -stokes_ksp_rtol 1.0e-6 \
> >>> -stokes_pc_type fieldsplit \
> >>> -stokes_pc_fieldsplit_type SCHUR \
> >>> -stokes-pc_fieldsplit_schur_fact_type UPPER \
> >>> -stokes_fieldsplit_u_ksp_type fgmres \
> >>> -stokes_fieldsplit_u_ksp_rtol 1e-3 \
> >>> -stokes_fieldsplit_u_pc_type fieldsplit \
> >>> -stokes_fieldsplit_u_fieldsplit_ksp_type preonly \
> >>> -stokes_fieldsplit_u_fieldsplit_pc_type ml \
> >>> -stokes_fieldsplit_u_pc_fieldsplit_block_size 3 \
> >>> -stokes_fieldsplit_u_pc_fieldsplit_type ADDITIVE \
> >>> -stokes_fieldsplit_p_ksp_type preonly \
> >>> -stokes_fieldsplit_p_pc_type jacobi \
> >>> -stokes_ksp_monitor_blocks \
> >>> -mx 16 \
> >>> -model 3
> >>>
> >>> curses me hardly by claiming:
> >>>
> >>> [0]PETSC ERROR: --------------------- Error Message
> ------------------------------------
> >>> [0]PETSC ERROR: Object is in wrong state!
> >>> [0]PETSC ERROR: Blocksize of x vector 1 does not match fieldsplit
> blocksize 3!
> >>> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> >>> [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 0, Tue Jun  5
> 14:20:42 CDT 2012
> >>> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
> >>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> >>> [0]PETSC ERROR: See docs/index.html for manual pages.
> >>> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> >>> [0]PETSC ERROR: ./ex42 on a int32-deb named mac11-005.geo.uni-mainz.deby anton Thu Jun 28 17:06:53 2012
> >>> [0]PETSC ERROR: Libraries linked from
> /Users/anton/LIB/petsc-3.3-p0/int32-debug/lib
> >>> [0]PETSC ERROR: Configure run at Tue Jun 12 15:32:21 2012
> >>> [0]PETSC ERROR: Configure options
> PETSC_DIR=/Users/anton/LIB/petsc-3.3-p0 PETSC_ARCH=int32-debug
> --download-f-blas-lapack=1 --with-debugging=1 --COPTFLAGS="-g -O0"
> --FOPTFLAGS="-g -O0" --CXXOPTFLAGS="-g -O0" --with-c++-support=1
> --with-fortran=1 --with-fortran-kernels=1 --with-large-file-io=1
> --with-mpi-compilers=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90
> --download-ml=1 --download-hypre=1 --download-blacs=1
> --download-scalapack=1 --download-metis=1 --download-parmetis=1
> --download-mumps=1 --download-superlu_dist=1
> >>> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> >>> [0]PETSC ERROR: PCApply_FieldSplit() line 726 in
> /Users/anton/LIB/petsc-3.3-p0/src/ksp/pc/impls/fieldsplit/fieldsplit.c
> >>> [0]PETSC ERROR: PCApply() line 384 in
> /Users/anton/LIB/petsc-3.3-p0/src/ksp/pc/interface/precon.c
> >>> [0]PETSC ERROR: KSPFGMRESCycle() line 169 in
> /Users/anton/LIB/petsc-3.3-p0/src/ksp/ksp/impls/gmres/fgmres/fgmres.c
> >>> [0]PETSC ERROR: KSPSolve_FGMRES() line 294 in
> /Users/anton/LIB/petsc-3.3-p0/src/ksp/ksp/impls/gmres/fgmres/fgmres.c
> >>> [0]PETSC ERROR: KSPSolve() line 446 in
> /Users/anton/LIB/petsc-3.3-p0/src/ksp/ksp/interface/itfunc.c
> >>> [0]PETSC ERROR: PCApply_FieldSplit_Schur() line 693 in
> /Users/anton/LIB/petsc-3.3-p0/src/ksp/pc/impls/fieldsplit/fieldsplit.c
> >>> [0]PETSC ERROR: PCApply() line 384 in
> /Users/anton/LIB/petsc-3.3-p0/src/ksp/pc/interface/precon.c
> >>> [0]PETSC ERROR: KSPSolve_GCR_cycle() line 47 in
> /Users/anton/LIB/petsc-3.3-p0/src/ksp/ksp/impls/gcr/gcr.c
> >>> [0]PETSC ERROR: KSPSolve_GCR() line 117 in
> /Users/anton/LIB/petsc-3.3-p0/src/ksp/ksp/impls/gcr/gcr.c
> >>> [0]PETSC ERROR: KSPSolve() line 446 in
> /Users/anton/LIB/petsc-3.3-p0/src/ksp/ksp/interface/itfunc.c
> >>> [0]PETSC ERROR: solve_stokes_3d_coupled() line 2045 in
> src/ksp/ksp/examples/tutorials/ex42.c
> >>> [0]PETSC ERROR: main() line 2106 in
> src/ksp/ksp/examples/tutorials/ex42.c
> >>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
> >>>
> >>> Similar error appeared in our code after upgrading to petsc-3.3, and
> we're using similar functionality and options as I posted above.
> >>>
> >>> Please explain this issue. An advice how to get rid of the error is
> also appreciated.
> >>>
> >>> Thanks a lot
> >>>
> >>> Anton
> >>
> >
> >
> >
> >
> -----------------------------------------------------------------------------
> > Boris J.P. Kaus
> >
> > Institute of Geosciences,
> > Geocycles Research Center &
> > Center for Computational Sciences.
> > University of Mainz, Mainz, Germany
> > Office:       00-285
> > Tel:          +49.6131.392.4527
> >
> > http://www.geophysik.uni-mainz.de
> >
> -----------------------------------------------------------------------------
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20130520/d7e43bbe/attachment.html>


More information about the petsc-dev mailing list