[petsc-users] PCFIELDSPLIT question

Matthew Knepley knepley at gmail.com
Tue Feb 9 09:19:31 CST 2016


On Tue, Feb 9, 2016 at 9:10 AM, Hom Nath Gharti <hng.email at gmail.com> wrote:

> Thank you so much Barry!
>
> For my small test case, with -pc_fieldsplit_block_size 4, the program
> runs, although the answer was not correct. At least now I get
> something to look upon. I am using PCFieldSplitSetIS to set the
> fields. Do I still need to use -pc_fieldsplit_block_size?
>

No, this is only for co-located discretizations.


> In my case each grid point may have different variable sets. For
> example, the grid point in the solid region has displacement and
> gravity potential: ux, uy, uz, and \phi. Whereas the grid point in the
> fluid region has scalar potential, pressure and gravity potential:
> \chi, p, and \phi. And the solid-fluid interface has all of them. Do
> you think I can still use PCFIELDSPLIT in this situation?
>

We have examples, like SNES ex62, which do exactly this.

  Thanks,

    Matt


> Best,
> Hom Nath
>
>
>
> On Mon, Feb 8, 2016 at 2:27 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> >
> >
> >    In this case you need to provide two pieces of information to the
> PCFIELDSPLIT. What we call the "block size" or bs which is the number of
> "basic fields" in the problem. For example if at each grid point you have
> x-velocity, y-velocity, and pressure the block size is 3. The second is you
> need to tell PCFIELDSPLIT what "basic fields" are in each split you want to
> use.
> >
> >    In your case you can do this with -pc_fieldsplit_block_size 4.
>  (Note if you use a DM to define your problem the PCFIELDSPLIT will
> automatically get the bs from the DM so you do not need to provide it.
> Similarly if you set a block size on your Mat the PCFIELDSPLIT will use
> that so you don't need to set it).
> >
> >    Then use -pc_fieldsplit_0_fields 0,1 to indicate your first split is
> the 0 and 1 basic fields and -pc_fieldsplit_1_fields 2,3 to indicate the
> second split is the 2 and 3 basic fields.
> > (By default if you do not provide this then PCFIELDSPLIT will use bs
> splits (4 in your case) one for each basic field.
> >
> >    Barry
> >
> >> On Feb 8, 2016, at 11:20 AM, Hom Nath Gharti <hng.email at gmail.com>
> wrote:
> >>
> >> Hi Matt, Hi all,
> >>
> >> I am trying to have some feel for PCFIELDSPLIT testing on a very small
> >> matrix (attached below). I have 4 degrees of freedoms. I use 4
> >> processors. I want to split 4 dofs into two fields each having two
> >> dofs. I don't know whether this my be a problem for petsc. When I use
> >> the command:
> >> srun -n 4 ./xtestfs -ksp_monitor -ksp_view
> >>
> >> It runs fine.
> >>
> >> But when I want to use field split options using the command:
> >> srun -n 4 ./xtestfs -ksp_monitor -ksp_type fgmres -pc_type fieldsplit
> >> -fieldsplit_0_ksp_type gmres -fieldsplit_0_pc_type bjacobi
> >> -fieldsplit_1_pc_type jacobi
> >>
> >> I get the following error message:
> >> [0]PETSC ERROR: Petsc has generated inconsistent data
> >> [0]PETSC ERROR: Unhandled case, must have at least two fields, not 1
> >> [0]PETSC ERROR: See
> >> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
> >> shooting.
> >> [0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015
> >> [0]PETSC ERROR: /tigress/hgharti/test/testfs/./xtestfs on a
> >> arch-linux2-c-debug named tiger-r11n1 by hgharti Mon Feb  8 11:40:03
> >> 2016
> >> [0]PETSC ERROR: Configure options
> >>
> --with-blas-lapack-dir=/opt/intel/composer_xe_2015.2.164/mkl/lib/intel64/
> >> --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90
> >> --with-mpiexec=mpiexec --with-debugging=1 --download-scalapack
> >> --download-mumps --download-pastix --download-superlu
> >> --download-superlu_dist --download-metis --download-parmetis
> >> --download-ptscotch --download-hypre
> >> [0]PETSC ERROR: #1 PCFieldSplitSetDefaults() line 469 in
> >>
> /home/hgharti/lsoft/petsc-3.6.3-intel16-mpi5/src/ksp/pc/impls/fieldsplit/fieldsplit.c
> >> [0]PETSC ERROR: #2 PCSetUp_FieldSplit() line 486 in
> >>
> /home/hgharti/lsoft/petsc-3.6.3-intel16-mpi5/src/ksp/pc/impls/fieldsplit/fieldsplit.c
> >> [0]PETSC ERROR: #3 PCSetUp() line 983 in
> >>
> /home/hgharti/lsoft/petsc-3.6.3-intel16-mpi5/src/ksp/pc/interface/precon.c
> >> [0]PETSC ERROR: #4 KSPSetUp() line 332 in
> >>
> /home/hgharti/lsoft/petsc-3.6.3-intel16-mpi5/src/ksp/ksp/interface/itfunc.c
> >> [0]PETSC ERROR: #5 KSPSolve() line 546 in
> >>
> /home/hgharti/lsoft/petsc-3.6.3-intel16-mpi5/src/ksp/ksp/interface/itfunc.c
> >> WARNING! There are options you set that were not used!
> >> WARNING! could be spelling mistake, etc!
> >> Option left: name:-fieldsplit_0_ksp_type value: gmres
> >> Option left: name:-fieldsplit_0_pc_type value: bjacobi
> >> Option left: name:-fieldsplit_1_pc_type value: jacobi
> >> forrtl: error (76): Abort trap signal
> >>
> >> I tried several trials but could not fix the problem. Is it the
> >> FORTRAN problem or am I doing something wrong? Any suggestions would
> >> be greatly appreciated.
> >> For your information I use:
> >> PETSc-3.6.3
> >> intel-16.0 compiler
> >> intel-mpi-5.1.1
> >>
> >> Program
> >>
> >>
> >> Best,
> >> Hom Nath
> >>
> >> ! simple test case for PCFIELDSPLIT
> >> !
> -----------------------------------------------------------------------
> >> program testfs
> >> implicit none
> >> #include "petsc/finclude/petscsys.h"
> >> #include "petsc/finclude/petscvec.h"
> >> #include "petsc/finclude/petscvec.h90"
> >> #include "petsc/finclude/petscmat.h"
> >> #include "petsc/finclude/petscksp.h"
> >> #include "petsc/finclude/petscpc.h"
> >> #include "petsc/finclude/petscviewer.h"
> >> #include "petsc/finclude/petscviewer.h90"
> >>
> >> Vec              x,b,u
> >> Mat              A
> >> KSP              ksp
> >> PC               pc
> >> PetscErrorCode   ierr
> >> PetscBool        flg
> >> PetscInt         errcode,its,maxitr,myrank,nproc
> >> PetscInt         nedof,nelmt,ndof,nenod,ngdof,fixdof,nsparse,neq
> >> PetscInt,allocatable  :: krow_sparse(:),ggdof_elmt(:,:),           &
> >> inode_elmt(:,:)
> >>
> >> PetscReal,allocatable :: storef(:,:),storekmat(:,:,:)
> >> PetscInt gdof0(2),gdof1(2)
> >>
> >> ! initialize MPI
> >>
> >> call MPI_INIT(errcode)
> >> if(errcode /= 0)stop 'ERROR: cannot initialize MPI!'
> >> call MPI_COMM_RANK(MPI_COMM_WORLD,myrank,errcode)
> >> if(errcode /= 0)stop 'ERROR: cannot find processor ID'
> >> call MPI_COMM_SIZE(MPI_COMM_WORLD,nproc,errcode)
> >> if(errcode /= 0)stop 'ERROR: cannot find number of processors!'
> >>
> >> ! define some parameters
> >> ! 1D model with 4 elements. Each elements consits of 2 nodes. Node 0 is
> fixed.
> >> ! 1-----------2-----------3-----------4-----------5
> >> nelmt=1 ! per processor
> >> nenod=2 ! number of nodes per element
> >> nedof=2 ! number of DOFs per element
> >> ndof=2  ! number of DOFs
> >> ngdof=4 ! total number of global DOFs
> >> fixdof=1
> >>
> >> if(myrank==0)then
> >>  neq=1 ! number of equations
> >>  nsparse=1
> >> else
> >>  neq=2 ! number of equations
> >>  nsparse=4
> >> endif
> >>
> >> allocate(storef(nedof,nelmt),storekmat(nedof,nedof,nelmt),
>        &
> >> ggdof_elmt(nedof,nelmt),inode_elmt(nenod,nelmt),krow_sparse(nsparse))
> >>
> >> storef=0.0_8 ! local RHS vector
> >> storekmat=0.0_8 ! local stiffness matrix
> >> if(myrank==0)then
> >>  krow_sparse(:)=1
> >>  storef(:,1)=(/12.5000_8, 12.5000_8/)
> >>  storekmat(:,:,1) = reshape((/1.2333333332_8, 0.0166666667_8, &
> >>                               0.0166666667_8,
> 1.2333333332_8/),(/nedof,nedof/))
> >>  inode_elmt(:,1)  = (/1,2/)
> >>  ggdof_elmt(:,1)  = (/0,1/) ! 0 because left node is fixed
> >> elseif(myrank==1)then
> >>  krow_sparse(:)=(/1,1,2,2/)
> >>  storef(:,1)=(/12.5000_8, 12.5000_8/)
> >>  storekmat(:,:,1) = reshape((/1.2333333332_8, 0.0166666667_8, &
> >>                               0.0166666667_8,
> 1.2333333332_8/),(/nedof,nedof/))
> >>  inode_elmt(:,1)  = (/1,2/)
> >>  ggdof_elmt(:,1)  = (/1,2/)
> >> elseif(myrank==2)then
> >>  krow_sparse(:)=(/1,1,2,2/)
> >>  storef(:,1)=(/12.5000_8, 12.5000_8/)
> >>  storekmat(:,:,1) = reshape((/1.2333333332_8, 0.0166666667_8, &
> >>                               0.0166666667_8,
> 1.2333333332_8/),(/nedof,nedof/))
> >>  inode_elmt(:,1)  = (/1,2/)
> >>  ggdof_elmt(:,1)  = (/2,3/)
> >> elseif(myrank==3)then
> >>  krow_sparse(:)=(/1,1,2,2/)
> >>  storef(:,1)=(/12.5000_8, 22.5000_8/)
> >>  storekmat(:,:,1) = reshape((/1.2333333332_8, 0.0166666667_8, &
> >>                               0.0166666667_8,
> 1.2333333332_8/),(/nedof,nedof/))
> >>  inode_elmt(:,1)  = (/1,2/)
> >>  ggdof_elmt(:,1)  = (/3,4/)
> >> endif
> >>
> >> call petsc_initialize
> >> call petsc_set_matrix
> >> call petsc_set_vector
> >> call petsc_solve
> >> call petsc_finalize
> >>
> >> ! deallocate
> >> deallocate(storef,storekmat,ggdof_elmt,inode_elmt)
> >> deallocate(krow_sparse)
> >> contains
> >>
> !-------------------------------------------------------------------------------
> >> subroutine petsc_initialize()
> >> implicit none
> >> PetscInt :: istart,iend
> >> PetscInt :: nzeros_max,nzeros_min
> >> PetscReal :: atol,dtol,rtol
> >> PetscInt,allocatable :: nzeros(:)
> >> IS gdof0_is,gdof1_is
> >>
> >> call PetscInitialize(PETSC_NULL_CHARACTER,ierr)
> >> call PetscOptionsGetInt(PETSC_NULL_CHARACTER,'-n',ngdof,flg,ierr)
> >>
> >> ! count number of nonzeros per row
> >> allocate(nzeros(neq))
> >> nzeros=0
> >> nzeros(krow_sparse)=nzeros(krow_sparse)+1
> >> nzeros_max=maxval(nzeros)
> >> nzeros_min=minval(nzeros)
> >>
> >> ! create matrix
> >> call MatCreate(PETSC_COMM_WORLD,A,ierr)
> >> call MatSetType(A,MATMPIAIJ,ierr)
> >> CHKERRQ(ierr)
> >> call MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,ngdof,ngdof,ierr)
> >> CHKERRQ(ierr)
> >> ! preallocation
> >> call
> MatMPIAIJSetPreallocation(A,nzeros_max,PETSC_NULL_INTEGER,nzeros_max,     &
> >> PETSC_NULL_INTEGER,ierr)
> >> CHKERRQ(ierr)
> >>
> >> call MatGetOwnershipRange(A,istart,iend,ierr)
> >> CHKERRQ(ierr)
> >> print*,'ownership:',myrank,istart,iend
> >> ! create vector
> >> call VecCreate(PETSC_COMM_WORLD,x,ierr)
> >> call VecSetSizes(x,PETSC_DECIDE,ngdof,ierr)
> >> call VecSetType(x,VECMPI,ierr)
> >> call VecDuplicate(x,b,ierr)
> >> call VecDuplicate(x,u,ierr)
> >>
> >>
> !*******************************************************************************
> >> ! get IS for split fields
> >> gdof0=(/0,1/)
> >> gdof1=(/2,3/)
> >> call
> ISCreateGeneral(PETSC_COMM_WORLD,2,gdof0,PETSC_COPY_VALUES,gdof0_is,ierr)
> >> CHKERRQ(ierr)
> >> call
> ISCreateGeneral(PETSC_COMM_WORLD,2,gdof1,PETSC_COPY_VALUES,gdof1_is,ierr)
> >> CHKERRQ(ierr)
> >>
> !*******************************************************************************
> >>
> >> ! Create linear solver context
> >> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr)
> >> call KSPSetOperators(ksp,A,A,ierr)
> >>
> >> call KSPGetPC(ksp,pc,ierr)
> >> CHKERRQ(ierr)
> >>
> >>
> !*******************************************************************************
> >> ! split PC
> >> call PCFieldSplitSetIS(pc,"0",gdof0_is,ierr);
> >> CHKERRQ(ierr)
> >> call ISDestroy(gdof0_is,ierr)
> >> CHKERRQ(ierr)
> >> call PCFieldSplitSetIS(pc,"1",gdof1_is,ierr);
> >> CHKERRQ(ierr)
> >> call ISDestroy(gdof1_is,ierr)
> >> CHKERRQ(ierr)
> >>
> !*******************************************************************************
> >>
> >> rtol = 1.0d-6
> >> atol = 1.0d-6
> >> dtol = 1.0d10
> >> maxitr = 1000
> >> call KSPSetTolerances(ksp,rtol,atol,dtol,maxitr,ierr)
> >> CHKERRQ(ierr)
> >> call KSPSetFromOptions(ksp,ierr)
> >> end subroutine petsc_initialize
> >>
> !-------------------------------------------------------------------------------
> >>
> >> subroutine petsc_set_matrix()
> >>
> >> implicit none
> >> integer :: i,i_elmt,j,ncount
> >> integer :: ggdof(NEDOF),idof(NEDOF),igdof(NEDOF)
> >>
> >> ! Set and assemble matrix.
> >> call MatZeroEntries(A,ierr)
> >> CHKERRQ(ierr)
> >> do i_elmt=1,nelmt
> >>  ggdof(:)=ggdof_elmt(:,i_elmt)
> >>  ggdof(:)=ggdof(:)-1 ! petsc index starts from 0
> >>  ncount=0; idof=-1; igdof=-1
> >>  do i=1,NEDOF
> >>    do j=1,NEDOF
> >>    if(ggdof(i).ge.0.and.ggdof(j).ge.0)then
> >>      call MatSetValues(A,1,ggdof(i),1,ggdof(j),storekmat(i,j,i_elmt),
> >>         &
> >>      ADD_VALUES,ierr)
> >>      CHKERRQ(ierr)
> >>    endif
> >>    enddo
> >>  enddo
> >> enddo
> >> call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr)
> >> call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr)
> >> if(myrank==0)print*,'matrix setting & assembly complete!'
> >>
> >> end subroutine petsc_set_matrix
> >>
> !-------------------------------------------------------------------------------
> >>
> >> subroutine petsc_set_vector()
> >> implicit none
> >> PetscScalar      zero
> >> integer :: i,i_elmt,ncount
> >> integer :: ggdof(NEDOF),idof(NEDOF),igdof(NEDOF)
> >>
> >> ! set vector
> >> zero=0.0_8
> >> call VecSet(b,zero,ierr)
> >> do i_elmt=1,nelmt
> >>  ggdof(:)=ggdof_elmt(:,i_elmt)
> >>  ggdof(:)=ggdof(:)-1 ! petsc index starts from 0
> >>  ncount=0; idof=-1; igdof=-1
> >>  do i=1,NEDOF
> >>    if(ggdof(i).ge.0)then
> >>      call VecSetValues(b,1,ggdof(i),storef(i,i_elmt),ADD_VALUES,ierr);
> >>      CHKERRQ(ierr)
> >>    endif
> >>  enddo
> >> enddo
> >>
> >> ! assemble vector
> >> call VecAssemblyBegin(b,ierr)
> >> call VecAssemblyEnd(b,ierr)
> >> if(myrank==0)print*,'vector setting & assembly complete!'
> >>
> >> end subroutine petsc_set_vector
> >>
> !-------------------------------------------------------------------------------
> >>
> >> subroutine petsc_solve()
> >> implicit none
> >> PetscInt ireason
> >> PetscScalar a_v(1)
> >> PetscOffset a_i
> >> PetscInt n
> >> PetscReal,allocatable :: sdata(:)
> >>
> >> call VecGetSize(b,n,ierr)
> >> CHKERRQ(ierr)
> >> allocate(sdata(n))
> >> sdata=0.0_8
> >> call VecGetArray(b,a_v,a_i,ierr)
> >> CHKERRQ(ierr)
> >> sdata(1:n)=a_v(a_i+1:a_i+n)
> >> call VecRestoreArray(b,a_v,a_i,ierr)
> >> CHKERRQ(ierr)
> >> print*,'rhs:',sdata
> >>
> >> call KSPSolve(ksp,b,x,ierr)
> >> sdata=0.0_8
> >> call VecGetArray(x,a_v,a_i,ierr)
> >> CHKERRQ(ierr)
> >> sdata(1:n)=a_v(a_i+1:a_i+n)
> >> call VecRestoreArray(b,a_v,a_i,ierr)
> >> CHKERRQ(ierr)
> >> print*,'solution:',sdata
> >>
> >> ! Check convergence
> >> call KSPGetConvergedReason(ksp,ireason,ierr)
> >> if(myrank==0)print*,'converges reason',myrank,ireason
> >> call KSPGetIterationNumber(ksp,its,ierr)
> >> if(myrank==0)print*,'Iterations:',its
> >> deallocate(sdata)
> >> end subroutine petsc_solve
> >>
> !-------------------------------------------------------------------------------
> >>
> >> subroutine petsc_finalize()
> >> implicit none
> >>
> >> ! Free work space.
> >> call VecDestroy(x,ierr)
> >> call VecDestroy(u,ierr)
> >> call VecDestroy(b,ierr)
> >> call MatDestroy(A,ierr)
> >> call KSPDestroy(ksp,ierr)
> >> call PetscFinalize(ierr)
> >> CHKERRQ(ierr)
> >>
> >> end subroutine petsc_finalize
> >>
> !-------------------------------------------------------------------------------
> >>
> >> end program testfs
> >>
> >> On Tue, Feb 2, 2016 at 4:54 PM, Matthew Knepley <knepley at gmail.com>
> wrote:
> >>> On Tue, Feb 2, 2016 at 3:11 PM, Hom Nath Gharti <hng.email at gmail.com>
> wrote:
> >>>>
> >>>> Thanks a lot. I will try.
> >>>
> >>>
> >>> Also, if you send a small test case, we can run it and tell you the
> problem.
> >>>
> >>>  Matt
> >>>
> >>>>
> >>>> Hom Nath
> >>>>
> >>>> On Tue, Feb 2, 2016 at 4:02 PM, Matthew Knepley <knepley at gmail.com>
> wrote:
> >>>>> On Tue, Feb 2, 2016 at 3:01 PM, Hom Nath Gharti <hng.email at gmail.com
> >
> >>>>> wrote:
> >>>>>>
> >>>>>> Thanks again Matt. Unfortunately got the same errors with '0'. I
> think
> >>>>>> both are valid in Fortran.
> >>>>>
> >>>>>
> >>>>> Then you will have to go in the debugger and see why its not
> creating 4
> >>>>> splits, since this is exactly
> >>>>> what it does in our tests. This is how all the SNES tests that I use
> >>>>> work. I
> >>>>> am sure its a Fortran
> >>>>> problem.
> >>>>>
> >>>>>  Thanks,
> >>>>>
> >>>>>     Matt
> >>>>>
> >>>>>>
> >>>>>> Hom Nath
> >>>>>>
> >>>>>> On Tue, Feb 2, 2016 at 3:42 PM, Matthew Knepley <knepley at gmail.com>
> >>>>>> wrote:
> >>>>>>> On Tue, Feb 2, 2016 at 2:35 PM, Hom Nath Gharti <
> hng.email at gmail.com>
> >>>>>>> wrote:
> >>>>>>>>
> >>>>>>>> Thank you so much Matt.
> >>>>>>>>
> >>>>>>>> I now tried the following:
> >>>>>>>>
> >>>>>>>> ======================================================
> >>>>>>>> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr)
> >>>>>>>> call KSPGetPC(ksp,pc,ierr)
> >>>>>>>>
> >>>>>>>> call PCFieldSplitSetIS(pc,"0",gdofu_is,ierr);
> >>>>>>>> call ISDestroy(gdofu_is,ierr)
> >>>>>>>> call PCFieldSplitSetIS(pc,"1",gdofchi_is,ierr);
> >>>>>>>> call ISDestroy(gdofchi_is,ierr)
> >>>>>>>> call PCFieldSplitSetIS(pc,"2",gdofp_is,ierr);
> >>>>>>>> call ISDestroy(gdofp_is,ierr)
> >>>>>>>> call PCFieldSplitSetIS(pc,"3",gdofphi_is,ierr);
> >>>>>>>> call ISDestroy(gdofphi_is,ierr)
> >>>>>>>>
> >>>>>>>> ! Amat changes in each time steps
> >>>>>>>> call KSPSetOperators(ksp,Amat,Amat,ierr) !version >= 3.5.0
> >>>>>>>>
> >>>>>>>> call KSPSolve(ksp,bvec,xvec,ierr)
> >>>>>>>> ======================================================
> >>>>>>>
> >>>>>>>
> >>>>>>> I am guessing that "0" is not a valid string for your Fortran
> >>>>>>> compiler.
> >>>>>>> Are
> >>>>>>> you sure
> >>>>>>> its not single quotes '0'?
> >>>>>>>
> >>>>>>>  Matt
> >>>>>>>
> >>>>>>>>
> >>>>>>>> But I get the following error:
> >>>>>>>>
> >>>>>>>> [0]PETSC ERROR: --------------------- Error Message
> >>>>>>>> --------------------------------------------------------------
> >>>>>>>> [0]PETSC ERROR: Petsc has generated inconsistent data
> >>>>>>>> [0]PETSC ERROR: Unhandled case, must have at least two fields,
> not 1
> >>>>>>>> [0]PETSC ERROR: See
> >>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
> >>>>>>>> shooting.
> >>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015
> >>>>>>>> [0]PETSC ERROR:
> >>>>>>>> /tigress/hgharti/gitwork/SPECFEM3D_GLOBEVSPP/./bin/xspecfem3D on a
> >>>>>>>> arch-linux2-c-debug named tiger-r3c1n7 by hgharti Tue Feb  2 15:
> >>>>>>>> 29:30 2016
> >>>>>>>> [0]PETSC ERROR: Configure options
> >>>>>>>>
> >>>>>>>>
> >>>>>>>>
> --with-blas-lapack-dir=/opt/intel/composer_xe_2015.2.164/mkl/lib/intel64/
> >>>>>>>> --with-cc=mpicc --with-cxx=mpicxx --wit
> >>>>>>>> h-fc=mpif90 --with-mpiexec=mpiexec --with-debugging=1
> >>>>>>>> --download-scalapack --download-mumps --download-pastix
> >>>>>>>> --download-superlu --download-superlu_dist --download-metis
> >>>>>>>> --download-parmetis --download-ptscotch --download-hypre
> >>>>>>>> [0]PETSC ERROR: #1 PCFieldSplitSetDefaults() line 469 in
> >>>>>>>>
> >>>>>>>>
> >>>>>>>>
> >>>>>>>>
> /home/hgharti/lsoft/petsc-3.6.3-intel16-mpi5/src/ksp/pc/impls/fieldsplit/fieldsplit.c
> >>>>>>>> [0]PETSC ERROR: #2 PCSetUp_FieldSplit() line 486 in
> >>>>>>>>
> >>>>>>>>
> >>>>>>>>
> >>>>>>>>
> /home/hgharti/lsoft/petsc-3.6.3-intel16-mpi5/src/ksp/pc/impls/fieldsplit/fieldsplit.c
> >>>>>>>> [0]PETSC ERROR: #3 PCSetUp() line 983 in
> >>>>>>>>
> >>>>>>>>
> >>>>>>>>
> /home/hgharti/lsoft/petsc-3.6.3-intel16-mpi5/src/ksp/pc/interface/precon.c
> >>>>>>>> [0]PETSC ERROR: #4 KSPSetUp() line 332 in
> >>>>>>>>
> >>>>>>>>
> >>>>>>>>
> >>>>>>>>
> /home/hgharti/lsoft/petsc-3.6.3-intel16-mpi5/src/ksp/ksp/interface/itfunc.c
> >>>>>>>> [0]PETSC ERROR: #5 KSPSolve() line 546 in
> >>>>>>>>
> >>>>>>>>
> >>>>>>>>
> >>>>>>>>
> /home/hgharti/lsoft/petsc-3.6.3-intel16-mpi5/src/ksp/ksp/interface/itfunc.c
> >>>>>>>> forrtl: error (76): Abort trap signal
> >>>>>>>>
> >>>>>>>> Am I missing something?
> >>>>>>>>
> >>>>>>>> Thanks,
> >>>>>>>> Hom Nath
> >>>>>>>>
> >>>>>>>> On Tue, Feb 2, 2016 at 3:24 PM, Matthew Knepley <
> knepley at gmail.com>
> >>>>>>>> wrote:
> >>>>>>>>> On Tue, Feb 2, 2016 at 2:20 PM, Hom Nath Gharti
> >>>>>>>>> <hng.email at gmail.com>
> >>>>>>>>> wrote:
> >>>>>>>>>>
> >>>>>>>>>> Hi Matt, hi all,
> >>>>>>>>>>
> >>>>>>>>>> I am trying to use PCFIELDSPLIT for my solver which consists of
> 4
> >>>>>>>>>> different variables, namely, u (displacement vector), \chi
> >>>>>>>>>> (displacement potential), p(pressure), and \phi (gravity
> >>>>>>>>>> potential).
> >>>>>>>>>>
> >>>>>>>>>> My code segment looks like the following:
> >>>>>>>>>> ======================================================
> >>>>>>>>>> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr)
> >>>>>>>>>> call KSPGetPC(ksp,pc,ierr)
> >>>>>>>>>> nsplit=4
> >>>>>>>>>> call PCFieldSplitSetBlockSize(pc,nsplit,ierr);
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>>> You do not need the statement above.
> >>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>> call PCFieldSplitSetIS(pc,"0",gdofu_is,ierr);
> >>>>>>>>>> call ISDestroy(gdofu_is,ierr)
> >>>>>>>>>> call PCFieldSplitSetIS(pc,"1",gdofchi_is,ierr);
> >>>>>>>>>> call ISDestroy(gdofchi_is,ierr)
> >>>>>>>>>> call PCFieldSplitSetIS(pc,"2",gdofp_is,ierr);
> >>>>>>>>>> call ISDestroy(gdofp_is,ierr)
> >>>>>>>>>> call PCFieldSplitSetIS(pc,"3",gdofphi_is,ierr);
> >>>>>>>>>> call ISDestroy(gdofphi_is,ierr)
> >>>>>>>>>>
> >>>>>>>>>> call PCFieldSplitGetSubKSP(pc,nsplit,subksp,ierr);
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>>> These SetOperators() calls are wrong. I am not sure why you want
> >>>>>>>>> them
> >>>>>>>>> here.
> >>>>>>>>> Also, that means you do not need the call above.
> >>>>>>>>>
> >>>>>>>>>  Thanks,
> >>>>>>>>>
> >>>>>>>>>     Matt
> >>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>> call KSPSetOperators(subksp(1),Amat,Amat,ierr);
> >>>>>>>>>> call KSPSetOperators(subksp(2),Amat,Amat,ierr);
> >>>>>>>>>> call KSPSetOperators(subksp(3),Amat,Amat,ierr);
> >>>>>>>>>> call KSPSetOperators(subksp(4),Amat,Amat,ierr);
> >>>>>>>>>>
> >>>>>>>>>> call KSPSolve(ksp,bvec,xvec,ierr)
> >>>>>>>>>> ======================================================
> >>>>>>>>>>
> >>>>>>>>>> But I am getting the following error:
> >>>>>>>>>> [79]PETSC ERROR: Null argument, when expecting valid pointer
> >>>>>>>>>> [79]PETSC ERROR: Null Object: Parameter # 1
> >>>>>>>>>> [79]PETSC ERROR: #1 KSPSetOperators() line 536 in
> >>>>>>>>>> /home/hgharti/lsoft/petsc-3.6.3-intel16-mpi5/src/ksp/ksp/interf
> >>>>>>>>>>
> >>>>>>>>>> It looks like I am doing something wrong in "call
> >>>>>>>>>> KSPSetOperators"
> >>>>>>>>>> but
> >>>>>>>>>> I could not figure that out.
> >>>>>>>>>>
> >>>>>>>>>> Could anybody help me to fix this problem? I looked into almost
> >>>>>>>>>> all
> >>>>>>>>>> related examples but I could not really figure out the correct
> >>>>>>>>>> steps
> >>>>>>>>>> after "call PCFieldSplitSetIS".
> >>>>>>>>>>
> >>>>>>>>>> Any help will be greatly appreciated.
> >>>>>>>>>>
> >>>>>>>>>> Best,
> >>>>>>>>>> Hom nath
> >>>>>>>>>>
> >>>>>>>>>> On Sun, Jan 24, 2016 at 7:14 PM, Hom Nath Gharti
> >>>>>>>>>> <hng.email at gmail.com>
> >>>>>>>>>> wrote:
> >>>>>>>>>>> Thank you so much Matt! I will try.
> >>>>>>>>>>>
> >>>>>>>>>>> Hom Nath
> >>>>>>>>>>>
> >>>>>>>>>>> On Sun, Jan 24, 2016 at 6:26 AM, Matthew Knepley
> >>>>>>>>>>> <knepley at gmail.com>
> >>>>>>>>>>> wrote:
> >>>>>>>>>>>> On Fri, Jan 22, 2016 at 2:19 PM, Hom Nath Gharti
> >>>>>>>>>>>> <hng.email at gmail.com>
> >>>>>>>>>>>> wrote:
> >>>>>>>>>>>>>
> >>>>>>>>>>>>> Dear all,
> >>>>>>>>>>>>>
> >>>>>>>>>>>>> I am new to PcFieldSplit.
> >>>>>>>>>>>>>
> >>>>>>>>>>>>> I have a matrix formed using MATMPIAIJ. Is it possible to use
> >>>>>>>>>>>>> PCFIELDSPLIT operations in this type of matrix? Or does it
> >>>>>>>>>>>>> have
> >>>>>>>>>>>>> to
> >>>>>>>>>>>>> be
> >>>>>>>>>>>>> MATMPIBIJ or MATNEST format?
> >>>>>>>>>>>>
> >>>>>>>>>>>>
> >>>>>>>>>>>> Yes, you can split AIJ.
> >>>>>>>>>>>>
> >>>>>>>>>>>>>
> >>>>>>>>>>>>> If possible for MATMPIAIJ, could anybody provide me a simple
> >>>>>>>>>>>>> example
> >>>>>>>>>>>>> or few steps? Variables in the equations are displacement
> >>>>>>>>>>>>> vector,
> >>>>>>>>>>>>> scalar potential and pressure.
> >>>>>>>>>>>>
> >>>>>>>>>>>>
> >>>>>>>>>>>> If you do not have a collocated discretization, then you have
> >>>>>>>>>>>> to
> >>>>>>>>>>>> use
> >>>>>>>>>>>>
> >>>>>>>>>>>>
> >>>>>>>>>>>>
> >>>>>>>>>>>>
> >>>>>>>>>>>>
> >>>>>>>>>>>>
> >>>>>>>>>>>>
> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCFieldSplitSetIS.html
> >>>>>>>>>>>>
> >>>>>>>>>>>>  Thanks,
> >>>>>>>>>>>>
> >>>>>>>>>>>>     Matt
> >>>>>>>>>>>>
> >>>>>>>>>>>>>
> >>>>>>>>>>>>> Thanks for help.
> >>>>>>>>>>>>>
> >>>>>>>>>>>>> Hom Nath
> >>>>>>>>>>>>
> >>>>>>>>>>>>
> >>>>>>>>>>>>
> >>>>>>>>>>>>
> >>>>>>>>>>>> --
> >>>>>>>>>>>> What most experimenters take for granted before they begin
> >>>>>>>>>>>> their
> >>>>>>>>>>>> experiments
> >>>>>>>>>>>> is infinitely more interesting than any results to which their
> >>>>>>>>>>>> experiments
> >>>>>>>>>>>> lead.
> >>>>>>>>>>>> -- Norbert Wiener
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>>> --
> >>>>>>>>> What most experimenters take for granted before they begin their
> >>>>>>>>> experiments
> >>>>>>>>> is infinitely more interesting than any results to which their
> >>>>>>>>> experiments
> >>>>>>>>> lead.
> >>>>>>>>> -- Norbert Wiener
> >>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>> --
> >>>>>>> What most experimenters take for granted before they begin their
> >>>>>>> experiments
> >>>>>>> is infinitely more interesting than any results to which their
> >>>>>>> experiments
> >>>>>>> lead.
> >>>>>>> -- Norbert Wiener
> >>>>>
> >>>>>
> >>>>>
> >>>>>
> >>>>> --
> >>>>> What most experimenters take for granted before they begin their
> >>>>> experiments
> >>>>> is infinitely more interesting than any results to which their
> >>>>> experiments
> >>>>> lead.
> >>>>> -- Norbert Wiener
> >>>
> >>>
> >>>
> >>>
> >>> --
> >>> What most experimenters take for granted before they begin their
> experiments
> >>> is infinitely more interesting than any results to which their
> experiments
> >>> lead.
> >>> -- Norbert Wiener
> >
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160209/deee171a/attachment-0001.html>


More information about the petsc-users mailing list