<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On Tue, Feb 9, 2016 at 9:10 AM, Hom Nath Gharti <span dir="ltr"><<a href="mailto:hng.email@gmail.com" target="_blank">hng.email@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Thank you so much Barry!<br>
<br>
For my small test case, with -pc_fieldsplit_block_size 4, the program<br>
runs, although the answer was not correct. At least now I get<br>
something to look upon. I am using PCFieldSplitSetIS to set the<br>
fields. Do I still need to use -pc_fieldsplit_block_size?<br></blockquote><div><br></div><div>No, this is only for co-located discretizations.</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
In my case each grid point may have different variable sets. For<br>
example, the grid point in the solid region has displacement and<br>
gravity potential: ux, uy, uz, and \phi. Whereas the grid point in the<br>
fluid region has scalar potential, pressure and gravity potential:<br>
\chi, p, and \phi. And the solid-fluid interface has all of them. Do<br>
you think I can still use PCFIELDSPLIT in this situation?<br></blockquote><div><br></div><div>We have examples, like SNES ex62, which do exactly this.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Best,<br>
Hom Nath<br>
<div class="HOEnZb"><div class="h5"><br>
<br>
<br>
On Mon, Feb 8, 2016 at 2:27 PM, Barry Smith <<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>> wrote:<br>
><br>
><br>
> In this case you need to provide two pieces of information to the PCFIELDSPLIT. What we call the "block size" or bs which is the number of "basic fields" in the problem. For example if at each grid point you have x-velocity, y-velocity, and pressure the block size is 3. The second is you need to tell PCFIELDSPLIT what "basic fields" are in each split you want to use.<br>
><br>
> In your case you can do this with -pc_fieldsplit_block_size 4. (Note if you use a DM to define your problem the PCFIELDSPLIT will automatically get the bs from the DM so you do not need to provide it. Similarly if you set a block size on your Mat the PCFIELDSPLIT will use that so you don't need to set it).<br>
><br>
> Then use -pc_fieldsplit_0_fields 0,1 to indicate your first split is the 0 and 1 basic fields and -pc_fieldsplit_1_fields 2,3 to indicate the second split is the 2 and 3 basic fields.<br>
> (By default if you do not provide this then PCFIELDSPLIT will use bs splits (4 in your case) one for each basic field.<br>
><br>
> Barry<br>
><br>
>> On Feb 8, 2016, at 11:20 AM, Hom Nath Gharti <<a href="mailto:hng.email@gmail.com">hng.email@gmail.com</a>> wrote:<br>
>><br>
>> Hi Matt, Hi all,<br>
>><br>
>> I am trying to have some feel for PCFIELDSPLIT testing on a very small<br>
>> matrix (attached below). I have 4 degrees of freedoms. I use 4<br>
>> processors. I want to split 4 dofs into two fields each having two<br>
>> dofs. I don't know whether this my be a problem for petsc. When I use<br>
>> the command:<br>
>> srun -n 4 ./xtestfs -ksp_monitor -ksp_view<br>
>><br>
>> It runs fine.<br>
>><br>
>> But when I want to use field split options using the command:<br>
>> srun -n 4 ./xtestfs -ksp_monitor -ksp_type fgmres -pc_type fieldsplit<br>
>> -fieldsplit_0_ksp_type gmres -fieldsplit_0_pc_type bjacobi<br>
>> -fieldsplit_1_pc_type jacobi<br>
>><br>
>> I get the following error message:<br>
>> [0]PETSC ERROR: Petsc has generated inconsistent data<br>
>> [0]PETSC ERROR: Unhandled case, must have at least two fields, not 1<br>
>> [0]PETSC ERROR: See<br>
>> <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble<br>
>> shooting.<br>
>> [0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015<br>
>> [0]PETSC ERROR: /tigress/hgharti/test/testfs/./xtestfs on a<br>
>> arch-linux2-c-debug named tiger-r11n1 by hgharti Mon Feb 8 11:40:03<br>
>> 2016<br>
>> [0]PETSC ERROR: Configure options<br>
>> --with-blas-lapack-dir=/opt/intel/composer_xe_2015.2.164/mkl/lib/intel64/<br>
>> --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90<br>
>> --with-mpiexec=mpiexec --with-debugging=1 --download-scalapack<br>
>> --download-mumps --download-pastix --download-superlu<br>
>> --download-superlu_dist --download-metis --download-parmetis<br>
>> --download-ptscotch --download-hypre<br>
>> [0]PETSC ERROR: #1 PCFieldSplitSetDefaults() line 469 in<br>
>> /home/hgharti/lsoft/petsc-3.6.3-intel16-mpi5/src/ksp/pc/impls/fieldsplit/fieldsplit.c<br>
>> [0]PETSC ERROR: #2 PCSetUp_FieldSplit() line 486 in<br>
>> /home/hgharti/lsoft/petsc-3.6.3-intel16-mpi5/src/ksp/pc/impls/fieldsplit/fieldsplit.c<br>
>> [0]PETSC ERROR: #3 PCSetUp() line 983 in<br>
>> /home/hgharti/lsoft/petsc-3.6.3-intel16-mpi5/src/ksp/pc/interface/precon.c<br>
>> [0]PETSC ERROR: #4 KSPSetUp() line 332 in<br>
>> /home/hgharti/lsoft/petsc-3.6.3-intel16-mpi5/src/ksp/ksp/interface/itfunc.c<br>
>> [0]PETSC ERROR: #5 KSPSolve() line 546 in<br>
>> /home/hgharti/lsoft/petsc-3.6.3-intel16-mpi5/src/ksp/ksp/interface/itfunc.c<br>
>> WARNING! There are options you set that were not used!<br>
>> WARNING! could be spelling mistake, etc!<br>
>> Option left: name:-fieldsplit_0_ksp_type value: gmres<br>
>> Option left: name:-fieldsplit_0_pc_type value: bjacobi<br>
>> Option left: name:-fieldsplit_1_pc_type value: jacobi<br>
>> forrtl: error (76): Abort trap signal<br>
>><br>
>> I tried several trials but could not fix the problem. Is it the<br>
>> FORTRAN problem or am I doing something wrong? Any suggestions would<br>
>> be greatly appreciated.<br>
>> For your information I use:<br>
>> PETSc-3.6.3<br>
>> intel-16.0 compiler<br>
>> intel-mpi-5.1.1<br>
>><br>
>> Program<br>
>><br>
>><br>
>> Best,<br>
>> Hom Nath<br>
>><br>
>> ! simple test case for PCFIELDSPLIT<br>
>> ! -----------------------------------------------------------------------<br>
>> program testfs<br>
>> implicit none<br>
>> #include "petsc/finclude/petscsys.h"<br>
>> #include "petsc/finclude/petscvec.h"<br>
>> #include "petsc/finclude/petscvec.h90"<br>
>> #include "petsc/finclude/petscmat.h"<br>
>> #include "petsc/finclude/petscksp.h"<br>
>> #include "petsc/finclude/petscpc.h"<br>
>> #include "petsc/finclude/petscviewer.h"<br>
>> #include "petsc/finclude/petscviewer.h90"<br>
>><br>
>> Vec x,b,u<br>
>> Mat A<br>
>> KSP ksp<br>
>> PC pc<br>
>> PetscErrorCode ierr<br>
>> PetscBool flg<br>
>> PetscInt errcode,its,maxitr,myrank,nproc<br>
>> PetscInt nedof,nelmt,ndof,nenod,ngdof,fixdof,nsparse,neq<br>
>> PetscInt,allocatable :: krow_sparse(:),ggdof_elmt(:,:), &<br>
>> inode_elmt(:,:)<br>
>><br>
>> PetscReal,allocatable :: storef(:,:),storekmat(:,:,:)<br>
>> PetscInt gdof0(2),gdof1(2)<br>
>><br>
>> ! initialize MPI<br>
>><br>
>> call MPI_INIT(errcode)<br>
>> if(errcode /= 0)stop 'ERROR: cannot initialize MPI!'<br>
>> call MPI_COMM_RANK(MPI_COMM_WORLD,myrank,errcode)<br>
>> if(errcode /= 0)stop 'ERROR: cannot find processor ID'<br>
>> call MPI_COMM_SIZE(MPI_COMM_WORLD,nproc,errcode)<br>
>> if(errcode /= 0)stop 'ERROR: cannot find number of processors!'<br>
>><br>
>> ! define some parameters<br>
>> ! 1D model with 4 elements. Each elements consits of 2 nodes. Node 0 is fixed.<br>
>> ! 1-----------2-----------3-----------4-----------5<br>
>> nelmt=1 ! per processor<br>
>> nenod=2 ! number of nodes per element<br>
>> nedof=2 ! number of DOFs per element<br>
>> ndof=2 ! number of DOFs<br>
>> ngdof=4 ! total number of global DOFs<br>
>> fixdof=1<br>
>><br>
>> if(myrank==0)then<br>
>> neq=1 ! number of equations<br>
>> nsparse=1<br>
>> else<br>
>> neq=2 ! number of equations<br>
>> nsparse=4<br>
>> endif<br>
>><br>
>> allocate(storef(nedof,nelmt),storekmat(nedof,nedof,nelmt), &<br>
>> ggdof_elmt(nedof,nelmt),inode_elmt(nenod,nelmt),krow_sparse(nsparse))<br>
>><br>
>> storef=0.0_8 ! local RHS vector<br>
>> storekmat=0.0_8 ! local stiffness matrix<br>
>> if(myrank==0)then<br>
>> krow_sparse(:)=1<br>
>> storef(:,1)=(/12.5000_8, 12.5000_8/)<br>
>> storekmat(:,:,1) = reshape((/1.2333333332_8, 0.0166666667_8, &<br>
>> 0.0166666667_8, 1.2333333332_8/),(/nedof,nedof/))<br>
>> inode_elmt(:,1) = (/1,2/)<br>
>> ggdof_elmt(:,1) = (/0,1/) ! 0 because left node is fixed<br>
>> elseif(myrank==1)then<br>
>> krow_sparse(:)=(/1,1,2,2/)<br>
>> storef(:,1)=(/12.5000_8, 12.5000_8/)<br>
>> storekmat(:,:,1) = reshape((/1.2333333332_8, 0.0166666667_8, &<br>
>> 0.0166666667_8, 1.2333333332_8/),(/nedof,nedof/))<br>
>> inode_elmt(:,1) = (/1,2/)<br>
>> ggdof_elmt(:,1) = (/1,2/)<br>
>> elseif(myrank==2)then<br>
>> krow_sparse(:)=(/1,1,2,2/)<br>
>> storef(:,1)=(/12.5000_8, 12.5000_8/)<br>
>> storekmat(:,:,1) = reshape((/1.2333333332_8, 0.0166666667_8, &<br>
>> 0.0166666667_8, 1.2333333332_8/),(/nedof,nedof/))<br>
>> inode_elmt(:,1) = (/1,2/)<br>
>> ggdof_elmt(:,1) = (/2,3/)<br>
>> elseif(myrank==3)then<br>
>> krow_sparse(:)=(/1,1,2,2/)<br>
>> storef(:,1)=(/12.5000_8, 22.5000_8/)<br>
>> storekmat(:,:,1) = reshape((/1.2333333332_8, 0.0166666667_8, &<br>
>> 0.0166666667_8, 1.2333333332_8/),(/nedof,nedof/))<br>
>> inode_elmt(:,1) = (/1,2/)<br>
>> ggdof_elmt(:,1) = (/3,4/)<br>
>> endif<br>
>><br>
>> call petsc_initialize<br>
>> call petsc_set_matrix<br>
>> call petsc_set_vector<br>
>> call petsc_solve<br>
>> call petsc_finalize<br>
>><br>
>> ! deallocate<br>
>> deallocate(storef,storekmat,ggdof_elmt,inode_elmt)<br>
>> deallocate(krow_sparse)<br>
>> contains<br>
>> !-------------------------------------------------------------------------------<br>
>> subroutine petsc_initialize()<br>
>> implicit none<br>
>> PetscInt :: istart,iend<br>
>> PetscInt :: nzeros_max,nzeros_min<br>
>> PetscReal :: atol,dtol,rtol<br>
>> PetscInt,allocatable :: nzeros(:)<br>
>> IS gdof0_is,gdof1_is<br>
>><br>
>> call PetscInitialize(PETSC_NULL_CHARACTER,ierr)<br>
>> call PetscOptionsGetInt(PETSC_NULL_CHARACTER,'-n',ngdof,flg,ierr)<br>
>><br>
>> ! count number of nonzeros per row<br>
>> allocate(nzeros(neq))<br>
>> nzeros=0<br>
>> nzeros(krow_sparse)=nzeros(krow_sparse)+1<br>
>> nzeros_max=maxval(nzeros)<br>
>> nzeros_min=minval(nzeros)<br>
>><br>
>> ! create matrix<br>
>> call MatCreate(PETSC_COMM_WORLD,A,ierr)<br>
>> call MatSetType(A,MATMPIAIJ,ierr)<br>
>> CHKERRQ(ierr)<br>
>> call MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,ngdof,ngdof,ierr)<br>
>> CHKERRQ(ierr)<br>
>> ! preallocation<br>
>> call MatMPIAIJSetPreallocation(A,nzeros_max,PETSC_NULL_INTEGER,nzeros_max, &<br>
>> PETSC_NULL_INTEGER,ierr)<br>
>> CHKERRQ(ierr)<br>
>><br>
>> call MatGetOwnershipRange(A,istart,iend,ierr)<br>
>> CHKERRQ(ierr)<br>
>> print*,'ownership:',myrank,istart,iend<br>
>> ! create vector<br>
>> call VecCreate(PETSC_COMM_WORLD,x,ierr)<br>
>> call VecSetSizes(x,PETSC_DECIDE,ngdof,ierr)<br>
>> call VecSetType(x,VECMPI,ierr)<br>
>> call VecDuplicate(x,b,ierr)<br>
>> call VecDuplicate(x,u,ierr)<br>
>><br>
>> !*******************************************************************************<br>
>> ! get IS for split fields<br>
>> gdof0=(/0,1/)<br>
>> gdof1=(/2,3/)<br>
>> call ISCreateGeneral(PETSC_COMM_WORLD,2,gdof0,PETSC_COPY_VALUES,gdof0_is,ierr)<br>
>> CHKERRQ(ierr)<br>
>> call ISCreateGeneral(PETSC_COMM_WORLD,2,gdof1,PETSC_COPY_VALUES,gdof1_is,ierr)<br>
>> CHKERRQ(ierr)<br>
>> !*******************************************************************************<br>
>><br>
>> ! Create linear solver context<br>
>> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr)<br>
>> call KSPSetOperators(ksp,A,A,ierr)<br>
>><br>
>> call KSPGetPC(ksp,pc,ierr)<br>
>> CHKERRQ(ierr)<br>
>><br>
>> !*******************************************************************************<br>
>> ! split PC<br>
>> call PCFieldSplitSetIS(pc,"0",gdof0_is,ierr);<br>
>> CHKERRQ(ierr)<br>
>> call ISDestroy(gdof0_is,ierr)<br>
>> CHKERRQ(ierr)<br>
>> call PCFieldSplitSetIS(pc,"1",gdof1_is,ierr);<br>
>> CHKERRQ(ierr)<br>
>> call ISDestroy(gdof1_is,ierr)<br>
>> CHKERRQ(ierr)<br>
>> !*******************************************************************************<br>
>><br>
>> rtol = 1.0d-6<br>
>> atol = 1.0d-6<br>
>> dtol = 1.0d10<br>
>> maxitr = 1000<br>
>> call KSPSetTolerances(ksp,rtol,atol,dtol,maxitr,ierr)<br>
>> CHKERRQ(ierr)<br>
>> call KSPSetFromOptions(ksp,ierr)<br>
>> end subroutine petsc_initialize<br>
>> !-------------------------------------------------------------------------------<br>
>><br>
>> subroutine petsc_set_matrix()<br>
>><br>
>> implicit none<br>
>> integer :: i,i_elmt,j,ncount<br>
>> integer :: ggdof(NEDOF),idof(NEDOF),igdof(NEDOF)<br>
>><br>
>> ! Set and assemble matrix.<br>
>> call MatZeroEntries(A,ierr)<br>
>> CHKERRQ(ierr)<br>
>> do i_elmt=1,nelmt<br>
>> ggdof(:)=ggdof_elmt(:,i_elmt)<br>
>> ggdof(:)=ggdof(:)-1 ! petsc index starts from 0<br>
>> ncount=0; idof=-1; igdof=-1<br>
>> do i=1,NEDOF<br>
>> do j=1,NEDOF<br>
>> if(ggdof(i).ge.0.and.ggdof(j).ge.0)then<br>
>> call MatSetValues(A,1,ggdof(i),1,ggdof(j),storekmat(i,j,i_elmt),<br>
>> &<br>
>> ADD_VALUES,ierr)<br>
>> CHKERRQ(ierr)<br>
>> endif<br>
>> enddo<br>
>> enddo<br>
>> enddo<br>
>> call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr)<br>
>> call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr)<br>
>> if(myrank==0)print*,'matrix setting & assembly complete!'<br>
>><br>
>> end subroutine petsc_set_matrix<br>
>> !-------------------------------------------------------------------------------<br>
>><br>
>> subroutine petsc_set_vector()<br>
>> implicit none<br>
>> PetscScalar zero<br>
>> integer :: i,i_elmt,ncount<br>
>> integer :: ggdof(NEDOF),idof(NEDOF),igdof(NEDOF)<br>
>><br>
>> ! set vector<br>
>> zero=0.0_8<br>
>> call VecSet(b,zero,ierr)<br>
>> do i_elmt=1,nelmt<br>
>> ggdof(:)=ggdof_elmt(:,i_elmt)<br>
>> ggdof(:)=ggdof(:)-1 ! petsc index starts from 0<br>
>> ncount=0; idof=-1; igdof=-1<br>
>> do i=1,NEDOF<br>
>> if(ggdof(i).ge.0)then<br>
>> call VecSetValues(b,1,ggdof(i),storef(i,i_elmt),ADD_VALUES,ierr);<br>
>> CHKERRQ(ierr)<br>
>> endif<br>
>> enddo<br>
>> enddo<br>
>><br>
>> ! assemble vector<br>
>> call VecAssemblyBegin(b,ierr)<br>
>> call VecAssemblyEnd(b,ierr)<br>
>> if(myrank==0)print*,'vector setting & assembly complete!'<br>
>><br>
>> end subroutine petsc_set_vector<br>
>> !-------------------------------------------------------------------------------<br>
>><br>
>> subroutine petsc_solve()<br>
>> implicit none<br>
>> PetscInt ireason<br>
>> PetscScalar a_v(1)<br>
>> PetscOffset a_i<br>
>> PetscInt n<br>
>> PetscReal,allocatable :: sdata(:)<br>
>><br>
>> call VecGetSize(b,n,ierr)<br>
>> CHKERRQ(ierr)<br>
>> allocate(sdata(n))<br>
>> sdata=0.0_8<br>
>> call VecGetArray(b,a_v,a_i,ierr)<br>
>> CHKERRQ(ierr)<br>
>> sdata(1:n)=a_v(a_i+1:a_i+n)<br>
>> call VecRestoreArray(b,a_v,a_i,ierr)<br>
>> CHKERRQ(ierr)<br>
>> print*,'rhs:',sdata<br>
>><br>
>> call KSPSolve(ksp,b,x,ierr)<br>
>> sdata=0.0_8<br>
>> call VecGetArray(x,a_v,a_i,ierr)<br>
>> CHKERRQ(ierr)<br>
>> sdata(1:n)=a_v(a_i+1:a_i+n)<br>
>> call VecRestoreArray(b,a_v,a_i,ierr)<br>
>> CHKERRQ(ierr)<br>
>> print*,'solution:',sdata<br>
>><br>
>> ! Check convergence<br>
>> call KSPGetConvergedReason(ksp,ireason,ierr)<br>
>> if(myrank==0)print*,'converges reason',myrank,ireason<br>
>> call KSPGetIterationNumber(ksp,its,ierr)<br>
>> if(myrank==0)print*,'Iterations:',its<br>
>> deallocate(sdata)<br>
>> end subroutine petsc_solve<br>
>> !-------------------------------------------------------------------------------<br>
>><br>
>> subroutine petsc_finalize()<br>
>> implicit none<br>
>><br>
>> ! Free work space.<br>
>> call VecDestroy(x,ierr)<br>
>> call VecDestroy(u,ierr)<br>
>> call VecDestroy(b,ierr)<br>
>> call MatDestroy(A,ierr)<br>
>> call KSPDestroy(ksp,ierr)<br>
>> call PetscFinalize(ierr)<br>
>> CHKERRQ(ierr)<br>
>><br>
>> end subroutine petsc_finalize<br>
>> !-------------------------------------------------------------------------------<br>
>><br>
>> end program testfs<br>
>><br>
>> On Tue, Feb 2, 2016 at 4:54 PM, Matthew Knepley <<a href="mailto:knepley@gmail.com">knepley@gmail.com</a>> wrote:<br>
>>> On Tue, Feb 2, 2016 at 3:11 PM, Hom Nath Gharti <<a href="mailto:hng.email@gmail.com">hng.email@gmail.com</a>> wrote:<br>
>>>><br>
>>>> Thanks a lot. I will try.<br>
>>><br>
>>><br>
>>> Also, if you send a small test case, we can run it and tell you the problem.<br>
>>><br>
>>> Matt<br>
>>><br>
>>>><br>
>>>> Hom Nath<br>
>>>><br>
>>>> On Tue, Feb 2, 2016 at 4:02 PM, Matthew Knepley <<a href="mailto:knepley@gmail.com">knepley@gmail.com</a>> wrote:<br>
>>>>> On Tue, Feb 2, 2016 at 3:01 PM, Hom Nath Gharti <<a href="mailto:hng.email@gmail.com">hng.email@gmail.com</a>><br>
>>>>> wrote:<br>
>>>>>><br>
>>>>>> Thanks again Matt. Unfortunately got the same errors with '0'. I think<br>
>>>>>> both are valid in Fortran.<br>
>>>>><br>
>>>>><br>
>>>>> Then you will have to go in the debugger and see why its not creating 4<br>
>>>>> splits, since this is exactly<br>
>>>>> what it does in our tests. This is how all the SNES tests that I use<br>
>>>>> work. I<br>
>>>>> am sure its a Fortran<br>
>>>>> problem.<br>
>>>>><br>
>>>>> Thanks,<br>
>>>>><br>
>>>>> Matt<br>
>>>>><br>
>>>>>><br>
>>>>>> Hom Nath<br>
>>>>>><br>
>>>>>> On Tue, Feb 2, 2016 at 3:42 PM, Matthew Knepley <<a href="mailto:knepley@gmail.com">knepley@gmail.com</a>><br>
>>>>>> wrote:<br>
>>>>>>> On Tue, Feb 2, 2016 at 2:35 PM, Hom Nath Gharti <<a href="mailto:hng.email@gmail.com">hng.email@gmail.com</a>><br>
>>>>>>> wrote:<br>
>>>>>>>><br>
>>>>>>>> Thank you so much Matt.<br>
>>>>>>>><br>
>>>>>>>> I now tried the following:<br>
>>>>>>>><br>
>>>>>>>> ======================================================<br>
>>>>>>>> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr)<br>
>>>>>>>> call KSPGetPC(ksp,pc,ierr)<br>
>>>>>>>><br>
>>>>>>>> call PCFieldSplitSetIS(pc,"0",gdofu_is,ierr);<br>
>>>>>>>> call ISDestroy(gdofu_is,ierr)<br>
>>>>>>>> call PCFieldSplitSetIS(pc,"1",gdofchi_is,ierr);<br>
>>>>>>>> call ISDestroy(gdofchi_is,ierr)<br>
>>>>>>>> call PCFieldSplitSetIS(pc,"2",gdofp_is,ierr);<br>
>>>>>>>> call ISDestroy(gdofp_is,ierr)<br>
>>>>>>>> call PCFieldSplitSetIS(pc,"3",gdofphi_is,ierr);<br>
>>>>>>>> call ISDestroy(gdofphi_is,ierr)<br>
>>>>>>>><br>
>>>>>>>> ! Amat changes in each time steps<br>
>>>>>>>> call KSPSetOperators(ksp,Amat,Amat,ierr) !version >= 3.5.0<br>
>>>>>>>><br>
>>>>>>>> call KSPSolve(ksp,bvec,xvec,ierr)<br>
>>>>>>>> ======================================================<br>
>>>>>>><br>
>>>>>>><br>
>>>>>>> I am guessing that "0" is not a valid string for your Fortran<br>
>>>>>>> compiler.<br>
>>>>>>> Are<br>
>>>>>>> you sure<br>
>>>>>>> its not single quotes '0'?<br>
>>>>>>><br>
>>>>>>> Matt<br>
>>>>>>><br>
>>>>>>>><br>
>>>>>>>> But I get the following error:<br>
>>>>>>>><br>
>>>>>>>> [0]PETSC ERROR: --------------------- Error Message<br>
>>>>>>>> --------------------------------------------------------------<br>
>>>>>>>> [0]PETSC ERROR: Petsc has generated inconsistent data<br>
>>>>>>>> [0]PETSC ERROR: Unhandled case, must have at least two fields, not 1<br>
>>>>>>>> [0]PETSC ERROR: See<br>
>>>>>>>> <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble<br>
>>>>>>>> shooting.<br>
>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015<br>
>>>>>>>> [0]PETSC ERROR:<br>
>>>>>>>> /tigress/hgharti/gitwork/SPECFEM3D_GLOBEVSPP/./bin/xspecfem3D on a<br>
>>>>>>>> arch-linux2-c-debug named tiger-r3c1n7 by hgharti Tue Feb 2 15:<br>
>>>>>>>> 29:30 2016<br>
>>>>>>>> [0]PETSC ERROR: Configure options<br>
>>>>>>>><br>
>>>>>>>><br>
>>>>>>>> --with-blas-lapack-dir=/opt/intel/composer_xe_2015.2.164/mkl/lib/intel64/<br>
>>>>>>>> --with-cc=mpicc --with-cxx=mpicxx --wit<br>
>>>>>>>> h-fc=mpif90 --with-mpiexec=mpiexec --with-debugging=1<br>
>>>>>>>> --download-scalapack --download-mumps --download-pastix<br>
>>>>>>>> --download-superlu --download-superlu_dist --download-metis<br>
>>>>>>>> --download-parmetis --download-ptscotch --download-hypre<br>
>>>>>>>> [0]PETSC ERROR: #1 PCFieldSplitSetDefaults() line 469 in<br>
>>>>>>>><br>
>>>>>>>><br>
>>>>>>>><br>
>>>>>>>> /home/hgharti/lsoft/petsc-3.6.3-intel16-mpi5/src/ksp/pc/impls/fieldsplit/fieldsplit.c<br>
>>>>>>>> [0]PETSC ERROR: #2 PCSetUp_FieldSplit() line 486 in<br>
>>>>>>>><br>
>>>>>>>><br>
>>>>>>>><br>
>>>>>>>> /home/hgharti/lsoft/petsc-3.6.3-intel16-mpi5/src/ksp/pc/impls/fieldsplit/fieldsplit.c<br>
>>>>>>>> [0]PETSC ERROR: #3 PCSetUp() line 983 in<br>
>>>>>>>><br>
>>>>>>>><br>
>>>>>>>> /home/hgharti/lsoft/petsc-3.6.3-intel16-mpi5/src/ksp/pc/interface/precon.c<br>
>>>>>>>> [0]PETSC ERROR: #4 KSPSetUp() line 332 in<br>
>>>>>>>><br>
>>>>>>>><br>
>>>>>>>><br>
>>>>>>>> /home/hgharti/lsoft/petsc-3.6.3-intel16-mpi5/src/ksp/ksp/interface/itfunc.c<br>
>>>>>>>> [0]PETSC ERROR: #5 KSPSolve() line 546 in<br>
>>>>>>>><br>
>>>>>>>><br>
>>>>>>>><br>
>>>>>>>> /home/hgharti/lsoft/petsc-3.6.3-intel16-mpi5/src/ksp/ksp/interface/itfunc.c<br>
>>>>>>>> forrtl: error (76): Abort trap signal<br>
>>>>>>>><br>
>>>>>>>> Am I missing something?<br>
>>>>>>>><br>
>>>>>>>> Thanks,<br>
>>>>>>>> Hom Nath<br>
>>>>>>>><br>
>>>>>>>> On Tue, Feb 2, 2016 at 3:24 PM, Matthew Knepley <<a href="mailto:knepley@gmail.com">knepley@gmail.com</a>><br>
>>>>>>>> wrote:<br>
>>>>>>>>> On Tue, Feb 2, 2016 at 2:20 PM, Hom Nath Gharti<br>
>>>>>>>>> <<a href="mailto:hng.email@gmail.com">hng.email@gmail.com</a>><br>
>>>>>>>>> wrote:<br>
>>>>>>>>>><br>
>>>>>>>>>> Hi Matt, hi all,<br>
>>>>>>>>>><br>
>>>>>>>>>> I am trying to use PCFIELDSPLIT for my solver which consists of 4<br>
>>>>>>>>>> different variables, namely, u (displacement vector), \chi<br>
>>>>>>>>>> (displacement potential), p(pressure), and \phi (gravity<br>
>>>>>>>>>> potential).<br>
>>>>>>>>>><br>
>>>>>>>>>> My code segment looks like the following:<br>
>>>>>>>>>> ======================================================<br>
>>>>>>>>>> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr)<br>
>>>>>>>>>> call KSPGetPC(ksp,pc,ierr)<br>
>>>>>>>>>> nsplit=4<br>
>>>>>>>>>> call PCFieldSplitSetBlockSize(pc,nsplit,ierr);<br>
>>>>>>>>><br>
>>>>>>>>><br>
>>>>>>>>> You do not need the statement above.<br>
>>>>>>>>><br>
>>>>>>>>>><br>
>>>>>>>>>> call PCFieldSplitSetIS(pc,"0",gdofu_is,ierr);<br>
>>>>>>>>>> call ISDestroy(gdofu_is,ierr)<br>
>>>>>>>>>> call PCFieldSplitSetIS(pc,"1",gdofchi_is,ierr);<br>
>>>>>>>>>> call ISDestroy(gdofchi_is,ierr)<br>
>>>>>>>>>> call PCFieldSplitSetIS(pc,"2",gdofp_is,ierr);<br>
>>>>>>>>>> call ISDestroy(gdofp_is,ierr)<br>
>>>>>>>>>> call PCFieldSplitSetIS(pc,"3",gdofphi_is,ierr);<br>
>>>>>>>>>> call ISDestroy(gdofphi_is,ierr)<br>
>>>>>>>>>><br>
>>>>>>>>>> call PCFieldSplitGetSubKSP(pc,nsplit,subksp,ierr);<br>
>>>>>>>>><br>
>>>>>>>>><br>
>>>>>>>>> These SetOperators() calls are wrong. I am not sure why you want<br>
>>>>>>>>> them<br>
>>>>>>>>> here.<br>
>>>>>>>>> Also, that means you do not need the call above.<br>
>>>>>>>>><br>
>>>>>>>>> Thanks,<br>
>>>>>>>>><br>
>>>>>>>>> Matt<br>
>>>>>>>>><br>
>>>>>>>>>><br>
>>>>>>>>>> call KSPSetOperators(subksp(1),Amat,Amat,ierr);<br>
>>>>>>>>>> call KSPSetOperators(subksp(2),Amat,Amat,ierr);<br>
>>>>>>>>>> call KSPSetOperators(subksp(3),Amat,Amat,ierr);<br>
>>>>>>>>>> call KSPSetOperators(subksp(4),Amat,Amat,ierr);<br>
>>>>>>>>>><br>
>>>>>>>>>> call KSPSolve(ksp,bvec,xvec,ierr)<br>
>>>>>>>>>> ======================================================<br>
>>>>>>>>>><br>
>>>>>>>>>> But I am getting the following error:<br>
>>>>>>>>>> [79]PETSC ERROR: Null argument, when expecting valid pointer<br>
>>>>>>>>>> [79]PETSC ERROR: Null Object: Parameter # 1<br>
>>>>>>>>>> [79]PETSC ERROR: #1 KSPSetOperators() line 536 in<br>
>>>>>>>>>> /home/hgharti/lsoft/petsc-3.6.3-intel16-mpi5/src/ksp/ksp/interf<br>
>>>>>>>>>><br>
>>>>>>>>>> It looks like I am doing something wrong in "call<br>
>>>>>>>>>> KSPSetOperators"<br>
>>>>>>>>>> but<br>
>>>>>>>>>> I could not figure that out.<br>
>>>>>>>>>><br>
>>>>>>>>>> Could anybody help me to fix this problem? I looked into almost<br>
>>>>>>>>>> all<br>
>>>>>>>>>> related examples but I could not really figure out the correct<br>
>>>>>>>>>> steps<br>
>>>>>>>>>> after "call PCFieldSplitSetIS".<br>
>>>>>>>>>><br>
>>>>>>>>>> Any help will be greatly appreciated.<br>
>>>>>>>>>><br>
>>>>>>>>>> Best,<br>
>>>>>>>>>> Hom nath<br>
>>>>>>>>>><br>
>>>>>>>>>> On Sun, Jan 24, 2016 at 7:14 PM, Hom Nath Gharti<br>
>>>>>>>>>> <<a href="mailto:hng.email@gmail.com">hng.email@gmail.com</a>><br>
>>>>>>>>>> wrote:<br>
>>>>>>>>>>> Thank you so much Matt! I will try.<br>
>>>>>>>>>>><br>
>>>>>>>>>>> Hom Nath<br>
>>>>>>>>>>><br>
>>>>>>>>>>> On Sun, Jan 24, 2016 at 6:26 AM, Matthew Knepley<br>
>>>>>>>>>>> <<a href="mailto:knepley@gmail.com">knepley@gmail.com</a>><br>
>>>>>>>>>>> wrote:<br>
>>>>>>>>>>>> On Fri, Jan 22, 2016 at 2:19 PM, Hom Nath Gharti<br>
>>>>>>>>>>>> <<a href="mailto:hng.email@gmail.com">hng.email@gmail.com</a>><br>
>>>>>>>>>>>> wrote:<br>
>>>>>>>>>>>>><br>
>>>>>>>>>>>>> Dear all,<br>
>>>>>>>>>>>>><br>
>>>>>>>>>>>>> I am new to PcFieldSplit.<br>
>>>>>>>>>>>>><br>
>>>>>>>>>>>>> I have a matrix formed using MATMPIAIJ. Is it possible to use<br>
>>>>>>>>>>>>> PCFIELDSPLIT operations in this type of matrix? Or does it<br>
>>>>>>>>>>>>> have<br>
>>>>>>>>>>>>> to<br>
>>>>>>>>>>>>> be<br>
>>>>>>>>>>>>> MATMPIBIJ or MATNEST format?<br>
>>>>>>>>>>>><br>
>>>>>>>>>>>><br>
>>>>>>>>>>>> Yes, you can split AIJ.<br>
>>>>>>>>>>>><br>
>>>>>>>>>>>>><br>
>>>>>>>>>>>>> If possible for MATMPIAIJ, could anybody provide me a simple<br>
>>>>>>>>>>>>> example<br>
>>>>>>>>>>>>> or few steps? Variables in the equations are displacement<br>
>>>>>>>>>>>>> vector,<br>
>>>>>>>>>>>>> scalar potential and pressure.<br>
>>>>>>>>>>>><br>
>>>>>>>>>>>><br>
>>>>>>>>>>>> If you do not have a collocated discretization, then you have<br>
>>>>>>>>>>>> to<br>
>>>>>>>>>>>> use<br>
>>>>>>>>>>>><br>
>>>>>>>>>>>><br>
>>>>>>>>>>>><br>
>>>>>>>>>>>><br>
>>>>>>>>>>>><br>
>>>>>>>>>>>><br>
>>>>>>>>>>>> <a href="http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCFieldSplitSetIS.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCFieldSplitSetIS.html</a><br>
>>>>>>>>>>>><br>
>>>>>>>>>>>> Thanks,<br>
>>>>>>>>>>>><br>
>>>>>>>>>>>> Matt<br>
>>>>>>>>>>>><br>
>>>>>>>>>>>>><br>
>>>>>>>>>>>>> Thanks for help.<br>
>>>>>>>>>>>>><br>
>>>>>>>>>>>>> Hom Nath<br>
>>>>>>>>>>>><br>
>>>>>>>>>>>><br>
>>>>>>>>>>>><br>
>>>>>>>>>>>><br>
>>>>>>>>>>>> --<br>
>>>>>>>>>>>> What most experimenters take for granted before they begin<br>
>>>>>>>>>>>> their<br>
>>>>>>>>>>>> experiments<br>
>>>>>>>>>>>> is infinitely more interesting than any results to which their<br>
>>>>>>>>>>>> experiments<br>
>>>>>>>>>>>> lead.<br>
>>>>>>>>>>>> -- Norbert Wiener<br>
>>>>>>>>><br>
>>>>>>>>><br>
>>>>>>>>><br>
>>>>>>>>><br>
>>>>>>>>> --<br>
>>>>>>>>> What most experimenters take for granted before they begin their<br>
>>>>>>>>> experiments<br>
>>>>>>>>> is infinitely more interesting than any results to which their<br>
>>>>>>>>> experiments<br>
>>>>>>>>> lead.<br>
>>>>>>>>> -- Norbert Wiener<br>
>>>>>>><br>
>>>>>>><br>
>>>>>>><br>
>>>>>>><br>
>>>>>>> --<br>
>>>>>>> What most experimenters take for granted before they begin their<br>
>>>>>>> experiments<br>
>>>>>>> is infinitely more interesting than any results to which their<br>
>>>>>>> experiments<br>
>>>>>>> lead.<br>
>>>>>>> -- Norbert Wiener<br>
>>>>><br>
>>>>><br>
>>>>><br>
>>>>><br>
>>>>> --<br>
>>>>> What most experimenters take for granted before they begin their<br>
>>>>> experiments<br>
>>>>> is infinitely more interesting than any results to which their<br>
>>>>> experiments<br>
>>>>> lead.<br>
>>>>> -- Norbert Wiener<br>
>>><br>
>>><br>
>>><br>
>>><br>
>>> --<br>
>>> What most experimenters take for granted before they begin their experiments<br>
>>> is infinitely more interesting than any results to which their experiments<br>
>>> lead.<br>
>>> -- Norbert Wiener<br>
><br>
</div></div></blockquote></div><br><br clear="all"><div><br></div>-- <br><div class="gmail_signature">What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div>
</div></div>