[petsc-users] PETSC_NULL_OBJECT gets corrupt after call to MatNestGetISs in fortran

Satish Balay balay at mcs.anl.gov
Fri May 1 09:15:43 CDT 2015


Great! Glad it works!

Satish

On Fri, 1 May 2015, Klaij, Christiaan wrote:

> Satish,
> 
> Today, I tried again and now I also get:
> 
>                      0
>                      0
> 
> Sorry, I must have done something wrong earlier... Thanks again,
> 
> Chris
> 
> ________________________________________
> From: Satish Balay <balay at mcs.anl.gov>
> Sent: Thursday, April 30, 2015 9:54 PM
> To: Klaij, Christiaan
> Cc: petsc-users
> Subject: Re: [petsc-users] PETSC_NULL_OBJECT gets corrupt after call to MatNestGetISs in fortran
> 
> Hm - it works for me with the test code you sent..
> 
> <petsc-3.5.3 before patch>
> 
> balay at asterix /home/balay/download-pine/x
> $ ./ex1f
>                     0
>              40357424
> 
> <petsc-3.5.3 after patch>
> 
> balay at asterix /home/balay/download-pine/x
> $ ./ex1f
>                     0
>                     0
> balay at asterix /home/balay/download-pine/x
> 
> Satish
> 
> 
> $ On Thu, 30 Apr 2015, Klaij, Christiaan wrote:
> 
> > Satish,
> >
> > Thanks for the files! I've copied them to the correct location in my petsc-3.5.3 dir and rebuild the whole thing from scratch but I still get the same memory corruption for the example below...
> >
> > Chris
> > ________________________________________
> > From: Satish Balay <balay at mcs.anl.gov>
> > Sent: Friday, April 24, 2015 10:54 PM
> > To: Klaij, Christiaan
> > Cc: petsc-users at mcs.anl.gov
> > Subject: Re: [petsc-users] PETSC_NULL_OBJECT gets corrupt after call to MatNestGetISs in fortran
> >
> > Sorry for dropping the ball on this issue. I pushed the following fix to maint branch.
> >
> > https://bitbucket.org/petsc/petsc/commits/3a4d7b9a6c83003720b45dc0635fc32ea52a4309
> >
> > To use this change with petsc-3.5.3 - you can drop in the attached replacement files at:
> >
> > src/mat/impls/nest/ftn-custom/zmatnestf.c
> > src/mat/impls/nest/ftn-auto/matnestf.c
> >
> > Satish
> >
> > On Fri, 24 Apr 2015, Klaij, Christiaan wrote:
> >
> > > Barry, Satish
> > >
> > > Any news on this issue?
> > >
> > > Chris
> > >
> > > > On Feb 12, 2015, at 07:13:08 CST, Smith, Barry <bsmith at mcs.anl.gov> wrote:
> > > >
> > > >    Thanks for reporting this. Currently the Fortran stub for this function is generated automatically which means it does not have the logic for handling a PETSC_NULL_OBJECT argument.
> > > >
> > > >     Satish, could you please see if you can add a custom fortran stub for this function in maint?
> > > >
> > > >   Thanks
> > > >
> > > >    Barry
> > > >
> > > > > On Feb 12, 2015, at 3:02 AM, Klaij, Christiaan <C.Klaij at marin.nl> wrote:
> > > > >
> > > > > Using petsc-3.5.3, I noticed that PETSC_NULL_OBJECT gets corrupt after calling MatNestGetISs in fortran. Here's a small example:
> > > > >
> > > > > $ cat fieldsplittry2.F90
> > > > > program fieldsplittry2
> > > > >
> > > > >  use petscksp
> > > > >  implicit none
> > > > > #include <finclude/petsckspdef.h>
> > > > >
> > > > >  PetscErrorCode :: ierr
> > > > >  PetscInt       :: size,i,j,start,end,n=4,numsplit=1
> > > > >  PetscScalar    :: zero=0.0,one=1.0
> > > > >  Vec            :: diag3,x,b
> > > > >  Mat            :: A,subA(4),myS
> > > > >  PC             :: pc,subpc(2)
> > > > >  KSP            :: ksp,subksp(2)
> > > > >  IS             :: isg(2)
> > > > >
> > > > >  call PetscInitialize(PETSC_NULL_CHARACTER,ierr); CHKERRQ(ierr)
> > > > >  call MPI_Comm_size(PETSC_COMM_WORLD,size,ierr); CHKERRQ(ierr);
> > > > >
> > > > >  ! vectors
> > > > >  call VecCreateMPI(MPI_COMM_WORLD,3*n,PETSC_DECIDE,diag3,ierr); CHKERRQ(ierr)
> > > > >  call VecSet(diag3,one,ierr); CHKERRQ(ierr)
> > > > >
> > > > >  call VecCreateMPI(MPI_COMM_WORLD,4*n,PETSC_DECIDE,x,ierr); CHKERRQ(ierr)
> > > > >  call VecSet(x,zero,ierr); CHKERRQ(ierr)
> > > > >
> > > > >  call VecDuplicate(x,b,ierr); CHKERRQ(ierr)
> > > > >  call VecSet(b,one,ierr); CHKERRQ(ierr)
> > > > >
> > > > >  ! matrix a00
> > > > >  call MatCreateAIJ(MPI_COMM_WORLD,3*n,3*n,PETSC_DECIDE,PETSC_DECIDE,1,PETSC_NULL_INTEGER,0,PETSC_NULL_INTEGER,subA(1),ierr);CHKERRQ(ierr)
> > > > >  call MatDiagonalSet(subA(1),diag3,INSERT_VALUES,ierr);CHKERRQ(ierr)
> > > > >  call MatAssemblyBegin(subA(1),MAT_FINAL_ASSEMBLY,ierr);CHKERRQ(ierr)
> > > > >  call MatAssemblyEnd(subA(1),MAT_FINAL_ASSEMBLY,ierr);CHKERRQ(ierr)
> > > > >
> > > > >  ! matrix a01
> > > > >  call MatCreateAIJ(MPI_COMM_WORLD,3*n,n,PETSC_DECIDE,PETSC_DECIDE,1,PETSC_NULL_INTEGER,1,PETSC_NULL_INTEGER,subA(2),ierr);CHKERRQ(ierr)
> > > > >  call MatGetOwnershipRange(subA(2),start,end,ierr);CHKERRQ(ierr);
> > > > >  do i=start,end-1
> > > > >     j=mod(i,size*n)
> > > > >     call MatSetValue(subA(2),i,j,one,INSERT_VALUES,ierr);CHKERRQ(ierr)
> > > > >  end do
> > > > >  call MatAssemblyBegin(subA(2),MAT_FINAL_ASSEMBLY,ierr);CHKERRQ(ierr)
> > > > >  call MatAssemblyEnd(subA(2),MAT_FINAL_ASSEMBLY,ierr);CHKERRQ(ierr)
> > > > >
> > > > >  ! matrix a10
> > > > >  call  MatTranspose(subA(2),MAT_INITIAL_MATRIX,subA(3),ierr);CHKERRQ(ierr)
> > > > >
> > > > >  ! matrix a11 (empty)
> > > > >  call MatCreateAIJ(MPI_COMM_WORLD,n,n,PETSC_DECIDE,PETSC_DECIDE,0,PETSC_NULL_INTEGER,0,PETSC_NULL_INTEGER,subA(4),ierr);CHKERRQ(ierr)
> > > > >  call MatAssemblyBegin(subA(4),MAT_FINAL_ASSEMBLY,ierr);CHKERRQ(ierr)
> > > > >  call MatAssemblyEnd(subA(4),MAT_FINAL_ASSEMBLY,ierr);CHKERRQ(ierr)
> > > > >
> > > > >  ! nested mat [a00,a01;a10,a11]
> > > > >  call MatCreateNest(MPI_COMM_WORLD,2,PETSC_NULL_OBJECT,2,PETSC_NULL_OBJECT,subA,A,ierr);CHKERRQ(ierr)
> > > > >  call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr);CHKERRQ(ierr)
> > > > >  call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr);CHKERRQ(ierr)
> > > > > print *, PETSC_NULL_OBJECT
> > > > >  call MatNestGetISs(A,isg,PETSC_NULL_OBJECT,ierr);CHKERRQ(ierr);
> > > > > print *, PETSC_NULL_OBJECT
> > > > >
> > > > >  call PetscFinalize(ierr)
> > > > >
> > > > > end program fieldsplittry2
> > > > > $ ./fieldsplittry2
> > > > >                     0
> > > > >              39367824
> > > > > $
> > > > >
> > > > >
> > > > > dr. ir. Christiaan Klaij
> > > > > CFD Researcher
> > > > > Research & Development
> > > > > E mailto:C.Klaij at marin.nl
> > > > > T +31 317 49 33 44
> > > > >
> > > > >
> > > > > MARIN
> > > > > 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands
> > > > > T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl
> > > > >
> > >
> > >
> > > dr. ir. Christiaan Klaij
> > > CFD Researcher
> > > Research & Development
> > > E mailto:C.Klaij at marin.nl
> > > T +31 317 49 33 44
> > >
> > >
> > > MARIN
> > > 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands
> > > T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl
> > >
> > >
> >
> 
> 



More information about the petsc-users mailing list