[petsc-users] PetscScatterCreate type mismatch after update.

Manuel Valera mvalera-w at sdsu.edu
Tue Mar 12 20:02:31 CDT 2019


Hi,

So i just solved that problem but now it looks my code broke somewhere
else, i have a script in place to scatter/gather the information to root in
order to write it to a file (i know, we need to make this parallel I/O but
that's future work). Such script looks like this:


>       SUBROUTINE WriteToFile_grid()
>
>             PetscErrorCode        :: ierrp
>             PetscMPIInt           :: rank
>             PetscInt              :: iter
>             Vec                   ::
> CenterX,CenterY,CenterZ,Nat1,Nat2,seqvec
>             PetscScalar, pointer  :: tmp3d(:,:,:),tmp4d(:,:,:,:),arr(:)
>             VecScatter            :: LargerToSmaller, scatterctx
>             INTEGER               :: i,j,k, ierr
>
>             call MPI_Comm_rank(PETSC_COMM_WORLD, rank, ierrp)
>
> !#####################################################################################!
>             !                         Grid Cell Centers: x-component
>                         !
>
> !#####################################################################################!
>             ! Extract x-component
>             call DMDAVecGetArrayF90(daGrid,GridCenters,tmp4d,ierrp)
>             call DMCreateGlobalVector(daSingle,CenterX,ierrp)
>             call DMDAVecGetArrayF90(daSingle,CenterX,tmp3d,ierrp)
>             tmp3d(:,:,:) = tmp4d(0,:,:,:)
>             call DMDAVecRestoreArrayF90(daSingle,CenterX,tmp3d,ierrp)
>             call DMDAVecRestoreArrayF90(daGrid,GridCenters,tmp4d,ierrp)
>             ! Scatter to daWriteCenters
>             call DMDACreateNaturalVector(daSingle,Nat1,ierrp)
>             call
> DMDAGlobalToNaturalBegin(daSingle,CenterX,INSERT_VALUES,Nat1,ierrp)
>             call
> DMDAGlobalToNaturalEnd(daSingle,CenterX,INSERT_VALUES,Nat1,ierrp)
>             call VecDestroy(CenterX,ierrp)
>             call DMDACreateNaturalVector(daWriteCenters,Nat2,ierrp)
>             call
> VecScatterCreate(Nat1,SingleIS,Nat2,WriteIS,LargerToSmaller,ierrp)
>             call
> VecScatterBegin(LargerToSmaller,Nat1,Nat2,INSERT_VALUES,SCATTER_FORWARD,ierrp)
>             call
> VecScatterEnd(LargerToSmaller,Nat1,Nat2,INSERT_VALUES,SCATTER_FORWARD,ierrp)
>             call VecScatterDestroy(LargerToSmaller,ierrp)
>             call VecDestroy(Nat1,ierrp)
>             ! Send to root
>             call VecScatterCreateToZero(Nat2,scatterctx,seqvec,ierrp)
>             call
> VecScatterBegin(scatterctx,Nat2,seqvec,INSERT_VALUES,SCATTER_FORWARD,ierrp)
>             call
> VecScatterEnd(scatterctx,Nat2,seqvec,INSERT_VALUES,SCATTER_FORWARD,ierrp)
>             call VecScatterDestroy(scatterctx,ierrp)
>            call VecDestroy(Nat2,ierrp)
>             ! Let root write to netCDF file
>             if (rank == 0) then
>                 allocate(buffer(1:IMax-1,1:JMax-1,1:KMax-1),STAT=ierr)
>                 call VecGetArrayReadF90(seqvec,arr,ierrp)
>                 iter = 1
>                 do k=1,KMax-1
>                     do j=1,JMax-1
>                         do i=1,IMax-1
>                             buffer(i,j,k) = arr(iter)
>                             iter = iter + 1
>                         enddo
>                     enddo
>                 enddo
>                 call VecRestoreArrayReadF90(seqvec,arr,ierrp)
>                 call
> nc_check(nf90_put_var(ncid,xID,buffer,start=(/1,1,1/),count=(/IMax-1,JMax-1,KMax-1/)),
> &
>                               'WriteNetCDF', context='put_var GridCenterX
> in '//trim(output_filename))
>                 deallocate(buffer,STAT=ierr)
>             endif
>             call VecDestroy(seqvec,ierrp)


And then the process is repeated for each variable to output. Notice the
vector seqvec is being destroyed at the end. Using petsc v3.10.0 this
script worked without problems. After updating to v3.10.4 it no longer
works. Gives the following error:

[0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: Null argument, when expecting valid pointer
> [0]PETSC ERROR: Null Object: Parameter # 3
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.10.4, unknown
> [0]PETSC ERROR: ./gcmSeamount on a petsc-debug named ocean by valera Tue
> Mar 12 17:59:43 2019
> [0]PETSC ERROR: Configure options --known-level1-dcache-size=32768
> --known-level1-dcache-linesize=64 --known-level1-dcache-assoc=8
> --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2
> --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8
> --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8
> --known-bits-per-byte=8 --known-memcmp-ok=1 --known-sizeof-MPI_Comm=8
> --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --known-mpi-int64_t=1
> --known-mpi-c-double-complex=1 --known-has-attribute-aligned=1
> PETSC_ARCH=petsc-debug --COPTFLAGS=-O2 --CXXOPTFLAGS=-O2 --FOPTFLAGS=-O2
> --with-cc=mpicc --with-cxx=mpic++ --with-fc=mpifort
> --with-shared-libraries=1 --with-debugging=1 --download-hypre --download-ml
> --with-batch --known-mpi-shared-libraries=1 --known-64-bit-blas-indices=0
> [0]PETSC ERROR: #1 VecScatterBegin() line 85 in
> /usr/dataC/home/valera/petsc/src/vec/vscat/interface/vscatfce.c
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: Null argument, when expecting valid pointer
> [0]PETSC ERROR: Null Object: Parameter # 3
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.10.4, unknown
> [0]PETSC ERROR: ./gcmSeamount on a petsc-debug named ocean by valera Tue
> Mar 12 17:59:43 2019
> [0]PETSC ERROR: Configure options --known-level1-dcache-size=32768
> --known-level1-dcache-linesize=64 --known-level1-dcache-assoc=8
> --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2
> --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8
> --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8
> --known-bits-per-byte=8 --known-memcmp-ok=1 --known-sizeof-MPI_Comm=8
> --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --known-mpi-int64_t=1
> --known-mpi-c-double-complex=1 --known-has-attribute-aligned=1
> PETSC_ARCH=petsc-debug --COPTFLAGS=-O2 --CXXOPTFLAGS=-O2 --FOPTFLAGS=-O2
> --with-cc=mpicc --with-cxx=mpic++ --with-fc=mpifort
> --with-shared-libraries=1 --with-debugging=1 --download-hypre --download-ml
> --with-batch --known-mpi-shared-libraries=1 --known-64-bit-blas-indices=0
> [0]PETSC ERROR: #2 VecScatterEnd() line 150 in
> /usr/dataC/home/valera/petsc/src/vec/vscat/interface/vscatfce.c
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: Null argument, when expecting valid pointer
> [0]PETSC ERROR: Null Object: Parameter # 1
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.10.4, unknown
> [0]PETSC ERROR: ./gcmSeamount on a petsc-debug named ocean by valera Tue
> Mar 12 17:59:43 2019
> [0]PETSC ERROR: Configure options --known-level1-dcache-size=32768
> --known-level1-dcache-linesize=64 --known-level1-dcache-assoc=8
> --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2
> --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8
> --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8
> --known-bits-per-byte=8 --known-memcmp-ok=1 --known-sizeof-MPI_Comm=8
> --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --known-mpi-int64_t=1
> --known-mpi-c-double-complex=1 --known-has-attribute-aligned=1
> PETSC_ARCH=petsc-debug --COPTFLAGS=-O2 --CXXOPTFLAGS=-O2 --FOPTFLAGS=-O2
> --with-cc=mpicc --with-cxx=mpic++ --with-fc=mpifort
> --with-shared-libraries=1 --with-debugging=1 --download-hypre --download-ml
> --with-batch --known-mpi-shared-libraries=1 --known-64-bit-blas-indices=0
> [0]PETSC ERROR: #3 VecGetArrayRead() line 1649 in
> /usr/dataC/home/valera/petsc/src/vec/vec/interface/rvector.c
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> [0]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
> [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS
> X to find memory corruption errors
> [0]PETSC ERROR: likely location of problem given in stack below
> [0]PETSC ERROR: ---------------------  Stack Frames
> ------------------------------------
> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> available,
> [0]PETSC ERROR:       INSTEAD the line number of the start of the function
> [0]PETSC ERROR:       is given.
> [0]PETSC ERROR: [0] VecGetArrayRead line 1648
> /usr/dataC/home/valera/petsc/src/vec/vec/interface/rvector.c
> [0]PETSC ERROR: [0] VecScatterEnd line 147
> /usr/dataC/home/valera/petsc/src/vec/vscat/interface/vscatfce.c
> [0]PETSC ERROR: [0] VecScatterBegin line 82
> /usr/dataC/home/valera/petsc/src/vec/vscat/interface/vscatfce.c
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: Signal received
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.10.4, unknown
> [0]PETSC ERROR: ./gcmSeamount on a petsc-debug named ocean by valera Tue
> Mar 12 17:59:43 2019
> [0]PETSC ERROR: Configure options --known-level1-dcache-size=32768
> --known-level1-dcache-linesize=64 --known-level1-dcache-assoc=8
> --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2
> --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8
> --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8
> --known-bits-per-byte=8 --known-memcmp-ok=1 --known-sizeof-MPI_Comm=8
> --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --known-mpi-int64_t=1
> --known-mpi-c-double-complex=1 --known-has-attribute-aligned=1
> PETSC_ARCH=petsc-debug --COPTFLAGS=-O2 --CXXOPTFLAGS=-O2 --FOPTFLAGS=-O2
> --with-cc=mpicc --with-cxx=mpic++ --with-fc=mpifort
> --with-shared-libraries=1 --with-debugging=1 --download-hypre --download-ml
> --with-batch --known-mpi-shared-libraries=1 --known-64-bit-blas-indices=0
> [0]PETSC ERROR: #4 User provided function() line 0 in  unknown file
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD



Now, interesting part is, if i comment the VecDestroy(seqvec,...) call,
this error doesn't show up, at least for this part of the code.

Can you help me understand this error and how to fix it? I know the error
is after the !send to root flag of the code,

Thanks,

Manuel

On Tue, Mar 12, 2019 at 3:58 PM Jed Brown <jed at jedbrown.org> wrote:

> Did you just update to 'master'?  See VecScatter changes:
>
> https://www.mcs.anl.gov/petsc/documentation/changes/dev.html
>
> Manuel Valera via petsc-users <petsc-users at mcs.anl.gov> writes:
>
> > Hello,
> >
> > I just updated petsc from the repo to the latest master branch version,
> and
> > a compilation problem popped up, it seems like the variable types are not
> > being acknowledged properly, what i have in a minimum working example
> > fashion is:
> >
> > #include <petsc/finclude/petscvec.h>
> >> #include <petsc/finclude/petscdmda.h>
> >> #include <petsc/finclude/petscdm.h>
> >> #include <petsc/finclude/petscis.h>
> >> #include <petsc/finclude/petscksp.h>
> >> USE petscvec
> >> USE petscdmda
> >> USE petscdm
> >> USE petscis
> >> USE petscksp
> >> IS                     :: ScalarIS
> >> IS                     :: DummyIS
> >> VecScatter             :: LargerToSmaller,to0,from0
> >> VecScatter             :: SmallerToLarger
> >> PetscInt, ALLOCATABLE  :: pScalarDA(:), pDummyDA(:)
> >> PetscScalar            :: rtol
> >> Vec                    :: Vec1
> >> Vec                    :: Vec2
> >> ! Create index sets
> >>             allocate( pScalarDA(0:(gridx-1)*(gridy-1)*(gridz-1)-1) ,
> >> pDummyDA(0:(gridx-1)*(gridy-1)*(gridz-1)-1) )
> >>             iter=0
> >>             do k=0,gridz-2
> >>                 kplane = k*gridx*gridy
> >>                 do j=0,gridy-2
> >>                     do i=0,gridx-2
> >>                         pScalarDA(iter) = kplane + j*(gridx) + i
> >>                         iter = iter+1
> >>                     enddo
> >>                 enddo
> >>             enddo
> >>             pDummyDA = (/ (ind,
> ind=0,((gridx-1)*(gridy-1)*(gridz-1))-1) /)
> >>             call
> >> ISCreateGeneral(PETSC_COMM_WORLD,(gridx-1)*(gridy-1)*(gridz-1), &
> >>
> >>  pScalarDA,PETSC_COPY_VALUES,ScalarIS,ierr)
> >>             call
> >> ISCreateGeneral(PETSC_COMM_WORLD,(gridx-1)*(gridy-1)*(gridz-1), &
> >>
> >>  pDummyDA,PETSC_COPY_VALUES,DummyIS,ierr)
> >>             deallocate(pScalarDA,pDummyDA, STAT=ierr)
> >>             ! Create VecScatter contexts: LargerToSmaller &
> SmallerToLarger
> >>             call DMDACreateNaturalVector(daScalars,Vec1,ierr)
> >>             call DMDACreateNaturalVector(daDummy,Vec2,ierr)
> >>             call
> >> VecScatterCreate(Vec1,ScalarIS,Vec2,DummyIS,LargerToSmaller,ierr)
> >>             call
> >> VecScatterCreate(Vec2,DummyIS,Vec1,ScalarIS,SmallerToLarger,ierr)
> >>             call VecDestroy(Vec1,ierr)
> >>             call VecDestroy(Vec2,ierr)
> >
> >
> > And the error i get is the part i cannot really understand:
> >
> > matrixobjs.f90:99.34:
> >>             call
> >> VecScatterCreate(Vec1,ScalarIS,Vec2,DummyIS,LargerToSmaller,ie
> >>                                                  1
> >> Error: Type mismatch in argument 'a' at (1); passed TYPE(tvec) to
> >> INTEGER(4)
> >> matrixobjs.f90:100.34:
> >>             call
> >> VecScatterCreate(Vec2,DummyIS,Vec1,ScalarIS,SmallerToLarger,ie
> >>                                                  1
> >> Error: Type mismatch in argument 'a' at (1); passed TYPE(tvec) to
> >> INTEGER(4)
> >> make[1]: *** [matrixobjs.o] Error 1
> >> make[1]: Leaving directory `/usr/scratch/valera/ParGCCOM-Master/Src'
> >> make: *** [gcmSeamount] Error 2
> >
> >
> > What i find hard to understand is why/where my code is finding an integer
> > type? as you can see from the MWE header the variables types look
> correct,
> >
> > Any help is appreaciated,
> >
> > Thanks,
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190312/649578d8/attachment-0001.html>


More information about the petsc-users mailing list