<div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr">Hi,<div><br></div><div>So i just solved that problem but now it looks my code broke somewhere else, i have a script in place to scatter/gather the information to root in order to write it to a file (i know, we need to make this parallel I/O but that's future work). Such script looks like this:</div><div><br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><br>      SUBROUTINE WriteToFile_grid()<br><br>            PetscErrorCode        :: ierrp<br>            PetscMPIInt           :: rank<br>            PetscInt              :: iter<br>            Vec                   :: CenterX,CenterY,CenterZ,Nat1,Nat2,seqvec<br>            PetscScalar, pointer  :: tmp3d(:,:,:),tmp4d(:,:,:,:),arr(:)<br>            VecScatter            :: LargerToSmaller, scatterctx<br>            INTEGER               :: i,j,k, ierr<br><br>            call MPI_Comm_rank(PETSC_COMM_WORLD, rank, ierrp)<br>            !#####################################################################################!<br>            !                         Grid Cell Centers: x-component                              !<br>            !#####################################################################################!<br>            ! Extract x-component<br>            call DMDAVecGetArrayF90(daGrid,GridCenters,tmp4d,ierrp)<br>            call DMCreateGlobalVector(daSingle,CenterX,ierrp)<br>            call DMDAVecGetArrayF90(daSingle,CenterX,tmp3d,ierrp)<br>            tmp3d(:,:,:) = tmp4d(0,:,:,:)<br>            call DMDAVecRestoreArrayF90(daSingle,CenterX,tmp3d,ierrp)<br>            call DMDAVecRestoreArrayF90(daGrid,GridCenters,tmp4d,ierrp)<br>            ! Scatter to daWriteCenters<br>            call DMDACreateNaturalVector(daSingle,Nat1,ierrp)<br>            call DMDAGlobalToNaturalBegin(daSingle,CenterX,INSERT_VALUES,Nat1,ierrp)<br>            call DMDAGlobalToNaturalEnd(daSingle,CenterX,INSERT_VALUES,Nat1,ierrp)<br>            call VecDestroy(CenterX,ierrp)<br>            call DMDACreateNaturalVector(daWriteCenters,Nat2,ierrp)<br>            call VecScatterCreate(Nat1,SingleIS,Nat2,WriteIS,LargerToSmaller,ierrp)<br>            call VecScatterBegin(LargerToSmaller,Nat1,Nat2,INSERT_VALUES,SCATTER_FORWARD,ierrp)<br>            call VecScatterEnd(LargerToSmaller,Nat1,Nat2,INSERT_VALUES,SCATTER_FORWARD,ierrp)<br>            call VecScatterDestroy(LargerToSmaller,ierrp)<br>            call VecDestroy(Nat1,ierrp)<br>            ! Send to root<br>            call VecScatterCreateToZero(Nat2,scatterctx,seqvec,ierrp)<br>            call VecScatterBegin(scatterctx,Nat2,seqvec,INSERT_VALUES,SCATTER_FORWARD,ierrp)<br>            call VecScatterEnd(scatterctx,Nat2,seqvec,INSERT_VALUES,SCATTER_FORWARD,ierrp)<br>            call VecScatterDestroy(scatterctx,ierrp)<br>           call VecDestroy(Nat2,ierrp)<br>            ! Let root write to netCDF file<br>            if (rank == 0) then<br>                allocate(buffer(1:IMax-1,1:JMax-1,1:KMax-1),STAT=ierr)<br>                call VecGetArrayReadF90(seqvec,arr,ierrp)<br>                iter = 1<br>                do k=1,KMax-1<br>                    do j=1,JMax-1<br>                        do i=1,IMax-1<br>                            buffer(i,j,k) = arr(iter)<br>                            iter = iter + 1<br>                        enddo<br>                    enddo<br>                enddo<br>                call VecRestoreArrayReadF90(seqvec,arr,ierrp)<br>                call nc_check(nf90_put_var(ncid,xID,buffer,start=(/1,1,1/),count=(/IMax-1,JMax-1,KMax-1/)), &<br>                              'WriteNetCDF', context='put_var GridCenterX in '//trim(output_filename))<br>                deallocate(buffer,STAT=ierr)<br>            endif<br>            call VecDestroy(seqvec,ierrp)</blockquote><div><br></div><div>And then the process is repeated for each variable to output. Notice the vector seqvec is being destroyed at the end. Using petsc v3.10.0 this script worked without problems. After updating to v3.10.4 it no longer works. Gives the following error:</div><div><br></div><div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>[0]PETSC ERROR: Null argument, when expecting valid pointer<br>[0]PETSC ERROR: Null Object: Parameter # 3<br>[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.<br>[0]PETSC ERROR: Petsc Release Version 3.10.4, unknown <br>[0]PETSC ERROR: ./gcmSeamount on a petsc-debug named ocean by valera Tue Mar 12 17:59:43 2019<br>[0]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=64 --known-level1-dcache-assoc=8 --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8 --known-bits-per-byte=8 --known-memcmp-ok=1 --known-sizeof-MPI_Comm=8 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --known-mpi-int64_t=1 --known-mpi-c-double-complex=1 --known-has-attribute-aligned=1 PETSC_ARCH=petsc-debug --COPTFLAGS=-O2 --CXXOPTFLAGS=-O2 --FOPTFLAGS=-O2 --with-cc=mpicc --with-cxx=mpic++ --with-fc=mpifort --with-shared-libraries=1 --with-debugging=1 --download-hypre --download-ml --with-batch --known-mpi-shared-libraries=1 --known-64-bit-blas-indices=0<br>[0]PETSC ERROR: #1 VecScatterBegin() line 85 in /usr/dataC/home/valera/petsc/src/vec/vscat/interface/vscatfce.c<br>[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>[0]PETSC ERROR: Null argument, when expecting valid pointer<br>[0]PETSC ERROR: Null Object: Parameter # 3<br>[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.<br>[0]PETSC ERROR: Petsc Release Version 3.10.4, unknown <br>[0]PETSC ERROR: ./gcmSeamount on a petsc-debug named ocean by valera Tue Mar 12 17:59:43 2019<br>[0]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=64 --known-level1-dcache-assoc=8 --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8 --known-bits-per-byte=8 --known-memcmp-ok=1 --known-sizeof-MPI_Comm=8 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --known-mpi-int64_t=1 --known-mpi-c-double-complex=1 --known-has-attribute-aligned=1 PETSC_ARCH=petsc-debug --COPTFLAGS=-O2 --CXXOPTFLAGS=-O2 --FOPTFLAGS=-O2 --with-cc=mpicc --with-cxx=mpic++ --with-fc=mpifort --with-shared-libraries=1 --with-debugging=1 --download-hypre --download-ml --with-batch --known-mpi-shared-libraries=1 --known-64-bit-blas-indices=0<br>[0]PETSC ERROR: #2 VecScatterEnd() line 150 in /usr/dataC/home/valera/petsc/src/vec/vscat/interface/vscatfce.c<br>[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>[0]PETSC ERROR: Null argument, when expecting valid pointer<br>[0]PETSC ERROR: Null Object: Parameter # 1<br>[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.<br>[0]PETSC ERROR: Petsc Release Version 3.10.4, unknown <br>[0]PETSC ERROR: ./gcmSeamount on a petsc-debug named ocean by valera Tue Mar 12 17:59:43 2019<br>[0]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=64 --known-level1-dcache-assoc=8 --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8 --known-bits-per-byte=8 --known-memcmp-ok=1 --known-sizeof-MPI_Comm=8 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --known-mpi-int64_t=1 --known-mpi-c-double-complex=1 --known-has-attribute-aligned=1 PETSC_ARCH=petsc-debug --COPTFLAGS=-O2 --CXXOPTFLAGS=-O2 --FOPTFLAGS=-O2 --with-cc=mpicc --with-cxx=mpic++ --with-fc=mpifort --with-shared-libraries=1 --with-debugging=1 --download-hypre --download-ml --with-batch --known-mpi-shared-libraries=1 --known-64-bit-blas-indices=0<br>[0]PETSC ERROR: #3 VecGetArrayRead() line 1649 in /usr/dataC/home/valera/petsc/src/vec/vec/interface/rvector.c<br>[0]PETSC ERROR: ------------------------------------------------------------------------<br>[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range<br>[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger<br>[0]PETSC ERROR: or see <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind</a><br>[0]PETSC ERROR: or try <a href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to find memory corruption errors<br>[0]PETSC ERROR: likely location of problem given in stack below<br>[0]PETSC ERROR: ---------------------  Stack Frames ------------------------------------<br>[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,<br>[0]PETSC ERROR:       INSTEAD the line number of the start of the function<br>[0]PETSC ERROR:       is given.<br>[0]PETSC ERROR: [0] VecGetArrayRead line 1648 /usr/dataC/home/valera/petsc/src/vec/vec/interface/rvector.c<br>[0]PETSC ERROR: [0] VecScatterEnd line 147 /usr/dataC/home/valera/petsc/src/vec/vscat/interface/vscatfce.c<br>[0]PETSC ERROR: [0] VecScatterBegin line 82 /usr/dataC/home/valera/petsc/src/vec/vscat/interface/vscatfce.c<br>[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>[0]PETSC ERROR: Signal received<br>[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.<br>[0]PETSC ERROR: Petsc Release Version 3.10.4, unknown <br>[0]PETSC ERROR: ./gcmSeamount on a petsc-debug named ocean by valera Tue Mar 12 17:59:43 2019<br>[0]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=64 --known-level1-dcache-assoc=8 --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8 --known-bits-per-byte=8 --known-memcmp-ok=1 --known-sizeof-MPI_Comm=8 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --known-mpi-int64_t=1 --known-mpi-c-double-complex=1 --known-has-attribute-aligned=1 PETSC_ARCH=petsc-debug --COPTFLAGS=-O2 --CXXOPTFLAGS=-O2 --FOPTFLAGS=-O2 --with-cc=mpicc --with-cxx=mpic++ --with-fc=mpifort --with-shared-libraries=1 --with-debugging=1 --download-hypre --download-ml --with-batch --known-mpi-shared-libraries=1 --known-64-bit-blas-indices=0<br>[0]PETSC ERROR: #4 User provided function() line 0 in  unknown file<br>--------------------------------------------------------------------------<br>MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD </blockquote></div><div><br></div><div><br></div><div>Now, interesting part is, if i comment the VecDestroy(seqvec,...) call, this error doesn't show up, at least for this part of the code. </div><div><br></div><div>Can you help me understand this error and how to fix it? I know the error is after the !send to root flag of the code,</div><div><br></div><div>Thanks,</div><div><br></div><div>Manuel </div></div></div></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Tue, Mar 12, 2019 at 3:58 PM Jed Brown <<a href="mailto:jed@jedbrown.org">jed@jedbrown.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Did you just update to 'master'?  See VecScatter changes:<br>
<br>
<a href="https://www.mcs.anl.gov/petsc/documentation/changes/dev.html" rel="noreferrer" target="_blank">https://www.mcs.anl.gov/petsc/documentation/changes/dev.html</a><br>
<br>
Manuel Valera via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>> writes:<br>
<br>
> Hello,<br>
><br>
> I just updated petsc from the repo to the latest master branch version, and<br>
> a compilation problem popped up, it seems like the variable types are not<br>
> being acknowledged properly, what i have in a minimum working example<br>
> fashion is:<br>
><br>
> #include <petsc/finclude/petscvec.h><br>
>> #include <petsc/finclude/petscdmda.h><br>
>> #include <petsc/finclude/petscdm.h><br>
>> #include <petsc/finclude/petscis.h><br>
>> #include <petsc/finclude/petscksp.h><br>
>> USE petscvec<br>
>> USE petscdmda<br>
>> USE petscdm<br>
>> USE petscis<br>
>> USE petscksp<br>
>> IS                     :: ScalarIS<br>
>> IS                     :: DummyIS<br>
>> VecScatter             :: LargerToSmaller,to0,from0<br>
>> VecScatter             :: SmallerToLarger<br>
>> PetscInt, ALLOCATABLE  :: pScalarDA(:), pDummyDA(:)<br>
>> PetscScalar            :: rtol<br>
>> Vec                    :: Vec1<br>
>> Vec                    :: Vec2<br>
>> ! Create index sets<br>
>>             allocate( pScalarDA(0:(gridx-1)*(gridy-1)*(gridz-1)-1) ,<br>
>> pDummyDA(0:(gridx-1)*(gridy-1)*(gridz-1)-1) )<br>
>>             iter=0<br>
>>             do k=0,gridz-2<br>
>>                 kplane = k*gridx*gridy<br>
>>                 do j=0,gridy-2<br>
>>                     do i=0,gridx-2<br>
>>                         pScalarDA(iter) = kplane + j*(gridx) + i<br>
>>                         iter = iter+1<br>
>>                     enddo<br>
>>                 enddo<br>
>>             enddo<br>
>>             pDummyDA = (/ (ind, ind=0,((gridx-1)*(gridy-1)*(gridz-1))-1) /)<br>
>>             call<br>
>> ISCreateGeneral(PETSC_COMM_WORLD,(gridx-1)*(gridy-1)*(gridz-1), &<br>
>><br>
>>  pScalarDA,PETSC_COPY_VALUES,ScalarIS,ierr)<br>
>>             call<br>
>> ISCreateGeneral(PETSC_COMM_WORLD,(gridx-1)*(gridy-1)*(gridz-1), &<br>
>><br>
>>  pDummyDA,PETSC_COPY_VALUES,DummyIS,ierr)<br>
>>             deallocate(pScalarDA,pDummyDA, STAT=ierr)<br>
>>             ! Create VecScatter contexts: LargerToSmaller & SmallerToLarger<br>
>>             call DMDACreateNaturalVector(daScalars,Vec1,ierr)<br>
>>             call DMDACreateNaturalVector(daDummy,Vec2,ierr)<br>
>>             call<br>
>> VecScatterCreate(Vec1,ScalarIS,Vec2,DummyIS,LargerToSmaller,ierr)<br>
>>             call<br>
>> VecScatterCreate(Vec2,DummyIS,Vec1,ScalarIS,SmallerToLarger,ierr)<br>
>>             call VecDestroy(Vec1,ierr)<br>
>>             call VecDestroy(Vec2,ierr)<br>
><br>
><br>
> And the error i get is the part i cannot really understand:<br>
><br>
> matrixobjs.f90:99.34:<br>
>>             call<br>
>> VecScatterCreate(Vec1,ScalarIS,Vec2,DummyIS,LargerToSmaller,ie<br>
>>                                                  1<br>
>> Error: Type mismatch in argument 'a' at (1); passed TYPE(tvec) to<br>
>> INTEGER(4)<br>
>> matrixobjs.f90:100.34:<br>
>>             call<br>
>> VecScatterCreate(Vec2,DummyIS,Vec1,ScalarIS,SmallerToLarger,ie<br>
>>                                                  1<br>
>> Error: Type mismatch in argument 'a' at (1); passed TYPE(tvec) to<br>
>> INTEGER(4)<br>
>> make[1]: *** [matrixobjs.o] Error 1<br>
>> make[1]: Leaving directory `/usr/scratch/valera/ParGCCOM-Master/Src'<br>
>> make: *** [gcmSeamount] Error 2<br>
><br>
><br>
> What i find hard to understand is why/where my code is finding an integer<br>
> type? as you can see from the MWE header the variables types look correct,<br>
><br>
> Any help is appreaciated,<br>
><br>
> Thanks,<br>
</blockquote></div>