[petsc-users] Different communicators in the two objects
Barry Smith
bsmith at mcs.anl.gov
Tue Mar 2 16:41:56 CST 2010
GRID is local vectors which means each process has its own that
lives on MPI_COMM_SELF. You are trying to view them with a viewer
created with PETSC_COMM_WORLD.
I don't think you want to view the local vectors, you want to view the
parallel vectors. If you want to view the local vectors then each
needs to go in its own file and you need to create PETSC_COMM_SELF
viewers for each file.
Barry
On Mar 2, 2010, at 4:24 PM, (Rebecca) Xuefei YUAN wrote:
> Hi,
>
> I tried to save the solution from DMComposte() as two binary
> files(one for da, and one for a scalar), but I get the error
> messages as below when running in two processors:
>
> (gdb) where
> #0 0xb7f1a410 in __kernel_vsyscall ()
> #1 0xb7c9a085 in raise () from /lib/tls/i686/cmov/libc.so.6
> #2 0xb7c9ba01 in abort () from /lib/tls/i686/cmov/libc.so.6
> #3 0x0873f24d in PetscAbortErrorHandler (line=697, fun=0x8868fd6
> "VecView",
> file=0x8868e50 "vector.c", dir=0x8868e59 "src/vec/vec/
> interface/", n=80,
> p=1,
> mess=0xbfc40b74 "Different communicators in the two objects:
> Argument # 1 and 2", ctx=0x0) at errabort.c:62
> #4 0x086b41be in PetscError (line=697, func=0x8868fd6 "VecView",
> file=0x8868e50 "vector.c", dir=0x8868e59 "src/vec/vec/
> interface/", n=80,
> p=1,
> mess=0x8869130 "Different communicators in the two objects:
> Argument # %d and %d") at err.c:482
> #5 0x085f2356 in VecView (vec=0x8a14000, viewer=0x89cc6f0) at
> vector.c:697
> #6 0x0804f30b in DumpSolutionToMatlab (dmmg=0x89b3370,
> fn=0xbfc416b7
> "twmgoreggt_tx7_ty6_x7_y6_nl1_s100_t375000_r30_pn10.m")
> at twmgoreggt.c:430
> #7 0x0804d72c in main (argc=Cannot access memory at address 0x2072
> ) at twmgoreggt.c:234
>
> The piece of code is:
>
> X = DMMGGetx(dmmg);
>
> ierr = DMCompositeGetEntries(dm,&da1,PETSC_IGNORE);CHKERRQ(ierr);
> ierr = DAGetLocalInfo(da1,&info1);CHKERRQ(ierr);
>
> // ierr = DMCompositeGetAccess(dm,X,&GRID,&c);CHKERRQ(ierr);
> ierr = DMCompositeGetLocalVectors(dm,&GRID,&c);CHKERRQ(ierr);
> ierr = DMCompositeScatter(dm,X,GRID,c);CHKERRQ(ierr);
>
> if(parameters->adaptiveTimeStepSize){
> sprintf(fileName, "g_atwgcqt2unffnictv_tx%i_ty%i_x%i_y%i_nl
> %i_nt%1.5f.dat",info1.mx,info1.my, parameters->mxgrid,parameters-
> >mygrid,parameters->numberOfLevels,parameters->timeToGenerateGrid);
> }else{
> sprintf(fileName, "g_twgcqt2unffnictv_tx%i_ty%i_x%i_y%i_nl
> %i_nt%1.5f.dat",info1.mx,info1.my, parameters->mxgrid,parameters-
> >mygrid,parameters->numberOfLevels,parameters->timeToGenerateGrid);
> }
>
> PetscViewerBinaryOpen
> (PETSC_COMM_WORLD,fileName,FILE_MODE_WRITE,&viewer_g);
> VecView(GRID,viewer_g);
> ierr = PetscViewerDestroy (viewer_g); CHKERRQ (ierr);
> if(parameters->adaptiveTimeStepSize){
> sprintf(fileName, "g_atwgcqt2unffnictv_tx%i_ty%i_x%i_y%i_nl
> %i_nt%1.5f.c.dat",info1.mx,info1.my, parameters->mxgrid,parameters-
> >mygrid,parameters->numberOfLevels,parameters->timeToGenerateGrid);
> }else{
> sprintf(fileName, "g_twgcqt2unffnictv_tx%i_ty%i_x%i_y%i_nl
> %i_nt%1.5f.c.dat",info1.mx,info1.my, parameters->mxgrid,parameters-
> >mygrid,parameters->numberOfLevels,parameters->timeToGenerateGrid);
> }
> int fd;
>
> PetscViewerBinaryOpen
> (PETSC_COMM_SELF,fileName,FILE_MODE_WRITE,&viewer_out);
> PetscViewerBinaryGetDescriptor(viewer_out,&fd);
> PetscBinaryWrite(fd,&c[0],1,PETSC_DOUBLE,PETSC_FALSE);
> // ierr = DMCompositeRestoreAccess(dm,X,&GRID,&c);CHKERRQ(ierr);
> ierr = DMCompositeGather(dm,X,GRID,c);CHKERRQ(ierr);
> ierr = DMCompositeRestoreLocalVectors(dm,&GRID,&c);CHKERRQ(ierr);
> ierr = PetscViewerDestroy (viewer_out); CHKERRQ (ierr);
>
>
> As debugging in gdb,
>
> in processor one:
> Breakpoint 1, DumpSolutionToMatlab (dmmg=0x89946b0,
> fn=0xbfc63f17
> "twmgoreggt_tx7_ty6_x7_y6_nl1_s100_t375000_r30_pn10.m")
> at twmgoreggt.c:430
> 430 VecView(GRID,viewer_g);
> (gdb) s
> VecView (vec=0x8a139e0, viewer=0x89b9750) at vector.c:690
> 690 PetscFunctionBegin;
> (gdb) n
> 691 PetscValidHeaderSpecific(vec,VEC_COOKIE,1);
> (gdb)
> 692 PetscValidType(vec,1);
> (gdb)
> 693 if (!viewer) {
> (gdb)
> 696 PetscValidHeaderSpecific(viewer,PETSC_VIEWER_COOKIE,2);
> (gdb)
> 697 PetscCheckSameComm(vec,1,viewer,2);
> (gdb) s
> PMPI_Comm_compare (comm1=-2080374780, comm2=-2080374782,
> result=0xbfc63c14)
> at comm_compare.c:81
> 81 MPIU_THREADPRIV_GET;
>
> and in processor two:
>
> Breakpoint 1, DumpSolutionToMatlab (dmmg=0x89b3370,
> fn=0xbf867a67
> "twmgoreggt_tx7_ty6_x7_y6_nl1_s100_t375000_r30_pn10.m")
> at twmgoreggt.c:430
> 430 VecView(GRID,viewer_g);
> (gdb) s
> VecView (vec=0x8a14000, viewer=0x89ae380) at vector.c:690
> 690 PetscFunctionBegin;
> (gdb) n
> 691 PetscValidHeaderSpecific(vec,VEC_COOKIE,1);
> (gdb)
> 692 PetscValidType(vec,1);
> (gdb)
> 693 if (!viewer) {
> (gdb)
> 696 PetscValidHeaderSpecific(viewer,PETSC_VIEWER_COOKIE,2);
> (gdb)
> 697 PetscCheckSameComm(vec,1,viewer,2);
> (gdb) s
> PMPI_Comm_compare (comm1=-2080374777, comm2=-2080374780,
> result=0xbf867764)
> at comm_compare.c:81
> 81 MPIU_THREADPRIV_GET;
> (gdb)
>
> In processor one, comm1=-2080374780, comm2=-2080374782, while in
> processor two,
> comm1=-2080374777, comm2=-2080374780. I do not know what causes the
> two communicators are different.
>
> Any idea about it? Thanks very much!
>
> -
> (Rebecca) Xuefei YUAN
> Department of Applied Physics and Applied Mathematics
> Columbia University
> Tel:917-399-8032
> www.columbia.edu/~xy2102
>
More information about the petsc-users
mailing list