[petsc-users] PetscSplitOwnerShip()
(Rebecca) Xuefei YUAN
xy2102 at columbia.edu
Tue Mar 9 18:52:09 CST 2010
Dear Matt,
Thanks for your reply. I did not partition by my own, and even if I
use VecLoadIntoVector(), the same error shows up.
Anything wrong with it?
Cheers,
Rebecca
Quoting Matthew Knepley <knepley at gmail.com>:
> If you do your own partitioning, you need to use VecLoadIntoVector() with a
> Vec you get from the DA.
>
> Matt
>
> On Tue, Mar 9, 2010 at 5:16 PM, (Rebecca) Xuefei YUAN
> <xy2102 at columbia.edu>wrote:
>
>> Hi,
>>
>> I ran np=4 for a small (-da_grid_x 6, -da_grid_y 5) problem, with
>> PETSC_STENCIL_BOX and width=2, dof = 4.
>>
>> When I use vecLoad() to load the binary file with the following set:
>>
>> PetscViewerBinaryOpen(PETSC_COMM_WORLD,fileName,FILE_MODE_READ,&viewer);
>> VecLoad(viewer,PETSC_NULL,&FIELD);
>> ierr = DAVecGetArray(da2_4,FIELD,&field);CHKERRQ(ierr);
>>
>> ierr = DAVecRestoreArray(da2_4,FIELD,&field);CHKERRQ(ierr);
>> ierr = VecDestroy(FIELD);CHKERRQ(ierr);
>> ierr = DADestroy(da2_4);CHKERRQ(ierr);
>> ierr = PetscViewerDestroy(viewer);CHKERRQ(ierr);
>>
>> However, I got the error messages as below:
>>
>> [0]PETSC ERROR: DAVecGetArray() line 53 in src/dm/da/src/dagetarray.c
>> Vector local size 32 is not compatible with DA local sizes 36 100
>>
>> [1]PETSC ERROR: DAVecGetArray() line 53 in src/dm/da/src/dagetarray.c
>> Vector local size 32 is not compatible with DA local sizes 36 100
>>
>> [2]PETSC ERROR: DAVecGetArray() line 53 in src/dm/da/src/dagetarray.c
>> Vector local size 28 is not compatible with DA local sizes 24 80
>>
>> [3]PETSC ERROR: DAVecGetArray() line 53 in src/dm/da/src/dagetarray.c
>> Vector local size 28 is not compatible with DA local sizes 24 80
>>
>> Then I tracked down and tried to find how this 32,32,28,28 coming from.
>>
>> It turns out that my da2_4 has the parameters at each processor:
>>
>>
>> xs xe ys ye x y Xs Xe Ys Ye Nl base nlocal Nlocal
>> p0: 0 12 0 3 12 3 0 20 0 5 100 0 100 36
>> p1: 12 24 0 3 12 3 4 24 0 5 100 36 100 36
>> p2: 0 12 3 5 12 2 0 20 1 5 80 72 80 24
>> p3: 12 24 3 5 12 2 4 24 1 5 80 96 80 24
>>
>> and deep in
>>
>> #0 PetscSplitOwnership (comm=-2080374782, n=0x8a061b4, N=0x8a061b8)
>> at psplit.c:81
>> #1 0x08628384 in PetscLayoutSetUp (map=0x8a061b0) at pmap.c:140
>> #2 0x08618320 in VecCreate_MPI_Private (v=0x8a05c50, alloc=PETSC_TRUE,
>> nghost=0, array=0x0) at pbvec.c:182
>> #3 0x08618ba7 in VecCreate_MPI (vv=0x8a05c50) at pbvec.c:232
>> #4 0x085f1554 in VecSetType (vec=0x8a05c50, method=0x885dd0b "mpi")
>> at vecreg.c:54
>> #5 0x085ec4f0 in VecSetTypeFromOptions_Private (vec=0x8a05c50)
>> at vector.c:1335
>> #6 0x085ec909 in VecSetFromOptions (vec=0x8a05c50) at vector.c:1370
>> #7 0x085d7a7e in VecLoad_Binary (viewer=0x89f70b0, itype=0x885ce3d "mpi",
>> newvec=0xbfe148e4) at vecio.c:228
>> #8 0x085d70e4 in VecLoad (viewer=0x89f70b0, outtype=0x885ce3d "mpi",
>> newvec=0xbfe148e4) at vecio.c:134
>> #9 0x0804f140 in FormInitialGuess_physical (dmmg=0x89a2880, X=0x89b34f0)
>> at vecviewload_out.c:390
>> #10 0x08052ced in DMMGSolve (dmmg=0x89a2720) at damg.c:307
>> #11 0x0804d479 in main (argc=-1, argv=0xbfe14cd4) at vecviewload_out.c:186
>>
>> it says that
>>
>> *n = *N/size + ((*N % size) > rank);
>>
>> where *N = 30; size = 4; rank = 0,1,2,3, in such a case, this gives me
>>
>> *n = 32,32,28,28 for pro0,1,2,3, separately.
>>
>> Where could be wrong with this mismatch of the local vector size and da
>> local size(excluding ghost pts or including ghost pts)?
>>
>> Thanks so much!
>>
>> Rebecca
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their experiments
> is infinitely more interesting than any results to which their experiments
> lead.
> -- Norbert Wiener
>
--
(Rebecca) Xuefei YUAN
Department of Applied Physics and Applied Mathematics
Columbia University
Tel:917-399-8032
www.columbia.edu/~xy2102
More information about the petsc-users
mailing list