Use of MatCreateMPIAIJ and VecCreateMPI when ghost cells are present
Wee-Beng TAY
zonexo at gmail.com
Tue Apr 14 09:17:00 CDT 2009
Hi Matthew,
Supposed for my 8x8 grid, there's ghost cells on the edge, hence
changing to 0->9 x 0->9 no. of grids. I divide them along the y
direction such that for 4 processors,
myid=0
y=0 to 2
myid=1
y=3 to 4
myid=2
y=5 to 6
myid=3
y=7 to 9
therefore, for myid = 0 and 3, i'll have slightly more cells. however,
for VecCreateMPI, if i use:
call VecCreateMPI(MPI_COMM_WORLD,PETSC_DECIDE,size_x*size_y,b_rhs,ierr)
should size_x = size_y = 8? Different myid also has different no. of
cells. however the PETSC_DECIDE is the same for all processors. Will
that cause error?
Thank you very much and have a nice day!
Yours sincerely,
Wee-Beng Tay
Matthew Knepley wrote:
> On Mon, Apr 13, 2009 at 10:50 PM, Wee-Beng TAY <zonexo at gmail.com
> <mailto:zonexo at gmail.com>> wrote:
>
> Hi,
>
> In the past, I did not use ghost cells. Hence, for e.g., on a grid
> 8x8, I can divide into 8x2 each for 4 processors i.e. divide the y
> direction because in my computation, usually y no. of cells > x
> no. of cells. this will minimize the exchange of values.
>
> Now, with ghost cells, it has changed from x,y=1 to 8 to 0 to 9,
> i.e., the grid is now 10x10 hence to divide to 4 processors, it
> will not be an integer because 10/4 is not an interger. I'm
> thinking of using 6x6 grid, and including ghost cells becomes 8x8.
> Is this the right/best way?
>
> Thank you very much and have a nice day!
>
>
> Usually it is not crucial to divide the grid into exactly equal parts
> since the number of elements is large.
>
> Matt
>
>
>
> Yours sincerely,
>
> Wee-Beng Tay
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which
> their experiments lead.
> -- Norbert Wiener
More information about the petsc-users
mailing list