Which way to decompose domain/grid

Stephen Wornom stephen.wornom at sophia.inria.fr
Thu Dec 10 05:55:13 CST 2009


Wee-Beng Tay wrote:
> Hi,
>
> I'm working on a 2D Cartesian grid and I'm going to decompose the grid 
> for MPI for my CFD Fortran code. The grid size is in the ratio of 110 
> x 70. I wonder how I should decompose the grid - horizontally or 
> vertically?
>
> For e.g., for 2 processors, to 2 55x70 grids, or 2 110x35 grids.
>
> I thought that communication between grids will be less if I do 55x70 
> because communication will only involve 70 values. 
I wrote a Cartesian mesh partitioner that partitions in slices as it 
involves the minimum communication time.
For general meshes I partition with METIS. Question: I would like to use 
the PETSC partitioner if use one has the option to partition on in the 
x-y coordinates. Is that possible?
Hope my question is clear.
Stephen
> However, if it's 110x35, it'll involve 110 values
>
> Hence does it matter in PETSc how the decomposition is done?
>
> On the other hand, since Fortran is column major and hence I do the 
> calculation:
>
> do j=1,size_y
>
> do i=1,size_x
>
> f(i,j)=....
>
> end do
>
> end do
>
> I'll need to "package" the 70 values in a chunk for efficient sending. 
> However, if it's in 110x35, I can use mpi_isend directly since it's 
> contagious data.
>
> So is there a better option since there seems to be a conflict? I read 
> about the use of DMMG. Will this problem be dealt with much better if 
> I use DMMG instead?
>
> Thank you very much


-- 
stephen.wornom at sophia.inria.fr
2004 route des lucioles - BP93
Sophia Antipolis
06902 CEDEX
		
Tel: 04 92 38 50 54
Fax: 04 97 15 53 51

-------------- next part --------------
A non-text attachment was scrubbed...
Name: stephen_wornom.vcf
Type: text/x-vcard
Size: 160 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20091210/6334d2c2/attachment.vcf>


More information about the petsc-users mailing list