Parallel partitioning of the matrix

Nguyen, Hung V ERDC-ITL-MS Hung.V.Nguyen at
Wed Apr 1 13:34:18 CDT 2009


Thank you for the info.


-----Original Message-----
From: petsc-users-bounces at
[mailto:petsc-users-bounces at] On Behalf Of Matthew Knepley
Sent: Wednesday, April 01, 2009 11:58 AM
To: PETSc users list
Subject: Re: Parallel partitioning of the matrix

On Tue, Mar 31, 2009 at 3:58 PM, Nguyen, Hung V ERDC-ITL-MS
<Hung.V.Nguyen at> wrote:

	I have a test case that each processor reads its owned part of matrix
in csr
	format dumped out by CFD application.
	Note: the partitions of matrix were done by ParMetis.
	Code below shows how to insert data into PETSc matrix (gmap is
	The solution from PETSc is very closed to CFD solution so I think it
	My question is whether the parallel partitioning of the matrix is
	by PETSc at runtime or is the same as ParMetis?
	Thank you,
	     /* create a matrix object */
	       MatCreateMPIAIJ(PETSC_COMM_WORLD, my_own, my_own,M,M, mnnz,


You have determined the partitioning right here.


	                  PETSC_NULL, mnnz, PETSC_NULL, &A);
	     for(i =0; i < my_own; i++) {
	          int row = gmap[i];
	          for (j = ia[i]; j < ia[i+1]; j++) {
	            int col = ja[j];
	            jj = gmap[col];
	            MatSetValues(A,1,&row,1,&jj,&val[j], INSERT_VALUES);
	/* free temporary arrays */
	      free(val); free(ja); free(ia);
	/* assemble the matrix and vectors*/
	     MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY);
	     MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY);

What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
-- Norbert Wiener

More information about the petsc-users mailing list