Parallel partitioning of the matrix
Matthew Knepley
knepley at gmail.com
Wed Apr 1 11:57:56 CDT 2009
On Tue, Mar 31, 2009 at 3:58 PM, Nguyen, Hung V ERDC-ITL-MS <
Hung.V.Nguyen at usace.army.mil> wrote:
> All,
>
> I have a test case that each processor reads its owned part of matrix in
> csr
> format dumped out by CFD application.
> Note: the partitions of matrix were done by ParMetis.
>
> Code below shows how to insert data into PETSc matrix (gmap is globalmap).
> The solution from PETSc is very closed to CFD solution so I think it is
> correct.
>
> My question is whether the parallel partitioning of the matrix is
> determined
> by PETSc at runtime or is the same as ParMetis?
>
> Thank you,
>
> -hung
> ---
> /* create a matrix object */
> MatCreateMPIAIJ(PETSC_COMM_WORLD, my_own, my_own,M,M, mnnz,
^^^^^^^^^^
You have determined the partitioning right here.
Matt
>
> PETSC_NULL, mnnz, PETSC_NULL, &A);
>
> for(i =0; i < my_own; i++) {
> int row = gmap[i];
> for (j = ia[i]; j < ia[i+1]; j++) {
> int col = ja[j];
> jj = gmap[col];
> MatSetValues(A,1,&row,1,&jj,&val[j], INSERT_VALUES);
> }
> }
> /* free temporary arrays */
> free(val); free(ja); free(ia);
>
> /* assemble the matrix and vectors*/
> MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY);
> MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY);
>
>
>
--
What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20090401/7d4cc00c/attachment.htm>
More information about the petsc-users
mailing list