[petsc-users] MATELEMENTAL

Hong Zhang hzhang at mcs.anl.gov
Tue Jun 11 22:25:32 CDT 2013


Shuangshuang :

Use
 ierr = MatSetSizes(J, PETSC_DECIDE,PETSC_DECIDE, 12, 12); CHKERRQ(ierr);
for any number of processors (np),
or
ierr = MatSetSizes(J, 6, 6, PETSC_DECIDE,PETSC_DECIDE); CHKERRQ(ierr);
for np=2
or
ierr = MatSetSizes(J, 4, 4, PETSC_DECIDE,PETSC_DECIDE); CHKERRQ(ierr);
for np=3

**
>
> Hi, Hong, does that mean in my code where my J is 12x12,****
>
> ** **
>
> If I use one processor, I should set the matrix size as: ****
>
>       ierr = MatSetSizes(J, 12, 12, 12, 12); CHKERRQ(ierr);****
>
> If I use two processor, I should write:****
>
>       ierr = MatSetSizes(J, 6, 6, 12, 12); CHKERRQ(ierr);****
>
> and for three processors, should be:****
>
>       ierr = MatSetSizes(J, 4, 4, 12, 12); CHKERRQ(ierr);****
>
> ** **
>
> which means, the number of local rows = the number of global rows / the
> number of processors?
>
Yes.

Hong

> *From:* Hong Zhang [mailto:hzhang at mcs.anl.gov <hzhang at mcs.anl.gov>]
> *Sent:* Tuesday, June 11, 2013 3:05 PM
> *To:* Jin, Shuangshuang
> *Cc:* petsc-users at mcs.anl.gov
> *Subject:* Re: [petsc-users] MATELEMENTAL****
>
> ** **
>
> Shuangshuang,****
>
> In ex38.c, m and n are local block sizes, not global. When running np=2 on
> a matrix with global sizes 12 by 12, you should give****
>
> mpiexec -n 2 ./ex38 -Cexp_view_ownership 1 -m 6 -n 6****
>
> ** **
>
> Hong****
>
> ** **
>
> ** **
>
> On Tue, Jun 11, 2013 at 3:22 PM, Jin, Shuangshuang <
> Shuangshuang.Jin at pnnl.gov> wrote:****
>
> Hello, I’m trying to set my Jacobian matrix to MATELEMENTAL type.****
>
>  ****
>
> According to an Elemental matrix example code:
> /src/mat/examples/tests/ex38.c, I wrote my piece of code as following:****
>
>  ****
>
>   ierr = MatCreate(PETSC_COMM_WORLD, &J); CHKERRQ(ierr); // J: Jacobian
> matrix****
>
>   ierr = MatSetSizes(J, 12, 12, PETSC_DECIDE, PETSC_DECIDE); CHKERRQ(ierr);
> ****
>
>   ierr = MatSetType(J, MATELEMENTAL); CHKERRQ(ierr);****
>
>   ierr = MatSetFromOptions(J); CHKERRQ(ierr);****
>
>   ierr = MatSetUp(J); CHKERRQ(ierr);****
>
>  ****
>
>   PetscInt       nrows,ncols;****
>
>   const PetscInt *rows,*cols;****
>
>   IS             isrows,iscols;****
>
>   PetscScalar    *v;****
>
>   ierr = MatGetOwnershipIS(J, &isrows, &iscols); CHKERRQ(ierr);****
>
>  ****
>
>   // Set local matrix entries****
>
>   ierr = ISGetLocalSize(isrows, &nrows); CHKERRQ(ierr);****
>
>   ierr = ISGetIndices(isrows, &rows); CHKERRQ(ierr);****
>
>   ierr = ISGetLocalSize(iscols, &ncols); CHKERRQ(ierr);****
>
>   ierr = ISGetIndices(iscols, &cols); CHKERRQ(ierr);****
>
>   ierr = PetscMalloc(nrows*ncols*sizeof(*v), &v); CHKERRQ(ierr);****
>
>   printf("nrows=%d, ncols=%d\n",nrows,ncols);****
>
>   for (i = 0; i < nrows; i++) {****
>
>     for (j = 0; j < ncols; j++) {****
>
>       v[i*ncols+j] = jacVal[i][j]; // jacVal stores the value of the 12*12
> Jacobian matrix****
>
>     }****
>
>   }****
>
>   ierr = MatSetValues(J, nrows, rows, ncols, cols, v, INSERT_VALUES);
> CHKERRQ(ierr);****
>
>   ierr = ISRestoreIndices(isrows, &rows); CHKERRQ(ierr);****
>
>   ierr = ISRestoreIndices(iscols, &cols); CHKERRQ(ierr);****
>
>   ierr = MatAssemblyBegin(J, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr);****
>
>   ierr = MatAssemblyEnd(J, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr);****
>
>   ierr = ISDestroy(&isrows); CHKERRQ(ierr);****
>
>   ierr = ISDestroy(&iscols); CHKERRQ(ierr);****
>
>  ****
>
> This works for one processor. When it goes to multiple processor, it will
> fail. ****
>
>  ****
>
> I notice that when the # of processor equals to 1, the nrows and ncols
> will be 12 each, which are the dimension of the Jacobian matrix.****
>
> However, when the # of processor goes up, for example, to 2, nrows=24,
> ncols=12, then it will fail because the index of my jac[i][j] are withn [0,
> 11].****
>
>  ****
>
> My question is, to use MATELEMENTAL, what is the rule to fit the values of
> my 12x12 matrix into a 24X12 arrays by the two level for loops?****
>
>  ****
>
> And I don’t understand what the set means here:****
>
> [d3m956 at olympus tests]$ mpiexec -n 1 ex38 -Cexp_view_ownership 1 -m 12 -n
> 12****
>
> Ownership of explicit C:****
>
> Row index set:****
>
> Number of indices in set 12****
>
> 0 0****
>
> 1 1****
>
> 2 2****
>
> 3 3****
>
> 4 4****
>
> 5 5****
>
> 6 6****
>
> 7 7****
>
> 8 8****
>
> 9 9****
>
> 10 10****
>
> 11 11****
>
> Column index set:****
>
> Number of indices in set 12****
>
> 0 0****
>
> 1 1****
>
> 2 2****
>
> 3 3****
>
> 4 4****
>
> 5 5****
>
> 6 6****
>
> 7 7****
>
> 8 8****
>
> 9 9****
>
> 10 10****
>
> 11 11****
>
> nrows=12, ncols=12****
>
>  ****
>
> [d3m956 at olympus tests]$ mpiexec -n 2 ex38 -Cexp_view_ownership 1 -m 12 -n
> 12****
>
> Ownership of explicit C:****
>
> Row index set:****
>
> [0] Number of indices in set 24****
>
> [0] 0 0****
>
> [0] 1 12****
>
> [0] 2 1****
>
> [0] 3 13****
>
> [0] 4 2****
>
> [0] 5 14****
>
> [0] 6 3****
>
> [0] 7 15****
>
> [0] 8 4****
>
> [0] 9 16****
>
> [0] 10 5****
>
> [0] 11 17****
>
> [0] 12 6****
>
> [0] 13 18****
>
> [0] 14 7****
>
> [0] 15 19****
>
> [0] 16 8****
>
> [0] 17 20****
>
> [0] 18 9****
>
> [0] 19 21****
>
> [0] 20 10****
>
> [0] 21 22****
>
> [0] 22 11****
>
> [0] 23 23****
>
> nrows=24, ncols=12****
>
> [1] Number of indices in set 24****
>
> [1] 0 0****
>
> [1] 1 12****
>
> [1] 2 1****
>
> [1] 3 13****
>
> [1] 4 2****
>
> [1] 5 14****
>
> [1] 6 3****
>
> [1] 7 15****
>
> [1] 8 4****
>
> [1] 9 16****
>
> [1] 10 5****
>
> [1] 11 17****
>
> [1] 12 6****
>
> [1] 13 18****
>
> [1] 14 7****
>
> [1] 15 19****
>
> [1] 16 8****
>
> [1] 17 20****
>
> [1] 18 9****
>
> [1] 19 21****
>
> [1] 20 10****
>
> [1] 21 22****
>
> [1] 22 11****
>
> [1] 23 23****
>
> Column index set:****
>
> [0] Number of indices in set 12****
>
> [0] 0 0****
>
> [0] 1 1****
>
> [0] 2 2****
>
> [0] 3 3****
>
> [0] 4 4****
>
> [0] 5 5****
>
> [0] 6 6****
>
> [0] 7 7****
>
> [0] 8 8****
>
> [0] 9 9****
>
> [0] 10 10****
>
> [0] 11 11****
>
> [1] Number of indices in set 12****
>
> [1] 0 12****
>
> [1] 1 13****
>
> [1] 2 14****
>
> [1] 3 15****
>
> [1] 4 16****
>
> [1] 5 17****
>
> [1] 6 18****
>
> [1] 7 19****
>
> [1] 8 20****
>
> [1] 9 21****
>
> [1] 10 22****
>
> [1] 11 23****
>
> nrows=24, ncols=12****
>
>  ****
>
> Thanks,****
>
> Shuangshuang****
>
>  ****
>
>  ****
>
>  ****
>
>  ****
>
>  ****
>
>  ****
>
> ** **
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130611/7db7700b/attachment.html>


More information about the petsc-users mailing list