[petsc-users] slepc on 1D row distributed matrix
Kannan, Ramakrishnan
kannanr at ornl.gov
Wed May 31 16:14:55 CDT 2017
Jose,
Thank you for the quick reply.
In this specific example, there are 5 mpi processes and each process owns an 1D row distributed matrix of size 3x15. According to the MatSetSizes, I should set local rows, local cols, global rows, global cols which in this case are 3,15,15,15 respectively. Instead why would I set 3,3,15,15.
Also in our program, I use global_row_idx, global_col_idx for MatSetValues. If I set 3,3,15,15 instead of 3,15,15,15, my MatSetValues fails with the error “nnz cannot be greater than row length:”. Also to test the 3,15,15,15 in MatSetSizes to be right, we called a MatCreateVec and MatMult of petsc which seemed to work alright too.
Appreciate your kind help.
--
Regards,
Ramki
On 5/31/17, 4:26 PM, "Jose E. Roman" <jroman at dsic.upv.es> wrote:
> El 31 may 2017, a las 21:46, Kannan, Ramakrishnan <kannanr at ornl.gov> escribió:
>
> Hello,
>
> I have got a sparse 1D row distributed matrix in which every MPI process owns an m/p x n of the global matrix mxn. I am running NHEP with krylovschur on it. It is throwing me some wrong error. For your reference, I have attached the modified ex5.c in which I SetSizes on the matrix to emulate the 1D row distribution and the log file with the error.
>
> In the unmodified ex5.c, for m=5, N=15, the local_m and the local_n is 3x3. How is the global 15x15 matrix distributed locally as 3x3 matrices? When I print the global matrix, it doesn’t appear to be diagonal as well.
>
> If slepc doesn’t support sparse 1D row distributed matrix, how do I need to redistribute it such that I can run NHEP on this.
> --
> Regards,
> Ramki
>
> <ex5.c><slepc.o607511>
As explained in the manpage, the local columns size n must match the local size of the x vector, so it must also be N/mpisize
http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatSetSizes.html
But be warned that your code will not work when N is not divisible by mpisize. In that case, global and local dimensions won't match.
Setting local sizes is not necessary in your case, since by default PETSc is already doing a 1D block-row distribution.
Jose
More information about the petsc-users
mailing list