[petsc-users] slepc on 1D row distributed matrix
Kannan, Ramakrishnan
kannanr at ornl.gov
Wed May 31 14:46:18 CDT 2017
Hello,
I have got a sparse 1D row distributed matrix in which every MPI process owns an m/p x n of the global matrix mxn. I am running NHEP with krylovschur on it. It is throwing me some wrong error. For your reference, I have attached the modified ex5.c in which I SetSizes on the matrix to emulate the 1D row distribution and the log file with the error.
In the unmodified ex5.c, for m=5, N=15, the local_m and the local_n is 3x3. How is the global 15x15 matrix distributed locally as 3x3 matrices? When I print the global matrix, it doesn’t appear to be diagonal as well.
If slepc doesn’t support sparse 1D row distributed matrix, how do I need to redistribute it such that I can run NHEP on this.
--
Regards,
Ramki
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170531/9e7ac8b6/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: ex5.c
Type: application/octet-stream
Size: 7780 bytes
Desc: ex5.c
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170531/9e7ac8b6/attachment-0002.obj>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: slepc.o607511
Type: application/octet-stream
Size: 26570 bytes
Desc: slepc.o607511
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170531/9e7ac8b6/attachment-0003.obj>
More information about the petsc-users
mailing list