[petsc-users] Slepc reading distributed matrix from files
Barry Smith
bsmith at mcs.anl.gov
Thu Jul 6 15:02:30 CDT 2017
> On Jul 6, 2017, at 10:48 AM, errabii sohaib <errabiisohaib at gmail.com> wrote:
>
> Hi everyone,
>
> I am trying to use Mat.view and Mat.load that you suggested. Regardless of the fact that the matrix i get is not what i had initially, my question is about how Mat.load reads the matrix in Parallel. Am i able to read the matrix in parallel regardless how it was written with Mat.view() ?
Yes, you can save on one process and read it in parallel.
> My second question is about the difference between PETSc.COMM_WORLD and MPI.COMM_WORLD, i have read in the user's manual that they are the practically the same, but creating a viewer with MPI.COMM_WORLD returns an error and it works with PETSc.COMM_WORLD.
This may be a petsc4py thing, just use PETSc.COMM_WORLD.
>
> ps:
> Here is the code i am using to write the matrix
>
> comm = PETSc.COMM_WORLD
> rank = comm.Get_rank()
> size = comm.Get_size()
>
> sizeGlob,nnzGlob,size,nnz,I,J,A=readMat('jacobian/matJ%i'%rank)
>
> M=PETSc.Mat()
> M.create()
> M.setSizes(size=(sizeGlob,sizeGlob))
> M.setType('aij')
> M.setUp()
> for i in range(nnz): #each proc holds different I,J,A and i simply fill in the matrix
> M[I[i]-1,J[i]-1]=A[i]
> M.assemblyBegin()
> M.assemblyEnd()
>
> vi = PETSc.Viewer()
> vi.createBinary('matJacobian',mode=1,comm=comm)
> M.view(viewer=vi)
>
> And the code i am using to read it:
>
> comm = PETSc.COMM_WORLD
> rank = comm.Get_rank()
> size = comm.Get_size()
>
> A = PETSc.Mat(comm)
> vi = PETSc.Viewer()
> vi.createBinary('matJacobian',mode=0,comm=comm)
> A.load(vi)
>
> Checking the matrix after reading it with A.getType returns seqaij as if it wasn't distributed on each proc!
This works fine in C, again could be a petsc4py thing. I don't know why it wouldn't work. Can you run with -info to get more information about what PETSc is doing?
>
> Thank you so much for your time,
> Sohaib
>
More information about the petsc-users
mailing list