[petsc-users] MatView for parallel jobs
Jose E. Roman
jroman at dsic.upv.es
Thu Jul 10 10:55:34 CDT 2025
You can try this:
B = A.convert('dense')
dense_mat = B.getDenseArray(readonly=True)
print(dense_mat)
Jose
> El 10 jul 2025, a las 10:35, Arun Soman Pillai - STFC UKRI via petsc-users <petsc-users at mcs.anl.gov> escribió:
>
> Hi,
> I am trying to view a matrix object when running a job in parallel. The application is BOUT++ (https://urldefense.us/v3/__https://boutproject.github.io/__;!!G_uCfscf7eWS!bhfmQqtaPjwyTZrLkeA6UsVNt6voZUv4N62wF-OTafCdKcrUyeW0ezFOYpJJ9nZ7jhDW2dzrLpCAA9zF8LLqhGvU$ ) which uses petsc as solver.
> While the option -mat_view gives a jumbled output on STDOUT from across the processors, redirecting output to a specific file results in race condition. The only option that works for me is -mat_view binary with -viewer_binary_filename <name>. I am processing this binary file with petsc4py as follows:
> from petsc4py import PETSc
> import numpy as np
> import matplotlib.pyplot as plt
> import glob
> def read_binary(filename):
> viewer = PETSc.Viewer().createMPIIO(filename, "r", comm=PETSc.COMM_WORLD)
> A = PETSc.Mat().load(viewer)
> A.assemble()
> print("Matrix type:", A.getType())
> print("Matrix size:", A.getSize())
> info = A.getInfo()
> print("Number of nonzeros:", info['nz_used'])
> print("Nonzeros allocated:", info['nz_allocated'])
> m, n = A.getSize()
> dense_mat = np.zeros((m, n), dtype=np.float64)
> for i in range(m):
> cols, vals = A.getRow(i)
> dense_mat[i, cols] = vals
> rstart, rend = A.getOwnershipRange()
> rows, cols = [], []
> for i in range(rstart, rend):
> cols_i, _ = A.getRow(i)
> rows.extend([i] * len(cols_i))
> cols.extend(cols_i)
> rows = np.array(rows)
> cols = np.array(cols)
> return dense_mat, rows, cols
> A, rows, cols = read_binary(“matrix”)
> This yields accurate number of non-zeros and matrix sparsity. However, the dense matrix turns out to be a null matrix. I wonder what is going wrong here.
> Thank you, best regards
> Arun Pillai
More information about the petsc-users
mailing list