[petsc-users] Binary input-output for parallel dense matrices (petsc4py)

Analabha Roy hariseldon99 at gmail.com
Thu Jun 30 11:30:29 CDT 2016


Hi all,

 I'm trying to do basic binary input/output for parallel matrices.
Following the example in "petsc4py/demo/binary-io/matvecio.py
<https://github.com/erdc-cm/petsc4py/blob/master/demo/binary-io/matvecio.py>
", the code below seems to work fine (raw python file attached) with
"mpirun -np 2 python ./test_binwrite.py" until I uncomment
"#A.setType(PETSc.Mat.Type.DENSE)" in order to do it for parallel dense
matrices. Then it crashes with error (stderr file attached, see below for
highlighted message).  What am I doing wrong?

Python Code
========================================================================================

#!/usr/bin/env python

"""Created on Wed Jun 29 22:05:59 2016 at author: daneel at utexas.edu"""
import numpy as npfrom petsc4py import PETScPrint = PETSc.Sys.Print

matsize = 100
filename = 'tmpMatrix-A.dat'

A = PETSc.Mat().create(PETSc.COMM_WORLD)
A.setSizes([matsize, matsize])#A.setType(PETSc.Mat.Type.DENSE)
A.setUp()

Istart, Iend = A.getOwnershipRange()for I in xrange(Istart, Iend) :
    A[I,:] = np.random.random(matsize)

A.assemble()Print("Random matrix assembled. Now writing to temporary
file ...")# save
viewer = PETSc.Viewer().createBinary(filename, 'w')
viewer.pushFormat(viewer.Format.NATIVE)
viewer.view(A)Print("Written to ", filename)# reloadPrint("Now
reloading from ", filename)
viewer_new = PETSc.Viewer().createBinary(filename, 'r')
viewer_new.pushFormat(viewer.getFormat())
A_new = PETSc.Mat().load(viewer_new)assert A_new.equal(A), "Reload
unsuccessful"Print("Reload successful")

========================================================================================


stderr output for parallel dense matrices:

========================================================================================

Random matrix assembled. Now writing to temporary file ...
Written to  tmpMatrix-A.dat
Now reloading from  tmpMatrix-A.dat
Traceback (most recent call last):
  File "./test_binwrite.py", line 36, in <module>
Traceback (most recent call last):
    A_new = PETSc.Mat().load(viewer_new)
  File "./test_binwrite.py", line 36, in <module>
      File "PETSc/Mat.pyx", line 642, in petsc4py.PETSc.Mat.load
(src/petsc4py.PETSc.c:118243)
A_new = PETSc.Mat().load(viewer_new)
  File "PETSc/Mat.pyx", line 642, in petsc4py.PETSc.Mat.load
(src/petsc4py.PETSc.c:118243)
petsc4py.PETSc.Errorpetsc4py.PETSc.Error: : error code 55[0] MatLoad()
line 1013 in /home/daneel/.local/src/petsc-3.7.2/src/mat/interface/matrix.c[0]
MatLoad_MPIAIJ() line 2947 in
/home/daneel/.local/src/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c[0]
MatLoad_MPIAIJ() line 2947 in
/home/daneel/.local/src/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c[0]
Out of memory. Allocated: 0, Used by process: 23687168[0] Memory
requested 5942180016
error code 55[1] MatLoad() line 1013 in
/home/daneel/.local/src/petsc-3.7.2/src/mat/interface/matrix.c[1]
MatLoad_MPIAIJ() line 2967 in
/home/daneel/.local/src/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c[1]
MatLoad_MPIAIJ() line 2967 in
/home/daneel/.local/src/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c[1]
Out of memory. Allocated: 0, Used by process: 23506944[1] Memory
requested 18446744072415633408
--------------------------------------------------------------------------
mpirun noticed that the job aborted, but has no info as to the process
that caused that situation.
--------------------------------------------------------------------------

========================================================================================







-- 
---
Analabha Roy
Postdoctoral Fellow
National Institute of Theoretical Physics (NiTheP)
<http://www.nithep.ac.za/>
Private Bag X1, Matieland,
Stellenbosch, South Africa,7602
Emails: daneel at utexas.edu, daneel at sun.ac.za, hariseldon99 at gmail.com
Webpage: http://www.ph.utexas.edu/~daneel/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160630/df832a52/attachment-0001.html>
-------------- next part --------------
daneel at fis406826:~/gitrepos/ising_exact$ mpirun -np 2 ./test_binwrite.py 
Random matrix assembled. Now writing to temporary file ...
Written to  tmpMatrix-A.dat
Now reloading from  tmpMatrix-A.dat
Traceback (most recent call last):
  File "./test_binwrite.py", line 36, in <module>
Traceback (most recent call last):
    A_new = PETSc.Mat().load(viewer_new)
  File "./test_binwrite.py", line 36, in <module>
      File "PETSc/Mat.pyx", line 642, in petsc4py.PETSc.Mat.load (src/petsc4py.PETSc.c:118243)
A_new = PETSc.Mat().load(viewer_new)
  File "PETSc/Mat.pyx", line 642, in petsc4py.PETSc.Mat.load (src/petsc4py.PETSc.c:118243)
petsc4py.PETSc.Errorpetsc4py.PETSc.Error: : error code 55
[0] MatLoad() line 1013 in /home/daneel/.local/src/petsc-3.7.2/src/mat/interface/matrix.c
[0] MatLoad_MPIAIJ() line 2947 in /home/daneel/.local/src/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
[0] MatLoad_MPIAIJ() line 2947 in /home/daneel/.local/src/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
[0] Out of memory. Allocated: 0, Used by process: 23687168
[0] Memory requested 5942180016
error code 55
[1] MatLoad() line 1013 in /home/daneel/.local/src/petsc-3.7.2/src/mat/interface/matrix.c
[1] MatLoad_MPIAIJ() line 2967 in /home/daneel/.local/src/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
[1] MatLoad_MPIAIJ() line 2967 in /home/daneel/.local/src/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
[1] Out of memory. Allocated: 0, Used by process: 23506944
[1] Memory requested 18446744072415633408
--------------------------------------------------------------------------
mpirun noticed that the job aborted, but has no info as to the process
that caused that situation.
--------------------------------------------------------------------------
-------------- next part --------------
A non-text attachment was scrubbed...
Name: test_binwrite.py
Type: text/x-python
Size: 920 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160630/df832a52/attachment-0001.py>


More information about the petsc-users mailing list