[petsc-users] Binary input-output for parallel dense matrices (petsc4py)
Analabha Roy
hariseldon99 at gmail.com
Thu Jun 30 17:22:57 CDT 2016
Hi,
Thanks for your attention. I created A_new explicitly and set the type to
dense, and it ran as expected. I also did
A_new = A.duplicate()
A_new.load(viewer_new)
and it worked too!
Regards,
AR
On Thu, Jun 30, 2016 at 7:12 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
> Ahh. When you use the "native" format with dense matrices you can only
> read the matrix back in later with a dense matrix. You need to set the
> A_new matrix type to dense before calling the load.
>
> We need more error checking in the PETSc MatLoad() for non-dense matrix
> formats that properly stops the code with an error message that the file
> format is not handled by the matrix type. I will do this.
>
> Barry
>
>
>
> > On Jun 30, 2016, at 11:30 AM, Analabha Roy <hariseldon99 at gmail.com>
> wrote:
> >
> > Hi all,
> >
> > I'm trying to do basic binary input/output for parallel matrices.
> Following the example in "petsc4py/demo/binary-io/matvecio.py ", the code
> below seems to work fine (raw python file attached) with "mpirun -np 2
> python ./test_binwrite.py" until I uncomment
> "#A.setType(PETSc.Mat.Type.DENSE)" in order to do it for parallel dense
> matrices. Then it crashes with error (stderr file attached, see below for
> highlighted message). What am I doing wrong?
> >
> > Python Code
> >
> ========================================================================================
> >
> > #!/usr/bin/env python
> > """
> > Created on Wed Jun 29 22:05:59 2016
> >
> > @author: daneel at utexas.edu
> > """
> >
> >
> >
> > import numpy as
> > np
> >
> > from petsc4py import
> > PETSc
> >
> > Print = PETSc.Sys.Print
> >
> >
> > matsize
> > = 100
> >
> > filename
> > = 'tmpMatrix-A.dat'
> >
> >
> > A
> > = PETSc.Mat().create(PETSc.COMM_WORLD)
> >
> > A
> > .setSizes([matsize, matsize])
> > #A.setType(PETSc.Mat.Type.DENSE)
> >
> > A
> > .setUp()
> >
> >
> > Istart
> > , Iend = A.getOwnershipRange()
> > for I in xrange(Istart, Iend) :
> >
> > A
> > [I,:] = np.random.random(matsize)
> >
> >
> > A
> > .assemble()
> > Print("Random matrix assembled. Now writing to temporary file ...")
> > # save
> >
> > viewer
> > = PETSc.Viewer().createBinary(filename, 'w')
> >
> > viewer
> > .pushFormat(viewer.Format.NATIVE)
> >
> > viewer
> > .view(A)
> > Print("Written to ", filename)
> > # reload
> > Print("Now reloading from ", filename)
> >
> > viewer_new
> > = PETSc.Viewer().createBinary(filename, 'r')
> >
> > viewer_new
> > .pushFormat(viewer.getFormat())
> >
> > A_new
> > = PETSc.Mat().load(viewer_new)
> > assert A_new.equal(A), "Reload unsuccessful"
> > Print("Reload successful")
> >
> ========================================================================================
> >
> >
> >
> > stderr output for parallel dense matrices:
> >
> ========================================================================================
> > Random matrix assembled. Now writing to temporary file ...
> >
> > Written to tmpMatrix-A
> > .
> > dat
> > Now reloading from tmpMatrix-A
> > .
> > dat
> > Traceback
> > (most recent call last):
> >
> > File
> > "./test_binwrite.py", line 36, in <module>
> >
> > Traceback
> > (most recent call last):
> >
> > A_new
> > = PETSc.Mat().load(viewer_new)
> >
> > File
> > "./test_binwrite.py", line 36, in <module>
> >
> > File
> > "PETSc/Mat.pyx", line 642, in petsc4py.PETSc.Mat.load
> (src/petsc4py.PETSc.c:118243)
> >
> > A_new
> > = PETSc.Mat().load(viewer_new)
> >
> > File
> > "PETSc/Mat.pyx", line 642, in petsc4py.PETSc.Mat.load
> (src/petsc4py.PETSc.c:118243)
> >
> > petsc4py
> > .PETSc.Errorpetsc4py.PETSc.Error: : error code 55
> > [0] MatLoad() line 1013 in
> /home/daneel/.local/src/petsc-3.7.2/src/mat/interface/matrix.c
> > [0] MatLoad_MPIAIJ() line 2947 in
> /home/daneel/.local/src/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > [0] MatLoad_MPIAIJ() line 2947 in
> /home/daneel/.local/src/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > [0] Out of memory. Allocated: 0, Used by process: 23687168
> > [0] Memory requested 5942180016
> >
> > error code
> > 55
> > [1] MatLoad() line 1013 in
> /home/daneel/.local/src/petsc-3.7.2/src/mat/interface/matrix.c
> > [1] MatLoad_MPIAIJ() line 2967 in
> /home/daneel/.local/src/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > [1] MatLoad_MPIAIJ() line 2967 in
> /home/daneel/.local/src/petsc-3.7.2/src/mat/impls/aij/mpi/mpiaij.c
> > [1] Out of memory. Allocated: 0, Used by process: 23506944
> > [1] Memory requested 18446744072415633408
> >
> >
> --------------------------------------------------------------------------
> > mpirun noticed that the job aborted, but has no info as to the process
> > that caused that situation
> > .
> >
> >
> --------------------------------------------------------------------------
> >
> >
> ========================================================================================
> >
> >
> >
> >
> >
> >
> >
> > --
> > ---
> > Analabha Roy
> > Postdoctoral Fellow
> > National Institute of Theoretical Physics (NiTheP)
> > Private Bag X1, Matieland,
> > Stellenbosch, South Africa,7602
> > Emails: daneel at utexas.edu, daneel at sun.ac.za, hariseldon99 at gmail.com
> > Webpage: http://www.ph.utexas.edu/~daneel/
> > <stderr.txt><test_binwrite.py>
>
>
--
---
Analabha Roy
Postdoctoral Fellow
National Institute of Theoretical Physics (NiTheP)
<http://www.nithep.ac.za/>
Private Bag X1, Matieland,
Stellenbosch, South Africa,7602
Emails: daneel at utexas.edu, daneel at sun.ac.za, hariseldon99 at gmail.com
Webpage: http://www.ph.utexas.edu/~daneel/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160701/18db5590/attachment.html>
More information about the petsc-users
mailing list