[petsc-dev] MatView on dense matrices crashes

Matthew Knepley knepley at gmail.com
Thu Dec 8 10:34:26 CST 2011


On Thu, Dec 8, 2011 at 10:19 AM, Alexander Grayver
<agrayver at gfz-potsdam.de>wrote:

> Hi dev-team,
>
> Using this code from Fortran:
>
> call PetscViewerBinaryOpen(sub%**comm3d,'S.dat',FILE_MODE_**
> WRITE,dviewer,ierr)
> call PetscViewerSetFormat(dviewer,**PETSC_VIEWER_NATIVE,ierr)
> call MatView(S,dviewer,ierr)
>

Good diagnosis. Pushed a fix.

PETSc: A great case for Python code generation. In fact, all the Fortran
headers should be generated.

   Matt


> crashes program with error:
>
> [0]PETSC ERROR: --------------------- Error Message
> ------------------------------**------
> [0]PETSC ERROR: No support for this operation for this object type!
> [0]PETSC ERROR: To store a parallel dense matrix you must first call
> PetscViewerSetFormat(viewer,**PETSC_VIEWER_NATIVE)!
> [0]PETSC ERROR: ------------------------------**
> ------------------------------**------------
> [0]PETSC ERROR: Petsc Development HG revision:
> 65c529a610ceec3880844014d1188a**718bb5f1fa  HG Date: Wed Dec 07 04:10:48
> 2011 -0600
> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [0]PETSC ERROR: See docs/index.html for manual pages.
> [0]PETSC ERROR: ------------------------------**
> ------------------------------**------------
> [0]PETSC ERROR: /home/emp on a openmpi-i named glic1 by agrayver Thu Dec
>  8 17:06:27 2011
> [0]PETSC ERROR: Libraries linked from /home/lib/petsc-dev/openmpi-**
> intel-complex-debug-f-ds/lib
> [0]PETSC ERROR: Configure run at Wed Dec  7 17:18:42 2011
> [0]PETSC ERROR: Configure options --with-petsc-arch=openmpi-**intel-complex-debug-f-ds
> --with-fortran-interfaces=1 --download-superlu_dist --download-mumps
> --download-parmetis --download-metis --with-scalapack-lib=/opt/**
> intel/Compiler/11.1/072/mkl/**lib/em64t/libmkl_scalapack_**lp64.a
> --with-scalapack-include=/opt/**intel/Compiler/11.1/072/mkl/**include
> --with-blacs-lib=/opt/intel/**Compiler/11.1/072/mkl/lib/**
> em64t/libmkl_blacs_openmpi_**lp64.a --with-blacs-include=/opt/**
> intel/Compiler/11.1/072/mkl/**include --with-mpi-dir=/opt/mpi/intel/**openmpi-1.4.2
> --with-scalar-type=complex --with-blas-lapack-dir=/opt/**
> intel/Compiler/11.1/072/mkl/**lib/em64t --with-precision=double --with-x=0
> [0]PETSC ERROR: ------------------------------**
> ------------------------------**------------
> [0]PETSC ERROR: MatView_MPIDense_Binary() line 678 in
> /home/lib/petsc-dev/src/mat/**impls/dense/mpi/mpidense.c
> [0]PETSC ERROR: MatView_MPIDense() line 800 in /home/lib/petsc-dev/src/mat/
> **impls/dense/mpi/mpidense.c
> [0]PETSC ERROR: MatView() line 761 in /home/lib/petsc-dev/src/mat/**
> interface/matrix.c
>
> Whereas same code works well with petsc-3.2-p5.
> Same problem for release version has been reported and fixed here:
> https://lists.mcs.anl.gov/**mailman/htdig/petsc-users/**
> 2011-September/010130.html<https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2011-September/010130.html>
>
> Regards,
> Alexander
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20111208/bef052b6/attachment.html>


More information about the petsc-dev mailing list