[petsc-users] [petsc-3.2] MatView on dense matrix
Alexander Grayver
agrayver at gfz-potsdam.de
Fri Sep 16 12:15:16 CDT 2011
Matt,
Can you explain what do you by seeing what format is returned?
I will check in debugger of course, but still I find it strange.
I have two versions of petsc (3.1-p7 and 3.2-p1) configured with the
same options, my code and runs are absolutely identical and I get this
error only with petsc-3.2.
Regards,
Alexander
On 16.09.2011 15:04, Matthew Knepley wrote:
> On Fri, Sep 16, 2011 at 5:38 AM, Alexander Grayver
> <agrayver at gfz-potsdam.de <mailto:agrayver at gfz-potsdam.de>> wrote:
>
> Hello,
>
> With petsc-3.1 I used this code to output MPI dense matrix:
>
> PetscViewer :: dviewer
> call
> PetscViewerBinaryOpen(MPI_COMM_WORLD,'A.dat',FILE_MODE_WRITE,dviewer,ierr);
> CHKERRQ(ierr)
> call PetscViewerSetFormat(dviewer,PETSC_VIEWER_NATIVE,ierr);
> CHKERRQ(ierr)
> call MatView(A,dviewer,ierr); CHKERRQ(ierr)
> call PetscViewerDestroy(dviewer,ierr); CHKERRQ(ierr)
>
> With petsc-3.2 this code fails with error:
>
>
> This error just says that the viewer did not return a format of
> PETSC_VIEWER_NATIVE. Either you
> accidentally set it in the wrong viewer, or not on every process, or
> there is memory corruption. I recommend
> using the debugger to see what format is returned.
>
> Thanks,
>
> Matt
>
> [7]PETSC ERROR: --------------------- Error Message
> ------------------------------------
> [7]PETSC ERROR: No support for this operation for this object type!
> [7]PETSC ERROR: To store a parallel dense matrix you must first
> call PetscViewerSetFormat(viewer,PETSC_VIEWER_NATIVE)!
> [7]PETSC ERROR:
> ------------------------------------------------------------------------
> [7]PETSC ERROR: Petsc Release Version 3.2.0, Patch 1, Mon Sep 12
> 16:01:51 CDT 2011
> [7]PETSC ERROR: See docs/changes/index.html for recent updates.
> [7]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [7]PETSC ERROR: See docs/index.html for manual pages.
> [7]PETSC ERROR:
> ------------------------------------------------------------------------
> [7]PETSC ERROR: /home/run/test on a openmpi-i named node205 by
> user Fri Sep 16 09:17:13 2011
> [7]PETSC ERROR: Libraries linked from
> /home/lib/petsc-3.2-p1/openmpi-intel-complex-debug-f/lib
> [7]PETSC ERROR: Configure run at Wed Sep 14 14:32:55 2011
> [7]PETSC ERROR: Configure options
> --with-petsc-arch=openmpi-intel-complex-debug-f
> --with-fortran-interfaces=1
> --with-mpi-dir=/opt/mpi/intel/openmpi-1.4.2
> --with-scalar-type=complex
> --with-blas-lapack-dir=/opt/intel/Compiler/11.1/072/mkl/lib/em64t
> --with-precision=double --with-x=0
> [7]PETSC ERROR:
> ------------------------------------------------------------------------
> [7]PETSC ERROR: MatView_MPIDense_Binary() line 667 in
> /home/lib/petsc-3.2-p1/src/mat/impls/dense/mpi/mpidense.c
> [7]PETSC ERROR: MatView_MPIDense() line 789 in
> /home/lib/petsc-3.2-p1/src/mat/impls/dense/mpi/mpidense.c
> [7]PETSC ERROR: MatView() line 757 in
> /home/lib/petsc-3.2-p1/src/mat/interface/matrix.c
> MatView_MPIDense() line 789 in
> /home/lib/petsc-3.2-p1/src/mat/impls/dense/mpi/mpidense.c
>
> See full error attached. Is there something new in petsc-3.2
> concerning matrix output?
>
> Regards,
> Alexander
>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which
> their experiments lead.
> -- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110916/693e4beb/attachment.htm>
More information about the petsc-users
mailing list