[petsc-users] [petsc-3.2] MatView on dense matrix
Alexander Grayver
agrayver at gfz-potsdam.de
Tue Sep 20 02:44:11 CDT 2011
Hi Satish,
The problem seems to be fixed now. Thanks!
Regards,
Alexander
On 20.09.2011 05:16, Satish Balay wrote:
> Can you try the attached patch - and see if it fixes the problem?
>
> cd petsc-3.2-p2
> patch -Np1< viewer-f.patch
>
> Satish
>
> On Mon, 19 Sep 2011, Alexander Grayver wrote:
>
>> Hello,
>>
>> Is that possible to get fixed version?
>>
>> Regards,
>> Alexander
>>
>> On 16.09.2011 19:59, Barry Smith wrote:
>>> The problem is from the C values of these parameters have gotten out of
>>> sync with the Fortran values.
>>>
>>> Satish,
>>>
>>> Could you please fix the finclude/petscviewer.h values to match the C
>>> in petsc-3.2 and send a patch?
>>>
>>> Thanks
>>>
>>> Barry
>>>
>>>
>>> On Sep 16, 2011, at 5:38 AM, Alexander Grayver wrote:
>>>
>>>> Hello,
>>>>
>>>> With petsc-3.1 I used this code to output MPI dense matrix:
>>>>
>>>> PetscViewer :: dviewer
>>>> call
>>>> PetscViewerBinaryOpen(MPI_COMM_WORLD,'A.dat',FILE_MODE_WRITE,dviewer,ierr);
>>>> CHKERRQ(ierr)
>>>> call PetscViewerSetFormat(dviewer,PETSC_VIEWER_NATIVE,ierr); CHKERRQ(ierr)
>>>> call MatView(A,dviewer,ierr); CHKERRQ(ierr)
>>>> call PetscViewerDestroy(dviewer,ierr); CHKERRQ(ierr)
>>>>
>>>> With petsc-3.2 this code fails with error:
>>>>
>>>> [7]PETSC ERROR: --------------------- Error Message
>>>> ------------------------------------
>>>> [7]PETSC ERROR: No support for this operation for this object type!
>>>> [7]PETSC ERROR: To store a parallel dense matrix you must first call
>>>> PetscViewerSetFormat(viewer,PETSC_VIEWER_NATIVE)!
>>>> [7]PETSC ERROR:
>>>> ------------------------------------------------------------------------
>>>> [7]PETSC ERROR: Petsc Release Version 3.2.0, Patch 1, Mon Sep 12 16:01:51
>>>> CDT 2011
>>>> [7]PETSC ERROR: See docs/changes/index.html for recent updates.
>>>> [7]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>>>> [7]PETSC ERROR: See docs/index.html for manual pages.
>>>> [7]PETSC ERROR:
>>>> ------------------------------------------------------------------------
>>>> [7]PETSC ERROR: /home/run/test on a openmpi-i named node205 by user Fri
>>>> Sep 16 09:17:13 2011
>>>> [7]PETSC ERROR: Libraries linked from
>>>> /home/lib/petsc-3.2-p1/openmpi-intel-complex-debug-f/lib
>>>> [7]PETSC ERROR: Configure run at Wed Sep 14 14:32:55 2011
>>>> [7]PETSC ERROR: Configure options
>>>> --with-petsc-arch=openmpi-intel-complex-debug-f
>>>> --with-fortran-interfaces=1 --with-mpi-dir=/opt/mpi/intel/openmpi-1.4.2
>>>> --with-scalar-type=complex
>>>> --with-blas-lapack-dir=/opt/intel/Compiler/11.1/072/mkl/lib/em64t
>>>> --with-precision=double --with-x=0
>>>> [7]PETSC ERROR:
>>>> ------------------------------------------------------------------------
>>>> [7]PETSC ERROR: MatView_MPIDense_Binary() line 667 in
>>>> /home/lib/petsc-3.2-p1/src/mat/impls/dense/mpi/mpidense.c
>>>> [7]PETSC ERROR: MatView_MPIDense() line 789 in
>>>> /home/lib/petsc-3.2-p1/src/mat/impls/dense/mpi/mpidense.c
>>>> [7]PETSC ERROR: MatView() line 757 in
>>>> /home/lib/petsc-3.2-p1/src/mat/interface/matrix.c
>>>> MatView_MPIDense() line 789 in
>>>> /home/lib/petsc-3.2-p1/src/mat/impls/dense/mpi/mpidense.c
>>>>
>>>> See full error attached. Is there something new in petsc-3.2 concerning
>>>> matrix output?
>>>>
>>>> Regards,
>>>> Alexander
>>>> <error.txt>
>>
More information about the petsc-users
mailing list