On Thu, Dec 8, 2011 at 10:19 AM, Alexander Grayver <span dir="ltr"><<a href="mailto:agrayver@gfz-potsdam.de">agrayver@gfz-potsdam.de</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
Hi dev-team,<br>
<br>
Using this code from Fortran:<br>
<br>
call PetscViewerBinaryOpen(sub%<u></u>comm3d,'S.dat',FILE_MODE_<u></u>WRITE,dviewer,ierr)<br>
call PetscViewerSetFormat(dviewer,<u></u>PETSC_VIEWER_NATIVE,ierr)<br>
call MatView(S,dviewer,ierr)<br></blockquote><div><br></div><div>Good diagnosis. Pushed a fix.</div><div><br></div><div>PETSc: A great case for Python code generation. In fact, all the Fortran headers should be generated.</div>
<div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
crashes program with error:<br>
<br>
[0]PETSC ERROR: --------------------- Error Message ------------------------------<u></u>------<br>
[0]PETSC ERROR: No support for this operation for this object type!<br>
[0]PETSC ERROR: To store a parallel dense matrix you must first call PetscViewerSetFormat(viewer,<u></u>PETSC_VIEWER_NATIVE)!<br>
[0]PETSC ERROR: ------------------------------<u></u>------------------------------<u></u>------------<br>
[0]PETSC ERROR: Petsc Development HG revision: 65c529a610ceec3880844014d1188a<u></u>718bb5f1fa HG Date: Wed Dec 07 04:10:48 2011 -0600<br>
[0]PETSC ERROR: See docs/changes/index.html for recent updates.<br>
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.<br>
[0]PETSC ERROR: See docs/index.html for manual pages.<br>
[0]PETSC ERROR: ------------------------------<u></u>------------------------------<u></u>------------<br>
[0]PETSC ERROR: /home/emp on a openmpi-i named glic1 by agrayver Thu Dec 8 17:06:27 2011<br>
[0]PETSC ERROR: Libraries linked from /home/lib/petsc-dev/openmpi-<u></u>intel-complex-debug-f-ds/lib<br>
[0]PETSC ERROR: Configure run at Wed Dec 7 17:18:42 2011<br>
[0]PETSC ERROR: Configure options --with-petsc-arch=openmpi-<u></u>intel-complex-debug-f-ds --with-fortran-interfaces=1 --download-superlu_dist --download-mumps --download-parmetis --download-metis --with-scalapack-lib=/opt/<u></u>intel/Compiler/11.1/072/mkl/<u></u>lib/em64t/libmkl_scalapack_<u></u>lp64.a --with-scalapack-include=/opt/<u></u>intel/Compiler/11.1/072/mkl/<u></u>include --with-blacs-lib=/opt/intel/<u></u>Compiler/11.1/072/mkl/lib/<u></u>em64t/libmkl_blacs_openmpi_<u></u>lp64.a --with-blacs-include=/opt/<u></u>intel/Compiler/11.1/072/mkl/<u></u>include --with-mpi-dir=/opt/mpi/intel/<u></u>openmpi-1.4.2 --with-scalar-type=complex --with-blas-lapack-dir=/opt/<u></u>intel/Compiler/11.1/072/mkl/<u></u>lib/em64t --with-precision=double --with-x=0<br>
[0]PETSC ERROR: ------------------------------<u></u>------------------------------<u></u>------------<br>
[0]PETSC ERROR: MatView_MPIDense_Binary() line 678 in /home/lib/petsc-dev/src/mat/<u></u>impls/dense/mpi/mpidense.c<br>
[0]PETSC ERROR: MatView_MPIDense() line 800 in /home/lib/petsc-dev/src/mat/<u></u>impls/dense/mpi/mpidense.c<br>
[0]PETSC ERROR: MatView() line 761 in /home/lib/petsc-dev/src/mat/<u></u>interface/matrix.c<br>
<br>
Whereas same code works well with petsc-3.2-p5.<br>
Same problem for release version has been reported and fixed here:<br>
<a href="https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2011-September/010130.html" target="_blank">https://lists.mcs.anl.gov/<u></u>mailman/htdig/petsc-users/<u></u>2011-September/010130.html</a><br>
<br>
Regards,<br>
Alexander<br>
</blockquote></div><br><br clear="all"><div><br></div>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener<br>