<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=us-ascii">
</head>
<body>
<div class="BodyFragment"><font size="2"><span style="font-size:11pt;">
<div class="PlainText"><br>
  sorry about this. The numerical values between C and Fortran got out of sync. I've attached a patch file you can apply with
<br>
<br>
  patch -p1 < format.patch</div>
</span></font></div>
<div class="BodyFragment"><font size="2"><span style="font-size:11pt;">
<div class="PlainText"><br>
<br>
  or you can use the branch <a href="https://gitlab.com/petsc/petsc/merge_requests/2346">
https://gitlab.com/petsc/petsc/merge_requests/2346</a><br>
<br>
   Barry<br>
<br>
<br>
> On Dec 3, 2019, at 1:10 AM, Marius Buerkle <mbuerkle@web.de> wrote:<br>
> <br>
> Hi,<br>
>  <br>
> I try to save a matrix in Elemental format to disk. I am doing, where p_matout is of type MATELEMENTAL,<br>
>  <br>
>     call PetscViewerCreate(PETSC_COMM_WORLD,v_file,ierr)    <br>
>     call PetscViewerPushFormat(v_file,PETSC_VIEWER_NATIVE,ierr)<br>
>     call PetscViewerSetType(v_file,PETSCVIEWERBINARY,ierr)    <br>
>     call PetscViewerFileSetMode(v_file,FILE_MODE_WRITE,ierr)    <br>
>     call PetscViewerFileSetName(v_file,trim(filename),ierr)        <br>
>     call MatView(p_matout,v_file,ierr)<br>
>     call PetscViewerDestroy(v_file,ierr)   <br>
>  <br>
> This gives the following error<br>
> [18]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>
> [18]PETSC ERROR: No support for this operation for this object type<br>
> [18]PETSC ERROR: To store a parallel dense matrix you must first call PetscViewerPushFormat(viewer,PETSC_VIEWER_NATIVE)<br>
> [18]PETSC ERROR: See <a href="https://www.mcs.anl.gov/petsc/documentation/faq.html">
https://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.<br>
> [18]PETSC ERROR: Petsc Development GIT revision: v3.12.1-317-gcc59f4f82c  GIT Date: 2019-11-08 00:07:59 -0600<br>
> [18]PETSC ERROR: /home/marius/prog/ownstuff/fortran/programs/transomat_dev/save_load_fs/transomat/transomat on a  named tono-hpc1 by marius Tue Dec  3 16:08:38 2019<br>
> [18]PETSC ERROR: Configure options --prefix=/home/marius/prog/petsc/petsc_slepc_opt --with-scalar-type=complex --with-fortran-kernels=1 --with-64-bit-indices=0 --CC=mpicc --COPTFLAGS="-g -Ofast -std=c11 -qopenmmp" --CXX=mpicxx --CXXOPTFLAGS="-g -Ofast -std=c++14
 -qopenmp" --FC=mpif90 --FOPTFLAGS="-g -Ofast -traceback -qopenmp" --with-mpi=1 --with-x=0 --download-parmetis=1 --download-metis=1 --download-superlu_dist=1 --download-superlu_dist-commit=f8ace664ec4ca10e96e258a764552cbda299ba6e --download-superlu_dist-cmake-arguments=-Denable_openmp:BOOL=TRUE
 --download-hwloc=1 --download-sowing=1 --with-openmp=1 --with-pthread=1 --download-elemental=1 --download-elemental-commit=6eb15a0da2a4998bf1cf971ae231b78e06d989d9 --download-elemental-cmake-arguments=-DEL_HYBRID:BOOL=TRUE --with-cxx-dialect=c++11 --with-debugging=0
 --with-valgrind=0 --with-blaslapack-lib=" /home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl/lib/intel64/libmkl_scalapack_lp64.a -Wl,--start-group /home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl/lib/intel64/libmkl_intel_lp64.a
 /home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl/lib/intel64/libmkl_intel_thread.a /home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl/lib/intel64/libmkl_core.a /home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.a
 -Wl,--end-group -liomp5 -lpthread -lm -ldl -lmpi_wrapper" --with-scalapack-lib=" /home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl/lib/intel64/libmkl_scalapack_lp64.a -Wl,--start-group /home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl/lib/intel64/libmkl_intel_lp64.a
 /home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl/lib/intel64/libmkl_intel_thread.a /home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl/lib/intel64/libmkl_core.a /home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.a
 -Wl,--end-group -liomp5 -lpthread -lm -ldl -lmpi_wrapper" --with-mkl_pardiso-dir=/home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl --with-mkl_cpardiso-dir=/home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl --with-mkl_sparse-dir=/home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl
 --with-mkl_sparse_optimize-dir=/home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl --download-slepc=1 --download-slepc-commit=658271f --download-make=1 --download-cmake=1<br>
> [18]PETSC ERROR: #1 MatView_MPIDense_Binary() line 682 in /home/marius/prog/petsc/git/petsc/src/mat/impls/dense/mpi/mpidense.c<br>
> [18]PETSC ERROR: #2 MatView_MPIDense() line 786 in /home/marius/prog/petsc/git/petsc/src/mat/impls/dense/mpi/mpidense.c<br>
> [18]PETSC ERROR: #3 MatView() line 1066 in /home/marius/prog/petsc/git/petsc/src/mat/interface/matrix.c<br>
> [18]PETSC ERROR: #4 MatView_Elemental() line 83 in /home/marius/prog/petsc/git/petsc/src/mat/impls/elemental/matelem.cxx<br>
> [18]PETSC ERROR: #5 MatView() line 1066 in /home/marius/prog/petsc/git/petsc/src/mat/interface/matrix.c<br>
> [19]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>
>  <br>
>  <br>
> and on StdOut<br>
> Elemental matrix (explicit ordering)<br>
>  <br>
> Any suggestions?<br>
>  <br>
> Best,<br>
> Marius<br>
<br>
</div>
</span></font></div>
</body>
</html>