[petsc-users] MatView to disk for Elemental

Smith, Barry F. bsmith at mcs.anl.gov
Tue Dec 10 07:56:58 CST 2019


  What went wrong. Since you don't have write access to the repository you will need to make a fork and then make a MR off the fork. 

  Send all details including screen shots of what fails, we've had other people do it in the past

  Barry

> On Dec 9, 2019, at 7:59 PM, Marius Buerkle <mbuerkle at web.de> wrote:
> 
> Hi,
>  
> Is it actually possible to submit a pull (merge) request ? I followed the petsc wiki but this didn't work.
>  
> Best
> Marius
>  
>  
> Gesendet: Donnerstag, 05. Dezember 2019 um 07:45 Uhr
> Von: "Marius Buerkle" <mbuerkle at web.de>
> An: "Smith, Barry F." <bsmith at mcs.anl.gov>
> Cc: "PETSc users list" <petsc-users at mcs.anl.gov>
> Betreff: Re: [petsc-users] MatView to disk for Elemental
> Doesn't look difficult to fix, I can do it and create a pull request.
>  
> 
> Von: "Smith, Barry F." <bsmith at mcs.anl.gov>
> An: "Marius Buerkle" <mbuerkle at web.de>
> Cc: "PETSc users list" <petsc-users at mcs.anl.gov>
> Betreff: Re: [petsc-users] MatView to disk for Elemental
> 
> Agreed, the fix for the bug you found is now in maint I will try to do another MR that fixes this; but lots to do today so may take a while.
> 
> Barry
> 
> 
> 
> > On Dec 4, 2019, at 1:02 AM, Marius Buerkle <mbuerkle at web.de> wrote:
> >
> > thanks for the swift fix it works now. One more question though. It still outputs "Elemental matrix (explicit ordering)" to StdOut which is kinda annoying, is there anyway to turn this off?
> >
> >
> > Von: "Smith, Barry F." <bsmith at mcs.anl.gov>
> > An: "Marius Buerkle" <mbuerkle at web.de>
> > Cc: "petsc-users at mcs.anl.gov" <petsc-users at mcs.anl.gov>
> > Betreff: Re: [petsc-users] MatView to disk for Elemental
> >
> > sorry about this. The numerical values between C and Fortran got out of sync. I've attached a patch file you can apply with
> >
> > patch -p1 < format.patch
> >
> >
> > or you can use the branch https://gitlab.com/petsc/petsc/merge_requests/2346
> >
> > Barry
> >
> >
> > > On Dec 3, 2019, at 1:10 AM, Marius Buerkle <mbuerkle at web.de> wrote:
> > >
> > > Hi,
> > >
> > > I try to save a matrix in Elemental format to disk. I am doing, where p_matout is of type MATELEMENTAL,
> > >
> > > call PetscViewerCreate(PETSC_COMM_WORLD,v_file,ierr)
> > > call PetscViewerPushFormat(v_file,PETSC_VIEWER_NATIVE,ierr)
> > > call PetscViewerSetType(v_file,PETSCVIEWERBINARY,ierr)
> > > call PetscViewerFileSetMode(v_file,FILE_MODE_WRITE,ierr)
> > > call PetscViewerFileSetName(v_file,trim(filename),ierr)
> > > call MatView(p_matout,v_file,ierr)
> > > call PetscViewerDestroy(v_file,ierr)
> > >
> > > This gives the following error
> > > [18]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> > > [18]PETSC ERROR: No support for this operation for this object type
> > > [18]PETSC ERROR: To store a parallel dense matrix you must first call PetscViewerPushFormat(viewer,PETSC_VIEWER_NATIVE)
> > > [18]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
> > > [18]PETSC ERROR: Petsc Development GIT revision: v3.12.1-317-gcc59f4f82c GIT Date: 2019-11-08 00:07:59 -0600
> > > [18]PETSC ERROR: /home/marius/prog/ownstuff/fortran/programs/transomat_dev/save_load_fs/transomat/transomat on a named tono-hpc1 by marius Tue Dec 3 16:08:38 2019
> > > [18]PETSC ERROR: Configure options --prefix=/home/marius/prog/petsc/petsc_slepc_opt --with-scalar-type=complex --with-fortran-kernels=1 --with-64-bit-indices=0 --CC=mpicc --COPTFLAGS="-g -Ofast -std=c11 -qopenmmp" --CXX=mpicxx --CXXOPTFLAGS="-g -Ofast -std=c++14 -qopenmp" --FC=mpif90 --FOPTFLAGS="-g -Ofast -traceback -qopenmp" --with-mpi=1 --with-x=0 --download-parmetis=1 --download-metis=1 --download-superlu_dist=1 --download-superlu_dist-commit=f8ace664ec4ca10e96e258a764552cbda299ba6e --download-superlu_dist-cmake-arguments=-Denable_openmp:BOOL=TRUE --download-hwloc=1 --download-sowing=1 --with-openmp=1 --with-pthread=1 --download-elemental=1 --download-elemental-commit=6eb15a0da2a4998bf1cf971ae231b78e06d989d9 --download-elemental-cmake-arguments=-DEL_HYBRID:BOOL=TRUE --with-cxx-dialect=c++11 --with-debugging=0 --with-valgrind=0 --with-blaslapack-lib=" /home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl/lib/intel64/libmkl_scalapack_lp64.a -Wl,--start-group /home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl/lib/intel64/libmkl_intel_lp64.a /home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl/lib/intel64/libmkl_intel_thread.a /home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl/lib/intel64/libmkl_core.a /home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.a -Wl,--end-group -liomp5 -lpthread -lm -ldl -lmpi_wrapper" --with-scalapack-lib=" /home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl/lib/intel64/libmkl_scalapack_lp64.a -Wl,--start-group /home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl/lib/intel64/libmkl_intel_lp64.a /home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl/lib/intel64/libmkl_intel_thread.a /home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl/lib/intel64/libmkl_core.a /home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl/lib/intel64/libmkl_blacs_openmpi_lp64.a -Wl,--end-group -liomp5 -lpthread -lm -ldl -lmpi_wrapper" --with-mkl_pardiso-dir=/home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl --with-mkl_cpardiso-dir=/home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl --with-mkl_sparse-dir=/home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl --with-mkl_sparse_optimize-dir=/home/marius/intel/compilers_and_libraries_2019.5.281/linux/mkl --download-slepc=1 --download-slepc-commit=658271f --download-make=1 --download-cmake=1
> > > [18]PETSC ERROR: #1 MatView_MPIDense_Binary() line 682 in /home/marius/prog/petsc/git/petsc/src/mat/impls/dense/mpi/mpidense.c
> > > [18]PETSC ERROR: #2 MatView_MPIDense() line 786 in /home/marius/prog/petsc/git/petsc/src/mat/impls/dense/mpi/mpidense.c
> > > [18]PETSC ERROR: #3 MatView() line 1066 in /home/marius/prog/petsc/git/petsc/src/mat/interface/matrix.c
> > > [18]PETSC ERROR: #4 MatView_Elemental() line 83 in /home/marius/prog/petsc/git/petsc/src/mat/impls/elemental/matelem.cxx
> > > [18]PETSC ERROR: #5 MatView() line 1066 in /home/marius/prog/petsc/git/petsc/src/mat/interface/matrix.c
> > > [19]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> > >
> > >
> > > and on StdOut
> > > Elemental matrix (explicit ordering)
> > >
> > > Any suggestions?
> > >
> > > Best,
> > > Marius
> >
>  



More information about the petsc-users mailing list