[petsc-users] reading petsc binary files.

Sanjay Kharche Sanjay.Kharche at lhsc.on.ca
Wed Feb 6 09:49:54 CST 2019


Hi Matt, Barry

Thanks for rapidly replying.

Both clusters are 64 bit, neither is MS Windows (wouldnt know how to use a windows machine), nor is my file transfer on windows. Just did some test runs with various combinations. It appears that it was a one off where bin files got corrupted during sftp/scp (I forgot which I used, always on Fedora cluster 1, red hat enterprise cluster 2). I have other queries, but I will make another posting for that tonight/tomorrow.

thanks for pointing out that sftp/scp may need (but not always) to be in binary mode unless I want a 4 bit word attached to the transferred file.

Sanjay

________________________________________
From: Matthew Knepley <knepley at gmail.com>
Sent: Wednesday, February 6, 2019 6:32:40 AM
To: Smith, Barry F.
Cc: Sanjay Kharche; petsc-users at mcs.anl.gov
Subject: Re: [petsc-users] reading petsc binary files.

On Tue, Feb 5, 2019 at 6:36 PM Smith, Barry F. via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> wrote:
> On Feb 5, 2019, at 5:06 PM, Sanjay Kharche via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> wrote:
>
>
> Hi
>
> I use two mpi clusters (cluster 1 and 2). Whereas the petsc binary files I generate can be read on cluster 1, I get errors doing so on cluster 2. I also output vts files corresponding to each binary file output, and it appears that both clusters do produce meaningful results. I use ver 3.7.4 on cluster 1, and 3.7.5 on 2. Also, the same simulation produces binary files of slightly different sizes on the two clusters.

   This is truly strange. PETSc has always used the exact same format for its binary files and automatically handles any byte-swapping that may be need to always have a consistent representation on the disk. Also a difference in the file size is strange. In over 20 years we've never had an issue with unreadable binary files.

  Can cluster 1 read the binary files from cluster 1 and cluster 2 read the binary files from cluster 2. Is it just cluster 2 cannot read the files from cluster 1?

  Could the file be changed somehow as it is copied between the clusters?

  Is this difference in size reproducible if you delete the files and create them again?

  Is one of the clusters by any chance a Microsoft Windows cluster?

Is one configured with 64-bit integers and the other not?

  Thanks,

     Matt

   Barry

> Can you comment on what I need to do to be able to read binary files on cluster 2.
>
> thanks
> Sanjay
>
> Code snippet from parallel petsc code that does output:
>
> if(time_int % (int)(1.0/DELTAT) == 0){ // a smaller time step and more files outputted makes the CV estimate better.
> sprintf(str,"my_2d%d.vts", file_Counter); // this confirms that simulation does something meaningful.
> PetscViewer viewer;  PetscViewerCreate(PETSC_COMM_WORLD, &viewer); PetscViewerSetType(viewer, PETSCVIEWERVTK);
> PetscViewerFileSetName(viewer, str); VecView(u, viewer); PetscViewerDestroy(&viewer);
>                 sprintf(str,"my_3d%d.bin",(int)file_Counter);
> PetscViewer viewer2;
> PetscViewerBinaryOpen(PETSC_COMM_WORLD,str,FILE_MODE_WRITE,&viewer2);
> VecView(u,viewer2);
> PetscViewerDestroy(&viewer2);
> file_Counter++;
> }
>
>
> How I am trying to read it (typically serial code with binary called ecg):
>
>   // inputs are Petsc binary files, create and destroy viewer at each file for simplicity.
> PetscViewer viewer_in;
> sprintf(str,"my_3d%d.bin",file_Counter);
> PetscViewerBinaryOpen(PETSC_COMM_WORLD,str,FILE_MODE_READ,&viewer_in);
> VecLoad(u,viewer_in);
> PetscViewerDestroy(&viewer_in);
>
>
> Errors I got when I ran ecg:
>
> login3 endo]$ ./ecg
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> [0]PETSC ERROR: Invalid argument
> [0]PETSC ERROR: Not a vector next in file
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.7.5, Jan, 01, 2017
> [0]PETSC ERROR: ./ecg on a arch-linux2-c-opt named gra-login3 by kharches Tue Feb  5 17:43:36 2019
> [0]PETSC ERROR: Configure options --prefix=/cvmfs/soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/petsc/3.7.5<http://soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/petsc/3.7.5> --with-mkl_pardiso=1 --with-mkl_pardiso-dir=/cvmfs/soft.computecanada.ca/easybuild/software/2017/Core/imkl/11.3.4.258/mkl<http://soft.computecanada.ca/easybuild/software/2017/Core/imkl/11.3.4.258/mkl> --with-hdf5=1 --with-hdf5-dir=/cvmfs/soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/hdf5-mpi/1.8.18<http://soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/hdf5-mpi/1.8.18> --download-hypre=1 --download-metis=1 --download-triangle=1 --download-ptscotch=1 --download-superlu_dist=1 --download-ml=1 --download-superlu=1 --download-prometheus=1 --download-mumps=1 --download-parmetis=1 --download-suitesparse=1 --download-mumps-shared=0 --download-ptscotch-shared=0 --download-superlu-shared=0 --download-superlu_dist-shared=0 --download-parmetis-shared=0 --download-metis-shared=0 --download-ml-shared=0 --download-suitesparse-shared=0 --download-hypre-shared=0 --download-prometheus-shared=0 --with-cc=mpicc --with-cxx=mpicxx --with-c++-support --with-fc=mpifort --CFLAGS="-O2 -xCore-AVX2 -ftz -fp-speculation=safe -fp-model source -fPIC" --CXXFLAGS="-O2 -xCore-AVX2 -ftz -fp-speculation=safe -fp-model source -fPIC" --FFLAGS="-O2 -xCore-AVX2 -ftz -fp-speculation=safe -fp-model source -fPIC" --with-gnu-compilers=0 --with-mpi=1 --with-build-step-np=8 --with-shared-libraries=1 --with-debugging=0 --with-pic=1 --with-x=0 --with-windows-graphics=0 --with-scalapack=1 --with-scalapack-include=/cvmfs/soft.computecanada.ca/easybuild/software/2017/Core/imkl/11.3.4.258/mkl/include<http://soft.computecanada.ca/easybuild/software/2017/Core/imkl/11.3.4.258/mkl/include> --with-scalapack-lib="[/cvmfs/soft.computecanada.ca/easybuild/software/2017/Core/imkl/11.3.4.258/mkl/lib/intel64/libmkl_scalapack_lp64.a,libmkl_blacs_openmpi_lp64.a,libmkl_intel_lp64.a,libmkl_sequential.a,libmkl_core.a<http://soft.computecanada.ca/easybuild/software/2017/Core/imkl/11.3.4.258/mkl/lib/intel64/libmkl_scalapack_lp64.a,libmkl_blacs_openmpi_lp64.a,libmkl_intel_lp64.a,libmkl_sequential.a,libmkl_core.a>]" --with-blas-lapack-lib="[/cvmfs/soft.computecanada.ca/easybuild/software/2017/Core/imkl/11.3.4.258/mkl/lib/intel64/libmkl_intel_lp64.a,libmkl_sequential.a,libmkl_core.a<http://soft.computecanada.ca/easybuild/software/2017/Core/imkl/11.3.4.258/mkl/lib/intel64/libmkl_intel_lp64.a,libmkl_sequential.a,libmkl_core.a>]" --with-hdf5=1 --with-hdf5-dir=/cvmfs/soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/hdf5-mpi/1.8.18<http://soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/hdf5-mpi/1.8.18> --with-fftw=1 --with-fftw-dir=/cvmfs/soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/fftw-mpi/3.3.6<http://soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/fftw-mpi/3.3.6>
> [0]PETSC ERROR: #1 PetscViewerBinaryReadVecHeader_Private() line 28 in /dev/shm/ebuser/PETSc/3.7.5/iomkl-2016.4.11/petsc-3.7.5/src/vec/vec/utils/vecio.c
> [0]PETSC ERROR: #2 VecLoad_Binary() line 90 in /dev/shm/ebuser/PETSc/3.7.5/iomkl-2016.4.11/petsc-3.7.5/src/vec/vec/utils/vecio.c
> [0]PETSC ERROR: #3 VecLoad_Default() line 413 in /dev/shm/ebuser/PETSc/3.7.5/iomkl-2016.4.11/petsc-3.7.5/src/vec/vec/utils/vecio.c
> [0]PETSC ERROR: #4 VecLoad() line 975 in /dev/shm/ebuser/PETSc/3.7.5/iomkl-2016.4.11/petsc-3.7.5/src/vec/vec/interface/vector.c
> [0]PETSC ERROR: #5 VecLoad_Binary_DA() line 931 in /dev/shm/ebuser/PETSc/3.7.5/iomkl-2016.4.11/petsc-3.7.5/src/dm/impls/da/gr2.c
> [0]PETSC ERROR: #6 VecLoad_Default_DA() line 964 in /dev/shm/ebuser/PETSc/3.7.5/iomkl-2016.4.11/petsc-3.7.5/src/dm/impls/da/gr2.c
> [0]PETSC ERROR: #7 VecLoad() line 975 in /dev/shm/ebuser/PETSc/3.7.5/iomkl-2016.4.11/petsc-3.7.5/src/vec/vec/interface/vector.c
>
> This email is directed in confidence solely to the person named above and may contain confidential, privileged or personal health information. Please be aware that this email may also be released to members of the public under Ontario's Freedom of Information and Protection of Privacy Act if required. Review, distribution, or disclosure of this email by anyone other than the person(s) for whom it was originally intended is strictly prohibited. If you are not an intended recipient, please notify the sender immediately via a return email and destroy all copies of the original message. Thank you for your cooperation.



--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>

This email is directed in confidence solely to the person named above and may contain confidential, privileged or personal health information. Please be aware that this email may also be released to members of the public under Ontario's Freedom of Information and Protection of Privacy Act if required. Review, distribution, or disclosure of this email by anyone other than the person(s) for whom it was originally intended is strictly prohibited. If you are not an intended recipient, please notify the sender immediately via a return email and destroy all copies of the original message. Thank you for your cooperation.


More information about the petsc-users mailing list