<div dir="ltr"><div dir="ltr">On Tue, Feb 5, 2019 at 6:36 PM Smith, Barry F. via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov">petsc-users@mcs.anl.gov</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">> On Feb 5, 2019, at 5:06 PM, Sanjay Kharche via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>> wrote:<br>
> <br>
> <br>
> Hi<br>
> <br>
> I use two mpi clusters (cluster 1 and 2). Whereas the petsc binary files I generate can be read on cluster 1, I get errors doing so on cluster 2. I also output vts files corresponding to each binary file output, and it appears that both clusters do produce meaningful results. I use ver 3.7.4 on cluster 1, and 3.7.5 on 2. Also, the same simulation produces binary files of slightly different sizes on the two clusters.<br>
<br>
   This is truly strange. PETSc has always used the exact same format for its binary files and automatically handles any byte-swapping that may be need to always have a consistent representation on the disk. Also a difference in the file size is strange. In over 20 years we've never had an issue with unreadable binary files.<br>
<br>
  Can cluster 1 read the binary files from cluster 1 and cluster 2 read the binary files from cluster 2. Is it just cluster 2 cannot read the files from cluster 1? <br>
<br>
  Could the file be changed somehow as it is copied between the clusters?<br>
<br>
  Is this difference in size reproducible if you delete the files and create them again?<br>
<br>
  Is one of the clusters by any chance a Microsoft Windows cluster?<br></blockquote><div><br></div><div>Is one configured with 64-bit integers and the other not?</div><div><br></div><div>  Thanks,</div><div><br></div><div>     Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
   Barry<br>
<br>
> Can you comment on what I need to do to be able to read binary files on cluster 2.<br>
> <br>
> thanks<br>
> Sanjay<br>
> <br>
> Code snippet from parallel petsc code that does output:<br>
> <br>
> if(time_int % (int)(1.0/DELTAT) == 0){ // a smaller time step and more files outputted makes the CV estimate better.<br>
> sprintf(str,"my_2d%d.vts", file_Counter); // this confirms that simulation does something meaningful.<br>
> PetscViewer viewer;  PetscViewerCreate(PETSC_COMM_WORLD, &viewer); PetscViewerSetType(viewer, PETSCVIEWERVTK);<br>
> PetscViewerFileSetName(viewer, str); VecView(u, viewer); PetscViewerDestroy(&viewer);<br>
>                 sprintf(str,"my_3d%d.bin",(int)file_Counter);<br>
> PetscViewer viewer2;<br>
> PetscViewerBinaryOpen(PETSC_COMM_WORLD,str,FILE_MODE_WRITE,&viewer2);<br>
> VecView(u,viewer2);<br>
> PetscViewerDestroy(&viewer2);<br>
> file_Counter++;<br>
> }<br>
> <br>
> <br>
> How I am trying to read it (typically serial code with binary called ecg):<br>
> <br>
>   // inputs are Petsc binary files, create and destroy viewer at each file for simplicity.<br>
> PetscViewer viewer_in;<br>
> sprintf(str,"my_3d%d.bin",file_Counter);<br>
> PetscViewerBinaryOpen(PETSC_COMM_WORLD,str,FILE_MODE_READ,&viewer_in);<br>
> VecLoad(u,viewer_in);<br>
> PetscViewerDestroy(&viewer_in);<br>
> <br>
> <br>
> Errors I got when I ran ecg:<br>
> <br>
> login3 endo]$ ./ecg<br>
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>
> [0]PETSC ERROR: Invalid argument<br>
> [0]PETSC ERROR: Not a vector next in file<br>
> [0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.<br>
> [0]PETSC ERROR: Petsc Release Version 3.7.5, Jan, 01, 2017<br>
> [0]PETSC ERROR: ./ecg on a arch-linux2-c-opt named gra-login3 by kharches Tue Feb  5 17:43:36 2019<br>
> [0]PETSC ERROR: Configure options --prefix=/cvmfs/<a href="http://soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/petsc/3.7.5" rel="noreferrer" target="_blank">soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/petsc/3.7.5</a> --with-mkl_pardiso=1 --with-mkl_pardiso-dir=/cvmfs/<a href="http://soft.computecanada.ca/easybuild/software/2017/Core/imkl/11.3.4.258/mkl" rel="noreferrer" target="_blank">soft.computecanada.ca/easybuild/software/2017/Core/imkl/11.3.4.258/mkl</a> --with-hdf5=1 --with-hdf5-dir=/cvmfs/<a href="http://soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/hdf5-mpi/1.8.18" rel="noreferrer" target="_blank">soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/hdf5-mpi/1.8.18</a> --download-hypre=1 --download-metis=1 --download-triangle=1 --download-ptscotch=1 --download-superlu_dist=1 --download-ml=1 --download-superlu=1 --download-prometheus=1 --download-mumps=1 --download-parmetis=1 --download-suitesparse=1 --download-mumps-shared=0 --download-ptscotch-shared=0 --download-superlu-shared=0 --download-superlu_dist-shared=0 --download-parmetis-shared=0 --download-metis-shared=0 --download-ml-shared=0 --download-suitesparse-shared=0 --download-hypre-shared=0 --download-prometheus-shared=0 --with-cc=mpicc --with-cxx=mpicxx --with-c++-support --with-fc=mpifort --CFLAGS="-O2 -xCore-AVX2 -ftz -fp-speculation=safe -fp-model source -fPIC" --CXXFLAGS="-O2 -xCore-AVX2 -ftz -fp-speculation=safe -fp-model source -fPIC" --FFLAGS="-O2 -xCore-AVX2 -ftz -fp-speculation=safe -fp-model source -fPIC" --with-gnu-compilers=0 --with-mpi=1 --with-build-step-np=8 --with-shared-libraries=1 --with-debugging=0 --with-pic=1 --with-x=0 --with-windows-graphics=0 --with-scalapack=1 --with-scalapack-include=/cvmfs/<a href="http://soft.computecanada.ca/easybuild/software/2017/Core/imkl/11.3.4.258/mkl/include" rel="noreferrer" target="_blank">soft.computecanada.ca/easybuild/software/2017/Core/imkl/11.3.4.258/mkl/include</a> --with-scalapack-lib="[/cvmfs/<a href="http://soft.computecanada.ca/easybuild/software/2017/Core/imkl/11.3.4.258/mkl/lib/intel64/libmkl_scalapack_lp64.a,libmkl_blacs_openmpi_lp64.a,libmkl_intel_lp64.a,libmkl_sequential.a,libmkl_core.a" rel="noreferrer" target="_blank">soft.computecanada.ca/easybuild/software/2017/Core/imkl/11.3.4.258/mkl/lib/intel64/libmkl_scalapack_lp64.a,libmkl_blacs_openmpi_lp64.a,libmkl_intel_lp64.a,libmkl_sequential.a,libmkl_core.a</a>]" --with-blas-lapack-lib="[/cvmfs/<a href="http://soft.computecanada.ca/easybuild/software/2017/Core/imkl/11.3.4.258/mkl/lib/intel64/libmkl_intel_lp64.a,libmkl_sequential.a,libmkl_core.a" rel="noreferrer" target="_blank">soft.computecanada.ca/easybuild/software/2017/Core/imkl/11.3.4.258/mkl/lib/intel64/libmkl_intel_lp64.a,libmkl_sequential.a,libmkl_core.a</a>]" --with-hdf5=1 --with-hdf5-dir=/cvmfs/<a href="http://soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/hdf5-mpi/1.8.18" rel="noreferrer" target="_blank">soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/hdf5-mpi/1.8.18</a> --with-fftw=1 --with-fftw-dir=/cvmfs/<a href="http://soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/fftw-mpi/3.3.6" rel="noreferrer" target="_blank">soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/fftw-mpi/3.3.6</a><br>
> [0]PETSC ERROR: #1 PetscViewerBinaryReadVecHeader_Private() line 28 in /dev/shm/ebuser/PETSc/3.7.5/iomkl-2016.4.11/petsc-3.7.5/src/vec/vec/utils/vecio.c<br>
> [0]PETSC ERROR: #2 VecLoad_Binary() line 90 in /dev/shm/ebuser/PETSc/3.7.5/iomkl-2016.4.11/petsc-3.7.5/src/vec/vec/utils/vecio.c<br>
> [0]PETSC ERROR: #3 VecLoad_Default() line 413 in /dev/shm/ebuser/PETSc/3.7.5/iomkl-2016.4.11/petsc-3.7.5/src/vec/vec/utils/vecio.c<br>
> [0]PETSC ERROR: #4 VecLoad() line 975 in /dev/shm/ebuser/PETSc/3.7.5/iomkl-2016.4.11/petsc-3.7.5/src/vec/vec/interface/vector.c<br>
> [0]PETSC ERROR: #5 VecLoad_Binary_DA() line 931 in /dev/shm/ebuser/PETSc/3.7.5/iomkl-2016.4.11/petsc-3.7.5/src/dm/impls/da/gr2.c<br>
> [0]PETSC ERROR: #6 VecLoad_Default_DA() line 964 in /dev/shm/ebuser/PETSc/3.7.5/iomkl-2016.4.11/petsc-3.7.5/src/dm/impls/da/gr2.c<br>
> [0]PETSC ERROR: #7 VecLoad() line 975 in /dev/shm/ebuser/PETSc/3.7.5/iomkl-2016.4.11/petsc-3.7.5/src/vec/vec/interface/vector.c<br>
> <br>
> This email is directed in confidence solely to the person named above and may contain confidential, privileged or personal health information. Please be aware that this email may also be released to members of the public under Ontario's Freedom of Information and Protection of Privacy Act if required. Review, distribution, or disclosure of this email by anyone other than the person(s) for whom it was originally intended is strictly prohibited. If you are not an intended recipient, please notify the sender immediately via a return email and destroy all copies of the original message. Thank you for your cooperation.<br>
<br>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr" class="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>