[petsc-users] Scaling of the Petsc Binary Viewer
Thibault Bridel-Bertomeu
thibault.bridelbertomeu at gmail.com
Wed Jul 7 14:49:39 CDT 2021
Hi Dave,
Thank you for your fast answer.
To postprocess the files in python, I use the PetscBinaryIO package that is
provided with PETSc, yes.
I load the file like this :
*import numpy as npimport meshioimport PetscBinaryIO as pioimport
matplotlib as mplimport matplotlib.pyplot as pltimport matplotlib.cm
<http://matplotlib.cm> as cmmpl.use('Agg')*
*restartname = "restart_00001001.bin"print("Reading {}
...".format(restartname))io = pio.PetscBinaryIO()fh =
open(restartname)objecttype = io.readObjectType(fh)data = Noneif objecttype
== 'Vec': data = io.readVec(fh)print("Size of data = ",
data.size)print("Size of a single variable (4 variables) = ", data.size /
4)assert(np.isclose(data.size / 4.0, np.floor(data.size / 4.0)))*
Then I load the mesh (it's from Gmsh so I use the meshio package) :
*meshname = "ForwardFacing.msh"print("Reading {} ...".format(meshname))mesh
= meshio.read(meshname)print("Number of vertices = ",
mesh.points.shape[0])print("Number of cells = ",
mesh.cells_dict['quad'].shape[0])*
>From the 'data' and the 'mesh' I use tricontourf from matplotlib to plot
the figure.
I removed the call to ...SetUseMPIIO... and it gives the same kind of data
yes (I attached a figure of the data obtained with the binary viewer
without MPI I/O).
Maybe it's just a connectivity issue ? Maybe the way the Vec is written by
the PETSc viewer somehow does not match the connectivity from the ori Gmsh
file but some other connectivity of the partitionned DMPlex ? If so, is
there a way to get the latter ? I know the binary viewer does not work on
DMPlex, the VTK viewer yields a corrupted dataset and I have issues with
HDF5 viewer with MPI (see another recent thread of mine) ...
Thanks again for your help !!
Thibault
Le mer. 7 juil. 2021 à 20:54, Dave May <dave.mayhem23 at gmail.com> a écrit :
>
>
> On Wed 7. Jul 2021 at 20:41, Thibault Bridel-Bertomeu <
> thibault.bridelbertomeu at gmail.com> wrote:
>
>> Dear all,
>>
>> I have been having issues with large Vec (based on DMPLex) and massive
>> MPI I/O ... it looks like the data that is written by the Petsc Binary
>> Viewer is gibberish for large meshes split on a high number of processes.
>> For instance, I am using a mesh that has around 50 million cells, split on
>> 1024 processors.
>> The computation seems to run fine, the timestep computed from the data
>> makes sense so I think internally everything is fine. But when I look at
>> the solution (one example attached) it's noise - at this point it should
>> show a bow shock developing on the left near the step.
>> The piece of code I use is below for the output :
>>
>> call DMGetOutputSequenceNumber(dm, save_seqnum,
>> save_seqval, ierr); CHKERRA(ierr)
>> call DMSetOutputSequenceNumber(dm, -1, 0.d0, ierr);
>> CHKERRA(ierr)
>> write(filename,'(A,I8.8,A)') "restart_", stepnum, ".bin"
>> call PetscViewerCreate(PETSC_COMM_WORLD, binViewer,
>> ierr); CHKERRA(ierr)
>> call PetscViewerSetType(binViewer, PETSCVIEWERBINARY,
>> ierr); CHKERRA(ierr)
>> call PetscViewerFileSetMode(binViewer, FILE_MODE_WRITE,
>> ierr); CHKERRA(ierr);
>> call PetscViewerBinarySetUseMPIIO(binViewer, PETSC_TRUE,
>> ierr); CHKERRA(ierr);
>>
>>
>
> Do you get the correct output if you don’t call the function above (or
> equivalently use PETSC_FALSE)
>
>
> call PetscViewerFileSetName(binViewer, trim(filename), ierr); CHKERRA(ierr)
>> call VecView(X, binViewer, ierr); CHKERRA(ierr)
>> call PetscViewerDestroy(binViewer, ierr); CHKERRA(ierr)
>> call DMSetOutputSequenceNumber(dm, save_seqnum,
>> save_seqval, ierr); CHKERRA(ierr)
>>
>> I do not think there is anything wrong with it but of course I would be
>> happy to hear your feedback.
>> Nonetheless my question was : how far have you tested the binary mpi i/o
>> of a Vec ? Does it make some sense that for a 50 million cell mesh split on
>> 1024 processes, it could somehow fail ?
>> Or is it my python drawing method that is completely incapable of
>> handling this dataset ? (paraview displays the same thing though so I'm not
>> sure ...)
>>
>
> Are you using the python provided tools within petsc to load the Vec from
> file?
>
>
> Thanks,
> Dave
>
>
>
>> Thank you very much for your advice and help !!!
>>
>> Thibault
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210707/1b334739/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: restart_00001001.png
Type: image/png
Size: 20605 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210707/1b334739/attachment-0001.png>
More information about the petsc-users
mailing list