[petsc-users] MPI-IO

Barry Smith bsmith at mcs.anl.gov
Mon Jul 21 10:49:29 CDT 2014


On Jul 21, 2014, at 8:41 AM, Jed Brown <jed at jedbrown.org> wrote:

> Matthew Knepley <knepley at gmail.com> writes:
> 
>> On Jul 21, 2014 5:44 AM, "Stephen Wornom" <stephen.wornom at inria.fr> wrote:
>>> 
>>> I have an unstructured mesh code used to compute vortex shedding problems
>> saving the solutions every 500-1000 time steps. The mesh size is 3 to 20
>> MNodes. The minimum number of cores that I us is 128 for the 3MNode mesh.
>>> I would like to know if PETSC could be used to use to save the solutions
>> using MPI-IO?
>> 
>> The normal VecView() for the binary viewer will use MPI/IO.
> 
> You need -viewer_binary_mpiio or PetscViewerBinarySetMPIIO().

    We should change this to use a PetscViewerFormat instead of special casing it.
> 
> PETSc devs, do you suppose MPI-IO support is stable enough that we could
> make this a default?

     It is not so much a question of stability, it is more a question of (I believe) MPI IO is just loads slower for what might normal PETSc users use, it only pays off for large numbers of nodes.

   Barry

>  In any case, PetscViewerBinarySetMPIIO should take
> a PetscBool.



More information about the petsc-users mailing list