[petsc-users] MPI-IO
Jed Brown
jed at jedbrown.org
Mon Jul 21 08:41:57 CDT 2014
Matthew Knepley <knepley at gmail.com> writes:
> On Jul 21, 2014 5:44 AM, "Stephen Wornom" <stephen.wornom at inria.fr> wrote:
>>
>> I have an unstructured mesh code used to compute vortex shedding problems
> saving the solutions every 500-1000 time steps. The mesh size is 3 to 20
> MNodes. The minimum number of cores that I us is 128 for the 3MNode mesh.
>> I would like to know if PETSC could be used to use to save the solutions
> using MPI-IO?
>
> The normal VecView() for the binary viewer will use MPI/IO.
You need -viewer_binary_mpiio or PetscViewerBinarySetMPIIO().
PETSc devs, do you suppose MPI-IO support is stable enough that we could
make this a default? In any case, PetscViewerBinarySetMPIIO should take
a PetscBool.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 818 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140721/5b1c69d6/attachment.pgp>
More information about the petsc-users
mailing list