[petsc-users] Binary I/O

Mohamad M. Nasr-Azadani mmnasr at gmail.com
Wed Oct 12 19:40:02 CDT 2011


Hi again,

On the similar topic, how hard would it be to write a function similar to
the PETSc's VecView() associated with Binary writer to do exactly the same
thing, i.e. write a parallel vector into one single file but when writing
all the processors are performing the task simultaneously?

Best,
Mohamad


On Wed, Oct 12, 2011 at 4:17 PM, Mohamad M. Nasr-Azadani
<mmnasr at gmail.com>wrote:

> Thanks Barry. That makes perfect sense.
>
> Best,
> Mohamad
>
>
> On Wed, Oct 12, 2011 at 3:50 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
>>
>> On Oct 12, 2011, at 5:42 PM, Mohamad M. Nasr-Azadani wrote:
>>
>> > Hi everyone,
>> >
>> > I think I know the answer to my question, but I was double checking.
>> > When using
>> > PetscViewerBinaryOpen();
>> >
>> > It is mentioned that
>> > "For writing files it only opens the file on processor 0 in the
>> communicator."
>> >
>> > Does that mean when writing a parallel vector to file using VecView(),
>> all the data from other processors is first sent to processor zero and then
>> dumped into the file?
>>
>>    No all the data is not sent to process zero before writing. That is
>> process 0 does not need enough memory to store all the data before writing.
>>
>>    Instead the processes take turns sending data to process 0 who
>> immediately writes it out out to disk.
>>
>> > If so, that would be a very slow processor for big datasets and large
>> number of processor?
>>
>>    For less than a few thousand processes this is completely fine and
>> nothing else would be much faster
>>
>> > Any suggestions to speed that process up?
>>
>>    We have the various MPI IO options that uses MPI IO to have several
>> processes writing to disks at the same time that is useful for very large
>> numbers of processes.
>>
>>   Barry
>>
>> >
>> > Best,
>> > Mohamad
>> >
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20111012/74697091/attachment.htm>


More information about the petsc-users mailing list