[petsc-users] Vec I/O using different parallel layout

Mohamad M. Nasr-Azadani mmnasr at gmail.com
Mon Oct 10 19:47:05 CDT 2011


Awesome! That is fantastic. :-)

Thanks Barry for your prompt response.
Mohamad


On Mon, Oct 10, 2011 at 5:45 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
> On Oct 10, 2011, at 7:41 PM, Mohamad M. Nasr-Azadani wrote:
>
> > Thanks Barry,
> >
> > I am still not 100% sure if I can do this.
> > Say I have save the global vector obtained by a DA (3D) that is shared
> amongst 16 processors. :.
> > Can I load that data into a vector obtained from a DA (3D, same size
> obviously) that is shared on 1 processor?
>
>   ABSOLUTELY. Or a DA on 2 processes etc.
>
>   Barry
>
> >
> > Thanks,
> > Best,
> > Mohamad
> >
> >
> > On Mon, Oct 10, 2011 at 5:36 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> >
> > On Oct 10, 2011, at 7:31 PM, Mohamad M. Nasr-Azadani wrote:
> >
> > > Hi
> > > I was wondering if it would be possible if I write a global vector
> (associated with a certain 3D distributed array)  to file
> > > via:
> > >
> > >             ierr = PetscViewerBinaryOpen(PCW1, filename,
> FILE_MODE_WRITE, &writer);
> > >             ierr = VecView(vec_data, writer);
> > >             ierr = PetscViewerDestroy(writer);
> > >
> > > And then load the data into a global vector which is not created using
> the same parallel layout?
> > > A simple example for this case would be to write the runtime data
> (parallel vector) to file and then just load the saved vector to do some
> simple SERIAL post processing.
> >
> >  YES.
> >
> >   The vector is saved to the file in the "natural ordering" that is
> starting with the logically 0,0,0 coordinate then increasing through the x
> axis, then the y axis then the z axis. To load back in in parallel you need
> to pass to VecLoad() a vector obtained with the appropriate
> DMCreateGlobalVector().
> >
> >
> >  Barry
> >
> > >
> > > Thanks in advance,
> > > Best
> > > Mohamad
> > >
> > >
> > >
> > >
> >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20111010/a628ef27/attachment.htm>


More information about the petsc-users mailing list