[petsc-users] petscviewerbinaryread
Matthew Knepley
knepley at gmail.com
Mon Jun 17 09:13:29 CDT 2013
On Mon, Jun 17, 2013 at 2:47 PM, Frederik Treue <frtr at fysik.dtu.dk> wrote:
> OK, so I got a little further: Now I can read 1D fields on any amount of
> processors, and 2D fields on 1 processor :). My code:
>
What are you trying to do? Why not just use VecView() and VecLoad(), which
work in parallel, and are scalable. I doubt you want to reproduce that code.
Matt
> MPI_Comm_size(PETSC_COMM_WORLD,&cs);
> MPI_Comm_rank(MPI_COMM_WORLD,&rank);
>
> ierr = VecGetArray(dummy,&dump); CHKERRQ(ierr);
> ierr = VecGetArray(ReRead,&readarray); CHKERRQ(ierr);
>
> ierr=PetscViewerBinaryOpen(PETSC_COMM_WORLD,"./testvector",FILE_MODE_READ,&fileview);CHKERRQ(ierr);
>
> for (i=0;i<rank;i++) {
> printf("Rank BEFORE: %d\n",rank);
>
> ierr=PetscViewerBinaryRead(fileview,(void*)dump,nx*ny/cs,PETSC_SCALAR);CHKERRQ(ierr);
> }
>
>
> ierr=PetscViewerBinaryRead(fileview,(void*)readarray,nx*ny/cs,PETSC_SCALAR);CHKERRQ(ierr);
>
> for (i=rank+1;i<cs;i++) {
> printf("Rank: AFTER: %d\n",rank);
>
> ierr=PetscViewerBinaryRead(fileview,(void*)dump,nx*ny/cs,PETSC_SCALAR);CHKERRQ(ierr);
> }
>
> However, this fails for 2D with more than one processor: The resulting
> vector is garbled and I get memory corruption. Am I on the right track,
> or is there another way to achieve an MPI version of binary read? The
> above code seems somewhat cumbersome...
>
> /Frederik Treue
>
> On Mon, 2013-06-17 at 11:06 +0200, Matthew Knepley wrote:
> > On Mon, Jun 17, 2013 at 10:59 AM, Frederik Treue <frtr at fysik.dtu.dk>
> > wrote:
> > Hi guys,
> >
> > is petscviewerbinaryread working? The examples given at the
> > webpage
> > either fails with out of memory error issues (ex65 and ex65dm)
> > or
> > doesn't compile (ex61) for me?
> >
> > Oddly I have problems only when trying to use mpi. My code:
> >
> >
> ierr=DMDACreate1d(PETSC_COMM_WORLD,DMDA_BOUNDARY_GHOSTED,156,1,1,PETSC_NULL,&da);CHKERRQ(ierr);
> >
> > ierr=DMCreateGlobalVector(da,&ReRead);CHKERRQ(ierr);
> >
> > ierr=VecAssemblyBegin(ReRead);CHKERRQ(ierr);
> > ierr=VecAssemblyEnd(ReRead);CHKERRQ(ierr);
> >
> > double *workarray=(double*)malloc(156*sizeof(double));
> >
> > ierr = VecGetArray(ReRead,&workarray); CHKERRQ(ierr);
> >
> >
> > 1) This does not make sense. You allocate an array, but then overwrite
> > that
> > array with the one inside the vector ReRead.
> >
> >
> ierr=PetscViewerBinaryOpen(PETSC_COMM_WORLD,"./testvector",FILE_MODE_READ,&fileview);CHKERRQ(ierr);
> >
> >
> ierr=PetscViewerBinaryRead(fileview,&dummy,1,PETSC_SCALAR);CHKERRQ(ierr);
> >
> >
> ierr=PetscViewerBinaryRead(fileview,(void*)workarray,156,PETSC_SCALAR);CHKERRQ(ierr);
> > printf("TEST: %g\n",workarray[144]);
> > ierr=VecRestoreArray(ReRead,&workarray);
> >
> >
> > 2) In parallel, the local array in your vector ReRead will be smaller
> > than the global size 156. Thus this read also
> > does not makes sense.
> >
> >
> > Matt
> >
> > VecView(ReRead,PETSC_VIEWER_DRAW_WORLD);
> >
> > This works fine as long as I'm on a single processor. The file
> > read also
> > works on mpi with 2 processors, as evidenced by the result of
> > the printf
> > statement (which gives the right result from both processors).
> > However
> > the VecView statement fails with "Vector not generated from a
> > DMDA!".
> > Why?
> >
> > And second question: How do I generalize to 2d? Can I give a
> > 2d array to
> > PetscViewerBinaryRead and expect it to work? And how should it
> > be
> > malloc'ed? Or does one give a 1d array to
> > PetscViewerBinaryRead and let
> > VecRestoreArray do the rest? Or what?
> >
> > /Frederik Treue
> >
> >
> >
> >
> >
> > --
> > What most experimenters take for granted before they begin their
> > experiments is infinitely more interesting than any results to which
> > their experiments lead.
> > -- Norbert Wiener
>
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130617/da119397/attachment.html>
More information about the petsc-users
mailing list