[petsc-users] Output question for ex29.

Matthew Knepley knepley at gmail.com
Fri Jun 10 15:09:09 CDT 2011


On Fri, Jun 10, 2011 at 3:05 PM, <zhenglun.wei at gmail.com> wrote:

> Dear Matt,
> Following your suggestions, I modified the VecView_VTK in
> /src/ksp/ksp/example/tutorial/ex29.c to
>
> PetscFunctionBegin;
> PetscViewerASCIIOpen(comm, filename, &viewer);
> PetscViewerSetFormat(viewer, PETSC_VIEWER_ASCII_MATLAB);
>

I told you to use ASCII_VTK, not ASCII_MATLAB.


> DMView(x, viewer);
> VecView(x, PETSC_VIEWER_STDOUT_WORLD);
> PetscViewerDestroy(&viewer);
> PetscFunctionReturn(0);
>
> However, I always give me some error. Could you please briefly check if
> there is any obvious coding problem here.
>

How can I possibly know what went wrong if you do not send the ENTIRE error
message.

   Matt


> thanks,
> Alan
>
>
> On , Matthew Knepley <knepley at gmail.com> wrote:
> > On Thu, Jun 9, 2011 at 6:18 PM, Matthew Knepley knepley at gmail.com>
> wrote:
> >
> > On Thu, Jun 9, 2011 at 6:01 PM, zhenglun.wei at gmail.com> wrote:
> >
> >
> > Dear Sir/Madam,
> > I'm still studying on the ex29 of /src/ksp/ksp/example/tutorial. Before I
> met a problem on VecView_VTK in parallel computation. I'm trying to modify
> it in order to output some data from the computation.
> >
> >
> >
> >
> > Here is a better answer. If you want output, throw away this old function
> which is broken, and use
> >
> >
> >   PetscViewerASCIIOpen()
> >   PetscViewerASCIISetFormat(PETSC_VIEWER_ASCII_VTK)
> >
> >   DMView()
> >   VecView()
> >   PetscViewerDestroy()
> >
> >
> >     Thanks,
> >
> >
> >        Matt
> >
> >
> > 1) My first questions is that what does this section do in VecView_VTK:
> >
> > 272: MPI_Comm_rank(comm, &rank);
> >
> > 273: MPI_Comm_size(comm, &size);
> > 274: MPI_Reduce(&n, &maxn, 1, MPIU_INT, MPI_MAX, 0, comm);
> >
> > 275: tag = ((PetscObject) viewer)->tag;
> > 276: if (!rank) {
> > 277: PetscMalloc((maxn+1) * sizeof(PetscScalar), &values);
> > 278: for(i = 0; i 279: PetscViewerASCIIPrintf(viewer, "%G\n",
> PetscRealPart(array[i]));
> >
> >
> > 280: }
> > 281: for(p = 1; p 282: MPI_Recv(values, (PetscMPIInt) n, MPIU_SCALAR, p,
> tag, comm, &status);
>
> > 283: MPI_Get_count(&status, MPIU_SCALAR, &nn);
> > 284: for(i = 0; i
> >
> > 285: PetscViewerASCIIPrintf(viewer, "%G\n", PetscRealPart(array[i]));
> > 286: }
> > 287: }
> > 288: PetscFree(values);
> > 289: } else {
> > 290: MPI_Send(array, n, MPIU_SCALAR, 0, tag, comm);
> >
> >
> > 291: }
> >
> > What I understand is: it gather all the data from different process in
> parallel computation, and output it to the 'viewer'. I comment out
> everything in VecView_VTK except this part, there is no error message coming
> up in my parallel computation so far.
> >
> >
> >
> > 2) however, I really don't know how did it split the domain for parallel
> computation. For example, if I use 4 processes, is the domain split like:
> >
> >
> > The DMDA describes the domain splitting.
> >
> >
> >
> >
> >    Matt
> >
> > a)
> > |
> > 0 | 1
> > |
> >
> >
> > ---------------|--------------
> > |
> > 2 | 3
> > |
> >
> > b)
> > |
> > 0 | 2
> > |
> > ---------------|--------------
> > |
> >
> >
> > 1 | 3
> > |
> >
> > c)
> > | | | |
> > | | | |
> > 0 | | 2 | |
> > | 1 | | 3 |
> > | | | |
> > | | | |
> >
> >
> >
> > d)
> > 0
> > ------------------------
> > 1
> > ------------------------
> > 2
> > ------------------------
> > 3
> >
> > thanks in advance,
> > Alan
> >
> >
> >
> >
> > --
> > What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> > -- Norbert Wiener
> >
> >
> >
> >
> > --
> > What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> > -- Norbert Wiener
> >
>



-- 
What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110610/5236a994/attachment.htm>


More information about the petsc-users mailing list