[petsc-users] About MatView
Longyin Cui
cuilongyin at gmail.com
Sat Jun 20 20:27:39 CDT 2015
Thank you so much for your heart warming in time reply, and I just need to
ask you a few more questions:
1, Correct me if I'm wrong, I am kinda new in this field. So, I can only
print the matrix or vectors entirely, to know the structure stored in each
processor I could only cal MatGetLocalSize or MatGetOwnershipRange to get
general ideas (Are there more of them? what is
PetscViewerASCIISynchronizedPrintf
used for?). Communicator PETSC_COMM_SELE is only useful when there is only
one process going on.
2, Our project is based on another group's project, and they are
physists.... So I am trying to get hold of when every processor
communicates what with each other. The question is not that complicated,
they first initialize the matrix and the vectors, set some operators and
values, assembly them and now solve Ax=b using FGMRES. From this point I
just want to know how the processors divide the matrix A, because I looked
into KSPsolve(), there doesn't seem to be any communication right? (Maybe I
wasn't paying enough attention). So, could you give me some hints how to
achieve this? to know how they communicate? I didn't find much
documentation about this.
3, related to question 2....how the matrix was generated generally, for
example, all the processors genereate one matrix together? or each of them
generate a whole matrix deparately by itself? Every one processor holds one
copy or there's only one copy in Rank0?
Thank you, you are the best!
Longyin Cui (or you know me as Eric);
Student from C.S. division;
Cell: 7407047169;
return 0;
On Sat, Jun 20, 2015 at 1:38 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
> Eric,
>
>
> > On Jun 20, 2015, at 1:42 AM, Longyin Cui <cuilongyin at gmail.com> wrote:
> >
> > OMG you are real!!!
> > OK, All my error message looks like this:
> > PETSc Error ... exiting
> >
> --------------------------------------------------------------------------
> > mpirun has exited due to process rank 13 with PID 1816 on
> > node cnode174.local exiting improperly. There are two reasons this could
> occur:
> >
> > 1. this process did not call "init" before exiting, but others in
> > the job did. This can cause a job to hang indefinitely while it waits
> > for all processes to call "init". By rule, if one process calls "init",
> > then ALL processes must call "init" prior to termination.
> >
> > 2. this process called "init", but exited without calling "finalize".
> > By rule, all processes that call "init" MUST call "finalize" prior to
> > exiting or it will be considered an "abnormal termination"
> >
> > This may have caused other processes in the application to be
> > terminated by signals sent by mpirun (as reported here).
>
> This crash doesn't seem to have anything to do in particular with code
> below. Do the PETSc examples run in parallel? Does your code that you ran
> have a PetscInitialize() in it? What about running on two processors, does
> that work?
>
> >
> > You are right, I did use PETSC_COMM_SELE, and when I used
> PETSC_COMM_WORLD alone I could get the entire matrix printed. But, this is
> one whole matrix in one file. The reason I used: PetscViewerASCIIOpen (
> PETSC_COMM_SELF, "mat.output", &viewer); and MatView (matrix,viewer); was
> because it says "Each processor can instead write its own independent
> output by specifying the communicator PETSC_COMM_SELF".
>
> Yikes, this is completely untrue and has been for decades. We have no
> way of saving the matrix in its parts; you cannot use use a PETSC_COMM_SELF
> viewer with a parallel matrix. Sorry about the wrong information in the
> documentation; I have fixed it.
>
> Why can't you just save the matrix in one file and then compare it? We
> don't provide a way to save objects one part per process because we think
> it is a bad model for parallel computing since the result depends on the
> number of processors you are using.
>
> Barry
>
> >
> > Also, I tried this as well, which failed, same error message :
> > PetscMPIInt my_rank;
> > MPI_Comm_rank(MPI_COMM_WORLD, &my_rank);
> > string str = KILLME(my_rank); // KILLME is a int to string
> function...
> > const char * c = str.c_str();
> > PetscViewer viewer;
> > PetscViewerASCIIOpen(PETSC_COMM_WORLD, c , &viewer);
> > MatView(impOP,viewer); //impOP is the huge matrix.
> > PetscViewerDestroy(&viewer);
> >
> > I was trying to generate 16 files recording each matrix hold by each
> processor so I can conpare them with the big matrix...so, what do you think
> >
> > Thank you very muuuch.
> >
> > Longyin Cui (or you know me as Eric);
> > Student from C.S. division;
> > Cell: 7407047169;
> > return 0;
> >
> >
> > On Sat, Jun 20, 2015 at 1:34 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> >
> > You need to cut and paste and send the entire error message: "not
> working" makes it very difficult for us to know what has gone wrong.
> > Based on the code fragment you sent I guess one of your problems is that
> the viewer communicator is not the same as the matrix communicator. Since
> the matrix is on 16 processors (I am guessing PETSC_COMM_WORLD) the viewer
> communicator must also be the same (also PETSC_COMM_WORLD).
> > The simplest code you can do is
> >
> > > PetscViewerASCIIOpen(PETSC_COMM_WORLD,"stdout",&viewer);
> > > MatView(impOP,viewer);
> >
> > but you can get a similar effect with the command line option
> -mat_view and not write any code at all (the less code you have to write
> the better).
> >
> > Barry
> >
> >
> > > On Jun 19, 2015, at 10:42 PM, Longyin Cui <cuilongyin at gmail.com>
> wrote:
> > >
> > > Hi dear whoever reads this:
> > >
> > > I have a quick question:
> > > After matrix assembly, suppouse I have matrix A. Assuming I used 16
> processors, if I want each processor to print out their local contents of
> the A, how do I proceed? (I simply want to know how the matrix is stored
> from generating to communicating to solving, so I get to display all the
> time to get better undersdtanding)
> > >
> > > I read the examples, and I tried things like below and sooooo many
> different codes from examples, it still is not working.
> > > PetscViewer viewer;
> > > PetscMPIInt my_rank;
> > > MPI_Comm_rank(PETSC_COMM_WORLD,&my_rank);
> > > PetscPrintf(MPI_COMM_SELF,"[%d] rank\n",my_rank);
> > > PetscViewerASCIIOpen(MPI_COMM_SELF,NULL,&viewer);
> > > PetscViewerPushFormat(viewer,PETSC_VIEWER_ASCII_INFO);
> > > MatView(impOP,viewer);
> > >
> > > Plea......se give me some hints
> > >
> > > Thank you so very much!
> > >
> > >
> > > Longyin Cui (or you know me as Eric);
> > > Student from C.S. division;
> > > Cell: 7407047169;
> > > return 0;
> > >
> >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150620/3e752db8/attachment.html>
More information about the petsc-users
mailing list