[petsc-users] mat_view_info and the number of matrix objects

Barry Smith bsmith at mcs.anl.gov
Wed Dec 7 22:49:41 CST 2011


On Dec 7, 2011, at 10:41 PM, Xiangdong Liang wrote:

> Hello everyone,
> 
> I am using mat_view_info to see the matrix information in ksp/ksp/ex2.c:
> http://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/examples/tutorials/ex2.c.html
> 
> When I use single processor, I see two matrix-objects. One is the
> operator A, but what's the other one? It seems that the other one
> (always in seqaij or seqsbaij format) is from KSPSolve. What's the
> purpose of that matrix object?
> 
> If I try two processors, I see five matrix-objects (more confused).
> Can you help me understand where these "five" matrices come from?
> Thanks.
> 
> Best,
> Xiangdong

  The -mat_view_info is only of limited usefulness.Every time a matrix assembly is done or the end of a factorization if the -mat_view_info argument is passed it prints information about the matrix. 

   The reason for the "two" is the second one is from the PC of ILU which generated a factored matrix and info is printed on that

    In parallel it also prints information about the sequential matrices that are stored inside the parallel matrix so you get the original matrix, then two sequential matrices inside it for each process. 

   If you want information about a particular matrix in your code you are better off using MatGetInfo() then trying to use the command line option.

  Barry

> 
> 
> 
> 
> Here is the script:
> 
> -----------------------
> ./ex2 -mat_view_info
> 
> Matrix Object: 1 MPI processes
>  type: seqaij
>  rows=56, cols=56
>  total: nonzeros=250, allocated nonzeros=280
>  total number of mallocs used during MatSetValues calls =0
>    not using I-node routines
> Matrix Object: 1 MPI processes
>  type: seqsbaij
>  rows=56, cols=56
>  package used to perform factorization: petsc
>  total: nonzeros=153, allocated nonzeros=153
>  total number of mallocs used during MatSetValues calls =0
>      block size is 1
> Norm of error 0.000156044 iterations 6
> 
> --------------------------------
> 
> mpirun -np 2 ./ex2 -mat_view_info
> 
> Matrix Object: 2 MPI processes
>  type: mpiaij
>  rows=56, cols=56
>  total: nonzeros=250, allocated nonzeros=560
>  total number of mallocs used during MatSetValues calls =0
>    not using I-node (on process 0) routines
> Matrix Object: 1 MPI processes
>  type: seqaij
>  rows=28, cols=28
>  total: nonzeros=118, allocated nonzeros=118
>  total number of mallocs used during MatSetValues calls =0
>    not using I-node routines
> Matrix Object: 1 MPI processes
>  type: seqsbaij
>  rows=28, cols=28
>  package used to perform factorization: petsc
>  total: nonzeros=73, allocated nonzeros=73
>  total number of mallocs used during MatSetValues calls =0
>      block size is 1
> Matrix Object: 1 MPI processes
>  type: seqaij
>  rows=28, cols=28
>  total: nonzeros=118, allocated nonzeros=118
>  total number of mallocs used during MatSetValues calls =0
>    not using I-node routines
> Matrix Object: 1 MPI processes
>  type: seqsbaij
>  rows=28, cols=28
>  package used to perform factorization: petsc
>  total: nonzeros=73, allocated nonzeros=73
>  total number of mallocs used during MatSetValues calls =0
>      block size is 1
> Norm of error 0.000411674 iterations 7



More information about the petsc-users mailing list