Comparing between a PETSc matrix and a standard fortran array in compressed row format
Barry Smith
bsmith at mcs.anl.gov
Fri Aug 3 22:31:42 CDT 2007
Unless the matrix values are huge then this is a big difference.
All you can do is print the difference matrix with ASCII MatView and
look for large entries. This will tell you the locations of the
trouble entries.
Barry
On Sat, 4 Aug 2007, Ben Tay wrote:
> Hi,
>
> I realised that the matrix storage format of NSPCG is actually ITPACK storage
> format. Anyway, I tried to insert the values into a PETSc matrix, use MatAXPY
> with the scalar a = -1 and then use MatNorm on the output matrix.
>
> Using NORM_1, NORM_FROBENIUS and NORM_INFINITY
> <../Vec/NORM_FROBENIUS.html#NORM_FROBENIUS>, I got ~543, 3194 and 556
> respectively. Does this mean that the matrix are different? How "much"
> different does that means?
>
> So what is the next step? Is there anyway to find out the location of the
> value which is different?
>
> This is my comparison subroutine:
>
> subroutine matrix_compare
>
> !compare big_A & PETSc matrix
>
> integer :: k,kk,II,JJ,ierr
>
> Vec xx_test,b_rhs_test
> Mat A_mat_test
>
> PetscScalar aa
>
> PetscReal nrm
>
> aa=-1.
>
> call
> MatCreateSeqAIJ(PETSC_COMM_SELF,total_k,total_k,10,PETSC_NULL_INTEGER,A_mat_test,ierr)
>
> call VecCreateSeq(PETSC_COMM_SELF,total_k,b_rhs_test,ierr)
>
> call VecDuplicate(b_rhs_test,xx_test,ierr)
>
> call VecAssemblyBegin(b_rhs_test,ierr)
>
> call VecAssemblyEnd(b_rhs_test,ierr)
>
> call VecAssemblyBegin(xx_test,ierr)
>
> call VecAssemblyEnd(xx_test,ierr)
>
> do k=1,total_k
> do kk=1,10
>
> II=k-1
>
> JJ=int_A(k,kk)-1
> call
> MatSetValues(A_mat_test,1,II,1,JJ,big_A(k,kk),INSERT_VALUES,ierr)
> call VecSetValue(b_rhs_test,II,q_p(k),INSERT_VALUES,ierr)
> end do
>
> end do
>
> ! call MatCopy(A_mat,A_mat_test,DIFFERENT_NONZERO_PATTERN ,ierr)
>
> call MatAssemblyBegin(A_mat_test,MAT_FINAL_ASSEMBLY,ierr)
> call MatAssemblyEnd(A_mat_test,MAT_FINAL_ASSEMBLY,ierr)
>
> call MatAXPY(A_mat_test,aa,A_mat,SAME_NONZERO_PATTERN,ierr)
>
> call MatNorm(A_mat_test,NORM_INFINITY,nrm,ierr)
>
> print *, nrm
>
> end subroutine matrix_compare
>
> Thanks!
>
>
>
> Barry Smith wrote:
> > You can use MatCreateSeqAIJWithArrays() to "convert" the NSPCG matrix into
> > PETSc format and then MatAXPY() to difference them and then MatNorm() to see
> > how large the result is.
> > Make sure the PETSc or hypre solver is always converging. Run with
> > -ksp_converged_reason and or -ksp_monitor. My guess is that the matrix is
> > becoming very ill-conditioned so the solvers with the default options are
> > failing.
> >
> > Barry
> >
> >
> > On Fri, 3 Aug 2007, Ben Tay wrote:
> >
> >
> > > Hi,
> > >
> > > I used 2 packages to solve my poisson eqn, which is part of my NS unsteady
> > > solver. One is the NSPCG solver package which uses the compressed row
> > > format
> > > to store the A matrix. The other is PETSc. I found that using both solvers
> > > gave me similar answers for a number of time step. However, after that,
> > > the
> > > answer will suddenly change drastically for the PETSc case. This does not
> > > happen for the NSPCG solver.
> > >
> > > For e.g. time step 1-315, oscillating airfoil case, pressure changes
> > > smoothly,
> > > similar answers in both cases
> > >
> > > at time=316, pressure changes from -3.22 to -3.2 for NSPCG, but pressure
> > > changes from -3.21 to -60.2 for PETSc
> > >
> > > This happens when I use HYPRE's AMG or PETSc's direct solver LU.
> > >
> > > I have been trying to find out what's the cause and I can't find the
> > > answer in
> > > debugging. I would like to compare the values of the matrix of the 2
> > > different
> > > solvers and see if there's any difference. However, NSPCG's matrix is in
> > > compressed row format while PETSc's one is just an address and it can't be
> > > viewed easily. Moreover, it's a big matrix so it's not possible to check
> > > by
> > > inspection. I'm thinking of subtracting one matrix by the other and find
> > > if
> > > it's zero. What's the best way to solve this problem? Btw, I'm using
> > > fortran
> > > and there's no mpi
> > >
> > > Thank you.
> > >
> > >
> > >
> > >
> >
> >
> >
>
>
More information about the petsc-users
mailing list