[petsc-users] Problem with assembling 2 matrices

Matthew Knepley knepley at gmail.com
Wed Jun 11 07:43:35 CDT 2014


On Wed, Jun 11, 2014 at 6:47 AM, Jonas Mairhofer <
mairhofer at itt.uni-stuttgart.de> wrote:

> Dear PETSc-Team,
>
> I am trying to set a global array X as column 0 of a dense matrix A and
> another global array Y as the first column of a second dense matrix B.
> Everything works perfectly fine in serial and parallel as long as I do
> this only for one of the matrices. However, when I try to do it for both
> matrices, the entries of matrix A seem to be overwritten by the values of
> matrix B.
>
> This is the output I get (first vectors X and Y, then matrices A and B):
>
>  Vector Object:Vec_0x84000000_0 1 MPI processes
>   type: mpi
> Process [0]
> 1
> 1
> 1
> Vector Object:Vec_0x84000000_1 1 MPI processes
>   type: mpi
> Process [0]
> 2
> 2
> 2
> Matrix Object: 1 MPI processes
>   type: seqdense
> 2.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00
> 2.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00
> 2.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00
> Matrix Object: 1 MPI processes
>   type: seqdense
> 2.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00
> 2.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00
> 2.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00
>
> I also attached the Fortran code:
>
>
>
>
> program main
> implicit none
>
> #include <finclude/petscsys.h>
> #include <finclude/petscvec.h>
> #include <finclude/petscdmda.h>
> #include <finclude/petscis.h>
> #include <finclude/petscmat.h>
> #include <finclude/petscksp.h>
> #include <finclude/petscpc.h>
> #include <finclude/petscsnes.h>
> #include <finclude/petscvec.h90>
> #include <finclude/petscdmda.h90>
>
>       PetscInt  i,j,m,n
>       PetscErrorCode ierr
>
>       PetscScalar one
>       Mat         A
>       Mat         B
>
>       PetscInt row(3)
>       DM    da
>       Vec X
>       Vec Y
>       PetscScalar,pointer :: X_pointer(:)
>       PetscScalar,pointer :: Y_pointer(:)
>       PetscInt xs,xm
>
> !Initialize
>       call PetscInitialize(PETSC_NULL_CHARACTER,ierr)
>
>       m = 3 !col
>       n = 3 !row
>
>
> call DMDACreate1d(PETSC_COMM_WORLD, &  !MPI communicator
>      & DMDA_BOUNDARY_NONE,                                          &
> !Boundary type at boundary of physical domain
>      & n,                                                       & !global
> dimension of array (if negative number, then value can be changed by user
> via command line!)
>      & 1,                                                           &
> !number of degrees of freedom per grid point (number of unknowns at each
> grid point)
>      & 0,                                                           &
> !number of ghost points accessible to local vectors
>      & PETSC_NULL_INTEGER,                                          &
> !could be an array to specify the number of grid points per processor
>      & da,                                                          & !the
> resulting distributed array object
>      &     ierr)
>
> call DMDAGetCorners(da,       & !the distributed array
>      &   xs,                   & !corner index in x direction
>      &   PETSC_NULL_INTEGER,        & !corner index in y direction
>      &   PETSC_NULL_INTEGER,        & !corner index in z direction
>      &   xm,                   & !width of locally owned part in x
> direction
>      &   PETSC_NULL_INTEGER,        & !width of locally owned part in y
> direction
>      &   PETSC_NULL_INTEGER,        & !width of locally owned part in z
> direction
>      &   ierr)                        !error check
>
>
>
>       one  = 1.0
>
>
> !set indices of matrix rows to be set
>       Do i = xs,xs+xm-1
>          row(i) = i
>       END DO
>
>
> !Set up vector X and Matrix A
>       call DMCreateGlobalVector(da,X,ierr)
>       call VecSet(X,one,ierr)      !set all entries of global array to 1
>       call VecGetArrayF90(X,X_pointer,ierr)
>
>       call MatCreateDense(PETSC_COMM_WORLD,PETSC_DECIDE,PETSC_
> DECIDE,n,m,PETSC_NULL_CHARACTER,A,ierr)
>

The bug is here (and in the other create call). You should not be using
PETSC_NULL_CHARACTER, but PETSC_NULL_SCALAR.

This is a good reason for reconsidering the use of Fortran. It just cannot
warn you about damaging errors like this.

  Thanks,

     Matt


>       call MatSetValues(A,xm,row(xs:xs+xm-1),1,0,X_pointer(1:xm),
> INSERT_VALUES,ierr)
>       call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr)
>       call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr)
>
>
> !Set up vector Y and matrix B
>       call DMCreateGlobalVector(da,Y,ierr)
>       call VecSet(Y,2.d0*one,ierr)      !set all entries of global array
> to 2
>       call VecGetArrayF90(Y,Y_pointer,ierr)
>
>       call MatCreateDense(PETSC_COMM_WORLD,PETSC_DECIDE,PETSC_
> DECIDE,n,m,PETSC_NULL_CHARACTER,B,ierr)
>       call MatSetValues(B,xm,row(xs:xs+xm-1),1,0,Y_pointer(1:xm),
> INSERT_VALUES,ierr)
>       call MatAssemblyBegin(B,MAT_FINAL_ASSEMBLY,ierr)
>       call MatAssemblyEnd(B,MAT_FINAL_ASSEMBLY,ierr)
>
>
> !print out matrices and vectors to check entries
>       call VecView(X,PETSC_VIEWER_STDOUT_WORLD,ierr)
>       call VecView(Y,PETSC_VIEWER_STDOUT_WORLD,ierr)
>       call MatView(A,PETSC_VIEWER_STDOUT_WORLD,ierr)
>       call MatView(B,PETSC_VIEWER_STDOUT_WORLD,ierr)
>
> !free objects
>       call VecRestoreArrayF90(X,X_pointer,ierr)
>       call VecDestroy(X,ierr)
>       call MatDestroy(A,ierr)
>       call VecRestoreArrayF90(Y,Y_pointer,ierr)
>       call VecDestroy(Y,ierr)
>       call MatDestroy(B,ierr)
>
>
>       call PetscFinalize(ierr)
> end
>
>
>
> Am I doing something wrong in my code?
> Thank you for your help!
> Jonas
>
>
>
>
>
>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140611/cf264ba9/attachment.html>


More information about the petsc-users mailing list