[petsc-users] questions regarding simple petsc matrix vector operation

Smith, Barry F. bsmith at mcs.anl.gov
Thu Apr 25 00:34:55 CDT 2019


  I ran your problem fine on one process. no errors.

   Suggest you run under valgrind on your system: https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind  or if that is not useful

   Run in the debugger with the option -start_in_debugger noxterm

  Good luck,

Barry



> On Apr 24, 2019, at 10:36 PM, Zhang, Junchao via petsc-users <petsc-users at mcs.anl.gov> wrote:
> 
> How many MPI ranks do you use? The following line is suspicious.  I guess you do not want a vector of global length 1. 
> 66   VecSetSizes(b,PETSC_DECIDE,1);
> 
> --Junchao Zhang
> 
> 
> On Wed, Apr 24, 2019 at 4:14 PM Karl Lin via petsc-users <petsc-users at mcs.anl.gov> wrote:
> Hi, there
> 
> I have been trying to get a simple program run with the following code:
> 
>  12 int main(int argc,char **args)
>  13 {
>  14   PetscErrorCode ierr;
>  15   Mat            A;
>  16   Mat           AT;
>  17   Mat            N;
>  18   char           name[1024];
>  19   char           vname[1024];
>  20   char           pass[1024];
>  21   PetscBool      flg;
>  22   Vec            b,x,u,Ab,Au;
>  23   PetscViewer    viewer;                        /* viewer */
>  24   PetscMPIInt    rank,size;
>  25
>  26   KSP            QRsolver;
>  27   PC             pc;
>  28   PetscInt       its;
>  29   PetscReal      norm;
>  30
>  31   PetscInt       n1, n2, n3, np1, np2, np3, p, jj;
>  32
>  33   PetscInt       *cols, *dnz, *onz;
>  34   PetscScalar    *vals;
>  35
>  36   ierr = PetscInitialize(&argc,&args,0,help);if (ierr) return ierr;
>  37
>  38   ierr = MPI_Comm_size(PETSC_COMM_WORLD,&size);CHKERRQ(ierr);
>  39   ierr = MPI_Comm_rank(PETSC_COMM_WORLD,&rank);CHKERRQ(ierr);
>  40
>  41   PetscMalloc1(1, &dnz);
>  42   PetscMalloc1(1, &onz);
>  43
>  44   dnz[0]=2;
>  45   onz[0]=1;
>  46
>  47   MatCreateMPIAIJMKL(PETSC_COMM_WORLD, 1, 2, 1, 2, 2, dnz, 2, onz, &A); CHKERRQ(ierr);
>  48
>  49   PetscMalloc1(2, &cols);
>  50   PetscMalloc1(2, &vals);
>  51
>  52   jj = rank;
>  53   cols[0]=0; cols[1]=1;
>  54   vals[0]=1.0;vals[1]=1.0;
>  55
>  56   MatSetValues(A, 1, &jj, 2, cols, vals, INSERT_VALUES);
>  57
>  58   MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY);
>  59   MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY);
>  60
>  61   VecCreate(PETSC_COMM_WORLD,&x);
>  62   VecSetSizes(x,PETSC_DECIDE,2);
>  63   VecSetFromOptions(x);
>  64
>  65   VecCreate(PETSC_COMM_WORLD,&b);
>  66   VecSetSizes(b,PETSC_DECIDE,1);
>  67   VecSetFromOptions(b);
>  68
>  69   VecCreate(PETSC_COMM_WORLD,&u);
>  70   VecSetSizes(u,PETSC_DECIDE,1);
>  71   VecSetFromOptions(u);
>  72
>  73   VecSet(b, 2.0);
>  74   VecSet(u, 0.0);
>  75   VecSet(x, 0.0);
>  76
>  77   MatMult(A, x, u);
>  78
>  79   VecView(x, PETSC_VIEWER_STDOUT_WORLD);
>  80   VecView(b, PETSC_VIEWER_STDOUT_WORLD);
>  81   VecView(u, PETSC_VIEWER_STDOUT_WORLD);
>  82
>  83   VecAXPY(u,-1.0,b);
> 
> However, it always crashes at line 83 even with single process saying:
> [0]PETSC ERROR: ------------------------------------------------------------------------
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
> [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
> [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
> [0]PETSC ERROR: to get more information on the crash.
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> [0]PETSC ERROR: Signal received
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.10.4, Feb, 26, 2019
> 
> I can't figure out why this would happen. The printout from VecView shows every vec value is correct. I will greatly appreciate any tips.
> 
> Regards,
> Karl
> 



More information about the petsc-users mailing list