[petsc-users] DMGetMatrix segfault

Klaij, Christiaan C.Klaij at marin.nl
Tue Jan 17 01:44:10 CST 2012


I'm learning to use DMs, below is a first try but it gives a segfault. Any ideas?

#include <petscdmda.h>

int main(int argc, char **argv)
{

  DM               da0, da1;
  DMDABoundaryType bx = DMDA_BOUNDARY_PERIODIC, by = DMDA_BOUNDARY_PERIODIC;
  DMDAStencilType  stype = DMDA_STENCIL_STAR;
  PetscInt         Mx = 8, My = 8;

  PetscInitialize(&argc, &argv, PETSC_NULL, PETSC_NULL);

  // create distributed array for Mx-by-My grid
  DMDACreate2d(PETSC_COMM_WORLD,bx,by,stype,Mx,My,PETSC_DECIDE,PETSC_DECIDE,1,1,PETSC_NULL,PETSC_NULL,&da0);
  DMDACreate2d(PETSC_COMM_WORLD,bx,by,stype,Mx,My,PETSC_DECIDE,PETSC_DECIDE,1,1,PETSC_NULL,PETSC_NULL,&da1);

  // mat nest from pack
  DM  pack;
  Mat A;
  DMCompositeCreate(PETSC_COMM_WORLD,&pack);
  DMCompositeAddDM(pack,da0);
  DMCompositeAddDM(pack,da1);
  DMGetMatrix(pack,MATNEST,&A);
  MatView(A,PETSC_VIEWER_DEFAULT);

  PetscFinalize();

  return 0;

}

[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/faq.html#valgrind[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: ---------------------  Stack Frames ------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR:       INSTEAD the line number of the start of the function
[0]PETSC ERROR:       is given.
[0]PETSC ERROR: [0] DMCompositeGetISLocalToGlobalMappings line 798 src/dm/impls/composite/pack.c
[0]PETSC ERROR: [0] DMCreateLocalToGlobalMapping_Composite line 1490 src/dm/impls/composite/pack.c
[0]PETSC ERROR: [0] DMGetLocalToGlobalMapping line 355 src/dm/interface/dm.c
[0]PETSC ERROR: [0] DMGetMatrix_Composite line 222 src/dm/impls/composite/packm.c
[0]PETSC ERROR: [0] DMGetMatrix line 569 src/dm/interface/dm.c
[0]PETSC ERROR: --------------------- Error Message ------------------------------------
[0]PETSC ERROR: Signal received!
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Release Version 3.2.0, Patch 5, Sat Oct 29 13:45:54 CDT 2011
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: ./dmda-try3 on a linux_64b named lin0133 by cklaij Tue Jan 17 08:34:36 2012
[0]PETSC ERROR: Libraries linked from /opt/refresco/64bit_intelv11.1_openmpi/petsc-3.2-p5/lib
[0]PETSC ERROR: Configure run at Mon Jan 16 14:03:34 2012
[0]PETSC ERROR: Configure options --prefix=/opt/refresco/64bit_intelv11.1_openmpi/petsc-3.2-p5 --with-mpi-dir=/opt/refresco/64bit_intelv11.1_openmpi/openmpi-1.4.4 --with-x=0 --with-mpe=0 --with-debugging=1 --with-hypre-include=/opt/refresco/64bit_intelv11.1_openmpi/hypre-2.7.0b/include --with-hypre-lib=/opt/refresco/64bit_intelv11.1_openmpi/hypre-2.7.0b/lib/libHYPRE.a --with-ml-include=/opt/refresco/64bit_intelv11.1_openmpi/ml-6.2/include --with-ml-lib=/opt/refresco/64bit_intelv11.1_openmpi/ml-6.2/lib/libml.a --with-blas-lapack-dir=/opt/intel/mkl
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 59.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpiexec has exited due to process rank 0 with PID 1445 on
node lin0133 exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpiexec (as reported here).
--------------------------------------------------------------------------



dr. ir. Christiaan Klaij
CFD Researcher
Research & Development
E mailto:C.Klaij at marin.nl
T +31 317 49 33 44

MARIN
2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands
T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl



More information about the petsc-users mailing list