[petsc-dev] petsc-dev: ASM with sbaij matrix and block size=1 broken?
Brad Aagaard
baagaard at usgs.gov
Fri Nov 4 19:10:06 CDT 2011
Jed,
It looks like your changeset 20985 may have introduced a bug. I just
pulled and rebuilt petsc-dev. I am trying to run a PyLith simulation
that worked in parallel with ASM, sbaij matrix, and bs=1 a week ago but
now fails. I get a memory allocation error (huge allocation) or a memory
corruption error. Using an aij matrix or running in serial works fine.
My simulation runs fine with 2 processes but not 3 or 4. The stack
traces are
NPROC=4 (memory corruption)
#0 0x00007f014a93bb00 in MatGetSubMatrices_MPIBAIJ_local (C=0x29560a0,
ismax=1, isrow=0x2a06520, iscol=0x29ff6f0, scall=MAT_INITIAL_MATRIX,
allrows=0x2a05db0, allcolumns=0x2a00ea0, submats=0x2a0ab00)
at /tools/common/petsc-dev/src/mat/impls/baij/mpi/baijov.c:933
#1 0x00007f014aacd73d in MatIncreaseOverlap_MPISBAIJ (C=0x29560a0,
is_max=1,
is=0x29fb110, ov=1)
at /tools/common/petsc-dev/src/mat/impls/sbaij/mpi/sbaijov.c:85
#2 0x00007f014ac476db in MatIncreaseOverlap (mat=0x29560a0, n=1,
is=0x29fb110, ov=1)
at /tools/common/petsc-dev/src/mat/interface/matrix.c:6669
#3 0x00007f014b016df6 in PCSetUp_ASM (pc=0x299d2f0)
at /tools/common/petsc-dev/src/ksp/pc/impls/asm/asm.c:199
NPROC=3 (request for huge amount of memory allocation)
#0 0x00007f67d30bca75 in MatGetSubMatrices_MPIBAIJ_local (C=0x2cc8430,
ismax=1, isrow=0x2eb4b70, iscol=0x2eadaf0, scall=MAT_INITIAL_MATRIX,
allrows=0x2eb4400, allcolumns=0x2eaf2a0, submats=0x2eb9150)
at /tools/common/petsc-dev/src/mat/impls/baij/mpi/baijov.c:929
#1 0x00007f67d324e73d in MatIncreaseOverlap_MPISBAIJ (C=0x2cc8430,
is_max=1,
is=0x2ea9510, ov=1)
at /tools/common/petsc-dev/src/mat/impls/sbaij/mpi/sbaijov.c:85
#2 0x00007f67d33c86db in MatIncreaseOverlap (mat=0x2cc8430, n=1,
is=0x2ea9510, ov=1)
at /tools/common/petsc-dev/src/mat/interface/matrix.c:6669
#3 0x00007f67d3797df6 in PCSetUp_ASM (pc=0x2d0cdf0)
at /tools/common/petsc-dev/src/ksp/pc/impls/asm/asm.c:199
#4 0x00007f67d36c43a5 in PCSetUp (pc=0x2d0cdf0)
Thanks,
Brad
More information about the petsc-dev
mailing list