[petsc-dev] problem registering a new solver

Mark Adams mfadams at lbl.gov
Tue Mar 2 15:43:31 CST 2021


I see the problem, but not the solution. I put a print statement
in MatSolverTypeDestroy and see:

MatSolverTypeDestroy seqaij inext->next=0x7fb4a9114c70 inext=0x7fb4a9113c70

before it fails. This seqaij node seems to be pointing to itself. If I
remove my registration call and it works and I see this from my print
statement:

MatSolverTypeDestroy seqaij inext->next=0x7fe811827270 inext=0x7fe811826270
MatSolverTypeDestroy seqaijperm inext->next=0x7fe811909c70
inext=0x7fe811827270
MatSolverTypeDestroy constantdiagonal inext->next=0x7fe81190ac70
inext=0x7fe811909c70
 ....


On Tue, Mar 2, 2021 at 3:42 PM Mark Adams <mfadams at lbl.gov> wrote:

> I am trying to add a band solver to PETSc (later to be moved to Cuda and
> Kokkos) and I have started by adding some types and a copy of the current
> LU as placeholders. I register with:
>
>  ierr = MatSolverTypeRegister(MATSOLVERPETSC, MATSEQAIJ,
>  MAT_FACTOR_LUBAND,MatGetFactor_seqaij_petsc);CHKERRQ(ierr);
>
> And that is about all that I do that can have any effect at this point. I
> add a switch in  MatGetFactor_seqaij_petsc on (ftype == MAT_FACTOR_LUBAND)
> to set the symbolic factorization method (*B)->ops->lufactorsymbolic  =
> MatLUBandFactorSymbolic_SeqAIJ; But this is not called (I verified this)
> because I don't know how to get MatGetFactor_seqaij_petsc to receive ftype
> == MAT_FACTOR_LUBAND. (I need help on this to)
>
> Anyway, I would hope that these changes would not do anything but I get an
> error (appended).
>
> It is failing in MatSolverTypeDestroy on this second PetscFree:
>
>     while (inext) {
>       ierr = PetscFree(inext->mtype);CHKERRQ(ierr);
>       iprev = inext;
>       inext = inext->next;
>       ierr = PetscFree(iprev);CHKERRQ(ierr);
>     }
>
> I tried to clone LU here, but I clearly missed something.
>
> Any ideas?
>
> And I just made  an MR for this if you want to look at the code.
>
> Thanks,
> Mark
> ...
> Number of SNES iterations = 2
> [0]PETSC ERROR: PetscTrFreeDefault() called from MatSolverTypeDestroy()
> line 4513 in /Users/markadams/Codes/petsc/src/mat/interface/matrix.c
> [0]PETSC ERROR: Block [id=0(48)] at address 0x7fb7fb8f7620 is corrupted
> (probably write past end of array)
> [0]PETSC ERROR: Block allocated in MatSolverTypeRegister() line 4382 in
> /Users/markadams/Codes/petsc/src/mat/interface/matrix.c
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: Memory corruption:
> https://www.mcs.anl.gov/petsc/documentation/installation.html#valgrind
> [0]PETSC ERROR: Corrupted memory
> [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [0]PETSC ERROR: Petsc Development GIT revision: v3.14.4-739-g575f931ef8
>  GIT Date: 2021-03-02 13:38:55 -0500
> [0]PETSC ERROR: ./ex19 on a arch-macosx-gnu-g named MarksMac-302.local by
> markadams Tue Mar  2 15:28:41 2021
> [0]PETSC ERROR: Configure options
> --with-mpi-dir=/usr/local/Cellar/mpich/3.3.2_1 COPTFLAGS="-g -O0"
> CXXOPTFLAGS="-g -O0" --download-metis=1 --download-parmetis=1
> --download-kokkos=1 --download-kokkos-kernels=1 --download-p4est=1
> --with-zlib=1 --download-superlu_dist --download-superlu --with-make-np=4
> --download-hdf5=1 -with-cuda=0 --with-x=0 --with-debugging=1
> PETSC_ARCH=arch-macosx-gnu-g --with-64-bit-indices=0 --with-openmp=0
> --with-ctable=0
> [0]PETSC ERROR: #1 PetscTrFreeDefault() line 310 in
> /Users/markadams/Codes/petsc/src/sys/memory/mtr.c
> [0]PETSC ERROR: #2 MatSolverTypeDestroy() line 4513 in
> /Users/markadams/Codes/petsc/src/mat/interface/matrix.c
> [0]PETSC ERROR: #3 MatFinalizePackage() line 57 in
> /Users/markadams/Codes/petsc/src/mat/interface/dlregismat.c
> [0]PETSC ERROR: #4 PetscRegisterFinalizeAll() line 389 in
> /Users/markadams/Codes/petsc/src/sys/objects/destroy.c
> [0]PETSC ERROR: #5 PetscFinalize() line 1474 in
> /Users/markadams/Codes/petsc/src/sys/objects/pinit.c
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20210302/6193bc38/attachment-0001.html>


More information about the petsc-dev mailing list