[petsc-users] Crash when trying to use FD Jacobian

Barry Smith bsmith at mcs.anl.gov
Fri Jun 5 13:36:38 CDT 2015


> On Jun 5, 2015, at 1:30 PM, Harshad Sahasrabudhe <hsahasra at purdue.edu> wrote:
> 
> Hi Matt,
> 
> Thanks for helping me out with this.
>  
> Ah, this is going to be somewhat harder. Unless PETSc know the connectivity of your Jacobian, which means the influence between
> unknowns, it can only do one vector at a time
>  
> Yes, I'm discretizing the  equation on a FEM mesh, so I know the connectivity between different DOFs.
> 
> Do you have a simplified Jacobian matrix you could use for preconditioner construction?
> 
>  I have an approximate diagonal Jacobian matrix, which doesn't give good results.
> 
> It is trying to use get coloring for the Jacobian
> 
> Is there some documentation on coloring which I can read up so that I can generate the coloring for the Jacobian myself?

  It is not the coloring you need to generate, just the nonzero structure. (From that PETSc computes the coloring) so do as I suggest in my other email.

   Barry


> 
> Thanks,
> Harshad
> 
> On Fri, Jun 5, 2015 at 1:59 PM, Matthew Knepley <knepley at gmail.com> wrote:
> On Fri, Jun 5, 2015 at 12:32 PM, Harshad Sahasrabudhe <hsahasra at purdue.edu> wrote:
> Hi,
> 
> I'm solving a non-linear equation using NEWTONLS. The SNES is called from a wrapper in the LibMesh library. I'm trying to use the default FD Jacobian by not setting any Mat or callback function for the Jacobian.
> 
> When doing this I get the following error. I'm not able to figure out why I get this error. Can I get some pointers to what I might be doing wrong?
> 
> Ah, this is going to be somewhat harder. Unless PETSc know the connectivity of your Jacobian, which means the influence between
> unknowns, it can only do one vector at a time:
> 
>   -snes_fd
> 
> which is really slow. It is trying to use get coloring for the Jacobian so that it can do many vectors
> at once. Do you have a simplified Jacobian matrix you could use for preconditioner construction?
> Then it could use that.
> 
>   Thanks,
> 
>      Matt
>  
> [0]PETSC ERROR: --------------------- Error Message ------------------------------------
> [0]PETSC ERROR: Object is in wrong state!
> [0]PETSC ERROR: Not for unassembled matrix!
> [0]PETSC ERROR: ------------------------------------------------------------------------
> [0]PETSC ERROR: Petsc Release Version 3.4.3, Oct, 15, 2013 
> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [0]PETSC ERROR: See docs/index.html for manual pages.
> [0]PETSC ERROR: ------------------------------------------------------------------------
> [0]PETSC ERROR: ./nemo on a linux named conte-fe02.rcac.purdue.edu by hsahasra Fri Jun  5 13:25:27 2015
> [0]PETSC ERROR: Libraries linked from /home/hsahasra/NEMO5/libs/petsc/build-real/linux/lib
> [0]PETSC ERROR: Configure run at Fri Mar 20 15:18:25 2015
> [0]PETSC ERROR: Configure options --with-x=0 --download-hdf5=1 --with-scalar-type=real --with-single-library=0 --with-shared-libraries=0 --with-clanguage=C++ --with-fortran=1 --with-cc=mpiicc --with-fc=mpiifort --with-cxx=mpiicpc COPTFLAGS=-O3 CXXOPTFLAGS=-O3 FOPTFLAGS=-O3 --download-metis=1 --download-parmetis=1 --with-valgrind-dir=/apps/rhel6/valgrind/3.8.1/ --download-mumps=1 --with-fortran-kernels=0 --with-blas-lapack-dir=/apps/rhel6/intel/composer_xe_2013.3.163/mkl --download-superlu_dist=1 --with-blas-lapack-dir=/apps/rhel6/intel/composer_xe_2013.3.163/mkl --with-blacs-lib=/apps/rhel6/intel/composer_xe_2013.3.163/mkl/lib/intel64/libmkl_blacs_intelmpi_lp64.so --with-blacs-include=/apps/rhel6/intel/composer_xe_2013.3.163/mkl/include --with-scalapack-lib="-L/apps/rhel6/intel/composer_xe_2013.3.163/mkl/lib/intel64 -lmkl_scalapack_lp64 -lmkl_blacs_intelmpi_lp64" i --with-scalapack-include=/apps/rhel6/intel/composer_xe_2013.3.163/mkl/include --with-pic=1 --with-debugging=1
> [0]PETSC ERROR: ------------------------------------------------------------------------
> [0]PETSC ERROR: MatGetColoring() line 481 in /home/hsahasra/NEMO5/libs/petsc/build-real/src/mat/color/color.c
> [0]PETSC ERROR: SNESComputeJacobianDefaultColor() line 64 in /home/hsahasra/NEMO5/libs/petsc/build-real/src/snes/interface/snesj2.c
> [0]PETSC ERROR: SNESComputeJacobian() line 2152 in /home/hsahasra/NEMO5/libs/petsc/build-real/src/snes/interface/snes.c
> [0]PETSC ERROR: SNESSolve_NEWTONLS() line 218 in /home/hsahasra/NEMO5/libs/petsc/build-real/src/snes/impls/ls/ls.c
> [0]PETSC ERROR: SNESSolve() line 3636 in /home/hsahasra/NEMO5/libs/petsc/build-real/src/snes/interface/snes.c
> [0]PETSC ERROR: solve() line 538 in "unknowndirectory/"src/solvers/petsc_nonlinear_solver.C
> application called MPI_Abort(comm=0x84000000, 73) - process 0
> 
> Thanks,
> Harshad
> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
> 



More information about the petsc-users mailing list