Hi all,<br>I am tring to use the tfs preconditioner to solve a large sparse mpiaij matrix.<br><br>11111111111111111111111111111111111111111<br>It works very well with a small matrix 45*45(Actually a 9*9 block matrix with blocksize 5) on 2 processors; Out put is as follows:<br>
<br> 0 KSP preconditioned resid norm 3.014544557924e+04 true resid norm 2.219812091849e+04 ||Ae||/||Ax|| 1.000000000000e+00<br> 1 KSP preconditioned resid norm 3.679021546908e-03 true resid norm 1.502747104104e-03 ||Ae||/||Ax|| 6.769704109737e-08<br>
2 KSP preconditioned resid norm 2.331909907779e-09 true resid norm 8.737892755044e-10 ||Ae||/||Ax|| 3.936320910733e-14<br>KSP Object:<br> type: gmres<br> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement<br>
GMRES: happy breakdown tolerance 1e-30<br> maximum iterations=10000, initial guess is zero<br> tolerances: relative=1e-10, absolute=1e-50, divergence=10000<br> left preconditioning<br>PC Object:<br> type: tfs<br>
linear system matrix = precond matrix:<br> Matrix Object:<br> type=mpiaij, rows=45, cols=45<br> total: nonzeros=825, allocated nonzeros=1350<br> using I-node (on process 0) routines: found 5 nodes, limit used is 5<br>
Norm of error 2.33234e-09, Iterations 2<br><br>2222222222222222222222222222222222222222<br><br>However, when I use the same code for a larger sparse matrix, a 18656 * 18656 block matrix with blocksize 5); it encounters the followins error.(Same error message for using 1 and 2 processors, seperately)<br>
<br>[0]PETSC ERROR: ------------------------------------------------------------------------<br>[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range<br>[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger<br>
[0]PETSC ERROR: or see <a href="http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[0]PETSC">http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[0]PETSC</a> ERROR: or try <a href="http://valgrind.org">http://valgrind.org</a> on linux or man libgmalloc on Apple to find memory corruption errors<br>
[0]PETSC ERROR: likely location of problem given in stack below<br>[0]PETSC ERROR: --------------------- Stack Frames ------------------------------------<br>[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,<br>
[0]PETSC ERROR: INSTEAD the line number of the start of the function<br>[0]PETSC ERROR: is given.<br>[0]PETSC ERROR: [0] PCSetUp_TFS line 116 src/ksp/pc/impls/tfs/tfs.c<br>[0]PETSC ERROR: [0] PCSetUp line 764 src/ksp/pc/interface/precon.c<br>
[0]PETSC ERROR: [0] KSPSetUp line 183 src/ksp/ksp/interface/itfunc.c<br>[0]PETSC ERROR: [0] KSPSolve line 305 src/ksp/ksp/interface/itfunc.c<br>[0]PETSC ERROR: --------------------- Error Message ------------------------------------<br>
[0]PETSC ERROR: Signal received!<br>[0]PETSC ERROR: ------------------------------------------------------------------------<br>[0]PETSC ERROR: Petsc Release Version 2.3.3, Patch 15, Tue Sep 23 10:02:49 CDT 2008 HG revision: 31306062cd1a6f6a2496fccb4878f485c9b91760<br>
[0]PETSC ERROR: See docs/changes/index.html for recent updates.<br>[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.<br>[0]PETSC ERROR: See docs/index.html for manual pages.<br>[0]PETSC ERROR: ------------------------------------------------------------------------<br>
[0]PETSC ERROR: ./kspex1reader_binmpiaij on a linux-gnu named vyan2000-linux by vyan2000 Fri May 15 01:06:12 2009<br>[0]PETSC ERROR: Libraries linked from /home/vyan2000/local/PPETSc/petsc-2.3.3-p15//lib/linux-gnu-c-debug<br>
[0]PETSC ERROR: Configure run at Mon May 4 00:59:41 2009<br>[0]PETSC ERROR: Configure options --with-mpi-dir=/home/vyan2000/local/mpich2-1.0.8p1/ --with-debugger=gdb --with-shared=0 --download-hypre=1 --download-parmetis=1<br>
[0]PETSC ERROR: ------------------------------------------------------------------------<br>[0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file<br>application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0[cli_0]: aborting job:<br>
application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0<br><br><br>3333333333333333333333333333333333333333333333<br><br>I have the exact solution x in hands, so before I push the matrix into the ksp solver, I did check the PETSC loaded matrix A and rhs vector b, by verifying Ax-b=0, in both cases of 1 processor and 2 processors. <br>
<br>Any sugeestions?<br><br>Thank you very much,<br><br>Yan<br>