About the -pc_type tfs

Ryan Yan vyan2000 at gmail.com
Thu May 21 13:36:26 CDT 2009


Hi Matt,
May I make an inquiry about the file that I send out last week. If there is
any thing missing, please let me know.

I have other choices on the -pc_type, but it seems like that the 'tfs' is
very promising, given it's excellent convergence trajectory when it was
applied onto the smaller matrix.

Thank you very much,

Yan







On Fri, May 15, 2009 at 10:46 PM, Ryan Yan <vyan2000 at gmail.com> wrote:

> Hi Matt,
> Since the bin file is quit big, I did not post this email to the group. You
> can reply the next response to the group.
>
> The matrix bin file is in the attachment.
> petsc_matrix_coef.bin   is the bin file to generate the matrix, which have
> the format aij.
>
> petsc_vec_knownsolu.bin is the bin file to generate the exact solution x.
>
>
> petsc_vec_rhs.bin is the bin file to generate the right hand side b.
>
> A c script named "tfs_binarymatrix_verify.c" is also attached for your
> convenience to check Ax-b =0.
>
> If you need any more information, please let me know.
>
> Thank you very much,
>
> Yan
>
>
>
>
>
> On Fri, May 15, 2009 at 1:34 PM, Matthew Knepley <knepley at gmail.com>wrote:
>
>> If you send the matrix in PETSc binary format we can check this.
>>
>>   Matt
>>
>>
>> On Fri, May 15, 2009 at 12:20 PM, Ryan Yan <vyan2000 at gmail.com> wrote:
>>
>>> Hi all,
>>> I am tring to use the tfs preconditioner to solve a large sparse mpiaij
>>> matrix.
>>>
>>> 11111111111111111111111111111111111111111
>>> It works very well with a small matrix 45*45(Actually a 9*9 block matrix
>>> with blocksize 5) on 2 processors; Out put is as follows:
>>>
>>>  0 KSP preconditioned resid norm 3.014544557924e+04 true resid norm
>>> 2.219812091849e+04 ||Ae||/||Ax|| 1.000000000000e+00
>>>   1 KSP preconditioned resid norm 3.679021546908e-03 true resid norm
>>> 1.502747104104e-03 ||Ae||/||Ax|| 6.769704109737e-08
>>>   2 KSP preconditioned resid norm 2.331909907779e-09 true resid norm
>>> 8.737892755044e-10 ||Ae||/||Ax|| 3.936320910733e-14
>>> KSP Object:
>>>   type: gmres
>>>     GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
>>> Orthogonalization with no iterative refinement
>>>     GMRES: happy breakdown tolerance 1e-30
>>>   maximum iterations=10000, initial guess is zero
>>>   tolerances:  relative=1e-10, absolute=1e-50, divergence=10000
>>>   left preconditioning
>>> PC Object:
>>>   type: tfs
>>>   linear system matrix = precond matrix:
>>>   Matrix Object:
>>>     type=mpiaij, rows=45, cols=45
>>>     total: nonzeros=825, allocated nonzeros=1350
>>>       using I-node (on process 0) routines: found 5 nodes, limit used is
>>> 5
>>> Norm of error 2.33234e-09, Iterations 2
>>>
>>> 2222222222222222222222222222222222222222
>>>
>>> However, when I use the same code for a larger sparse matrix, a 18656 *
>>> 18656 block matrix with blocksize 5); it encounters the followins
>>> error.(Same error message for using 1 and 2 processors, seperately)
>>>
>>> [0]PETSC ERROR:
>>> ------------------------------------------------------------------------
>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
>>> probably memory access out of range
>>> [0]PETSC ERROR: Try option -start_in_debugger or
>>> -on_error_attach_debugger
>>> [0]PETSC ERROR: or see
>>> http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[0]PETSC<http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal%5B0%5DPETSC>ERROR: or try
>>> http://valgrind.org on linux or man libgmalloc on Apple to find memory
>>> corruption errors
>>> [0]PETSC ERROR: likely location of problem given in stack below
>>> [0]PETSC ERROR: ---------------------  Stack Frames
>>> ------------------------------------
>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not
>>> available,
>>> [0]PETSC ERROR:       INSTEAD the line number of the start of the
>>> function
>>> [0]PETSC ERROR:       is given.
>>> [0]PETSC ERROR: [0] PCSetUp_TFS line 116 src/ksp/pc/impls/tfs/tfs.c
>>> [0]PETSC ERROR: [0] PCSetUp line 764 src/ksp/pc/interface/precon.c
>>> [0]PETSC ERROR: [0] KSPSetUp line 183 src/ksp/ksp/interface/itfunc.c
>>> [0]PETSC ERROR: [0] KSPSolve line 305 src/ksp/ksp/interface/itfunc.c
>>> [0]PETSC ERROR: --------------------- Error Message
>>> ------------------------------------
>>> [0]PETSC ERROR: Signal received!
>>> [0]PETSC ERROR:
>>> ------------------------------------------------------------------------
>>> [0]PETSC ERROR: Petsc Release Version 2.3.3, Patch 15, Tue Sep 23
>>> 10:02:49 CDT 2008 HG revision: 31306062cd1a6f6a2496fccb4878f485c9b91760
>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>>> [0]PETSC ERROR: See docs/index.html for manual pages.
>>> [0]PETSC ERROR:
>>> ------------------------------------------------------------------------
>>> [0]PETSC ERROR: ./kspex1reader_binmpiaij on a linux-gnu named
>>> vyan2000-linux by vyan2000 Fri May 15 01:06:12 2009
>>> [0]PETSC ERROR: Libraries linked from
>>> /home/vyan2000/local/PPETSc/petsc-2.3.3-p15//lib/linux-gnu-c-debug
>>> [0]PETSC ERROR: Configure run at Mon May  4 00:59:41 2009
>>> [0]PETSC ERROR: Configure options
>>> --with-mpi-dir=/home/vyan2000/local/mpich2-1.0.8p1/ --with-debugger=gdb
>>> --with-shared=0 --download-hypre=1 --download-parmetis=1
>>> [0]PETSC ERROR:
>>> ------------------------------------------------------------------------
>>> [0]PETSC ERROR: User provided function() line 0 in unknown directory
>>> unknown file
>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0[cli_0]:
>>> aborting job:
>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
>>>
>>>
>>> 3333333333333333333333333333333333333333333333
>>>
>>> I have the exact solution x in hands, so before I push the matrix into
>>> the ksp solver, I did check the PETSC loaded matrix A and rhs vector b, by
>>> verifying Ax-b=0,  in  both cases of 1 processor and 2 processors.
>>>
>>> Any sugeestions?
>>>
>>> Thank you very much,
>>>
>>> Yan
>>>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20090521/78984ec5/attachment.htm>


More information about the petsc-users mailing list