[petsc-dev] PCApplyTranspose in BiCG

Alexander Grayver agrayver at gfz-potsdam.de
Thu Mar 22 12:30:47 CDT 2012


Jed,

Ok, I see. Initially I've changed solver because I observed one thing. I 
have two petsc-dev (one is older than the other) versions compiled.
Running the same program with both gives different results:

./solveTest -ksp_view -ksp_converged_reason -ksp_monitor_true_residual 
-log_summary -mat_type aij -ksp_rtol 1.0e-7 -ksp_type tfqmr -pc_type sor 
-ksp_max_it 750
   0 KSP preconditioned resid norm 9.760858225033e-10 true resid norm 
4.108026443997e-08 ||r(i)||/||b|| 1.000000000000e+00
   1 KSP preconditioned resid norm 9.497939702484e-10 true resid norm 
3.591828545356e-08 ||r(i)||/||b|| 8.743440662618e-01
   1 KSP preconditioned resid norm 1.341212527158e-09 true resid norm 
3.754099340764e-08 ||r(i)||/||b|| 9.138449793207e-01
   2 KSP preconditioned resid norm 9.481744679086e-10 true resid norm 
3.710424003325e-08 ||r(i)||/||b|| 9.032132713620e-01
   2 KSP preconditioned resid norm 1.340878639500e-09 true resid norm 
3.730261027312e-08 ||r(i)||/||b|| 9.080421166135e-01
   3 KSP preconditioned resid norm 9.481415577317e-10 true resid norm 
3.725585425324e-08 ||r(i)||/||b|| 9.069039540306e-01
  ...
750 KSP preconditioned resid norm 1.340874209607e-09 true resid norm 
3.726506102951e-08 ||r(i)||/||b|| 9.071280708031e-01
Linear solve did not converge due to DIVERGED_ITS iterations 750
KSP Object: 1 MPI processes
   type: tfqmr
   maximum iterations=750, initial guess is zero
   tolerances:  relative=1e-07, absolute=1e-50, divergence=10000
   left preconditioning
   using PRECONDITIONED norm type for convergence test
PC Object: 1 MPI processes
   type: sor
     SOR: type = local_symmetric, iterations = 1, local iterations = 1, 
omega = 1
   linear system matrix = precond matrix:
   Matrix Object:   1 MPI processes
     type: seqaij
     rows=1375920, cols=1375920
     total: nonzeros=17671714, allocated nonzeros=17671714
     total number of mallocs used during MatSetValues calls =0
       not using I-node routines

Using Petsc Development HG revision: 
59ca71b85d784abbe22f12c34ffcb2ccc6dc68b0  HG Date: Tue Mar 20 01:05:13 
2012 -0500
Configure options: --with-petsc-arch=openmpi-intel-complex-release-c 
--with-mpi-dir=/opt/mpi/intel/openmpi-1.4.2 --with-scalar-type=complex 
--with-precision=double --with-blas-lapack-dir=/usr/local/lib --with-x=0 
--with-debugging=0

#######################################
And now second one:
./solveTest -ksp_view -ksp_converged_reason -ksp_monitor_true_residual 
-log_summary -mat_type aij -ksp_rtol 1.0e-7 -ksp_type tfqmr -pc_type sor 
-ksp_max_it 750
   0 KSP preconditioned resid norm 9.760858225034e-10 true resid norm 
4.108026443997e-08 ||r(i)||/||b|| 1.000000000000e+00
   1 KSP preconditioned resid norm 9.497930576113e-10 true resid norm 
3.591633653854e-08 ||r(i)||/||b|| 8.742966246242e-01
   1 KSP preconditioned resid norm 1.341211098002e-09 true resid norm 
3.754049143693e-08 ||r(i)||/||b|| 9.138327600539e-01
   2 KSP preconditioned resid norm 9.481662292281e-10 true resid norm 
3.709648505700e-08 ||r(i)||/||b|| 9.030244951612e-01
   2 KSP preconditioned resid norm 1.340863953080e-09 true resid norm 
3.730358317575e-08 ||r(i)||/||b|| 9.080657995827e-01
   3 KSP preconditioned resid norm 9.481308142928e-10 true resid norm 
3.725036905334e-08 ||r(i)||/||b|| 9.067704300630e-01
...
230 KSP preconditioned resid norm 8.875661094394e-17 true resid norm 
1.062030983067e-14 ||r(i)||/||b|| 2.585258389996e-07
Linear solve converged due to CONVERGED_RTOL iterations 230
KSP Object: 1 MPI processes
   type: tfqmr
   maximum iterations=750, initial guess is zero
   tolerances:  relative=1e-07, absolute=1e-50, divergence=10000
   left preconditioning
   using PRECONDITIONED norm type for convergence test
PC Object: 1 MPI processes
   type: sor
     SOR: type = local_symmetric, iterations = 1, local iterations = 1, 
omega = 1
   linear system matrix = precond matrix:
   Matrix Object:   1 MPI processes
     type: seqaij
     rows=1375920, cols=1375920
     total: nonzeros=17671714, allocated nonzeros=17671714
     total number of mallocs used during MatSetValues calls =0
       not using I-node routines

Using Petsc Development HG revision: 
a4dae06b4d05ef5b78a9a9aa2cbc31b2b158834f  HG Date: Wed Feb 08 11:29:52 
2012 -0600
Configure options: --with-petsc-arch=openmpi-intel-complex-release-f-mkl 
--with-fortran-interfaces=1 --download-superlu --download-superlu_dist 
--download-mumps --download-parmetis --download-ptscotch 
--download-metis 
--with-scalapack-lib=/opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_scalapack_lp64.a 
--with-scalapack-include=/opt/intel/Compiler/11.1/072/mkl/include 
--with-blacs-lib=/opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_blacs_openmpi_lp64.a 
--with-blacs-include=/opt/intel/Compiler/11.1/072/mkl/include 
--with-mpi-dir=/opt/mpi/intel/openmpi-1.4.2 --with-scalar-type=complex 
--with-blas-lapack-lib="[/opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_intel_lp64.a,/opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_intel_thread.a,/opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_core.a,/opt/intel/Compiler/11.1/072/lib/intel64/libiomp5.a]" 
--with-precision=double --with-x=0 --with-debugging=0


Well, two version compiled with different lines, but still I don't think 
this is the reason...
Where actually to look at?

Thanks.

On 22.03.2012 15:42, Jed Brown wrote:
> On Thu, Mar 22, 2012 at 04:23, Alexander Grayver 
> <agrayver at gfz-potsdam.de <mailto:agrayver at gfz-potsdam.de>> wrote:
>
>     I've got the following error when applying bicg+sor to my problem:
>
>     mt-soft/solveTest> ./solveTest -ksp_view -ksp_converged_reason
>     -ksp_monitor_true_residual -log_summary -mat_type aij -ksp_rtol
>     1.0e-7 -ksp_type bicg -pc_type sor -ksp_max_it 750
>     [0]PETSC ERROR: --------------------- Error Message
>     ------------------------------------
>     [0]PETSC ERROR: No support for this operation for this object type!
>     [0]PETSC ERROR: PC does not have apply transpose!
>     [0]PETSC ERROR:
>     ------------------------------------------------------------------------
>     [0]PETSC ERROR: Petsc Development HG revision:
>     59ca71b85d784abbe22f12c34ffcb2ccc6dc68b0  HG Date: Tue Mar 20
>     01:05:13 2012 -0500
>     [0]PETSC ERROR: See docs/changes/index.html for recent updates.
>     [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>     [0]PETSC ERROR: See docs/index.html for manual pages.
>     [0]PETSC ERROR:
>     ------------------------------------------------------------------------
>     [0]PETSC ERROR: Configure options
>     --with-petsc-arch=openmpi-intel-complex-release-c
>     --with-mpi-dir=/opt/mpi/intel/openmpi-1.4.2
>     --with-scalar-type=complex --with-precision=double
>     --with-blas-lapack-dir=/usr/local/lib --with-x=0 --with-debugging=0
>     [0]PETSC ERROR:
>     ------------------------------------------------------------------------
>     [0]PETSC ERROR: PCApplyTranspose() line 515 in
>     /home/mt/agrayver/lib/petsc-dev/src/ksp/pc/interface/precon.c
>     [0]PETSC ERROR: KSPSolve_BiCG() line 55 in
>     /home/mt/agrayver/lib/petsc-dev/src/ksp/ksp/impls/bicg/bicg.c
>     [0]PETSC ERROR: KSPSolve() line 434 in
>     /home/mt/agrayver/lib/petsc-dev/src/ksp/ksp/interface/itfunc.c
>     [0]PETSC ERROR: main() line 114 in /home/solveTest.c
>
>     Any ideas?
>
>
> PCApplyTranspose_SOR is not implemented. It is unfortunately not an 
> operation that is efficient to implement in-place like SOR.


-- 
Regards,
Alexander

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120322/5d9f309e/attachment.html>


More information about the petsc-dev mailing list