[petsc-users] Convergence of transposed linear system.

francois Pacull fpacull at hotmail.com
Mon Nov 24 15:50:51 CST 2014




Gaetan,

Do you observe this behavior on some smaller matrices? I would say that an easy way to test this on a small linear system is to build the PC with the untransposed preconditioning matrix, get the overlapping partition IS with PCASMGetLocalSubdomains, destroy the PC, create a new one with the transposed preconditioning matrix, and set the partition with PCASMSetLocalSubdomains (using the stored IS).

And did you ever observe this behavior without any overlap: ASM(overlap 0)? 

What do you mean when you say that left preconditioning fails? Does it stagnate? Converge slowly? It is rather strange I think.

Francois.


Date: Mon, 24 Nov 2014 08:03:17 -0500
Subject: Re: [petsc-users] Convergence of transposed linear system.
From: gaetank at gmail.com
To: fpacull at hotmail.com
CC: petsc-users at mcs.anl.gov

That is a good idea to try Francois. Do you know if there is a easy way to try that in PETSc? However, in my case, I'm not using an upwind scheme, but rather a 2nd order JST scheme for the preconditioner. Also, we have observed the same behavior even for Euler systems, although both the direct/adjoint systems in this case are easier to solve and the difference between the systems is less dramatic.
I also though about using left preconditioning for the adjoint system instead of right preconditioning, but left preconditioning consistently fails even for the untransposed system. I have no idea why left preconditioning doesn't work.
Gaetan
On Mon, Nov 24, 2014 at 6:24 AM, francois Pacull <fpacull at hotmail.com> wrote:



Hello,

This is just an idea but this might be due to the fact that the structure of the preconditioner is severely unsymmetrical when using a first-order upwind scheme without viscous terms: when building the overlap, the non-zero terms in the row-wise extra-diagonal blocks yield the list of vertices to add to each subdomain. If you use the transpose of the preconditioner, it still uses the row-wise and not the column-wise extra-diagonal blocks. So maybe you should build the ASM(1) preconditioner with the untransposed matrix first, and then transpose the preconditioning matrix? You may also change the side of the preconditioner, for the transposed system.

Francois.


Date: Sun, 23 Nov 2014 20:54:20 -0500
From: gaetank at gmail.com
To: petsc-users at mcs.anl.gov
Subject: [petsc-users] Convergence of transposed linear system.

Hi everyone
I have a question relating to preconditioning effectiveness on large transposed systems. The linear system I'm trying to solve is jacobian matrix of 3D RANS CFD solver. The bock matrix consists of about 3 million block rows with a block size of 6: 5 for the inviscid part and 1 for the SA turbulence model. 
The preconditioning matrix is different from the linear system matrix in two ways: It uses a first order discretization (instead of second order) and the viscous fluxes are dropped. 
The untransposed system converges about 6 orders of magnitude with GRMES(100), ASM (overlap 1) and ILU(1) with RCM reordering. The test is run on 128 processors.  There are no convergence difficulties. 
However, when I try to solve the transpose of the same system, by either calling KSPSolveTranspose() or by assembling the transpose of the linear system and its preconditioner and calling KSPSolve(), GMRES stagnates after a negligible drop in the residual and no further progress is made. 
I have successfully solved this transpose system by using a different preconditioner that includes the complete linearization of the viscous terms (~4 times as many non-zeros in PC matrix) and a much much stronger preconditioner (ASM(2), ILU(2) with 200 GMRES reset. 

My question is why does the solution of the transpose system with the same method perform so terribly? Is it normal that vastly stronger preconditioning method is required to solve transpose systems?
Any suggestions would be greatly appreciated
Gaetan 		 	   		  


 		 	   		  
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141124/a446e9f7/attachment.html>


More information about the petsc-users mailing list