[petsc-users] Preconditioning of Liouvillian Superoperator

Matthew Knepley knepley at gmail.com
Wed Jan 31 08:01:05 CST 2024


On Wed, Jan 31, 2024 at 8:21 AM Mark Adams <mfadams at lbl.gov> wrote:

> Iterative solvers have to be designed for your particular operator.
> You want to look in your field to see how people solve these problems.
> (eg, zeros on the diagonal will need something like a block solver or maybe
> ILU with a particular ordering)
> I don't personally know anything about this operator. Perhaps someone else
> can help you, but you will probably need to find this yourself.
> Also, hypre's ILUTP is not well supported. You could use our (serial) ILU
> on one processor to experiment with (
> https://petsc.org/main/manualpages/PC/PCILU).
>

As Mark says, understanding your operator is key here. However, some
comments about GMRES. Full
GMRES is guaranteed to converge. By default you are using GMRES(30), which
has no guarantees. You could look at the effect of increasing the subspace
size. This is probably not worth it without first understanding at least
the spectrum of the operator, and other analytic characteristics (say is it
a PDE, or BIE, etc)

  Thanks,

     Matt


> Mark
>
>
> On Wed, Jan 31, 2024 at 6:51 AM Niclas Götting <
> ngoetting at itp.uni-bremen.de> wrote:
>
>> Hi all,
>>
>> I've been trying for the last couple of days to solve a linear system
>> using iterative methods. The system size itself scales exponentially
>> (64^N) with the number of components, so I receive sizes of
>>
>> * (64, 64) for one component
>> * (4096, 4096) for two components
>> * (262144, 262144) for three components
>>
>> I can solve the first two cases with direct solvers and don't run into
>> any problems; however, the last case is the first nontrivial and it's
>> too large for a direct solution, which is why I believe that I need an
>> iterative solver.
>>
>> As I know the solution for the first two cases, I tried to reproduce
>> them using GMRES and failed on the second, because GMRES didn't converge
>> and seems to have been going in the wrong direction (the vector to which
>> it "tries" to converge is a totally different one than the correct
>> solution). I went as far as -ksp_max_it 1000000, which takes orders of
>> magnitude longer than the LU solution and I'd intuitively think that
>> GMRES should not take *that* much longer than LU. Here is the
>> information I have about this (4096, 4096) system:
>>
>> * not symmetric (which is why I went for GMRES)
>> * not singular (SVD: condition number 1.427743623238e+06, 0 of 4096
>> singular values are (nearly) zero)
>> * solving without preconditioning does not converge (DIVERGED_ITS)
>> * solving with iLU and natural ordering fails due to zeros on the diagonal
>> * solving with iLU and RCM ordering does not converge (DIVERGED_ITS)
>>
>> After some searching I also found
>> [this](http://arxiv.org/abs/1504.06768) paper, which mentions the use of
>> ILUTP, which I believe in PETSc should be used via hypre, which,
>> however, threw a SEGV for me, and I'm not sure if it's worth debugging
>> at this point in time, because I might be missing something entirely
>> different.
>>
>> Does anybody have an idea how this system could be solved in finite
>> time, such that the method also scales to the three component problem?
>>
>> Thank you all very much in advance!
>>
>> Best regards
>> Niclas
>>
>>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240131/50fb30f6/attachment-0001.html>


More information about the petsc-users mailing list