[petsc-dev] : Implementing of a variable block size BILU preconditioner

Ali Reza Khaz'ali arkhazali at cc.iut.ac.ir
Wed Dec 5 12:58:15 CST 2018


Dear Mark,

 

Thanks for you kind answer and good suggestion.

The matrix is a block one, having sparse blocks with a dimension of 5.

And about the scales, the numbers do go away, though they are important. In fact, that is one of my serious problems about this system. Do you have any recommendation to alleviate such a problem? 

 

From: Mark Adams <mfadams at lbl.gov> 
Sent: Wednesday, December 05, 2018 5:22 PM
To: arkhazali at cc.iut.ac.ir
Cc: For users of the development version of PETSc <petsc-dev at mcs.anl.gov>
Subject: Re: [petsc-dev] FW: Re[2]: Implementing of a variable block size BILU preconditioner

 

If you zero a row out then put something on the diagonal. 

 

And your matrix data file (it does not look like it has any sparsity meta-data) has about 18 orders of scales. When you diagonally scale, which most solvers implicitly do, it looks like some of these numbers will just go away and you will not get correct results if they are relevant.

 

On Tue, Dec 4, 2018 at 11:57 PM Ali Reza Khaz'ali via petsc-dev <petsc-dev at mcs.anl.gov <mailto:petsc-dev at mcs.anl.gov> > wrote:

Dear Jed,

It is ILU(0) since I did not set the levels. Also, no shift was applied. I have attached a Jacobian matrix for the smallest case that I can simulate with BILU. I've noticed that it has zeros on its diagonal since I had to remove the phase equilibria equations to make the block sizes equal. Additionally, after a discussion with one of my students, I am now convinced that zeros may temporarily appear on the diagonal of the Jacobian of the original code (with phase equilibrium) during SNES iterations.
I do not know if the attached Jacobian can be used for comparison purposes. Changing preconditioner or the linear solver will change the convergence properties of SNES, and hence, the simulator will try to adjust some of its other parameters (e.g., time step size) to get the most accurate and fastest results. In other words, we won't have the same Jacobian as attached if scalar ILU is used.

Many thanks,
Ali 

-----Original Message-----
From: Jed Brown <jed at jedbrown.org <mailto:jed at jedbrown.org> >
Sent: Wednesday, December 05, 2018 12:00 AM
To: Ali Reza Khaz'ali <arkhazali at cc.iut.ac.ir <mailto:arkhazali at cc.iut.ac.ir> >; 'Smith, Barry F.' <bsmith at mcs.anl.gov <mailto:bsmith at mcs.anl.gov> >
Cc: petsc-dev at mcs.anl.gov <mailto:petsc-dev at mcs.anl.gov> 
Subject: RE: Re[2]: [petsc-dev] Implementing of a variable block size BILU preconditioner

Ali Reza Khaz'ali <arkhazali at cc.iut.ac.ir <mailto:arkhazali at cc.iut.ac.ir> > writes:

> Dear Jed,
>
> ILU with BAIJ works, and its performance in reducing the condition number is slightly better than PCVPBJACOBI. Thanks for your guidance.

Is it ILU(0)?  Did you need to turn enable shifts?  Can you write out a small matrix that succeeds with BAIJ/ILU, but not with AIJ/ILU so we can compare?

> Best wishes,
> Ali
>
> -----Original Message-----
> From: Jed Brown <jed at jedbrown.org <mailto:jed at jedbrown.org> >
> Sent: Tuesday, December 04, 2018 9:40 PM
> To: Ali Reza Khaz'ali <arkhazali at cc.iut.ac.ir <mailto:arkhazali at cc.iut.ac.ir> >; 'Smith, Barry F.' 
> <bsmith at mcs.anl.gov <mailto:bsmith at mcs.anl.gov> >
> Cc: petsc-dev at mcs.anl.gov <mailto:petsc-dev at mcs.anl.gov> 
> Subject: RE: Re[2]: [petsc-dev] Implementing of a variable block size 
> BILU preconditioner
>
> Ali Reza Khaz'ali <arkhazali at cc.iut.ac.ir <mailto:arkhazali at cc.iut.ac.ir> > writes:
>
>> Dear Jed,
>>
>> Thanks for your kind answer. I thought Scalar BJACOBI does not need 
>> data from the other domains, but ILU does.
>
> There is no parallel ILU in PETSc.
>
> $ mpiexec -n 2 mpich-clang/tests/ksp/ksp/examples/tutorials/ex2
> -pc_type ilu [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html for possible LU and Cholesky solvers [0]PETSC ERROR: Could not locate a solver package. Perhaps you must ./configure with --download-<package> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
> [0]PETSC ERROR: Petsc Development GIT revision: v3.10.2-19-g217b8b62e2 
> GIT Date: 2018-10-17 10:34:59 +0200 [0]PETSC ERROR:
> mpich-clang/tests/ksp/ksp/examples/tutorials/ex2 on a mpich-clang 
> named joule by jed Tue Dec  4 11:02:53 2018 [0]PETSC ERROR: Configure 
> options --download-chaco --download-p4est --download-sundials 
> --download-triangle --with-fc=0 
> --with-mpi-dir=/home/jed/usr/ccache/mpich-clang --with-visibility 
> --with-x --with-yaml PETSC_ARCH=mpich-clang [0]PETSC ERROR: #1
> MatGetFactor() line 4485 in /home/jed/petsc/src/mat/interface/matrix.c
> [0]PETSC ERROR: #2 PCSetUp_ILU() line 142 in 
> /home/jed/petsc/src/ksp/pc/impls/factor/ilu/ilu.c
> [0]PETSC ERROR: #3 PCSetUp() line 932 in 
> /home/jed/petsc/src/ksp/pc/interface/precon.c
> [0]PETSC ERROR: #4 KSPSetUp() line 391 in 
> /home/jed/petsc/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: #5 KSPSolve() line 723 in 
> /home/jed/petsc/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: #6 main() line 201 in 
> /home/jed/petsc/src/ksp/ksp/examples/tutorials/ex2.c
> [0]PETSC ERROR: PETSc Option Table entries:
> [0]PETSC ERROR: -malloc_test
> [0]PETSC ERROR: -pc_type ilu
> [0]PETSC ERROR: ----------------End of Error Message -------send 
> entire error message to petsc-maint at mcs.anl.gov---------- <mailto:petsc-maint at mcs.anl.gov---------->  application 
> called MPI_Abort(MPI_COMM_WORLD, 92) - process 0
>
>> I have tested my code with scalar ILU. However, no KSP could converge.
>
> There are no guarantees.  See src/ksp/pc/examples/tutorials/ex1.c which tests with Kershaw's matrix, a 4x4 sparse SPD matrix where incomplete factorization yields an indefinite preconditioner.
>
>> Also, there are no zeros on the diagonal, at least in the current 
>> cases that I am simulating them. However, I will recheck it.
>> Additionally, I am going to do a limited test with the available BILU 
>> (ILU with BAIJ matrices) to see whether it can work if I keep my 
>> block sizes constant.
>
> Good.  We should understand why that works or doesn't work before proceeding.
>
>> Since PCVPBJACOBI had a limited success, I feel it is going to work.
>>
>> Best wishes, Ali
>>
>> -----Original Message-----
>> From: Jed Brown <jed at jedbrown.org <mailto:jed at jedbrown.org> >
>> Sent: Tuesday, December 04, 2018 6:22 PM
>> To: Alireza Khazali <arkhazali at cc.iut.ac.ir <mailto:arkhazali at cc.iut.ac.ir> >; Smith, Barry F. 
>> <bsmith at mcs.anl.gov <mailto:bsmith at mcs.anl.gov> >
>> Cc: petsc-dev at mcs.anl.gov <mailto:petsc-dev at mcs.anl.gov> 
>> Subject: Re: Re[2]: [petsc-dev] Implementing of a variable block size 
>> BILU preconditioner
>>
>> Alireza Khazali <arkhazali at cc.iut.ac.ir <mailto:arkhazali at cc.iut.ac.ir> > writes:
>>
>>> Dear Barry,
>>>
>>>
>>> Thanks for your kind answer. You are right. I will try to write my own block ILU with the properties I need. However, my primary problem is the matrix storage. As I understand, each process keeps a portion of a matrix and has access to that portion only. Additionally, I have not found any routine that enables the access of one process to the portion of the matrix that is saved on the memory of another process. However, some algorithms like ILU need such access, and despite spending much time investigating the code, I still do not understand how such a thing is done.
>>
>> Even scalar ILU in PETSc is used inside domain decomposition, such as block Jacobi or additive Schwarz.  Hypre has a parallel ILU, but the parallel scalability is bad.  This is pretty fundamental to ILU: it is not a good parallel algorithm.
>>
>>> I am a very experienced programmer in the field of numerical analysis, but PETSc is a very huge and complicated code. Therefore, I have to apologize for taking your precious time if you find the solution to my problem too obvious, but I will be really really grateful if you could give me a hint.
>>>
>>>
>>>
>>>
>>> Dear Jed,
>>>
>>>
>>> Thank you for your kind answer. I have a multi-component fluid flow 
>>> simulator, which produces valid results using a direct solver (like MKL DGESV). However, direct solvers are useless if the problem size increases, no other available combinations of PC/KSP could solve the system. The simulator must solve a system of nonlinear PDEs for each node, and the primary unknowns are pressure and fluid compositions. However, at some pressures, the fluid is vaporized/liquefied (A.K.A undergoes phase change), and the number of PDEs for that node is increased (phase equilibrium equations have to be solved for that node, too). Therefore, in the discretized then linearized system, we have a block of equations for each node, but the block size is variable, depending on the fluid phase status in that node. The block preconditioners can handle the problem but only if they are designed for such a variable size block matrix. Thanks to Barry, we have a variable block sized BJacobi preconditioner (PCVPBJACOBI), but it does not provide the required precision in a few cases, and more effective preconditioning is needed. Also, I have found that others may need such variable block size handling, as it can be found in PETSc mailing lists:
>>
>> I understand that you have variable sized blocks, but why not use scalar ILU?  Is it failing due to zeros on the diagonal and you have evidence that blocking fixes that?  If so, what ordering are you using for your fields?  Putting the dual variable (often pressure) last often works.
>> Note that incomplete factorization can break down even for SPD matrices.
>>
>> I keep asking on this point because block ILU(0) is algebraically equivalent to scalar ILU(0) on a matrix with the same nonzero pattern, modulo handling of singular blocks that is hard to achieve with scalar ILU.
>>
>>> https://lists.mcs.anl.gov/pipermail/petsc-users/2011-October/010491.
>>> h
>>> t
>>> ml
>>>
>>> https://lists.mcs.anl.gov/pipermail/petsc-users/2018-August/036028.h
>>> t
>>> m
>>> l
>>>
>>> Since I have always loved to contribute to open source projects, and PETSc helped me a lot in my other researches, I decided to add variable size block ILU preconditioner to PETSc. However, PETSc is too complicated, andI cannot accomplish such a task efficiently without help.
>>>
>>>
>>>
>>>
>>> Many thanks,
>>>
>>> Ali
>>> ----- Original Message -----
>>>
>>>
>>> From: Jed Brown (jed at jedbrown.org <mailto:jed at jedbrown.org> )
>>> Date: 13/09/97 05:09
>>> To: Smith, Barry F. (bsmith at mcs.anl.gov <mailto:bsmith at mcs.anl.gov> ), Ali Reza Khaz'ali
>>> (arkhazali at cc.iut.ac.ir <mailto:arkhazali at cc.iut.ac.ir> )
>>> Cc: petsc-dev at mcs.anl.gov <mailto:petsc-dev at mcs.anl.gov> 
>>> Subject: Re: [petsc-dev] Implementing of a variable block size BILU 
>>> preconditioner
>>>
>>>
>>>
>>>
>>>
>>> "Smith, Barry F. via petsc-dev" <petsc-dev at mcs.anl.gov <mailto:petsc-dev at mcs.anl.gov> > writes:
>>>
>>>>> On Dec 3, 2018, at 4:49 PM, Ali Reza Khaz'ali <arkhazali at cc.iut.ac.ir <mailto:arkhazali at cc.iut.ac.ir> > wrote:
>>>>> 
>>>>> Hi,
>>>>> 
>>>>> I think that the topic is more suited for PETSc-developers than its users; therefore, I move it to the dev list.
>>>>> 
>>>>> Continuing the discussion on implementing a variable-block size BILU preconditioner, would it be possible to change the block size parameter (bs) on BAIJ format such that it can handle variable block sizes? (i.e., instead of it being a scalar, it can be an array). Although BILU does not necessarily require rectangular blocks, I think, it leads to less         messy code.
>>>>
>>>>    That is an alternative to using the AIJ format. The problem with this approach is you will need to write a lot of code for the variable block size BAIJ; MatSetValues_SeqVBAIJ, MatMult_SeqVBAIJ, etc etc. While if you reuse the AIJ you only need to write new factorization and solve routines (much less code).
>>>
>>> Sure, but the result isn't really different (modulo associativity) 
>>> from normal ILU applied to a block matrix (at least unless you start 
>>> considering fill with incomplete blocks).
>>>
>>> Ali, what are you hoping to achieve with variable block ILU?
>>>
>>>>> Also, being a newbie on PETSc code, I do not understand some parts of the code, especially distributed matrix storage and some of the implemented numerical algorithms. Is there any reference that I can use for this?
>>>>
>>>>    There is a little discussion at the end chapters of the users manual, plus you should read the developers manual. But there is not a lot of detail except in the actual code.
>>>>
>>>>     Barry
>>>>
>>>>> 
>>>>>  
>>>>> --
>>>>> Ali Reza Khaz’ali
>>>>> Assistant Professor of Petroleum Engineering, Department of 
>>>>> Chemical Engineering Isfahan University of Technology Isfahan, 
>>>>> Iran
>>>>>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20181205/c9a98526/attachment.html>


More information about the petsc-dev mailing list