<html xmlns:v="urn:schemas-microsoft-com:vml" xmlns:o="urn:schemas-microsoft-com:office:office" xmlns:w="urn:schemas-microsoft-com:office:word" xmlns:m="http://schemas.microsoft.com/office/2004/12/omml" xmlns="http://www.w3.org/TR/REC-html40"><head><meta http-equiv=Content-Type content="text/html; charset=utf-8"><meta name=Generator content="Microsoft Word 15 (filtered medium)"><style><!--
/* Font Definitions */
@font-face
{font-family:"Cambria Math";
panose-1:2 4 5 3 5 4 6 3 2 4;}
@font-face
{font-family:Calibri;
panose-1:2 15 5 2 2 2 4 3 2 4;}
/* Style Definitions */
p.MsoNormal, li.MsoNormal, div.MsoNormal
{margin:0in;
margin-bottom:.0001pt;
font-size:12.0pt;
font-family:"Times New Roman",serif;}
a:link, span.MsoHyperlink
{mso-style-priority:99;
color:blue;
text-decoration:underline;}
a:visited, span.MsoHyperlinkFollowed
{mso-style-priority:99;
color:purple;
text-decoration:underline;}
p.msonormal0, li.msonormal0, div.msonormal0
{mso-style-name:msonormal;
mso-margin-top-alt:auto;
margin-right:0in;
mso-margin-bottom-alt:auto;
margin-left:0in;
font-size:12.0pt;
font-family:"Times New Roman",serif;}
span.EmailStyle18
{mso-style-type:personal;
font-family:"Calibri",sans-serif;
color:#1F497D;}
span.EmailStyle19
{mso-style-type:personal-compose;
font-family:"Calibri",sans-serif;
color:windowtext;}
.MsoChpDefault
{mso-style-type:export-only;
font-family:"Calibri",sans-serif;}
@page WordSection1
{size:8.5in 11.0in;
margin:1.0in 1.0in 1.0in 1.0in;}
div.WordSection1
{page:WordSection1;}
--></style><!--[if gte mso 9]><xml>
<o:shapedefaults v:ext="edit" spidmax="1026" />
</xml><![endif]--><!--[if gte mso 9]><xml>
<o:shapelayout v:ext="edit">
<o:idmap v:ext="edit" data="1" />
</o:shapelayout></xml><![endif]--></head><body lang=EN-US link=blue vlink=purple><div class=WordSection1><p class=MsoNormal><span style='font-size:11.0pt;font-family:"Calibri",sans-serif'>Dear Mark,<o:p></o:p></span></p><p class=MsoNormal><span style='font-size:11.0pt;font-family:"Calibri",sans-serif'><o:p> </o:p></span></p><p class=MsoNormal><span style='font-size:11.0pt;font-family:"Calibri",sans-serif'>Thanks for you kind answer and good suggestion.<o:p></o:p></span></p><p class=MsoNormal><span style='font-size:11.0pt;font-family:"Calibri",sans-serif'>The matrix is a block one, having sparse blocks with a dimension of 5.<o:p></o:p></span></p><p class=MsoNormal><span style='font-size:11.0pt;font-family:"Calibri",sans-serif'>And about the scales, the numbers do go away, though they are important. In fact, that is one of my serious problems about this system. Do you have any recommendation to alleviate such a problem? <o:p></o:p></span></p><p class=MsoNormal><span style='font-size:11.0pt;font-family:"Calibri",sans-serif;color:#1F497D'><o:p> </o:p></span></p><p class=MsoNormal><b><span style='font-size:11.0pt;font-family:"Calibri",sans-serif'>From:</span></b><span style='font-size:11.0pt;font-family:"Calibri",sans-serif'> Mark Adams <mfadams@lbl.gov> <br><b>Sent:</b> Wednesday, December 05, 2018 5:22 PM<br><b>To:</b> arkhazali@cc.iut.ac.ir<br><b>Cc:</b> For users of the development version of PETSc <petsc-dev@mcs.anl.gov><br><b>Subject:</b> Re: [petsc-dev] FW: Re[2]: Implementing of a variable block size BILU preconditioner<o:p></o:p></span></p><p class=MsoNormal><o:p> </o:p></p><div><p class=MsoNormal>If you zero a row out then put something on the diagonal. <o:p></o:p></p><div><p class=MsoNormal><o:p> </o:p></p></div><div><p class=MsoNormal>And your matrix data file (it does not look like it has any sparsity meta-data) has about 18 orders of scales. When you diagonally scale, which most solvers implicitly do, it looks like some of these numbers will just go away and you will not get correct results if they are relevant.<o:p></o:p></p></div></div><p class=MsoNormal><o:p> </o:p></p><div><div><p class=MsoNormal>On Tue, Dec 4, 2018 at 11:57 PM Ali Reza Khaz'ali via petsc-dev <<a href="mailto:petsc-dev@mcs.anl.gov">petsc-dev@mcs.anl.gov</a>> wrote:<o:p></o:p></p></div><blockquote style='border:none;border-left:solid #CCCCCC 1.0pt;padding:0in 0in 0in 6.0pt;margin-left:4.8pt;margin-right:0in'><p class=MsoNormal>Dear Jed,<br><br>It is ILU(0) since I did not set the levels. Also, no shift was applied. I have attached a Jacobian matrix for the smallest case that I can simulate with BILU. I've noticed that it has zeros on its diagonal since I had to remove the phase equilibria equations to make the block sizes equal. Additionally, after a discussion with one of my students, I am now convinced that zeros may temporarily appear on the diagonal of the Jacobian of the original code (with phase equilibrium) during SNES iterations.<br>I do not know if the attached Jacobian can be used for comparison purposes. Changing preconditioner or the linear solver will change the convergence properties of SNES, and hence, the simulator will try to adjust some of its other parameters (e.g., time step size) to get the most accurate and fastest results. In other words, we won't have the same Jacobian as attached if scalar ILU is used.<br><br>Many thanks,<br>Ali <br><br>-----Original Message-----<br>From: Jed Brown <<a href="mailto:jed@jedbrown.org" target="_blank">jed@jedbrown.org</a>><br>Sent: Wednesday, December 05, 2018 12:00 AM<br>To: Ali Reza Khaz'ali <<a href="mailto:arkhazali@cc.iut.ac.ir" target="_blank">arkhazali@cc.iut.ac.ir</a>>; 'Smith, Barry F.' <<a href="mailto:bsmith@mcs.anl.gov" target="_blank">bsmith@mcs.anl.gov</a>><br>Cc: <a href="mailto:petsc-dev@mcs.anl.gov" target="_blank">petsc-dev@mcs.anl.gov</a><br>Subject: RE: Re[2]: [petsc-dev] Implementing of a variable block size BILU preconditioner<br><br>Ali Reza Khaz'ali <<a href="mailto:arkhazali@cc.iut.ac.ir" target="_blank">arkhazali@cc.iut.ac.ir</a>> writes:<br><br>> Dear Jed,<br>><br>> ILU with BAIJ works, and its performance in reducing the condition number is slightly better than PCVPBJACOBI. Thanks for your guidance.<br><br>Is it ILU(0)? Did you need to turn enable shifts? Can you write out a small matrix that succeeds with BAIJ/ILU, but not with AIJ/ILU so we can compare?<br><br>> Best wishes,<br>> Ali<br>><br>> -----Original Message-----<br>> From: Jed Brown <<a href="mailto:jed@jedbrown.org" target="_blank">jed@jedbrown.org</a>><br>> Sent: Tuesday, December 04, 2018 9:40 PM<br>> To: Ali Reza Khaz'ali <<a href="mailto:arkhazali@cc.iut.ac.ir" target="_blank">arkhazali@cc.iut.ac.ir</a>>; 'Smith, Barry F.' <br>> <<a href="mailto:bsmith@mcs.anl.gov" target="_blank">bsmith@mcs.anl.gov</a>><br>> Cc: <a href="mailto:petsc-dev@mcs.anl.gov" target="_blank">petsc-dev@mcs.anl.gov</a><br>> Subject: RE: Re[2]: [petsc-dev] Implementing of a variable block size <br>> BILU preconditioner<br>><br>> Ali Reza Khaz'ali <<a href="mailto:arkhazali@cc.iut.ac.ir" target="_blank">arkhazali@cc.iut.ac.ir</a>> writes:<br>><br>>> Dear Jed,<br>>><br>>> Thanks for your kind answer. I thought Scalar BJACOBI does not need <br>>> data from the other domains, but ILU does.<br>><br>> There is no parallel ILU in PETSc.<br>><br>> $ mpiexec -n 2 mpich-clang/tests/ksp/ksp/examples/tutorials/ex2<br>> -pc_type ilu [0]PETSC ERROR: --------------------- Error Message<br>> --------------------------------------------------------------<br>> [0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html" target="_blank">http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html</a> for possible LU and Cholesky solvers [0]PETSC ERROR: Could not locate a solver package. Perhaps you must ./configure with --download-<package> [0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.<br>> [0]PETSC ERROR: Petsc Development GIT revision: v3.10.2-19-g217b8b62e2 <br>> GIT Date: 2018-10-17 10:34:59 +0200 [0]PETSC ERROR:<br>> mpich-clang/tests/ksp/ksp/examples/tutorials/ex2 on a mpich-clang <br>> named joule by jed Tue Dec 4 11:02:53 2018 [0]PETSC ERROR: Configure <br>> options --download-chaco --download-p4est --download-sundials <br>> --download-triangle --with-fc=0 <br>> --with-mpi-dir=/home/jed/usr/ccache/mpich-clang --with-visibility <br>> --with-x --with-yaml PETSC_ARCH=mpich-clang [0]PETSC ERROR: #1<br>> MatGetFactor() line 4485 in /home/jed/petsc/src/mat/interface/matrix.c<br>> [0]PETSC ERROR: #2 PCSetUp_ILU() line 142 in <br>> /home/jed/petsc/src/ksp/pc/impls/factor/ilu/ilu.c<br>> [0]PETSC ERROR: #3 PCSetUp() line 932 in <br>> /home/jed/petsc/src/ksp/pc/interface/precon.c<br>> [0]PETSC ERROR: #4 KSPSetUp() line 391 in <br>> /home/jed/petsc/src/ksp/ksp/interface/itfunc.c<br>> [0]PETSC ERROR: #5 KSPSolve() line 723 in <br>> /home/jed/petsc/src/ksp/ksp/interface/itfunc.c<br>> [0]PETSC ERROR: #6 main() line 201 in <br>> /home/jed/petsc/src/ksp/ksp/examples/tutorials/ex2.c<br>> [0]PETSC ERROR: PETSc Option Table entries:<br>> [0]PETSC ERROR: -malloc_test<br>> [0]PETSC ERROR: -pc_type ilu<br>> [0]PETSC ERROR: ----------------End of Error Message -------send <br>> entire error message to <a href="mailto:petsc-maint@mcs.anl.gov----------">petsc-maint@mcs.anl.gov----------</a> application <br>> called MPI_Abort(MPI_COMM_WORLD, 92) - process 0<br>><br>>> I have tested my code with scalar ILU. However, no KSP could converge.<br>><br>> There are no guarantees. See src/ksp/pc/examples/tutorials/ex1.c which tests with Kershaw's matrix, a 4x4 sparse SPD matrix where incomplete factorization yields an indefinite preconditioner.<br>><br>>> Also, there are no zeros on the diagonal, at least in the current <br>>> cases that I am simulating them. However, I will recheck it.<br>>> Additionally, I am going to do a limited test with the available BILU <br>>> (ILU with BAIJ matrices) to see whether it can work if I keep my <br>>> block sizes constant.<br>><br>> Good. We should understand why that works or doesn't work before proceeding.<br>><br>>> Since PCVPBJACOBI had a limited success, I feel it is going to work.<br>>><br>>> Best wishes, Ali<br>>><br>>> -----Original Message-----<br>>> From: Jed Brown <<a href="mailto:jed@jedbrown.org" target="_blank">jed@jedbrown.org</a>><br>>> Sent: Tuesday, December 04, 2018 6:22 PM<br>>> To: Alireza Khazali <<a href="mailto:arkhazali@cc.iut.ac.ir" target="_blank">arkhazali@cc.iut.ac.ir</a>>; Smith, Barry F. <br>>> <<a href="mailto:bsmith@mcs.anl.gov" target="_blank">bsmith@mcs.anl.gov</a>><br>>> Cc: <a href="mailto:petsc-dev@mcs.anl.gov" target="_blank">petsc-dev@mcs.anl.gov</a><br>>> Subject: Re: Re[2]: [petsc-dev] Implementing of a variable block size <br>>> BILU preconditioner<br>>><br>>> Alireza Khazali <<a href="mailto:arkhazali@cc.iut.ac.ir" target="_blank">arkhazali@cc.iut.ac.ir</a>> writes:<br>>><br>>>> Dear Barry,<br>>>><br>>>><br>>>> Thanks for your kind answer. You are right. I will try to write my own block ILU with the properties I need. However, my primary problem is the matrix storage. As I understand, each process keeps a portion of a matrix and has access to that portion only. Additionally, I have not found any routine that enables the access of one process to the portion of the matrix that is saved on the memory of another process. However, some algorithms like ILU need such access, and despite spending much time investigating the code, I still do not understand how such a thing is done.<br>>><br>>> Even scalar ILU in PETSc is used inside domain decomposition, such as block Jacobi or additive Schwarz. Hypre has a parallel ILU, but the parallel scalability is bad. This is pretty fundamental to ILU: it is not a good parallel algorithm.<br>>><br>>>> I am a very experienced programmer in the field of numerical analysis, but PETSc is a very huge and complicated code. Therefore, I have to apologize for taking your precious time if you find the solution to my problem too obvious, but I will be really really grateful if you could give me a hint.<br>>>><br>>>><br>>>><br>>>><br>>>> Dear Jed,<br>>>><br>>>><br>>>> Thank you for your kind answer. I have a multi-component fluid flow <br>>>> simulator, which produces valid results using a direct solver (like MKL DGESV). However, direct solvers are useless if the problem size increases, no other available combinations of PC/KSP could solve the system. The simulator must solve a system of nonlinear PDEs for each node, and the primary unknowns are pressure and fluid compositions. However, at some pressures, the fluid is vaporized/liquefied (A.K.A undergoes phase change), and the number of PDEs for that node is increased (phase equilibrium equations have to be solved for that node, too). Therefore, in the discretized then linearized system, we have a block of equations for each node, but the block size is variable, depending on the fluid phase status in that node. The block preconditioners can handle the problem but only if they are designed for such a variable size block matrix. Thanks to Barry, we have a variable block sized BJacobi preconditioner (PCVPBJACOBI), but it does not provide the required precision in a few cases, and more effective preconditioning is needed. Also, I have found that others may need such variable block size handling, as it can be found in PETSc mailing lists:<br>>><br>>> I understand that you have variable sized blocks, but why not use scalar ILU? Is it failing due to zeros on the diagonal and you have evidence that blocking fixes that? If so, what ordering are you using for your fields? Putting the dual variable (often pressure) last often works.<br>>> Note that incomplete factorization can break down even for SPD matrices.<br>>><br>>> I keep asking on this point because block ILU(0) is algebraically equivalent to scalar ILU(0) on a matrix with the same nonzero pattern, modulo handling of singular blocks that is hard to achieve with scalar ILU.<br>>><br>>>> <a href="https://lists.mcs.anl.gov/pipermail/petsc-users/2011-October/010491" target="_blank">https://lists.mcs.anl.gov/pipermail/petsc-users/2011-October/010491</a>.<br>>>> h<br>>>> t<br>>>> ml<br>>>><br>>>> <a href="https://lists.mcs.anl.gov/pipermail/petsc-users/2018-August/036028.h" target="_blank">https://lists.mcs.anl.gov/pipermail/petsc-users/2018-August/036028.h</a><br>>>> t<br>>>> m<br>>>> l<br>>>><br>>>> Since I have always loved to contribute to open source projects, and PETSc helped me a lot in my other researches, I decided to add variable size block ILU preconditioner to PETSc. However, PETSc is too complicated, andI cannot accomplish such a task efficiently without help.<br>>>><br>>>><br>>>><br>>>><br>>>> Many thanks,<br>>>><br>>>> Ali<br>>>> ----- Original Message -----<br>>>><br>>>><br>>>> From: Jed Brown (<a href="mailto:jed@jedbrown.org" target="_blank">jed@jedbrown.org</a>)<br>>>> Date: 13/09/97 05:09<br>>>> To: Smith, Barry F. (<a href="mailto:bsmith@mcs.anl.gov" target="_blank">bsmith@mcs.anl.gov</a>), Ali Reza Khaz'ali<br>>>> (<a href="mailto:arkhazali@cc.iut.ac.ir" target="_blank">arkhazali@cc.iut.ac.ir</a>)<br>>>> Cc: <a href="mailto:petsc-dev@mcs.anl.gov" target="_blank">petsc-dev@mcs.anl.gov</a><br>>>> Subject: Re: [petsc-dev] Implementing of a variable block size BILU <br>>>> preconditioner<br>>>><br>>>><br>>>><br>>>><br>>>><br>>>> "Smith, Barry F. via petsc-dev" <<a href="mailto:petsc-dev@mcs.anl.gov" target="_blank">petsc-dev@mcs.anl.gov</a>> writes:<br>>>><br>>>>>> On Dec 3, 2018, at 4:49 PM, Ali Reza Khaz'ali <<a href="mailto:arkhazali@cc.iut.ac.ir" target="_blank">arkhazali@cc.iut.ac.ir</a>> wrote:<br>>>>>> <br>>>>>> Hi,<br>>>>>> <br>>>>>> I think that the topic is more suited for PETSc-developers than its users; therefore, I move it to the dev list.<br>>>>>> <br>>>>>> Continuing the discussion on implementing a variable-block size BILU preconditioner, would it be possible to change the block size parameter (bs) on BAIJ format such that it can handle variable block sizes? (i.e., instead of it being a scalar, it can be an array). Although BILU does not necessarily require rectangular blocks, I think, it leads to less messy code.<br>>>>><br>>>>> That is an alternative to using the AIJ format. The problem with this approach is you will need to write a lot of code for the variable block size BAIJ; MatSetValues_SeqVBAIJ, MatMult_SeqVBAIJ, etc etc. While if you reuse the AIJ you only need to write new factorization and solve routines (much less code).<br>>>><br>>>> Sure, but the result isn't really different (modulo associativity) <br>>>> from normal ILU applied to a block matrix (at least unless you start <br>>>> considering fill with incomplete blocks).<br>>>><br>>>> Ali, what are you hoping to achieve with variable block ILU?<br>>>><br>>>>>> Also, being a newbie on PETSc code, I do not understand some parts of the code, especially distributed matrix storage and some of the implemented numerical algorithms. Is there any reference that I can use for this?<br>>>>><br>>>>> There is a little discussion at the end chapters of the users manual, plus you should read the developers manual. But there is not a lot of detail except in the actual code.<br>>>>><br>>>>> Barry<br>>>>><br>>>>>> <br>>>>>> <br>>>>>> --<br>>>>>> Ali Reza Khaz’ali<br>>>>>> Assistant Professor of Petroleum Engineering, Department of <br>>>>>> Chemical Engineering Isfahan University of Technology Isfahan, <br>>>>>> Iran<br>>>>>><o:p></o:p></p></blockquote></div></div></body></html>