From elias.karabelas at uni-graz.at Sat Apr 1 08:26:31 2023 From: elias.karabelas at uni-graz.at (Elias Karabelas) Date: Sat, 1 Apr 2023 15:26:31 +0200 Subject: [petsc-users] Augmented Linear System In-Reply-To: References: <2267f28c-ec43-66b8-43dd-29b4c6288478@uni-graz.at> Message-ID: Dear Mark, I thought so that all bets would be off with my attempt. So just to illustrate what I meant by my second approach. I can start from the second line of the blocksystem and rewrite it to D dp = f2 - C du then I tried a pesudoinverse (for lack of alternatives I did this with an SVD) and get (denoting the pseudoinverse with D^+) dp = D^+(f2 - C du) plugging that into the first line I get (K - B D^+ C) du = f1 - B D^+ f2 The left hand side I put into a MATSHELL were I store D^+ and the 8 vectors for B and C as a user ctx and update them on the fly during newton. However as you said using this matrix a Amat and pure K as a Pmat in a KSP fails more often then it suceeds so I guess this version of the Schur-Complement is not really well-behaved. And thanks for the advice with Walker et al, what I meant here was that in the outer newton I wanted to do an inexact Newton satisfying || F(x) + F'(x)dx || <= eta_k || F(x) || with F(x) being my residual comprised of the two vectors and F'(x) my block-system. Now I can select eta_k according to SOME rules, but I haven't found anything on howto choose the tolerances in my KSP to finally arrive at this inequality. Reason for doing that in the first place was, that I wanted to safe iteration counts on my total solves, as K is non-symmetric and I have to use a GMRES (preferred) or BiCGSTAB (not so preferred). Cheers Elias Am 31.03.2023 um 15:25 schrieb Mark Adams: > > > On Fri, Mar 31, 2023 at 4:58?AM Karabelas, Elias > (elias.karabelas at uni-graz.at) wrote: > > Hi Mark, > > thanks for the input, however I didn't quite get what you mean. > > Maybe I should be a bit more precisce what I want to achieve and why: > > So this specific form of block system arises in some biomedical > application that colleagues and me published in > https://www.sciencedirect.com/science/article/pii/S0045782521004230 > (the intersting part is Appendix B.3) > > It boils down to a Newton method for solving nolinear mechanics > describing the motion of the human hear, that is coupled on some > Neumann surfaces (the surfaces of the inner cavities of each > bloodpool in the heart) with a pressure that comes from a > complicated 0D ODE model that describes cardiovascular physiology. > This comes to look like > > |F_1(u,p)|?? | 0 | > |??? ?| = |?? | > |F_2(u,p)|?? | 0 | > > with F_1 is the residual of nonlinear mechanics plus a nonlinear > boundary coupling term and F_2 is a coupling term to the ODE > system. In this case u is displacement and p is the pressure > calculated from the ODE model (one for each cavity in the heart, > which gives four). > > After linearization, we arrive exactly at the aforementioned > blocksystem, that we solve at the moment by a Schurcomplement > approach based on K. > So using this we can get around without doing a MATSHELL for the > schurcomplement as we can just apply the KSP for K five times in > order to approximate the solution of the schur-complement system. > > > So you compute an explicit Schur complement (4 solves) and then the > real solve use 1 more K solve. > I think this is pretty good as is. You are lucky with only 4 of these > pressure equations. > I've actually do this on a problem with 100s of extra equations > (surface averaging equations) but the problem was linear and would > 1000s of times steps, or more, and this huge set up cost was amortized. > > However here it gets tricky: The outer newton loop is working > based on an inexact newton method with a forcing term from Walker > et al. So basically the atol and rtol of the KSP are not constant > but vary, so I guess this will influence how well we actually > resolve the solution to the schur-complement system (I tried to > find some works that can explain how to choose forcing terms in > this case but found none). > > > Honestly, Walker is a great guy, but I would not get too hung up on this. > I've done a lot of plasticity work long ago and gave up on Walker et > al. Others have had the same experience. > What is new with your problem is how accurately?do you want the Schur > complement (4) solves. > > This brought me to think if we can do this the other way around > and do a pseudo-inverse of D because it's 4x4 and there is no need > for a KSP there. > I did a test implementation with a MATSHELL that realizes (K - B > D^+ C) and use just K for building an GAMG prec however this fails > spectaculary, because D^+ can behave very badly and the other way > around I have (C K^-1 B - D) and this behaves more like a singular > pertubation of the Matrix C K^-1 B which behaves nicer. So here I > stopped investigating because my PETSc expertise is not bad but > certainly not good enough to judge which approach would pay off > more in terms of runtime (by gut feeling was that building a > MATSHELL requires then only one KSP solve vs the other 5). > > However I'm happy to hear some alternatives that I could persue in > order to speed up our current solver strategy or even be able to > build a nice MATSHELL. > > > OK, so you have tried what I was alluding to. > I don't follow what you did exactly and have not worked it out, but > there should be an iteration on the pressure equation with a (lagged) > Schur solve as a preconditioner. > But with only 4 extra solves in your case, I don't think it is worth > it unless you want to write solver papers. > And AMG in general really has to have a normal PDE, e.g. the K^-1 > solve, and if K is too far away from the Laplacian (or elasticity) > then all bets are off. > Good luck, > Mark > > > Thanks > Elias > > > > > Am 30.03.23 um 19:41 schrieb Mark Adams: >> You can lag the update of the Schur complement and use your >> solver as a preconditioner. >> If your problems don't change much you might converge fast enough >> (ie, < 4 iterations with one solve per iteration), but what you >> have is not bad if the size of your auxiliary, p, space does not >> grow. >> >> Mark >> >> On Thu, Mar 30, 2023 at 11:56?AM Karabelas, Elias >> (elias.karabelas at uni-graz.at) wrote: >> >> Dear Community, >> >> I have a linear system of the form >> >> |K B| du ?? f1 >> >> ??????? = >> >> |C D| dp ?? f2 >> >> where K is a big m x m sparsematrix that comes from some FE >> discretization, B is a coupling matrix (of the form m x 4), C >> is of the >> form (4 x m) and D is 4x4. >> >> I save B and C as 4 Vecs and D as a 4x4 double array. D might be >> singular so at the moment I use the following >> schur-complement approach >> to solve this system >> >> 1) Calculate the vecs v1 = KSP(K,PrecK) * f1, invB = [ KSP(K, >> PrecK) * >> B[0], KSP(K, PrecK) * B[1], KSP(K, PrecK) * B[2], KSP(K, >> PrecK) * B[3] ] >> >> 2) Build the schurcomplement S=[C ^ K^-1 B - D] via VecDots >> (C ^ K^-1 B >> [i, j] = VecDot(C[i], invB[j]) >> >> 3) invert S (this seems to be mostly non-singular) to get dp >> >> 4) calculate du with dp >> >> So counting this, I need 5x to call KSP which can be >> expensive and I >> thought of somehow doing the Schur-Complement the other way >> around, >> however due to the (possible) singularity of D this seems >> like a bad >> idea (even using a pseudoinverse) >> >> Two things puzzle me here still >> >> A) Can this be done more efficiently? >> >> B) In case my above matrix is the Jacobian in a newton >> method, how do I >> make sure with any form of Schur Complement approach that I >> hit the >> correct residual reduction? >> >> Thanks >> >> Elias >> >> -- >> Dr. Elias Karabelas >> Universit?tsassistent | PostDoc >> >> Institut f?r Mathematik & Wissenschaftliches Rechnen | >> Institute of Mathematics & Scientific Computing >> Universit?t Graz | University of Graz >> >> Heinrichstra?e 36, 8010 Graz >> Tel.:? ?+43/(0)316/380-8546 >> E-Mail: elias.karabelas at uni-graz.at >> Web: https://ccl.medunigraz.at >> >> Bitte denken Sie an die Umwelt, bevor Sie dieses E-Mail >> drucken. Danke! >> Please consider the environment before printing this e-mail. >> Thank you! >> > > -- > Dr. Elias Karabelas > Universit?tsassistent | PostDoc > > Institut f?r Mathematik & Wissenschaftliches Rechnen | Institute of Mathematics & Scientific Computing > Universit?t Graz | University of Graz > > Heinrichstra?e 36, 8010 Graz > Tel.: +43/(0)316/380-8546 > E-Mail:elias.karabelas at uni-graz.at > Web: https://ccl.medunigraz.at > > Bitte denken Sie an die Umwelt, bevor Sie dieses E-Mail drucken. Danke! > Please consider the environment before printing this e-mail. Thank you! > -- Dr. Elias Karabelas Universit?tsassistent | PostDoc Institut f?r Mathematik & Wissenschaftliches Rechnen | Institute of Mathematics & Scientific Computing Universit?t Graz | University of Graz Heinrichstra?e 36, 8010 Graz Tel.: +43/(0)316/380-8546 E-Mail:elias.karabelas at uni-graz.at Web:https://ccl.medunigraz.at Bitte denken Sie an die Umwelt, bevor Sie dieses E-Mail drucken. Danke! Please consider the environment before printing this e-mail. Thank you! -------------- next part -------------- An HTML attachment was scrubbed... URL: From mark at resfrac.com Sat Apr 1 20:48:05 2023 From: mark at resfrac.com (Mark McClure) Date: Sat, 1 Apr 2023 18:48:05 -0700 Subject: [petsc-users] MPI linear solver reproducibility question Message-ID: Hello, I have been a user of Petsc for quite a few years, though I haven't updated my version in a few years, so it's possible that my comments below could be 'out of date'. Several years ago, I'd asked you guys about reproducibility. I observed that if I gave an identical matrix to the Petsc linear solver, I would get a bit-wise identical result back if running on one processor, but if I ran with MPI, I would see differences at the final sig figs, below the convergence criterion. Even if rerunning the same exact calculation on the same exact machine. Ie, with repeated tests, it was always converging to the same answer 'within convergence tolerance', but not consistent in the sig figs beyond the convergence tolerance. At the time, the response that this was unavoidable, and related to the issue that machine arithmetic is not commutative, and so the timing of when processors were recombining information (which was random, effectively a race condition) was causing these differences. Am I remembering correctly? And, if so, is this still a property of the Petsc linear solver with MPI, and is there now any option available to resolve it? I would be willing to accept a performance hit in order to get guaranteed bitwise consistency, even when running with MPI. I am using the solver KSPBCGS, without a preconditioner. This is the selection because several years ago, I did testing, and found that on the particular linear systems that I am usually working with, this solver (with no preconditioner) was the most robust, in terms of consistently converging, and in terms of performance. Actually, I also tested a variety of other linear solvers other than Petsc (including other implementations of BiCGStab), and found that the Petsc BCGS was the best performer. Though, I'm curious, have there been updates to that algorithm in recent years, where I should consider updating to a newer Petsc build and comparing? Best regards, Mark McClure -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Sat Apr 1 23:05:16 2023 From: jed at jedbrown.org (Jed Brown) Date: Sat, 01 Apr 2023 22:05:16 -0600 Subject: [petsc-users] MPI linear solver reproducibility question In-Reply-To: References: Message-ID: <87fs9jj5dv.fsf@jedbrown.org> If you use unpreconditioned BCGS and ensure that you assemble the same matrix (depends how you do the communication for that), I think you'll get bitwise reproducible results when using an MPI that follows the suggestion for implementers about determinism. Beyond that, it'll depend somewhat on the preconditioner. If you like BCGS, you may want to try BCGSL, which has a longer memory and tends to be more robust. But preconditioning is usually critical and the place to devote most effort. Mark McClure writes: > Hello, > > I have been a user of Petsc for quite a few years, though I haven't updated > my version in a few years, so it's possible that my comments below could be > 'out of date'. > > Several years ago, I'd asked you guys about reproducibility. I observed > that if I gave an identical matrix to the Petsc linear solver, I would get > a bit-wise identical result back if running on one processor, but if I ran > with MPI, I would see differences at the final sig figs, below the > convergence criterion. Even if rerunning the same exact calculation on the > same exact machine. > > Ie, with repeated tests, it was always converging to the same answer > 'within convergence tolerance', but not consistent in the sig figs beyond > the convergence tolerance. > > At the time, the response that this was unavoidable, and related to the > issue that machine arithmetic is not commutative, and so the timing of when > processors were recombining information (which was random, effectively a > race condition) was causing these differences. > > Am I remembering correctly? And, if so, is this still a property of the > Petsc linear solver with MPI, and is there now any option available to > resolve it? I would be willing to accept a performance hit in order to get > guaranteed bitwise consistency, even when running with MPI. > > I am using the solver KSPBCGS, without a preconditioner. This is the > selection because several years ago, I did testing, and found that on the > particular linear systems that I am usually working with, this solver (with > no preconditioner) was the most robust, in terms of consistently > converging, and in terms of performance. Actually, I also tested a variety > of other linear solvers other than Petsc (including other implementations > of BiCGStab), and found that the Petsc BCGS was the best performer. Though, > I'm curious, have there been updates to that algorithm in recent years, > where I should consider updating to a newer Petsc build and comparing? > > Best regards, > Mark McClure From mark at resfrac.com Sat Apr 1 23:31:20 2023 From: mark at resfrac.com (Mark McClure) Date: Sat, 1 Apr 2023 21:31:20 -0700 Subject: [petsc-users] MPI linear solver reproducibility question In-Reply-To: <87fs9jj5dv.fsf@jedbrown.org> References: <87fs9jj5dv.fsf@jedbrown.org> Message-ID: Thank you, I will try BCGSL. And good to know that this is worth pursuing, and that it is possible. Step 1, I guess I should upgrade to the latest release on Petsc. How can I make sure that I am "using an MPI that follows the suggestion for implementers about determinism"? I am using MPICH version 3.3a2. I am pretty sure that I'm assembling the same matrix every time, but I'm not sure how it would depend on 'how you do the communication'. Each process is doing a series of MatSetValues with INSERT_VALUES, assembling the matrix by rows. My understanding of this process is that it'd be deterministic. On Sat, Apr 1, 2023 at 9:05?PM Jed Brown wrote: > If you use unpreconditioned BCGS and ensure that you assemble the same > matrix (depends how you do the communication for that), I think you'll get > bitwise reproducible results when using an MPI that follows the suggestion > for implementers about determinism. Beyond that, it'll depend somewhat on > the preconditioner. > > If you like BCGS, you may want to try BCGSL, which has a longer memory and > tends to be more robust. But preconditioning is usually critical and the > place to devote most effort. > > Mark McClure writes: > > > Hello, > > > > I have been a user of Petsc for quite a few years, though I haven't > updated > > my version in a few years, so it's possible that my comments below could > be > > 'out of date'. > > > > Several years ago, I'd asked you guys about reproducibility. I observed > > that if I gave an identical matrix to the Petsc linear solver, I would > get > > a bit-wise identical result back if running on one processor, but if I > ran > > with MPI, I would see differences at the final sig figs, below the > > convergence criterion. Even if rerunning the same exact calculation on > the > > same exact machine. > > > > Ie, with repeated tests, it was always converging to the same answer > > 'within convergence tolerance', but not consistent in the sig figs beyond > > the convergence tolerance. > > > > At the time, the response that this was unavoidable, and related to the > > issue that machine arithmetic is not commutative, and so the timing of > when > > processors were recombining information (which was random, effectively a > > race condition) was causing these differences. > > > > Am I remembering correctly? And, if so, is this still a property of the > > Petsc linear solver with MPI, and is there now any option available to > > resolve it? I would be willing to accept a performance hit in order to > get > > guaranteed bitwise consistency, even when running with MPI. > > > > I am using the solver KSPBCGS, without a preconditioner. This is the > > selection because several years ago, I did testing, and found that on the > > particular linear systems that I am usually working with, this solver > (with > > no preconditioner) was the most robust, in terms of consistently > > converging, and in terms of performance. Actually, I also tested a > variety > > of other linear solvers other than Petsc (including other implementations > > of BiCGStab), and found that the Petsc BCGS was the best performer. > Though, > > I'm curious, have there been updates to that algorithm in recent years, > > where I should consider updating to a newer Petsc build and comparing? > > > > Best regards, > > Mark McClure > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Sat Apr 1 23:53:26 2023 From: jed at jedbrown.org (Jed Brown) Date: Sat, 01 Apr 2023 22:53:26 -0600 Subject: [petsc-users] MPI linear solver reproducibility question In-Reply-To: References: <87fs9jj5dv.fsf@jedbrown.org> Message-ID: <874jpykhq1.fsf@jedbrown.org> Mark McClure writes: > Thank you, I will try BCGSL. > > And good to know that this is worth pursuing, and that it is possible. Step > 1, I guess I should upgrade to the latest release on Petsc. > > How can I make sure that I am "using an MPI that follows the suggestion for > implementers about determinism"? I am using MPICH version 3.3a2. > > I am pretty sure that I'm assembling the same matrix every time, but I'm > not sure how it would depend on 'how you do the communication'. Each > process is doing a series of MatSetValues with INSERT_VALUES, > assembling the matrix by rows. My understanding of this process is that > it'd be deterministic. In the typical FD implementation, you only set local rows, but with FE and sometimes FV, you also create values that need to be communicated and summed on other processors. From mark at resfrac.com Sun Apr 2 00:03:48 2023 From: mark at resfrac.com (Mark McClure) Date: Sat, 1 Apr 2023 22:03:48 -0700 Subject: [petsc-users] MPI linear solver reproducibility question In-Reply-To: <874jpykhq1.fsf@jedbrown.org> References: <87fs9jj5dv.fsf@jedbrown.org> <874jpykhq1.fsf@jedbrown.org> Message-ID: In the typical FD implementation, you only set local rows, but with FE and sometimes FV, you also create values that need to be communicated and summed on other processors. Makes sense. Anyway, in this case, I am certain that I am giving the solver bitwise identical matrices from each process. I am not using a preconditioner, using BCGS, with Petsc version 3.13.3. So then, how can I make sure that I am "using an MPI that follows the suggestion for implementers about determinism"? I am using MPICH version 3.3a2, didn't do anything special when installing it. Does that sound OK? If so, I could upgrade to the latest Petsc, try again, and if confirmed that it persists, could provide a reproduction scenario. On Sat, Apr 1, 2023 at 9:53?PM Jed Brown wrote: > Mark McClure writes: > > > Thank you, I will try BCGSL. > > > > And good to know that this is worth pursuing, and that it is possible. > Step > > 1, I guess I should upgrade to the latest release on Petsc. > > > > How can I make sure that I am "using an MPI that follows the suggestion > for > > implementers about determinism"? I am using MPICH version 3.3a2. > > > > I am pretty sure that I'm assembling the same matrix every time, but I'm > > not sure how it would depend on 'how you do the communication'. Each > > process is doing a series of MatSetValues with INSERT_VALUES, > > assembling the matrix by rows. My understanding of this process is that > > it'd be deterministic. > > In the typical FD implementation, you only set local rows, but with FE and > sometimes FV, you also create values that need to be communicated and > summed on other processors. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Sun Apr 2 08:31:07 2023 From: jed at jedbrown.org (Jed Brown) Date: Sun, 02 Apr 2023 07:31:07 -0600 Subject: [petsc-users] MPI linear solver reproducibility question In-Reply-To: References: <87fs9jj5dv.fsf@jedbrown.org> <874jpykhq1.fsf@jedbrown.org> Message-ID: <871ql2jtr8.fsf@jedbrown.org> Vector communication used a different code path in 3.13. If you have a reproducer with current PETSc, I'll have a look. Here's a demo that the solution is bitwise identical (the sha256sum is the same every time you run it, though it might be different on your computer from mine due to compiler version and flags). $ mpiexec -n 8 ompi/tests/snes/tutorials/ex5 -da_refine 3 -snes_monitor -snes_view_solution binary && sha256sum binaryoutput 0 SNES Function norm 1.265943996096e+00 1 SNES Function norm 2.831564838232e-02 2 SNES Function norm 4.456686729809e-04 3 SNES Function norm 1.206531765776e-07 4 SNES Function norm 1.740255643596e-12 5410f84e91a9db3a74a2ac33603bbbb1fb48e7eaf739614192cfd53344517986 binaryoutput Mark McClure writes: > In the typical FD implementation, you only set local rows, but with FE and > sometimes FV, you also create values that need to be communicated and > summed on other processors. > Makes sense. > > Anyway, in this case, I am certain that I am giving the solver bitwise > identical matrices from each process. I am not using a preconditioner, > using BCGS, with Petsc version 3.13.3. > > So then, how can I make sure that I am "using an MPI that follows the > suggestion for implementers about determinism"? I am using MPICH version > 3.3a2, didn't do anything special when installing it. Does that sound OK? > If so, I could upgrade to the latest Petsc, try again, and if confirmed > that it persists, could provide a reproduction scenario. > > > > On Sat, Apr 1, 2023 at 9:53?PM Jed Brown wrote: > >> Mark McClure writes: >> >> > Thank you, I will try BCGSL. >> > >> > And good to know that this is worth pursuing, and that it is possible. >> Step >> > 1, I guess I should upgrade to the latest release on Petsc. >> > >> > How can I make sure that I am "using an MPI that follows the suggestion >> for >> > implementers about determinism"? I am using MPICH version 3.3a2. >> > >> > I am pretty sure that I'm assembling the same matrix every time, but I'm >> > not sure how it would depend on 'how you do the communication'. Each >> > process is doing a series of MatSetValues with INSERT_VALUES, >> > assembling the matrix by rows. My understanding of this process is that >> > it'd be deterministic. >> >> In the typical FD implementation, you only set local rows, but with FE and >> sometimes FV, you also create values that need to be communicated and >> summed on other processors. >> From mark at resfrac.com Sun Apr 2 11:35:50 2023 From: mark at resfrac.com (Mark McClure) Date: Sun, 2 Apr 2023 09:35:50 -0700 Subject: [petsc-users] MPI linear solver reproducibility question In-Reply-To: <871ql2jtr8.fsf@jedbrown.org> References: <87fs9jj5dv.fsf@jedbrown.org> <874jpykhq1.fsf@jedbrown.org> <871ql2jtr8.fsf@jedbrown.org> Message-ID: Ok, good to know. I'll update to latest Petsc, and do some testing, and let you know either way. On Sun, Apr 2, 2023 at 6:31?AM Jed Brown wrote: > Vector communication used a different code path in 3.13. If you have a > reproducer with current PETSc, I'll have a look. Here's a demo that the > solution is bitwise identical (the sha256sum is the same every time you run > it, though it might be different on your computer from mine due to compiler > version and flags). > > $ mpiexec -n 8 ompi/tests/snes/tutorials/ex5 -da_refine 3 -snes_monitor > -snes_view_solution binary && sha256sum binaryoutput > 0 SNES Function norm 1.265943996096e+00 > 1 SNES Function norm 2.831564838232e-02 > 2 SNES Function norm 4.456686729809e-04 > 3 SNES Function norm 1.206531765776e-07 > 4 SNES Function norm 1.740255643596e-12 > 5410f84e91a9db3a74a2ac33603bbbb1fb48e7eaf739614192cfd53344517986 > binaryoutput > > Mark McClure writes: > > > In the typical FD implementation, you only set local rows, but with FE > and > > sometimes FV, you also create values that need to be communicated and > > summed on other processors. > > Makes sense. > > > > Anyway, in this case, I am certain that I am giving the solver bitwise > > identical matrices from each process. I am not using a preconditioner, > > using BCGS, with Petsc version 3.13.3. > > > > So then, how can I make sure that I am "using an MPI that follows the > > suggestion for implementers about determinism"? I am using MPICH version > > 3.3a2, didn't do anything special when installing it. Does that sound OK? > > If so, I could upgrade to the latest Petsc, try again, and if confirmed > > that it persists, could provide a reproduction scenario. > > > > > > > > On Sat, Apr 1, 2023 at 9:53?PM Jed Brown wrote: > > > >> Mark McClure writes: > >> > >> > Thank you, I will try BCGSL. > >> > > >> > And good to know that this is worth pursuing, and that it is possible. > >> Step > >> > 1, I guess I should upgrade to the latest release on Petsc. > >> > > >> > How can I make sure that I am "using an MPI that follows the > suggestion > >> for > >> > implementers about determinism"? I am using MPICH version 3.3a2. > >> > > >> > I am pretty sure that I'm assembling the same matrix every time, but > I'm > >> > not sure how it would depend on 'how you do the communication'. Each > >> > process is doing a series of MatSetValues with INSERT_VALUES, > >> > assembling the matrix by rows. My understanding of this process is > that > >> > it'd be deterministic. > >> > >> In the typical FD implementation, you only set local rows, but with FE > and > >> sometimes FV, you also create values that need to be communicated and > >> summed on other processors. > >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From tt73 at njit.edu Sun Apr 2 17:14:18 2023 From: tt73 at njit.edu (Takahashi, Tadanaga) Date: Sun, 2 Apr 2023 18:14:18 -0400 Subject: [petsc-users] Question about NASM initialization Message-ID: Hello PETSc devs, I am using SNES NASM with Newton LS on the sub-SNES. I was wondering how the sub-SNES chooses the initial guess during each NASM iteration. Is it using the previously computed solution or is it restarting from zero? -------------- next part -------------- An HTML attachment was scrubbed... URL: From s_g at berkeley.edu Sun Apr 2 17:27:29 2023 From: s_g at berkeley.edu (Sanjay Govindjee) Date: Mon, 3 Apr 2023 00:27:29 +0200 Subject: [petsc-users] Preallocation Message-ID: I was looking at the release notes for 3.19.0 and noted the comment: Deprecate all MatPreallocate* routines. These are no longer needed since non-preallocated matrices will now be as fast as using them My interpretation of this is that I can now comment out all the MatPreallocate* lines in my code and it will run just fine (and be as fast as before) and no other change are necessary -- with the side benefit of no longer having to maintain my preallocation code. Have I read this correctly? -sanjay -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Sun Apr 2 20:09:34 2023 From: bsmith at petsc.dev (Barry Smith) Date: Sun, 2 Apr 2023 21:09:34 -0400 Subject: [petsc-users] Preallocation In-Reply-To: References: Message-ID: <3F017926-4BD5-4180-93DD-816DBDDE5A83@petsc.dev> Yes, but it would be interesting to see a comparison of timing between your current code and code with just a call to MatSetUp() and no calls to the preallocation routines. Barry > On Apr 2, 2023, at 6:27 PM, Sanjay Govindjee wrote: > > I was looking at the release notes for 3.19.0 and noted the comment: > Deprecate all MatPreallocate* routines. These are no longer needed since non-preallocated matrices will now be as fast as using them > > My interpretation of this is that I can now comment out all the MatPreallocate* lines in my code and it will run just fine (and be as fast as before) and no other change are necessary -- with the side benefit of no longer having to maintain my preallocation code. > > Have I read this correctly? > > -sanjay > -------------- next part -------------- An HTML attachment was scrubbed... URL: From xiongziming2010 at gmail.com Mon Apr 3 04:03:01 2023 From: xiongziming2010 at gmail.com (ziming xiong) Date: Mon, 3 Apr 2023 11:03:01 +0200 Subject: [petsc-users] Installation issues based on Petsc-pardiso Message-ID: Hello? I'm trying to configure petsc (Version 3.19.0) with Pardiso, but I keep getting errors, mainly in the blaslapack part, when I try (--with-blaslapack-dir="/cygdrive/c/PROGRA~2/Intel/oneAPI/mkl/ latest" \ --with-mkl_pardiso-dir="/cygdrive/c/PROGRA~2/Intel/oneAPI/mkl/latest"), it prompts (Could not find a functional BLAS. Run with --with-blas-lib=< lib> to indicate the library containing BLAS.Or --download-f2cblaslapack=1 to have one automatically downloaded and installed). When I tried only ( --with-blaslapack-lib="L/cygdrive/c/PROGRA~2/Intel/oneAPI/mkl/latest/lib/intel64 mkl_intel_lp64_dll.lib mkl_sequential_dll.lib mkl_core_dll.lib"). It works. But it doesn't work with (--with-mkl_pardiso-lib or --with-mkl_pardiso-dir ) With the errors: (MKL_Pardiso cannot work with 32 integers but 64 bit Blas/Lapack integers) So what should I do? Best regards, Ziming XIONG -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Apr 3 08:43:38 2023 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 3 Apr 2023 09:43:38 -0400 Subject: [petsc-users] Installation issues based on Petsc-pardiso In-Reply-To: References: Message-ID: We cannot tell what is going on without the configure.log file for each case. Thanks, Matt On Mon, Apr 3, 2023 at 5:03?AM ziming xiong wrote: > Hello? > I'm trying to configure petsc (Version 3.19.0) with Pardiso, but I keep > getting errors, mainly in the blaslapack part, when I try > (--with-blaslapack-dir="/cygdrive/c/PROGRA~2/Intel/oneAPI/mkl/ latest" \ > --with-mkl_pardiso-dir="/cygdrive/c/PROGRA~2/Intel/oneAPI/mkl/latest"), > it prompts (Could not find a functional BLAS. Run with --with-blas-lib=< > lib> to indicate the library > containing BLAS.Or --download-f2cblaslapack=1 to have one automatically > downloaded and installed). > > When I tried only > ( --with-blaslapack-lib="L/cygdrive/c/PROGRA~2/Intel/oneAPI/mkl/latest/lib/intel64 > mkl_intel_lp64_dll.lib mkl_sequential_dll.lib mkl_core_dll.lib"). It > works. > But it doesn't work with (--with-mkl_pardiso-lib > or --with-mkl_pardiso-dir ) With the errors: > (MKL_Pardiso cannot work with 32 integers but 64 bit Blas/Lapack > integers) > So what should I do? > > Best regards, > Ziming XIONG > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From liufield at gmail.com Tue Apr 4 09:42:44 2023 From: liufield at gmail.com (neil liu) Date: Tue, 4 Apr 2023 10:42:44 -0400 Subject: [petsc-users] Questions about the PetscDSSetResidual and Nedelec element. Message-ID: Hello, I am learning this case, https://petsc.org/release/src/snes/tutorials/ex62.c.html . And try to make myself familiar with the FEM (PetscFE) there. Then I have several questions. 1) PetscDSSetResidual , For example, line 291, PetscCall (PetscDSSetResidual (ds, 0, f0_quadratic_u, f1_u)), here, f0_quadratic_u and f1_u is function pointers. Where have the input parameter for these functions been calculated, e.g., u_x. If I want to check the values of these parameters before line 291, how to print that ? 2) Does PetscFE support Nedelec element? Will it be painful to add these modules myself ? Thanks , Xiaodong -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Apr 4 09:59:45 2023 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 4 Apr 2023 10:59:45 -0400 Subject: [petsc-users] Questions about the PetscDSSetResidual and Nedelec element. In-Reply-To: References: Message-ID: On Tue, Apr 4, 2023 at 10:45?AM neil liu wrote: > Hello, > > I am learning this case, > https://petsc.org/release/src/snes/tutorials/ex62.c.html > . And try to make myself familiar with the FEM (PetscFE) there. > Then I have several questions. > 1) PetscDSSetResidual > , > For example, line 291, PetscCall > (PetscDSSetResidual > (ds, 0, > f0_quadratic_u, f1_u)), here, f0_quadratic_u and f1_u is function > pointers. Where have the input parameter for these functions been > calculated, e.g., u_x. > Line 291 sets callback functions. These functions are actually called in the loop over the mesh which calculates the residual. PETSc calculates u_x, given the PetscFE and the coefficients (and the geometry). > If I want to check the values of these parameters before line 291, how to > print that ? > I output many thing with -dm_plex_print_fem 5, but I am not sure if I print out the field jet. It would be easy to add. > 2) Does PetscFE support Nedelec element? Will it be painful to add these > modules myself ? > What kind? I think we support them, but have no tests. Take a look at our support for Raviart-Thomas https://gitlab.com/petsc/petsc/-/blob/main/src/dm/dt/dualspace/impls/lagrange/tutorials/ex1.c#L78 and see a use case here https://gitlab.com/petsc/petsc/-/blob/main/src/snes/tutorials/ex24.c Thanks, Matt > Thanks , > > Xiaodong > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From nbarnafi at cmm.uchile.cl Tue Apr 4 15:02:31 2023 From: nbarnafi at cmm.uchile.cl (Nicolas Barnafi) Date: Tue, 4 Apr 2023 16:02:31 -0400 Subject: [petsc-users] Problem setting Fieldsplit fields In-Reply-To: <33456f00-6519-fb04-b7ca-6f54d5eb5ee5@cmm.uchile.cl> References: <15fd4b93-6e77-b9ef-95f6-1f3c1ed45162@cmm.uchile.cl> <33456f00-6519-fb04-b7ca-6f54d5eb5ee5@cmm.uchile.cl> Message-ID: <15a06337-ef02-9ae0-6419-857e4bd9e0d1@cmm.uchile.cl> Hi Matt, One further question on this. I am working on the code now, but have one issue. The IS I grab from all fields need to be set to the sub PCs, but they will have a local ordering of the dofs. Is there a tool in PETSc to make this coherent? I.e. if I will set the IS fields '0,4,7' to a a subPC object within the Fieldsplit structure, how to build a new PC such that setting PCSetFieldIS(pc, '0', field[0]) PCSetFieldIS(pc, '1', field[4]) PCSetFieldIS(pc, '2', field[7]) makes sense in the new subPC? Thanks for the help. Best, NB On 22-02-23 14:54, Nicolas Barnafi wrote: > Hi Matt, > > Sorry for the late answer, it was holiday time. > >> Just to clarify, if you call SetIS() 3 times, and then give >> >> ? -pc_fieldsplit_0_fields 0,2 >> >> then we should reduce the number of fields to two by calling >> ISConcatenate() on the first and last ISes? > > Exactly > >> I think this should not be hard. It will work exactly as it does on >> the DM case, except the ISes will come from >> the PC, not the DM. One complication is that you will have to hold the >> new ISes until the end, and then set them. >> >> ? ?Thanks, >> >> ? ? ?Matt > > Nice, then it is exactly what I want. I will work on it, and create a PR > when things are starting to fit in. > > Best, > NB From knepley at gmail.com Wed Apr 5 10:16:33 2023 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 5 Apr 2023 11:16:33 -0400 Subject: [petsc-users] Questions about the PetscDSSetResidual and Nedelec element. In-Reply-To: References: Message-ID: On Wed, Apr 5, 2023 at 10:36?AM neil liu wrote: > Thanks, Matt. > When you said "P and P minus space" , do you have some references about > this terminology to build the Hdiv and Hcurl for triangles or tetrahedra? > The canonical reference is https://www-users.cse.umn.edu/~arnold//papers/acta.pdf but that was hard to understand for me. There must be a better reference but I do not know it. Thanks, Matt > On Wed, Apr 5, 2023 at 10:12?AM Matthew Knepley wrote: > >> On Wed, Apr 5, 2023 at 10:00?AM neil liu wrote: >> >>> Thanks, Matt. >>> 1) You mentioned that "PETSc calculates u_x, given the PetscFE and the >>> coefficients (and the geometry).". >>> Could you please direct me the source code that calculates u_x ? >>> >> >> >> https://gitlab.com/petsc/petsc/-/blob/main/src/dm/dt/fe/interface/fe.c#L2124 >> >> >>> 2) Yes. Raviart-Thomas is one type of Nedelec element (Hdiv also called >>> divergence conforming). Do you support Hcurl (curl conforming) Nedelec >>> element ? >>> >> >> Yes, you make them in the same way using the P and P^- spaces. >> >> Thanks, >> >> Matt >> >> >>> On Tue, Apr 4, 2023 at 10:59?AM Matthew Knepley >>> wrote: >>> >>>> On Tue, Apr 4, 2023 at 10:45?AM neil liu wrote: >>>> >>>>> Hello, >>>>> >>>>> I am learning this case, >>>>> https://petsc.org/release/src/snes/tutorials/ex62.c.html >>>>> . And try to make myself familiar with the FEM (PetscFE) there. >>>>> Then I have several questions. >>>>> 1) PetscDSSetResidual >>>>> , >>>>> For example, line 291, PetscCall >>>>> ( >>>>> PetscDSSetResidual >>>>> (ds, 0, >>>>> f0_quadratic_u, f1_u)), here, f0_quadratic_u and f1_u is function >>>>> pointers. Where have the input parameter for these functions been >>>>> calculated, e.g., u_x. >>>>> >>>> >>>> Line 291 sets callback functions. These functions are actually called >>>> in the loop over the mesh which calculates the residual. >>>> PETSc calculates u_x, given the PetscFE and the coefficients (and the >>>> geometry). >>>> >>>> >>>>> If I want to check the values of these parameters before line 291, how >>>>> to print that ? >>>>> >>>> >>>> I output many thing with -dm_plex_print_fem 5, but I am not sure if I >>>> print out the field jet. It would be easy to add. >>>> >>>> >>>>> 2) Does PetscFE support Nedelec element? Will it be painful to add >>>>> these modules myself ? >>>>> >>>> >>>> What kind? I think we support them, but have no tests. Take a look at >>>> our support for Raviart-Thomas >>>> >>>> >>>> https://gitlab.com/petsc/petsc/-/blob/main/src/dm/dt/dualspace/impls/lagrange/tutorials/ex1.c#L78 >>>> >>>> and see a use case here >>>> >>>> https://gitlab.com/petsc/petsc/-/blob/main/src/snes/tutorials/ex24.c >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> Thanks , >>>>> >>>>> Xiaodong >>>>> >>>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>>> >>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Apr 5 10:28:34 2023 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 5 Apr 2023 11:28:34 -0400 Subject: [petsc-users] Questions about the PetscDSSetResidual and Nedelec element. In-Reply-To: References: Message-ID: On Wed, Apr 5, 2023 at 11:23?AM neil liu wrote: > Thanks a lot, Matt. I indeed did some research using Hcurl Nedelec element > to solve 3D maxwell equations. I didn't consider too much math related to > Nedelec element (P and P minus). That is why I would like some references > from you. > > Could you please direct me the source code defining the Hdiv basis > functions? > This is what I use for RT1-P0 on quadrilaterals, with fields names "phi" and "q", # RT1-P0 on quads testset: args: -dm_plex_simplex 0 -dm_plex_box_bd periodic,none -dm_plex_box_faces 3,1 \ -dm_plex_box_lower 0,-1 -dm_plex_box_upper 6.283185307179586,1\ -phi_petscspace_degree 0 \ -phi_petscdualspace_lagrange_use_moments \ -phi_petscdualspace_lagrange_moment_order 2 \ -q_petscfe_default_quadrature_order 1 \ -q_petscspace_type sum \ -q_petscspace_variables 2 \ -q_petscspace_components 2 \ -q_petscspace_sum_spaces 2 \ -q_petscspace_sum_concatenate true \ -q_sumcomp_0_petscspace_variables 2 \ -q_sumcomp_0_petscspace_type tensor \ -q_sumcomp_0_petscspace_tensor_spaces 2 \ -q_sumcomp_0_petscspace_tensor_uniform false \ -q_sumcomp_0_tensorcomp_0_petscspace_degree 1 \ -q_sumcomp_0_tensorcomp_1_petscspace_degree 0 \ -q_sumcomp_1_petscspace_variables 2 \ -q_sumcomp_1_petscspace_type tensor \ -q_sumcomp_1_petscspace_tensor_spaces 2 \ -q_sumcomp_1_petscspace_tensor_uniform false \ -q_sumcomp_1_tensorcomp_0_petscspace_degree 0 \ -q_sumcomp_1_tensorcomp_1_petscspace_degree 1 \ -q_petscdualspace_form_degree -1 \ -q_petscdualspace_order 1 \ -q_petscdualspace_lagrange_trimmed true \ Thanks, Matt > Thanks, > > On Wed, Apr 5, 2023 at 11:16?AM Matthew Knepley wrote: > >> On Wed, Apr 5, 2023 at 10:36?AM neil liu wrote: >> >>> Thanks, Matt. >>> When you said "P and P minus space" , do you have some references about >>> this terminology to build the Hdiv and Hcurl for triangles or tetrahedra? >>> >> >> The canonical reference is >> https://www-users.cse.umn.edu/~arnold//papers/acta.pdf but that was hard >> to understand for me. There must >> be a better reference but I do not know it. >> >> Thanks, >> >> Matt >> >> >>> On Wed, Apr 5, 2023 at 10:12?AM Matthew Knepley >>> wrote: >>> >>>> On Wed, Apr 5, 2023 at 10:00?AM neil liu wrote: >>>> >>>>> Thanks, Matt. >>>>> 1) You mentioned that "PETSc calculates u_x, given the PetscFE and the >>>>> coefficients (and the geometry).". >>>>> Could you please direct me the source code that calculates u_x ? >>>>> >>>> >>>> >>>> https://gitlab.com/petsc/petsc/-/blob/main/src/dm/dt/fe/interface/fe.c#L2124 >>>> >>>> >>>>> 2) Yes. Raviart-Thomas is one type of Nedelec element (Hdiv also >>>>> called divergence conforming). Do you support Hcurl (curl conforming) >>>>> Nedelec element ? >>>>> >>>> >>>> Yes, you make them in the same way using the P and P^- spaces. >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> On Tue, Apr 4, 2023 at 10:59?AM Matthew Knepley >>>>> wrote: >>>>> >>>>>> On Tue, Apr 4, 2023 at 10:45?AM neil liu wrote: >>>>>> >>>>>>> Hello, >>>>>>> >>>>>>> I am learning this case, >>>>>>> https://petsc.org/release/src/snes/tutorials/ex62.c.html >>>>>>> . And try to make myself familiar with the FEM (PetscFE) there. >>>>>>> Then I have several questions. >>>>>>> 1) PetscDSSetResidual >>>>>>> , >>>>>>> For example, line 291, PetscCall >>>>>>> ( >>>>>>> PetscDSSetResidual >>>>>>> (ds, >>>>>>> 0, f0_quadratic_u, f1_u)), here, f0_quadratic_u and f1_u is >>>>>>> function pointers. Where have the input parameter for these functions been >>>>>>> calculated, e.g., u_x. >>>>>>> >>>>>> >>>>>> Line 291 sets callback functions. These functions are actually called >>>>>> in the loop over the mesh which calculates the residual. >>>>>> PETSc calculates u_x, given the PetscFE and the coefficients (and the >>>>>> geometry). >>>>>> >>>>>> >>>>>>> If I want to check the values of these parameters before line 291, >>>>>>> how to print that ? >>>>>>> >>>>>> >>>>>> I output many thing with -dm_plex_print_fem 5, but I am not sure if I >>>>>> print out the field jet. It would be easy to add. >>>>>> >>>>>> >>>>>>> 2) Does PetscFE support Nedelec element? Will it be painful to add >>>>>>> these modules myself ? >>>>>>> >>>>>> >>>>>> What kind? I think we support them, but have no tests. Take a look at >>>>>> our support for Raviart-Thomas >>>>>> >>>>>> >>>>>> https://gitlab.com/petsc/petsc/-/blob/main/src/dm/dt/dualspace/impls/lagrange/tutorials/ex1.c#L78 >>>>>> >>>>>> and see a use case here >>>>>> >>>>>> >>>>>> https://gitlab.com/petsc/petsc/-/blob/main/src/snes/tutorials/ex24.c >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> Thanks , >>>>>>> >>>>>>> Xiaodong >>>>>>> >>>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>>> https://www.cse.buffalo.edu/~knepley/ >>>>>> >>>>>> >>>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>>> >>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From liufield at gmail.com Wed Apr 5 10:22:55 2023 From: liufield at gmail.com (neil liu) Date: Wed, 5 Apr 2023 11:22:55 -0400 Subject: [petsc-users] Questions about the PetscDSSetResidual and Nedelec element. In-Reply-To: References: Message-ID: Thanks a lot, Matt. I indeed did some research using Hcurl Nedelec element to solve 3D maxwell equations. I didn't consider too much math related to Nedelec element (P and P minus). That is why I would like some references from you. Could you please direct me the source code defining the Hdiv basis functions? Thanks, On Wed, Apr 5, 2023 at 11:16?AM Matthew Knepley wrote: > On Wed, Apr 5, 2023 at 10:36?AM neil liu wrote: > >> Thanks, Matt. >> When you said "P and P minus space" , do you have some references about >> this terminology to build the Hdiv and Hcurl for triangles or tetrahedra? >> > > The canonical reference is > https://www-users.cse.umn.edu/~arnold//papers/acta.pdf but that was hard > to understand for me. There must > be a better reference but I do not know it. > > Thanks, > > Matt > > >> On Wed, Apr 5, 2023 at 10:12?AM Matthew Knepley >> wrote: >> >>> On Wed, Apr 5, 2023 at 10:00?AM neil liu wrote: >>> >>>> Thanks, Matt. >>>> 1) You mentioned that "PETSc calculates u_x, given the PetscFE and the >>>> coefficients (and the geometry).". >>>> Could you please direct me the source code that calculates u_x ? >>>> >>> >>> >>> https://gitlab.com/petsc/petsc/-/blob/main/src/dm/dt/fe/interface/fe.c#L2124 >>> >>> >>>> 2) Yes. Raviart-Thomas is one type of Nedelec element (Hdiv also called >>>> divergence conforming). Do you support Hcurl (curl conforming) Nedelec >>>> element ? >>>> >>> >>> Yes, you make them in the same way using the P and P^- spaces. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> On Tue, Apr 4, 2023 at 10:59?AM Matthew Knepley >>>> wrote: >>>> >>>>> On Tue, Apr 4, 2023 at 10:45?AM neil liu wrote: >>>>> >>>>>> Hello, >>>>>> >>>>>> I am learning this case, >>>>>> https://petsc.org/release/src/snes/tutorials/ex62.c.html >>>>>> . And try to make myself familiar with the FEM (PetscFE) there. >>>>>> Then I have several questions. >>>>>> 1) PetscDSSetResidual >>>>>> , >>>>>> For example, line 291, PetscCall >>>>>> ( >>>>>> PetscDSSetResidual >>>>>> (ds, >>>>>> 0, f0_quadratic_u, f1_u)), here, f0_quadratic_u and f1_u is function >>>>>> pointers. Where have the input parameter for these functions been >>>>>> calculated, e.g., u_x. >>>>>> >>>>> >>>>> Line 291 sets callback functions. These functions are actually called >>>>> in the loop over the mesh which calculates the residual. >>>>> PETSc calculates u_x, given the PetscFE and the coefficients (and the >>>>> geometry). >>>>> >>>>> >>>>>> If I want to check the values of these parameters before line 291, >>>>>> how to print that ? >>>>>> >>>>> >>>>> I output many thing with -dm_plex_print_fem 5, but I am not sure if I >>>>> print out the field jet. It would be easy to add. >>>>> >>>>> >>>>>> 2) Does PetscFE support Nedelec element? Will it be painful to add >>>>>> these modules myself ? >>>>>> >>>>> >>>>> What kind? I think we support them, but have no tests. Take a look at >>>>> our support for Raviart-Thomas >>>>> >>>>> >>>>> https://gitlab.com/petsc/petsc/-/blob/main/src/dm/dt/dualspace/impls/lagrange/tutorials/ex1.c#L78 >>>>> >>>>> and see a use case here >>>>> >>>>> https://gitlab.com/petsc/petsc/-/blob/main/src/snes/tutorials/ex24.c >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> Thanks , >>>>>> >>>>>> Xiaodong >>>>>> >>>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From liufield at gmail.com Wed Apr 5 09:36:39 2023 From: liufield at gmail.com (neil liu) Date: Wed, 5 Apr 2023 10:36:39 -0400 Subject: [petsc-users] Questions about the PetscDSSetResidual and Nedelec element. In-Reply-To: References: Message-ID: Thanks, Matt. When you said "P and P minus space" , do you have some references about this terminology to build the Hdiv and Hcurl for triangles or tetrahedra? On Wed, Apr 5, 2023 at 10:12?AM Matthew Knepley wrote: > On Wed, Apr 5, 2023 at 10:00?AM neil liu wrote: > >> Thanks, Matt. >> 1) You mentioned that "PETSc calculates u_x, given the PetscFE and the >> coefficients (and the geometry).". >> Could you please direct me the source code that calculates u_x ? >> > > > https://gitlab.com/petsc/petsc/-/blob/main/src/dm/dt/fe/interface/fe.c#L2124 > > >> 2) Yes. Raviart-Thomas is one type of Nedelec element (Hdiv also called >> divergence conforming). Do you support Hcurl (curl conforming) Nedelec >> element ? >> > > Yes, you make them in the same way using the P and P^- spaces. > > Thanks, > > Matt > > >> On Tue, Apr 4, 2023 at 10:59?AM Matthew Knepley >> wrote: >> >>> On Tue, Apr 4, 2023 at 10:45?AM neil liu wrote: >>> >>>> Hello, >>>> >>>> I am learning this case, >>>> https://petsc.org/release/src/snes/tutorials/ex62.c.html >>>> . And try to make myself familiar with the FEM (PetscFE) there. >>>> Then I have several questions. >>>> 1) PetscDSSetResidual >>>> , >>>> For example, line 291, PetscCall >>>> ( >>>> PetscDSSetResidual >>>> (ds, 0, >>>> f0_quadratic_u, f1_u)), here, f0_quadratic_u and f1_u is function >>>> pointers. Where have the input parameter for these functions been >>>> calculated, e.g., u_x. >>>> >>> >>> Line 291 sets callback functions. These functions are actually called in >>> the loop over the mesh which calculates the residual. >>> PETSc calculates u_x, given the PetscFE and the coefficients (and the >>> geometry). >>> >>> >>>> If I want to check the values of these parameters before line 291, how >>>> to print that ? >>>> >>> >>> I output many thing with -dm_plex_print_fem 5, but I am not sure if I >>> print out the field jet. It would be easy to add. >>> >>> >>>> 2) Does PetscFE support Nedelec element? Will it be painful to add >>>> these modules myself ? >>>> >>> >>> What kind? I think we support them, but have no tests. Take a look at >>> our support for Raviart-Thomas >>> >>> >>> https://gitlab.com/petsc/petsc/-/blob/main/src/dm/dt/dualspace/impls/lagrange/tutorials/ex1.c#L78 >>> >>> and see a use case here >>> >>> https://gitlab.com/petsc/petsc/-/blob/main/src/snes/tutorials/ex24.c >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> Thanks , >>>> >>>> Xiaodong >>>> >>>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gaochenyi14 at 163.com Tue Apr 4 21:00:30 2023 From: gaochenyi14 at 163.com (gaochenyi14) Date: Wed, 5 Apr 2023 10:00:30 +0800 Subject: [petsc-users] How to use PETSc real version and complex version simultaneously? Message-ID: <8DA5B0CD-CBC9-42D0-8804-E96298299DC5@163.com> Hi, I rely on PETSc to deal with real and complex sparse matrices of dimension 1e4 * 1e4 or above. I want to use real version when only real matrices are involved, to achieve better performance, and use complex version only when complex matrices get involved. But in the manual it says different versions can not be used at the same time. Can this restriction be circumvented? If not, where does the restriction come from? All the best, C.-Y. GAO From knepley at gmail.com Wed Apr 5 09:12:05 2023 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 5 Apr 2023 10:12:05 -0400 Subject: [petsc-users] Questions about the PetscDSSetResidual and Nedelec element. In-Reply-To: References: Message-ID: On Wed, Apr 5, 2023 at 10:00?AM neil liu wrote: > Thanks, Matt. > 1) You mentioned that "PETSc calculates u_x, given the PetscFE and the > coefficients (and the geometry).". > Could you please direct me the source code that calculates u_x ? > https://gitlab.com/petsc/petsc/-/blob/main/src/dm/dt/fe/interface/fe.c#L2124 > 2) Yes. Raviart-Thomas is one type of Nedelec element (Hdiv also called > divergence conforming). Do you support Hcurl (curl conforming) Nedelec > element ? > Yes, you make them in the same way using the P and P^- spaces. Thanks, Matt > On Tue, Apr 4, 2023 at 10:59?AM Matthew Knepley wrote: > >> On Tue, Apr 4, 2023 at 10:45?AM neil liu wrote: >> >>> Hello, >>> >>> I am learning this case, >>> https://petsc.org/release/src/snes/tutorials/ex62.c.html >>> . And try to make myself familiar with the FEM (PetscFE) there. >>> Then I have several questions. >>> 1) PetscDSSetResidual >>> , >>> For example, line 291, PetscCall >>> ( >>> PetscDSSetResidual >>> (ds, 0, >>> f0_quadratic_u, f1_u)), here, f0_quadratic_u and f1_u is function >>> pointers. Where have the input parameter for these functions been >>> calculated, e.g., u_x. >>> >> >> Line 291 sets callback functions. These functions are actually called in >> the loop over the mesh which calculates the residual. >> PETSc calculates u_x, given the PetscFE and the coefficients (and the >> geometry). >> >> >>> If I want to check the values of these parameters before line 291, how >>> to print that ? >>> >> >> I output many thing with -dm_plex_print_fem 5, but I am not sure if I >> print out the field jet. It would be easy to add. >> >> >>> 2) Does PetscFE support Nedelec element? Will it be painful to add these >>> modules myself ? >>> >> >> What kind? I think we support them, but have no tests. Take a look at our >> support for Raviart-Thomas >> >> >> https://gitlab.com/petsc/petsc/-/blob/main/src/dm/dt/dualspace/impls/lagrange/tutorials/ex1.c#L78 >> >> and see a use case here >> >> https://gitlab.com/petsc/petsc/-/blob/main/src/snes/tutorials/ex24.c >> >> Thanks, >> >> Matt >> >> >>> Thanks , >>> >>> Xiaodong >>> >>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Apr 5 13:04:36 2023 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 5 Apr 2023 14:04:36 -0400 Subject: [petsc-users] How to use PETSc real version and complex version simultaneously? In-Reply-To: <8DA5B0CD-CBC9-42D0-8804-E96298299DC5@163.com> References: <8DA5B0CD-CBC9-42D0-8804-E96298299DC5@163.com> Message-ID: On Wed, Apr 5, 2023 at 1:59?PM gaochenyi14 wrote: > Hi, > > I rely on PETSc to deal with real and complex sparse matrices of dimension > 1e4 * 1e4 or above. I want to use real version when only real matrices are > involved, to achieve better performance, and use complex version only when > complex matrices get involved. But in the manual it says different versions > can not be used at the same time. Can this restriction be circumvented? If > not, where does the restriction come from? > It is possible to do this, but it is cumbersome. You would have to compile both versions of the library, dlopen() them, and get the symbols you need. A group at Purdue has done this, but it is involved. We use typedefs to change between real and complex. It would not be difficult to allow storage in several types. However, prescribing how one type interactes with another, particularly when data is passed in or out, is challenging. This difficulty does not go away with templates since it is about type interaction, not polymorphism. Thanks, Matt > All the best, > C.-Y. GAO > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From liufield at gmail.com Wed Apr 5 08:59:49 2023 From: liufield at gmail.com (neil liu) Date: Wed, 5 Apr 2023 09:59:49 -0400 Subject: [petsc-users] Questions about the PetscDSSetResidual and Nedelec element. In-Reply-To: References: Message-ID: Thanks, Matt. 1) You mentioned that "PETSc calculates u_x, given the PetscFE and the coefficients (and the geometry).". Could you please direct me the source code that calculates u_x ? 2) Yes. Raviart-Thomas is one type of Nedelec element (Hdiv also called divergence conforming). Do you support Hcurl (curl conforming) Nedelec element ? On Tue, Apr 4, 2023 at 10:59?AM Matthew Knepley wrote: > On Tue, Apr 4, 2023 at 10:45?AM neil liu wrote: > >> Hello, >> >> I am learning this case, >> https://petsc.org/release/src/snes/tutorials/ex62.c.html >> . And try to make myself familiar with the FEM (PetscFE) there. >> Then I have several questions. >> 1) PetscDSSetResidual >> , >> For example, line 291, PetscCall >> (PetscDSSetResidual >> (ds, 0, >> f0_quadratic_u, f1_u)), here, f0_quadratic_u and f1_u is function >> pointers. Where have the input parameter for these functions been >> calculated, e.g., u_x. >> > > Line 291 sets callback functions. These functions are actually called in > the loop over the mesh which calculates the residual. > PETSc calculates u_x, given the PetscFE and the coefficients (and the > geometry). > > >> If I want to check the values of these parameters before line 291, how to >> print that ? >> > > I output many thing with -dm_plex_print_fem 5, but I am not sure if I > print out the field jet. It would be easy to add. > > >> 2) Does PetscFE support Nedelec element? Will it be painful to add these >> modules myself ? >> > > What kind? I think we support them, but have no tests. Take a look at our > support for Raviart-Thomas > > > https://gitlab.com/petsc/petsc/-/blob/main/src/dm/dt/dualspace/impls/lagrange/tutorials/ex1.c#L78 > > and see a use case here > > https://gitlab.com/petsc/petsc/-/blob/main/src/snes/tutorials/ex24.c > > Thanks, > > Matt > > >> Thanks , >> >> Xiaodong >> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Apr 4 16:07:53 2023 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 4 Apr 2023 17:07:53 -0400 Subject: [petsc-users] Problem setting Fieldsplit fields In-Reply-To: <15a06337-ef02-9ae0-6419-857e4bd9e0d1@cmm.uchile.cl> References: <15fd4b93-6e77-b9ef-95f6-1f3c1ed45162@cmm.uchile.cl> <33456f00-6519-fb04-b7ca-6f54d5eb5ee5@cmm.uchile.cl> <15a06337-ef02-9ae0-6419-857e4bd9e0d1@cmm.uchile.cl> Message-ID: On Tue, Apr 4, 2023 at 4:02?PM Nicolas Barnafi wrote: > Hi Matt, > > One further question on this. I am working on the code now, but have one > issue. > > The IS I grab from all fields need to be set to the sub PCs, but they > will have a local ordering of the dofs. Is there a tool in PETSc to make > this coherent? I.e. if I will set the IS fields '0,4,7' to a a subPC > object within the Fieldsplit structure, how to build a new PC such that > setting > > PCSetFieldIS(pc, '0', field[0]) > PCSetFieldIS(pc, '1', field[4]) > PCSetFieldIS(pc, '2', field[7]) > > makes sense in the new subPC? > I am not understanding yet. What might go wrong? Oh, do you mean that if you do a nested FieldSplit, then the indices you supply for splits in the subproblem are relative to the subproblem numbering? Naively, you would have to search the original ISes, but if you are using whole sets, you could search for the first entry. If you are using fields from a DM, the the CreateSubDM() calls do this map automatically. Thanks, Matt > Thanks for the help. > > Best, > NB > > On 22-02-23 14:54, Nicolas Barnafi wrote: > > Hi Matt, > > > > Sorry for the late answer, it was holiday time. > > > >> Just to clarify, if you call SetIS() 3 times, and then give > >> > >> -pc_fieldsplit_0_fields 0,2 > >> > >> then we should reduce the number of fields to two by calling > >> ISConcatenate() on the first and last ISes? > > > > Exactly > > > >> I think this should not be hard. It will work exactly as it does on > >> the DM case, except the ISes will come from > >> the PC, not the DM. One complication is that you will have to hold the > >> new ISes until the end, and then set them. > >> > >> Thanks, > >> > >> Matt > > > > Nice, then it is exactly what I want. I will work on it, and create a PR > > when things are starting to fit in. > > > > Best, > > NB > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mi.mike1021 at gmail.com Tue Apr 4 20:18:58 2023 From: mi.mike1021 at gmail.com (Mike Michell) Date: Tue, 4 Apr 2023 20:18:58 -0500 Subject: [petsc-users] Define sub-dm from original dm and vector mapping Message-ID: Dear PETSc developer team, Hi, this is a follow-up question regarding generating sub-dm for surface from the original volumetric dm ( https://lists.mcs.anl.gov/pipermail/petsc-users/2023-March/048263.html). I have created a short code as written below as a piece of it to create a sub-dm on the surface, extracted from the original volume dmplex: ------------------------------------- call DMClone(dm_origin, dm_wall, ierr);CHKERRA(ierr) ! label for face sets call DMGetLabel(dm_wall, "Face Sets", label_facesets, ierr);CHKERRA(ierr) call DMPlexLabelComplete(dm_wall, label_facesets, ierr);CHKERRA(ierr) ! label for vertex on surface call DMCreateLabel(dm_wall, "Wall", ierr);CHKERRA(ierr) call DMGetLabel(dm_wall, "Wall", label_surf, ierr);CHKERRA(ierr) call DMPlexGetChart(dm_wall, ist, iend, ierr);CHKERRA(ierr) do i=ist,iend call DMLabelGetValue(label_facesets, i, val, ierr);CHKERRA(ierr) if(val .eq. ID_wall) then call DMLabelSetValue(label_surf, i, ID_wall, ierr);CHKERRA(ierr) endif enddo call DMPlexLabelComplete(dm_wall, label_surf, ierr);CHKERRA(ierr) ! create submesh call DMPlexCreateSubmesh(dm_wall, label_surf, ID_wall, PETSC_TRUE, dm_sub, ierr);CHKERRA(ierr) call DMPlexGetSubpointMap(dm_sub, label_sub, ierr);CHKERRA(ierr) ------------------------------------- Now I can define a vector on dm_sub and view it via PETSc_Viewer. However, it is unclear how to map a vector defined on the volumetric dm (dm_wall) into a vector defined on the surface dm (dm_sub). I guess label_sub created via DMPlexGetSubpointMap() can help with it, but it is difficult to get a solid picture without an example. Could I get a comment for it? Thanks, Mike -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Apr 5 14:07:31 2023 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 5 Apr 2023 15:07:31 -0400 Subject: [petsc-users] Define sub-dm from original dm and vector mapping In-Reply-To: References: Message-ID: On Wed, Apr 5, 2023 at 3:04?PM Mike Michell wrote: > Dear PETSc developer team, > > Hi, this is a follow-up question regarding generating sub-dm for surface > from the original volumetric dm ( > https://lists.mcs.anl.gov/pipermail/petsc-users/2023-March/048263.html). > > I have created a short code as written below as a piece of it to create a > sub-dm on the surface, extracted from the original volume dmplex: > > ------------------------------------- > call DMClone(dm_origin, dm_wall, ierr);CHKERRA(ierr) > > ! label for face sets > call DMGetLabel(dm_wall, "Face Sets", label_facesets, > ierr);CHKERRA(ierr) > call DMPlexLabelComplete(dm_wall, label_facesets, ierr);CHKERRA(ierr) > > ! label for vertex on surface > call DMCreateLabel(dm_wall, "Wall", ierr);CHKERRA(ierr) > call DMGetLabel(dm_wall, "Wall", label_surf, ierr);CHKERRA(ierr) > call DMPlexGetChart(dm_wall, ist, iend, ierr);CHKERRA(ierr) > do i=ist,iend > call DMLabelGetValue(label_facesets, i, val, ierr);CHKERRA(ierr) > if(val .eq. ID_wall) then > call DMLabelSetValue(label_surf, i, ID_wall, ierr);CHKERRA(ierr) > endif > enddo > call DMPlexLabelComplete(dm_wall, label_surf, ierr);CHKERRA(ierr) > > ! create submesh > call DMPlexCreateSubmesh(dm_wall, label_surf, ID_wall, PETSC_TRUE, > dm_sub, ierr);CHKERRA(ierr) > call DMPlexGetSubpointMap(dm_sub, label_sub, ierr);CHKERRA(ierr) > ------------------------------------- > > Now I can define a vector on dm_sub and view it via PETSc_Viewer. However, > it is unclear how to map a vector defined on the volumetric dm (dm_wall) > into a vector defined on the surface dm (dm_sub). I guess label_sub created > via DMPlexGetSubpointMap() can help with it, but it is difficult to get a > solid picture without an example. Could I get a comment for it? > I would use DMProjectField(). This would project a vector field in some FEM space on the volumetric mesh (the input U), into a vector field defined in some FEM space on the surface (the input dm _and_ the output vector). This should work correctly. Let me know if this does not make sense to code. Thanks, Matt > Thanks, > Mike > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From kaus at uni-mainz.de Wed Apr 5 15:17:30 2023 From: kaus at uni-mainz.de (Kaus, Boris) Date: Wed, 5 Apr 2023 20:17:30 +0000 Subject: [petsc-users] Installation issue of 3.18.* and 3.19.0 on Apple systems Message-ID: <6625F254-C1D9-4249-BD09-6D3BFAA8CDC2@uni-mainz.de> Hi everyone, I?m trying to install precompiled binaries for PETSc 3.18.5 & 3.19.0 using the BinaryBuilder cross-compilation: https://github.com/JuliaPackaging/Yggdrasil/pull/6533, which mostly works fine: https://buildkite.com/julialang/yggdrasil/builds/2093). Yet, on apple systems I receive a somewhat weird bug during the configure step: [22:08:49] ******************************************************************************* [22:08:49] TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure [22:08:49] ------------------------------------------------------------------------------- [22:08:49] invalid literal for int() with base 10: '' [22:08:49] ******************************************************************************* [22:08:49] [22:08:49] [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. [22:08:49] make: *** [GNUmakefile:17: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscvariables] Error 2 [22:08:49] make: *** Waiting for unfinished jobs.... [22:08:49] make: *** [GNUmakefile:17: lib/petsc/conf/petscvariables] Error 2 The log file is rather brief: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # more configure.log Executing: uname -s stdout: Darwin It works fine for PETSc 3.16.5/3.17.5, and this first occurs in 3.18.0. Is there something that changed between 3.17 & 3.18 that could cause this? The build system seems to use python3.9 (3.4+ as required) Thanks! Boris -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefano.zampini at gmail.com Wed Apr 5 15:32:07 2023 From: stefano.zampini at gmail.com (Stefano Zampini) Date: Wed, 5 Apr 2023 23:32:07 +0300 Subject: [petsc-users] Installation issue of 3.18.* and 3.19.0 on Apple systems In-Reply-To: <6625F254-C1D9-4249-BD09-6D3BFAA8CDC2@uni-mainz.de> References: <6625F254-C1D9-4249-BD09-6D3BFAA8CDC2@uni-mainz.de> Message-ID: It seems there's some typo/error in the configure command that is being executed. Can you post it here? Il giorno mer 5 apr 2023 alle ore 23:18 Kaus, Boris ha scritto: > Hi everyone, > > I?m trying to install precompiled binaries for PETSc 3.18.5 & 3.19.0 using > the BinaryBuilder cross-compilation: > https://github.com/JuliaPackaging/Yggdrasil/pull/6533, which mostly works > fine: https://buildkite.com/julialang/yggdrasil/builds/2093). > > Yet, on apple systems I receive a somewhat weird bug during the configure > step: > > [22:08:49] > ******************************************************************************* > [22:08:49] TypeError or ValueError possibly related to ERROR in > COMMAND LINE ARGUMENT while running ./configure > [22:08:49] > ------------------------------------------------------------------------------- > [22:08:49] invalid literal for int() with base 10: '' > [22:08:49] > ******************************************************************************* > [22:08:49] > [22:08:49] > [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: > /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or > directory > [22:08:49] make[1]: *** No rule to make target > '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. > [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: > /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or > directory > [22:08:49] make[1]: *** No rule to make target > '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. > [22:08:49] make: *** [GNUmakefile:17: > /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscvariables] Error 2 > [22:08:49] make: *** Waiting for unfinished jobs.... > [22:08:49] make: *** [GNUmakefile:17: lib/petsc/conf/petscvariables] Error > 2 > > The log file is rather brief: > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # more configure.log > Executing: uname -s > stdout: Darwin > > It works fine for PETSc 3.16.5/3.17.5, and this first occurs in 3.18.0. > Is there something that changed between 3.17 & 3.18 that could cause this? > > The build system seems to use python3.9 (3.4+ as required) > > Thanks! > Boris > > > > > -- Stefano -------------- next part -------------- An HTML attachment was scrubbed... URL: From kaus at uni-mainz.de Wed Apr 5 15:37:45 2023 From: kaus at uni-mainz.de (Kaus, Boris) Date: Wed, 5 Apr 2023 20:37:45 +0000 Subject: [petsc-users] Installation issue of 3.18.* and 3.19.0 on Apple systems In-Reply-To: References: <6625F254-C1D9-4249-BD09-6D3BFAA8CDC2@uni-mainz.de> Message-ID: <3219E7BA-A189-48B2-A174-3DAE9EE7F8CC@uni-mainz.de> It can be reproduced with this: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ ******************************************************************************* TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure ------------------------------------------------------------------------------- invalid literal for int() with base 10: '' ******************************************************************************* File "/workspace/srcdir/petsc-3.18.0/config/configure.py", line 457, in petsc_configure framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 101, in __init__ self.createChildren() File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 338, in createChildren self.getChild(moduleName) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 323, in getChild config.setupDependencies(self) File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 89, in setupDependencies self.registerPythonFile(utility,'config.utilities') File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 49, in registerPythonFile utilityObj = self.framework.require(directory+utilityName, self) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 343, in require config = self.getChild(moduleName, keywordArgs) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 317, in getChild config = type(self, *keywordArgs) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/utilities/macosFirewall.py", line 12, in __init__ self.isDarwin = config.setCompilers.Configure.isDarwin(self.log) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 664, in isDarwin if not isUname_value: config.setCompilers.Configure.isUname(log) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in isUname v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) On 5. Apr 2023, at 22:32, Stefano Zampini wrote: It seems there's some typo/error in the configure command that is being executed. Can you post it here? Il giorno mer 5 apr 2023 alle ore 23:18 Kaus, Boris > ha scritto: Hi everyone, I?m trying to install precompiled binaries for PETSc 3.18.5 & 3.19.0 using the BinaryBuilder cross-compilation: https://github.com/JuliaPackaging/Yggdrasil/pull/6533, which mostly works fine: https://buildkite.com/julialang/yggdrasil/builds/2093). Yet, on apple systems I receive a somewhat weird bug during the configure step: [22:08:49] ******************************************************************************* [22:08:49] TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure [22:08:49] ------------------------------------------------------------------------------- [22:08:49] invalid literal for int() with base 10: '' [22:08:49] ******************************************************************************* [22:08:49] [22:08:49] [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. [22:08:49] make: *** [GNUmakefile:17: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscvariables] Error 2 [22:08:49] make: *** Waiting for unfinished jobs.... [22:08:49] make: *** [GNUmakefile:17: lib/petsc/conf/petscvariables] Error 2 The log file is rather brief: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # more configure.log Executing: uname -s stdout: Darwin It works fine for PETSc 3.16.5/3.17.5, and this first occurs in 3.18.0. Is there something that changed between 3.17 & 3.18 that could cause this? The build system seems to use python3.9 (3.4+ as required) Thanks! Boris -- Stefano -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Wed Apr 5 15:38:36 2023 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 5 Apr 2023 15:38:36 -0500 (CDT) Subject: [petsc-users] Installation issue of 3.18.* and 3.19.0 on Apple systems In-Reply-To: References: <6625F254-C1D9-4249-BD09-6D3BFAA8CDC2@uni-mainz.de> Message-ID: Is it possible some unicode chars are in configure options? Perhaps debug with: diff --git a/config/configure.py b/config/configure.py index e00a7a9617c..a95483b61a5 100755 --- a/config/configure.py +++ b/config/configure.py @@ -455,6 +455,7 @@ def petsc_configure(configure_options): tbo = None framework = None try: + print(sys.argv[1:]) framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) framework.setup() framework.logPrintBox('Configuring PETSc to compile on your system') [Note the error message is printed further below] Satish On Wed, 5 Apr 2023, Stefano Zampini wrote: > It seems there's some typo/error in the configure command that is being > executed. Can you post it here? > > Il giorno mer 5 apr 2023 alle ore 23:18 Kaus, Boris ha > scritto: > > > Hi everyone, > > > > I?m trying to install precompiled binaries for PETSc 3.18.5 & 3.19.0 using > > the BinaryBuilder cross-compilation: > > https://github.com/JuliaPackaging/Yggdrasil/pull/6533, which mostly works > > fine: https://buildkite.com/julialang/yggdrasil/builds/2093). > > > > Yet, on apple systems I receive a somewhat weird bug during the configure > > step: > > > > [22:08:49] > > ******************************************************************************* > > [22:08:49] TypeError or ValueError possibly related to ERROR in > > COMMAND LINE ARGUMENT while running ./configure > > [22:08:49] > > ------------------------------------------------------------------------------- > > [22:08:49] invalid literal for int() with base 10: '' > > [22:08:49] > > ******************************************************************************* > > [22:08:49] > > [22:08:49] > > [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: > > /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or > > directory > > [22:08:49] make[1]: *** No rule to make target > > '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. > > [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: > > /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or > > directory > > [22:08:49] make[1]: *** No rule to make target > > '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. > > [22:08:49] make: *** [GNUmakefile:17: > > /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscvariables] Error 2 > > [22:08:49] make: *** Waiting for unfinished jobs.... > > [22:08:49] make: *** [GNUmakefile:17: lib/petsc/conf/petscvariables] Error > > 2 > > > > The log file is rather brief: > > > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # more configure.log > > Executing: uname -s > > stdout: Darwin > > > > It works fine for PETSc 3.16.5/3.17.5, and this first occurs in 3.18.0. > > Is there something that changed between 3.17 & 3.18 that could cause this? > > > > The build system seems to use python3.9 (3.4+ as required) > > > > Thanks! > > Boris > > > > > > > > > > > > From kaus at uni-mainz.de Wed Apr 5 15:44:55 2023 From: kaus at uni-mainz.de (Kaus, Boris) Date: Wed, 5 Apr 2023 20:44:55 +0000 Subject: [petsc-users] Installation issue of 3.18.* and 3.19.0 on Apple systems In-Reply-To: References: <6625F254-C1D9-4249-BD09-6D3BFAA8CDC2@uni-mainz.de> Message-ID: <4C368992-C302-4599-8663-FD60A16DEAEA@uni-mainz.de> That gives this: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ ['--prefix=/workspace/destdir/lib/petsc/double_real_Int32/'] ******************************************************************************* TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure ------------------------------------------------------------------------------- invalid literal for int() with base 10: '' ******************************************************************************* > On 5. Apr 2023, at 22:38, Satish Balay wrote: > > Is it possible some unicode chars are in configure options? > > Perhaps debug with: > > diff --git a/config/configure.py b/config/configure.py > index e00a7a9617c..a95483b61a5 100755 > --- a/config/configure.py > +++ b/config/configure.py > @@ -455,6 +455,7 @@ def petsc_configure(configure_options): > tbo = None > framework = None > try: > + print(sys.argv[1:]) > framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) > framework.setup() > framework.logPrintBox('Configuring PETSc to compile on your system') > > > [Note the error message is printed further below] > > Satish > > On Wed, 5 Apr 2023, Stefano Zampini wrote: > >> It seems there's some typo/error in the configure command that is being >> executed. Can you post it here? >> >> Il giorno mer 5 apr 2023 alle ore 23:18 Kaus, Boris ha >> scritto: >> >>> Hi everyone, >>> >>> I?m trying to install precompiled binaries for PETSc 3.18.5 & 3.19.0 using >>> the BinaryBuilder cross-compilation: >>> https://github.com/JuliaPackaging/Yggdrasil/pull/6533, which mostly works >>> fine: https://buildkite.com/julialang/yggdrasil/builds/2093). >>> >>> Yet, on apple systems I receive a somewhat weird bug during the configure >>> step: >>> >>> [22:08:49] >>> ******************************************************************************* >>> [22:08:49] TypeError or ValueError possibly related to ERROR in >>> COMMAND LINE ARGUMENT while running ./configure >>> [22:08:49] >>> ------------------------------------------------------------------------------- >>> [22:08:49] invalid literal for int() with base 10: '' >>> [22:08:49] >>> ******************************************************************************* >>> [22:08:49] >>> [22:08:49] >>> [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: >>> /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or >>> directory >>> [22:08:49] make[1]: *** No rule to make target >>> '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. >>> [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: >>> /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or >>> directory >>> [22:08:49] make[1]: *** No rule to make target >>> '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. >>> [22:08:49] make: *** [GNUmakefile:17: >>> /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscvariables] Error 2 >>> [22:08:49] make: *** Waiting for unfinished jobs.... >>> [22:08:49] make: *** [GNUmakefile:17: lib/petsc/conf/petscvariables] Error >>> 2 >>> >>> The log file is rather brief: >>> >>> sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # more configure.log >>> Executing: uname -s >>> stdout: Darwin >>> >>> It works fine for PETSc 3.16.5/3.17.5, and this first occurs in 3.18.0. >>> Is there something that changed between 3.17 & 3.18 that could cause this? >>> >>> The build system seems to use python3.9 (3.4+ as required) >>> >>> Thanks! >>> Boris >>> >>> >>> >>> >>> >> >> From balay at mcs.anl.gov Wed Apr 5 15:45:42 2023 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 5 Apr 2023 15:45:42 -0500 (CDT) Subject: [petsc-users] Installation issue of 3.18.* and 3.19.0 on Apple systems In-Reply-To: <3219E7BA-A189-48B2-A174-3DAE9EE7F8CC@uni-mainz.de> References: <6625F254-C1D9-4249-BD09-6D3BFAA8CDC2@uni-mainz.de> <3219E7BA-A189-48B2-A174-3DAE9EE7F8CC@uni-mainz.de> Message-ID: Well this doesn't trigger the error for me. Do you have any env variables set with unicode [non-ascii] chars? Satish --- balay at ypro petsc-3.19.0 % sw_vers ProductName: macOS ProductVersion: 13.3 BuildVersion: 22E252 balay at ypro petsc-3.19.0 % ./configure --with-mpi=0 --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ ============================================================================================= Configuring PETSc to compile on your system ============================================================================================= ============================================================================================= ***** WARNING ***** You have a version of GNU make older than 4.0. It will work, but may not support all the parallel testing options. You can install the latest GNU make with your package manager, such as Brew or MacPorts, or use the --download-make option to get the latest GNU make ============================================================================================= Compilers: C Compiler: gcc -fPIC -Wall -Wwrite-strings -Wno-unknown-pragmas -fstack-protector -fno-stack-check -Qunused-arguments -fvisibility=hidden -g3 -O0 Version: Apple clang version 14.0.3 (clang-1403.0.22.14.1) ... ... On Wed, 5 Apr 2023, Kaus, Boris wrote: > It can be reproduced with this: > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ > ******************************************************************************* > TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure > ------------------------------------------------------------------------------- > invalid literal for int() with base 10: '' > ******************************************************************************* > > > File "/workspace/srcdir/petsc-3.18.0/config/configure.py", line 457, in petsc_configure > framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 101, in __init__ > self.createChildren() > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 338, in createChildren > self.getChild(moduleName) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 323, in getChild > config.setupDependencies(self) > File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 89, in setupDependencies > self.registerPythonFile(utility,'config.utilities') > File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 49, in registerPythonFile > utilityObj = self.framework.require(directory+utilityName, self) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 343, in require > config = self.getChild(moduleName, keywordArgs) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 317, in getChild > config = type(self, *keywordArgs) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/utilities/macosFirewall.py", line 12, in __init__ > self.isDarwin = config.setCompilers.Configure.isDarwin(self.log) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 664, in isDarwin > if not isUname_value: config.setCompilers.Configure.isUname(log) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in isUname > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > > > On 5. Apr 2023, at 22:32, Stefano Zampini wrote: > > It seems there's some typo/error in the configure command that is being executed. Can you post it here? > > Il giorno mer 5 apr 2023 alle ore 23:18 Kaus, Boris > ha scritto: > Hi everyone, > > I?m trying to install precompiled binaries for PETSc 3.18.5 & 3.19.0 using the BinaryBuilder cross-compilation: > https://github.com/JuliaPackaging/Yggdrasil/pull/6533, which mostly works fine: https://buildkite.com/julialang/yggdrasil/builds/2093). > > Yet, on apple systems I receive a somewhat weird bug during the configure step: > > [22:08:49] ******************************************************************************* > [22:08:49] TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure > [22:08:49] ------------------------------------------------------------------------------- > [22:08:49] invalid literal for int() with base 10: '' > [22:08:49] ******************************************************************************* > [22:08:49] > [22:08:49] > [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory > [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. > [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory > [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. > [22:08:49] make: *** [GNUmakefile:17: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscvariables] Error 2 > [22:08:49] make: *** Waiting for unfinished jobs.... > [22:08:49] make: *** [GNUmakefile:17: lib/petsc/conf/petscvariables] Error 2 > > The log file is rather brief: > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # more configure.log > Executing: uname -s > stdout: Darwin > > It works fine for PETSc 3.16.5/3.17.5, and this first occurs in 3.18.0. > Is there something that changed between 3.17 & 3.18 that could cause this? > > The build system seems to use python3.9 (3.4+ as required) > > Thanks! > Boris > > > > > > > -- > Stefano > > From kaus at uni-mainz.de Wed Apr 5 15:49:05 2023 From: kaus at uni-mainz.de (Kaus, Boris) Date: Wed, 5 Apr 2023 20:49:05 +0000 Subject: [petsc-users] Installation issue of 3.18.* and 3.19.0 on Apple systems In-Reply-To: References: <6625F254-C1D9-4249-BD09-6D3BFAA8CDC2@uni-mainz.de> <3219E7BA-A189-48B2-A174-3DAE9EE7F8CC@uni-mainz.de> Message-ID: <9253F01D-EAC1-492F-8F73-0E234B63BEF1@uni-mainz.de> Don?t think so: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # env _=/usr/bin/env VERBOSE=true BUILD_LD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld OLDPWD=/workspace/srcdir/petsc-3.18.0 host_libdir=/workspace/x86_64-linux-musl-cxx11/destdir/lib nproc=8 target=aarch64-apple-darwin20 bindir=/workspace/destdir/bin CC=cc READELF_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf host_bindir=/workspace/x86_64-linux-musl-cxx11/destdir/bin PATH=/opt/bin/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi:/opt/aarch64-apple-darwin20/bin:/opt/bin/x86_64-linux-musl-cxx11:/opt/x86_64-linux-musl/bin:/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/bin:/sbin:/workspace/x86_64-linux-musl-cxx11/destdir/bin:/workspace/destdir/bin nbits=64 BUILD_STRIP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip BUILD_OBJDUMP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump CMAKE_TARGET_TOOLCHAIN=/opt/toolchains/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/target_aarch64-apple-darwin20.cmake FC_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran FC=gfortran SRC_NAME=PETSc RANLIB_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib CC_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc PS1=\[\]sandbox\[\]:\[\]${PWD//$WORKSPACE/$\{WORKSPACE\}}\[\] \$ PKG_CONFIG_SYSROOT_DIR=/workspace/destdir LD_LIBRARY_PATH=/usr/lib/csl-musl-x86_64:/usr/local/lib64:/usr/local/lib:/usr/lib64:/usr/lib:/lib64:/lib:/workspace/x86_64-linux-musl-cxx11/destdir/lib:/opt/x86_64-linux-musl/x86_64-linux-musl/lib64:/opt/x86_64-linux-musl/x86_64-linux-musl/lib:/opt/aarch64-apple-darwin20/aarch64-apple-darwin20/lib:/opt/aarch64-apple-darwin20/lib:/workspace/destdir/lib64:/workspace/destdir/lib HOSTOBJCOPY=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy HOSTOBJDUMP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump LIPO_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo HOSTSTRIP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip BUILD_OBJCOPY=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy ZERO_AR_DATE=1 dlext=dylib HIDDEN_PS1=\[\]sandbox\[\]:\[\]${PWD//$WORKSPACE/$\{WORKSPACE\}}\[\] \$ CCACHE_COMPILERCHECK=content AR_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar HOSTDSYMUTIL=dsymutil SHLVL=1 OBJDUMP_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump CXX_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ HOSTCXX=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ USER=kausb BUILD_DSYMUTIL=dsymutil CC_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc OBJCOPY_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy TERM=screen LIPO_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo BUILD_LIPO=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo NM_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm host_prefix=/workspace/x86_64-linux-musl-cxx11/destdir FC_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran AR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar WORKSPACE=/workspace STRIP_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip HOSTRANLIB=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib RANLIB_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib DSYMUTIL_FOR_BUILD=dsymutil HOSTAS=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as HOSTAR=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar BUILD_RANLIB=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib NM_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm LD=/opt/bin/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/ld HOSTLD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld bb_full_target=aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi LLVM_TARGET=aarch64-apple-darwin20 BUILD_READELF=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf CXX_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ libdir=/workspace/destdir/lib MESON_TARGET_TOOLCHAIN=/opt/toolchains/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/target_aarch64-apple-darwin20.meson LLVM_HOST_TARGET=x86_64-linux-musl STRIP_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip AS_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as HISTFILE=/meta/.bash_history HOME=/root HOSTLIPO=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo includedir=/workspace/destdir/include MESON_HOST_TOOLCHAIN=/opt/toolchains/x86_64-linux-musl-cxx11/host_x86_64-linux-musl.meson BUILD_FC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran V=true BUILD_CC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc HOSTCC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc AS_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as CXX=c++ rust_target=aarch64-apple-darwin rust_host=x86_64-unknown-linux-musl HOSTFC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran exeext= READELF_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf bb_target=aarch64-apple-darwin20 SOURCE_DATE_EPOCH=0 PWD=/workspace/srcdir/petsc-3.18.0 MACOSX_DEPLOYMENT_TARGET=11.0 proc_family=arm BUILD_NM=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm BUILD_CXX=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ LD_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld OBJDUMP_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump OBJCOPY_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy HOSTNM=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm USE_CCACHE=false BUILD_AR=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar BUILD_AS=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as prefix=/workspace/destdir HOSTNAME=271f88c24b60 CHARSET=UTF-8 PKG_CONFIG_PATH=/workspace/destdir/lib/pkgconfig:/workspace/destdir/lib64/pkgconfig:/workspace/destdir/share/pkgconfig MACHTYPE=x86_64-linux-musl HOSTREADELF=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf DSYMUTIL_BUILD=dsymutil LD_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld host_includedir=/workspace/x86_64-linux-musl-cxx11/destdir/include CMAKE_HOST_TOOLCHAIN=/opt/toolchains/x86_64-linux-musl-cxx11/host_x86_64-linux-musl.cmake SHELL=/bin/bash On 5. Apr 2023, at 22:45, Satish Balay wrote: Well this doesn't trigger the error for me. Do you have any env variables set with unicode [non-ascii] chars? Satish --- balay at ypro petsc-3.19.0 % sw_vers ProductName: macOS ProductVersion: 13.3 BuildVersion: 22E252 balay at ypro petsc-3.19.0 % ./configure --with-mpi=0 --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ ============================================================================================= Configuring PETSc to compile on your system ============================================================================================= ============================================================================================= ***** WARNING ***** You have a version of GNU make older than 4.0. It will work, but may not support all the parallel testing options. You can install the latest GNU make with your package manager, such as Brew or MacPorts, or use the --download-make option to get the latest GNU make ============================================================================================= Compilers: C Compiler: gcc -fPIC -Wall -Wwrite-strings -Wno-unknown-pragmas -fstack-protector -fno-stack-check -Qunused-arguments -fvisibility=hidden -g3 -O0 Version: Apple clang version 14.0.3 (clang-1403.0.22.14.1) ... ... On Wed, 5 Apr 2023, Kaus, Boris wrote: It can be reproduced with this: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ ******************************************************************************* TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure ------------------------------------------------------------------------------- invalid literal for int() with base 10: '' ******************************************************************************* File "/workspace/srcdir/petsc-3.18.0/config/configure.py", line 457, in petsc_configure framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 101, in __init__ self.createChildren() File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 338, in createChildren self.getChild(moduleName) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 323, in getChild config.setupDependencies(self) File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 89, in setupDependencies self.registerPythonFile(utility,'config.utilities') File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 49, in registerPythonFile utilityObj = self.framework.require(directory+utilityName, self) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 343, in require config = self.getChild(moduleName, keywordArgs) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 317, in getChild config = type(self, *keywordArgs) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/utilities/macosFirewall.py", line 12, in __init__ self.isDarwin = config.setCompilers.Configure.isDarwin(self.log) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 664, in isDarwin if not isUname_value: config.setCompilers.Configure.isUname(log) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in isUname v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) On 5. Apr 2023, at 22:32, Stefano Zampini wrote: It seems there's some typo/error in the configure command that is being executed. Can you post it here? Il giorno mer 5 apr 2023 alle ore 23:18 Kaus, Boris > ha scritto: Hi everyone, I?m trying to install precompiled binaries for PETSc 3.18.5 & 3.19.0 using the BinaryBuilder cross-compilation: https://github.com/JuliaPackaging/Yggdrasil/pull/6533, which mostly works fine: https://buildkite.com/julialang/yggdrasil/builds/2093). Yet, on apple systems I receive a somewhat weird bug during the configure step: [22:08:49] ******************************************************************************* [22:08:49] TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure [22:08:49] ------------------------------------------------------------------------------- [22:08:49] invalid literal for int() with base 10: '' [22:08:49] ******************************************************************************* [22:08:49] [22:08:49] [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. [22:08:49] make: *** [GNUmakefile:17: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscvariables] Error 2 [22:08:49] make: *** Waiting for unfinished jobs.... [22:08:49] make: *** [GNUmakefile:17: lib/petsc/conf/petscvariables] Error 2 The log file is rather brief: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # more configure.log Executing: uname -s stdout: Darwin It works fine for PETSc 3.16.5/3.17.5, and this first occurs in 3.18.0. Is there something that changed between 3.17 & 3.18 that could cause this? The build system seems to use python3.9 (3.4+ as required) Thanks! Boris -- Stefano -------------- next part -------------- An HTML attachment was scrubbed... URL: From kaus at uni-mainz.de Wed Apr 5 15:59:09 2023 From: kaus at uni-mainz.de (Kaus, Boris) Date: Wed, 5 Apr 2023 20:59:09 +0000 Subject: [petsc-users] Installation issue of 3.18.* and 3.19.0 on Apple systems In-Reply-To: <9253F01D-EAC1-492F-8F73-0E234B63BEF1@uni-mainz.de> References: <6625F254-C1D9-4249-BD09-6D3BFAA8CDC2@uni-mainz.de> <3219E7BA-A189-48B2-A174-3DAE9EE7F8CC@uni-mainz.de> <9253F01D-EAC1-492F-8F73-0E234B63BEF1@uni-mainz.de> Message-ID: <7A0992CD-E5E7-4962-ABCB-45CD7671B1B2@uni-mainz.de> In case anyone wants to reproduce this: $julia build_tarballs.jl --debug --verbose aarch64-apple-darwin-libgfortran5-mpi+openmpi Here build_tarballs.jl is given here: https://github.com/JuliaPackaging/Yggdrasil/blob/46a609970e3236610804a66903936ea1d6fa3759/P/PETSc/build_tarballs.jl It requires Julia, BinaryBuilder and a system that allows Docker. Boris On 5. Apr 2023, at 22:49, Kaus, Boris wrote: Don?t think so: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # env _=/usr/bin/env VERBOSE=true BUILD_LD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld OLDPWD=/workspace/srcdir/petsc-3.18.0 host_libdir=/workspace/x86_64-linux-musl-cxx11/destdir/lib nproc=8 target=aarch64-apple-darwin20 bindir=/workspace/destdir/bin CC=cc READELF_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf host_bindir=/workspace/x86_64-linux-musl-cxx11/destdir/bin PATH=/opt/bin/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi:/opt/aarch64-apple-darwin20/bin:/opt/bin/x86_64-linux-musl-cxx11:/opt/x86_64-linux-musl/bin:/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/bin:/sbin:/workspace/x86_64-linux-musl-cxx11/destdir/bin:/workspace/destdir/bin nbits=64 BUILD_STRIP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip BUILD_OBJDUMP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump CMAKE_TARGET_TOOLCHAIN=/opt/toolchains/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/target_aarch64-apple-darwin20.cmake FC_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran FC=gfortran SRC_NAME=PETSc RANLIB_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib CC_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc PS1=\[\]sandbox\[\]:\[\]${PWD//$WORKSPACE/$\{WORKSPACE\}}\[\] \$ PKG_CONFIG_SYSROOT_DIR=/workspace/destdir LD_LIBRARY_PATH=/usr/lib/csl-musl-x86_64:/usr/local/lib64:/usr/local/lib:/usr/lib64:/usr/lib:/lib64:/lib:/workspace/x86_64-linux-musl-cxx11/destdir/lib:/opt/x86_64-linux-musl/x86_64-linux-musl/lib64:/opt/x86_64-linux-musl/x86_64-linux-musl/lib:/opt/aarch64-apple-darwin20/aarch64-apple-darwin20/lib:/opt/aarch64-apple-darwin20/lib:/workspace/destdir/lib64:/workspace/destdir/lib HOSTOBJCOPY=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy HOSTOBJDUMP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump LIPO_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo HOSTSTRIP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip BUILD_OBJCOPY=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy ZERO_AR_DATE=1 dlext=dylib HIDDEN_PS1=\[\]sandbox\[\]:\[\]${PWD//$WORKSPACE/$\{WORKSPACE\}}\[\] \$ CCACHE_COMPILERCHECK=content AR_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar HOSTDSYMUTIL=dsymutil SHLVL=1 OBJDUMP_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump CXX_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ HOSTCXX=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ USER=kausb BUILD_DSYMUTIL=dsymutil CC_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc OBJCOPY_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy TERM=screen LIPO_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo BUILD_LIPO=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo NM_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm host_prefix=/workspace/x86_64-linux-musl-cxx11/destdir FC_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran AR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar WORKSPACE=/workspace STRIP_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip HOSTRANLIB=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib RANLIB_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib DSYMUTIL_FOR_BUILD=dsymutil HOSTAS=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as HOSTAR=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar BUILD_RANLIB=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib NM_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm LD=/opt/bin/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/ld HOSTLD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld bb_full_target=aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi LLVM_TARGET=aarch64-apple-darwin20 BUILD_READELF=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf CXX_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ libdir=/workspace/destdir/lib MESON_TARGET_TOOLCHAIN=/opt/toolchains/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/target_aarch64-apple-darwin20.meson LLVM_HOST_TARGET=x86_64-linux-musl STRIP_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip AS_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as HISTFILE=/meta/.bash_history HOME=/root HOSTLIPO=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo includedir=/workspace/destdir/include MESON_HOST_TOOLCHAIN=/opt/toolchains/x86_64-linux-musl-cxx11/host_x86_64-linux-musl.meson BUILD_FC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran V=true BUILD_CC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc HOSTCC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc AS_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as CXX=c++ rust_target=aarch64-apple-darwin rust_host=x86_64-unknown-linux-musl HOSTFC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran exeext= READELF_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf bb_target=aarch64-apple-darwin20 SOURCE_DATE_EPOCH=0 PWD=/workspace/srcdir/petsc-3.18.0 MACOSX_DEPLOYMENT_TARGET=11.0 proc_family=arm BUILD_NM=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm BUILD_CXX=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ LD_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld OBJDUMP_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump OBJCOPY_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy HOSTNM=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm USE_CCACHE=false BUILD_AR=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar BUILD_AS=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as prefix=/workspace/destdir HOSTNAME=271f88c24b60 CHARSET=UTF-8 PKG_CONFIG_PATH=/workspace/destdir/lib/pkgconfig:/workspace/destdir/lib64/pkgconfig:/workspace/destdir/share/pkgconfig MACHTYPE=x86_64-linux-musl HOSTREADELF=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf DSYMUTIL_BUILD=dsymutil LD_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld host_includedir=/workspace/x86_64-linux-musl-cxx11/destdir/include CMAKE_HOST_TOOLCHAIN=/opt/toolchains/x86_64-linux-musl-cxx11/host_x86_64-linux-musl.cmake SHELL=/bin/bash On 5. Apr 2023, at 22:45, Satish Balay wrote: Well this doesn't trigger the error for me. Do you have any env variables set with unicode [non-ascii] chars? Satish --- balay at ypro petsc-3.19.0 % sw_vers ProductName: macOS ProductVersion: 13.3 BuildVersion: 22E252 balay at ypro petsc-3.19.0 % ./configure --with-mpi=0 --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ ============================================================================================= Configuring PETSc to compile on your system ============================================================================================= ============================================================================================= ***** WARNING ***** You have a version of GNU make older than 4.0. It will work, but may not support all the parallel testing options. You can install the latest GNU make with your package manager, such as Brew or MacPorts, or use the --download-make option to get the latest GNU make ============================================================================================= Compilers: C Compiler: gcc -fPIC -Wall -Wwrite-strings -Wno-unknown-pragmas -fstack-protector -fno-stack-check -Qunused-arguments -fvisibility=hidden -g3 -O0 Version: Apple clang version 14.0.3 (clang-1403.0.22.14.1) ... ... On Wed, 5 Apr 2023, Kaus, Boris wrote: It can be reproduced with this: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ ******************************************************************************* TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure ------------------------------------------------------------------------------- invalid literal for int() with base 10: '' ******************************************************************************* File "/workspace/srcdir/petsc-3.18.0/config/configure.py", line 457, in petsc_configure framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 101, in __init__ self.createChildren() File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 338, in createChildren self.getChild(moduleName) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 323, in getChild config.setupDependencies(self) File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 89, in setupDependencies self.registerPythonFile(utility,'config.utilities') File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 49, in registerPythonFile utilityObj = self.framework.require(directory+utilityName, self) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 343, in require config = self.getChild(moduleName, keywordArgs) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 317, in getChild config = type(self, *keywordArgs) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/utilities/macosFirewall.py", line 12, in __init__ self.isDarwin = config.setCompilers.Configure.isDarwin(self.log) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 664, in isDarwin if not isUname_value: config.setCompilers.Configure.isUname(log) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in isUname v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) On 5. Apr 2023, at 22:32, Stefano Zampini wrote: It seems there's some typo/error in the configure command that is being executed. Can you post it here? Il giorno mer 5 apr 2023 alle ore 23:18 Kaus, Boris > ha scritto: Hi everyone, I?m trying to install precompiled binaries for PETSc 3.18.5 & 3.19.0 using the BinaryBuilder cross-compilation: https://github.com/JuliaPackaging/Yggdrasil/pull/6533, which mostly works fine: https://buildkite.com/julialang/yggdrasil/builds/2093). Yet, on apple systems I receive a somewhat weird bug during the configure step: [22:08:49] ******************************************************************************* [22:08:49] TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure [22:08:49] ------------------------------------------------------------------------------- [22:08:49] invalid literal for int() with base 10: '' [22:08:49] ******************************************************************************* [22:08:49] [22:08:49] [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. [22:08:49] make: *** [GNUmakefile:17: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscvariables] Error 2 [22:08:49] make: *** Waiting for unfinished jobs.... [22:08:49] make: *** [GNUmakefile:17: lib/petsc/conf/petscvariables] Error 2 The log file is rather brief: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # more configure.log Executing: uname -s stdout: Darwin It works fine for PETSc 3.16.5/3.17.5, and this first occurs in 3.18.0. Is there something that changed between 3.17 & 3.18 that could cause this? The build system seems to use python3.9 (3.4+ as required) Thanks! Boris -- Stefano -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Wed Apr 5 16:00:34 2023 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 5 Apr 2023 16:00:34 -0500 (CDT) Subject: [petsc-users] Installation issue of 3.18.* and 3.19.0 on Apple systems In-Reply-To: <9253F01D-EAC1-492F-8F73-0E234B63BEF1@uni-mainz.de> References: <6625F254-C1D9-4249-BD09-6D3BFAA8CDC2@uni-mainz.de> <3219E7BA-A189-48B2-A174-3DAE9EE7F8CC@uni-mainz.de> <9253F01D-EAC1-492F-8F73-0E234B63BEF1@uni-mainz.de> Message-ID: <832c3cf8-0d7d-91d1-9197-e1cf37b634ae@mcs.anl.gov> Sorry, Was looking at the wrong place. > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) Can you try: balay at ypro petsc % python3 Python 3.9.6 (default, Mar 10 2023, 20:16:38) [Clang 14.0.3 (clang-1403.0.22.14.1)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> import platform >>> platform.mac_ver() ('13.3', ('', '', ''), 'x86_64') >>> platform.mac_ver()[0].split('.') ['13', '3'] >>> tuple([int(a) for a in platform.mac_ver()[0].split('.')]) (13, 3) >>> Satish On Wed, 5 Apr 2023, Kaus, Boris wrote: > Don?t think so: > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # env > _=/usr/bin/env > VERBOSE=true > BUILD_LD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld > OLDPWD=/workspace/srcdir/petsc-3.18.0 > host_libdir=/workspace/x86_64-linux-musl-cxx11/destdir/lib > nproc=8 > target=aarch64-apple-darwin20 > bindir=/workspace/destdir/bin > CC=cc > READELF_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf > host_bindir=/workspace/x86_64-linux-musl-cxx11/destdir/bin > PATH=/opt/bin/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi:/opt/aarch64-apple-darwin20/bin:/opt/bin/x86_64-linux-musl-cxx11:/opt/x86_64-linux-musl/bin:/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/bin:/sbin:/workspace/x86_64-linux-musl-cxx11/destdir/bin:/workspace/destdir/bin > nbits=64 > BUILD_STRIP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip > BUILD_OBJDUMP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump > CMAKE_TARGET_TOOLCHAIN=/opt/toolchains/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/target_aarch64-apple-darwin20.cmake > FC_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran > FC=gfortran > SRC_NAME=PETSc > RANLIB_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib > CC_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc > PS1=\[\]sandbox\[\]:\[\]${PWD//$WORKSPACE/$\{WORKSPACE\}}\[\] \$ > PKG_CONFIG_SYSROOT_DIR=/workspace/destdir > LD_LIBRARY_PATH=/usr/lib/csl-musl-x86_64:/usr/local/lib64:/usr/local/lib:/usr/lib64:/usr/lib:/lib64:/lib:/workspace/x86_64-linux-musl-cxx11/destdir/lib:/opt/x86_64-linux-musl/x86_64-linux-musl/lib64:/opt/x86_64-linux-musl/x86_64-linux-musl/lib:/opt/aarch64-apple-darwin20/aarch64-apple-darwin20/lib:/opt/aarch64-apple-darwin20/lib:/workspace/destdir/lib64:/workspace/destdir/lib > HOSTOBJCOPY=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy > HOSTOBJDUMP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump > LIPO_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo > HOSTSTRIP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip > BUILD_OBJCOPY=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy > ZERO_AR_DATE=1 > dlext=dylib > HIDDEN_PS1=\[\]sandbox\[\]:\[\]${PWD//$WORKSPACE/$\{WORKSPACE\}}\[\] \$ > CCACHE_COMPILERCHECK=content > AR_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar > HOSTDSYMUTIL=dsymutil > SHLVL=1 > OBJDUMP_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump > CXX_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ > HOSTCXX=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ > USER=kausb > BUILD_DSYMUTIL=dsymutil > CC_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc > OBJCOPY_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy > TERM=screen > LIPO_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo > BUILD_LIPO=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo > NM_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm > host_prefix=/workspace/x86_64-linux-musl-cxx11/destdir > FC_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran > AR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar > WORKSPACE=/workspace > STRIP_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip > HOSTRANLIB=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib > RANLIB_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib > DSYMUTIL_FOR_BUILD=dsymutil > HOSTAS=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as > HOSTAR=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar > BUILD_RANLIB=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib > NM_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm > LD=/opt/bin/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/ld > HOSTLD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld > bb_full_target=aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi > LLVM_TARGET=aarch64-apple-darwin20 > BUILD_READELF=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf > CXX_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ > libdir=/workspace/destdir/lib > MESON_TARGET_TOOLCHAIN=/opt/toolchains/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/target_aarch64-apple-darwin20.meson > LLVM_HOST_TARGET=x86_64-linux-musl > STRIP_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip > AS_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as > HISTFILE=/meta/.bash_history > HOME=/root > HOSTLIPO=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo > includedir=/workspace/destdir/include > MESON_HOST_TOOLCHAIN=/opt/toolchains/x86_64-linux-musl-cxx11/host_x86_64-linux-musl.meson > BUILD_FC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran > V=true > BUILD_CC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc > HOSTCC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc > AS_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as > CXX=c++ > rust_target=aarch64-apple-darwin > rust_host=x86_64-unknown-linux-musl > HOSTFC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran > exeext= > READELF_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf > bb_target=aarch64-apple-darwin20 > SOURCE_DATE_EPOCH=0 > PWD=/workspace/srcdir/petsc-3.18.0 > MACOSX_DEPLOYMENT_TARGET=11.0 > proc_family=arm > BUILD_NM=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm > BUILD_CXX=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ > LD_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld > OBJDUMP_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump > OBJCOPY_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy > HOSTNM=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm > USE_CCACHE=false > BUILD_AR=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar > BUILD_AS=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as > prefix=/workspace/destdir > HOSTNAME=271f88c24b60 > CHARSET=UTF-8 > PKG_CONFIG_PATH=/workspace/destdir/lib/pkgconfig:/workspace/destdir/lib64/pkgconfig:/workspace/destdir/share/pkgconfig > MACHTYPE=x86_64-linux-musl > HOSTREADELF=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf > DSYMUTIL_BUILD=dsymutil > LD_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld > host_includedir=/workspace/x86_64-linux-musl-cxx11/destdir/include > CMAKE_HOST_TOOLCHAIN=/opt/toolchains/x86_64-linux-musl-cxx11/host_x86_64-linux-musl.cmake > SHELL=/bin/bash > > > On 5. Apr 2023, at 22:45, Satish Balay wrote: > > Well this doesn't trigger the error for me. Do you have any env variables set with unicode [non-ascii] chars? > > Satish > > --- > > balay at ypro petsc-3.19.0 % sw_vers > ProductName: macOS > ProductVersion: 13.3 > BuildVersion: 22E252 > balay at ypro petsc-3.19.0 % ./configure --with-mpi=0 --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ > ============================================================================================= > Configuring PETSc to compile on your system > ============================================================================================= > ============================================================================================= > ***** WARNING ***** > You have a version of GNU make older than 4.0. It will work, but may not support all the > parallel testing options. You can install the latest GNU make with your package manager, > such as Brew or MacPorts, or use the --download-make option to get the latest GNU make > ============================================================================================= > Compilers: > C Compiler: gcc -fPIC -Wall -Wwrite-strings -Wno-unknown-pragmas -fstack-protector -fno-stack-check -Qunused-arguments -fvisibility=hidden -g3 -O0 > Version: Apple clang version 14.0.3 (clang-1403.0.22.14.1) > ... > ... > > > On Wed, 5 Apr 2023, Kaus, Boris wrote: > > It can be reproduced with this: > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ > ******************************************************************************* > TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure > ------------------------------------------------------------------------------- > invalid literal for int() with base 10: '' > ******************************************************************************* > > > File "/workspace/srcdir/petsc-3.18.0/config/configure.py", line 457, in petsc_configure > framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 101, in __init__ > self.createChildren() > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 338, in createChildren > self.getChild(moduleName) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 323, in getChild > config.setupDependencies(self) > File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 89, in setupDependencies > self.registerPythonFile(utility,'config.utilities') > File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 49, in registerPythonFile > utilityObj = self.framework.require(directory+utilityName, self) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 343, in require > config = self.getChild(moduleName, keywordArgs) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 317, in getChild > config = type(self, *keywordArgs) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/utilities/macosFirewall.py", line 12, in __init__ > self.isDarwin = config.setCompilers.Configure.isDarwin(self.log) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 664, in isDarwin > if not isUname_value: config.setCompilers.Configure.isUname(log) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in isUname > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > > > On 5. Apr 2023, at 22:32, Stefano Zampini wrote: > > It seems there's some typo/error in the configure command that is being executed. Can you post it here? > > Il giorno mer 5 apr 2023 alle ore 23:18 Kaus, Boris > ha scritto: > Hi everyone, > > I?m trying to install precompiled binaries for PETSc 3.18.5 & 3.19.0 using the BinaryBuilder cross-compilation: > https://github.com/JuliaPackaging/Yggdrasil/pull/6533, which mostly works fine: https://buildkite.com/julialang/yggdrasil/builds/2093). > > Yet, on apple systems I receive a somewhat weird bug during the configure step: > > [22:08:49] ******************************************************************************* > [22:08:49] TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure > [22:08:49] ------------------------------------------------------------------------------- > [22:08:49] invalid literal for int() with base 10: '' > [22:08:49] ******************************************************************************* > [22:08:49] > [22:08:49] > [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory > [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. > [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory > [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. > [22:08:49] make: *** [GNUmakefile:17: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscvariables] Error 2 > [22:08:49] make: *** Waiting for unfinished jobs.... > [22:08:49] make: *** [GNUmakefile:17: lib/petsc/conf/petscvariables] Error 2 > > The log file is rather brief: > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # more configure.log > Executing: uname -s > stdout: Darwin > > It works fine for PETSc 3.16.5/3.17.5, and this first occurs in 3.18.0. > Is there something that changed between 3.17 & 3.18 that could cause this? > > The build system seems to use python3.9 (3.4+ as required) > > Thanks! > Boris > > > > > > > -- > Stefano > > From kaus at uni-mainz.de Wed Apr 5 16:04:05 2023 From: kaus at uni-mainz.de (Kaus, Boris) Date: Wed, 5 Apr 2023 21:04:05 +0000 Subject: [petsc-users] Installation issue of 3.18.* and 3.19.0 on Apple systems In-Reply-To: <832c3cf8-0d7d-91d1-9197-e1cf37b634ae@mcs.anl.gov> References: <6625F254-C1D9-4249-BD09-6D3BFAA8CDC2@uni-mainz.de> <3219E7BA-A189-48B2-A174-3DAE9EE7F8CC@uni-mainz.de> <9253F01D-EAC1-492F-8F73-0E234B63BEF1@uni-mainz.de> <832c3cf8-0d7d-91d1-9197-e1cf37b634ae@mcs.anl.gov> Message-ID: <2201BE8A-8E4F-4D52-AEB2-F51FCD557D89@uni-mainz.de> That indeed seems to be the issue: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # python3 Python 3.9.7 (default, Nov 24 2021, 21:15:59) [GCC 10.3.1 20211027] on linux Type "help", "copyright", "credits" or "license" for more information. >>> import platform >>> platform.mac_ver() ('', ('', '', ''), '') >>> platform.mac_ver()[0].split('.') [''] >>> tuple([int(a) for a in platform.mac_ver()[0].split('.')]) Traceback (most recent call last): File "", line 1, in File "", line 1, in ValueError: invalid literal for int() with base 10: ?' On 5. Apr 2023, at 23:00, Satish Balay wrote: Sorry, Was looking at the wrong place. v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) Can you try: balay at ypro petsc % python3 Python 3.9.6 (default, Mar 10 2023, 20:16:38) [Clang 14.0.3 (clang-1403.0.22.14.1)] on darwin Type "help", "copyright", "credits" or "license" for more information. import platform platform.mac_ver() ('13.3', ('', '', ''), 'x86_64') platform.mac_ver()[0].split('.') ['13', '3'] tuple([int(a) for a in platform.mac_ver()[0].split('.')]) (13, 3) Satish On Wed, 5 Apr 2023, Kaus, Boris wrote: Don?t think so: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # env _=/usr/bin/env VERBOSE=true BUILD_LD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld OLDPWD=/workspace/srcdir/petsc-3.18.0 host_libdir=/workspace/x86_64-linux-musl-cxx11/destdir/lib nproc=8 target=aarch64-apple-darwin20 bindir=/workspace/destdir/bin CC=cc READELF_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf host_bindir=/workspace/x86_64-linux-musl-cxx11/destdir/bin PATH=/opt/bin/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi:/opt/aarch64-apple-darwin20/bin:/opt/bin/x86_64-linux-musl-cxx11:/opt/x86_64-linux-musl/bin:/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/bin:/sbin:/workspace/x86_64-linux-musl-cxx11/destdir/bin:/workspace/destdir/bin nbits=64 BUILD_STRIP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip BUILD_OBJDUMP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump CMAKE_TARGET_TOOLCHAIN=/opt/toolchains/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/target_aarch64-apple-darwin20.cmake FC_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran FC=gfortran SRC_NAME=PETSc RANLIB_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib CC_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc PS1=\[\]sandbox\[\]:\[\]${PWD//$WORKSPACE/$\{WORKSPACE\}}\[\] \$ PKG_CONFIG_SYSROOT_DIR=/workspace/destdir LD_LIBRARY_PATH=/usr/lib/csl-musl-x86_64:/usr/local/lib64:/usr/local/lib:/usr/lib64:/usr/lib:/lib64:/lib:/workspace/x86_64-linux-musl-cxx11/destdir/lib:/opt/x86_64-linux-musl/x86_64-linux-musl/lib64:/opt/x86_64-linux-musl/x86_64-linux-musl/lib:/opt/aarch64-apple-darwin20/aarch64-apple-darwin20/lib:/opt/aarch64-apple-darwin20/lib:/workspace/destdir/lib64:/workspace/destdir/lib HOSTOBJCOPY=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy HOSTOBJDUMP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump LIPO_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo HOSTSTRIP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip BUILD_OBJCOPY=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy ZERO_AR_DATE=1 dlext=dylib HIDDEN_PS1=\[\]sandbox\[\]:\[\]${PWD//$WORKSPACE/$\{WORKSPACE\}}\[\] \$ CCACHE_COMPILERCHECK=content AR_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar HOSTDSYMUTIL=dsymutil SHLVL=1 OBJDUMP_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump CXX_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ HOSTCXX=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ USER=kausb BUILD_DSYMUTIL=dsymutil CC_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc OBJCOPY_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy TERM=screen LIPO_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo BUILD_LIPO=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo NM_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm host_prefix=/workspace/x86_64-linux-musl-cxx11/destdir FC_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran AR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar WORKSPACE=/workspace STRIP_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip HOSTRANLIB=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib RANLIB_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib DSYMUTIL_FOR_BUILD=dsymutil HOSTAS=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as HOSTAR=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar BUILD_RANLIB=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib NM_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm LD=/opt/bin/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/ld HOSTLD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld bb_full_target=aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi LLVM_TARGET=aarch64-apple-darwin20 BUILD_READELF=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf CXX_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ libdir=/workspace/destdir/lib MESON_TARGET_TOOLCHAIN=/opt/toolchains/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/target_aarch64-apple-darwin20.meson LLVM_HOST_TARGET=x86_64-linux-musl STRIP_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip AS_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as HISTFILE=/meta/.bash_history HOME=/root HOSTLIPO=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo includedir=/workspace/destdir/include MESON_HOST_TOOLCHAIN=/opt/toolchains/x86_64-linux-musl-cxx11/host_x86_64-linux-musl.meson BUILD_FC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran V=true BUILD_CC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc HOSTCC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc AS_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as CXX=c++ rust_target=aarch64-apple-darwin rust_host=x86_64-unknown-linux-musl HOSTFC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran exeext= READELF_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf bb_target=aarch64-apple-darwin20 SOURCE_DATE_EPOCH=0 PWD=/workspace/srcdir/petsc-3.18.0 MACOSX_DEPLOYMENT_TARGET=11.0 proc_family=arm BUILD_NM=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm BUILD_CXX=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ LD_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld OBJDUMP_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump OBJCOPY_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy HOSTNM=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm USE_CCACHE=false BUILD_AR=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar BUILD_AS=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as prefix=/workspace/destdir HOSTNAME=271f88c24b60 CHARSET=UTF-8 PKG_CONFIG_PATH=/workspace/destdir/lib/pkgconfig:/workspace/destdir/lib64/pkgconfig:/workspace/destdir/share/pkgconfig MACHTYPE=x86_64-linux-musl HOSTREADELF=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf DSYMUTIL_BUILD=dsymutil LD_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld host_includedir=/workspace/x86_64-linux-musl-cxx11/destdir/include CMAKE_HOST_TOOLCHAIN=/opt/toolchains/x86_64-linux-musl-cxx11/host_x86_64-linux-musl.cmake SHELL=/bin/bash On 5. Apr 2023, at 22:45, Satish Balay wrote: Well this doesn't trigger the error for me. Do you have any env variables set with unicode [non-ascii] chars? Satish --- balay at ypro petsc-3.19.0 % sw_vers ProductName: macOS ProductVersion: 13.3 BuildVersion: 22E252 balay at ypro petsc-3.19.0 % ./configure --with-mpi=0 --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ ============================================================================================= Configuring PETSc to compile on your system ============================================================================================= ============================================================================================= ***** WARNING ***** You have a version of GNU make older than 4.0. It will work, but may not support all the parallel testing options. You can install the latest GNU make with your package manager, such as Brew or MacPorts, or use the --download-make option to get the latest GNU make ============================================================================================= Compilers: C Compiler: gcc -fPIC -Wall -Wwrite-strings -Wno-unknown-pragmas -fstack-protector -fno-stack-check -Qunused-arguments -fvisibility=hidden -g3 -O0 Version: Apple clang version 14.0.3 (clang-1403.0.22.14.1) ... ... On Wed, 5 Apr 2023, Kaus, Boris wrote: It can be reproduced with this: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ ******************************************************************************* TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure ------------------------------------------------------------------------------- invalid literal for int() with base 10: '' ******************************************************************************* File "/workspace/srcdir/petsc-3.18.0/config/configure.py", line 457, in petsc_configure framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 101, in __init__ self.createChildren() File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 338, in createChildren self.getChild(moduleName) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 323, in getChild config.setupDependencies(self) File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 89, in setupDependencies self.registerPythonFile(utility,'config.utilities') File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 49, in registerPythonFile utilityObj = self.framework.require(directory+utilityName, self) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 343, in require config = self.getChild(moduleName, keywordArgs) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 317, in getChild config = type(self, *keywordArgs) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/utilities/macosFirewall.py", line 12, in __init__ self.isDarwin = config.setCompilers.Configure.isDarwin(self.log) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 664, in isDarwin if not isUname_value: config.setCompilers.Configure.isUname(log) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in isUname v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) On 5. Apr 2023, at 22:32, Stefano Zampini wrote: It seems there's some typo/error in the configure command that is being executed. Can you post it here? Il giorno mer 5 apr 2023 alle ore 23:18 Kaus, Boris > ha scritto: Hi everyone, I?m trying to install precompiled binaries for PETSc 3.18.5 & 3.19.0 using the BinaryBuilder cross-compilation: https://github.com/JuliaPackaging/Yggdrasil/pull/6533, which mostly works fine: https://buildkite.com/julialang/yggdrasil/builds/2093). Yet, on apple systems I receive a somewhat weird bug during the configure step: [22:08:49] ******************************************************************************* [22:08:49] TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure [22:08:49] ------------------------------------------------------------------------------- [22:08:49] invalid literal for int() with base 10: '' [22:08:49] ******************************************************************************* [22:08:49] [22:08:49] [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. [22:08:49] make: *** [GNUmakefile:17: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscvariables] Error 2 [22:08:49] make: *** Waiting for unfinished jobs.... [22:08:49] make: *** [GNUmakefile:17: lib/petsc/conf/petscvariables] Error 2 The log file is rather brief: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # more configure.log Executing: uname -s stdout: Darwin It works fine for PETSc 3.16.5/3.17.5, and this first occurs in 3.18.0. Is there something that changed between 3.17 & 3.18 that could cause this? The build system seems to use python3.9 (3.4+ as required) Thanks! Boris -- Stefano -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Apr 5 16:09:25 2023 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 5 Apr 2023 17:09:25 -0400 Subject: [petsc-users] Installation issue of 3.18.* and 3.19.0 on Apple systems In-Reply-To: <2201BE8A-8E4F-4D52-AEB2-F51FCD557D89@uni-mainz.de> References: <6625F254-C1D9-4249-BD09-6D3BFAA8CDC2@uni-mainz.de> <3219E7BA-A189-48B2-A174-3DAE9EE7F8CC@uni-mainz.de> <9253F01D-EAC1-492F-8F73-0E234B63BEF1@uni-mainz.de> <832c3cf8-0d7d-91d1-9197-e1cf37b634ae@mcs.anl.gov> <2201BE8A-8E4F-4D52-AEB2-F51FCD557D89@uni-mainz.de> Message-ID: The documentation does say Entries which cannot be determined are set to '' so I guess we need a guard. Matt On Wed, Apr 5, 2023 at 5:04?PM Kaus, Boris wrote: > That indeed seems to be the issue: > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # python3 > Python 3.9.7 (default, Nov 24 2021, 21:15:59) > [GCC 10.3.1 20211027] on linux > Type "help", "copyright", "credits" or "license" for more information. > >>> import platform > >>> platform.mac_ver() > ('', ('', '', ''), '') > >>> platform.mac_ver()[0].split('.') > [''] > >>> tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > Traceback (most recent call last): > File "", line 1, in > File "", line 1, in > ValueError: invalid literal for int() with base 10: ?' > > > On 5. Apr 2023, at 23:00, Satish Balay wrote: > > Sorry, Was looking at the wrong place. > > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > > > Can you try: > > balay at ypro petsc % python3 > Python 3.9.6 (default, Mar 10 2023, 20:16:38) > [Clang 14.0.3 (clang-1403.0.22.14.1)] on darwin > Type "help", "copyright", "credits" or "license" for more information. > > import platform > platform.mac_ver() > > ('13.3', ('', '', ''), 'x86_64') > > platform.mac_ver()[0].split('.') > > ['13', '3'] > > tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > > (13, 3) > > > > > Satish > > > > > On Wed, 5 Apr 2023, Kaus, Boris wrote: > > Don?t think so: > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # env > _=/usr/bin/env > VERBOSE=true > BUILD_LD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld > OLDPWD=/workspace/srcdir/petsc-3.18.0 > host_libdir=/workspace/x86_64-linux-musl-cxx11/destdir/lib > nproc=8 > target=aarch64-apple-darwin20 > bindir=/workspace/destdir/bin > CC=cc > > READELF_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf > host_bindir=/workspace/x86_64-linux-musl-cxx11/destdir/bin > > PATH=/opt/bin/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi:/opt/aarch64-apple-darwin20/bin:/opt/bin/x86_64-linux-musl-cxx11:/opt/x86_64-linux-musl/bin:/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/bin:/sbin:/workspace/x86_64-linux-musl-cxx11/destdir/bin:/workspace/destdir/bin > nbits=64 > BUILD_STRIP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip > BUILD_OBJDUMP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump > > CMAKE_TARGET_TOOLCHAIN=/opt/toolchains/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/target_aarch64-apple-darwin20.cmake > FC_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran > FC=gfortran > SRC_NAME=PETSc > RANLIB_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib > CC_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc > PS1=\[\]sandbox\[\]:\[\]${PWD//$WORKSPACE/$\{WORKSPACE\}}\[\] \$ > PKG_CONFIG_SYSROOT_DIR=/workspace/destdir > > LD_LIBRARY_PATH=/usr/lib/csl-musl-x86_64:/usr/local/lib64:/usr/local/lib:/usr/lib64:/usr/lib:/lib64:/lib:/workspace/x86_64-linux-musl-cxx11/destdir/lib:/opt/x86_64-linux-musl/x86_64-linux-musl/lib64:/opt/x86_64-linux-musl/x86_64-linux-musl/lib:/opt/aarch64-apple-darwin20/aarch64-apple-darwin20/lib:/opt/aarch64-apple-darwin20/lib:/workspace/destdir/lib64:/workspace/destdir/lib > HOSTOBJCOPY=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy > HOSTOBJDUMP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump > LIPO_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo > HOSTSTRIP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip > BUILD_OBJCOPY=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy > ZERO_AR_DATE=1 > dlext=dylib > HIDDEN_PS1=\[\]sandbox\[\]:\[\]${PWD//$WORKSPACE/$\{WORKSPACE\}}\[\] \$ > CCACHE_COMPILERCHECK=content > AR_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar > HOSTDSYMUTIL=dsymutil > SHLVL=1 > > OBJDUMP_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump > CXX_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ > HOSTCXX=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ > USER=kausb > BUILD_DSYMUTIL=dsymutil > CC_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc > > OBJCOPY_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy > TERM=screen > LIPO_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo > BUILD_LIPO=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo > NM_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm > host_prefix=/workspace/x86_64-linux-musl-cxx11/destdir > FC_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran > AR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar > WORKSPACE=/workspace > STRIP_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip > HOSTRANLIB=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib > RANLIB_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib > DSYMUTIL_FOR_BUILD=dsymutil > HOSTAS=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as > HOSTAR=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar > BUILD_RANLIB=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib > NM_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm > LD=/opt/bin/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/ld > HOSTLD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld > bb_full_target=aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi > LLVM_TARGET=aarch64-apple-darwin20 > BUILD_READELF=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf > CXX_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ > libdir=/workspace/destdir/lib > > MESON_TARGET_TOOLCHAIN=/opt/toolchains/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/target_aarch64-apple-darwin20.meson > LLVM_HOST_TARGET=x86_64-linux-musl > STRIP_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip > AS_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as > HISTFILE=/meta/.bash_history > HOME=/root > HOSTLIPO=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo > includedir=/workspace/destdir/include > > MESON_HOST_TOOLCHAIN=/opt/toolchains/x86_64-linux-musl-cxx11/host_x86_64-linux-musl.meson > BUILD_FC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran > V=true > BUILD_CC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc > HOSTCC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc > AS_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as > CXX=c++ > rust_target=aarch64-apple-darwin > rust_host=x86_64-unknown-linux-musl > HOSTFC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran > exeext= > READELF_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf > bb_target=aarch64-apple-darwin20 > SOURCE_DATE_EPOCH=0 > PWD=/workspace/srcdir/petsc-3.18.0 > MACOSX_DEPLOYMENT_TARGET=11.0 > proc_family=arm > BUILD_NM=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm > BUILD_CXX=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ > LD_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld > OBJDUMP_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump > OBJCOPY_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy > HOSTNM=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm > USE_CCACHE=false > BUILD_AR=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar > BUILD_AS=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as > prefix=/workspace/destdir > HOSTNAME=271f88c24b60 > CHARSET=UTF-8 > > PKG_CONFIG_PATH=/workspace/destdir/lib/pkgconfig:/workspace/destdir/lib64/pkgconfig:/workspace/destdir/share/pkgconfig > MACHTYPE=x86_64-linux-musl > HOSTREADELF=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf > DSYMUTIL_BUILD=dsymutil > LD_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld > host_includedir=/workspace/x86_64-linux-musl-cxx11/destdir/include > > CMAKE_HOST_TOOLCHAIN=/opt/toolchains/x86_64-linux-musl-cxx11/host_x86_64-linux-musl.cmake > SHELL=/bin/bash > > > On 5. Apr 2023, at 22:45, Satish Balay wrote: > > Well this doesn't trigger the error for me. Do you have any env variables > set with unicode [non-ascii] chars? > > Satish > > --- > > balay at ypro petsc-3.19.0 % sw_vers > ProductName: macOS > ProductVersion: 13.3 > BuildVersion: 22E252 > balay at ypro petsc-3.19.0 % ./configure --with-mpi=0 > --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ > > ============================================================================================= > Configuring PETSc to compile on your system > > ============================================================================================= > > ============================================================================================= > ***** WARNING ***** > You have a version of GNU make older than 4.0. It will work, but may not > support all the > parallel testing options. You can install the latest GNU make with your > package manager, > such as Brew or MacPorts, or use the --download-make option to get the > latest GNU make > > ============================================================================================= > Compilers: > C Compiler: gcc -fPIC -Wall -Wwrite-strings -Wno-unknown-pragmas > -fstack-protector -fno-stack-check -Qunused-arguments -fvisibility=hidden > -g3 -O0 > Version: Apple clang version 14.0.3 (clang-1403.0.22.14.1) > ... > ... > > > On Wed, 5 Apr 2023, Kaus, Boris wrote: > > It can be reproduced with this: > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure > --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ > > ******************************************************************************* > TypeError or ValueError possibly related to ERROR in COMMAND LINE > ARGUMENT while running ./configure > > ------------------------------------------------------------------------------- > invalid literal for int() with base 10: '' > > ******************************************************************************* > > > File "/workspace/srcdir/petsc-3.18.0/config/configure.py", line 457, in > petsc_configure > framework = > config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], > loadArgDB = 0) > File > "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", > line 101, in __init__ > self.createChildren() > File > "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", > line 338, in createChildren > self.getChild(moduleName) > File > "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", > line 323, in getChild > config.setupDependencies(self) > File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 89, > in setupDependencies > self.registerPythonFile(utility,'config.utilities') > File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 49, > in registerPythonFile > utilityObj = > self.framework.require(directory+utilityName, self) > File > "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", > line 343, in require > config = self.getChild(moduleName, keywordArgs) > File > "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", > line 317, in getChild > config = type(self, *keywordArgs) > File > "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/utilities/macosFirewall.py", > line 12, in __init__ > self.isDarwin = config.setCompilers.Configure.isDarwin(self.log) > File > "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", > line 664, in isDarwin > if not isUname_value: config.setCompilers.Configure.isUname(log) > File > "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", > line 630, in isUname > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > File > "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", > line 630, in > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > > > On 5. Apr 2023, at 22:32, Stefano Zampini > wrote: > > It seems there's some typo/error in the configure command that is being > executed. Can you post it here? > > Il giorno mer 5 apr 2023 alle ore 23:18 Kaus, Boris mailto:kaus at uni-mainz.de > >> ha scritto: > Hi everyone, > > I?m trying to install precompiled binaries for PETSc 3.18.5 & 3.19.0 using > the BinaryBuilder cross-compilation: > https://github.com/JuliaPackaging/Yggdrasil/pull/6533, which mostly works > fine: https://buildkite.com/julialang/yggdrasil/builds/2093). > > Yet, on apple systems I receive a somewhat weird bug during the configure > step: > > [22:08:49] > ******************************************************************************* > [22:08:49] TypeError or ValueError possibly related to ERROR in > COMMAND LINE ARGUMENT while running ./configure > [22:08:49] > ------------------------------------------------------------------------------- > [22:08:49] invalid literal for int() with base 10: '' > [22:08:49] > ******************************************************************************* > [22:08:49] > [22:08:49] > [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: > /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or > directory > [22:08:49] make[1]: *** No rule to make target > '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. > [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: > /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or > directory > [22:08:49] make[1]: *** No rule to make target > '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. > [22:08:49] make: *** [GNUmakefile:17: > /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscvariables] Error 2 > [22:08:49] make: *** Waiting for unfinished jobs.... > [22:08:49] make: *** [GNUmakefile:17: lib/petsc/conf/petscvariables] Error > 2 > > The log file is rather brief: > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # more configure.log > Executing: uname -s > stdout: Darwin > > It works fine for PETSc 3.16.5/3.17.5, and this first occurs in 3.18.0. > Is there something that changed between 3.17 & 3.18 that could cause this? > > The build system seems to use python3.9 (3.4+ as required) > > Thanks! > Boris > > > > > > > -- > Stefano > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Wed Apr 5 16:11:49 2023 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 5 Apr 2023 16:11:49 -0500 (CDT) Subject: [petsc-users] Installation issue of 3.18.* and 3.19.0 on Apple systems In-Reply-To: <2201BE8A-8E4F-4D52-AEB2-F51FCD557D89@uni-mainz.de> References: <6625F254-C1D9-4249-BD09-6D3BFAA8CDC2@uni-mainz.de> <3219E7BA-A189-48B2-A174-3DAE9EE7F8CC@uni-mainz.de> <9253F01D-EAC1-492F-8F73-0E234B63BEF1@uni-mainz.de> <832c3cf8-0d7d-91d1-9197-e1cf37b634ae@mcs.anl.gov> <2201BE8A-8E4F-4D52-AEB2-F51FCD557D89@uni-mainz.de> Message-ID: Hm - broken python? Either way configure should not fail. Perhaps the following fix: Satish --- diff --git a/config/BuildSystem/config/setCompilers.py b/config/BuildSystem/config/setCompilers.py index e4d13bea58f..ae53d1e397e 100644 --- a/config/BuildSystem/config/setCompilers.py +++ b/config/BuildSystem/config/setCompilers.py @@ -626,10 +626,14 @@ class Configure(config.base.Configure): if log: log.write('Detected Darwin') isDarwin_value = True import platform - v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) - if v >= (10,15,0): - if log: log.write('Detected Darwin/MacOSX Catalina OS\n') - isDarwinCatalina_value = True + try: + v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) + if v >= (10,15,0): + if log: log.write('Detected Darwin/MacOSX Catalina OS\n') + isDarwinCatalina_value = True + except RuntimeError: + if log: log.write('MacOS version detecton failed!\n') + pass if output.find('freebsd') >= 0: if log: log.write('Detected FreeBSD') isFreeBSD_value = True On Wed, 5 Apr 2023, Kaus, Boris wrote: > That indeed seems to be the issue: > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # python3 > Python 3.9.7 (default, Nov 24 2021, 21:15:59) > [GCC 10.3.1 20211027] on linux > Type "help", "copyright", "credits" or "license" for more information. > >>> import platform > >>> platform.mac_ver() > ('', ('', '', ''), '') > >>> platform.mac_ver()[0].split('.') > [''] > >>> tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > Traceback (most recent call last): > File "", line 1, in > File "", line 1, in > ValueError: invalid literal for int() with base 10: ?' > > > On 5. Apr 2023, at 23:00, Satish Balay wrote: > > Sorry, Was looking at the wrong place. > > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > > Can you try: > > balay at ypro petsc % python3 > Python 3.9.6 (default, Mar 10 2023, 20:16:38) > [Clang 14.0.3 (clang-1403.0.22.14.1)] on darwin > Type "help", "copyright", "credits" or "license" for more information. > import platform > platform.mac_ver() > ('13.3', ('', '', ''), 'x86_64') > platform.mac_ver()[0].split('.') > ['13', '3'] > tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > (13, 3) > > > > Satish > > > > > On Wed, 5 Apr 2023, Kaus, Boris wrote: > > Don?t think so: > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # env > _=/usr/bin/env > VERBOSE=true > BUILD_LD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld > OLDPWD=/workspace/srcdir/petsc-3.18.0 > host_libdir=/workspace/x86_64-linux-musl-cxx11/destdir/lib > nproc=8 > target=aarch64-apple-darwin20 > bindir=/workspace/destdir/bin > CC=cc > READELF_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf > host_bindir=/workspace/x86_64-linux-musl-cxx11/destdir/bin > PATH=/opt/bin/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi:/opt/aarch64-apple-darwin20/bin:/opt/bin/x86_64-linux-musl-cxx11:/opt/x86_64-linux-musl/bin:/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/bin:/sbin:/workspace/x86_64-linux-musl-cxx11/destdir/bin:/workspace/destdir/bin > nbits=64 > BUILD_STRIP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip > BUILD_OBJDUMP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump > CMAKE_TARGET_TOOLCHAIN=/opt/toolchains/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/target_aarch64-apple-darwin20.cmake > FC_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran > FC=gfortran > SRC_NAME=PETSc > RANLIB_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib > CC_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc > PS1=\[\]sandbox\[\]:\[\]${PWD//$WORKSPACE/$\{WORKSPACE\}}\[\] \$ > PKG_CONFIG_SYSROOT_DIR=/workspace/destdir > LD_LIBRARY_PATH=/usr/lib/csl-musl-x86_64:/usr/local/lib64:/usr/local/lib:/usr/lib64:/usr/lib:/lib64:/lib:/workspace/x86_64-linux-musl-cxx11/destdir/lib:/opt/x86_64-linux-musl/x86_64-linux-musl/lib64:/opt/x86_64-linux-musl/x86_64-linux-musl/lib:/opt/aarch64-apple-darwin20/aarch64-apple-darwin20/lib:/opt/aarch64-apple-darwin20/lib:/workspace/destdir/lib64:/workspace/destdir/lib > HOSTOBJCOPY=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy > HOSTOBJDUMP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump > LIPO_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo > HOSTSTRIP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip > BUILD_OBJCOPY=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy > ZERO_AR_DATE=1 > dlext=dylib > HIDDEN_PS1=\[\]sandbox\[\]:\[\]${PWD//$WORKSPACE/$\{WORKSPACE\}}\[\] \$ > CCACHE_COMPILERCHECK=content > AR_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar > HOSTDSYMUTIL=dsymutil > SHLVL=1 > OBJDUMP_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump > CXX_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ > HOSTCXX=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ > USER=kausb > BUILD_DSYMUTIL=dsymutil > CC_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc > OBJCOPY_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy > TERM=screen > LIPO_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo > BUILD_LIPO=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo > NM_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm > host_prefix=/workspace/x86_64-linux-musl-cxx11/destdir > FC_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran > AR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar > WORKSPACE=/workspace > STRIP_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip > HOSTRANLIB=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib > RANLIB_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib > DSYMUTIL_FOR_BUILD=dsymutil > HOSTAS=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as > HOSTAR=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar > BUILD_RANLIB=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib > NM_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm > LD=/opt/bin/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/ld > HOSTLD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld > bb_full_target=aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi > LLVM_TARGET=aarch64-apple-darwin20 > BUILD_READELF=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf > CXX_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ > libdir=/workspace/destdir/lib > MESON_TARGET_TOOLCHAIN=/opt/toolchains/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/target_aarch64-apple-darwin20.meson > LLVM_HOST_TARGET=x86_64-linux-musl > STRIP_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip > AS_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as > HISTFILE=/meta/.bash_history > HOME=/root > HOSTLIPO=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo > includedir=/workspace/destdir/include > MESON_HOST_TOOLCHAIN=/opt/toolchains/x86_64-linux-musl-cxx11/host_x86_64-linux-musl.meson > BUILD_FC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran > V=true > BUILD_CC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc > HOSTCC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc > AS_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as > CXX=c++ > rust_target=aarch64-apple-darwin > rust_host=x86_64-unknown-linux-musl > HOSTFC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran > exeext= > READELF_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf > bb_target=aarch64-apple-darwin20 > SOURCE_DATE_EPOCH=0 > PWD=/workspace/srcdir/petsc-3.18.0 > MACOSX_DEPLOYMENT_TARGET=11.0 > proc_family=arm > BUILD_NM=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm > BUILD_CXX=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ > LD_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld > OBJDUMP_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump > OBJCOPY_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy > HOSTNM=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm > USE_CCACHE=false > BUILD_AR=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar > BUILD_AS=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as > prefix=/workspace/destdir > HOSTNAME=271f88c24b60 > CHARSET=UTF-8 > PKG_CONFIG_PATH=/workspace/destdir/lib/pkgconfig:/workspace/destdir/lib64/pkgconfig:/workspace/destdir/share/pkgconfig > MACHTYPE=x86_64-linux-musl > HOSTREADELF=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf > DSYMUTIL_BUILD=dsymutil > LD_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld > host_includedir=/workspace/x86_64-linux-musl-cxx11/destdir/include > CMAKE_HOST_TOOLCHAIN=/opt/toolchains/x86_64-linux-musl-cxx11/host_x86_64-linux-musl.cmake > SHELL=/bin/bash > > > On 5. Apr 2023, at 22:45, Satish Balay wrote: > > Well this doesn't trigger the error for me. Do you have any env variables set with unicode [non-ascii] chars? > > Satish > > --- > > balay at ypro petsc-3.19.0 % sw_vers > ProductName: macOS > ProductVersion: 13.3 > BuildVersion: 22E252 > balay at ypro petsc-3.19.0 % ./configure --with-mpi=0 --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ > ============================================================================================= > Configuring PETSc to compile on your system > ============================================================================================= > ============================================================================================= > ***** WARNING ***** > You have a version of GNU make older than 4.0. It will work, but may not support all the > parallel testing options. You can install the latest GNU make with your package manager, > such as Brew or MacPorts, or use the --download-make option to get the latest GNU make > ============================================================================================= > Compilers: > C Compiler: gcc -fPIC -Wall -Wwrite-strings -Wno-unknown-pragmas -fstack-protector -fno-stack-check -Qunused-arguments -fvisibility=hidden -g3 -O0 > Version: Apple clang version 14.0.3 (clang-1403.0.22.14.1) > ... > ... > > > On Wed, 5 Apr 2023, Kaus, Boris wrote: > > It can be reproduced with this: > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ > ******************************************************************************* > TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure > ------------------------------------------------------------------------------- > invalid literal for int() with base 10: '' > ******************************************************************************* > > > File "/workspace/srcdir/petsc-3.18.0/config/configure.py", line 457, in petsc_configure > framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 101, in __init__ > self.createChildren() > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 338, in createChildren > self.getChild(moduleName) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 323, in getChild > config.setupDependencies(self) > File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 89, in setupDependencies > self.registerPythonFile(utility,'config.utilities') > File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 49, in registerPythonFile > utilityObj = self.framework.require(directory+utilityName, self) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 343, in require > config = self.getChild(moduleName, keywordArgs) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 317, in getChild > config = type(self, *keywordArgs) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/utilities/macosFirewall.py", line 12, in __init__ > self.isDarwin = config.setCompilers.Configure.isDarwin(self.log) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 664, in isDarwin > if not isUname_value: config.setCompilers.Configure.isUname(log) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in isUname > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > > > On 5. Apr 2023, at 22:32, Stefano Zampini wrote: > > It seems there's some typo/error in the configure command that is being executed. Can you post it here? > > Il giorno mer 5 apr 2023 alle ore 23:18 Kaus, Boris > ha scritto: > Hi everyone, > > I?m trying to install precompiled binaries for PETSc 3.18.5 & 3.19.0 using the BinaryBuilder cross-compilation: > https://github.com/JuliaPackaging/Yggdrasil/pull/6533, which mostly works fine: https://buildkite.com/julialang/yggdrasil/builds/2093). > > Yet, on apple systems I receive a somewhat weird bug during the configure step: > > [22:08:49] ******************************************************************************* > [22:08:49] TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure > [22:08:49] ------------------------------------------------------------------------------- > [22:08:49] invalid literal for int() with base 10: '' > [22:08:49] ******************************************************************************* > [22:08:49] > [22:08:49] > [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory > [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. > [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory > [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. > [22:08:49] make: *** [GNUmakefile:17: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscvariables] Error 2 > [22:08:49] make: *** Waiting for unfinished jobs.... > [22:08:49] make: *** [GNUmakefile:17: lib/petsc/conf/petscvariables] Error 2 > > The log file is rather brief: > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # more configure.log > Executing: uname -s > stdout: Darwin > > It works fine for PETSc 3.16.5/3.17.5, and this first occurs in 3.18.0. > Is there something that changed between 3.17 & 3.18 that could cause this? > > The build system seems to use python3.9 (3.4+ as required) > > Thanks! > Boris > > > > > > > -- > Stefano > > From kaus at uni-mainz.de Wed Apr 5 16:31:14 2023 From: kaus at uni-mainz.de (Kaus, Boris) Date: Wed, 5 Apr 2023 21:31:14 +0000 Subject: [petsc-users] Installation issue of 3.18.* and 3.19.0 on Apple systems In-Reply-To: References: <6625F254-C1D9-4249-BD09-6D3BFAA8CDC2@uni-mainz.de> <3219E7BA-A189-48B2-A174-3DAE9EE7F8CC@uni-mainz.de> <9253F01D-EAC1-492F-8F73-0E234B63BEF1@uni-mainz.de> <832c3cf8-0d7d-91d1-9197-e1cf37b634ae@mcs.anl.gov> <2201BE8A-8E4F-4D52-AEB2-F51FCD557D89@uni-mainz.de> Message-ID: <23767F99-C0BD-4622-A13F-0A11BDC46A63@uni-mainz.de> Perhaps python is broken, or perhaps this is because it is not a real Mac OS, but an emulated one. Seems a similar error now occurs a bit down the line: --- sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ ******************************************************************************* TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure ------------------------------------------------------------------------------- invalid literal for int() with base 10: '' ******************************************************************************* File "/workspace/srcdir/petsc-3.18.0/config/configure.py", line 457, in petsc_configure framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 101, in __init__ self.createChildren() File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 338, in createChildren self.getChild(moduleName) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 323, in getChild config.setupDependencies(self) File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 89, in setupDependencies self.registerPythonFile(utility,'config.utilities') File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 49, in registerPythonFile utilityObj = self.framework.require(directory+utilityName, self) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 343, in require config = self.getChild(moduleName, keywordArgs) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 317, in getChild config = type(self, *keywordArgs) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/utilities/macosFirewall.py", line 12, in __init__ self.isDarwin = config.setCompilers.Configure.isDarwin(self.log) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 668, in isDarwin if not isUname_value: config.setCompilers.Configure.isUname(log) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 631, in isUname v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 631, in v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) On 5. Apr 2023, at 23:11, Satish Balay wrote: Hm - broken python? Either way configure should not fail. Perhaps the following fix: Satish --- diff --git a/config/BuildSystem/config/setCompilers.py b/config/BuildSystem/config/setCompilers.py index e4d13bea58f..ae53d1e397e 100644 --- a/config/BuildSystem/config/setCompilers.py +++ b/config/BuildSystem/config/setCompilers.py @@ -626,10 +626,14 @@ class Configure(config.base.Configure): if log: log.write('Detected Darwin') isDarwin_value = True import platform - v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) - if v >= (10,15,0): - if log: log.write('Detected Darwin/MacOSX Catalina OS\n') - isDarwinCatalina_value = True + try: + v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) + if v >= (10,15,0): + if log: log.write('Detected Darwin/MacOSX Catalina OS\n') + isDarwinCatalina_value = True + except RuntimeError: + if log: log.write('MacOS version detecton failed!\n') + pass if output.find('freebsd') >= 0: if log: log.write('Detected FreeBSD') isFreeBSD_value = True On Wed, 5 Apr 2023, Kaus, Boris wrote: That indeed seems to be the issue: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # python3 Python 3.9.7 (default, Nov 24 2021, 21:15:59) [GCC 10.3.1 20211027] on linux Type "help", "copyright", "credits" or "license" for more information. import platform platform.mac_ver() ('', ('', '', ''), '') platform.mac_ver()[0].split('.') [''] tuple([int(a) for a in platform.mac_ver()[0].split('.')]) Traceback (most recent call last): File "", line 1, in File "", line 1, in ValueError: invalid literal for int() with base 10: ?' On 5. Apr 2023, at 23:00, Satish Balay wrote: Sorry, Was looking at the wrong place. v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) Can you try: balay at ypro petsc % python3 Python 3.9.6 (default, Mar 10 2023, 20:16:38) [Clang 14.0.3 (clang-1403.0.22.14.1)] on darwin Type "help", "copyright", "credits" or "license" for more information. import platform platform.mac_ver() ('13.3', ('', '', ''), 'x86_64') platform.mac_ver()[0].split('.') ['13', '3'] tuple([int(a) for a in platform.mac_ver()[0].split('.')]) (13, 3) Satish On Wed, 5 Apr 2023, Kaus, Boris wrote: Don?t think so: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # env _=/usr/bin/env VERBOSE=true BUILD_LD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld OLDPWD=/workspace/srcdir/petsc-3.18.0 host_libdir=/workspace/x86_64-linux-musl-cxx11/destdir/lib nproc=8 target=aarch64-apple-darwin20 bindir=/workspace/destdir/bin CC=cc READELF_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf host_bindir=/workspace/x86_64-linux-musl-cxx11/destdir/bin PATH=/opt/bin/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi:/opt/aarch64-apple-darwin20/bin:/opt/bin/x86_64-linux-musl-cxx11:/opt/x86_64-linux-musl/bin:/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/bin:/sbin:/workspace/x86_64-linux-musl-cxx11/destdir/bin:/workspace/destdir/bin nbits=64 BUILD_STRIP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip BUILD_OBJDUMP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump CMAKE_TARGET_TOOLCHAIN=/opt/toolchains/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/target_aarch64-apple-darwin20.cmake FC_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran FC=gfortran SRC_NAME=PETSc RANLIB_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib CC_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc PS1=\[\]sandbox\[\]:\[\]${PWD//$WORKSPACE/$\{WORKSPACE\}}\[\] \$ PKG_CONFIG_SYSROOT_DIR=/workspace/destdir LD_LIBRARY_PATH=/usr/lib/csl-musl-x86_64:/usr/local/lib64:/usr/local/lib:/usr/lib64:/usr/lib:/lib64:/lib:/workspace/x86_64-linux-musl-cxx11/destdir/lib:/opt/x86_64-linux-musl/x86_64-linux-musl/lib64:/opt/x86_64-linux-musl/x86_64-linux-musl/lib:/opt/aarch64-apple-darwin20/aarch64-apple-darwin20/lib:/opt/aarch64-apple-darwin20/lib:/workspace/destdir/lib64:/workspace/destdir/lib HOSTOBJCOPY=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy HOSTOBJDUMP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump LIPO_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo HOSTSTRIP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip BUILD_OBJCOPY=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy ZERO_AR_DATE=1 dlext=dylib HIDDEN_PS1=\[\]sandbox\[\]:\[\]${PWD//$WORKSPACE/$\{WORKSPACE\}}\[\] \$ CCACHE_COMPILERCHECK=content AR_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar HOSTDSYMUTIL=dsymutil SHLVL=1 OBJDUMP_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump CXX_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ HOSTCXX=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ USER=kausb BUILD_DSYMUTIL=dsymutil CC_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc OBJCOPY_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy TERM=screen LIPO_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo BUILD_LIPO=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo NM_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm host_prefix=/workspace/x86_64-linux-musl-cxx11/destdir FC_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran AR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar WORKSPACE=/workspace STRIP_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip HOSTRANLIB=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib RANLIB_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib DSYMUTIL_FOR_BUILD=dsymutil HOSTAS=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as HOSTAR=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar BUILD_RANLIB=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib NM_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm LD=/opt/bin/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/ld HOSTLD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld bb_full_target=aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi LLVM_TARGET=aarch64-apple-darwin20 BUILD_READELF=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf CXX_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ libdir=/workspace/destdir/lib MESON_TARGET_TOOLCHAIN=/opt/toolchains/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/target_aarch64-apple-darwin20.meson LLVM_HOST_TARGET=x86_64-linux-musl STRIP_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip AS_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as HISTFILE=/meta/.bash_history HOME=/root HOSTLIPO=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo includedir=/workspace/destdir/include MESON_HOST_TOOLCHAIN=/opt/toolchains/x86_64-linux-musl-cxx11/host_x86_64-linux-musl.meson BUILD_FC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran V=true BUILD_CC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc HOSTCC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc AS_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as CXX=c++ rust_target=aarch64-apple-darwin rust_host=x86_64-unknown-linux-musl HOSTFC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran exeext= READELF_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf bb_target=aarch64-apple-darwin20 SOURCE_DATE_EPOCH=0 PWD=/workspace/srcdir/petsc-3.18.0 MACOSX_DEPLOYMENT_TARGET=11.0 proc_family=arm BUILD_NM=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm BUILD_CXX=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ LD_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld OBJDUMP_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump OBJCOPY_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy HOSTNM=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm USE_CCACHE=false BUILD_AR=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar BUILD_AS=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as prefix=/workspace/destdir HOSTNAME=271f88c24b60 CHARSET=UTF-8 PKG_CONFIG_PATH=/workspace/destdir/lib/pkgconfig:/workspace/destdir/lib64/pkgconfig:/workspace/destdir/share/pkgconfig MACHTYPE=x86_64-linux-musl HOSTREADELF=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf DSYMUTIL_BUILD=dsymutil LD_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld host_includedir=/workspace/x86_64-linux-musl-cxx11/destdir/include CMAKE_HOST_TOOLCHAIN=/opt/toolchains/x86_64-linux-musl-cxx11/host_x86_64-linux-musl.cmake SHELL=/bin/bash On 5. Apr 2023, at 22:45, Satish Balay wrote: Well this doesn't trigger the error for me. Do you have any env variables set with unicode [non-ascii] chars? Satish --- balay at ypro petsc-3.19.0 % sw_vers ProductName: macOS ProductVersion: 13.3 BuildVersion: 22E252 balay at ypro petsc-3.19.0 % ./configure --with-mpi=0 --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ ============================================================================================= Configuring PETSc to compile on your system ============================================================================================= ============================================================================================= ***** WARNING ***** You have a version of GNU make older than 4.0. It will work, but may not support all the parallel testing options. You can install the latest GNU make with your package manager, such as Brew or MacPorts, or use the --download-make option to get the latest GNU make ============================================================================================= Compilers: C Compiler: gcc -fPIC -Wall -Wwrite-strings -Wno-unknown-pragmas -fstack-protector -fno-stack-check -Qunused-arguments -fvisibility=hidden -g3 -O0 Version: Apple clang version 14.0.3 (clang-1403.0.22.14.1) ... ... On Wed, 5 Apr 2023, Kaus, Boris wrote: It can be reproduced with this: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ ******************************************************************************* TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure ------------------------------------------------------------------------------- invalid literal for int() with base 10: '' ******************************************************************************* File "/workspace/srcdir/petsc-3.18.0/config/configure.py", line 457, in petsc_configure framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 101, in __init__ self.createChildren() File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 338, in createChildren self.getChild(moduleName) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 323, in getChild config.setupDependencies(self) File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 89, in setupDependencies self.registerPythonFile(utility,'config.utilities') File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 49, in registerPythonFile utilityObj = self.framework.require(directory+utilityName, self) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 343, in require config = self.getChild(moduleName, keywordArgs) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 317, in getChild config = type(self, *keywordArgs) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/utilities/macosFirewall.py", line 12, in __init__ self.isDarwin = config.setCompilers.Configure.isDarwin(self.log) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 664, in isDarwin if not isUname_value: config.setCompilers.Configure.isUname(log) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in isUname v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) On 5. Apr 2023, at 22:32, Stefano Zampini wrote: It seems there's some typo/error in the configure command that is being executed. Can you post it here? Il giorno mer 5 apr 2023 alle ore 23:18 Kaus, Boris > ha scritto: Hi everyone, I?m trying to install precompiled binaries for PETSc 3.18.5 & 3.19.0 using the BinaryBuilder cross-compilation: https://github.com/JuliaPackaging/Yggdrasil/pull/6533, which mostly works fine: https://buildkite.com/julialang/yggdrasil/builds/2093). Yet, on apple systems I receive a somewhat weird bug during the configure step: [22:08:49] ******************************************************************************* [22:08:49] TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure [22:08:49] ------------------------------------------------------------------------------- [22:08:49] invalid literal for int() with base 10: '' [22:08:49] ******************************************************************************* [22:08:49] [22:08:49] [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. [22:08:49] make: *** [GNUmakefile:17: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscvariables] Error 2 [22:08:49] make: *** Waiting for unfinished jobs.... [22:08:49] make: *** [GNUmakefile:17: lib/petsc/conf/petscvariables] Error 2 The log file is rather brief: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # more configure.log Executing: uname -s stdout: Darwin It works fine for PETSc 3.16.5/3.17.5, and this first occurs in 3.18.0. Is there something that changed between 3.17 & 3.18 that could cause this? The build system seems to use python3.9 (3.4+ as required) Thanks! Boris -- Stefano -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Wed Apr 5 16:58:43 2023 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 5 Apr 2023 16:58:43 -0500 (CDT) Subject: [petsc-users] Installation issue of 3.18.* and 3.19.0 on Apple systems In-Reply-To: <23767F99-C0BD-4622-A13F-0A11BDC46A63@uni-mainz.de> References: <6625F254-C1D9-4249-BD09-6D3BFAA8CDC2@uni-mainz.de> <3219E7BA-A189-48B2-A174-3DAE9EE7F8CC@uni-mainz.de> <9253F01D-EAC1-492F-8F73-0E234B63BEF1@uni-mainz.de> <832c3cf8-0d7d-91d1-9197-e1cf37b634ae@mcs.anl.gov> <2201BE8A-8E4F-4D52-AEB2-F51FCD557D89@uni-mainz.de> <23767F99-C0BD-4622-A13F-0A11BDC46A63@uni-mainz.de> Message-ID: <59b55873-a729-b85a-e2e2-89bb20531f9f@mcs.anl.gov> Ah - my patch was buggy. > + except RuntimeError: should be: 'except:' i.e: diff --git a/config/BuildSystem/config/setCompilers.py b/config/BuildSystem/config/setCompilers.py index 57848e736e3..e191c1e1b4d 100644 --- a/config/BuildSystem/config/setCompilers.py +++ b/config/BuildSystem/config/setCompilers.py @@ -630,10 +630,14 @@ class Configure(config.base.Configure): if log: log.write('Detected Darwin') isDarwin_value = True import platform - v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) - if v >= (10,15,0): - if log: log.write('Detected Darwin/MacOSX Catalina OS\n') - isDarwinCatalina_value = True + try: + v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) + if v >= (10,15,0): + if log: log.write('Detected Darwin/MacOSX Catalina OS\n') + isDarwinCatalina_value = True + except: + if log: log.write('MacOS version detecton failed!\n') + pass if output.find('freebsd') >= 0: if log: log.write('Detected FreeBSD') isFreeBSD_value = True Satish On Wed, 5 Apr 2023, Kaus, Boris wrote: > Perhaps python is broken, or perhaps this is because it is not a real Mac OS, but an emulated one. > > Seems a similar error now occurs a bit down the line: > > --- > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ > ******************************************************************************* > TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure > ------------------------------------------------------------------------------- > invalid literal for int() with base 10: '' > ******************************************************************************* > > > File "/workspace/srcdir/petsc-3.18.0/config/configure.py", line 457, in petsc_configure > framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 101, in __init__ > self.createChildren() > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 338, in createChildren > self.getChild(moduleName) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 323, in getChild > config.setupDependencies(self) > File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 89, in setupDependencies > self.registerPythonFile(utility,'config.utilities') > File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 49, in registerPythonFile > utilityObj = self.framework.require(directory+utilityName, self) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 343, in require > config = self.getChild(moduleName, keywordArgs) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 317, in getChild > config = type(self, *keywordArgs) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/utilities/macosFirewall.py", line 12, in __init__ > self.isDarwin = config.setCompilers.Configure.isDarwin(self.log) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 668, in isDarwin > if not isUname_value: config.setCompilers.Configure.isUname(log) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 631, in isUname > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 631, in > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > > > > > > On 5. Apr 2023, at 23:11, Satish Balay wrote: > > Hm - broken python? Either way configure should not fail. Perhaps the following fix: > > Satish > > --- > > diff --git a/config/BuildSystem/config/setCompilers.py b/config/BuildSystem/config/setCompilers.py > index e4d13bea58f..ae53d1e397e 100644 > --- a/config/BuildSystem/config/setCompilers.py > +++ b/config/BuildSystem/config/setCompilers.py > @@ -626,10 +626,14 @@ class Configure(config.base.Configure): > if log: log.write('Detected Darwin') > isDarwin_value = True > import platform > - v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > - if v >= (10,15,0): > - if log: log.write('Detected Darwin/MacOSX Catalina OS\n') > - isDarwinCatalina_value = True > + try: > + v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > + if v >= (10,15,0): > + if log: log.write('Detected Darwin/MacOSX Catalina OS\n') > + isDarwinCatalina_value = True > + except RuntimeError: > + if log: log.write('MacOS version detecton failed!\n') > + pass > if output.find('freebsd') >= 0: > if log: log.write('Detected FreeBSD') > isFreeBSD_value = True > > On Wed, 5 Apr 2023, Kaus, Boris wrote: > > That indeed seems to be the issue: > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # python3 > Python 3.9.7 (default, Nov 24 2021, 21:15:59) > [GCC 10.3.1 20211027] on linux > Type "help", "copyright", "credits" or "license" for more information. > import platform > platform.mac_ver() > ('', ('', '', ''), '') > platform.mac_ver()[0].split('.') > [''] > tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > Traceback (most recent call last): > File "", line 1, in > File "", line 1, in > ValueError: invalid literal for int() with base 10: ?' > > > On 5. Apr 2023, at 23:00, Satish Balay wrote: > > Sorry, Was looking at the wrong place. > > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > > Can you try: > > balay at ypro petsc % python3 > Python 3.9.6 (default, Mar 10 2023, 20:16:38) > [Clang 14.0.3 (clang-1403.0.22.14.1)] on darwin > Type "help", "copyright", "credits" or "license" for more information. > import platform > platform.mac_ver() > ('13.3', ('', '', ''), 'x86_64') > platform.mac_ver()[0].split('.') > ['13', '3'] > tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > (13, 3) > > > > Satish > > > > > On Wed, 5 Apr 2023, Kaus, Boris wrote: > > Don?t think so: > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # env > _=/usr/bin/env > VERBOSE=true > BUILD_LD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld > OLDPWD=/workspace/srcdir/petsc-3.18.0 > host_libdir=/workspace/x86_64-linux-musl-cxx11/destdir/lib > nproc=8 > target=aarch64-apple-darwin20 > bindir=/workspace/destdir/bin > CC=cc > READELF_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf > host_bindir=/workspace/x86_64-linux-musl-cxx11/destdir/bin > PATH=/opt/bin/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi:/opt/aarch64-apple-darwin20/bin:/opt/bin/x86_64-linux-musl-cxx11:/opt/x86_64-linux-musl/bin:/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/bin:/sbin:/workspace/x86_64-linux-musl-cxx11/destdir/bin:/workspace/destdir/bin > nbits=64 > BUILD_STRIP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip > BUILD_OBJDUMP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump > CMAKE_TARGET_TOOLCHAIN=/opt/toolchains/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/target_aarch64-apple-darwin20.cmake > FC_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran > FC=gfortran > SRC_NAME=PETSc > RANLIB_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib > CC_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc > PS1=\[\]sandbox\[\]:\[\]${PWD//$WORKSPACE/$\{WORKSPACE\}}\[\] \$ > PKG_CONFIG_SYSROOT_DIR=/workspace/destdir > LD_LIBRARY_PATH=/usr/lib/csl-musl-x86_64:/usr/local/lib64:/usr/local/lib:/usr/lib64:/usr/lib:/lib64:/lib:/workspace/x86_64-linux-musl-cxx11/destdir/lib:/opt/x86_64-linux-musl/x86_64-linux-musl/lib64:/opt/x86_64-linux-musl/x86_64-linux-musl/lib:/opt/aarch64-apple-darwin20/aarch64-apple-darwin20/lib:/opt/aarch64-apple-darwin20/lib:/workspace/destdir/lib64:/workspace/destdir/lib > HOSTOBJCOPY=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy > HOSTOBJDUMP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump > LIPO_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo > HOSTSTRIP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip > BUILD_OBJCOPY=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy > ZERO_AR_DATE=1 > dlext=dylib > HIDDEN_PS1=\[\]sandbox\[\]:\[\]${PWD//$WORKSPACE/$\{WORKSPACE\}}\[\] \$ > CCACHE_COMPILERCHECK=content > AR_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar > HOSTDSYMUTIL=dsymutil > SHLVL=1 > OBJDUMP_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump > CXX_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ > HOSTCXX=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ > USER=kausb > BUILD_DSYMUTIL=dsymutil > CC_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc > OBJCOPY_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy > TERM=screen > LIPO_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo > BUILD_LIPO=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo > NM_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm > host_prefix=/workspace/x86_64-linux-musl-cxx11/destdir > FC_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran > AR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar > WORKSPACE=/workspace > STRIP_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip > HOSTRANLIB=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib > RANLIB_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib > DSYMUTIL_FOR_BUILD=dsymutil > HOSTAS=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as > HOSTAR=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar > BUILD_RANLIB=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib > NM_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm > LD=/opt/bin/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/ld > HOSTLD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld > bb_full_target=aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi > LLVM_TARGET=aarch64-apple-darwin20 > BUILD_READELF=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf > CXX_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ > libdir=/workspace/destdir/lib > MESON_TARGET_TOOLCHAIN=/opt/toolchains/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/target_aarch64-apple-darwin20.meson > LLVM_HOST_TARGET=x86_64-linux-musl > STRIP_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip > AS_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as > HISTFILE=/meta/.bash_history > HOME=/root > HOSTLIPO=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo > includedir=/workspace/destdir/include > MESON_HOST_TOOLCHAIN=/opt/toolchains/x86_64-linux-musl-cxx11/host_x86_64-linux-musl.meson > BUILD_FC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran > V=true > BUILD_CC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc > HOSTCC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc > AS_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as > CXX=c++ > rust_target=aarch64-apple-darwin > rust_host=x86_64-unknown-linux-musl > HOSTFC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran > exeext= > READELF_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf > bb_target=aarch64-apple-darwin20 > SOURCE_DATE_EPOCH=0 > PWD=/workspace/srcdir/petsc-3.18.0 > MACOSX_DEPLOYMENT_TARGET=11.0 > proc_family=arm > BUILD_NM=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm > BUILD_CXX=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ > LD_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld > OBJDUMP_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump > OBJCOPY_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy > HOSTNM=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm > USE_CCACHE=false > BUILD_AR=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar > BUILD_AS=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as > prefix=/workspace/destdir > HOSTNAME=271f88c24b60 > CHARSET=UTF-8 > PKG_CONFIG_PATH=/workspace/destdir/lib/pkgconfig:/workspace/destdir/lib64/pkgconfig:/workspace/destdir/share/pkgconfig > MACHTYPE=x86_64-linux-musl > HOSTREADELF=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf > DSYMUTIL_BUILD=dsymutil > LD_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld > host_includedir=/workspace/x86_64-linux-musl-cxx11/destdir/include > CMAKE_HOST_TOOLCHAIN=/opt/toolchains/x86_64-linux-musl-cxx11/host_x86_64-linux-musl.cmake > SHELL=/bin/bash > > > On 5. Apr 2023, at 22:45, Satish Balay wrote: > > Well this doesn't trigger the error for me. Do you have any env variables set with unicode [non-ascii] chars? > > Satish > > --- > > balay at ypro petsc-3.19.0 % sw_vers > ProductName: macOS > ProductVersion: 13.3 > BuildVersion: 22E252 > balay at ypro petsc-3.19.0 % ./configure --with-mpi=0 --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ > ============================================================================================= > Configuring PETSc to compile on your system > ============================================================================================= > ============================================================================================= > ***** WARNING ***** > You have a version of GNU make older than 4.0. It will work, but may not support all the > parallel testing options. You can install the latest GNU make with your package manager, > such as Brew or MacPorts, or use the --download-make option to get the latest GNU make > ============================================================================================= > Compilers: > C Compiler: gcc -fPIC -Wall -Wwrite-strings -Wno-unknown-pragmas -fstack-protector -fno-stack-check -Qunused-arguments -fvisibility=hidden -g3 -O0 > Version: Apple clang version 14.0.3 (clang-1403.0.22.14.1) > ... > ... > > > On Wed, 5 Apr 2023, Kaus, Boris wrote: > > It can be reproduced with this: > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ > ******************************************************************************* > TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure > ------------------------------------------------------------------------------- > invalid literal for int() with base 10: '' > ******************************************************************************* > > > File "/workspace/srcdir/petsc-3.18.0/config/configure.py", line 457, in petsc_configure > framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 101, in __init__ > self.createChildren() > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 338, in createChildren > self.getChild(moduleName) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 323, in getChild > config.setupDependencies(self) > File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 89, in setupDependencies > self.registerPythonFile(utility,'config.utilities') > File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 49, in registerPythonFile > utilityObj = self.framework.require(directory+utilityName, self) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 343, in require > config = self.getChild(moduleName, keywordArgs) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 317, in getChild > config = type(self, *keywordArgs) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/utilities/macosFirewall.py", line 12, in __init__ > self.isDarwin = config.setCompilers.Configure.isDarwin(self.log) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 664, in isDarwin > if not isUname_value: config.setCompilers.Configure.isUname(log) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in isUname > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > > > On 5. Apr 2023, at 22:32, Stefano Zampini wrote: > > It seems there's some typo/error in the configure command that is being executed. Can you post it here? > > Il giorno mer 5 apr 2023 alle ore 23:18 Kaus, Boris > ha scritto: > Hi everyone, > > I?m trying to install precompiled binaries for PETSc 3.18.5 & 3.19.0 using the BinaryBuilder cross-compilation: > https://github.com/JuliaPackaging/Yggdrasil/pull/6533, which mostly works fine: https://buildkite.com/julialang/yggdrasil/builds/2093). > > Yet, on apple systems I receive a somewhat weird bug during the configure step: > > [22:08:49] ******************************************************************************* > [22:08:49] TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure > [22:08:49] ------------------------------------------------------------------------------- > [22:08:49] invalid literal for int() with base 10: '' > [22:08:49] ******************************************************************************* > [22:08:49] > [22:08:49] > [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory > [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. > [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory > [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. > [22:08:49] make: *** [GNUmakefile:17: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscvariables] Error 2 > [22:08:49] make: *** Waiting for unfinished jobs.... > [22:08:49] make: *** [GNUmakefile:17: lib/petsc/conf/petscvariables] Error 2 > > The log file is rather brief: > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # more configure.log > Executing: uname -s > stdout: Darwin > > It works fine for PETSc 3.16.5/3.17.5, and this first occurs in 3.18.0. > Is there something that changed between 3.17 & 3.18 that could cause this? > > The build system seems to use python3.9 (3.4+ as required) > > Thanks! > Boris > > > > > > > -- > Stefano > > From kaus at uni-mainz.de Wed Apr 5 17:13:58 2023 From: kaus at uni-mainz.de (Kaus, Boris) Date: Wed, 5 Apr 2023 22:13:58 +0000 Subject: [petsc-users] Installation issue of 3.18.* and 3.19.0 on Apple systems In-Reply-To: <59b55873-a729-b85a-e2e2-89bb20531f9f@mcs.anl.gov> References: <6625F254-C1D9-4249-BD09-6D3BFAA8CDC2@uni-mainz.de> <3219E7BA-A189-48B2-A174-3DAE9EE7F8CC@uni-mainz.de> <9253F01D-EAC1-492F-8F73-0E234B63BEF1@uni-mainz.de> <832c3cf8-0d7d-91d1-9197-e1cf37b634ae@mcs.anl.gov> <2201BE8A-8E4F-4D52-AEB2-F51FCD557D89@uni-mainz.de> <23767F99-C0BD-4622-A13F-0A11BDC46A63@uni-mainz.de> <59b55873-a729-b85a-e2e2-89bb20531f9f@mcs.anl.gov> Message-ID: <52C2E466-C0F9-4E57-A0F8-7EA99D75104E@uni-mainz.de> I don?t understand why this keeps giving the same error (shouldn?t): sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ ******************************************************************************* TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure ------------------------------------------------------------------------------- invalid literal for int() with base 10: '' ******************************************************************************* File "/workspace/srcdir/petsc-3.18.0/config/configure.py", line 457, in petsc_configure framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 101, in __init__ self.createChildren() File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 338, in createChildren self.getChild(moduleName) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 323, in getChild config.setupDependencies(self) File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 89, in setupDependencies self.registerPythonFile(utility,'config.utilities') File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 49, in registerPythonFile utilityObj = self.framework.require(directory+utilityName, self) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 343, in require config = self.getChild(moduleName, keywordArgs) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 317, in getChild config = type(self, *keywordArgs) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/utilities/macosFirewall.py", line 12, in __init__ self.isDarwin = config.setCompilers.Configure.isDarwin(self.log) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 676, in isDarwin if not isUname_value: config.setCompilers.Configure.isUname(log) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 631, in isUname v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 631, in v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) On 5. Apr 2023, at 23:58, Satish Balay wrote: Ah - my patch was buggy. + except RuntimeError: should be: 'except:' i.e: diff --git a/config/BuildSystem/config/setCompilers.py b/config/BuildSystem/config/setCompilers.py index 57848e736e3..e191c1e1b4d 100644 --- a/config/BuildSystem/config/setCompilers.py +++ b/config/BuildSystem/config/setCompilers.py @@ -630,10 +630,14 @@ class Configure(config.base.Configure): if log: log.write('Detected Darwin') isDarwin_value = True import platform - v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) - if v >= (10,15,0): - if log: log.write('Detected Darwin/MacOSX Catalina OS\n') - isDarwinCatalina_value = True + try: + v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) + if v >= (10,15,0): + if log: log.write('Detected Darwin/MacOSX Catalina OS\n') + isDarwinCatalina_value = True + except: + if log: log.write('MacOS version detecton failed!\n') + pass if output.find('freebsd') >= 0: if log: log.write('Detected FreeBSD') isFreeBSD_value = True Satish On Wed, 5 Apr 2023, Kaus, Boris wrote: Perhaps python is broken, or perhaps this is because it is not a real Mac OS, but an emulated one. Seems a similar error now occurs a bit down the line: --- sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ ******************************************************************************* TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure ------------------------------------------------------------------------------- invalid literal for int() with base 10: '' ******************************************************************************* File "/workspace/srcdir/petsc-3.18.0/config/configure.py", line 457, in petsc_configure framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 101, in __init__ self.createChildren() File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 338, in createChildren self.getChild(moduleName) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 323, in getChild config.setupDependencies(self) File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 89, in setupDependencies self.registerPythonFile(utility,'config.utilities') File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 49, in registerPythonFile utilityObj = self.framework.require(directory+utilityName, self) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 343, in require config = self.getChild(moduleName, keywordArgs) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 317, in getChild config = type(self, *keywordArgs) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/utilities/macosFirewall.py", line 12, in __init__ self.isDarwin = config.setCompilers.Configure.isDarwin(self.log) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 668, in isDarwin if not isUname_value: config.setCompilers.Configure.isUname(log) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 631, in isUname v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 631, in v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) On 5. Apr 2023, at 23:11, Satish Balay wrote: Hm - broken python? Either way configure should not fail. Perhaps the following fix: Satish --- diff --git a/config/BuildSystem/config/setCompilers.py b/config/BuildSystem/config/setCompilers.py index e4d13bea58f..ae53d1e397e 100644 --- a/config/BuildSystem/config/setCompilers.py +++ b/config/BuildSystem/config/setCompilers.py @@ -626,10 +626,14 @@ class Configure(config.base.Configure): if log: log.write('Detected Darwin') isDarwin_value = True import platform - v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) - if v >= (10,15,0): - if log: log.write('Detected Darwin/MacOSX Catalina OS\n') - isDarwinCatalina_value = True + try: + v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) + if v >= (10,15,0): + if log: log.write('Detected Darwin/MacOSX Catalina OS\n') + isDarwinCatalina_value = True + except RuntimeError: + if log: log.write('MacOS version detecton failed!\n') + pass if output.find('freebsd') >= 0: if log: log.write('Detected FreeBSD') isFreeBSD_value = True On Wed, 5 Apr 2023, Kaus, Boris wrote: That indeed seems to be the issue: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # python3 Python 3.9.7 (default, Nov 24 2021, 21:15:59) [GCC 10.3.1 20211027] on linux Type "help", "copyright", "credits" or "license" for more information. import platform platform.mac_ver() ('', ('', '', ''), '') platform.mac_ver()[0].split('.') [''] tuple([int(a) for a in platform.mac_ver()[0].split('.')]) Traceback (most recent call last): File "", line 1, in File "", line 1, in ValueError: invalid literal for int() with base 10: ?' On 5. Apr 2023, at 23:00, Satish Balay wrote: Sorry, Was looking at the wrong place. v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) Can you try: balay at ypro petsc % python3 Python 3.9.6 (default, Mar 10 2023, 20:16:38) [Clang 14.0.3 (clang-1403.0.22.14.1)] on darwin Type "help", "copyright", "credits" or "license" for more information. import platform platform.mac_ver() ('13.3', ('', '', ''), 'x86_64') platform.mac_ver()[0].split('.') ['13', '3'] tuple([int(a) for a in platform.mac_ver()[0].split('.')]) (13, 3) Satish On Wed, 5 Apr 2023, Kaus, Boris wrote: Don?t think so: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # env _=/usr/bin/env VERBOSE=true BUILD_LD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld OLDPWD=/workspace/srcdir/petsc-3.18.0 host_libdir=/workspace/x86_64-linux-musl-cxx11/destdir/lib nproc=8 target=aarch64-apple-darwin20 bindir=/workspace/destdir/bin CC=cc READELF_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf host_bindir=/workspace/x86_64-linux-musl-cxx11/destdir/bin PATH=/opt/bin/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi:/opt/aarch64-apple-darwin20/bin:/opt/bin/x86_64-linux-musl-cxx11:/opt/x86_64-linux-musl/bin:/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/bin:/sbin:/workspace/x86_64-linux-musl-cxx11/destdir/bin:/workspace/destdir/bin nbits=64 BUILD_STRIP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip BUILD_OBJDUMP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump CMAKE_TARGET_TOOLCHAIN=/opt/toolchains/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/target_aarch64-apple-darwin20.cmake FC_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran FC=gfortran SRC_NAME=PETSc RANLIB_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib CC_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc PS1=\[\]sandbox\[\]:\[\]${PWD//$WORKSPACE/$\{WORKSPACE\}}\[\] \$ PKG_CONFIG_SYSROOT_DIR=/workspace/destdir LD_LIBRARY_PATH=/usr/lib/csl-musl-x86_64:/usr/local/lib64:/usr/local/lib:/usr/lib64:/usr/lib:/lib64:/lib:/workspace/x86_64-linux-musl-cxx11/destdir/lib:/opt/x86_64-linux-musl/x86_64-linux-musl/lib64:/opt/x86_64-linux-musl/x86_64-linux-musl/lib:/opt/aarch64-apple-darwin20/aarch64-apple-darwin20/lib:/opt/aarch64-apple-darwin20/lib:/workspace/destdir/lib64:/workspace/destdir/lib HOSTOBJCOPY=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy HOSTOBJDUMP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump LIPO_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo HOSTSTRIP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip BUILD_OBJCOPY=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy ZERO_AR_DATE=1 dlext=dylib HIDDEN_PS1=\[\]sandbox\[\]:\[\]${PWD//$WORKSPACE/$\{WORKSPACE\}}\[\] \$ CCACHE_COMPILERCHECK=content AR_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar HOSTDSYMUTIL=dsymutil SHLVL=1 OBJDUMP_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump CXX_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ HOSTCXX=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ USER=kausb BUILD_DSYMUTIL=dsymutil CC_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc OBJCOPY_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy TERM=screen LIPO_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo BUILD_LIPO=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo NM_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm host_prefix=/workspace/x86_64-linux-musl-cxx11/destdir FC_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran AR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar WORKSPACE=/workspace STRIP_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip HOSTRANLIB=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib RANLIB_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib DSYMUTIL_FOR_BUILD=dsymutil HOSTAS=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as HOSTAR=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar BUILD_RANLIB=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib NM_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm LD=/opt/bin/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/ld HOSTLD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld bb_full_target=aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi LLVM_TARGET=aarch64-apple-darwin20 BUILD_READELF=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf CXX_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ libdir=/workspace/destdir/lib MESON_TARGET_TOOLCHAIN=/opt/toolchains/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/target_aarch64-apple-darwin20.meson LLVM_HOST_TARGET=x86_64-linux-musl STRIP_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip AS_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as HISTFILE=/meta/.bash_history HOME=/root HOSTLIPO=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo includedir=/workspace/destdir/include MESON_HOST_TOOLCHAIN=/opt/toolchains/x86_64-linux-musl-cxx11/host_x86_64-linux-musl.meson BUILD_FC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran V=true BUILD_CC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc HOSTCC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc AS_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as CXX=c++ rust_target=aarch64-apple-darwin rust_host=x86_64-unknown-linux-musl HOSTFC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran exeext= READELF_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf bb_target=aarch64-apple-darwin20 SOURCE_DATE_EPOCH=0 PWD=/workspace/srcdir/petsc-3.18.0 MACOSX_DEPLOYMENT_TARGET=11.0 proc_family=arm BUILD_NM=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm BUILD_CXX=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ LD_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld OBJDUMP_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump OBJCOPY_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy HOSTNM=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm USE_CCACHE=false BUILD_AR=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar BUILD_AS=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as prefix=/workspace/destdir HOSTNAME=271f88c24b60 CHARSET=UTF-8 PKG_CONFIG_PATH=/workspace/destdir/lib/pkgconfig:/workspace/destdir/lib64/pkgconfig:/workspace/destdir/share/pkgconfig MACHTYPE=x86_64-linux-musl HOSTREADELF=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf DSYMUTIL_BUILD=dsymutil LD_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld host_includedir=/workspace/x86_64-linux-musl-cxx11/destdir/include CMAKE_HOST_TOOLCHAIN=/opt/toolchains/x86_64-linux-musl-cxx11/host_x86_64-linux-musl.cmake SHELL=/bin/bash On 5. Apr 2023, at 22:45, Satish Balay wrote: Well this doesn't trigger the error for me. Do you have any env variables set with unicode [non-ascii] chars? Satish --- balay at ypro petsc-3.19.0 % sw_vers ProductName: macOS ProductVersion: 13.3 BuildVersion: 22E252 balay at ypro petsc-3.19.0 % ./configure --with-mpi=0 --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ ============================================================================================= Configuring PETSc to compile on your system ============================================================================================= ============================================================================================= ***** WARNING ***** You have a version of GNU make older than 4.0. It will work, but may not support all the parallel testing options. You can install the latest GNU make with your package manager, such as Brew or MacPorts, or use the --download-make option to get the latest GNU make ============================================================================================= Compilers: C Compiler: gcc -fPIC -Wall -Wwrite-strings -Wno-unknown-pragmas -fstack-protector -fno-stack-check -Qunused-arguments -fvisibility=hidden -g3 -O0 Version: Apple clang version 14.0.3 (clang-1403.0.22.14.1) ... ... On Wed, 5 Apr 2023, Kaus, Boris wrote: It can be reproduced with this: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ ******************************************************************************* TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure ------------------------------------------------------------------------------- invalid literal for int() with base 10: '' ******************************************************************************* File "/workspace/srcdir/petsc-3.18.0/config/configure.py", line 457, in petsc_configure framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 101, in __init__ self.createChildren() File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 338, in createChildren self.getChild(moduleName) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 323, in getChild config.setupDependencies(self) File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 89, in setupDependencies self.registerPythonFile(utility,'config.utilities') File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 49, in registerPythonFile utilityObj = self.framework.require(directory+utilityName, self) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 343, in require config = self.getChild(moduleName, keywordArgs) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 317, in getChild config = type(self, *keywordArgs) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/utilities/macosFirewall.py", line 12, in __init__ self.isDarwin = config.setCompilers.Configure.isDarwin(self.log) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 664, in isDarwin if not isUname_value: config.setCompilers.Configure.isUname(log) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in isUname v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) On 5. Apr 2023, at 22:32, Stefano Zampini wrote: It seems there's some typo/error in the configure command that is being executed. Can you post it here? Il giorno mer 5 apr 2023 alle ore 23:18 Kaus, Boris > ha scritto: Hi everyone, I?m trying to install precompiled binaries for PETSc 3.18.5 & 3.19.0 using the BinaryBuilder cross-compilation: https://github.com/JuliaPackaging/Yggdrasil/pull/6533, which mostly works fine: https://buildkite.com/julialang/yggdrasil/builds/2093). Yet, on apple systems I receive a somewhat weird bug during the configure step: [22:08:49] ******************************************************************************* [22:08:49] TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure [22:08:49] ------------------------------------------------------------------------------- [22:08:49] invalid literal for int() with base 10: '' [22:08:49] ******************************************************************************* [22:08:49] [22:08:49] [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. [22:08:49] make: *** [GNUmakefile:17: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscvariables] Error 2 [22:08:49] make: *** Waiting for unfinished jobs.... [22:08:49] make: *** [GNUmakefile:17: lib/petsc/conf/petscvariables] Error 2 The log file is rather brief: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # more configure.log Executing: uname -s stdout: Darwin It works fine for PETSc 3.16.5/3.17.5, and this first occurs in 3.18.0. Is there something that changed between 3.17 & 3.18 that could cause this? The build system seems to use python3.9 (3.4+ as required) Thanks! Boris -- Stefano -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Wed Apr 5 17:56:43 2023 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 5 Apr 2023 17:56:43 -0500 (CDT) Subject: [petsc-users] Installation issue of 3.18.* and 3.19.0 on Apple systems In-Reply-To: <52C2E466-C0F9-4E57-A0F8-7EA99D75104E@uni-mainz.de> References: <6625F254-C1D9-4249-BD09-6D3BFAA8CDC2@uni-mainz.de> <3219E7BA-A189-48B2-A174-3DAE9EE7F8CC@uni-mainz.de> <9253F01D-EAC1-492F-8F73-0E234B63BEF1@uni-mainz.de> <832c3cf8-0d7d-91d1-9197-e1cf37b634ae@mcs.anl.gov> <2201BE8A-8E4F-4D52-AEB2-F51FCD557D89@uni-mainz.de> <23767F99-C0BD-4622-A13F-0A11BDC46A63@uni-mainz.de> <59b55873-a729-b85a-e2e2-89bb20531f9f@mcs.anl.gov> <52C2E466-C0F9-4E57-A0F8-7EA99D75104E@uni-mainz.de> Message-ID: <7b96989f-20f5-4bb0-f50a-8ba986484407@mcs.anl.gov> This is strange. I can trigger the error with: - v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) + v = tuple([int('')]) and this fix is able to overcome it. I pushed this change to `balay/fix-macos-version-check` branch - can you try from it? Satish On Wed, 5 Apr 2023, Kaus, Boris wrote: > I don?t understand why this keeps giving the same error (shouldn?t): > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ > ******************************************************************************* > TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure > ------------------------------------------------------------------------------- > invalid literal for int() with base 10: '' > ******************************************************************************* > > > File "/workspace/srcdir/petsc-3.18.0/config/configure.py", line 457, in petsc_configure > framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 101, in __init__ > self.createChildren() > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 338, in createChildren > self.getChild(moduleName) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 323, in getChild > config.setupDependencies(self) > File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 89, in setupDependencies > self.registerPythonFile(utility,'config.utilities') > File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 49, in registerPythonFile > utilityObj = self.framework.require(directory+utilityName, self) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 343, in require > config = self.getChild(moduleName, keywordArgs) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 317, in getChild > config = type(self, *keywordArgs) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/utilities/macosFirewall.py", line 12, in __init__ > self.isDarwin = config.setCompilers.Configure.isDarwin(self.log) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 676, in isDarwin > if not isUname_value: config.setCompilers.Configure.isUname(log) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 631, in isUname > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 631, in > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > > > On 5. Apr 2023, at 23:58, Satish Balay wrote: > > Ah - my patch was buggy. > > + except RuntimeError: > > should be: 'except:' i.e: > > diff --git a/config/BuildSystem/config/setCompilers.py b/config/BuildSystem/config/setCompilers.py > index 57848e736e3..e191c1e1b4d 100644 > --- a/config/BuildSystem/config/setCompilers.py > +++ b/config/BuildSystem/config/setCompilers.py > @@ -630,10 +630,14 @@ class Configure(config.base.Configure): > if log: log.write('Detected Darwin') > isDarwin_value = True > import platform > - v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > - if v >= (10,15,0): > - if log: log.write('Detected Darwin/MacOSX Catalina OS\n') > - isDarwinCatalina_value = True > + try: > + v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > + if v >= (10,15,0): > + if log: log.write('Detected Darwin/MacOSX Catalina OS\n') > + isDarwinCatalina_value = True > + except: > + if log: log.write('MacOS version detecton failed!\n') > + pass > if output.find('freebsd') >= 0: > if log: log.write('Detected FreeBSD') > isFreeBSD_value = True > > Satish > > > > > On Wed, 5 Apr 2023, Kaus, Boris wrote: > > Perhaps python is broken, or perhaps this is because it is not a real Mac OS, but an emulated one. > > Seems a similar error now occurs a bit down the line: > > --- > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ > ******************************************************************************* > TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure > ------------------------------------------------------------------------------- > invalid literal for int() with base 10: '' > ******************************************************************************* > > > File "/workspace/srcdir/petsc-3.18.0/config/configure.py", line 457, in petsc_configure > framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 101, in __init__ > self.createChildren() > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 338, in createChildren > self.getChild(moduleName) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 323, in getChild > config.setupDependencies(self) > File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 89, in setupDependencies > self.registerPythonFile(utility,'config.utilities') > File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 49, in registerPythonFile > utilityObj = self.framework.require(directory+utilityName, self) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 343, in require > config = self.getChild(moduleName, keywordArgs) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 317, in getChild > config = type(self, *keywordArgs) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/utilities/macosFirewall.py", line 12, in __init__ > self.isDarwin = config.setCompilers.Configure.isDarwin(self.log) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 668, in isDarwin > if not isUname_value: config.setCompilers.Configure.isUname(log) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 631, in isUname > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 631, in > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > > > > > > On 5. Apr 2023, at 23:11, Satish Balay wrote: > > Hm - broken python? Either way configure should not fail. Perhaps the following fix: > > Satish > > --- > > diff --git a/config/BuildSystem/config/setCompilers.py b/config/BuildSystem/config/setCompilers.py > index e4d13bea58f..ae53d1e397e 100644 > --- a/config/BuildSystem/config/setCompilers.py > +++ b/config/BuildSystem/config/setCompilers.py > @@ -626,10 +626,14 @@ class Configure(config.base.Configure): > if log: log.write('Detected Darwin') > isDarwin_value = True > import platform > - v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > - if v >= (10,15,0): > - if log: log.write('Detected Darwin/MacOSX Catalina OS\n') > - isDarwinCatalina_value = True > + try: > + v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > + if v >= (10,15,0): > + if log: log.write('Detected Darwin/MacOSX Catalina OS\n') > + isDarwinCatalina_value = True > + except RuntimeError: > + if log: log.write('MacOS version detecton failed!\n') > + pass > if output.find('freebsd') >= 0: > if log: log.write('Detected FreeBSD') > isFreeBSD_value = True > > On Wed, 5 Apr 2023, Kaus, Boris wrote: > > That indeed seems to be the issue: > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # python3 > Python 3.9.7 (default, Nov 24 2021, 21:15:59) > [GCC 10.3.1 20211027] on linux > Type "help", "copyright", "credits" or "license" for more information. > import platform > platform.mac_ver() > ('', ('', '', ''), '') > platform.mac_ver()[0].split('.') > [''] > tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > Traceback (most recent call last): > File "", line 1, in > File "", line 1, in > ValueError: invalid literal for int() with base 10: ?' > > > On 5. Apr 2023, at 23:00, Satish Balay wrote: > > Sorry, Was looking at the wrong place. > > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > > Can you try: > > balay at ypro petsc % python3 > Python 3.9.6 (default, Mar 10 2023, 20:16:38) > [Clang 14.0.3 (clang-1403.0.22.14.1)] on darwin > Type "help", "copyright", "credits" or "license" for more information. > import platform > platform.mac_ver() > ('13.3', ('', '', ''), 'x86_64') > platform.mac_ver()[0].split('.') > ['13', '3'] > tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > (13, 3) > > > > Satish > > > > > On Wed, 5 Apr 2023, Kaus, Boris wrote: > > Don?t think so: > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # env > _=/usr/bin/env > VERBOSE=true > BUILD_LD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld > OLDPWD=/workspace/srcdir/petsc-3.18.0 > host_libdir=/workspace/x86_64-linux-musl-cxx11/destdir/lib > nproc=8 > target=aarch64-apple-darwin20 > bindir=/workspace/destdir/bin > CC=cc > READELF_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf > host_bindir=/workspace/x86_64-linux-musl-cxx11/destdir/bin > PATH=/opt/bin/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi:/opt/aarch64-apple-darwin20/bin:/opt/bin/x86_64-linux-musl-cxx11:/opt/x86_64-linux-musl/bin:/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/bin:/sbin:/workspace/x86_64-linux-musl-cxx11/destdir/bin:/workspace/destdir/bin > nbits=64 > BUILD_STRIP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip > BUILD_OBJDUMP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump > CMAKE_TARGET_TOOLCHAIN=/opt/toolchains/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/target_aarch64-apple-darwin20.cmake > FC_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran > FC=gfortran > SRC_NAME=PETSc > RANLIB_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib > CC_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc > PS1=\[\]sandbox\[\]:\[\]${PWD//$WORKSPACE/$\{WORKSPACE\}}\[\] \$ > PKG_CONFIG_SYSROOT_DIR=/workspace/destdir > LD_LIBRARY_PATH=/usr/lib/csl-musl-x86_64:/usr/local/lib64:/usr/local/lib:/usr/lib64:/usr/lib:/lib64:/lib:/workspace/x86_64-linux-musl-cxx11/destdir/lib:/opt/x86_64-linux-musl/x86_64-linux-musl/lib64:/opt/x86_64-linux-musl/x86_64-linux-musl/lib:/opt/aarch64-apple-darwin20/aarch64-apple-darwin20/lib:/opt/aarch64-apple-darwin20/lib:/workspace/destdir/lib64:/workspace/destdir/lib > HOSTOBJCOPY=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy > HOSTOBJDUMP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump > LIPO_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo > HOSTSTRIP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip > BUILD_OBJCOPY=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy > ZERO_AR_DATE=1 > dlext=dylib > HIDDEN_PS1=\[\]sandbox\[\]:\[\]${PWD//$WORKSPACE/$\{WORKSPACE\}}\[\] \$ > CCACHE_COMPILERCHECK=content > AR_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar > HOSTDSYMUTIL=dsymutil > SHLVL=1 > OBJDUMP_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump > CXX_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ > HOSTCXX=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ > USER=kausb > BUILD_DSYMUTIL=dsymutil > CC_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc > OBJCOPY_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy > TERM=screen > LIPO_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo > BUILD_LIPO=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo > NM_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm > host_prefix=/workspace/x86_64-linux-musl-cxx11/destdir > FC_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran > AR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar > WORKSPACE=/workspace > STRIP_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip > HOSTRANLIB=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib > RANLIB_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib > DSYMUTIL_FOR_BUILD=dsymutil > HOSTAS=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as > HOSTAR=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar > BUILD_RANLIB=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib > NM_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm > LD=/opt/bin/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/ld > HOSTLD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld > bb_full_target=aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi > LLVM_TARGET=aarch64-apple-darwin20 > BUILD_READELF=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf > CXX_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ > libdir=/workspace/destdir/lib > MESON_TARGET_TOOLCHAIN=/opt/toolchains/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/target_aarch64-apple-darwin20.meson > LLVM_HOST_TARGET=x86_64-linux-musl > STRIP_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip > AS_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as > HISTFILE=/meta/.bash_history > HOME=/root > HOSTLIPO=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo > includedir=/workspace/destdir/include > MESON_HOST_TOOLCHAIN=/opt/toolchains/x86_64-linux-musl-cxx11/host_x86_64-linux-musl.meson > BUILD_FC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran > V=true > BUILD_CC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc > HOSTCC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc > AS_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as > CXX=c++ > rust_target=aarch64-apple-darwin > rust_host=x86_64-unknown-linux-musl > HOSTFC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran > exeext= > READELF_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf > bb_target=aarch64-apple-darwin20 > SOURCE_DATE_EPOCH=0 > PWD=/workspace/srcdir/petsc-3.18.0 > MACOSX_DEPLOYMENT_TARGET=11.0 > proc_family=arm > BUILD_NM=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm > BUILD_CXX=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ > LD_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld > OBJDUMP_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump > OBJCOPY_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy > HOSTNM=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm > USE_CCACHE=false > BUILD_AR=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar > BUILD_AS=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as > prefix=/workspace/destdir > HOSTNAME=271f88c24b60 > CHARSET=UTF-8 > PKG_CONFIG_PATH=/workspace/destdir/lib/pkgconfig:/workspace/destdir/lib64/pkgconfig:/workspace/destdir/share/pkgconfig > MACHTYPE=x86_64-linux-musl > HOSTREADELF=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf > DSYMUTIL_BUILD=dsymutil > LD_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld > host_includedir=/workspace/x86_64-linux-musl-cxx11/destdir/include > CMAKE_HOST_TOOLCHAIN=/opt/toolchains/x86_64-linux-musl-cxx11/host_x86_64-linux-musl.cmake > SHELL=/bin/bash > > > On 5. Apr 2023, at 22:45, Satish Balay wrote: > > Well this doesn't trigger the error for me. Do you have any env variables set with unicode [non-ascii] chars? > > Satish > > --- > > balay at ypro petsc-3.19.0 % sw_vers > ProductName: macOS > ProductVersion: 13.3 > BuildVersion: 22E252 > balay at ypro petsc-3.19.0 % ./configure --with-mpi=0 --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ > ============================================================================================= > Configuring PETSc to compile on your system > ============================================================================================= > ============================================================================================= > ***** WARNING ***** > You have a version of GNU make older than 4.0. It will work, but may not support all the > parallel testing options. You can install the latest GNU make with your package manager, > such as Brew or MacPorts, or use the --download-make option to get the latest GNU make > ============================================================================================= > Compilers: > C Compiler: gcc -fPIC -Wall -Wwrite-strings -Wno-unknown-pragmas -fstack-protector -fno-stack-check -Qunused-arguments -fvisibility=hidden -g3 -O0 > Version: Apple clang version 14.0.3 (clang-1403.0.22.14.1) > ... > ... > > > On Wed, 5 Apr 2023, Kaus, Boris wrote: > > It can be reproduced with this: > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ > ******************************************************************************* > TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure > ------------------------------------------------------------------------------- > invalid literal for int() with base 10: '' > ******************************************************************************* > > > File "/workspace/srcdir/petsc-3.18.0/config/configure.py", line 457, in petsc_configure > framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 101, in __init__ > self.createChildren() > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 338, in createChildren > self.getChild(moduleName) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 323, in getChild > config.setupDependencies(self) > File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 89, in setupDependencies > self.registerPythonFile(utility,'config.utilities') > File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 49, in registerPythonFile > utilityObj = self.framework.require(directory+utilityName, self) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 343, in require > config = self.getChild(moduleName, keywordArgs) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 317, in getChild > config = type(self, *keywordArgs) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/utilities/macosFirewall.py", line 12, in __init__ > self.isDarwin = config.setCompilers.Configure.isDarwin(self.log) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 664, in isDarwin > if not isUname_value: config.setCompilers.Configure.isUname(log) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in isUname > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > > > On 5. Apr 2023, at 22:32, Stefano Zampini wrote: > > It seems there's some typo/error in the configure command that is being executed. Can you post it here? > > Il giorno mer 5 apr 2023 alle ore 23:18 Kaus, Boris > ha scritto: > Hi everyone, > > I?m trying to install precompiled binaries for PETSc 3.18.5 & 3.19.0 using the BinaryBuilder cross-compilation: > https://github.com/JuliaPackaging/Yggdrasil/pull/6533, which mostly works fine: https://buildkite.com/julialang/yggdrasil/builds/2093). > > Yet, on apple systems I receive a somewhat weird bug during the configure step: > > [22:08:49] ******************************************************************************* > [22:08:49] TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure > [22:08:49] ------------------------------------------------------------------------------- > [22:08:49] invalid literal for int() with base 10: '' > [22:08:49] ******************************************************************************* > [22:08:49] > [22:08:49] > [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory > [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. > [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory > [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. > [22:08:49] make: *** [GNUmakefile:17: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscvariables] Error 2 > [22:08:49] make: *** Waiting for unfinished jobs.... > [22:08:49] make: *** [GNUmakefile:17: lib/petsc/conf/petscvariables] Error 2 > > The log file is rather brief: > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # more configure.log > Executing: uname -s > stdout: Darwin > > It works fine for PETSc 3.16.5/3.17.5, and this first occurs in 3.18.0. > Is there something that changed between 3.17 & 3.18 that could cause this? > > The build system seems to use python3.9 (3.4+ as required) > > Thanks! > Boris > > > > > > > -- > Stefano > > From kaus at uni-mainz.de Thu Apr 6 02:37:53 2023 From: kaus at uni-mainz.de (Kaus, Boris) Date: Thu, 6 Apr 2023 07:37:53 +0000 Subject: [petsc-users] Installation issue of 3.18.* and 3.19.0 on Apple systems In-Reply-To: <7b96989f-20f5-4bb0-f50a-8ba986484407@mcs.anl.gov> References: <6625F254-C1D9-4249-BD09-6D3BFAA8CDC2@uni-mainz.de> <3219E7BA-A189-48B2-A174-3DAE9EE7F8CC@uni-mainz.de> <9253F01D-EAC1-492F-8F73-0E234B63BEF1@uni-mainz.de> <832c3cf8-0d7d-91d1-9197-e1cf37b634ae@mcs.anl.gov> <2201BE8A-8E4F-4D52-AEB2-F51FCD557D89@uni-mainz.de> <23767F99-C0BD-4622-A13F-0A11BDC46A63@uni-mainz.de> <59b55873-a729-b85a-e2e2-89bb20531f9f@mcs.anl.gov> <52C2E466-C0F9-4E57-A0F8-7EA99D75104E@uni-mainz.de> <7b96989f-20f5-4bb0-f50a-8ba986484407@mcs.anl.gov> Message-ID: <1F8CE19F-4459-4A08-96BC-D63B3ACEDC03@uni-mainz.de> Apologies, mistake on my side. Yes this works - thanks a lot for your help! Boris On 6. Apr 2023, at 00:56, Satish Balay wrote: This is strange. I can trigger the error with: - v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) + v = tuple([int('')]) and this fix is able to overcome it. I pushed this change to `balay/fix-macos-version-check` branch - can you try from it? Satish On Wed, 5 Apr 2023, Kaus, Boris wrote: I don?t understand why this keeps giving the same error (shouldn?t): sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ ******************************************************************************* TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure ------------------------------------------------------------------------------- invalid literal for int() with base 10: '' ******************************************************************************* File "/workspace/srcdir/petsc-3.18.0/config/configure.py", line 457, in petsc_configure framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 101, in __init__ self.createChildren() File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 338, in createChildren self.getChild(moduleName) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 323, in getChild config.setupDependencies(self) File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 89, in setupDependencies self.registerPythonFile(utility,'config.utilities') File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 49, in registerPythonFile utilityObj = self.framework.require(directory+utilityName, self) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 343, in require config = self.getChild(moduleName, keywordArgs) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 317, in getChild config = type(self, *keywordArgs) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/utilities/macosFirewall.py", line 12, in __init__ self.isDarwin = config.setCompilers.Configure.isDarwin(self.log) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 676, in isDarwin if not isUname_value: config.setCompilers.Configure.isUname(log) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 631, in isUname v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 631, in v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) On 5. Apr 2023, at 23:58, Satish Balay wrote: Ah - my patch was buggy. + except RuntimeError: should be: 'except:' i.e: diff --git a/config/BuildSystem/config/setCompilers.py b/config/BuildSystem/config/setCompilers.py index 57848e736e3..e191c1e1b4d 100644 --- a/config/BuildSystem/config/setCompilers.py +++ b/config/BuildSystem/config/setCompilers.py @@ -630,10 +630,14 @@ class Configure(config.base.Configure): if log: log.write('Detected Darwin') isDarwin_value = True import platform - v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) - if v >= (10,15,0): - if log: log.write('Detected Darwin/MacOSX Catalina OS\n') - isDarwinCatalina_value = True + try: + v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) + if v >= (10,15,0): + if log: log.write('Detected Darwin/MacOSX Catalina OS\n') + isDarwinCatalina_value = True + except: + if log: log.write('MacOS version detecton failed!\n') + pass if output.find('freebsd') >= 0: if log: log.write('Detected FreeBSD') isFreeBSD_value = True Satish On Wed, 5 Apr 2023, Kaus, Boris wrote: Perhaps python is broken, or perhaps this is because it is not a real Mac OS, but an emulated one. Seems a similar error now occurs a bit down the line: --- sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ ******************************************************************************* TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure ------------------------------------------------------------------------------- invalid literal for int() with base 10: '' ******************************************************************************* File "/workspace/srcdir/petsc-3.18.0/config/configure.py", line 457, in petsc_configure framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 101, in __init__ self.createChildren() File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 338, in createChildren self.getChild(moduleName) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 323, in getChild config.setupDependencies(self) File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 89, in setupDependencies self.registerPythonFile(utility,'config.utilities') File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 49, in registerPythonFile utilityObj = self.framework.require(directory+utilityName, self) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 343, in require config = self.getChild(moduleName, keywordArgs) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 317, in getChild config = type(self, *keywordArgs) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/utilities/macosFirewall.py", line 12, in __init__ self.isDarwin = config.setCompilers.Configure.isDarwin(self.log) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 668, in isDarwin if not isUname_value: config.setCompilers.Configure.isUname(log) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 631, in isUname v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 631, in v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) On 5. Apr 2023, at 23:11, Satish Balay wrote: Hm - broken python? Either way configure should not fail. Perhaps the following fix: Satish --- diff --git a/config/BuildSystem/config/setCompilers.py b/config/BuildSystem/config/setCompilers.py index e4d13bea58f..ae53d1e397e 100644 --- a/config/BuildSystem/config/setCompilers.py +++ b/config/BuildSystem/config/setCompilers.py @@ -626,10 +626,14 @@ class Configure(config.base.Configure): if log: log.write('Detected Darwin') isDarwin_value = True import platform - v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) - if v >= (10,15,0): - if log: log.write('Detected Darwin/MacOSX Catalina OS\n') - isDarwinCatalina_value = True + try: + v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) + if v >= (10,15,0): + if log: log.write('Detected Darwin/MacOSX Catalina OS\n') + isDarwinCatalina_value = True + except RuntimeError: + if log: log.write('MacOS version detecton failed!\n') + pass if output.find('freebsd') >= 0: if log: log.write('Detected FreeBSD') isFreeBSD_value = True On Wed, 5 Apr 2023, Kaus, Boris wrote: That indeed seems to be the issue: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # python3 Python 3.9.7 (default, Nov 24 2021, 21:15:59) [GCC 10.3.1 20211027] on linux Type "help", "copyright", "credits" or "license" for more information. import platform platform.mac_ver() ('', ('', '', ''), '') platform.mac_ver()[0].split('.') [''] tuple([int(a) for a in platform.mac_ver()[0].split('.')]) Traceback (most recent call last): File "", line 1, in File "", line 1, in ValueError: invalid literal for int() with base 10: ?' On 5. Apr 2023, at 23:00, Satish Balay wrote: Sorry, Was looking at the wrong place. v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) Can you try: balay at ypro petsc % python3 Python 3.9.6 (default, Mar 10 2023, 20:16:38) [Clang 14.0.3 (clang-1403.0.22.14.1)] on darwin Type "help", "copyright", "credits" or "license" for more information. import platform platform.mac_ver() ('13.3', ('', '', ''), 'x86_64') platform.mac_ver()[0].split('.') ['13', '3'] tuple([int(a) for a in platform.mac_ver()[0].split('.')]) (13, 3) Satish On Wed, 5 Apr 2023, Kaus, Boris wrote: Don?t think so: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # env _=/usr/bin/env VERBOSE=true BUILD_LD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld OLDPWD=/workspace/srcdir/petsc-3.18.0 host_libdir=/workspace/x86_64-linux-musl-cxx11/destdir/lib nproc=8 target=aarch64-apple-darwin20 bindir=/workspace/destdir/bin CC=cc READELF_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf host_bindir=/workspace/x86_64-linux-musl-cxx11/destdir/bin PATH=/opt/bin/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi:/opt/aarch64-apple-darwin20/bin:/opt/bin/x86_64-linux-musl-cxx11:/opt/x86_64-linux-musl/bin:/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/bin:/sbin:/workspace/x86_64-linux-musl-cxx11/destdir/bin:/workspace/destdir/bin nbits=64 BUILD_STRIP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip BUILD_OBJDUMP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump CMAKE_TARGET_TOOLCHAIN=/opt/toolchains/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/target_aarch64-apple-darwin20.cmake FC_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran FC=gfortran SRC_NAME=PETSc RANLIB_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib CC_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc PS1=\[\]sandbox\[\]:\[\]${PWD//$WORKSPACE/$\{WORKSPACE\}}\[\] \$ PKG_CONFIG_SYSROOT_DIR=/workspace/destdir LD_LIBRARY_PATH=/usr/lib/csl-musl-x86_64:/usr/local/lib64:/usr/local/lib:/usr/lib64:/usr/lib:/lib64:/lib:/workspace/x86_64-linux-musl-cxx11/destdir/lib:/opt/x86_64-linux-musl/x86_64-linux-musl/lib64:/opt/x86_64-linux-musl/x86_64-linux-musl/lib:/opt/aarch64-apple-darwin20/aarch64-apple-darwin20/lib:/opt/aarch64-apple-darwin20/lib:/workspace/destdir/lib64:/workspace/destdir/lib HOSTOBJCOPY=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy HOSTOBJDUMP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump LIPO_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo HOSTSTRIP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip BUILD_OBJCOPY=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy ZERO_AR_DATE=1 dlext=dylib HIDDEN_PS1=\[\]sandbox\[\]:\[\]${PWD//$WORKSPACE/$\{WORKSPACE\}}\[\] \$ CCACHE_COMPILERCHECK=content AR_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar HOSTDSYMUTIL=dsymutil SHLVL=1 OBJDUMP_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump CXX_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ HOSTCXX=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ USER=kausb BUILD_DSYMUTIL=dsymutil CC_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc OBJCOPY_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy TERM=screen LIPO_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo BUILD_LIPO=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo NM_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm host_prefix=/workspace/x86_64-linux-musl-cxx11/destdir FC_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran AR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar WORKSPACE=/workspace STRIP_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip HOSTRANLIB=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib RANLIB_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib DSYMUTIL_FOR_BUILD=dsymutil HOSTAS=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as HOSTAR=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar BUILD_RANLIB=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib NM_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm LD=/opt/bin/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/ld HOSTLD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld bb_full_target=aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi LLVM_TARGET=aarch64-apple-darwin20 BUILD_READELF=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf CXX_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ libdir=/workspace/destdir/lib MESON_TARGET_TOOLCHAIN=/opt/toolchains/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/target_aarch64-apple-darwin20.meson LLVM_HOST_TARGET=x86_64-linux-musl STRIP_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip AS_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as HISTFILE=/meta/.bash_history HOME=/root HOSTLIPO=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo includedir=/workspace/destdir/include MESON_HOST_TOOLCHAIN=/opt/toolchains/x86_64-linux-musl-cxx11/host_x86_64-linux-musl.meson BUILD_FC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran V=true BUILD_CC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc HOSTCC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc AS_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as CXX=c++ rust_target=aarch64-apple-darwin rust_host=x86_64-unknown-linux-musl HOSTFC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran exeext= READELF_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf bb_target=aarch64-apple-darwin20 SOURCE_DATE_EPOCH=0 PWD=/workspace/srcdir/petsc-3.18.0 MACOSX_DEPLOYMENT_TARGET=11.0 proc_family=arm BUILD_NM=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm BUILD_CXX=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ LD_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld OBJDUMP_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump OBJCOPY_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy HOSTNM=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm USE_CCACHE=false BUILD_AR=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar BUILD_AS=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as prefix=/workspace/destdir HOSTNAME=271f88c24b60 CHARSET=UTF-8 PKG_CONFIG_PATH=/workspace/destdir/lib/pkgconfig:/workspace/destdir/lib64/pkgconfig:/workspace/destdir/share/pkgconfig MACHTYPE=x86_64-linux-musl HOSTREADELF=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf DSYMUTIL_BUILD=dsymutil LD_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld host_includedir=/workspace/x86_64-linux-musl-cxx11/destdir/include CMAKE_HOST_TOOLCHAIN=/opt/toolchains/x86_64-linux-musl-cxx11/host_x86_64-linux-musl.cmake SHELL=/bin/bash On 5. Apr 2023, at 22:45, Satish Balay wrote: Well this doesn't trigger the error for me. Do you have any env variables set with unicode [non-ascii] chars? Satish --- balay at ypro petsc-3.19.0 % sw_vers ProductName: macOS ProductVersion: 13.3 BuildVersion: 22E252 balay at ypro petsc-3.19.0 % ./configure --with-mpi=0 --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ ============================================================================================= Configuring PETSc to compile on your system ============================================================================================= ============================================================================================= ***** WARNING ***** You have a version of GNU make older than 4.0. It will work, but may not support all the parallel testing options. You can install the latest GNU make with your package manager, such as Brew or MacPorts, or use the --download-make option to get the latest GNU make ============================================================================================= Compilers: C Compiler: gcc -fPIC -Wall -Wwrite-strings -Wno-unknown-pragmas -fstack-protector -fno-stack-check -Qunused-arguments -fvisibility=hidden -g3 -O0 Version: Apple clang version 14.0.3 (clang-1403.0.22.14.1) ... ... On Wed, 5 Apr 2023, Kaus, Boris wrote: It can be reproduced with this: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ ******************************************************************************* TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure ------------------------------------------------------------------------------- invalid literal for int() with base 10: '' ******************************************************************************* File "/workspace/srcdir/petsc-3.18.0/config/configure.py", line 457, in petsc_configure framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 101, in __init__ self.createChildren() File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 338, in createChildren self.getChild(moduleName) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 323, in getChild config.setupDependencies(self) File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 89, in setupDependencies self.registerPythonFile(utility,'config.utilities') File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 49, in registerPythonFile utilityObj = self.framework.require(directory+utilityName, self) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 343, in require config = self.getChild(moduleName, keywordArgs) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 317, in getChild config = type(self, *keywordArgs) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/utilities/macosFirewall.py", line 12, in __init__ self.isDarwin = config.setCompilers.Configure.isDarwin(self.log) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 664, in isDarwin if not isUname_value: config.setCompilers.Configure.isUname(log) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in isUname v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) On 5. Apr 2023, at 22:32, Stefano Zampini wrote: It seems there's some typo/error in the configure command that is being executed. Can you post it here? Il giorno mer 5 apr 2023 alle ore 23:18 Kaus, Boris > ha scritto: Hi everyone, I?m trying to install precompiled binaries for PETSc 3.18.5 & 3.19.0 using the BinaryBuilder cross-compilation: https://github.com/JuliaPackaging/Yggdrasil/pull/6533, which mostly works fine: https://buildkite.com/julialang/yggdrasil/builds/2093). Yet, on apple systems I receive a somewhat weird bug during the configure step: [22:08:49] ******************************************************************************* [22:08:49] TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure [22:08:49] ------------------------------------------------------------------------------- [22:08:49] invalid literal for int() with base 10: '' [22:08:49] ******************************************************************************* [22:08:49] [22:08:49] [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. [22:08:49] make: *** [GNUmakefile:17: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscvariables] Error 2 [22:08:49] make: *** Waiting for unfinished jobs.... [22:08:49] make: *** [GNUmakefile:17: lib/petsc/conf/petscvariables] Error 2 The log file is rather brief: sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # more configure.log Executing: uname -s stdout: Darwin It works fine for PETSc 3.16.5/3.17.5, and this first occurs in 3.18.0. Is there something that changed between 3.17 & 3.18 that could cause this? The build system seems to use python3.9 (3.4+ as required) Thanks! Boris -- Stefano -------------- next part -------------- An HTML attachment was scrubbed... URL: From xiongziming2010 at gmail.com Thu Apr 6 04:20:23 2023 From: xiongziming2010 at gmail.com (ziming xiong) Date: Thu, 6 Apr 2023 11:20:23 +0200 Subject: [petsc-users] Installation issues based on Petsc-pardiso et metis Message-ID: Hello, I want configure Petsc with pardiso and metis , but there is still error, I work with the following code for pardiso: ./configure --with-cc="win32fe cl" --with-fc=0 --with-cxx="win32fe cl" --with-shared-libraries=0 --with-mpi-dir="/cygdrive/c/PROGRA~2/Intel/MPI" --with-mpiexec="/cygdrive/c/PROGRA~1/Microsoft_MPI/Bin/mpiexec" --with-blaslapack-dir="/cygdrive/c/PROGRA~2/Intel/oneAPI/mkl/latest" --with-mkl_pardiso-dir="/cygdrive/c/PROGRA~2/Intel/oneAPI/mkl/latest" the configure_error_with_pardiso.log is the configure log file. For metis, i used code: ./configure --with-metis-lib=/cygdrive/f/metis-5.1.0/build/libmetis --with-metis-include=/cygdrive/f/metis-5.1.0/include --with-cc="win32fe cl" --with-fc=0 --with-cxx="win32fe cl" --with-shared-libraries=0 --with-mpi-dir="/cygdrive/c/PROGRA~2/Intel/MPI" --with-mpiexec="/cygdrive/c/PROGRA~1/Microsoft_MPI/Bin/mpiexec" --with-blaslapack-lib="-L/cygdrive/c/PROGRA~2/Intel/oneAPI/mkl/latest/lib/intel64 mkl_intel_lp64_dll.lib mkl_sequential_dll.lib mkl_core_dll.lib" But there is the error as follow: --with-metis-lib=['/cygdrive/f/metis-5.1.0/build/libmetis'] and --with-metis-include=['/cygdrive/f/metis-5.1.0/include'] did not work which the configure_error_with_metis is the configure file for metis. I also try to use --download-metis for configure with metis, but there is the Error configuring METIS with CMake. If u want I can send the configure.log file Please help me to configure with these external packages. I've been stuck in this stage for a long time Best regards, Ziming XIONG -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure_error_with_pardiso.log Type: application/octet-stream Size: 1490073 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure_error_with_metis.log Type: application/octet-stream Size: 1411551 bytes Desc: not available URL: From knepley at gmail.com Thu Apr 6 06:38:42 2023 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 6 Apr 2023 07:38:42 -0400 Subject: [petsc-users] Installation issues based on Petsc-pardiso et metis In-Reply-To: References: Message-ID: On Thu, Apr 6, 2023 at 5:21?AM ziming xiong wrote: > Hello, > I want configure Petsc with pardiso and metis , but there is still error, > This seems like a fundamental misunderstanding. There are two ways to run on a Windows machine: 1) You can use Windows compilers and generate .lib libraries. This is what you are currently doing. 2) You can use WSL2 or MSYS, using UNIX compilers, generating .a archives and .so shared libraries. When you built Metis, you used strategy 2) since we see Executing: win32fe cl -o /cygdrive/c/Users/XiongZiming/AppData/Local/Temp/petsc-r1s2t4wp/config.libraries/conftest.exe -g /cygdrive/c/Users/XiongZiming/AppData/Local/Temp/petsc-r1s2t4wp/config.libraries/conftest.o -L/cygdrive/f/metis-5.1.0/build -L/cygdrive/f/metis-5.1.0/build -lmetis Ws2_32.lib stdout: LINK : fatal error LNK1104: cannot open file 'libmetis.lib' Possible ERROR while running linker: exit code 2 stdout: LINK : fatal error LNK1104: cannot open file 'libmetis.lib' Linker output before filtering: LINK : fatal error LNK1104: cannot open file 'libmetis.lib' Linker output after filtering: LINK : fatal error LNK1104: cannot open file 'libmetis.lib' Personally, I find strategy 2) easier to deal with, especially for external libraries. Thanks, Matt > I work with the following code for pardiso: > ./configure > --with-cc="win32fe cl" > --with-fc=0 > --with-cxx="win32fe cl" > --with-shared-libraries=0 > --with-mpi-dir="/cygdrive/c/PROGRA~2/Intel/MPI" > --with-mpiexec="/cygdrive/c/PROGRA~1/Microsoft_MPI/Bin/mpiexec" > --with-blaslapack-dir="/cygdrive/c/PROGRA~2/Intel/oneAPI/mkl/latest" > --with-mkl_pardiso-dir="/cygdrive/c/PROGRA~2/Intel/oneAPI/mkl/latest" > > the configure_error_with_pardiso.log is the configure log file. > > For metis, i used code: > ./configure > --with-metis-lib=/cygdrive/f/metis-5.1.0/build/libmetis > --with-metis-include=/cygdrive/f/metis-5.1.0/include > --with-cc="win32fe cl" > --with-fc=0 > --with-cxx="win32fe cl" > --with-shared-libraries=0 > --with-mpi-dir="/cygdrive/c/PROGRA~2/Intel/MPI" > --with-mpiexec="/cygdrive/c/PROGRA~1/Microsoft_MPI/Bin/mpiexec" > --with-blaslapack-lib="-L/cygdrive/c/PROGRA~2/Intel/oneAPI/mkl/latest/lib/intel64 > mkl_intel_lp64_dll.lib mkl_sequential_dll.lib mkl_core_dll.lib" > > But there is the error as follow: > --with-metis-lib=['/cygdrive/f/metis-5.1.0/build/libmetis'] and > --with-metis-include=['/cygdrive/f/metis-5.1.0/include'] did not work > > which the configure_error_with_metis is the configure file for metis. > > I also try to use --download-metis for configure with metis, but there is > the Error configuring METIS with CMake. If u want I can send the > configure.log file > > Please help me to configure with these external packages. I've been stuck > in this stage for a long time > > Best regards, > Ziming XIONG > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tt73 at njit.edu Thu Apr 6 09:20:33 2023 From: tt73 at njit.edu (Takahashi, Tadanaga) Date: Thu, 6 Apr 2023 10:20:33 -0400 Subject: [petsc-users] Question about NASM initialization In-Reply-To: References: Message-ID: I am following up from the last inquiry. I read the source code nasm.c and it looks like sub-snes iteration is being initialized with a scatter call from the previous solution. In other words, if I use Newton's method for the local solver, then in each NASM iteration the Newton's method uses the previous local solution as the initial guess. Can anyone confirm this? On Sun, Apr 2, 2023 at 6:14?PM Takahashi, Tadanaga wrote: > Hello PETSc devs, > > I am using SNES NASM with Newton LS on the sub-SNES. I was wondering how > the sub-SNES chooses the initial guess during each NASM iteration. Is it > using the previously computed solution or is it restarting from zero? > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Apr 6 09:24:50 2023 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 6 Apr 2023 10:24:50 -0400 Subject: [petsc-users] Question about NASM initialization In-Reply-To: References: Message-ID: On Thu, Apr 6, 2023 at 10:21?AM Takahashi, Tadanaga wrote: > I am following up from the last inquiry. I read the source code nasm.c > and it looks > like sub-snes iteration is being initialized with a scatter call from the > previous solution. In other words, if I use Newton's method for the local > solver, then in each NASM iteration the Newton's method uses the previous > local solution as the initial guess. Can anyone confirm this? > This is the intention. There are not many tests, so it is possible there is a bug, but it is supposed to use the existing solution. Thanks, Matt > On Sun, Apr 2, 2023 at 6:14?PM Takahashi, Tadanaga wrote: > >> Hello PETSc devs, >> >> I am using SNES NASM with Newton LS on the sub-SNES. I was wondering how >> the sub-SNES chooses the initial guess during each NASM iteration. Is it >> using the previously computed solution or is it restarting from zero? >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tt73 at njit.edu Thu Apr 6 09:42:39 2023 From: tt73 at njit.edu (Takahashi, Tadanaga) Date: Thu, 6 Apr 2023 10:42:39 -0400 Subject: [petsc-users] Question about NASM initialization In-Reply-To: References: Message-ID: Ok, thanks for the clarification. On Thu, Apr 6, 2023 at 10:25?AM Matthew Knepley wrote: > On Thu, Apr 6, 2023 at 10:21?AM Takahashi, Tadanaga wrote: > >> I am following up from the last inquiry. I read the source code nasm.c >> and it looks >> like sub-snes iteration is being initialized with a scatter call from the >> previous solution. In other words, if I use Newton's method for the local >> solver, then in each NASM iteration the Newton's method uses the previous >> local solution as the initial guess. Can anyone confirm this? >> > > This is the intention. There are not many tests, so it is possible there > is a bug, but it is supposed to use the existing solution. > > Thanks, > > Matt > > >> On Sun, Apr 2, 2023 at 6:14?PM Takahashi, Tadanaga wrote: >> >>> Hello PETSc devs, >>> >>> I am using SNES NASM with Newton LS on the sub-SNES. I was wondering how >>> the sub-SNES chooses the initial guess during each NASM iteration. Is it >>> using the previously computed solution or is it restarting from zero? >>> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gaochenyi14 at 163.com Thu Apr 6 11:19:18 2023 From: gaochenyi14 at 163.com (gaochenyi14) Date: Fri, 7 Apr 2023 00:19:18 +0800 Subject: [petsc-users] How to use PETSc real version and complex version simultaneously? In-Reply-To: References: <8DA5B0CD-CBC9-42D0-8804-E96298299DC5@163.com> Message-ID: <9167783E-5E47-418D-9315-2239DB589542@163.com> Thanks for the hint. I did a naive test and managed to use the same function with different typedefs. Basically, it is relied on that C has no name mangling while C++ has name mangling. Could this trick applies to PETSc? The files for the test are attached. The procedure for compilation is in the `main.cpp` Best regards, C.-Y. GAO > On Apr 6, 2023, at 02:04, Matthew Knepley wrote: > > On Wed, Apr 5, 2023 at 1:59?PM gaochenyi14 > wrote: > Hi, > > I rely on PETSc to deal with real and complex sparse matrices of dimension 1e4 * 1e4 or above. I want to use real version when only real matrices are involved, to achieve better performance, and use complex version only when complex matrices get involved. But in the manual it says different versions can not be used at the same time. Can this restriction be circumvented? If not, where does the restriction come from? > > It is possible to do this, but it is cumbersome. You would have to compile both versions of the library, dlopen() them, and get the symbols you need. A group at Purdue has done this, but it is involved. > > We use typedefs to change between real and complex. It would not be difficult to allow storage in several types. However, prescribing how one type interactes with another, particularly when data is passed in or out, is challenging. This difficulty does not go away with templates since it is about type interaction, not polymorphism. > > Thanks, > > Matt > > All the best, > C.-Y. GAO > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: name-conflict.zip Type: application/zip Size: 1339 bytes Desc: not available URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Thu Apr 6 12:00:30 2023 From: balay at mcs.anl.gov (Satish Balay) Date: Thu, 6 Apr 2023 12:00:30 -0500 (CDT) Subject: [petsc-users] Installation issue of 3.18.* and 3.19.0 on Apple systems In-Reply-To: <1F8CE19F-4459-4A08-96BC-D63B3ACEDC03@uni-mainz.de> References: <6625F254-C1D9-4249-BD09-6D3BFAA8CDC2@uni-mainz.de> <3219E7BA-A189-48B2-A174-3DAE9EE7F8CC@uni-mainz.de> <9253F01D-EAC1-492F-8F73-0E234B63BEF1@uni-mainz.de> <832c3cf8-0d7d-91d1-9197-e1cf37b634ae@mcs.anl.gov> <2201BE8A-8E4F-4D52-AEB2-F51FCD557D89@uni-mainz.de> <23767F99-C0BD-4622-A13F-0A11BDC46A63@uni-mainz.de> <59b55873-a729-b85a-e2e2-89bb20531f9f@mcs.anl.gov> <52C2E466-C0F9-4E57-A0F8-7EA99D75104E@uni-mainz.de> <7b96989f-20f5-4bb0-f50a-8ba986484407@mcs.anl.gov> <1F8CE19F-4459-4A08-96BC-D63B3ACEDC03@uni-mainz.de> Message-ID: Great! I created MR for this change https://gitlab.com/petsc/petsc/-/merge_requests/6289 Satish On Thu, 6 Apr 2023, Kaus, Boris wrote: > Apologies, mistake on my side. Yes this works - thanks a lot for your help! > > Boris > > On 6. Apr 2023, at 00:56, Satish Balay wrote: > > This is strange. I can trigger the error with: > > - v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > + v = tuple([int('')]) > > and this fix is able to overcome it. > > I pushed this change to `balay/fix-macos-version-check` branch - can you try from it? > > Satish > > On Wed, 5 Apr 2023, Kaus, Boris wrote: > > I don?t understand why this keeps giving the same error (shouldn?t): > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ > ******************************************************************************* > TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure > ------------------------------------------------------------------------------- > invalid literal for int() with base 10: '' > ******************************************************************************* > > > File "/workspace/srcdir/petsc-3.18.0/config/configure.py", line 457, in petsc_configure > framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 101, in __init__ > self.createChildren() > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 338, in createChildren > self.getChild(moduleName) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 323, in getChild > config.setupDependencies(self) > File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 89, in setupDependencies > self.registerPythonFile(utility,'config.utilities') > File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 49, in registerPythonFile > utilityObj = self.framework.require(directory+utilityName, self) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 343, in require > config = self.getChild(moduleName, keywordArgs) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 317, in getChild > config = type(self, *keywordArgs) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/utilities/macosFirewall.py", line 12, in __init__ > self.isDarwin = config.setCompilers.Configure.isDarwin(self.log) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 676, in isDarwin > if not isUname_value: config.setCompilers.Configure.isUname(log) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 631, in isUname > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 631, in > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > > > On 5. Apr 2023, at 23:58, Satish Balay wrote: > > Ah - my patch was buggy. > > + except RuntimeError: > > should be: 'except:' i.e: > > diff --git a/config/BuildSystem/config/setCompilers.py b/config/BuildSystem/config/setCompilers.py > index 57848e736e3..e191c1e1b4d 100644 > --- a/config/BuildSystem/config/setCompilers.py > +++ b/config/BuildSystem/config/setCompilers.py > @@ -630,10 +630,14 @@ class Configure(config.base.Configure): > if log: log.write('Detected Darwin') > isDarwin_value = True > import platform > - v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > - if v >= (10,15,0): > - if log: log.write('Detected Darwin/MacOSX Catalina OS\n') > - isDarwinCatalina_value = True > + try: > + v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > + if v >= (10,15,0): > + if log: log.write('Detected Darwin/MacOSX Catalina OS\n') > + isDarwinCatalina_value = True > + except: > + if log: log.write('MacOS version detecton failed!\n') > + pass > if output.find('freebsd') >= 0: > if log: log.write('Detected FreeBSD') > isFreeBSD_value = True > > Satish > > > > > On Wed, 5 Apr 2023, Kaus, Boris wrote: > > Perhaps python is broken, or perhaps this is because it is not a real Mac OS, but an emulated one. > > Seems a similar error now occurs a bit down the line: > > --- > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ > ******************************************************************************* > TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure > ------------------------------------------------------------------------------- > invalid literal for int() with base 10: '' > ******************************************************************************* > > > File "/workspace/srcdir/petsc-3.18.0/config/configure.py", line 457, in petsc_configure > framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 101, in __init__ > self.createChildren() > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 338, in createChildren > self.getChild(moduleName) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 323, in getChild > config.setupDependencies(self) > File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 89, in setupDependencies > self.registerPythonFile(utility,'config.utilities') > File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 49, in registerPythonFile > utilityObj = self.framework.require(directory+utilityName, self) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 343, in require > config = self.getChild(moduleName, keywordArgs) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 317, in getChild > config = type(self, *keywordArgs) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/utilities/macosFirewall.py", line 12, in __init__ > self.isDarwin = config.setCompilers.Configure.isDarwin(self.log) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 668, in isDarwin > if not isUname_value: config.setCompilers.Configure.isUname(log) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 631, in isUname > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 631, in > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > > > > > > On 5. Apr 2023, at 23:11, Satish Balay wrote: > > Hm - broken python? Either way configure should not fail. Perhaps the following fix: > > Satish > > --- > > diff --git a/config/BuildSystem/config/setCompilers.py b/config/BuildSystem/config/setCompilers.py > index e4d13bea58f..ae53d1e397e 100644 > --- a/config/BuildSystem/config/setCompilers.py > +++ b/config/BuildSystem/config/setCompilers.py > @@ -626,10 +626,14 @@ class Configure(config.base.Configure): > if log: log.write('Detected Darwin') > isDarwin_value = True > import platform > - v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > - if v >= (10,15,0): > - if log: log.write('Detected Darwin/MacOSX Catalina OS\n') > - isDarwinCatalina_value = True > + try: > + v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > + if v >= (10,15,0): > + if log: log.write('Detected Darwin/MacOSX Catalina OS\n') > + isDarwinCatalina_value = True > + except RuntimeError: > + if log: log.write('MacOS version detecton failed!\n') > + pass > if output.find('freebsd') >= 0: > if log: log.write('Detected FreeBSD') > isFreeBSD_value = True > > On Wed, 5 Apr 2023, Kaus, Boris wrote: > > That indeed seems to be the issue: > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # python3 > Python 3.9.7 (default, Nov 24 2021, 21:15:59) > [GCC 10.3.1 20211027] on linux > Type "help", "copyright", "credits" or "license" for more information. > import platform > platform.mac_ver() > ('', ('', '', ''), '') > platform.mac_ver()[0].split('.') > [''] > tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > Traceback (most recent call last): > File "", line 1, in > File "", line 1, in > ValueError: invalid literal for int() with base 10: ?' > > > On 5. Apr 2023, at 23:00, Satish Balay wrote: > > Sorry, Was looking at the wrong place. > > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > > Can you try: > > balay at ypro petsc % python3 > Python 3.9.6 (default, Mar 10 2023, 20:16:38) > [Clang 14.0.3 (clang-1403.0.22.14.1)] on darwin > Type "help", "copyright", "credits" or "license" for more information. > import platform > platform.mac_ver() > ('13.3', ('', '', ''), 'x86_64') > platform.mac_ver()[0].split('.') > ['13', '3'] > tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > (13, 3) > > > > Satish > > > > > On Wed, 5 Apr 2023, Kaus, Boris wrote: > > Don?t think so: > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # env > _=/usr/bin/env > VERBOSE=true > BUILD_LD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld > OLDPWD=/workspace/srcdir/petsc-3.18.0 > host_libdir=/workspace/x86_64-linux-musl-cxx11/destdir/lib > nproc=8 > target=aarch64-apple-darwin20 > bindir=/workspace/destdir/bin > CC=cc > READELF_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf > host_bindir=/workspace/x86_64-linux-musl-cxx11/destdir/bin > PATH=/opt/bin/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi:/opt/aarch64-apple-darwin20/bin:/opt/bin/x86_64-linux-musl-cxx11:/opt/x86_64-linux-musl/bin:/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/bin:/sbin:/workspace/x86_64-linux-musl-cxx11/destdir/bin:/workspace/destdir/bin > nbits=64 > BUILD_STRIP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip > BUILD_OBJDUMP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump > CMAKE_TARGET_TOOLCHAIN=/opt/toolchains/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/target_aarch64-apple-darwin20.cmake > FC_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran > FC=gfortran > SRC_NAME=PETSc > RANLIB_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib > CC_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc > PS1=\[\]sandbox\[\]:\[\]${PWD//$WORKSPACE/$\{WORKSPACE\}}\[\] \$ > PKG_CONFIG_SYSROOT_DIR=/workspace/destdir > LD_LIBRARY_PATH=/usr/lib/csl-musl-x86_64:/usr/local/lib64:/usr/local/lib:/usr/lib64:/usr/lib:/lib64:/lib:/workspace/x86_64-linux-musl-cxx11/destdir/lib:/opt/x86_64-linux-musl/x86_64-linux-musl/lib64:/opt/x86_64-linux-musl/x86_64-linux-musl/lib:/opt/aarch64-apple-darwin20/aarch64-apple-darwin20/lib:/opt/aarch64-apple-darwin20/lib:/workspace/destdir/lib64:/workspace/destdir/lib > HOSTOBJCOPY=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy > HOSTOBJDUMP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump > LIPO_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo > HOSTSTRIP=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip > BUILD_OBJCOPY=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy > ZERO_AR_DATE=1 > dlext=dylib > HIDDEN_PS1=\[\]sandbox\[\]:\[\]${PWD//$WORKSPACE/$\{WORKSPACE\}}\[\] \$ > CCACHE_COMPILERCHECK=content > AR_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar > HOSTDSYMUTIL=dsymutil > SHLVL=1 > OBJDUMP_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump > CXX_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ > HOSTCXX=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ > USER=kausb > BUILD_DSYMUTIL=dsymutil > CC_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc > OBJCOPY_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy > TERM=screen > LIPO_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo > BUILD_LIPO=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo > NM_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm > host_prefix=/workspace/x86_64-linux-musl-cxx11/destdir > FC_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran > AR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar > WORKSPACE=/workspace > STRIP_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip > HOSTRANLIB=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib > RANLIB_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib > DSYMUTIL_FOR_BUILD=dsymutil > HOSTAS=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as > HOSTAR=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar > BUILD_RANLIB=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ranlib > NM_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm > LD=/opt/bin/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/ld > HOSTLD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld > bb_full_target=aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi > LLVM_TARGET=aarch64-apple-darwin20 > BUILD_READELF=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf > CXX_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ > libdir=/workspace/destdir/lib > MESON_TARGET_TOOLCHAIN=/opt/toolchains/aarch64-apple-darwin20-libgfortran5-cxx11-mpi+openmpi/target_aarch64-apple-darwin20.meson > LLVM_HOST_TARGET=x86_64-linux-musl > STRIP_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-strip > AS_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as > HISTFILE=/meta/.bash_history > HOME=/root > HOSTLIPO=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-lipo > includedir=/workspace/destdir/include > MESON_HOST_TOOLCHAIN=/opt/toolchains/x86_64-linux-musl-cxx11/host_x86_64-linux-musl.meson > BUILD_FC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran > V=true > BUILD_CC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc > HOSTCC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gcc > AS_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as > CXX=c++ > rust_target=aarch64-apple-darwin > rust_host=x86_64-unknown-linux-musl > HOSTFC=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-gfortran > exeext= > READELF_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf > bb_target=aarch64-apple-darwin20 > SOURCE_DATE_EPOCH=0 > PWD=/workspace/srcdir/petsc-3.18.0 > MACOSX_DEPLOYMENT_TARGET=11.0 > proc_family=arm > BUILD_NM=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm > BUILD_CXX=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-g++ > LD_FOR_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld > OBJDUMP_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objdump > OBJCOPY_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-objcopy > HOSTNM=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-nm > USE_CCACHE=false > BUILD_AR=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ar > BUILD_AS=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-as > prefix=/workspace/destdir > HOSTNAME=271f88c24b60 > CHARSET=UTF-8 > PKG_CONFIG_PATH=/workspace/destdir/lib/pkgconfig:/workspace/destdir/lib64/pkgconfig:/workspace/destdir/share/pkgconfig > MACHTYPE=x86_64-linux-musl > HOSTREADELF=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-readelf > DSYMUTIL_BUILD=dsymutil > LD_BUILD=/opt/bin/x86_64-linux-musl-cxx11/x86_64-linux-musl-ld > host_includedir=/workspace/x86_64-linux-musl-cxx11/destdir/include > CMAKE_HOST_TOOLCHAIN=/opt/toolchains/x86_64-linux-musl-cxx11/host_x86_64-linux-musl.cmake > SHELL=/bin/bash > > > On 5. Apr 2023, at 22:45, Satish Balay wrote: > > Well this doesn't trigger the error for me. Do you have any env variables set with unicode [non-ascii] chars? > > Satish > > --- > > balay at ypro petsc-3.19.0 % sw_vers > ProductName: macOS > ProductVersion: 13.3 > BuildVersion: 22E252 > balay at ypro petsc-3.19.0 % ./configure --with-mpi=0 --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ > ============================================================================================= > Configuring PETSc to compile on your system > ============================================================================================= > ============================================================================================= > ***** WARNING ***** > You have a version of GNU make older than 4.0. It will work, but may not support all the > parallel testing options. You can install the latest GNU make with your package manager, > such as Brew or MacPorts, or use the --download-make option to get the latest GNU make > ============================================================================================= > Compilers: > C Compiler: gcc -fPIC -Wall -Wwrite-strings -Wno-unknown-pragmas -fstack-protector -fno-stack-check -Qunused-arguments -fvisibility=hidden -g3 -O0 > Version: Apple clang version 14.0.3 (clang-1403.0.22.14.1) > ... > ... > > > On Wed, 5 Apr 2023, Kaus, Boris wrote: > > It can be reproduced with this: > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # ./configure --prefix=/workspace/destdir/lib/petsc/double_real_Int32/ > ******************************************************************************* > TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure > ------------------------------------------------------------------------------- > invalid literal for int() with base 10: '' > ******************************************************************************* > > > File "/workspace/srcdir/petsc-3.18.0/config/configure.py", line 457, in petsc_configure > framework = config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:], loadArgDB = 0) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 101, in __init__ > self.createChildren() > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 338, in createChildren > self.getChild(moduleName) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 323, in getChild > config.setupDependencies(self) > File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 89, in setupDependencies > self.registerPythonFile(utility,'config.utilities') > File "/workspace/srcdir/petsc-3.18.0/config/PETSc/Configure.py", line 49, in registerPythonFile > utilityObj = self.framework.require(directory+utilityName, self) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 343, in require > config = self.getChild(moduleName, keywordArgs) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/framework.py", line 317, in getChild > config = type(self, *keywordArgs) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/utilities/macosFirewall.py", line 12, in __init__ > self.isDarwin = config.setCompilers.Configure.isDarwin(self.log) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 664, in isDarwin > if not isUname_value: config.setCompilers.Configure.isUname(log) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in isUname > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > File "/workspace/srcdir/petsc-3.18.0/config/BuildSystem/config/setCompilers.py", line 630, in > v = tuple([int(a) for a in platform.mac_ver()[0].split('.')]) > > > On 5. Apr 2023, at 22:32, Stefano Zampini wrote: > > It seems there's some typo/error in the configure command that is being executed. Can you post it here? > > Il giorno mer 5 apr 2023 alle ore 23:18 Kaus, Boris > ha scritto: > Hi everyone, > > I?m trying to install precompiled binaries for PETSc 3.18.5 & 3.19.0 using the BinaryBuilder cross-compilation: > https://github.com/JuliaPackaging/Yggdrasil/pull/6533, which mostly works fine: https://buildkite.com/julialang/yggdrasil/builds/2093). > > Yet, on apple systems I receive a somewhat weird bug during the configure step: > > [22:08:49] ******************************************************************************* > [22:08:49] TypeError or ValueError possibly related to ERROR in COMMAND LINE ARGUMENT while running ./configure > [22:08:49] ------------------------------------------------------------------------------- > [22:08:49] invalid literal for int() with base 10: '' > [22:08:49] ******************************************************************************* > [22:08:49] > [22:08:49] > [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory > [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. > [22:08:49] /workspace/srcdir/petsc-3.18.0/lib/petsc/conf/rules:860: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules: No such file or directory > [22:08:49] make[1]: *** No rule to make target '/workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscrules'. Stop. > [22:08:49] make: *** [GNUmakefile:17: /workspace/srcdir/petsc-3.18.0//lib/petsc/conf/petscvariables] Error 2 > [22:08:49] make: *** Waiting for unfinished jobs.... > [22:08:49] make: *** [GNUmakefile:17: lib/petsc/conf/petscvariables] Error 2 > > The log file is rather brief: > > sandbox:${WORKSPACE}/srcdir/petsc-3.18.0 # more configure.log > Executing: uname -s > stdout: Darwin > > It works fine for PETSc 3.16.5/3.17.5, and this first occurs in 3.18.0. > Is there something that changed between 3.17 & 3.18 that could cause this? > > The build system seems to use python3.9 (3.4+ as required) > > Thanks! > Boris > > > > > > > -- > Stefano > > From appiazzolla at gmail.com Fri Apr 7 09:06:00 2023 From: appiazzolla at gmail.com (Astor Piaz) Date: Fri, 7 Apr 2023 08:06:00 -0600 Subject: [petsc-users] MPI+OpenMP+MKL Message-ID: Hello petsc-users, I am trying to use a code that is parallelized with a combination of OpenMP and MKL parallelisms, where OpenMP threads are able to spawn MPI processes. I have carefully scheduled the processes such that the right amount is launched, at the right time. When trying to use my code inside a MatShell (for later use in an FGMRES KSPSolver), MKL processes are not being used. I am sorry if this has been asked before. What configuration should I use in order to profit from MPI+OpenMP+MKL parallelism? Thank you! -- Astor -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Apr 7 09:10:11 2023 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 7 Apr 2023 10:10:11 -0400 Subject: [petsc-users] MPI+OpenMP+MKL In-Reply-To: References: Message-ID: On Fri, Apr 7, 2023 at 10:06?AM Astor Piaz wrote: > Hello petsc-users, > I am trying to use a code that is parallelized with a combination of > OpenMP and MKL parallelisms, where OpenMP threads are able to spawn MPI > processes. > I have carefully scheduled the processes such that the right amount is > launched, at the right time. > When trying to use my code inside a MatShell (for later use in an FGMRES > KSPSolver), MKL processes are not being used. > > I am sorry if this has been asked before. > What configuration should I use in order to profit from MPI+OpenMP+MKL > parallelism? > You should configure using --with-threadsafety Thanks, Matt > Thank you! > -- > Astor > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From junchao.zhang at gmail.com Fri Apr 7 09:57:35 2023 From: junchao.zhang at gmail.com (Junchao Zhang) Date: Fri, 7 Apr 2023 09:57:35 -0500 Subject: [petsc-users] MPI+OpenMP+MKL In-Reply-To: References: Message-ID: > OpenMP threads are able to spawn MPI processes I am curious why you have this usage. Is it because that you want a pure OpenMP code (i.e., not MPI capable) to call petsc? --Junchao Zhang On Fri, Apr 7, 2023 at 9:06?AM Astor Piaz wrote: > Hello petsc-users, > I am trying to use a code that is parallelized with a combination of > OpenMP and MKL parallelisms, where OpenMP threads are able to spawn MPI > processes. > I have carefully scheduled the processes such that the right amount is > launched, at the right time. > When trying to use my code inside a MatShell (for later use in an FGMRES > KSPSolver), MKL processes are not being used. > > I am sorry if this has been asked before. > What configuration should I use in order to profit from MPI+OpenMP+MKL > parallelism? > > Thank you! > -- > Astor > -------------- next part -------------- An HTML attachment was scrubbed... URL: From appiazzolla at gmail.com Fri Apr 7 13:26:35 2023 From: appiazzolla at gmail.com (Astor Piaz) Date: Fri, 7 Apr 2023 12:26:35 -0600 Subject: [petsc-users] MPI+OpenMP+MKL In-Reply-To: References: Message-ID: Hi Matthew, Jungchau, Thank you for your advice. The code still does not work, I give more details about it below, I can specify more about it as you wish. I am implementing a spectral method resulting in a block matrix where the off-diagonal blocks are Poincare-Steklov operators of impedance-to-impedance type. Those Poincare-Steklov operators have been created hierarchically merging subdomain operators (the HPS method), and I have a well tuned (but rather complex) OpenMP+MKL code that can apply this operator very fast. I would like to use PETSc's MPI-parallel GMRES solver with a MatShell that calls my OpenMP+MKL code, while each block can be in a different MPI process. At the moment the code runs correctly, except that PETSc is not letting my OpenMP+MKL code make the scheduling of threads as I choose. I am using ./configure --with-scalar-type=complex --prefix=../install/fast/ --with-debugging=0 -with-openmp=1 --with-blaslapack-dir=${MKLROOT} --with-mkl_cpardiso-dir=${MKLROOT} --with-threadsafety --with-log=0 COPTFLAGS=-g -Ofast CXXOPTFLAGS=-g -Ofast FOPTFLAGS=-g -Ofast Attached is an image of htop showing that the MKL threads are indeed being spawn, but they remain unused by the code. The previous calculations on the code show that it is capable of using OpenMP and MKL, only when PETSC KSPSolver is called MKL seems to be turned off. On Fri, Apr 7, 2023 at 8:10?AM Matthew Knepley wrote: > On Fri, Apr 7, 2023 at 10:06?AM Astor Piaz wrote: > >> Hello petsc-users, >> I am trying to use a code that is parallelized with a combination of >> OpenMP and MKL parallelisms, where OpenMP threads are able to spawn MPI >> processes. >> I have carefully scheduled the processes such that the right amount is >> launched, at the right time. >> When trying to use my code inside a MatShell (for later use in an FGMRES >> KSPSolver), MKL processes are not being used. >> >> I am sorry if this has been asked before. >> What configuration should I use in order to profit from MPI+OpenMP+MKL >> parallelism? >> > > You should configure using --with-threadsafety > > Thanks, > > Matt > > >> Thank you! >> -- >> Astor >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: htop.png Type: image/png Size: 298746 bytes Desc: not available URL: From knepley at gmail.com Fri Apr 7 14:25:11 2023 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 7 Apr 2023 15:25:11 -0400 Subject: [petsc-users] MPI+OpenMP+MKL In-Reply-To: References: Message-ID: On Fri, Apr 7, 2023 at 2:26?PM Astor Piaz wrote: > Hi Matthew, Jungchau, > Thank you for your advice. The code still does not work, I give more > details about it below, I can specify more about it as you wish. > > I am implementing a spectral method resulting in a block matrix where the > off-diagonal blocks are Poincare-Steklov operators of > impedance-to-impedance type. > Those Poincare-Steklov operators have been created hierarchically merging > subdomain operators (the HPS method), and I have a well tuned (but rather > complex) OpenMP+MKL code that can apply this operator very fast. > I would like to use PETSc's MPI-parallel GMRES solver with a MatShell that > calls my OpenMP+MKL code, while each block can be in a different MPI > process. > > At the moment the code runs correctly, except that PETSc is not letting my > OpenMP+MKL code make the scheduling of threads as I choose. > PETSc does not say anything about OpenMP threads. However, maybe you need to launch the executable with the correct OMP env variables? Thanks, Matt > I am using > ./configure --with-scalar-type=complex --prefix=../install/fast/ > --with-debugging=0 -with-openmp=1 --with-blaslapack-dir=${MKLROOT} > --with-mkl_cpardiso-dir=${MKLROOT} --with-threadsafety --with-log=0 > COPTFLAGS=-g -Ofast CXXOPTFLAGS=-g -Ofast FOPTFLAGS=-g -Ofast > > Attached is an image of htop showing that the MKL threads are indeed being > spawn, but they remain unused by the code. The previous calculations on the > code show that it is capable of using OpenMP and MKL, only when PETSC > KSPSolver is called MKL seems to be turned off. > > On Fri, Apr 7, 2023 at 8:10?AM Matthew Knepley wrote: > >> On Fri, Apr 7, 2023 at 10:06?AM Astor Piaz wrote: >> >>> Hello petsc-users, >>> I am trying to use a code that is parallelized with a combination of >>> OpenMP and MKL parallelisms, where OpenMP threads are able to spawn MPI >>> processes. >>> I have carefully scheduled the processes such that the right amount is >>> launched, at the right time. >>> When trying to use my code inside a MatShell (for later use in an FGMRES >>> KSPSolver), MKL processes are not being used. >>> >>> I am sorry if this has been asked before. >>> What configuration should I use in order to profit from MPI+OpenMP+MKL >>> parallelism? >>> >> >> You should configure using --with-threadsafety >> >> Thanks, >> >> Matt >> >> >>> Thank you! >>> -- >>> Astor >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From appiazzolla at gmail.com Fri Apr 7 19:17:15 2023 From: appiazzolla at gmail.com (Astor Piaz) Date: Fri, 7 Apr 2023 18:17:15 -0600 Subject: [petsc-users] MPI+OpenMP+MKL In-Reply-To: References: Message-ID: Thanks for your reply Matt. The problem seems to be the MKL threads I just realized. Inside the MatShell I call: call omp_set_nested(.true.) call omp_set_dynamic(.false.) call mkl_set_dynamic(0) Then, inside the omp single thread I use: nMkl0 = mkl_set_num_threads_local(nMkl) where nMkl is set to 24 MKL_VERBOSE shows, that the calls to have access to 24 threads but the timings are the same as in 1 thread MKL_VERBOSE ZGEMV(N,12544,12544,0x7ffde9edc800,0x14e4662d2010,12544,0x14985e610,1,0x7ffde9edc7f0,0x189faaa90,1) 117.09ms CNR:OFF Dyn:0 FastMM:1 TID:0 NThr:24 MKL_VERBOSE ZGEMV(N,12544,12544,0x7ffe00355700,0x14c8ec1e4010,12544,0x16959c830,1,0x7ffe003556f0,0x17dd7da70,1) 117.37ms CNR:OFF Dyn:0 FastMM:1 TID:0 NThr:1 The configuration of OpenMP that is launching these MKL processes is as follows: OPENMP DISPLAY ENVIRONMENT BEGIN _OPENMP = '201511' OMP_DYNAMIC = 'FALSE' OMP_NESTED = 'TRUE' OMP_NUM_THREADS = '24' OMP_SCHEDULE = 'DYNAMIC' OMP_PROC_BIND = 'TRUE' OMP_PLACES = '{0:24}' OMP_STACKSIZE = '0' OMP_WAIT_POLICY = 'PASSIVE' OMP_THREAD_LIMIT = '4294967295' OMP_MAX_ACTIVE_LEVELS = '255' OMP_CANCELLATION = 'FALSE' OMP_DEFAULT_DEVICE = '0' OMP_MAX_TASK_PRIORITY = '0' OMP_DISPLAY_AFFINITY = 'FALSE' OMP_AFFINITY_FORMAT = 'level %L thread %i affinity %A' OMP_ALLOCATOR = 'omp_default_mem_alloc' OMP_TARGET_OFFLOAD = 'DEFAULT' GOMP_CPU_AFFINITY = '' GOMP_STACKSIZE = '0' GOMP_SPINCOUNT = '300000' OPENMP DISPLAY ENVIRONMENT END On Fri, Apr 7, 2023 at 1:25?PM Matthew Knepley wrote: > On Fri, Apr 7, 2023 at 2:26?PM Astor Piaz wrote: > >> Hi Matthew, Jungchau, >> Thank you for your advice. The code still does not work, I give more >> details about it below, I can specify more about it as you wish. >> >> I am implementing a spectral method resulting in a block matrix where the >> off-diagonal blocks are Poincare-Steklov operators of >> impedance-to-impedance type. >> Those Poincare-Steklov operators have been created hierarchically merging >> subdomain operators (the HPS method), and I have a well tuned (but rather >> complex) OpenMP+MKL code that can apply this operator very fast. >> I would like to use PETSc's MPI-parallel GMRES solver with a MatShell >> that calls my OpenMP+MKL code, while each block can be in a different MPI >> process. >> >> At the moment the code runs correctly, except that PETSc is not letting >> my OpenMP+MKL code make the scheduling of threads as I choose. >> > > PETSc does not say anything about OpenMP threads. However, maybe you need > to launch the executable with the correct OMP env variables? > > Thanks, > > Matt > > >> I am using >> ./configure --with-scalar-type=complex --prefix=../install/fast/ >> --with-debugging=0 -with-openmp=1 --with-blaslapack-dir=${MKLROOT} >> --with-mkl_cpardiso-dir=${MKLROOT} --with-threadsafety --with-log=0 >> COPTFLAGS=-g -Ofast CXXOPTFLAGS=-g -Ofast FOPTFLAGS=-g -Ofast >> >> Attached is an image of htop showing that the MKL threads are indeed >> being spawn, but they remain unused by the code. The previous calculations >> on the code show that it is capable of using OpenMP and MKL, only when >> PETSC KSPSolver is called MKL seems to be turned off. >> >> On Fri, Apr 7, 2023 at 8:10?AM Matthew Knepley wrote: >> >>> On Fri, Apr 7, 2023 at 10:06?AM Astor Piaz >>> wrote: >>> >>>> Hello petsc-users, >>>> I am trying to use a code that is parallelized with a combination of >>>> OpenMP and MKL parallelisms, where OpenMP threads are able to spawn MPI >>>> processes. >>>> I have carefully scheduled the processes such that the right amount is >>>> launched, at the right time. >>>> When trying to use my code inside a MatShell (for later use in an >>>> FGMRES KSPSolver), MKL processes are not being used. >>>> >>>> I am sorry if this has been asked before. >>>> What configuration should I use in order to profit from MPI+OpenMP+MKL >>>> parallelism? >>>> >>> >>> You should configure using --with-threadsafety >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> Thank you! >>>> -- >>>> Astor >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dave.mayhem23 at gmail.com Fri Apr 7 21:29:03 2023 From: dave.mayhem23 at gmail.com (Dave May) Date: Fri, 7 Apr 2023 19:29:03 -0700 Subject: [petsc-users] MPI+OpenMP+MKL In-Reply-To: References: Message-ID: On Fri 7. Apr 2023 at 07:06, Astor Piaz wrote: > Hello petsc-users, > I am trying to use a code that is parallelized with a combination of > OpenMP and MKL parallelisms, where OpenMP threads are able to spawn MPI > processes. > Is this really the correct way to go? Would it not be more suitable (or simpler) to run your application on an MPI sub communicator which maps one rank to say one compute node, and then within each rank of the sub comm you utilize your threaded OpenMP / MKL code using as many physical threads as there are cores/ node (and or hyper threads if that?s is effective for you)? Thanks, Dave I have carefully scheduled the processes such that the right amount is > launched, at the right time. > When trying to use my code inside a MatShell (for later use in an FGMRES > KSPSolver), MKL processes are not being used. > > I am sorry if this has been asked before. > What configuration should I use in order to profit from MPI+OpenMP+MKL > parallelism? > > Thank you! > -- > Astor > -------------- next part -------------- An HTML attachment was scrubbed... URL: From appiazzolla at gmail.com Fri Apr 7 21:45:06 2023 From: appiazzolla at gmail.com (Astor Piaz) Date: Fri, 7 Apr 2023 20:45:06 -0600 Subject: [petsc-users] MPI+OpenMP+MKL In-Reply-To: References: Message-ID: I'm sorry I meant OpenMP threads are able to spawn MKL processes On Fri, Apr 7, 2023 at 8:29?PM Dave May wrote: > > > On Fri 7. Apr 2023 at 07:06, Astor Piaz wrote: > >> Hello petsc-users, >> I am trying to use a code that is parallelized with a combination of >> OpenMP and MKL parallelisms, where OpenMP threads are able to spawn MPI >> processes. >> > > Is this really the correct way to go? > > > Would it not be more suitable (or simpler) to run your application on an > MPI sub communicator which maps one rank to say one compute node, and then > within each rank of the sub comm you utilize your threaded OpenMP / MKL > code using as many physical threads as there are cores/ node (and or hyper > threads if that?s is effective for you)? > > Thanks, > Dave > > I have carefully scheduled the processes such that the right amount is >> launched, at the right time. >> When trying to use my code inside a MatShell (for later use in an FGMRES >> KSPSolver), MKL processes are not being used. >> >> I am sorry if this has been asked before. >> What configuration should I use in order to profit from MPI+OpenMP+MKL >> parallelism? >> >> Thank you! >> -- >> Astor >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From junchao.zhang at gmail.com Fri Apr 7 22:29:41 2023 From: junchao.zhang at gmail.com (Junchao Zhang) Date: Fri, 7 Apr 2023 22:29:41 -0500 Subject: [petsc-users] MPI+OpenMP+MKL In-Reply-To: References: Message-ID: I don't know OpenMP, but I saw these in your configure OMP_PROC_BIND = 'TRUE' OMP_PLACES = '{0:24}' Try not to do any binding and let OS freely schedule threads. --Junchao Zhang On Fri, Apr 7, 2023 at 7:17?PM Astor Piaz wrote: > Thanks for your reply Matt. > > The problem seems to be the MKL threads I just realized. > > Inside the MatShell I call: > > call omp_set_nested(.true.) > call omp_set_dynamic(.false.) > call mkl_set_dynamic(0) > > Then, inside the omp single thread I use: > > nMkl0 = mkl_set_num_threads_local(nMkl) > > where nMkl is set to 24 > > MKL_VERBOSE shows, that the calls to have access to 24 threads but the > timings are the same as in 1 thread > > MKL_VERBOSE > ZGEMV(N,12544,12544,0x7ffde9edc800,0x14e4662d2010,12544,0x14985e610,1,0x7ffde9edc7f0,0x189faaa90,1) > 117.09ms CNR:OFF Dyn:0 FastMM:1 TID:0 NThr:24 > MKL_VERBOSE > ZGEMV(N,12544,12544,0x7ffe00355700,0x14c8ec1e4010,12544,0x16959c830,1,0x7ffe003556f0,0x17dd7da70,1) > 117.37ms CNR:OFF Dyn:0 FastMM:1 TID:0 NThr:1 > > The configuration of OpenMP that is launching these MKL processes is as > follows: > > OPENMP DISPLAY ENVIRONMENT BEGIN > _OPENMP = '201511' > OMP_DYNAMIC = 'FALSE' > OMP_NESTED = 'TRUE' > OMP_NUM_THREADS = '24' > OMP_SCHEDULE = 'DYNAMIC' > OMP_PROC_BIND = 'TRUE' > OMP_PLACES = '{0:24}' > OMP_STACKSIZE = '0' > OMP_WAIT_POLICY = 'PASSIVE' > OMP_THREAD_LIMIT = '4294967295' > OMP_MAX_ACTIVE_LEVELS = '255' > OMP_CANCELLATION = 'FALSE' > OMP_DEFAULT_DEVICE = '0' > OMP_MAX_TASK_PRIORITY = '0' > OMP_DISPLAY_AFFINITY = 'FALSE' > OMP_AFFINITY_FORMAT = 'level %L thread %i affinity %A' > OMP_ALLOCATOR = 'omp_default_mem_alloc' > OMP_TARGET_OFFLOAD = 'DEFAULT' > GOMP_CPU_AFFINITY = '' > GOMP_STACKSIZE = '0' > GOMP_SPINCOUNT = '300000' > OPENMP DISPLAY ENVIRONMENT END > > > > On Fri, Apr 7, 2023 at 1:25?PM Matthew Knepley wrote: > >> On Fri, Apr 7, 2023 at 2:26?PM Astor Piaz wrote: >> >>> Hi Matthew, Jungchau, >>> Thank you for your advice. The code still does not work, I give more >>> details about it below, I can specify more about it as you wish. >>> >>> I am implementing a spectral method resulting in a block matrix where >>> the off-diagonal blocks are Poincare-Steklov operators of >>> impedance-to-impedance type. >>> Those Poincare-Steklov operators have been created hierarchically >>> merging subdomain operators (the HPS method), and I have a well tuned (but >>> rather complex) OpenMP+MKL code that can apply this operator very fast. >>> I would like to use PETSc's MPI-parallel GMRES solver with a MatShell >>> that calls my OpenMP+MKL code, while each block can be in a different MPI >>> process. >>> >>> At the moment the code runs correctly, except that PETSc is not letting >>> my OpenMP+MKL code make the scheduling of threads as I choose. >>> >> >> PETSc does not say anything about OpenMP threads. However, maybe you need >> to launch the executable with the correct OMP env variables? >> >> Thanks, >> >> Matt >> >> >>> I am using >>> ./configure --with-scalar-type=complex --prefix=../install/fast/ >>> --with-debugging=0 -with-openmp=1 --with-blaslapack-dir=${MKLROOT} >>> --with-mkl_cpardiso-dir=${MKLROOT} --with-threadsafety --with-log=0 >>> COPTFLAGS=-g -Ofast CXXOPTFLAGS=-g -Ofast FOPTFLAGS=-g -Ofast >>> >>> Attached is an image of htop showing that the MKL threads are indeed >>> being spawn, but they remain unused by the code. The previous calculations >>> on the code show that it is capable of using OpenMP and MKL, only when >>> PETSC KSPSolver is called MKL seems to be turned off. >>> >>> On Fri, Apr 7, 2023 at 8:10?AM Matthew Knepley >>> wrote: >>> >>>> On Fri, Apr 7, 2023 at 10:06?AM Astor Piaz >>>> wrote: >>>> >>>>> Hello petsc-users, >>>>> I am trying to use a code that is parallelized with a combination of >>>>> OpenMP and MKL parallelisms, where OpenMP threads are able to spawn MPI >>>>> processes. >>>>> I have carefully scheduled the processes such that the right amount is >>>>> launched, at the right time. >>>>> When trying to use my code inside a MatShell (for later use in an >>>>> FGMRES KSPSolver), MKL processes are not being used. >>>>> >>>>> I am sorry if this has been asked before. >>>>> What configuration should I use in order to profit from MPI+OpenMP+MKL >>>>> parallelism? >>>>> >>>> >>>> You should configure using --with-threadsafety >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> Thank you! >>>>> -- >>>>> Astor >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>>> >>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhangzhiyu20 at mails.ucas.ac.cn Tue Apr 11 08:49:09 2023 From: zhangzhiyu20 at mails.ucas.ac.cn (=?UTF-8?B?5byg5rK75oSa?=) Date: Tue, 11 Apr 2023 21:49:09 +0800 (GMT+08:00) Subject: [petsc-users] Issue accessing SLEPc documentation Message-ID: <6af35cb3.8297.1877093d522.Coremail.zhangzhiyu20@mails.ucas.ac.cn> Dear SLEPc support team, I am a new user of SLEPc and I am having trouble accessing the online documentation. When I try to access the link https://slepc.upv.es/documentation/current/docs/manualpages/EPS/index.html, the webpage displays a 404 error message. I have tried accessing the link from different devices and browsers, but the issue persists. Could you please advise on how I can access the documentation? Is there an alternative link or any other resources that I can use to access the SLEPc documentation? Thank you for your assistance. I look forward to hearing from you soon. Best regards, zhang zhiyu -------------- next part -------------- An HTML attachment was scrubbed... URL: From jroman at dsic.upv.es Tue Apr 11 09:30:29 2023 From: jroman at dsic.upv.es (Jose E. Roman) Date: Tue, 11 Apr 2023 16:30:29 +0200 Subject: [petsc-users] Issue accessing SLEPc documentation In-Reply-To: <6af35cb3.8297.1877093d522.Coremail.zhangzhiyu20@mails.ucas.ac.cn> References: <6af35cb3.8297.1877093d522.Coremail.zhangzhiyu20@mails.ucas.ac.cn> Message-ID: <2E80006E-69F5-4582-BA74-B9AA22E124F4@dsic.upv.es> I have fixed this a few minutes ago. Try again using the reload button of your browser. Jose > El 11 abr 2023, a las 15:49, ??? escribi?: > > Dear SLEPc support team, > > > I am a new user of SLEPc and I am having trouble accessing the online documentation. When I try to access the link https://slepc.upv.es/documentation/current/docs/manualpages/EPS/index.html, the webpage displays a 404 error message. I have tried accessing the link from different devices and browsers, but the issue persists. > > > Could you please advise on how I can access the documentation? Is there an alternative link or any other resources that I can use to access the SLEPc documentation? > > > Thank you for your assistance. I look forward to hearing from you soon. > > > Best regards, > > zhang zhiyu > From joauma.marichal at uclouvain.be Wed Apr 12 03:02:21 2023 From: joauma.marichal at uclouvain.be (Joauma Marichal) Date: Wed, 12 Apr 2023 08:02:21 +0000 Subject: [petsc-users] DMSwarm with periodic B.C. Message-ID: Hello, I am using petsc DMSwarm library for some Lagrangian particle tracking. Until now, I was working in a closed box and was therefore initializing my swarm object as: // Create a DMDA staggered and without ghost cells (for DMSwarm to work) DMDACreate3d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE, DM_BOUNDARY_NONE, DM_BOUNDARY_NONE,DMDA_STENCIL_BOX,M,N,P,m,n,p,1,1,lx,ly,lz,da_swarm); DMSetFromOptions(*da_swarm); DMSetUp(*da_swarm); PetscScalar xmin, xmax, ymin, ymax, zmin, zmax; xmin = ymin = zmin = 0.; xmax = ymax = zmax = 1.; DMDASetUniformCoordinates(*da_swarm,xmin, xmax, ymin, ymax, zmin, zmax); //SetNonUniform3DCoordinates(*da_swarm, cornp, gridp, rank); /* Create a DMShell for point location purposes */ DMShellCreate(PETSC_COMM_WORLD,dmcell); DMSetApplicationContext(*dmcell,*da_swarm); (*dmcell)->ops->locatepoints = DMLocatePoints_DMDARegular; (*dmcell)->ops->getneighbors = DMGetNeighbors_DMDARegular; // Create a Swarm DMDA DMCreate(PETSC_COMM_WORLD,swarm); DMSetType(*swarm,DMSWARM); DMSetDimension(*swarm,3); DMSwarmSetType(*swarm,DMSWARM_PIC); DMSwarmSetCellDM(*swarm,*dmcell); I am now trying to work with periodic boundary conditions. I tried replacing DM_BOUNDARY_NONE by DM_BOUNDARY_PERIODIC but it does not work? I checked for examples using periodic B.C. but have not found any. Is it possible? And if yes, how can I make it work? Thanks a lot for your answer. Best regards, Joauma -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Wed Apr 12 10:31:45 2023 From: mfadams at lbl.gov (Mark Adams) Date: Wed, 12 Apr 2023 11:31:45 -0400 Subject: [petsc-users] [petsc-maint] DMSwarm with periodic B.C. In-Reply-To: References: Message-ID: First, you don't want a DMShell. Just use da_swarm. See src/dm/tutorials/ex20.c You can run this test with > cd src/dm/tutorials > make ex20 > ./ex20 or > ./ex20 -mode 1 See the end of ex20.c for these (lack of) arguments Now change that code to one periodic direction and test. This could be a bug. This code is not well tested. Thanks, Mark On Wed, Apr 12, 2023 at 4:02?AM Joauma Marichal < joauma.marichal at uclouvain.be> wrote: > Hello, > > > > I am using petsc DMSwarm library for some Lagrangian particle tracking. > Until now, I was working in a closed box and was therefore initializing my > swarm object as: > > > > // Create a DMDA staggered and without ghost cells (for DMSwarm to work) > > DMDACreate3d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE, DM_BOUNDARY_NONE, > DM_BOUNDARY_NONE,DMDA_STENCIL_BOX,M,N,P,m,n,p,1,1,lx,ly,lz,da_swarm); > > > > DMSetFromOptions(*da_swarm); > > DMSetUp(*da_swarm); > > PetscScalar xmin, xmax, ymin, ymax, zmin, zmax; > > xmin = ymin = zmin = 0.; > > xmax = ymax = zmax = 1.; > > DMDASetUniformCoordinates(*da_swarm,xmin, xmax, ymin, ymax, zmin, zmax); > > //SetNonUniform3DCoordinates(*da_swarm, cornp, gridp, rank); > > > > /* Create a DMShell for point location purposes */ > > DMShellCreate(PETSC_COMM_WORLD,dmcell); > > DMSetApplicationContext(*dmcell,*da_swarm); > > (*dmcell)->ops->locatepoints = DMLocatePoints_DMDARegular; > > (*dmcell)->ops->getneighbors = DMGetNeighbors_DMDARegular; > > > > // Create a Swarm DMDA > > DMCreate(PETSC_COMM_WORLD,swarm); > > DMSetType(*swarm,DMSWARM); > > DMSetDimension(*swarm,3); > > DMSwarmSetType(*swarm,DMSWARM_PIC); > > DMSwarmSetCellDM(*swarm,*dmcell); > > > > I am now trying to work with periodic boundary conditions. I tried > replacing DM_BOUNDARY_NONE by DM_BOUNDARY_PERIODIC but it does not work? > I checked for examples using periodic B.C. but have not found any. Is it > possible? And if yes, how can I make it work? > > > > Thanks a lot for your answer. > > Best regards, > > > > Joauma > -------------- next part -------------- An HTML attachment was scrubbed... URL: From zjorti at lanl.gov Wed Apr 12 15:08:53 2023 From: zjorti at lanl.gov (Jorti, Zakariae) Date: Wed, 12 Apr 2023 20:08:53 +0000 Subject: [petsc-users] Question about -memory_view Message-ID: <9f53a2a9df2d48608e04ae69f07ce219@lanl.gov> Hello, I am running some matrix computations on 64 nodes, using 640 MPI tasks. And I wanted to check the memory usage with the -memory_view flag. I get the following output: Summary of Memory Usage in PETSc Maximum (over computational time) process memory: total 3.1056e+11 max 5.9918e+08 min 4.2213e+08 Current process memory: total 1.9194e+11 max 3.9960e+08 min 2.2761e+08 What is the difference between maximum process memory and current process memory? What does total mean here? (in each node or total of all the nodes) Also, if the job fails because of memory shortage, will this -memory_view still output some information? Thank you. Zakariae -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Wed Apr 12 16:32:11 2023 From: bsmith at petsc.dev (Barry Smith) Date: Wed, 12 Apr 2023 17:32:11 -0400 Subject: [petsc-users] Question about -memory_view In-Reply-To: <9f53a2a9df2d48608e04ae69f07ce219@lanl.gov> References: <9f53a2a9df2d48608e04ae69f07ce219@lanl.gov> Message-ID: <0738336B-6037-404A-8AD4-A5B59AA0BDEB@petsc.dev> > On Apr 12, 2023, at 4:08 PM, Jorti, Zakariae via petsc-users wrote: > > Hello, > > I am running some matrix computations on 64 nodes, using 640 MPI tasks. > And I wanted to check the memory usage with the -memory_view flag. > I get the following output: > > Summary of Memory Usage in PETSc > Maximum (over computational time) process memory: total 3.1056e+11 max 5.9918e+08 min 4.2213e+08 > Current process memory: total 1.9194e+11 max 3.9960e+08 min 2.2761e+08 > > > What is the difference between maximum process memory and current process memory? Over computational time means the maximum it ever was (the high water mark) and current means what it is right now. For memory usage obtained from the OS (what we call "process memory" in the output as opposed to PetscMalloc()ed memory) often the current does not ever go below the maximum it ever was because the "extra now unneeded memory" is not returned to the OS. > What does total mean here? (in each node or total of all the nodes) Total is sum over all MPI ranks, max is maximum over all ranks, min is minimum over all ranks > Also, if the job fails because of memory shortage, will this -memory_view still output some information? Generally not if the job fails before the memory information is printed. Usually one runs with smaller memory usage increasing the problem size several times to see how the memory usage scales with the problem size (linearly, quadratically, etc) and this guides understanding the memory usage and if it can be improved. Just running with a large memory usage alone is not that useful in providing information. > > Thank you. > > Zakariae -------------- next part -------------- An HTML attachment was scrubbed... URL: From zjorti at lanl.gov Wed Apr 12 17:21:36 2023 From: zjorti at lanl.gov (Jorti, Zakariae) Date: Wed, 12 Apr 2023 22:21:36 +0000 Subject: [petsc-users] [EXTERNAL] Re: Question about -memory_view In-Reply-To: <0738336B-6037-404A-8AD4-A5B59AA0BDEB@petsc.dev> References: <9f53a2a9df2d48608e04ae69f07ce219@lanl.gov>, <0738336B-6037-404A-8AD4-A5B59AA0BDEB@petsc.dev> Message-ID: <7be8808377d548faaa2c2f81caaf3cb5@lanl.gov> Hello Barry, I appreciate the clarification. I tried to check the memory usage with seff and I got the following results: seff 7274633 Job ID: 7274633 Cluster: perlmutter User/Group: zjorti/zjorti State: COMPLETED (exit code 0) Nodes: 64 Cores per node: 256 CPU Utilized: 3-13:19:59 CPU Efficiency: 3.88% of 91-14:11:12 core-walltime Job Wall-clock time: 00:08:03 Memory Utilized: 958.68 GB (estimated maximum) Memory Efficiency: 0.00% of 0.00 MB (0.00 MB/node) Do you know how to interpret this? The memory utilized seems higher than the one given by -memory_view (3.1056e+11)... Thank you. Best, Zakariae ________________________________ From: Barry Smith Sent: Wednesday, April 12, 2023 3:32:11 PM To: Jorti, Zakariae Cc: petsc-users at mcs.anl.gov Subject: [EXTERNAL] Re: [petsc-users] Question about -memory_view On Apr 12, 2023, at 4:08 PM, Jorti, Zakariae via petsc-users wrote: Hello, I am running some matrix computations on 64 nodes, using 640 MPI tasks. And I wanted to check the memory usage with the -memory_view flag. I get the following output: Summary of Memory Usage in PETSc Maximum (over computational time) process memory: total 3.1056e+11 max 5.9918e+08 min 4.2213e+08 Current process memory: total 1.9194e+11 max 3.9960e+08 min 2.2761e+08 What is the difference between maximum process memory and current process memory? Over computational time means the maximum it ever was (the high water mark) and current means what it is right now. For memory usage obtained from the OS (what we call "process memory" in the output as opposed to PetscMalloc()ed memory) often the current does not ever go below the maximum it ever was because the "extra now unneeded memory" is not returned to the OS. What does total mean here? (in each node or total of all the nodes) Total is sum over all MPI ranks, max is maximum over all ranks, min is minimum over all ranks Also, if the job fails because of memory shortage, will this -memory_view still output some information? Generally not if the job fails before the memory information is printed. Usually one runs with smaller memory usage increasing the problem size several times to see how the memory usage scales with the problem size (linearly, quadratically, etc) and this guides understanding the memory usage and if it can be improved. Just running with a large memory usage alone is not that useful in providing information. Thank you. Zakariae -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Wed Apr 12 19:35:33 2023 From: bsmith at petsc.dev (Barry Smith) Date: Wed, 12 Apr 2023 20:35:33 -0400 Subject: [petsc-users] [EXTERNAL] Question about -memory_view In-Reply-To: <7be8808377d548faaa2c2f81caaf3cb5@lanl.gov> References: <9f53a2a9df2d48608e04ae69f07ce219@lanl.gov> <0738336B-6037-404A-8AD4-A5B59AA0BDEB@petsc.dev> <7be8808377d548faaa2c2f81caaf3cb5@lanl.gov> Message-ID: No idea. You can allocate a few vectors of known size, fill them with some value VecSet(x,2.0) and check the various memory reports to what they report. Barry > On Apr 12, 2023, at 6:21 PM, Jorti, Zakariae wrote: > > Hello Barry, > > I appreciate the clarification. > I tried to check the memory usage with seff and I got the following results: > > seff 7274633 > Job ID: 7274633 > Cluster: perlmutter > User/Group: zjorti/zjorti > State: COMPLETED (exit code 0) > Nodes: 64 > Cores per node: 256 > CPU Utilized: 3-13:19:59 > CPU Efficiency: 3.88% of 91-14:11:12 core-walltime > Job Wall-clock time: 00:08:03 > Memory Utilized: 958.68 GB (estimated maximum) > Memory Efficiency: 0.00% of 0.00 MB (0.00 MB/node) > > Do you know how to interpret this? > The memory utilized seems higher than the one given by -memory_view (3.1056e+11)... > Thank you. > Best, > > Zakariae > From: Barry Smith > > Sent: Wednesday, April 12, 2023 3:32:11 PM > To: Jorti, Zakariae > Cc: petsc-users at mcs.anl.gov > Subject: [EXTERNAL] Re: [petsc-users] Question about -memory_view > > > >> On Apr 12, 2023, at 4:08 PM, Jorti, Zakariae via petsc-users > wrote: >> >> Hello, >> >> I am running some matrix computations on 64 nodes, using 640 MPI tasks. >> And I wanted to check the memory usage with the -memory_view flag. >> I get the following output: >> >> Summary of Memory Usage in PETSc >> Maximum (over computational time) process memory: total 3.1056e+11 max 5.9918e+08 min 4.2213e+08 >> Current process memory: total 1.9194e+11 max 3.9960e+08 min 2.2761e+08 >> >> >> What is the difference between maximum process memory and current process memory? > > Over computational time means the maximum it ever was (the high water mark) and current means what it is right now. For memory usage obtained from the OS (what we call "process memory" in the output as opposed to PetscMalloc()ed memory) often the current does not ever go below the maximum it ever was because the "extra now unneeded memory" is not returned to the OS. > > >> What does total mean here? (in each node or total of all the nodes) > > Total is sum over all MPI ranks, max is maximum over all ranks, min is minimum over all ranks > >> Also, if the job fails because of memory shortage, will this -memory_view still output some information? > > Generally not if the job fails before the memory information is printed. Usually one runs with smaller memory usage increasing the problem size several times to see how the memory usage scales with the problem size (linearly, quadratically, etc) and this guides understanding the memory usage and if it can be improved. Just running with a large memory usage alone is not that useful in providing information. > > >> >> Thank you. >> >> Zakariae -------------- next part -------------- An HTML attachment was scrubbed... URL: From jeremy at seamplex.com Thu Apr 13 07:17:19 2023 From: jeremy at seamplex.com (Jeremy Theler) Date: Thu, 13 Apr 2023 09:17:19 -0300 Subject: [petsc-users] Effect of -pc_gamg_threshold vs PETSc version Message-ID: <3dc145050abae22d80443de140c9801d6e801bb0.camel@seamplex.com> When using GAMG+cg for linear elasticity and providing the near nullspace computed by MatNullSpaceCreateRigidBody(), I used to?find "experimentally" that a small value of -pc_gamg_threshold in the order of 0.0001 would slightly decrease the solve time. Starting with 3.18, I started seeing that any positive value for the treshold would increase the solve time. I did a quick parametric (serial) run solving an elastic problem with a matrix size of approx 570k x 570k for different values of GAMG threshold and different PETSc versions (compiled with the same compiler, options and flags). I noted that 1. starting from 3.18, a threshold of 0.0001 that used to improve the speed now worsens it. 2. PETSc 3.17 looks like a "sweet spot" of speed I would like to hear any comments you might have. The wall time shown includes the time needed to read the mesh and assemble the stiffness matrix. It is a refined version of the NAFEMS LE10 benchmark described here: https://seamplex.com/feenox/examples/mechanical.html#nafems-le10-thick-plate-pressure-benchmark If you want, I could dump the matrix, rhs and near nullspace vectors and share them. -- jeremy theler -------------- next part -------------- A non-text attachment was scrubbed... Name: threshold.pdf Type: application/pdf Size: 9656 bytes Desc: not available URL: -------------- next part -------------- 15 30.06 16 30.44 17 26.29 18 28.61 19 29.65 -------------- next part -------------- 15 29.23 16 29.54 17 24.70 18 29.44 19 30.58 -------------- next part -------------- 15 30.11 16 30.51 17 25.80 18 32.68 19 33.78 -------------- next part -------------- 15 31.68 16 31.96 17 26.98 18 43.36 19 44.24 -------------- next part -------------- 15 36.74 16 37.06 17 31.96 18 69.54 19 70.14 From karthikeyan.chockalingam at stfc.ac.uk Thu Apr 13 08:40:01 2023 From: karthikeyan.chockalingam at stfc.ac.uk (Karthikeyan Chockalingam - STFC UKRI) Date: Thu, 13 Apr 2023 13:40:01 +0000 Subject: [petsc-users] non-homogenous Dirichlet bc using MatZeroRowsColumns Message-ID: Hi, I am trying to solve the below parabolic system for a constant time interval dt (M + dt * K) x = M * x^(old) Where, M ? Mass matrix K ? Stiffness matrix A = M + dt * K (remains constant) I apply the boundary condition using MatZeroRowsColumns. For homogenous boundary conditions, I notice that the A matrix remains unmodified from one-time step to the next. For the non-homogenous boundary condition, I supply ?x? with the prescribed boundary value and pass it to MatZeroRowsColumns as follows MatZeroRowsColumnsIS(A, is, 1, x, b); After the linear solve, I find that A is modified. So I have to reassemble A again for every time step. Is it MatZeroRowsColumns or the solve itself that modifies A and why? Kind regards, Karthik. -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Thu Apr 13 11:27:58 2023 From: mfadams at lbl.gov (Mark Adams) Date: Thu, 13 Apr 2023 12:27:58 -0400 Subject: [petsc-users] Effect of -pc_gamg_threshold vs PETSc version In-Reply-To: <3dc145050abae22d80443de140c9801d6e801bb0.camel@seamplex.com> References: <3dc145050abae22d80443de140c9801d6e801bb0.camel@seamplex.com> Message-ID: Hi Jeremy, We did make some changes for performance reasons that we could not avoid, but I have never seen anything like this, so let's dig into it. 0) what is your test problem? eg, 3D Lapacian with Q1 finite elements. First, you can get GAMG diagnostics by running with '-info :pc' and grep on GAMG. Second, you are going to want to look at *iteration count* and *solve times, *and you want to separate the solve time (KSPSolve) and the GAMG setup time. If you have your own timer dig into -log_view data and get the "KSPSolve" time (solve time) and "RAP" or "P'AP" for the setup time. You could run one warm up solve and time a second one separately. That is what I do. *Iteration count:* You want to look at the eigen estimates for chebyshev. If you have an SPD problem then you want to use CG and not the default GMRES. If the eigen estimates are low GAMG convergence can suffer, but this is ussually catastrphic. *If your interation counts increase dramatically then this could be the issue.* *Time / iteration and setup time:* You can also see the grid sizes and number of nnz/row (ave). This will effect time/iteration and setip time 3.17) In looking at the change logs for 3.17 ( https://petsc.org/main/changes/317/#:~:text=maximum%20of%20ten-,PCMG,-%3A) we made few changes: * moved default smoothing to Jacobi from SOR because Jacobi works on GPUs * Some eigen estimate changes that you should look at. You should add the MatOptions if your matrix is SPD especially. SOR ussually converges faster, but is slower per interation. *Maybe Jacobi runs a lot faster for you.* + Check iteration counts + Check the eigen estimates did not change. If they did then we can dig into that 3.18) big chagnes: https://petsc.org/main/changes/318/#:~:text=based%20aggregation%20algorithm-,PC,-%3A * Some small things but *the -pc_gamg_sym_graph bullet might be (one of) your problem(s)*. Related to the MatOptions bullet above * The "aggressive" coarsening stratagy (use to be called "square_graph" but the old syntax is supported) is different because the old was was very slow. I have noticed that the rate of coarsening changes a little with the new method, but not much. * But the way threshould works with the new method is a bit different so that could explain some of this.* (new method calls MIS twice; old method call MIS on A'A) ** There are two things that you want to check: 1) Eigen estimates. If Eigen estimates are too small interation counts can increase a lot or ussually the solver just fails. * See if there are any changes in the eigen estimates for chebyshev* 2) Rate of coarening, which effects the number of NNZ per row. If that is too slow, NNZ goes up and the coarse grid construction (RAP) cost go way up. Check that the coarse grid sizes, which is related to NNZ per row, do not change. I think they do and we can dig into into it. *A quick proxy for (2) is the "grid complexity" output. This should be around 1.01 to 1.2* Anyway, sorry for the changes. I hate changing GAMG for this reason and I hate AMG for this reason! Thanks, Mark On Thu, Apr 13, 2023 at 8:17?AM Jeremy Theler wrote: > When using GAMG+cg for linear elasticity and providing the near > nullspace computed by MatNullSpaceCreateRigidBody(), I used to find > "experimentally" that a small value of -pc_gamg_threshold in the order > of 0.0001 would slightly decrease the solve time. > > Starting with 3.18, I started seeing that any positive value for the > treshold would increase the solve time. I did a quick parametric > (serial) run solving an elastic problem with a matrix size of approx > 570k x 570k for different values of GAMG threshold and different PETSc > versions (compiled with the same compiler, options and flags). > > I noted that > > 1. starting from 3.18, a threshold of 0.0001 that used to improve the > speed now worsens it. > 2. PETSc 3.17 looks like a "sweet spot" of speed > > I would like to hear any comments you might have. > > The wall time shown includes the time needed to read the mesh and > assemble the stiffness matrix. It is a refined version of the NAFEMS > LE10 benchmark described here: > > https://seamplex.com/feenox/examples/mechanical.html#nafems-le10-thick-plate-pressure-benchmark > > If you want, I could dump the matrix, rhs and near nullspace vectors > and share them. > > -- > jeremy theler > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Thu Apr 13 13:12:48 2023 From: bsmith at petsc.dev (Barry Smith) Date: Thu, 13 Apr 2023 14:12:48 -0400 Subject: [petsc-users] non-homogenous Dirichlet bc using MatZeroRowsColumns In-Reply-To: References: Message-ID: This is a good question. MatZeroRowsColumns() removes the rows and columns from the matrix and (optionally) updates the right hand side (and inserts values in the prescribed solution locations). Thus repeated calls cannot continue to update the right hand side vector; since the needed matrix values are gone. The way I solve problems where the boundary solution depends on time is by replacing that term by differentiating that equation; and do not use MatZeroRowsColumns(). So let w be the solution at some grid point on the boundary and w(t) = f(t) then I write w'(t) = f'(t) and then code the f'(t) as the right hand side for that grid point. If f(t)= c is a constant then f'(t) is zero and I just code zero as the right hand side for that point and use the initial condition w(0) = c. There may be better ways to do this. Barry > On Apr 13, 2023, at 9:40 AM, Karthikeyan Chockalingam - STFC UKRI via petsc-users wrote: > > Hi, > > I am trying to solve the below parabolic system for a constant time interval dt > > (M + dt * K) x = M * x^(old) > > Where, > M ? Mass matrix > K ? Stiffness matrix > A = M + dt * K (remains constant) > > I apply the boundary condition using MatZeroRowsColumns. For homogenous boundary conditions, I notice that the A matrix remains unmodified from one-time step to the next. > > For the non-homogenous boundary condition, I supply ?x? with the prescribed boundary value and pass it to MatZeroRowsColumns as follows > > MatZeroRowsColumnsIS(A, is, 1, x, b); > > After the linear solve, I find that A is modified. So I have to reassemble A again for every time step. Is it MatZeroRowsColumns or the solve itself that modifies A and why? > > Kind regards, > Karthik. -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexlindsay239 at gmail.com Thu Apr 13 15:33:34 2023 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Thu, 13 Apr 2023 13:33:34 -0700 Subject: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split Message-ID: Hi, I'm trying to solve steady Navier-Stokes for different Reynolds numbers. My options table -dm_moose_fieldsplit_names u,p -dm_moose_nfieldsplits 2 -fieldsplit_p_dm_moose_vars pressure -fieldsplit_p_ksp_type preonly -fieldsplit_p_pc_type jacobi -fieldsplit_u_dm_moose_vars vel_x,vel_y -fieldsplit_u_ksp_type preonly -fieldsplit_u_pc_hypre_type boomeramg -fieldsplit_u_pc_type hypre -pc_fieldsplit_schur_fact_type full -pc_fieldsplit_schur_precondition selfp -pc_fieldsplit_type schur -pc_type fieldsplit works wonderfully for a low Reynolds number of 2.2. The solver performance crushes LU as I scale up the problem. However, not surprisingly this options table struggles when I bump the Reynolds number to 220. I've read that use of AIR (approximate ideal restriction) can improve performance for advection dominated problems. I've tried setting -pc_hypre_boomeramg_restriction_type 1 for a simple diffusion problem and the option works fine. However, when applying it to my field-split preconditioned Navier-Stokes system, I get immediate non-convergence: 0 Nonlinear |R| = 1.033077e+03 0 Linear |R| = 1.033077e+03 Linear solve did not converge due to DIVERGED_NANORINF iterations 0 Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0 Does anyone have an idea as to why this might be happening? If not, I'd take a suggestion on where to set a breakpoint to start my own investigation. Alternatively, I welcome other preconditioning suggestions for an advection dominated problem. Alex -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Thu Apr 13 15:54:15 2023 From: bsmith at petsc.dev (Barry Smith) Date: Thu, 13 Apr 2023 16:54:15 -0400 Subject: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split In-Reply-To: References: Message-ID: <89D88373-FDC8-4214-B93A-DCB87E5CF012@petsc.dev> It would be useful to see the convergences inside the linear solve so perhaps start with -ksp_monitor_true_residual -fieldsplit_u_ksp_type richardson (this is to allow the monitor below to work) -fieldsplit_u_ksp_max_its 1 -fieldsplit_u_ksp_monitor Perhaps others, Matt/Jed/Pierre/Stefano likely know better off the cuff than me. We should have a convenience option like -pc_fieldsplit_schur_monitor similar to the -pc_fieldsplit_gkb_monitor > On Apr 13, 2023, at 4:33 PM, Alexander Lindsay wrote: > > Hi, I'm trying to solve steady Navier-Stokes for different Reynolds numbers. My options table > > -dm_moose_fieldsplit_names u,p > -dm_moose_nfieldsplits 2 > -fieldsplit_p_dm_moose_vars pressure > -fieldsplit_p_ksp_type preonly > -fieldsplit_p_pc_type jacobi > -fieldsplit_u_dm_moose_vars vel_x,vel_y > -fieldsplit_u_ksp_type preonly > -fieldsplit_u_pc_hypre_type boomeramg > -fieldsplit_u_pc_type hypre > -pc_fieldsplit_schur_fact_type full > -pc_fieldsplit_schur_precondition selfp > -pc_fieldsplit_type schur > -pc_type fieldsplit > > works wonderfully for a low Reynolds number of 2.2. The solver performance crushes LU as I scale up the problem. However, not surprisingly this options table struggles when I bump the Reynolds number to 220. I've read that use of AIR (approximate ideal restriction) can improve performance for advection dominated problems. I've tried setting -pc_hypre_boomeramg_restriction_type 1 for a simple diffusion problem and the option works fine. However, when applying it to my field-split preconditioned Navier-Stokes system, I get immediate non-convergence: > > 0 Nonlinear |R| = 1.033077e+03 > 0 Linear |R| = 1.033077e+03 > Linear solve did not converge due to DIVERGED_NANORINF iterations 0 > Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0 > > Does anyone have an idea as to why this might be happening? If not, I'd take a suggestion on where to set a breakpoint to start my own investigation. Alternatively, I welcome other preconditioning suggestions for an advection dominated problem. > > Alex -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexlindsay239 at gmail.com Thu Apr 13 16:07:07 2023 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Thu, 13 Apr 2023 14:07:07 -0700 Subject: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split In-Reply-To: <89D88373-FDC8-4214-B93A-DCB87E5CF012@petsc.dev> References: <89D88373-FDC8-4214-B93A-DCB87E5CF012@petsc.dev> Message-ID: Here's the result. 0 KSP unpreconditioned resid norm 1.033076851740e+03 true resid norm 1.033076851740e+03 ||r(i)||/||b|| 1.000000000000e+00 Residual norms for fieldsplit_u_ solve. 0 KSP Residual norm -nan Residual norms for fieldsplit_p_ solve. 0 KSP Residual norm -nan Residual norms for fieldsplit_u_ solve. 0 KSP Residual norm -nan 1 KSP Residual norm -nan Residual norms for fieldsplit_u_ solve. 0 KSP Residual norm -nan Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0 PC failed due to SUBPC_ERROR I probably should have read the FAQ on `-fp_trap` before sending my first email. Working with this stack trace (gdb) bt #0 0x00007fffe83a4286 in hypre_ParMatmul._omp_fn.1 () at par_csr_matop.c:1124 #1 0x00007ffff4982a16 in GOMP_parallel () from /lib/x86_64-linux-gnu/libgomp.so.1 #2 0x00007fffe83abfd1 in hypre_ParMatmul (A=, B=B at entry=0x55555da2ffa0) at par_csr_matop.c:967 #3 0x00007fffe82f09bf in hypre_BoomerAMGSetup (amg_vdata=, A=, f=, u=) at par_amg_setup.c:2790 #4 0x00007fffe82d54f0 in HYPRE_BoomerAMGSetup (solver=, A=, b=, x=) at HYPRE_parcsr_amg.c:47 #5 0x00007fffe940d33c in PCSetUp_HYPRE (pc=) at /home/lindad/projects/moose/petsc/src/ksp/pc/impls/hypre/hypre.c:418 #6 0x00007fffe9413d87 in PCSetUp (pc=0x55555d5ef390) at /home/lindad/projects/moose/petsc/src/ksp/pc/interface/precon.c:1017 #7 0x00007fffe94f856b in KSPSetUp (ksp=ksp at entry=0x55555d5eecb0) at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:408 #8 0x00007fffe94fa6f4 in KSPSolve_Private (ksp=ksp at entry=0x55555d5eecb0, b=0x55555d619730, x=) at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:852 #9 0x00007fffe94fd8b1 in KSPSolve (ksp=ksp at entry=0x55555d5eecb0, b=, x=) at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:1086 #10 0x00007fffe93d84a1 in PCApply_FieldSplit_Schur (pc=0x555555bef790, x=0x555556d5a510, y=0x555556d59e30) at /home/lindad/projects/moose/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1185 #11 0x00007fffe9414484 in PCApply (pc=pc at entry=0x555555bef790, x=x at entry=0x555556d5a510, y=y at entry=0x555556d59e30) at /home/lindad/projects/moose/petsc/src/ksp/pc/interface/precon.c:445 #12 0x00007fffe9415ad7 in PCApplyBAorAB (pc=0x555555bef790, side=PC_RIGHT, x=0x555556d5a510, y=y at entry=0x555556e922a0, work=0x555556d59e30) at /home/lindad/projects/moose/petsc/src/ksp/pc/interface/precon.c:727 #13 0x00007fffe9451fcd in KSP_PCApplyBAorAB (w=, y=0x555556e922a0, x=, ksp=0x555556068fc0) at /home/lindad/projects/moose/petsc/include/petsc/private/kspimpl.h:421 #14 KSPGMRESCycle (itcount=itcount at entry=0x7fffffffcca0, ksp=ksp at entry =0x555556068fc0) at /home/lindad/projects/moose/petsc/src/ksp/ksp/impls/gmres/gmres.c:162 #15 0x00007fffe94536f9 in KSPSolve_GMRES (ksp=0x555556068fc0) at /home/lindad/projects/moose/petsc/src/ksp/ksp/impls/gmres/gmres.c:247 #16 0x00007fffe94fb1c4 in KSPSolve_Private (ksp=0x555556068fc0, b=b at entry=0x55555568e510, x=, x at entry=0x55555607cce0) at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:914 #17 0x00007fffe94fd8b1 in KSPSolve (ksp=, b=b at entry=0x55555568e510, x=x at entry=0x55555607cce0) at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:1086 #18 0x00007fffe9582850 in SNESSolve_NEWTONLS (snes=0x555556065610) at /home/lindad/projects/moose/petsc/src/snes/impls/ls/ls.c:225 #19 0x00007fffe959c7ee in SNESSolve (snes=0x555556065610, b=0x0, x=) at /home/lindad/projects/moose/petsc/src/snes/interface/snes.c:4809 On Thu, Apr 13, 2023 at 1:54?PM Barry Smith wrote: > > It would be useful to see the convergences inside the linear solve so > perhaps start with > > -ksp_monitor_true_residual > > -fieldsplit_u_ksp_type richardson (this is to allow the monitor below > to work) > -fieldsplit_u_ksp_max_its 1 > -fieldsplit_u_ksp_monitor > > Perhaps others, Matt/Jed/Pierre/Stefano likely know better off the cuff > than me. > > We should have a convenience option like -pc_fieldsplit_schur_monitor > similar to the -pc_fieldsplit_gkb_monitor > > > > On Apr 13, 2023, at 4:33 PM, Alexander Lindsay > wrote: > > Hi, I'm trying to solve steady Navier-Stokes for different Reynolds > numbers. My options table > > -dm_moose_fieldsplit_names u,p > -dm_moose_nfieldsplits 2 > -fieldsplit_p_dm_moose_vars pressure > -fieldsplit_p_ksp_type preonly > -fieldsplit_p_pc_type jacobi > -fieldsplit_u_dm_moose_vars vel_x,vel_y > -fieldsplit_u_ksp_type preonly > -fieldsplit_u_pc_hypre_type boomeramg > -fieldsplit_u_pc_type hypre > -pc_fieldsplit_schur_fact_type full > -pc_fieldsplit_schur_precondition selfp > -pc_fieldsplit_type schur > -pc_type fieldsplit > > works wonderfully for a low Reynolds number of 2.2. The solver performance > crushes LU as I scale up the problem. However, not surprisingly this > options table struggles when I bump the Reynolds number to 220. I've read > that use of AIR (approximate ideal restriction) can improve performance for > advection dominated problems. I've tried > setting -pc_hypre_boomeramg_restriction_type 1 for a simple diffusion > problem and the option works fine. However, when applying it to my > field-split preconditioned Navier-Stokes system, I get immediate > non-convergence: > > 0 Nonlinear |R| = 1.033077e+03 > 0 Linear |R| = 1.033077e+03 > Linear solve did not converge due to DIVERGED_NANORINF iterations 0 > Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0 > > Does anyone have an idea as to why this might be happening? If not, I'd > take a suggestion on where to set a breakpoint to start my own > investigation. Alternatively, I welcome other preconditioning suggestions > for an advection dominated problem. > > Alex > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Apr 13 19:10:20 2023 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 13 Apr 2023 20:10:20 -0400 Subject: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split In-Reply-To: References: <89D88373-FDC8-4214-B93A-DCB87E5CF012@petsc.dev> Message-ID: On Thu, Apr 13, 2023 at 5:07?PM Alexander Lindsay wrote: > Here's the result. > > 0 KSP unpreconditioned resid norm 1.033076851740e+03 true resid norm > 1.033076851740e+03 ||r(i)||/||b|| 1.000000000000e+00 > Residual norms for fieldsplit_u_ solve. > 0 KSP Residual norm -nan > Residual norms for fieldsplit_p_ solve. > 0 KSP Residual norm -nan > Residual norms for fieldsplit_u_ solve. > 0 KSP Residual norm -nan > 1 KSP Residual norm -nan > Residual norms for fieldsplit_u_ solve. > 0 KSP Residual norm -nan > Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0 > PC failed due to SUBPC_ERROR > > I probably should have read the FAQ on `-fp_trap` before sending my first > email. > I think this can be mailed to Hypre now. We do not interfere in their SetUp, so I think it has to be on their side. Thanks, Matt > Working with this stack trace > > (gdb) bt > #0 0x00007fffe83a4286 in hypre_ParMatmul._omp_fn.1 () at > par_csr_matop.c:1124 > #1 0x00007ffff4982a16 in GOMP_parallel () from > /lib/x86_64-linux-gnu/libgomp.so.1 > #2 0x00007fffe83abfd1 in hypre_ParMatmul (A=, B=B at entry=0x55555da2ffa0) > at par_csr_matop.c:967 > #3 0x00007fffe82f09bf in hypre_BoomerAMGSetup (amg_vdata=, > A=, f=, > u=) at par_amg_setup.c:2790 > #4 0x00007fffe82d54f0 in HYPRE_BoomerAMGSetup (solver=, > A=, b=, > x=) at HYPRE_parcsr_amg.c:47 > #5 0x00007fffe940d33c in PCSetUp_HYPRE (pc=) > at /home/lindad/projects/moose/petsc/src/ksp/pc/impls/hypre/hypre.c:418 > #6 0x00007fffe9413d87 in PCSetUp (pc=0x55555d5ef390) > at /home/lindad/projects/moose/petsc/src/ksp/pc/interface/precon.c:1017 > #7 0x00007fffe94f856b in KSPSetUp (ksp=ksp at entry=0x55555d5eecb0) > at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:408 > #8 0x00007fffe94fa6f4 in KSPSolve_Private (ksp=ksp at entry=0x55555d5eecb0, > b=0x55555d619730, x=) > at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:852 > #9 0x00007fffe94fd8b1 in KSPSolve (ksp=ksp at entry=0x55555d5eecb0, > b=, x=) > at > /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:1086 > #10 0x00007fffe93d84a1 in PCApply_FieldSplit_Schur (pc=0x555555bef790, > x=0x555556d5a510, y=0x555556d59e30) > at > /home/lindad/projects/moose/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1185 > #11 0x00007fffe9414484 in PCApply (pc=pc at entry=0x555555bef790, x=x at entry=0x555556d5a510, > y=y at entry=0x555556d59e30) > at /home/lindad/projects/moose/petsc/src/ksp/pc/interface/precon.c:445 > #12 0x00007fffe9415ad7 in PCApplyBAorAB (pc=0x555555bef790, side=PC_RIGHT, > x=0x555556d5a510, > y=y at entry=0x555556e922a0, work=0x555556d59e30) > at /home/lindad/projects/moose/petsc/src/ksp/pc/interface/precon.c:727 > #13 0x00007fffe9451fcd in KSP_PCApplyBAorAB (w=, > y=0x555556e922a0, x=, > ksp=0x555556068fc0) at > /home/lindad/projects/moose/petsc/include/petsc/private/kspimpl.h:421 > #14 KSPGMRESCycle (itcount=itcount at entry=0x7fffffffcca0, ksp=ksp at entry > =0x555556068fc0) > at > /home/lindad/projects/moose/petsc/src/ksp/ksp/impls/gmres/gmres.c:162 > #15 0x00007fffe94536f9 in KSPSolve_GMRES (ksp=0x555556068fc0) > at > /home/lindad/projects/moose/petsc/src/ksp/ksp/impls/gmres/gmres.c:247 > #16 0x00007fffe94fb1c4 in KSPSolve_Private (ksp=0x555556068fc0, b=b at entry=0x55555568e510, > x=, > x at entry=0x55555607cce0) at > /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:914 > #17 0x00007fffe94fd8b1 in KSPSolve (ksp=, b=b at entry=0x55555568e510, > x=x at entry=0x55555607cce0) > at > /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:1086 > #18 0x00007fffe9582850 in SNESSolve_NEWTONLS (snes=0x555556065610) > at /home/lindad/projects/moose/petsc/src/snes/impls/ls/ls.c:225 > #19 0x00007fffe959c7ee in SNESSolve (snes=0x555556065610, b=0x0, > x=) > at /home/lindad/projects/moose/petsc/src/snes/interface/snes.c:4809 > > On Thu, Apr 13, 2023 at 1:54?PM Barry Smith wrote: > >> >> It would be useful to see the convergences inside the linear solve so >> perhaps start with >> >> -ksp_monitor_true_residual >> >> -fieldsplit_u_ksp_type richardson (this is to allow the monitor below >> to work) >> -fieldsplit_u_ksp_max_its 1 >> -fieldsplit_u_ksp_monitor >> >> Perhaps others, Matt/Jed/Pierre/Stefano likely know better off the cuff >> than me. >> >> We should have a convenience option like -pc_fieldsplit_schur_monitor >> similar to the -pc_fieldsplit_gkb_monitor >> >> >> >> On Apr 13, 2023, at 4:33 PM, Alexander Lindsay >> wrote: >> >> Hi, I'm trying to solve steady Navier-Stokes for different Reynolds >> numbers. My options table >> >> -dm_moose_fieldsplit_names u,p >> -dm_moose_nfieldsplits 2 >> -fieldsplit_p_dm_moose_vars pressure >> -fieldsplit_p_ksp_type preonly >> -fieldsplit_p_pc_type jacobi >> -fieldsplit_u_dm_moose_vars vel_x,vel_y >> -fieldsplit_u_ksp_type preonly >> -fieldsplit_u_pc_hypre_type boomeramg >> -fieldsplit_u_pc_type hypre >> -pc_fieldsplit_schur_fact_type full >> -pc_fieldsplit_schur_precondition selfp >> -pc_fieldsplit_type schur >> -pc_type fieldsplit >> >> works wonderfully for a low Reynolds number of 2.2. The solver >> performance crushes LU as I scale up the problem. However, not surprisingly >> this options table struggles when I bump the Reynolds number to 220. I've >> read that use of AIR (approximate ideal restriction) can improve >> performance for advection dominated problems. I've tried >> setting -pc_hypre_boomeramg_restriction_type 1 for a simple diffusion >> problem and the option works fine. However, when applying it to my >> field-split preconditioned Navier-Stokes system, I get immediate >> non-convergence: >> >> 0 Nonlinear |R| = 1.033077e+03 >> 0 Linear |R| = 1.033077e+03 >> Linear solve did not converge due to DIVERGED_NANORINF iterations 0 >> Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0 >> >> Does anyone have an idea as to why this might be happening? If not, I'd >> take a suggestion on where to set a breakpoint to start my own >> investigation. Alternatively, I welcome other preconditioning suggestions >> for an advection dominated problem. >> >> Alex >> >> >> -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Thu Apr 13 21:03:07 2023 From: mfadams at lbl.gov (Mark Adams) Date: Thu, 13 Apr 2023 22:03:07 -0400 Subject: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split In-Reply-To: References: <89D88373-FDC8-4214-B93A-DCB87E5CF012@petsc.dev> Message-ID: Are you using OpenMP? ("OMP"). If so try without it. On Thu, Apr 13, 2023 at 5:07?PM Alexander Lindsay wrote: > Here's the result. > > 0 KSP unpreconditioned resid norm 1.033076851740e+03 true resid norm > 1.033076851740e+03 ||r(i)||/||b|| 1.000000000000e+00 > Residual norms for fieldsplit_u_ solve. > 0 KSP Residual norm -nan > Residual norms for fieldsplit_p_ solve. > 0 KSP Residual norm -nan > Residual norms for fieldsplit_u_ solve. > 0 KSP Residual norm -nan > 1 KSP Residual norm -nan > Residual norms for fieldsplit_u_ solve. > 0 KSP Residual norm -nan > Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0 > PC failed due to SUBPC_ERROR > > I probably should have read the FAQ on `-fp_trap` before sending my first > email. > > Working with this stack trace > > (gdb) bt > #0 0x00007fffe83a4286 in hypre_ParMatmul._omp_fn.1 () at > par_csr_matop.c:1124 > #1 0x00007ffff4982a16 in GOMP_parallel () from > /lib/x86_64-linux-gnu/libgomp.so.1 > #2 0x00007fffe83abfd1 in hypre_ParMatmul (A=, B=B at entry=0x55555da2ffa0) > at par_csr_matop.c:967 > #3 0x00007fffe82f09bf in hypre_BoomerAMGSetup (amg_vdata=, > A=, f=, > u=) at par_amg_setup.c:2790 > #4 0x00007fffe82d54f0 in HYPRE_BoomerAMGSetup (solver=, > A=, b=, > x=) at HYPRE_parcsr_amg.c:47 > #5 0x00007fffe940d33c in PCSetUp_HYPRE (pc=) > at /home/lindad/projects/moose/petsc/src/ksp/pc/impls/hypre/hypre.c:418 > #6 0x00007fffe9413d87 in PCSetUp (pc=0x55555d5ef390) > at /home/lindad/projects/moose/petsc/src/ksp/pc/interface/precon.c:1017 > #7 0x00007fffe94f856b in KSPSetUp (ksp=ksp at entry=0x55555d5eecb0) > at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:408 > #8 0x00007fffe94fa6f4 in KSPSolve_Private (ksp=ksp at entry=0x55555d5eecb0, > b=0x55555d619730, x=) > at /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:852 > #9 0x00007fffe94fd8b1 in KSPSolve (ksp=ksp at entry=0x55555d5eecb0, > b=, x=) > at > /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:1086 > #10 0x00007fffe93d84a1 in PCApply_FieldSplit_Schur (pc=0x555555bef790, > x=0x555556d5a510, y=0x555556d59e30) > at > /home/lindad/projects/moose/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1185 > #11 0x00007fffe9414484 in PCApply (pc=pc at entry=0x555555bef790, x=x at entry=0x555556d5a510, > y=y at entry=0x555556d59e30) > at /home/lindad/projects/moose/petsc/src/ksp/pc/interface/precon.c:445 > #12 0x00007fffe9415ad7 in PCApplyBAorAB (pc=0x555555bef790, side=PC_RIGHT, > x=0x555556d5a510, > y=y at entry=0x555556e922a0, work=0x555556d59e30) > at /home/lindad/projects/moose/petsc/src/ksp/pc/interface/precon.c:727 > #13 0x00007fffe9451fcd in KSP_PCApplyBAorAB (w=, > y=0x555556e922a0, x=, > ksp=0x555556068fc0) at > /home/lindad/projects/moose/petsc/include/petsc/private/kspimpl.h:421 > #14 KSPGMRESCycle (itcount=itcount at entry=0x7fffffffcca0, ksp=ksp at entry > =0x555556068fc0) > at > /home/lindad/projects/moose/petsc/src/ksp/ksp/impls/gmres/gmres.c:162 > #15 0x00007fffe94536f9 in KSPSolve_GMRES (ksp=0x555556068fc0) > at > /home/lindad/projects/moose/petsc/src/ksp/ksp/impls/gmres/gmres.c:247 > #16 0x00007fffe94fb1c4 in KSPSolve_Private (ksp=0x555556068fc0, b=b at entry=0x55555568e510, > x=, > x at entry=0x55555607cce0) at > /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:914 > #17 0x00007fffe94fd8b1 in KSPSolve (ksp=, b=b at entry=0x55555568e510, > x=x at entry=0x55555607cce0) > at > /home/lindad/projects/moose/petsc/src/ksp/ksp/interface/itfunc.c:1086 > #18 0x00007fffe9582850 in SNESSolve_NEWTONLS (snes=0x555556065610) > at /home/lindad/projects/moose/petsc/src/snes/impls/ls/ls.c:225 > #19 0x00007fffe959c7ee in SNESSolve (snes=0x555556065610, b=0x0, > x=) > at /home/lindad/projects/moose/petsc/src/snes/interface/snes.c:4809 > > On Thu, Apr 13, 2023 at 1:54?PM Barry Smith wrote: > >> >> It would be useful to see the convergences inside the linear solve so >> perhaps start with >> >> -ksp_monitor_true_residual >> >> -fieldsplit_u_ksp_type richardson (this is to allow the monitor below >> to work) >> -fieldsplit_u_ksp_max_its 1 >> -fieldsplit_u_ksp_monitor >> >> Perhaps others, Matt/Jed/Pierre/Stefano likely know better off the cuff >> than me. >> >> We should have a convenience option like -pc_fieldsplit_schur_monitor >> similar to the -pc_fieldsplit_gkb_monitor >> >> >> >> On Apr 13, 2023, at 4:33 PM, Alexander Lindsay >> wrote: >> >> Hi, I'm trying to solve steady Navier-Stokes for different Reynolds >> numbers. My options table >> >> -dm_moose_fieldsplit_names u,p >> -dm_moose_nfieldsplits 2 >> -fieldsplit_p_dm_moose_vars pressure >> -fieldsplit_p_ksp_type preonly >> -fieldsplit_p_pc_type jacobi >> -fieldsplit_u_dm_moose_vars vel_x,vel_y >> -fieldsplit_u_ksp_type preonly >> -fieldsplit_u_pc_hypre_type boomeramg >> -fieldsplit_u_pc_type hypre >> -pc_fieldsplit_schur_fact_type full >> -pc_fieldsplit_schur_precondition selfp >> -pc_fieldsplit_type schur >> -pc_type fieldsplit >> >> works wonderfully for a low Reynolds number of 2.2. The solver >> performance crushes LU as I scale up the problem. However, not surprisingly >> this options table struggles when I bump the Reynolds number to 220. I've >> read that use of AIR (approximate ideal restriction) can improve >> performance for advection dominated problems. I've tried >> setting -pc_hypre_boomeramg_restriction_type 1 for a simple diffusion >> problem and the option works fine. However, when applying it to my >> field-split preconditioned Navier-Stokes system, I get immediate >> non-convergence: >> >> 0 Nonlinear |R| = 1.033077e+03 >> 0 Linear |R| = 1.033077e+03 >> Linear solve did not converge due to DIVERGED_NANORINF iterations 0 >> Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0 >> >> Does anyone have an idea as to why this might be happening? If not, I'd >> take a suggestion on where to set a breakpoint to start my own >> investigation. Alternatively, I welcome other preconditioning suggestions >> for an advection dominated problem. >> >> Alex >> >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexlindsay239 at gmail.com Thu Apr 13 23:02:55 2023 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Thu, 13 Apr 2023 21:02:55 -0700 Subject: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split In-Reply-To: References: Message-ID: <204E9576-8A6C-4E35-815C-0A05FB30CA73@gmail.com> An HTML attachment was scrubbed... URL: From pierre.jolivet at lip6.fr Thu Apr 13 23:54:28 2023 From: pierre.jolivet at lip6.fr (Pierre Jolivet) Date: Fri, 14 Apr 2023 06:54:28 +0200 Subject: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split In-Reply-To: References: Message-ID: <2A65BB14-67A4-4B9E-AC44-7F0A23C27D56@lip6.fr> > On 13 Apr 2023, at 10:33 PM, Alexander Lindsay wrote: > > Hi, I'm trying to solve steady Navier-Stokes for different Reynolds numbers. My options table > > -dm_moose_fieldsplit_names u,p > -dm_moose_nfieldsplits 2 > -fieldsplit_p_dm_moose_vars pressure > -fieldsplit_p_ksp_type preonly > -fieldsplit_p_pc_type jacobi > -fieldsplit_u_dm_moose_vars vel_x,vel_y > -fieldsplit_u_ksp_type preonly > -fieldsplit_u_pc_hypre_type boomeramg > -fieldsplit_u_pc_type hypre > -pc_fieldsplit_schur_fact_type full > -pc_fieldsplit_schur_precondition selfp > -pc_fieldsplit_type schur > -pc_type fieldsplit > > works wonderfully for a low Reynolds number of 2.2. The solver performance crushes LU as I scale up the problem. However, not surprisingly this options table struggles when I bump the Reynolds number to 220. I've read that use of AIR (approximate ideal restriction) can improve performance for advection dominated problems. I've tried setting -pc_hypre_boomeramg_restriction_type 1 for a simple diffusion problem and the option works fine. However, when applying it to my field-split preconditioned Navier-Stokes system, I get immediate non-convergence: > > 0 Nonlinear |R| = 1.033077e+03 > 0 Linear |R| = 1.033077e+03 > Linear solve did not converge due to DIVERGED_NANORINF iterations 0 > Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0 > > Does anyone have an idea as to why this might be happening? Do not use this option, even when not part of PCFIELDSPLIT. There is some missing plumbing in PETSc which makes it unusable, see Ben?s comment here https://github.com/hypre-space/hypre/issues/764#issuecomment-1353452417. In fact, it?s quite easy to make HYPRE generate NaN with a very simple stabilized convection?diffusion problem near the pure convection limit (something that ?AIR is supposed to handle). Even worse, you can make HYPRE fill your terminal with printf-style debugging messages https://github.com/hypre-space/hypre/blob/5546cc22d46b3dba253849f258786da47c9a7b21/src/parcsr_ls/par_lr_restr.c#L1416 with this option turned on. As a result, I have been unable to reproduce any of the ?AIR results. This also explains why I have been using plain BoomerAMG instead of ?AIR for the comparison in page 9 of https://arxiv.org/pdf/2201.02250.pdf (if you would like to try the PC we are using, I could send you the command line options). Thanks, Pierre > If not, I'd take a suggestion on where to set a breakpoint to start my own investigation. Alternatively, I welcome other preconditioning suggestions for an advection dominated problem. > > Alex -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexlindsay239 at gmail.com Fri Apr 14 00:02:48 2023 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Thu, 13 Apr 2023 22:02:48 -0700 Subject: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split In-Reply-To: <2A65BB14-67A4-4B9E-AC44-7F0A23C27D56@lip6.fr> References: <2A65BB14-67A4-4B9E-AC44-7F0A23C27D56@lip6.fr> Message-ID: Pierre, This is very helpful information. Thank you. Yes I would appreciate those command line options if you?re willing to share! > On Apr 13, 2023, at 9:54 PM, Pierre Jolivet wrote: > > ? > >>> On 13 Apr 2023, at 10:33 PM, Alexander Lindsay wrote: >>> >>> Hi, I'm trying to solve steady Navier-Stokes for different Reynolds numbers. My options table >>> >>> -dm_moose_fieldsplit_names u,p >>> -dm_moose_nfieldsplits 2 >>> -fieldsplit_p_dm_moose_vars pressure >>> -fieldsplit_p_ksp_type preonly >>> -fieldsplit_p_pc_type jacobi >>> -fieldsplit_u_dm_moose_vars vel_x,vel_y >>> -fieldsplit_u_ksp_type preonly >>> -fieldsplit_u_pc_hypre_type boomeramg >>> -fieldsplit_u_pc_type hypre >>> -pc_fieldsplit_schur_fact_type full >>> -pc_fieldsplit_schur_precondition selfp >>> -pc_fieldsplit_type schur >>> -pc_type fieldsplit >>> >>> works wonderfully for a low Reynolds number of 2.2. The solver performance crushes LU as I scale up the problem. However, not surprisingly this options table struggles when I bump the Reynolds number to 220. I've read that use of AIR (approximate ideal restriction) can improve performance for advection dominated problems. I've tried setting -pc_hypre_boomeramg_restriction_type 1 for a simple diffusion problem and the option works fine. However, when applying it to my field-split preconditioned Navier-Stokes system, I get immediate non-convergence: >>> >>> 0 Nonlinear |R| = 1.033077e+03 >>> 0 Linear |R| = 1.033077e+03 >>> Linear solve did not converge due to DIVERGED_NANORINF iterations 0 >>> Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0 >>> >>> Does anyone have an idea as to why this might be happening? >> >> Do not use this option, even when not part of PCFIELDSPLIT. >> There is some missing plumbing in PETSc which makes it unusable, see Ben?s comment here https://github.com/hypre-space/hypre/issues/764#issuecomment-1353452417. >> In fact, it?s quite easy to make HYPRE generate NaN with a very simple stabilized convection?diffusion problem near the pure convection limit (something that ?AIR is supposed to handle). >> Even worse, you can make HYPRE fill your terminal with printf-style debugging messages https://github.com/hypre-space/hypre/blob/5546cc22d46b3dba253849f258786da47c9a7b21/src/parcsr_ls/par_lr_restr.c#L1416 with this option turned on. >> As a result, I have been unable to reproduce any of the ?AIR results. >> This also explains why I have been using plain BoomerAMG instead of ?AIR for the comparison in page 9 of https://arxiv.org/pdf/2201.02250.pdf (if you would like to try the PC we are using, I could send you the command line options). >> >> Thanks, >> Pierre >> >> If not, I'd take a suggestion on where to set a breakpoint to start my own investigation. Alternatively, I welcome other preconditioning suggestions for an advection dominated problem. >> >> Alex > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pierre.jolivet at lip6.fr Fri Apr 14 00:28:55 2023 From: pierre.jolivet at lip6.fr (Pierre Jolivet) Date: Fri, 14 Apr 2023 07:28:55 +0200 Subject: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split In-Reply-To: References: <2A65BB14-67A4-4B9E-AC44-7F0A23C27D56@lip6.fr> Message-ID: <0DE6DDB9-759F-4DA8-86F1-CFA48F7AAE79@lip6.fr> > On 14 Apr 2023, at 7:02 AM, Alexander Lindsay wrote: > > Pierre, > > This is very helpful information. Thank you. Yes I would appreciate those command line options if you?re willing to share! No problem, I?ll get in touch with you in private first, because it may require some extra work (need a couple of extra options in PETSc ./configure), and this is not very related to the problem at hand, so best not to spam the mailing list. Thanks, Pierre >> On Apr 13, 2023, at 9:54 PM, Pierre Jolivet wrote: >> >> ? >> >>> On 13 Apr 2023, at 10:33 PM, Alexander Lindsay wrote: >>> >>> Hi, I'm trying to solve steady Navier-Stokes for different Reynolds numbers. My options table >>> >>> -dm_moose_fieldsplit_names u,p >>> -dm_moose_nfieldsplits 2 >>> -fieldsplit_p_dm_moose_vars pressure >>> -fieldsplit_p_ksp_type preonly >>> -fieldsplit_p_pc_type jacobi >>> -fieldsplit_u_dm_moose_vars vel_x,vel_y >>> -fieldsplit_u_ksp_type preonly >>> -fieldsplit_u_pc_hypre_type boomeramg >>> -fieldsplit_u_pc_type hypre >>> -pc_fieldsplit_schur_fact_type full >>> -pc_fieldsplit_schur_precondition selfp >>> -pc_fieldsplit_type schur >>> -pc_type fieldsplit >>> >>> works wonderfully for a low Reynolds number of 2.2. The solver performance crushes LU as I scale up the problem. However, not surprisingly this options table struggles when I bump the Reynolds number to 220. I've read that use of AIR (approximate ideal restriction) can improve performance for advection dominated problems. I've tried setting -pc_hypre_boomeramg_restriction_type 1 for a simple diffusion problem and the option works fine. However, when applying it to my field-split preconditioned Navier-Stokes system, I get immediate non-convergence: >>> >>> 0 Nonlinear |R| = 1.033077e+03 >>> 0 Linear |R| = 1.033077e+03 >>> Linear solve did not converge due to DIVERGED_NANORINF iterations 0 >>> Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0 >>> >>> Does anyone have an idea as to why this might be happening? >> >> Do not use this option, even when not part of PCFIELDSPLIT. >> There is some missing plumbing in PETSc which makes it unusable, see Ben?s comment here https://github.com/hypre-space/hypre/issues/764#issuecomment-1353452417. >> In fact, it?s quite easy to make HYPRE generate NaN with a very simple stabilized convection?diffusion problem near the pure convection limit (something that ?AIR is supposed to handle). >> Even worse, you can make HYPRE fill your terminal with printf-style debugging messages https://github.com/hypre-space/hypre/blob/5546cc22d46b3dba253849f258786da47c9a7b21/src/parcsr_ls/par_lr_restr.c#L1416 with this option turned on. >> As a result, I have been unable to reproduce any of the ?AIR results. >> This also explains why I have been using plain BoomerAMG instead of ?AIR for the comparison in page 9 of https://arxiv.org/pdf/2201.02250.pdf (if you would like to try the PC we are using, I could send you the command line options). >> >> Thanks, >> Pierre >> >>> If not, I'd take a suggestion on where to set a breakpoint to start my own investigation. Alternatively, I welcome other preconditioning suggestions for an advection dominated problem. >>> >>> Alex >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From thijs.smit at hest.ethz.ch Fri Apr 14 01:57:49 2023 From: thijs.smit at hest.ethz.ch (Smit Thijs) Date: Fri, 14 Apr 2023 06:57:49 +0000 Subject: [petsc-users] Drawing a line plot with two lines Message-ID: <7a678b6c2758440ba0461dd3e221aa74@hest.ethz.ch> Hi All, I am trying to plot a loglog plot with two lines, one for my error and one with a slope of 1. Plotting only the error or only the slope of 1 works fine, but I like both lines in the same plot (that doesn't work till now). Does somebody know how to solve this? Find a code snipet below: PetscDraw draw; PetscDrawLG lg; PetscDrawAxis axis; PetscReal xc, yc; PetscDrawCreate(PETSC_COMM_SELF, NULL, "Log(Error) vs Log(dx)", PETSC_DECIDE, PETSC_DECIDE, PETSC_DRAW_HALF_SIZE, PETSC_DRAW_HALF_SIZE, &draw); PetscDrawSetFromOptions(draw); PetscDrawLGCreate(draw, 2, &lg); PetscDrawLGSetUseMarkers(lg, PETSC_TRUE); PetscDrawLGGetAxis(lg, &axis); PetscDrawAxisSetLabels(axis, NULL, "Log(dx)", "Log(Error)"); for loop { xc = PetscLog10Real(dx); yc = PetscLog10Real(error); PetscDrawLGAddPoint(lg, &xc, &yc); // to plot the error PetscDrawLGAddPoint(lg, &xc, &xc); // to plot line with slope 1 PetscDrawLGDraw(lg); } PetscDrawSetPause(draw, -2); PetscDrawLGDestroy(&lg); PetscDrawDestroy(&draw); Best regards, Thijs Smit -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Apr 14 06:59:07 2023 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 14 Apr 2023 07:59:07 -0400 Subject: [petsc-users] Drawing a line plot with two lines In-Reply-To: <7a678b6c2758440ba0461dd3e221aa74@hest.ethz.ch> References: <7a678b6c2758440ba0461dd3e221aa74@hest.ethz.ch> Message-ID: On Fri, Apr 14, 2023 at 2:58?AM Smit Thijs wrote: > Hi All, > > > I am trying to plot a loglog plot with two lines, one for my error and one > with a slope of 1. Plotting only the error or only the slope of 1 works > fine, but I like both lines in the same plot (that doesn?t work till now). > Does somebody know how to solve this? Find a code snipet below: > > PetscDraw draw; > > PetscDrawLG lg; > > PetscDrawAxis axis; > > PetscReal xc, yc; > > PetscDrawCreate(PETSC_COMM_SELF, NULL, "Log(Error) vs Log(dx)", > PETSC_DECIDE, PETSC_DECIDE, PETSC_DRAW_HALF_SIZE, PETSC_DRAW_HALF_SIZE, > &draw); > > PetscDrawSetFromOptions(draw); > > PetscDrawLGCreate(draw, 2, &lg); > > PetscDrawLGSetUseMarkers(lg, PETSC_TRUE); > > PetscDrawLGGetAxis(lg, &axis); > > PetscDrawAxisSetLabels(axis, NULL, "Log(dx)", "Log(Error)"); > > for loop { > > xc = PetscLog10Real(dx); > > yc = PetscLog10Real(error); > > PetscDrawLGAddPoint(lg, &xc, &yc); // to plot the error > Here is want an array of points, one for each curve: https://petsc.org/main/manualpages/Draw/PetscDrawLGAddPoint/ Thanks, Matt > PetscDrawLGAddPoint(lg, &xc, &xc); // to plot line with slope 1 > > PetscDrawLGDraw(lg); > } > > PetscDrawSetPause(draw, -2); > > PetscDrawLGDestroy(&lg); > > PetscDrawDestroy(&draw); > > > > Best regards, > > > > Thijs Smit > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jeremy at seamplex.com Fri Apr 14 07:53:58 2023 From: jeremy at seamplex.com (Jeremy Theler) Date: Fri, 14 Apr 2023 09:53:58 -0300 Subject: [petsc-users] Effect of -pc_gamg_threshold vs PETSc version In-Reply-To: References: <3dc145050abae22d80443de140c9801d6e801bb0.camel@seamplex.com> Message-ID: <57b4d5f16b9d84b87675fd149dd83527d63651a5.camel@seamplex.com> Hi Mark. So glad you answered. > 0) what is your test problem? eg, 3D Lapacian with Q1 finite > elements. I said in my first email it was linear elasticty (and I gave a link where you can see the geometry, BCs, etc.) but I did not specifty further details. It is linear elasticity with displacement-based FEM formulation using unstructured curved 10-noded tetrahedra. The matrix is marked as SPD with MatSetOption() and the solver is indeed CG and not the default GMRES. > First, you can get GAMG diagnostics?by running with?'-info :pc' and > grep on GAMG. Great advice. Now I have a lot more of information but I'm not sure how to analyze it. Find attached for each combination of threshold and PETSc version the output of -info :pc -ksp_monitor -ksp_view In general it looks like 3.18 and 3.19 have less KSP iterations than 3.17 but the overall time is larger. > Anyway, sorry for the changes.? > I hate changing GAMG for this reason and I hate AMG for this reason! No need to apologize, I just want to better understand how to better exploit your code! Thanks -- jeremy > > Thanks, > Mark > > > > On Thu, Apr 13, 2023 at 8:17?AM Jeremy Theler > wrote: > > When using GAMG+cg for linear elasticity and providing the near > > nullspace computed by MatNullSpaceCreateRigidBody(), I used to?find > > "experimentally" that a small value of -pc_gamg_threshold in the > > order > > of 0.0001 would slightly decrease the solve time. > > > > Starting with 3.18, I started seeing that any positive value for > > the > > treshold would increase the solve time. I did a quick parametric > > (serial) run solving an elastic problem with a matrix size of > > approx > > 570k x 570k for different values of GAMG threshold and different > > PETSc > > versions (compiled with the same compiler, options and flags). > > > > I noted that > > > > ?1. starting from 3.18, a threshold of 0.0001 that used to improve > > the > > speed now worsens it. > > ?2. PETSc 3.17 looks like a "sweet spot" of speed > > > > I would like to hear any comments you might have. > > > > The wall time shown includes the time needed to read the mesh and > > assemble the stiffness matrix. It is a refined version of the > > NAFEMS > > LE10 benchmark described here: > > https://seamplex.com/feenox/examples/mechanical.html#nafems-le10- > > thick-plate-pressure-benchmark > > > > If you want, I could dump the matrix, rhs and near nullspace > > vectors > > and share them. > > > > -- > > jeremy theler > > -------------- next part -------------- A non-text attachment was scrubbed... Name: log.tar.gz Type: application/x-compressed-tar Size: 21926 bytes Desc: not available URL: From knepley at gmail.com Fri Apr 14 08:36:12 2023 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 14 Apr 2023 09:36:12 -0400 Subject: [petsc-users] Effect of -pc_gamg_threshold vs PETSc version In-Reply-To: <57b4d5f16b9d84b87675fd149dd83527d63651a5.camel@seamplex.com> References: <3dc145050abae22d80443de140c9801d6e801bb0.camel@seamplex.com> <57b4d5f16b9d84b87675fd149dd83527d63651a5.camel@seamplex.com> Message-ID: On Fri, Apr 14, 2023 at 8:54?AM Jeremy Theler wrote: > Hi Mark. So glad you answered. > > > 0) what is your test problem? eg, 3D Lapacian with Q1 finite > > elements. > > I said in my first email it was linear elasticty (and I gave a link > where you can see the geometry, BCs, etc.) but I did not specifty > further details. > It is linear elasticity with displacement-based FEM formulation using > unstructured curved 10-noded tetrahedra. > I believe our jargon for this would be "P_2 Lagrange element". > The matrix is marked as SPD with MatSetOption() and the solver is > indeed CG and not the default GMRES. > > > First, you can get GAMG diagnostics by running with '-info :pc' and > > grep on GAMG. > > Great advice. Now I have a lot more of information but I'm not sure how > to analyze it. Find attached for each combination of threshold and > PETSc version the output of -info :pc -ksp_monitor -ksp_view > > In general it looks like 3.18 and 3.19 have less KSP iterations than > 3.17 but the overall time is larger. > I will also look and see if I can figure out the change. This kind of behavior usually means that we somehow made the coarse problem larger. This can make it more accurate (fewer iterations) but more costly. This also makes sense that it is sensitive to the threshold parameter, but that is not the only thing that controls the sparsity. There is also squaring the graph. Mark, do you know if we change the default for squaring? Thanks, Matt > > Anyway, sorry for the changes. > > I hate changing GAMG for this reason and I hate AMG for this reason! > > No need to apologize, I just want to better understand how to better > exploit your code! > > Thanks > -- > jeremy > > > > > Thanks, > > Mark > > > > > > > > On Thu, Apr 13, 2023 at 8:17?AM Jeremy Theler > > wrote: > > > When using GAMG+cg for linear elasticity and providing the near > > > nullspace computed by MatNullSpaceCreateRigidBody(), I used to find > > > "experimentally" that a small value of -pc_gamg_threshold in the > > > order > > > of 0.0001 would slightly decrease the solve time. > > > > > > Starting with 3.18, I started seeing that any positive value for > > > the > > > treshold would increase the solve time. I did a quick parametric > > > (serial) run solving an elastic problem with a matrix size of > > > approx > > > 570k x 570k for different values of GAMG threshold and different > > > PETSc > > > versions (compiled with the same compiler, options and flags). > > > > > > I noted that > > > > > > 1. starting from 3.18, a threshold of 0.0001 that used to improve > > > the > > > speed now worsens it. > > > 2. PETSc 3.17 looks like a "sweet spot" of speed > > > > > > I would like to hear any comments you might have. > > > > > > The wall time shown includes the time needed to read the mesh and > > > assemble the stiffness matrix. It is a refined version of the > > > NAFEMS > > > LE10 benchmark described here: > > > https://seamplex.com/feenox/examples/mechanical.html#nafems-le10- > > > thick-plate-pressure-benchmark > > > > > > If you want, I could dump the matrix, rhs and near nullspace > > > vectors > > > and share them. > > > > > > -- > > > jeremy theler > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From kabdelaz at purdue.edu Fri Apr 14 11:39:43 2023 From: kabdelaz at purdue.edu (Khaled Nabil Shar Abdelaziz) Date: Fri, 14 Apr 2023 16:39:43 +0000 Subject: [petsc-users] Issue with SNES Solver in Fortran - Converging with Zero Residual Message-ID: Hello, I hope you are well. I am currently working with the SNES solver in Fortran, utilizing Newton's method. I have provided both the residual function and the Jacobian. However, I have encountered an issue that I hope you can help me with. In the first non-linear iteration, the solver calculates a non-zero residual value and starts the KSP solver to minimize the error. However, in the second non-linear iteration, it returns an exact zero residual, considering it converged. As a result, the first few steps provide a somewhat accurate solution, but after around 15 steps, the solution starts diverging rapidly. I assume this is due to accumulating errors, as the residual in subsequent steps becomes increasingly larger. Here are some outputs from the solver: start_SNES_petsc_solver: var_nd%name=u 0 SNES Function norm 3.412918650183e+01 Attempt! 0 KSP Residual norm 5.861642176595e-01 1 KSP Residual norm 1.411858179645e-01 2 KSP Residual norm 1.388288156571e-01 3 KSP Residual norm 4.900215159087e-02 4 KSP Residual norm 2.559630070894e-02 5 KSP Residual norm 1.007110997387e-02 6 KSP Residual norm 6.371175598940e-03 1 SNES Function norm 0.000000000000e+00 iteration= 1 SNESConvergedReason= 2 istep= 1 ****************************** start_SNES_petsc_solver: var_nd%name=u 0 SNES Function norm 6.831896456736e+01 Attempt! 0 KSP Residual norm 5.850729730568e+00 1 KSP Residual norm 5.176135972454e+00 2 KSP Residual norm 5.106059774079e-01 3 KSP Residual norm 2.058573608172e-01 4 KSP Residual norm 8.430267458444e-02 5 KSP Residual norm 2.421049820170e-02 6 KSP Residual norm 1.387479046692e-02 7 KSP Residual norm 6.556624109622e-03 1 SNES Function norm 0.000000000000e+00 iteration= 1 SNESConvergedReason= 2 istep= 2 ****************************** start_SNES_petsc_solver: var_nd%name=u 0 SNES Function norm 1.024330722398e+02 Attempt! 0 KSP Residual norm 1.324263482159e+01 1 KSP Residual norm 8.772774639367e+00 2 KSP Residual norm 8.717824082000e-01 3 KSP Residual norm 3.965086318719e-01 4 KSP Residual norm 1.380063519887e-01 5 KSP Residual norm 3.983781619335e-02 6 KSP Residual norm 1.690524902818e-02 7 KSP Residual norm 1.371656480592e-02 8 KSP Residual norm 4.598826535286e-03 1 SNES Function norm 0.000000000000e+00 iteration= 1 SNESConvergedReason= 2 istep= 3 ****************************** start_SNES_petsc_solver: var_nd%name=u 0 SNES Function norm 1.371337889713e+02 Attempt! 0 KSP Residual norm 1.823603533770e+01 1 KSP Residual norm 1.823546028484e+01 2 KSP Residual norm 8.167385988622e-01 3 KSP Residual norm 1.930526020067e-01 4 KSP Residual norm 1.768873013055e-01 5 KSP Residual norm 2.692456250466e-02 6 KSP Residual norm 1.130530545617e-02 7 KSP Residual norm 6.934825615412e-03 1 SNES Function norm 0.000000000000e+00 iteration= 1 SNESConvergedReason= 2 istep= 4 ****************************** start_SNES_petsc_solver: var_nd%name=u 0 SNES Function norm 1.717647834464e+02 Attempt! 0 KSP Residual norm 1.015846744437e+02 1 KSP Residual norm 7.622793741160e+00 2 KSP Residual norm 1.381803723895e+00 3 KSP Residual norm 1.305423467184e-01 4 KSP Residual norm 3.606398975875e-02 5 KSP Residual norm 3.463999556864e-02 6 KSP Residual norm 3.268229989659e-02 7 KSP Residual norm 8.847393497789e-03 1 SNES Function norm 0.000000000000e+00 iteration= 1 SNESConvergedReason= 2 Do you have any idea what might be causing this behavior? I appreciate any insights you might have. Best, Khaled -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Fri Apr 14 12:45:15 2023 From: bsmith at petsc.dev (Barry Smith) Date: Fri, 14 Apr 2023 13:45:15 -0400 Subject: [petsc-users] Issue with SNES Solver in Fortran - Converging with Zero Residual In-Reply-To: References: Message-ID: <0DE0A214-E640-4CB7-9267-DE7973C4077E@petsc.dev> Likely there is an issue with the FormFunction you are providing. For a small problem you can call VecView() on the input vector at the start of your routine and then VecView() on the output vector at the end of your routine. This might provide some insight. Since the output vector is presumably all zero you can try running in the debugger and trace through the second call to your FormFunction to see why it generating exactly zero output when it presumably should not be. Barry > On Apr 14, 2023, at 12:39 PM, Khaled Nabil Shar Abdelaziz wrote: > > Hello, > I hope you are well. I am currently working with the SNES solver in Fortran, utilizing Newton's method. I have provided both the residual function and the Jacobian. However, I have encountered an issue that I hope you can help me with. > > In the first non-linear iteration, the solver calculates a non-zero residual value and starts the KSP solver to minimize the error. However, in the second non-linear iteration, it returns an exact zero residual, considering it converged. > > As a result, the first few steps provide a somewhat accurate solution, but after around 15 steps, the solution starts diverging rapidly. I assume this is due to accumulating errors, as the residual in subsequent steps becomes increasingly larger. > > Here are some outputs from the solver: > > start_SNES_petsc_solver: var_nd%name=u > 0 SNES Function norm 3.412918650183e+01 > Attempt! > 0 KSP Residual norm 5.861642176595e-01 > 1 KSP Residual norm 1.411858179645e-01 > 2 KSP Residual norm 1.388288156571e-01 > 3 KSP Residual norm 4.900215159087e-02 > 4 KSP Residual norm 2.559630070894e-02 > 5 KSP Residual norm 1.007110997387e-02 > 6 KSP Residual norm 6.371175598940e-03 > 1 SNES Function norm 0.000000000000e+00 > iteration= 1 > SNESConvergedReason= 2 > > > istep= 1 > ****************************** > > start_SNES_petsc_solver: var_nd%name=u > 0 SNES Function norm 6.831896456736e+01 > Attempt! > 0 KSP Residual norm 5.850729730568e+00 > 1 KSP Residual norm 5.176135972454e+00 > 2 KSP Residual norm 5.106059774079e-01 > 3 KSP Residual norm 2.058573608172e-01 > 4 KSP Residual norm 8.430267458444e-02 > 5 KSP Residual norm 2.421049820170e-02 > 6 KSP Residual norm 1.387479046692e-02 > 7 KSP Residual norm 6.556624109622e-03 > 1 SNES Function norm 0.000000000000e+00 > iteration= 1 > SNESConvergedReason= 2 > > istep= 2 > ****************************** > > start_SNES_petsc_solver: var_nd%name=u > 0 SNES Function norm 1.024330722398e+02 > Attempt! > 0 KSP Residual norm 1.324263482159e+01 > 1 KSP Residual norm 8.772774639367e+00 > 2 KSP Residual norm 8.717824082000e-01 > 3 KSP Residual norm 3.965086318719e-01 > 4 KSP Residual norm 1.380063519887e-01 > 5 KSP Residual norm 3.983781619335e-02 > 6 KSP Residual norm 1.690524902818e-02 > 7 KSP Residual norm 1.371656480592e-02 > 8 KSP Residual norm 4.598826535286e-03 > 1 SNES Function norm 0.000000000000e+00 > iteration= 1 > SNESConvergedReason= 2 > > istep= 3 > ****************************** > > start_SNES_petsc_solver: var_nd%name=u > 0 SNES Function norm 1.371337889713e+02 > Attempt! > 0 KSP Residual norm 1.823603533770e+01 > 1 KSP Residual norm 1.823546028484e+01 > 2 KSP Residual norm 8.167385988622e-01 > 3 KSP Residual norm 1.930526020067e-01 > 4 KSP Residual norm 1.768873013055e-01 > 5 KSP Residual norm 2.692456250466e-02 > 6 KSP Residual norm 1.130530545617e-02 > 7 KSP Residual norm 6.934825615412e-03 > 1 SNES Function norm 0.000000000000e+00 > iteration= 1 > SNESConvergedReason= 2 > > istep= 4 > ****************************** > > start_SNES_petsc_solver: var_nd%name=u > 0 SNES Function norm 1.717647834464e+02 > Attempt! > 0 KSP Residual norm 1.015846744437e+02 > 1 KSP Residual norm 7.622793741160e+00 > 2 KSP Residual norm 1.381803723895e+00 > 3 KSP Residual norm 1.305423467184e-01 > 4 KSP Residual norm 3.606398975875e-02 > 5 KSP Residual norm 3.463999556864e-02 > 6 KSP Residual norm 3.268229989659e-02 > 7 KSP Residual norm 8.847393497789e-03 > 1 SNES Function norm 0.000000000000e+00 > iteration= 1 > SNESConvergedReason= 2 > > Do you have any idea what might be causing this behavior? I appreciate any insights you might have. > > Best, > Khaled -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexlindsay239 at gmail.com Fri Apr 14 16:58:25 2023 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Fri, 14 Apr 2023 14:58:25 -0700 Subject: [petsc-users] PETSc testing recipes Message-ID: Hi, is there a place I can look to understand the testing recipes used in PETSc CI, e.g. what external packages are included (if any), what C++ dialect is used for any external packages built with C++, etc.? Alex -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Fri Apr 14 17:16:32 2023 From: jed at jedbrown.org (Jed Brown) Date: Fri, 14 Apr 2023 16:16:32 -0600 Subject: [petsc-users] PETSc testing recipes In-Reply-To: References: Message-ID: <87a5zads8v.fsf@jedbrown.org> Look at config/examples/arch-ci-*.py for the configurations. They're driven from .gitlab-ci.yml Alexander Lindsay writes: > Hi, is there a place I can look to understand the testing recipes used in > PETSc CI, e.g. what external packages are included (if any), what C++ > dialect is used for any external packages built with C++, etc.? > > Alex From balay at mcs.anl.gov Fri Apr 14 17:18:28 2023 From: balay at mcs.anl.gov (Satish Balay) Date: Fri, 14 Apr 2023 17:18:28 -0500 (CDT) Subject: [petsc-users] PETSc testing recipes In-Reply-To: References: Message-ID: Hm depending on what you are seeking - you'll have to look at multiple places [and co-related the info from them] - you can check one of the pipelines to see exactly what jobs are run (and the logs of the jobs for details). for ex: https://gitlab.com/petsc/petsc/-/pipelines/836609909 - most of the build scripts are at config/examples/arch-ci-*.py - so thats one way to get a view of what pkgs are tested by what jobs.. balay at p1 /home/balay/petsc (main =) $ ls config/examples/arch-ci-*.py |wc -l 56 - And wrt specific examples corresponding to external pkg - say - superlu_dist - you can do something like: balay at p1 /home/balay/petsc (main =) $ git grep 'requires:' src |grep superlu_dist |head -5 src/ksp/ksp/tests/ex17.c: requires: superlu_dist complex src/ksp/ksp/tests/ex17.c: requires: superlu_dist complex src/ksp/ksp/tests/ex33.c: requires: superlu_dist !complex src/ksp/ksp/tests/ex33.c: requires: superlu_dist !complex src/ksp/ksp/tests/ex49.c: requires: superlu_dist etc. - Wrt C++ dialect: balay at p1 /home/balay/petsc (main =) $ git grep self.minCxxVersion config/BuildSystem/config/packages config/BuildSystem/config/packages/AMReX.py: self.minCxxVersion = 'c++14' config/BuildSystem/config/packages/h2opus.py: self.minCxxVersion = 'c++14' config/BuildSystem/config/packages/kokkos.py: self.minCxxVersion = 'c++17' config/BuildSystem/config/packages/raja.py: self.minCxxVersion = 'c++14' config/BuildSystem/config/packages/sycl.py: self.minCxxVersion = 'c++17' balay at p1 /home/balay/petsc (main =) $ git grep self.maxCxxVersion config/BuildSystem/config/packages config/BuildSystem/config/packages/MOAB.py: self.maxCxxVersion = 'c++14' config/BuildSystem/config/packages/amgx.py: self.maxCxxVersion = 'c++17' # https://github.com/NVIDIA/AMGX/issues/231 config/BuildSystem/config/packages/elemental.py: self.maxCxxVersion = 'c++14' config/BuildSystem/config/packages/grid.py: self.maxCxxVersion = 'c++17' config/BuildSystem/config/packages/kokkos.py: self.maxCxxVersion = 'c++17' However configure determines the max that the compiler supports and attempts to use that [when only min is set]. And this info is usually in configure.log [or make.log] for the corresponding build. Satish On Fri, 14 Apr 2023, Alexander Lindsay wrote: > Hi, is there a place I can look to understand the testing recipes used in > PETSc CI, e.g. what external packages are included (if any), what C++ > dialect is used for any external packages built with C++, etc.? > > Alex > From jacob.fai at gmail.com Fri Apr 14 17:48:50 2023 From: jacob.fai at gmail.com (Jacob Faibussowitsch) Date: Fri, 14 Apr 2023 18:48:50 -0400 Subject: [petsc-users] PETSc testing recipes In-Reply-To: References: Message-ID: More specifically for configure handling of C++ dialect. It has 3 modes: 1. You don?t specify any C++ dialect (i.e. configure without passing `--with-cxx-dialect`): Configure will determine the dialect range for your given setup. It takes into account all of the compilers (including ?extra? ones like nvcc) it finds, as well as all of your requested packages. It will also enable GNU extensions if they are available, i.e. passing `-std=gnu++17` over `-std=c++17`. From this range, it will select the highest possible dialect and apply it to all builds. 2. You specify `--with-cxx-dialect=14`: Configure will do the above but will impose an arbitrary cap of C++14. If something in your environment conflicts with this (i.e. maybe your compiler doesn?t support it, or maybe a package requires C++17) it will produce a hard error telling you why this won?t work. It still attempts to enable GNU extensions as above. 3. You specify `--with-cxx-dialect=c++14` OR you have `-std=c++14` in `[--]CXXFLAGS` and/or `[--]CXXOPTFLAGS`: configure assumes you know something it doesn?t. It will blindly apply exactly `-std=c++14` everywhere. This includes building PETSc itself and all packages that PETSc builds. It will still check your compiler supports `-std=c++14` though (and error if it does not). So if your goal is to have the highest available dialect chosen for you, do nothing. Do not pass any dialect flags, and do not set them in your `CXXFLAGS` or `CXXOPTFLAGS`. Configure will do The Right Thing. If your goal is to explicitly set the dialect, we recommend using option 2 since that might still allow you to enable GNU extensions. If something is just not working then 1. send us your configure.log ? maybe we have a bug ? and 2. use option #3, it will do exactly as you ask. Best regards, Jacob Faibussowitsch (Jacob Fai - booss - oh - vitch) > On Apr 14, 2023, at 18:18, Satish Balay via petsc-users wrote: > > Hm depending on what you are seeking - you'll have to look at multiple places [and co-related the info from them] > > - you can check one of the pipelines to see exactly what jobs are run (and the logs of the jobs for details). for ex: > https://gitlab.com/petsc/petsc/-/pipelines/836609909 > > - most of the build scripts are at config/examples/arch-ci-*.py - so thats one way to get a view of what pkgs are tested by what jobs.. > > balay at p1 /home/balay/petsc (main =) > $ ls config/examples/arch-ci-*.py |wc -l > 56 > > - And wrt specific examples corresponding to external pkg - say - superlu_dist - you can do something like: > > balay at p1 /home/balay/petsc (main =) > $ git grep 'requires:' src |grep superlu_dist |head -5 > src/ksp/ksp/tests/ex17.c: requires: superlu_dist complex > src/ksp/ksp/tests/ex17.c: requires: superlu_dist complex > src/ksp/ksp/tests/ex33.c: requires: superlu_dist !complex > src/ksp/ksp/tests/ex33.c: requires: superlu_dist !complex > src/ksp/ksp/tests/ex49.c: requires: superlu_dist > > etc. > > - Wrt C++ dialect: > > balay at p1 /home/balay/petsc (main =) > $ git grep self.minCxxVersion config/BuildSystem/config/packages > config/BuildSystem/config/packages/AMReX.py: self.minCxxVersion = 'c++14' > config/BuildSystem/config/packages/h2opus.py: self.minCxxVersion = 'c++14' > config/BuildSystem/config/packages/kokkos.py: self.minCxxVersion = 'c++17' > config/BuildSystem/config/packages/raja.py: self.minCxxVersion = 'c++14' > config/BuildSystem/config/packages/sycl.py: self.minCxxVersion = 'c++17' > balay at p1 /home/balay/petsc (main =) > $ git grep self.maxCxxVersion config/BuildSystem/config/packages > config/BuildSystem/config/packages/MOAB.py: self.maxCxxVersion = 'c++14' > config/BuildSystem/config/packages/amgx.py: self.maxCxxVersion = 'c++17' # https://github.com/NVIDIA/AMGX/issues/231 > config/BuildSystem/config/packages/elemental.py: self.maxCxxVersion = 'c++14' > config/BuildSystem/config/packages/grid.py: self.maxCxxVersion = 'c++17' > config/BuildSystem/config/packages/kokkos.py: self.maxCxxVersion = 'c++17' > > However configure determines the max that the compiler supports and > attempts to use that [when only min is set]. And this info is usually > in configure.log [or make.log] for the corresponding build. > > Satish > > > On Fri, 14 Apr 2023, Alexander Lindsay wrote: > >> Hi, is there a place I can look to understand the testing recipes used in >> PETSc CI, e.g. what external packages are included (if any), what C++ >> dialect is used for any external packages built with C++, etc.? >> >> Alex >> > From alexlindsay239 at gmail.com Fri Apr 14 23:22:56 2023 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Fri, 14 Apr 2023 21:22:56 -0700 Subject: [petsc-users] PETSc testing recipes In-Reply-To: References: Message-ID: Thank you all for the info! > On Apr 14, 2023, at 3:48 PM, Jacob Faibussowitsch wrote: > > ?More specifically for configure handling of C++ dialect. It has 3 modes: > > 1. You don?t specify any C++ dialect (i.e. configure without passing `--with-cxx-dialect`): > > Configure will determine the dialect range for your given setup. It takes into account all of the compilers (including ?extra? ones like nvcc) it finds, as well as all of your requested packages. It will also enable GNU extensions if they are available, i.e. passing `-std=gnu++17` over `-std=c++17`. From this range, it will select the highest possible dialect and apply it to all builds. > > 2. You specify `--with-cxx-dialect=14`: > > Configure will do the above but will impose an arbitrary cap of C++14. If something in your environment conflicts with this (i.e. maybe your compiler doesn?t support it, or maybe a package requires C++17) it will produce a hard error telling you why this won?t work. It still attempts to enable GNU extensions as above. > > 3. You specify `--with-cxx-dialect=c++14` OR you have `-std=c++14` in `[--]CXXFLAGS` and/or `[--]CXXOPTFLAGS`: > > configure assumes you know something it doesn?t. It will blindly apply exactly `-std=c++14` everywhere. This includes building PETSc itself and all packages that PETSc builds. It will still check your compiler supports `-std=c++14` though (and error if it does not). > > So if your goal is to have the highest available dialect chosen for you, do nothing. Do not pass any dialect flags, and do not set them in your `CXXFLAGS` or `CXXOPTFLAGS`. Configure will do The Right Thing. > > If your goal is to explicitly set the dialect, we recommend using option 2 since that might still allow you to enable GNU extensions. > > If something is just not working then 1. send us your configure.log ? maybe we have a bug ? and 2. use option #3, it will do exactly as you ask. > > Best regards, > > Jacob Faibussowitsch > (Jacob Fai - booss - oh - vitch) > >> On Apr 14, 2023, at 18:18, Satish Balay via petsc-users wrote: >> >> Hm depending on what you are seeking - you'll have to look at multiple places [and co-related the info from them] >> >> - you can check one of the pipelines to see exactly what jobs are run (and the logs of the jobs for details). for ex: >> https://gitlab.com/petsc/petsc/-/pipelines/836609909 >> >> - most of the build scripts are at config/examples/arch-ci-*.py - so thats one way to get a view of what pkgs are tested by what jobs.. >> >> balay at p1 /home/balay/petsc (main =) >> $ ls config/examples/arch-ci-*.py |wc -l >> 56 >> >> - And wrt specific examples corresponding to external pkg - say - superlu_dist - you can do something like: >> >> balay at p1 /home/balay/petsc (main =) >> $ git grep 'requires:' src |grep superlu_dist |head -5 >> src/ksp/ksp/tests/ex17.c: requires: superlu_dist complex >> src/ksp/ksp/tests/ex17.c: requires: superlu_dist complex >> src/ksp/ksp/tests/ex33.c: requires: superlu_dist !complex >> src/ksp/ksp/tests/ex33.c: requires: superlu_dist !complex >> src/ksp/ksp/tests/ex49.c: requires: superlu_dist >> >> etc. >> >> - Wrt C++ dialect: >> >> balay at p1 /home/balay/petsc (main =) >> $ git grep self.minCxxVersion config/BuildSystem/config/packages >> config/BuildSystem/config/packages/AMReX.py: self.minCxxVersion = 'c++14' >> config/BuildSystem/config/packages/h2opus.py: self.minCxxVersion = 'c++14' >> config/BuildSystem/config/packages/kokkos.py: self.minCxxVersion = 'c++17' >> config/BuildSystem/config/packages/raja.py: self.minCxxVersion = 'c++14' >> config/BuildSystem/config/packages/sycl.py: self.minCxxVersion = 'c++17' >> balay at p1 /home/balay/petsc (main =) >> $ git grep self.maxCxxVersion config/BuildSystem/config/packages >> config/BuildSystem/config/packages/MOAB.py: self.maxCxxVersion = 'c++14' >> config/BuildSystem/config/packages/amgx.py: self.maxCxxVersion = 'c++17' # https://github.com/NVIDIA/AMGX/issues/231 >> config/BuildSystem/config/packages/elemental.py: self.maxCxxVersion = 'c++14' >> config/BuildSystem/config/packages/grid.py: self.maxCxxVersion = 'c++17' >> config/BuildSystem/config/packages/kokkos.py: self.maxCxxVersion = 'c++17' >> >> However configure determines the max that the compiler supports and >> attempts to use that [when only min is set]. And this info is usually >> in configure.log [or make.log] for the corresponding build. >> >> Satish >> >> >>> On Fri, 14 Apr 2023, Alexander Lindsay wrote: >>> >>> Hi, is there a place I can look to understand the testing recipes used in >>> PETSc CI, e.g. what external packages are included (if any), what C++ >>> dialect is used for any external packages built with C++, etc.? >>> >>> Alex >>> >> > From D.Lathouwers at tudelft.nl Sat Apr 15 06:27:45 2023 From: D.Lathouwers at tudelft.nl (Danny Lathouwers - TNW) Date: Sat, 15 Apr 2023 11:27:45 +0000 Subject: [petsc-users] problem with PetscInitialize when moving to 3.19.0 Message-ID: <81B887C2-3A40-4BCF-9978-CE20D742CA22@tudelft.nl> Dear community, I am transitioning a working Fortran code based on version 3.10.4 to version 3.19.0. The code compiles except for one problem during linking where PetscInitialize can no longer be found (undefined reference to `petscinitialize_?). I am attaching the piece of code where the call is made. Removing the call makes the whole code compile so this is the only error. I added the include line of petsc.h as the documentation seems to imply this, but this did not help. Any help is appreciated. Thanks, Danny.? -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: petsc_mod.f90 Type: application/octet-stream Size: 1112 bytes Desc: not available URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 1674 bytes Desc: not available URL: From mfadams at lbl.gov Sat Apr 15 06:35:43 2023 From: mfadams at lbl.gov (Mark Adams) Date: Sat, 15 Apr 2023 07:35:43 -0400 Subject: [petsc-users] Effect of -pc_gamg_threshold vs PETSc version In-Reply-To: References: <3dc145050abae22d80443de140c9801d6e801bb0.camel@seamplex.com> <57b4d5f16b9d84b87675fd149dd83527d63651a5.camel@seamplex.com> Message-ID: OK, P2 tets linear elasticity. That is good to know. Matt, there were a bunch of changes, mostly "small", but a big change was (2.18) changing the aggressive coarsening (square graph) from MIS(A'A) to MIS(MIS(A)). In my experiments, the coarsening rates changed a little. 2.17 went from SOR to Jacobi default smoothing, and Jeremy's times went down a little. idk. Jeremy, you may want more levels of aggressive coarsening with P2 tests. The old syntax will work but the new is -pc_gamg_aggressive_coarsening <1> More to come, Mark On Fri, Apr 14, 2023 at 9:56?AM Matthew Knepley wrote: > On Fri, Apr 14, 2023 at 8:54?AM Jeremy Theler wrote: > >> Hi Mark. So glad you answered. >> >> > 0) what is your test problem? eg, 3D Lapacian with Q1 finite >> > elements. >> >> I said in my first email it was linear elasticty (and I gave a link >> where you can see the geometry, BCs, etc.) but I did not specifty >> further details. >> It is linear elasticity with displacement-based FEM formulation using >> unstructured curved 10-noded tetrahedra. >> > > I believe our jargon for this would be "P_2 Lagrange element". > > >> The matrix is marked as SPD with MatSetOption() and the solver is >> indeed CG and not the default GMRES. >> >> > First, you can get GAMG diagnostics by running with '-info :pc' and >> > grep on GAMG. >> >> Great advice. Now I have a lot more of information but I'm not sure how >> to analyze it. Find attached for each combination of threshold and >> PETSc version the output of -info :pc -ksp_monitor -ksp_view >> >> In general it looks like 3.18 and 3.19 have less KSP iterations than >> 3.17 but the overall time is larger. >> > > I will also look and see if I can figure out the change. This kind of > behavior usually means that > we somehow made the coarse problem larger. This can make it more accurate > (fewer iterations) > but more costly. This also makes sense that it is sensitive to the > threshold parameter, but that is > not the only thing that controls the sparsity. There is also squaring the > graph. > > Mark, do you know if we change the default for squaring? > > Thanks, > > Matt > > >> > Anyway, sorry for the changes. >> > I hate changing GAMG for this reason and I hate AMG for this reason! >> >> No need to apologize, I just want to better understand how to better >> exploit your code! >> >> Thanks >> -- >> jeremy >> >> > >> > Thanks, >> > Mark >> > >> > >> > >> > On Thu, Apr 13, 2023 at 8:17?AM Jeremy Theler >> > wrote: >> > > When using GAMG+cg for linear elasticity and providing the near >> > > nullspace computed by MatNullSpaceCreateRigidBody(), I used to find >> > > "experimentally" that a small value of -pc_gamg_threshold in the >> > > order >> > > of 0.0001 would slightly decrease the solve time. >> > > >> > > Starting with 3.18, I started seeing that any positive value for >> > > the >> > > treshold would increase the solve time. I did a quick parametric >> > > (serial) run solving an elastic problem with a matrix size of >> > > approx >> > > 570k x 570k for different values of GAMG threshold and different >> > > PETSc >> > > versions (compiled with the same compiler, options and flags). >> > > >> > > I noted that >> > > >> > > 1. starting from 3.18, a threshold of 0.0001 that used to improve >> > > the >> > > speed now worsens it. >> > > 2. PETSc 3.17 looks like a "sweet spot" of speed >> > > >> > > I would like to hear any comments you might have. >> > > >> > > The wall time shown includes the time needed to read the mesh and >> > > assemble the stiffness matrix. It is a refined version of the >> > > NAFEMS >> > > LE10 benchmark described here: >> > > https://seamplex.com/feenox/examples/mechanical.html#nafems-le10- >> > > thick-plate-pressure-benchmark >> > > >> > > If you want, I could dump the matrix, rhs and near nullspace >> > > vectors >> > > and share them. >> > > >> > > -- >> > > jeremy theler >> > > >> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sat Apr 15 06:41:07 2023 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 15 Apr 2023 07:41:07 -0400 Subject: [petsc-users] problem with PetscInitialize when moving to 3.19.0 In-Reply-To: <81B887C2-3A40-4BCF-9978-CE20D742CA22@tudelft.nl> References: <81B887C2-3A40-4BCF-9978-CE20D742CA22@tudelft.nl> Message-ID: On Sat, Apr 15, 2023 at 7:28?AM Danny Lathouwers - TNW via petsc-users < petsc-users at mcs.anl.gov> wrote: > Dear community, > > I am transitioning a working Fortran code based on version 3.10.4 to > version 3.19.0. > The code compiles except for one problem during linking where > PetscInitialize can no longer be found (undefined reference to > `petscinitialize_?). > PetscInitialize() only takes the "ierr" argument now. THanks, Matt > I am attaching the piece of code where the call is made. Removing the call > makes the whole code compile so this is the only error. > I added the include line of petsc.h as the documentation seems to imply > this, but this did not help. > > Any help is appreciated. > > Thanks, > Danny. > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From D.Lathouwers at tudelft.nl Sat Apr 15 06:48:21 2023 From: D.Lathouwers at tudelft.nl (Danny Lathouwers - TNW) Date: Sat, 15 Apr 2023 11:48:21 +0000 Subject: [petsc-users] problem with PetscInitialize when moving to 3.19.0 In-Reply-To: References: <81B887C2-3A40-4BCF-9978-CE20D742CA22@tudelft.nl> Message-ID: <1EA3072B-5A5D-480E-B7C3-B4DA403407B6@tudelft.nl> Dear Matt, Thanks for the quick reply. This however does not resolve the problem. It simply cannot find the routine at all during linking. Danny. > On 15 Apr 2023, at 13:41, Matthew Knepley wrote: > > On Sat, Apr 15, 2023 at 7:28?AM Danny Lathouwers - TNW via petsc-users > wrote: >> Dear community, >> >> I am transitioning a working Fortran code based on version 3.10.4 to version 3.19.0. >> The code compiles except for one problem during linking where PetscInitialize can no longer be found (undefined reference to `petscinitialize_?). > > PetscInitialize() only takes the "ierr" argument now. > > THanks, > > Matt > >> I am attaching the piece of code where the call is made. Removing the call makes the whole code compile so this is the only error. >> I added the include line of petsc.h as the documentation seems to imply this, but this did not help. >> >> Any help is appreciated. >> >> Thanks, >> Danny. >> > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 1674 bytes Desc: not available URL: From knepley at gmail.com Sat Apr 15 06:58:45 2023 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 15 Apr 2023 07:58:45 -0400 Subject: [petsc-users] problem with PetscInitialize when moving to 3.19.0 In-Reply-To: <1EA3072B-5A5D-480E-B7C3-B4DA403407B6@tudelft.nl> References: <81B887C2-3A40-4BCF-9978-CE20D742CA22@tudelft.nl> <1EA3072B-5A5D-480E-B7C3-B4DA403407B6@tudelft.nl> Message-ID: On Sat, Apr 15, 2023 at 7:48?AM Danny Lathouwers - TNW < D.Lathouwers at tudelft.nl> wrote: > Dear Matt, > > Thanks for the quick reply. > > This however does not resolve the problem. > It simply cannot find the routine at all during linking. > Perhaps your link line is wrong. Let's start with a PETSc example. Can you make Vec ex1f90/F90? cd $PETSC_DIR cd src/vec/vec/tutorials/ make ex1f90 That will also show the correct link line. Thanks, Matt > Danny. > > On 15 Apr 2023, at 13:41, Matthew Knepley wrote: > > On Sat, Apr 15, 2023 at 7:28?AM Danny Lathouwers - TNW via petsc-users < > petsc-users at mcs.anl.gov> wrote: > >> Dear community, >> >> I am transitioning a working Fortran code based on version 3.10.4 to >> version 3.19.0. >> The code compiles except for one problem during linking where >> PetscInitialize can no longer be found (undefined reference to >> `petscinitialize_?). >> > > PetscInitialize() only takes the "ierr" argument now. > > THanks, > > Matt > > >> I am attaching the piece of code where the call is made. Removing the >> call makes the whole code compile so this is the only error. >> I added the include line of petsc.h as the documentation seems to imply >> this, but this did not help. >> >> Any help is appreciated. >> >> Thanks, >> Danny. >> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Sat Apr 15 07:05:10 2023 From: mfadams at lbl.gov (Mark Adams) Date: Sat, 15 Apr 2023 08:05:10 -0400 Subject: [petsc-users] Effect of -pc_gamg_threshold vs PETSc version In-Reply-To: <57b4d5f16b9d84b87675fd149dd83527d63651a5.camel@seamplex.com> References: <3dc145050abae22d80443de140c9801d6e801bb0.camel@seamplex.com> <57b4d5f16b9d84b87675fd149dd83527d63651a5.camel@seamplex.com> Message-ID: On Fri, Apr 14, 2023 at 8:54?AM Jeremy Theler wrote: > Hi Mark. So glad you answered. > > > 0) what is your test problem? eg, 3D Lapacian with Q1 finite > > elements. > > I said in my first email it was linear elasticty (and I gave a link > where you can see the geometry, BCs, etc.) but I did not specifty > further details. > OK, this is fine. Nice 3D tet mesh with P2 (P1 will lock). I assume you have a benign Poisson ratio. > It is linear elasticity with displacement-based FEM formulation using > unstructured curved 10-noded tetrahedra. > > The matrix is marked as SPD with MatSetOption() and the solver is > indeed CG and not the default GMRES. > > good > > First, you can get GAMG diagnostics by running with '-info :pc' and > > grep on GAMG. > > Great advice. Now I have a lot more of information but I'm not sure how > to analyze it. Find attached for each combination of threshold and > PETSc version the output of -info :pc -ksp_monitor -ksp_view > > (new_py-env) 07:43 ~/Downloads/log 2$ grep "grid complexity" * .... infopc-0.02-17.log:[0] PCSetUp_GAMG(): (null): 7 levels, grid complexity = 1.28914 infopc-0.02-17.log: Complexity: grid = 1.04361 operator = 1.28914 infopc-0.02-18.log: Complexity: grid = 1.05658 operator = 1.64555 infopc-0.02-19.log: Complexity: grid = 1.05658 operator = 1.64555 I was using "grid complexity " and changed to the accepted term "operator". You can see that the new coarening method is pretty different on this problem. For an isiotropoic problem like this a zero threshould is a good place to start, but you can use it to tune. > In general it looks like 3.18 and 3.19 have less KSP iterations than > 3.17 but the overall time is larger. > We need to see the solve times. Run with -log_view and grep for KSPSolve We can look at the setup time separately In practice the setup time is amortized unless you use a full Newton nonlinear solver. Your iteration counts are reasonable. They go up a little with Jacobi. Here are your grid sizes: (new_py-env) 07:55 ~/Downloads/log 2$ grep N= infopc-0.0001-19.log [0] PCSetUp_GAMG(): (null): level 0) N=568386, n data rows=3, n data cols=6, nnz/row (ave)=82, np=1 [0] PCSetUp_GAMG(): (null): 1) N=22206, n data cols=6, nnz/row (ave)=445, 1 active pes [0] PCSetUp_GAMG(): (null): 2) N=2628, n data cols=6, nnz/row (ave)=1082, 1 active pes [0] PCSetUp_GAMG(): (null): 3) N=180, n data cols=6, nnz/row (ave)=180, 1 active pes [0] PCSetUp_GAMG(): (null): 4) N=18, n data cols=6, nnz/row (ave)=18, 1 active pes [0] PCSetUp_GAMG(): (null): PCSetUp_GAMG: call KSPChebyshevSetEigenvalues on level 3 (N=180) with emax = 2.0785 emin = 0.0468896 [0] PCSetUp_GAMG(): (null): PCSetUp_GAMG: call KSPChebyshevSetEigenvalues on level 2 (N=2628) with emax = 1.78977 emin = 5.91431e-08 [0] PCSetUp_GAMG(): (null): PCSetUp_GAMG: call KSPChebyshevSetEigenvalues on level 1 (N=22206) with emax = 2.12541 emin = 4.27581e-09 [0] PCSetUp_GAMG(): (null): PCSetUp_GAMG: call KSPChebyshevSetEigenvalues on level 0 (N=568386) with emax = 3.42185 emin = 0.0914406 (new_py-env) 07:56 ~/Downloads/log 2$ grep N= infopc-0.0001-15.log [0] PCSetUp_GAMG(): level 0) N=568386, n data rows=3, n data cols=6, nnz/row (ave)=82, np=1 [0] PCSetUp_GAMG(): 1) N=15642, n data cols=6, nnz/row (ave)=368, 1 active pes [0] PCSetUp_GAMG(): 2) N=1266, n data cols=6, nnz/row (ave)=468, 1 active pes [0] PCSetUp_GAMG(): 3) N=108, n data cols=6, nnz/row (ave)=108, 1 active pes [0] PCSetUp_GAMG(): 4) N=12, n data cols=6, nnz/row (ave)=12, 1 active pes The new version coarsens slower. BTW. Use something like -pc_gamg_coarse_eq_limit 1000 Your coarse grids are too small. You can grep on MatLUFactor to check that the coarse grid solve/factor is under control but 1000 in 3D is pretty conservative. I am sure you are are going to want more aggresive coarsening (newer versions): -pc_gamg_aggressive_coarsening <1> Just try 10 (ie, all levels) to start. Mark > > Anyway, sorry for the changes. > > I hate changing GAMG for this reason and I hate AMG for this reason! > > No need to apologize, I just want to better understand how to better > exploit your code! > > Thanks > -- > jeremy > > > > > Thanks, > > Mark > > > > > > > > On Thu, Apr 13, 2023 at 8:17?AM Jeremy Theler > > wrote: > > > When using GAMG+cg for linear elasticity and providing the near > > > nullspace computed by MatNullSpaceCreateRigidBody(), I used to find > > > "experimentally" that a small value of -pc_gamg_threshold in the > > > order > > > of 0.0001 would slightly decrease the solve time. > > > > > > Starting with 3.18, I started seeing that any positive value for > > > the > > > treshold would increase the solve time. I did a quick parametric > > > (serial) run solving an elastic problem with a matrix size of > > > approx > > > 570k x 570k for different values of GAMG threshold and different > > > PETSc > > > versions (compiled with the same compiler, options and flags). > > > > > > I noted that > > > > > > 1. starting from 3.18, a threshold of 0.0001 that used to improve > > > the > > > speed now worsens it. > > > 2. PETSc 3.17 looks like a "sweet spot" of speed > > > > > > I would like to hear any comments you might have. > > > > > > The wall time shown includes the time needed to read the mesh and > > > assemble the stiffness matrix. It is a refined version of the > > > NAFEMS > > > LE10 benchmark described here: > > > https://seamplex.com/feenox/examples/mechanical.html#nafems-le10- > > > thick-plate-pressure-benchmark > > > > > > If you want, I could dump the matrix, rhs and near nullspace > > > vectors > > > and share them. > > > > > > -- > > > jeremy theler > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From D.Lathouwers at tudelft.nl Sat Apr 15 07:07:41 2023 From: D.Lathouwers at tudelft.nl (Danny Lathouwers - TNW) Date: Sat, 15 Apr 2023 12:07:41 +0000 Subject: [petsc-users] problem with PetscInitialize when moving to 3.19.0 In-Reply-To: References: <81B887C2-3A40-4BCF-9978-CE20D742CA22@tudelft.nl> <1EA3072B-5A5D-480E-B7C3-B4DA403407B6@tudelft.nl> Message-ID: Hi Matt, I just solved it :-) the problem is the ?only? in the use statement. After removing it the code compiles and links flawlessly. Many thanks for your suggestions ! Now off to testing this version. Danny. > On 15 Apr 2023, at 13:58, Matthew Knepley wrote: > > On Sat, Apr 15, 2023 at 7:48?AM Danny Lathouwers - TNW > wrote: >> Dear Matt, >> >> Thanks for the quick reply. >> >> This however does not resolve the problem. >> It simply cannot find the routine at all during linking. > > Perhaps your link line is wrong. Let's start with a PETSc example. Can you make Vec ex1f90/F90? > > cd $PETSC_DIR > cd src/vec/vec/tutorials/ > make ex1f90 > > That will also show the correct link line. > > Thanks, > > Matt > >> Danny. >> >>> On 15 Apr 2023, at 13:41, Matthew Knepley > wrote: >>> >>> On Sat, Apr 15, 2023 at 7:28?AM Danny Lathouwers - TNW via petsc-users > wrote: >>>> Dear community, >>>> >>>> I am transitioning a working Fortran code based on version 3.10.4 to version 3.19.0. >>>> The code compiles except for one problem during linking where PetscInitialize can no longer be found (undefined reference to `petscinitialize_?). >>> >>> PetscInitialize() only takes the "ierr" argument now. >>> >>> THanks, >>> >>> Matt >>> >>>> I am attaching the piece of code where the call is made. Removing the call makes the whole code compile so this is the only error. >>>> I added the include line of petsc.h as the documentation seems to imply this, but this did not help. >>>> >>>> Any help is appreciated. >>>> >>>> Thanks, >>>> Danny. >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >> > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 1674 bytes Desc: not available URL: From carl-johan.thore at liu.se Sun Apr 16 13:50:58 2023 From: carl-johan.thore at liu.se (Carl-Johan Thore) Date: Sun, 16 Apr 2023 18:50:58 +0000 Subject: [petsc-users] Fieldsplit with redistribute Message-ID: Hello, I'm solving a blocksystem [A C; C' D], where D is not zero, using the PCFIELDSPLIT preconditioner and set the split using PetscFieldSplitSetIS. This works very well until I try PCREDISTRIBUTE (which is attractive as I have many locked DOFs). I suspect something goes wrong when constructing the IS for the split (I've tried various things using the IS-routines). Can PETSc do this automatically? Or else, any hints? Kind regards, Carl-Johan From bsmith at petsc.dev Sun Apr 16 14:11:18 2023 From: bsmith at petsc.dev (Barry Smith) Date: Sun, 16 Apr 2023 15:11:18 -0400 Subject: [petsc-users] Fieldsplit with redistribute In-Reply-To: References: Message-ID: <35C00A41-36BA-45E6-AC79-7DDDDFCFED6C@petsc.dev> There is no code to do this currently. I would start by building your IS for each split before the PCRedistribute and then adding to the PCApply_Redistribute() code that "fixes" these IS by "removing" the entries of the IS associated with removed degrees of freedom and then shifting the entries indices of the IS by taking into account the removed indices. But you have probably already been trying this? It does require digging directly into the PCApply_Redistribute() to get the needed information (which degrees of freedom are removed by the redistribute code), plus it requires shifting the MPI rank ownership of the entries of the IS in the same way the MPI rank ownership of the degrees of freedom of the vector are moved. If you have some code that you think should be doing this but doesn't work feel free to send it to us and we may be able to fix it. Barry > On Apr 16, 2023, at 2:50 PM, Carl-Johan Thore via petsc-users wrote: > > Hello, > I'm solving a blocksystem > [A C; > C' D], > where D is not zero, using the PCFIELDSPLIT preconditioner and set the split using PetscFieldSplitSetIS. This works very well until I try PCREDISTRIBUTE (which is attractive as I have many locked DOFs). I suspect something goes wrong when constructing the IS for the split (I've tried various things using the IS-routines). Can PETSc do this automatically? Or else, any hints? > Kind regards, > Carl-Johan From carl-johan.thore at liu.se Sun Apr 16 14:36:06 2023 From: carl-johan.thore at liu.se (Carl-Johan Thore) Date: Sun, 16 Apr 2023 19:36:06 +0000 Subject: [petsc-users] Fieldsplit with redistribute In-Reply-To: <35C00A41-36BA-45E6-AC79-7DDDDFCFED6C@petsc.dev> References: <35C00A41-36BA-45E6-AC79-7DDDDFCFED6C@petsc.dev> Message-ID: Thanks for the quick reply Barry! I have not tried the version with PCApply_Redistribute that you suggest, but I have a code that does roughly what you describe. It works when running on one rank, but fails on multiple ranks. I suspect the issue is with the use of ISEmbed as, quoting the PETSc-manual, "the resulting IS is sequential, since the index substitution it encodes is purely local" (admittedly I don't fully understand what that means). If you think using ISEmbed is not a good idea, I'll try PCApply_Redistribute() ________________________________ From: Barry Smith Sent: 16 April 2023 21:11:18 To: Carl-Johan Thore Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Fieldsplit with redistribute There is no code to do this currently. I would start by building your IS for each split before the PCRedistribute and then adding to the PCApply_Redistribute() code that "fixes" these IS by "removing" the entries of the IS associated with removed degrees of freedom and then shifting the entries indices of the IS by taking into account the removed indices. But you have probably already been trying this? It does require digging directly into the PCApply_Redistribute() to get the needed information (which degrees of freedom are removed by the redistribute code), plus it requires shifting the MPI rank ownership of the entries of the IS in the same way the MPI rank ownership of the degrees of freedom of the vector are moved. If you have some code that you think should be doing this but doesn't work feel free to send it to us and we may be able to fix it. Barry > On Apr 16, 2023, at 2:50 PM, Carl-Johan Thore via petsc-users wrote: > > Hello, > I'm solving a blocksystem > [A C; > C' D], > where D is not zero, using the PCFIELDSPLIT preconditioner and set the split using PetscFieldSplitSetIS. This works very well until I try PCREDISTRIBUTE (which is attractive as I have many locked DOFs). I suspect something goes wrong when constructing the IS for the split (I've tried various things using the IS-routines). Can PETSc do this automatically? Or else, any hints? > Kind regards, > Carl-Johan -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Sun Apr 16 15:31:14 2023 From: bsmith at petsc.dev (Barry Smith) Date: Sun, 16 Apr 2023 16:31:14 -0400 Subject: [petsc-users] Fieldsplit with redistribute In-Reply-To: References: <35C00A41-36BA-45E6-AC79-7DDDDFCFED6C@petsc.dev> Message-ID: The manual page for ISEmbed is incomprehensible to me. Anyways no matter what, you need to know what degrees of freedom are removed by PCDistribute() in order to produce the reduced IS which is why I think you need information only available inside PCSetUp_Redistribute(). (Sorry it is PCSetUp_Redistribute() not PCApply_Redistribute()) Barry > On Apr 16, 2023, at 3:36 PM, Carl-Johan Thore wrote: > > Thanks for the quick reply Barry! > I have not tried the version with PCApply_Redistribute that you suggest, but I have a code that does roughly what you describe. It works when running on one rank, but fails on multiple ranks. I suspect the issue is with the use of ISEmbed as, quoting the PETSc-manual, "the resulting IS is sequential, since the index substitution it encodes is purely local" (admittedly I don't fully understand what that means). If you think using ISEmbed is not a good idea, I'll try PCApply_Redistribute() > From: Barry Smith > Sent: 16 April 2023 21:11:18 > To: Carl-Johan Thore > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > There is no code to do this currently. > > I would start by building your IS for each split before the PCRedistribute and then adding to the PCApply_Redistribute() code that "fixes" these IS by "removing" the entries of the IS associated with removed degrees of freedom and then shifting the entries indices of the IS by taking into account the removed indices. But you have probably already been trying this? It does require digging directly into the PCApply_Redistribute() to get the needed information (which degrees of freedom are removed by the redistribute code), plus it requires shifting the MPI rank ownership of the entries of the IS in the same way the MPI rank ownership of the degrees of freedom of the vector are moved. > > If you have some code that you think should be doing this but doesn't work feel free to send it to us and we may be able to fix it. > > Barry > > > > On Apr 16, 2023, at 2:50 PM, Carl-Johan Thore via petsc-users wrote: > > > > Hello, > > I'm solving a blocksystem > > [A C; > > C' D], > > where D is not zero, using the PCFIELDSPLIT preconditioner and set the split using PetscFieldSplitSetIS. This works very well until I try PCREDISTRIBUTE (which is attractive as I have many locked DOFs). I suspect something goes wrong when constructing the IS for the split (I've tried various things using the IS-routines). Can PETSc do this automatically? Or else, any hints? > > Kind regards, > > Carl-Johan > -------------- next part -------------- An HTML attachment was scrubbed... URL: From carl-johan.thore at liu.se Sun Apr 16 16:05:14 2023 From: carl-johan.thore at liu.se (Carl-Johan Thore) Date: Sun, 16 Apr 2023 21:05:14 +0000 Subject: [petsc-users] Fieldsplit with redistribute In-Reply-To: References: <35C00A41-36BA-45E6-AC79-7DDDDFCFED6C@petsc.dev> Message-ID: Ok I see, thanks! I'll try PCSetUp_Redistribute() then. ________________________________ From: Barry Smith Sent: 16 April 2023 22:31:14 To: Carl-Johan Thore Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Fieldsplit with redistribute The manual page for ISEmbed is incomprehensible to me. Anyways no matter what, you need to know what degrees of freedom are removed by PCDistribute() in order to produce the reduced IS which is why I think you need information only available inside PCSetUp_Redistribute(). (Sorry it is PCSetUp_Redistribute() not PCApply_Redistribute()) Barry On Apr 16, 2023, at 3:36 PM, Carl-Johan Thore wrote: Thanks for the quick reply Barry! I have not tried the version with PCApply_Redistribute that you suggest, but I have a code that does roughly what you describe. It works when running on one rank, but fails on multiple ranks. I suspect the issue is with the use of ISEmbed as, quoting the PETSc-manual, "the resulting IS is sequential, since the index substitution it encodes is purely local" (admittedly I don't fully understand what that means). If you think using ISEmbed is not a good idea, I'll try PCApply_Redistribute() ________________________________ From: Barry Smith Sent: 16 April 2023 21:11:18 To: Carl-Johan Thore Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Fieldsplit with redistribute There is no code to do this currently. I would start by building your IS for each split before the PCRedistribute and then adding to the PCApply_Redistribute() code that "fixes" these IS by "removing" the entries of the IS associated with removed degrees of freedom and then shifting the entries indices of the IS by taking into account the removed indices. But you have probably already been trying this? It does require digging directly into the PCApply_Redistribute() to get the needed information (which degrees of freedom are removed by the redistribute code), plus it requires shifting the MPI rank ownership of the entries of the IS in the same way the MPI rank ownership of the degrees of freedom of the vector are moved. If you have some code that you think should be doing this but doesn't work feel free to send it to us and we may be able to fix it. Barry > On Apr 16, 2023, at 2:50 PM, Carl-Johan Thore via petsc-users wrote: > > Hello, > I'm solving a blocksystem > [A C; > C' D], > where D is not zero, using the PCFIELDSPLIT preconditioner and set the split using PetscFieldSplitSetIS. This works very well until I try PCREDISTRIBUTE (which is attractive as I have many locked DOFs). I suspect something goes wrong when constructing the IS for the split (I've tried various things using the IS-routines). Can PETSc do this automatically? Or else, any hints? > Kind regards, > Carl-Johan -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexlindsay239 at gmail.com Sun Apr 16 18:10:52 2023 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Sun, 16 Apr 2023 16:10:52 -0700 Subject: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split In-Reply-To: <0DE6DDB9-759F-4DA8-86F1-CFA48F7AAE79@lip6.fr> References: <0DE6DDB9-759F-4DA8-86F1-CFA48F7AAE79@lip6.fr> Message-ID: <77E433C0-E692-4561-97E9-6B7ED24E0F2C@gmail.com> Are there any plans to get the missing hook into PETSc for AIR? Just curious if there?s an issue I can subscribe to or anything. (Independently I?m excited to test HPDDM out tomorrow) > On Apr 13, 2023, at 10:29 PM, Pierre Jolivet wrote: > > ? >> On 14 Apr 2023, at 7:02 AM, Alexander Lindsay wrote: >> >> Pierre, >> >> This is very helpful information. Thank you. Yes I would appreciate those command line options if you?re willing to share! > > No problem, I?ll get in touch with you in private first, because it may require some extra work (need a couple of extra options in PETSc ./configure), and this is not very related to the problem at hand, so best not to spam the mailing list. > > Thanks, > Pierre > >>>> On Apr 13, 2023, at 9:54 PM, Pierre Jolivet wrote: >>>> >>> ? >>> >>>> On 13 Apr 2023, at 10:33 PM, Alexander Lindsay wrote: >>>> >>>> Hi, I'm trying to solve steady Navier-Stokes for different Reynolds numbers. My options table >>>> >>>> -dm_moose_fieldsplit_names u,p >>>> -dm_moose_nfieldsplits 2 >>>> -fieldsplit_p_dm_moose_vars pressure >>>> -fieldsplit_p_ksp_type preonly >>>> -fieldsplit_p_pc_type jacobi >>>> -fieldsplit_u_dm_moose_vars vel_x,vel_y >>>> -fieldsplit_u_ksp_type preonly >>>> -fieldsplit_u_pc_hypre_type boomeramg >>>> -fieldsplit_u_pc_type hypre >>>> -pc_fieldsplit_schur_fact_type full >>>> -pc_fieldsplit_schur_precondition selfp >>>> -pc_fieldsplit_type schur >>>> -pc_type fieldsplit >>>> >>>> works wonderfully for a low Reynolds number of 2.2. The solver performance crushes LU as I scale up the problem. However, not surprisingly this options table struggles when I bump the Reynolds number to 220. I've read that use of AIR (approximate ideal restriction) can improve performance for advection dominated problems. I've tried setting -pc_hypre_boomeramg_restriction_type 1 for a simple diffusion problem and the option works fine. However, when applying it to my field-split preconditioned Navier-Stokes system, I get immediate non-convergence: >>>> >>>> 0 Nonlinear |R| = 1.033077e+03 >>>> 0 Linear |R| = 1.033077e+03 >>>> Linear solve did not converge due to DIVERGED_NANORINF iterations 0 >>>> Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0 >>>> >>>> Does anyone have an idea as to why this might be happening? >>> >>> Do not use this option, even when not part of PCFIELDSPLIT. >>> There is some missing plumbing in PETSc which makes it unusable, see Ben?s comment here https://github.com/hypre-space/hypre/issues/764#issuecomment-1353452417. >>> In fact, it?s quite easy to make HYPRE generate NaN with a very simple stabilized convection?diffusion problem near the pure convection limit (something that ?AIR is supposed to handle). >>> Even worse, you can make HYPRE fill your terminal with printf-style debugging messages https://github.com/hypre-space/hypre/blob/5546cc22d46b3dba253849f258786da47c9a7b21/src/parcsr_ls/par_lr_restr.c#L1416 with this option turned on. >>> As a result, I have been unable to reproduce any of the ?AIR results. >>> This also explains why I have been using plain BoomerAMG instead of ?AIR for the comparison in page 9 of https://arxiv.org/pdf/2201.02250.pdf (if you would like to try the PC we are using, I could send you the command line options). >>> >>> Thanks, >>> Pierre >>> >>>> If not, I'd take a suggestion on where to set a breakpoint to start my own investigation. Alternatively, I welcome other preconditioning suggestions for an advection dominated problem. >>>> >>>> Alex >>> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pierre.jolivet at lip6.fr Sun Apr 16 23:27:24 2023 From: pierre.jolivet at lip6.fr (Pierre Jolivet) Date: Mon, 17 Apr 2023 06:27:24 +0200 Subject: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split In-Reply-To: <77E433C0-E692-4561-97E9-6B7ED24E0F2C@gmail.com> References: <0DE6DDB9-759F-4DA8-86F1-CFA48F7AAE79@lip6.fr> <77E433C0-E692-4561-97E9-6B7ED24E0F2C@gmail.com> Message-ID: <5C72E83C-717E-447D-BEC4-C5399636BCE2@lip6.fr> > On 17 Apr 2023, at 1:10 AM, Alexander Lindsay wrote: > > Are there any plans to get the missing hook into PETSc for AIR? Just curious if there?s an issue I can subscribe to or anything. Not that I know of, but it would make for a nice contribution if you feel like creating a PR. Thanks, Pierre > (Independently I?m excited to test HPDDM out tomorrow) > >> On Apr 13, 2023, at 10:29 PM, Pierre Jolivet wrote: >> >> ? >>> On 14 Apr 2023, at 7:02 AM, Alexander Lindsay wrote: >>> >>> Pierre, >>> >>> This is very helpful information. Thank you. Yes I would appreciate those command line options if you?re willing to share! >> >> No problem, I?ll get in touch with you in private first, because it may require some extra work (need a couple of extra options in PETSc ./configure), and this is not very related to the problem at hand, so best not to spam the mailing list. >> >> Thanks, >> Pierre >> >>>> On Apr 13, 2023, at 9:54 PM, Pierre Jolivet wrote: >>>> >>>> ? >>>> >>>>> On 13 Apr 2023, at 10:33 PM, Alexander Lindsay wrote: >>>>> >>>>> Hi, I'm trying to solve steady Navier-Stokes for different Reynolds numbers. My options table >>>>> >>>>> -dm_moose_fieldsplit_names u,p >>>>> -dm_moose_nfieldsplits 2 >>>>> -fieldsplit_p_dm_moose_vars pressure >>>>> -fieldsplit_p_ksp_type preonly >>>>> -fieldsplit_p_pc_type jacobi >>>>> -fieldsplit_u_dm_moose_vars vel_x,vel_y >>>>> -fieldsplit_u_ksp_type preonly >>>>> -fieldsplit_u_pc_hypre_type boomeramg >>>>> -fieldsplit_u_pc_type hypre >>>>> -pc_fieldsplit_schur_fact_type full >>>>> -pc_fieldsplit_schur_precondition selfp >>>>> -pc_fieldsplit_type schur >>>>> -pc_type fieldsplit >>>>> >>>>> works wonderfully for a low Reynolds number of 2.2. The solver performance crushes LU as I scale up the problem. However, not surprisingly this options table struggles when I bump the Reynolds number to 220. I've read that use of AIR (approximate ideal restriction) can improve performance for advection dominated problems. I've tried setting -pc_hypre_boomeramg_restriction_type 1 for a simple diffusion problem and the option works fine. However, when applying it to my field-split preconditioned Navier-Stokes system, I get immediate non-convergence: >>>>> >>>>> 0 Nonlinear |R| = 1.033077e+03 >>>>> 0 Linear |R| = 1.033077e+03 >>>>> Linear solve did not converge due to DIVERGED_NANORINF iterations 0 >>>>> Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0 >>>>> >>>>> Does anyone have an idea as to why this might be happening? >>>> >>>> Do not use this option, even when not part of PCFIELDSPLIT. >>>> There is some missing plumbing in PETSc which makes it unusable, see Ben?s comment here https://github.com/hypre-space/hypre/issues/764#issuecomment-1353452417. >>>> In fact, it?s quite easy to make HYPRE generate NaN with a very simple stabilized convection?diffusion problem near the pure convection limit (something that ?AIR is supposed to handle). >>>> Even worse, you can make HYPRE fill your terminal with printf-style debugging messages https://github.com/hypre-space/hypre/blob/5546cc22d46b3dba253849f258786da47c9a7b21/src/parcsr_ls/par_lr_restr.c#L1416 with this option turned on. >>>> As a result, I have been unable to reproduce any of the ?AIR results. >>>> This also explains why I have been using plain BoomerAMG instead of ?AIR for the comparison in page 9 of https://arxiv.org/pdf/2201.02250.pdf (if you would like to try the PC we are using, I could send you the command line options). >>>> >>>> Thanks, >>>> Pierre >>>> >>>>> If not, I'd take a suggestion on where to set a breakpoint to start my own investigation. Alternatively, I welcome other preconditioning suggestions for an advection dominated problem. >>>>> >>>>> Alex >>>> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Mon Apr 17 04:36:35 2023 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Mon, 17 Apr 2023 11:36:35 +0200 Subject: [petsc-users] issues with VecSetValues in petsc 3.19 Message-ID: Hello Barry, Matt, Jed, I have just installed the latest and greatest version of petsc and I am hitting a problem I did not have in previous releases. Here is the error: *[1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------[1]PETSC ERROR: Object is in wrong state[1]PETSC ERROR: Not for unassembled vector, did you call VecAssemblyBegin()/VecAssemblyEnd()?[1]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc!* the vector x I am filling is created with: - call VecDuplicate(this%rhs, x, ierr) rhs is allocated and I called VecAssemblyBegin()/VecAssemblyEnd() on, do I need to call it on duplicated vectors as well now on? Thank you! -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Apr 17 04:40:35 2023 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 17 Apr 2023 05:40:35 -0400 Subject: [petsc-users] issues with VecSetValues in petsc 3.19 In-Reply-To: References: Message-ID: On Mon, Apr 17, 2023 at 5:36?AM Edoardo alinovi wrote: > Hello Barry, Matt, Jed, > > I have just installed the latest and greatest version of petsc and I am > hitting a problem I did not have in previous releases. > > Here is the error: > > > > > *[1]PETSC ERROR: --------------------- Error Message > --------------------------------------------------------------[1]PETSC > ERROR: Object is in wrong state[1]PETSC ERROR: Not for unassembled vector, > did you call VecAssemblyBegin()/VecAssemblyEnd()?[1]PETSC ERROR: WARNING! > There are option(s) set that were not used! Could be the program crashed > before they were used or a spelling mistake, etc!* > > the vector x I am filling is created with: > > - call VecDuplicate(this%rhs, x, ierr) > > rhs is allocated and I called VecAssemblyBegin()/VecAssemblyEnd() on, do I > need to call it on duplicated vectors as well now on? > Only if you change the values. Can you show the entire stack from the error message? Thanks, Matt > Thank you! > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Mon Apr 17 05:00:14 2023 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Mon, 17 Apr 2023 12:00:14 +0200 Subject: [petsc-users] issues with VecSetValues in petsc 3.19 In-Reply-To: References: Message-ID: Hey Matt, Thanks for the help. Here is the error: [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Object is in wrong state [0]PETSC ERROR: Not for unassembled vector, did you call VecAssemblyBegin()/VecAssemblyEnd()? [0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [0]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_p_ksp_rtol value: 0.10000000000000000E-0001 source: code [0]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_p_pc_type value: hypre source: code [0]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_u_ksp_rtol value: 0.10000000000000000E-0001 source: code [0]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_u_pc_type value: bjacobi source: code [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.19.0, unknown [0]PETSC ERROR: flubio_coupled on a arch-gnu named betelgeuse by edo Mon Apr 17 12:05:28 2023 [0]PETSC ERROR: Configure options PETSC_ARCH=arch-gnu FOPTFLAGS=-O3 COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no -with-mpi-dir=/home/edo/user_software_repository/openmpi-4.1.2/build -download-fblaslapack=1 -download-superlu_dist -download-mumps -download-hypre -download-metis -download-parmetis -download-scalapack --download-ml -download-slepc -download-spai -download-fftw [0]PETSC ERROR: #1 MatMult() at /home/edo/user_software_repository/petsc/src/mat/interface/matrix.c:2557 [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [1]PETSC ERROR: Object is in wrong state [1]PETSC ERROR: Not for unassembled vector, did you call VecAssemblyBegin()/VecAssemblyEnd()? [1]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [1]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_p_ksp_rtol value: 0.10000000000000000E-0001 source: code [1]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_p_pc_type value: hypre source: code [1]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_u_ksp_rtol value: 0.10000000000000000E-0001 source: code [1]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_u_pc_type value: bjacobi source: code [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [1]PETSC ERROR: Petsc Release Version 3.19.0, unknown [1]PETSC ERROR: flubio_coupled on a arch-gnu named betelgeuse by edo Mon Apr 17 12:05:28 2023 [1]PETSC ERROR: Configure options PETSC_ARCH=arch-gnu FOPTFLAGS=-O3 COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no -with-mpi-dir=/home/edo/user_software_repository/openmpi-4.1.2/build -download-fblaslapack=1 -download-superlu_dist -download-mumps -download-hypre -download-metis -download-parmetis -download-scalapack --download-ml -download-slepc -download-spai -download-fftw [1]PETSC ERROR: #1 MatMult() at /home/edo/user_software_repository/petsc/src/mat/interface/matrix.c:2557 [2]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [2]PETSC ERROR: Object is in wrong state [2]PETSC ERROR: Not for unassembled vector, did you call VecAssemblyBegin()/VecAssemblyEnd()? [2]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [2]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_p_ksp_rtol value: 0.10000000000000000E-0001 source: code [2]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_p_pc_type value: hypre source: code [2]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_u_ksp_rtol value: 0.10000000000000000E-0001 source: code [2]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_u_pc_type value: bjacobi source: code [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [3]PETSC ERROR: Object is in wrong state [3]PETSC ERROR: Not for unassembled vector, did you call VecAssemblyBegin()/VecAssemblyEnd()? [3]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [3]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_p_ksp_rtol value: 0.10000000000000000E-0001 source: code [3]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_p_pc_type value: hypre source: code [3]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_u_ksp_rtol value: 0.10000000000000000E-0001 source: code [3]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_u_pc_type value: bjacobi source: code See https://petsc.org/release/faq/ for trouble shooting. [2]PETSC ERROR: Petsc Release Version 3.19.0, unknown [2]PETSC ERROR: flubio_coupled on a arch-gnu named betelgeuse by edo Mon Apr 17 12:05:28 2023 [2]PETSC ERROR: Configure options PETSC_ARCH=arch-gnu FOPTFLAGS=-O3 COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no -with-mpi-dir=/home/edo/user_software_repository/openmpi-4.1.2/build -download-fblaslapack=1 -download-superlu_dist -download-mumps -download-hypre -download-metis -download-parmetis -download-scalapack --download-ml -download-slepc -download-spai -download-fftw [3]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [3]PETSC ERROR: Petsc Release Version 3.19.0, unknown [3]PETSC ERROR: flubio_coupled on a arch-gnu named betelgeuse by edo Mon Apr 17 12:05:28 2023 [3]PETSC ERROR: Configure options PETSC_ARCH=arch-gnu FOPTFLAGS=-O3 COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no -with-mpi-dir=/home/edo/user_software_repository/openmpi-4.1.2/build -download-fblaslapack=1 -download-superlu_dist -download-mumps -download-hypre -download-metis -download-parmetis -download-scalapack --download-ml -download-slepc -download-spai -download-fftw [2]PETSC ERROR: #1 MatMult() at /home/edo/user_software_repository/petsc/src/mat/interface/matrix.c:2557 [3]PETSC ERROR: #1 MatMult() at /home/edo/user_software_repository/petsc/src/mat/interface/matrix.c:2557 Here the way I am doing: ! Duplicate vector call VecDuplicate(this%rhs, x, ierr) call VecZeroEntries(x, ierr) call VecDuplicate(this%rhs, vres, ierr) ! Set the current solution call VecSetValues(x, numberOfElements, mesh%cellGlobalAddr-1, field%phi(1:numberOfElements,iComp), INSERT_VALUES, ierr) call flubioStopMsg('HELLO') cheers -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Apr 17 05:03:07 2023 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 17 Apr 2023 06:03:07 -0400 Subject: [petsc-users] issues with VecSetValues in petsc 3.19 In-Reply-To: References: Message-ID: On Mon, Apr 17, 2023 at 6:00?AM Edoardo alinovi wrote: > Hey Matt, > > Thanks for the help. Here is the error: > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Object is in wrong state > [0]PETSC ERROR: Not for unassembled vector, did you call > VecAssemblyBegin()/VecAssemblyEnd()? > [0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could > be the program crashed before they were used or a spelling mistake, etc! > [0]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_p_ksp_rtol value: > 0.10000000000000000E-0001 source: code > [0]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_p_pc_type value: > hypre source: code > [0]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_u_ksp_rtol value: > 0.10000000000000000E-0001 source: code > [0]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_u_pc_type value: > bjacobi source: code > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.19.0, unknown > [0]PETSC ERROR: flubio_coupled on a arch-gnu named betelgeuse by edo Mon > Apr 17 12:05:28 2023 > [0]PETSC ERROR: Configure options PETSC_ARCH=arch-gnu FOPTFLAGS=-O3 > COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no > -with-mpi-dir=/home/edo/user_software_repository/openmpi-4.1.2/build > -download-fblaslapack=1 -download-superlu_dist -download-mumps > -download-hypre -download-metis -download-parmetis -download-scalapack > --download-ml -download-slepc -download-spai -download-fftw > [0]PETSC ERROR: #1 MatMult() at > /home/edo/user_software_repository/petsc/src/mat/interface/matrix.c:2557 > [1]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [1]PETSC ERROR: Object is in wrong state > [1]PETSC ERROR: Not for unassembled vector, did you call > VecAssemblyBegin()/VecAssemblyEnd()? > [1]PETSC ERROR: WARNING! There are option(s) set that were not used! Could > be the program crashed before they were used or a spelling mistake, etc! > [1]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_p_ksp_rtol value: > 0.10000000000000000E-0001 source: code > [1]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_p_pc_type value: > hypre source: code > [1]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_u_ksp_rtol value: > 0.10000000000000000E-0001 source: code > [1]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_u_pc_type value: > bjacobi source: code > [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [1]PETSC ERROR: Petsc Release Version 3.19.0, unknown > [1]PETSC ERROR: flubio_coupled on a arch-gnu named betelgeuse by edo Mon > Apr 17 12:05:28 2023 > [1]PETSC ERROR: Configure options PETSC_ARCH=arch-gnu FOPTFLAGS=-O3 > COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no > -with-mpi-dir=/home/edo/user_software_repository/openmpi-4.1.2/build > -download-fblaslapack=1 -download-superlu_dist -download-mumps > -download-hypre -download-metis -download-parmetis -download-scalapack > --download-ml -download-slepc -download-spai -download-fftw > [1]PETSC ERROR: #1 MatMult() at > /home/edo/user_software_repository/petsc/src/mat/interface/matrix.c:2557 > [2]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [2]PETSC ERROR: Object is in wrong state > [2]PETSC ERROR: Not for unassembled vector, did you call > VecAssemblyBegin()/VecAssemblyEnd()? > [2]PETSC ERROR: WARNING! There are option(s) set that were not used! Could > be the program crashed before they were used or a spelling mistake, etc! > [2]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_p_ksp_rtol value: > 0.10000000000000000E-0001 source: code > [2]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_p_pc_type value: > hypre source: code > [2]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_u_ksp_rtol value: > 0.10000000000000000E-0001 source: code > [2]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_u_pc_type value: > bjacobi source: code > [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [3]PETSC ERROR: Object is in wrong state > [3]PETSC ERROR: Not for unassembled vector, did you call > VecAssemblyBegin()/VecAssemblyEnd()? > [3]PETSC ERROR: WARNING! There are option(s) set that were not used! Could > be the program crashed before they were used or a spelling mistake, etc! > [3]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_p_ksp_rtol value: > 0.10000000000000000E-0001 source: code > [3]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_p_pc_type value: > hypre source: code > [3]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_u_ksp_rtol value: > 0.10000000000000000E-0001 source: code > [3]PETSC ERROR: Option left: name:-UPeqn_fieldsplit_u_pc_type value: > bjacobi source: code > See https://petsc.org/release/faq/ for trouble shooting. > [2]PETSC ERROR: Petsc Release Version 3.19.0, unknown > [2]PETSC ERROR: flubio_coupled on a arch-gnu named betelgeuse by edo Mon > Apr 17 12:05:28 2023 > [2]PETSC ERROR: Configure options PETSC_ARCH=arch-gnu FOPTFLAGS=-O3 > COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no > -with-mpi-dir=/home/edo/user_software_repository/openmpi-4.1.2/build > -download-fblaslapack=1 -download-superlu_dist -download-mumps > -download-hypre -download-metis -download-parmetis -download-scalapack > --download-ml -download-slepc -download-spai -download-fftw > [3]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [3]PETSC ERROR: Petsc Release Version 3.19.0, unknown > [3]PETSC ERROR: flubio_coupled on a arch-gnu named betelgeuse by edo Mon > Apr 17 12:05:28 2023 > [3]PETSC ERROR: Configure options PETSC_ARCH=arch-gnu FOPTFLAGS=-O3 > COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no > -with-mpi-dir=/home/edo/user_software_repository/openmpi-4.1.2/build > -download-fblaslapack=1 -download-superlu_dist -download-mumps > -download-hypre -download-metis -download-parmetis -download-scalapack > --download-ml -download-slepc -download-spai -download-fftw > [2]PETSC ERROR: #1 MatMult() at > /home/edo/user_software_repository/petsc/src/mat/interface/matrix.c:2557 > [3]PETSC ERROR: #1 MatMult() at > /home/edo/user_software_repository/petsc/src/mat/interface/matrix.c:2557 > > Here the way I am doing: > > ! Duplicate vector > call VecDuplicate(this%rhs, x, ierr) > call VecZeroEntries(x, ierr) > > call VecDuplicate(this%rhs, vres, ierr) > ! Set the current solution > call VecSetValues(x, numberOfElements, mesh%cellGlobalAddr-1, > field%phi(1:numberOfElements,iComp), INSERT_VALUES, ierr) > After VecSetValues(), you must call VecAssemblyBegin/End(). Thanks, Matt > call flubioStopMsg('HELLO') > > cheers > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Mon Apr 17 05:08:08 2023 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Mon, 17 Apr 2023 12:08:08 +0200 Subject: [petsc-users] issues with VecSetValues in petsc 3.19 In-Reply-To: References: Message-ID: Aaah yes you are right. Do not ask me why, but I was not getting this with 3.18.5, odd. -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Mon Apr 17 05:08:49 2023 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Mon, 17 Apr 2023 12:08:49 +0200 Subject: [petsc-users] issues with VecSetValues in petsc 3.19 In-Reply-To: References: Message-ID: Thanks Matt, your always there when you need <3 -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Apr 17 05:15:02 2023 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 17 Apr 2023 06:15:02 -0400 Subject: [petsc-users] issues with VecSetValues in petsc 3.19 In-Reply-To: References: Message-ID: On Mon, Apr 17, 2023 at 6:09?AM Edoardo alinovi wrote: > Thanks Matt, your always there when you need <3 > Glad it's working! Sometime you have to tell me what it is solving. Thanks, Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Mon Apr 17 05:16:39 2023 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Mon, 17 Apr 2023 12:16:39 +0200 Subject: [petsc-users] issues with VecSetValues in petsc 3.19 In-Reply-To: References: Message-ID: Do you mean the solver I am messing around? XD -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Apr 17 05:22:49 2023 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 17 Apr 2023 06:22:49 -0400 Subject: [petsc-users] issues with VecSetValues in petsc 3.19 In-Reply-To: References: Message-ID: On Mon, Apr 17, 2023 at 6:16?AM Edoardo alinovi wrote: > Do you mean the solver I am messing around? XD > Yes, and what physics it is targeting. THanks, Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Mon Apr 17 05:36:54 2023 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Mon, 17 Apr 2023 12:36:54 +0200 Subject: [petsc-users] issues with VecSetValues in petsc 3.19 In-Reply-To: References: Message-ID: Sure thing, the solver I am working on is this one: https://gitlab.com/alie89/flubio-code-fvm. It is a 3D, collocated, unstructured, finite volume solver for incompressibility NS. I can run steady, unsteady and I can use SIMPLE, PISO and Factional step method (both explicit and fully implicit momentum). I can also solve for turbulence (k-omega, BSL, SST, Spalart-Allmars, LES). I have also implemented some kind of Immersed boundary (2D/3D) that I need to resume at some point. Hot topic of the moment, I am developing a fully coupled pressure based solver using field-split. What I have now is working ok, I have validated it on a lot of 2D problems and going on with 3D right now. If all the tests are passed, I'll focus on tuning the field splitting which looks to be a quite interesting topic! Flubio is a project I have been carrying on since PhD days. The implementation is 99% on my shoulders, despite the fact I am collaborating with some people around. I am coding evenings and weekends/free time, it gives me a lot of satisfaction and also a lot of insights! -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Apr 17 06:50:07 2023 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 17 Apr 2023 07:50:07 -0400 Subject: [petsc-users] issues with VecSetValues in petsc 3.19 In-Reply-To: References: Message-ID: On Mon, Apr 17, 2023 at 6:37?AM Edoardo alinovi wrote: > Sure thing, the solver I am working on is this one: > https://gitlab.com/alie89/flubio-code-fvm. > > It is a 3D, collocated, unstructured, finite volume solver for > incompressibility NS. I can run steady, unsteady and I can use SIMPLE, PISO > and Factional step method (both explicit and fully implicit momentum). I > can also solve for turbulence (k-omega, BSL, SST, Spalart-Allmars, LES). I > have also implemented some kind of Immersed boundary (2D/3D) that I need to > resume at some point. > > Hot topic of the moment, I am developing a fully coupled pressure based > solver using field-split. What I have now is working ok, I have validated > it on a lot of 2D problems and going on with 3D right now. If all the > tests are passed, I'll focus on tuning the field splitting which looks to > be a quite interesting topic! > I think a very good discussion of the issues from the point of view of FEM is https://arxiv.org/abs/1810.03315 There should be a similar analysis from the FVM side, although it might not be possible to find a pressure discretization compatible with the FVM velocity for this purpose. Thanks, Matt > Flubio is a project I have been carrying on since PhD days. The > implementation is 99% on my shoulders, despite the fact I am > collaborating with some people around. I am coding evenings and > weekends/free time, it gives me a lot of satisfaction and also a lot of > insights! > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexlindsay239 at gmail.com Mon Apr 17 08:20:27 2023 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Mon, 17 Apr 2023 06:20:27 -0700 Subject: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split In-Reply-To: <5C72E83C-717E-447D-BEC4-C5399636BCE2@lip6.fr> References: <0DE6DDB9-759F-4DA8-86F1-CFA48F7AAE79@lip6.fr> <77E433C0-E692-4561-97E9-6B7ED24E0F2C@gmail.com> <5C72E83C-717E-447D-BEC4-C5399636BCE2@lip6.fr> Message-ID: Good to know. I may take a shot at it depending on need and time! Opened https://gitlab.com/petsc/petsc/-/issues/1362 for doing so Alex On Sun, Apr 16, 2023 at 9:27?PM Pierre Jolivet wrote: > > On 17 Apr 2023, at 1:10 AM, Alexander Lindsay > wrote: > > Are there any plans to get the missing hook into PETSc for AIR? Just > curious if there?s an issue I can subscribe to or anything. > > > Not that I know of, but it would make for a nice contribution if you feel > like creating a PR. > > Thanks, > Pierre > > (Independently I?m excited to test HPDDM out tomorrow) > > On Apr 13, 2023, at 10:29 PM, Pierre Jolivet > wrote: > > ? > > On 14 Apr 2023, at 7:02 AM, Alexander Lindsay > wrote: > > Pierre, > > This is very helpful information. Thank you. Yes I would appreciate those > command line options if you?re willing to share! > > > No problem, I?ll get in touch with you in private first, because it may > require some extra work (need a couple of extra options in PETSc > ./configure), and this is not very related to the problem at hand, so best > not to spam the mailing list. > > Thanks, > Pierre > > On Apr 13, 2023, at 9:54 PM, Pierre Jolivet > wrote: > > ? > > On 13 Apr 2023, at 10:33 PM, Alexander Lindsay > wrote: > > Hi, I'm trying to solve steady Navier-Stokes for different Reynolds > numbers. My options table > > -dm_moose_fieldsplit_names u,p > -dm_moose_nfieldsplits 2 > -fieldsplit_p_dm_moose_vars pressure > -fieldsplit_p_ksp_type preonly > -fieldsplit_p_pc_type jacobi > -fieldsplit_u_dm_moose_vars vel_x,vel_y > -fieldsplit_u_ksp_type preonly > -fieldsplit_u_pc_hypre_type boomeramg > -fieldsplit_u_pc_type hypre > -pc_fieldsplit_schur_fact_type full > -pc_fieldsplit_schur_precondition selfp > -pc_fieldsplit_type schur > -pc_type fieldsplit > > works wonderfully for a low Reynolds number of 2.2. The solver performance > crushes LU as I scale up the problem. However, not surprisingly this > options table struggles when I bump the Reynolds number to 220. I've read > that use of AIR (approximate ideal restriction) can improve performance for > advection dominated problems. I've tried > setting -pc_hypre_boomeramg_restriction_type 1 for a simple diffusion > problem and the option works fine. However, when applying it to my > field-split preconditioned Navier-Stokes system, I get immediate > non-convergence: > > 0 Nonlinear |R| = 1.033077e+03 > 0 Linear |R| = 1.033077e+03 > Linear solve did not converge due to DIVERGED_NANORINF iterations 0 > Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0 > > Does anyone have an idea as to why this might be happening? > > > Do not use this option, even when not part of PCFIELDSPLIT. > There is some missing plumbing in PETSc which makes it unusable, see Ben?s > comment here > https://github.com/hypre-space/hypre/issues/764#issuecomment-1353452417. > In fact, it?s quite easy to make HYPRE generate NaN with a very simple > stabilized convection?diffusion problem near the pure convection limit > (something that ?AIR is supposed to handle). > Even worse, you can make HYPRE fill your terminal with printf-style > debugging messages > https://github.com/hypre-space/hypre/blob/5546cc22d46b3dba253849f258786da47c9a7b21/src/parcsr_ls/par_lr_restr.c#L1416 with > this option turned on. > As a result, I have been unable to reproduce any of the ?AIR results. > This also explains why I have been using plain BoomerAMG instead of ?AIR > for the comparison in page 9 of https://arxiv.org/pdf/2201.02250.pdf (if > you would like to try the PC we are using, I could send you the command > line options). > > Thanks, > Pierre > > If not, I'd take a suggestion on where to set a breakpoint to start my own > investigation. Alternatively, I welcome other preconditioning suggestions > for an advection dominated problem. > > Alex > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From matteo.semplice at uninsubria.it Mon Apr 17 11:22:26 2023 From: matteo.semplice at uninsubria.it (Matteo Semplice) Date: Mon, 17 Apr 2023 18:22:26 +0200 Subject: [petsc-users] PETSc error only in debug build Message-ID: Dear PETSc users, ??? I am investigating a strange error occurring when using my code on a cluster; I managed to reproduce it on my machine as well and it's weird: - on petsc3.19, optimized build, the code runs fine, serial and parallel - on petsc 3,19, --with=debugging=1, the code crashes without giving me a meaningful message. The output is $ ../levelSet -options_file ../test.opts Converting from ../pointClouds/2d/ptCloud_cerchio.txt in binary format: this is slow! Pass in the .info file instead! Read 50 particles from ../pointClouds/2d/ptCloud_cerchio.txt Bounding box: [-0.665297, 0.666667] x [-0.666324, 0.666324] [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Petsc has generated inconsistent data [0]PETSC ERROR: Invalid stack size 0, pop convertCloudTxt clouds.cpp:139. [0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spell ing mistake, etc! [0]PETSC ERROR: ??Option left: name:-delta value: 1.0 source: file [0]PETSC ERROR: ??Option left: name:-dx value: 0.1 source: file [0]PETSC ERROR: ??Option left: name:-extraCells value: 5 source: file [0]PETSC ERROR: ??Option left: name:-maxIter value: 200 source: file [0]PETSC ERROR: ??Option left: name:-p value: 1.0 source: file [0]PETSC ERROR: ??Option left: name:-tau value: 0.1 source: file [0]PETSC ERROR: ??Option left: name:-u0tresh value: 0.3 source: file [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.19.0, unknown [0]PETSC ERROR: ../levelSet on a ?named signalkuppe by matteo Mon Apr 17 18:04:03 2023 [0]PETSC ERROR: Configure options --download-ml \ --with-metis --with-parmetis \ --download-hdf5 \ --with-triangle --with-gmsh \ P ETSC_DIR=/home/matteo/software/petsc --PETSC_ARCH=dbg --with-debugging=1 --COPTFLAGS=-O --CXXOPTFLAGS=-O --FOPTFLAGS=-O --prefix=/ home/matteo/software/petsc/3.19-dbg/ [0]PETSC ERROR: #1 convertCloudTxt() at clouds.cpp:139 -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_SELF with errorcode 77. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- Now, line 139 of clouds.cpp is PetscFunctionReturn(PETSC_SUCCESS), so I cannot understand what is the offending operation in that routine. (Note: this is a convertion routine and, skipping it, just make the next routine fail in a similar way...) My student has also tried to compile PETSc with |--with-strict-petscerrorcode| and fixing all the compilation errors that were raised, but it didn't help. Do you have any guess on what to look for? Bonus question to assess the cluster output what is the default value for --with-debugging? I that option is not specified during PETSc configure, does one get optimized or debug build? Thanks ??? Matteo -- Professore Associato in Analisi Numerica Dipartimento di Scienza e Alta Tecnologia Universit? degli Studi dell'Insubria Via Valleggio, 11 - Como -------------- next part -------------- An HTML attachment was scrubbed... URL: From pierre.jolivet at lip6.fr Mon Apr 17 11:27:00 2023 From: pierre.jolivet at lip6.fr (Pierre Jolivet) Date: Mon, 17 Apr 2023 18:27:00 +0200 Subject: [petsc-users] PETSc error only in debug build In-Reply-To: References: Message-ID: > On 17 Apr 2023, at 6:22 PM, Matteo Semplice wrote: > > Dear PETSc users, > > I am investigating a strange error occurring when using my code on a cluster; I managed to reproduce it on my machine as well and it's weird: > > - on petsc3.19, optimized build, the code runs fine, serial and parallel > > - on petsc 3,19, --with=debugging=1, the code crashes without giving me a meaningful message. The output is > > $ ../levelSet -options_file ../test.opts > Converting from ../pointClouds/2d/ptCloud_cerchio.txt in binary format: this is slow! > Pass in the .info file instead! > Read 50 particles from ../pointClouds/2d/ptCloud_cerchio.txt > Bounding box: [-0.665297, 0.666667] x [-0.666324, 0.666324] > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Petsc has generated inconsistent data > [0]PETSC ERROR: Invalid stack size 0, pop convertCloudTxt clouds.cpp:139. > > [0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spell > ing mistake, etc! > [0]PETSC ERROR: Option left: name:-delta value: 1.0 source: file > [0]PETSC ERROR: Option left: name:-dx value: 0.1 source: file > [0]PETSC ERROR: Option left: name:-extraCells value: 5 source: file > [0]PETSC ERROR: Option left: name:-maxIter value: 200 source: file > [0]PETSC ERROR: Option left: name:-p value: 1.0 source: file > [0]PETSC ERROR: Option left: name:-tau value: 0.1 source: file > [0]PETSC ERROR: Option left: name:-u0tresh value: 0.3 source: file > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.19.0, unknown > [0]PETSC ERROR: ../levelSet on a named signalkuppe by matteo Mon Apr 17 18:04:03 2023 > [0]PETSC ERROR: Configure options --download-ml \ --with-metis --with-parmetis \ --download-hdf5 \ --with-triangle --with-gmsh \ P > ETSC_DIR=/home/matteo/software/petsc --PETSC_ARCH=dbg --with-debugging=1 --COPTFLAGS=-O --CXXOPTFLAGS=-O --FOPTFLAGS=-O --prefix=/ > home/matteo/software/petsc/3.19-dbg/ > [0]PETSC ERROR: #1 convertCloudTxt() at clouds.cpp:139 > -------------------------------------------------------------------------- > MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_SELF > with errorcode 77. > > NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. > You may or may not see output from other processes, depending on > exactly when Open MPI kills them. > -------------------------------------------------------------------------- > > Now, line 139 of clouds.cpp is PetscFunctionReturn(PETSC_SUCCESS), so I cannot understand what is the offending operation in that routine. (Note: this is a convertion routine and, skipping it, just make the next routine fail in a similar way...) > > My student has also tried to compile PETSc with --with-strict-petscerrorcode and fixing all the compilation errors that were raised, but it didn't help. > > Do you have any guess on what to look for? > There may be a PetscFunctionBeginUser; missing at the beginning of the convertCloudTxt() function. Could you double-check this? Thanks, Pierre > Bonus question to assess the cluster output what is the default value for --with-debugging? I that option is not specified during PETSc configure, does one get optimized or debug build? > > Thanks > > Matteo > > -- > Professore Associato in Analisi Numerica > Dipartimento di Scienza e Alta Tecnologia > Universit? degli Studi dell'Insubria > Via Valleggio, 11 - Como -------------- next part -------------- An HTML attachment was scrubbed... URL: From matteo.semplice at uninsubria.it Mon Apr 17 11:35:10 2023 From: matteo.semplice at uninsubria.it (Matteo Semplice) Date: Mon, 17 Apr 2023 18:35:10 +0200 Subject: [petsc-users] PETSc error only in debug build In-Reply-To: References: Message-ID: <08bc26ab-d0c9-e544-d143-d2bfaa92a27d@uninsubria.it> Adding PetscFunctionBeginUser indeed seems to fix this and moves the error to the next function without PetscFunctionBeginUser... Thanks! Matteo Il 17/04/23 18:27, Pierre Jolivet ha scritto: > > > Non si ricevono spesso messaggi di posta elettronica da > pierre.jolivet at lip6.fr. Informazioni sul perch? ? importante > > > > > >> On 17 Apr 2023, at 6:22 PM, Matteo Semplice >> wrote: >> >> Dear PETSc users, >> >> ??? I am investigating a strange error occurring when using my code >> on a cluster; I managed to reproduce it on my machine as well and >> it's weird: >> >> - on petsc3.19, optimized build, the code runs fine, serial and parallel >> >> - on petsc 3,19, --with=debugging=1, the code crashes without giving >> me a meaningful message. The output is >> >> $ ../levelSet -options_file ../test.opts >> Converting from ../pointClouds/2d/ptCloud_cerchio.txt in binary >> format: this is slow! >> Pass in the .info file instead! >> Read 50 particles from ../pointClouds/2d/ptCloud_cerchio.txt >> Bounding box: [-0.665297, 0.666667] x [-0.666324, 0.666324] >> [0]PETSC ERROR: --------------------- Error Message >> -------------------------------------------------------------- >> [0]PETSC ERROR: Petsc has generated inconsistent data >> [0]PETSC ERROR: Invalid stack size 0, pop convertCloudTxt >> clouds.cpp:139. >> >> [0]PETSC ERROR: WARNING! There are option(s) set that were not used! >> Could be the program crashed before they were used or a spell >> ing mistake, etc! >> [0]PETSC ERROR: ??Option left: name:-delta value: 1.0 source: file >> [0]PETSC ERROR: ??Option left: name:-dx value: 0.1 source: file >> [0]PETSC ERROR: ??Option left: name:-extraCells value: 5 source: file >> [0]PETSC ERROR: ??Option left: name:-maxIter value: 200 source: file >> [0]PETSC ERROR: ??Option left: name:-p value: 1.0 source: file >> [0]PETSC ERROR: ??Option left: name:-tau value: 0.1 source: file >> [0]PETSC ERROR: ??Option left: name:-u0tresh value: 0.3 source: file >> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. >> [0]PETSC ERROR: Petsc Release Version 3.19.0, unknown >> [0]PETSC ERROR: ../levelSet on a ?named signalkuppe by matteo Mon Apr >> 17 18:04:03 2023 >> [0]PETSC ERROR: Configure options --download-ml \ --with-metis >> --with-parmetis \ --download-hdf5 \ --with-triangle --with-gmsh \ P >> ETSC_DIR=/home/matteo/software/petsc --PETSC_ARCH=dbg >> --with-debugging=1 --COPTFLAGS=-O --CXXOPTFLAGS=-O --FOPTFLAGS=-O >> --prefix=/ >> home/matteo/software/petsc/3.19-dbg/ >> [0]PETSC ERROR: #1 convertCloudTxt() at clouds.cpp:139 >> -------------------------------------------------------------------------- >> >> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_SELF >> with errorcode 77. >> >> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. >> You may or may not see output from other processes, depending on >> exactly when Open MPI kills them. >> -------------------------------------------------------------------------- >> >> Now, line 139 of clouds.cpp is PetscFunctionReturn(PETSC_SUCCESS), so >> I cannot understand what is the offending operation in that routine. >> (Note: this is a convertion routine and, skipping it, just make the >> next routine fail in a similar way...) >> >> My student has also tried to compile PETSc with >> |--with-strict-petscerrorcode| and fixing all the compilation errors >> that were raised, but it didn't help. >> >> Do you have any guess on what to look for? >> > There may be a?PetscFunctionBeginUser; missing at the beginning of the > convertCloudTxt() function. > Could you double-check this? > > Thanks, > Pierre >> >> Bonus question to assess the cluster output what is the default value >> for --with-debugging? I that option is not specified during PETSc >> configure, does one get optimized or debug build? >> >> Thanks >> >> ??? Matteo >> >> -- >> Professore Associato in Analisi Numerica >> Dipartimento di Scienza e Alta Tecnologia >> Universit? degli Studi dell'Insubria >> Via Valleggio, 11 - Como > -- --- Professore Associato in Analisi Numerica Dipartimento di Scienza e Alta Tecnologia Universit? degli Studi dell'Insubria Via Valleggio, 11 - Como -------------- next part -------------- An HTML attachment was scrubbed... URL: From jrwrigh.iii at gmail.com Mon Apr 17 12:30:30 2023 From: jrwrigh.iii at gmail.com (James Wright) Date: Mon, 17 Apr 2023 11:30:30 -0600 Subject: [petsc-users] Composing different Field Components into single Vector (or data array) Message-ID: Hello, I currently have two DMPlex objects, both `DMClone`ed from the same "original" DMPlex object. They have different fields with different numbers of components on each of them. I would like to compose certain components from each into a single contiguous array. I'd also like to do this from a DMGlobal vector. What is the best/most robust way to do that? Obviously I can just `VecGetArrayRead` each of the vectors and loop through them manually, but that relies on knowledge of the array data layout. There's `DMPlexGetLocalOffsets`, but since I want to use the global Vec, there isn't a corresponding `DMPlexGetGlobalOffsets` (that I can see anyways). This manually looping would also require that the underlying arrangement of the DOFs is the same between the two DMPlex objects, but I assume this is true from the `DMClone` operation. Thanks, James Wright Graduate Research Assistant, PhD University of Colorado Boulder Cell: (864) 498 8869 Email: james at jameswright.xyz Website: jameswright.xyz -------------- next part -------------- An HTML attachment was scrubbed... URL: From bourdin at mcmaster.ca Mon Apr 17 13:42:11 2023 From: bourdin at mcmaster.ca (Blaise Bourdin) Date: Mon, 17 Apr 2023 18:42:11 +0000 Subject: [petsc-users] Composing different Field Components into single Vector (or data array) In-Reply-To: References: Message-ID: <33427F79-6425-42F8-A64A-8D65237D6929@mcmaster.ca> An HTML attachment was scrubbed... URL: From alexlindsay239 at gmail.com Mon Apr 17 17:22:44 2023 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Mon, 17 Apr 2023 16:22:44 -0600 Subject: [petsc-users] PCHPDDM and matrix type Message-ID: I'm likely revealing a lot of ignorance, but in order to use HPDDM as a preconditioner does my system matrix (I am using the same matrix for A and P) need to be block type, e.g. baij or sbaij ? In MOOSE our default is aij and I am currently getting [1]PETSC ERROR: #1 buildTwo() at /raid/lindad/moose/petsc/arch-moose/include/HPDDM_schwarz.hpp:1012 with options: -pc_type hpddm -pc_hpddm_block_splitting -pc_hpddm_coarse_mat_type baij -pc_hpddm_coarse_pc_type lu -pc_hpddm_define_subdomains -pc_hpddm_levels_1_eps_gen_non_hermitian -pc_hpddm_levels_1_eps_nev 50 -pc_hpddm_levels_1_eps_threshold 0.1 -pc_hpddm_levels_1_st_matstructure SAME -pc_hpddm_levels_1_st_share_sub_ksp -pc_hpddm_levels_1_sub_pc_factor_mat_solver_type mumps -pc_hpddm_levels_1_sub_pc_type lu Alex -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexlindsay239 at gmail.com Mon Apr 17 17:24:11 2023 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Mon, 17 Apr 2023 16:24:11 -0600 Subject: [petsc-users] PCHPDDM and matrix type In-Reply-To: References: Message-ID: If it helps: if I use those exact same options in serial, then no errors and the linear solve is beautiful :-) On Mon, Apr 17, 2023 at 4:22?PM Alexander Lindsay wrote: > I'm likely revealing a lot of ignorance, but in order to use HPDDM as a > preconditioner does my system matrix (I am using the same matrix for A and > P) need to be block type, e.g. baij or sbaij ? In MOOSE our default is aij > and I am currently getting > > [1]PETSC ERROR: #1 buildTwo() at > /raid/lindad/moose/petsc/arch-moose/include/HPDDM_schwarz.hpp:1012 > > with options: > > -pc_type hpddm -pc_hpddm_block_splitting -pc_hpddm_coarse_mat_type baij > -pc_hpddm_coarse_pc_type lu -pc_hpddm_define_subdomains > -pc_hpddm_levels_1_eps_gen_non_hermitian -pc_hpddm_levels_1_eps_nev 50 > -pc_hpddm_levels_1_eps_threshold 0.1 -pc_hpddm_levels_1_st_matstructure > SAME -pc_hpddm_levels_1_st_share_sub_ksp > -pc_hpddm_levels_1_sub_pc_factor_mat_solver_type mumps > -pc_hpddm_levels_1_sub_pc_type lu > > Alex > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Apr 17 17:55:30 2023 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 17 Apr 2023 18:55:30 -0400 Subject: [petsc-users] PCHPDDM and matrix type In-Reply-To: References: Message-ID: I don't think so. Can you show the whole stack? THanks, Matt On Mon, Apr 17, 2023 at 6:24?PM Alexander Lindsay wrote: > If it helps: if I use those exact same options in serial, then no errors > and the linear solve is beautiful :-) > > On Mon, Apr 17, 2023 at 4:22?PM Alexander Lindsay < > alexlindsay239 at gmail.com> wrote: > >> I'm likely revealing a lot of ignorance, but in order to use HPDDM as a >> preconditioner does my system matrix (I am using the same matrix for A and >> P) need to be block type, e.g. baij or sbaij ? In MOOSE our default is aij >> and I am currently getting >> >> [1]PETSC ERROR: #1 buildTwo() at >> /raid/lindad/moose/petsc/arch-moose/include/HPDDM_schwarz.hpp:1012 >> >> with options: >> >> -pc_type hpddm -pc_hpddm_block_splitting -pc_hpddm_coarse_mat_type baij >> -pc_hpddm_coarse_pc_type lu -pc_hpddm_define_subdomains >> -pc_hpddm_levels_1_eps_gen_non_hermitian -pc_hpddm_levels_1_eps_nev 50 >> -pc_hpddm_levels_1_eps_threshold 0.1 -pc_hpddm_levels_1_st_matstructure >> SAME -pc_hpddm_levels_1_st_share_sub_ksp >> -pc_hpddm_levels_1_sub_pc_factor_mat_solver_type mumps >> -pc_hpddm_levels_1_sub_pc_type lu >> >> Alex >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexlindsay239 at gmail.com Mon Apr 17 18:26:28 2023 From: alexlindsay239 at gmail.com (Alexander Lindsay) Date: Mon, 17 Apr 2023 17:26:28 -0600 Subject: [petsc-users] PCHPDDM and matrix type In-Reply-To: References: Message-ID: I don't really get much more of a stack trace out: [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Invalid argument [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [1]PETSC ERROR: Invalid argument [1]PETSC ERROR: [0]PETSC ERROR: Option left: name:-i value: full_upwinding_2D.i source: command line [0]PETSC ERROR: [1]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! Option left: name:-ksp_converged_reason value: ::failed source: code [0]PETSC ERROR: Option left: name:-pc_hpddm_coarse_mat_type value: baij source: command line [0]PETSC ERROR: Option left: name:-pc_hpddm_coarse_pc_type value: lu source: command line [1]PETSC ERROR: Option left: name:-i value: full_upwinding_2D.i source: command line [1]PETSC ERROR: Option left: name:-ksp_converged_reason value: ::failed source: code [0]PETSC ERROR: Option left: name:-snes_converged_reason value: ::failed source: code [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [0]PETSC ERROR: Petsc Development GIT revision: v3.17.4-3368-g5a48edb989d GIT Date: 2023-04-16 17:35:24 +0000 [1]PETSC ERROR: Option left: name:-pc_hpddm_coarse_mat_type value: baij source: command line [1]PETSC ERROR: Option left: name:-pc_hpddm_coarse_pc_type value: lu source: command line [0]PETSC ERROR: ../../../moose_test-opt on a arch-moose named rod.hpc.inl.gov by lindad Mon Apr 17 16:11:09 2023 [0]PETSC ERROR: Configure options --download-hypre=1 --with-shared-libraries=1 --download-hdf5=1 --with-hdf5-fortran-bindings=0 --with-debugging=no --download-fblaslapack=1 --download-metis=1 --download-ptscotch=1 --download-parmetis=1 --download-superlu_dist=1 --download-mumps=1 --download-strumpack=1 --download-scalapack=1 --download-slepc=1 --with-mpi=1 --with-openmp=1 --with-cxx-dialect=C++11 --with-fortran-bindings=0 --with-sowing=0 --with-64-bit-indices --with-make-np=256 --download-hpddm [1]PETSC ERROR: Option left: name:-snes_converged_reason value: ::failed source: code [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [1]PETSC ERROR: [0]PETSC ERROR: #1 buildTwo() at /raid/lindad/moose/petsc/arch-moose/include/HPDDM_schwarz.hpp:1012 Petsc Development GIT revision: v3.17.4-3368-g5a48edb989d GIT Date: 2023-04-16 17:35:24 +0000 [1]PETSC ERROR: ../../../moose_test-opt on a arch-moose named rod.hpc.inl.gov by lindad Mon Apr 17 16:11:09 2023 [1]PETSC ERROR: Configure options --download-hypre=1 --with-shared-libraries=1 --download-hdf5=1 --with-hdf5-fortran-bindings=0 --with-debugging=no --download-fblaslapack=1 --download-metis=1 --download-ptscotch=1 --download-parmetis=1 --download-superlu_dist=1 --download-mumps=1 --download-strumpack=1 --download-scalapack=1 --download-slepc=1 --with-mpi=1 --with-openmp=1 --with-cxx-dialect=C++11 --with-fortran-bindings=0 --with-sowing=0 --with-64-bit-indices --with-make-np=256 --download-hpddm [1]PETSC ERROR: #1 buildTwo() at /raid/lindad/moose/petsc/arch-moose/include/HPDDM_schwarz.hpp:1012 On Mon, Apr 17, 2023 at 4:55?PM Matthew Knepley wrote: > I don't think so. Can you show the whole stack? > > THanks, > > Matt > > On Mon, Apr 17, 2023 at 6:24?PM Alexander Lindsay < > alexlindsay239 at gmail.com> wrote: > >> If it helps: if I use those exact same options in serial, then no errors >> and the linear solve is beautiful :-) >> >> On Mon, Apr 17, 2023 at 4:22?PM Alexander Lindsay < >> alexlindsay239 at gmail.com> wrote: >> >>> I'm likely revealing a lot of ignorance, but in order to use HPDDM as a >>> preconditioner does my system matrix (I am using the same matrix for A and >>> P) need to be block type, e.g. baij or sbaij ? In MOOSE our default is aij >>> and I am currently getting >>> >>> [1]PETSC ERROR: #1 buildTwo() at >>> /raid/lindad/moose/petsc/arch-moose/include/HPDDM_schwarz.hpp:1012 >>> >>> with options: >>> >>> -pc_type hpddm -pc_hpddm_block_splitting -pc_hpddm_coarse_mat_type baij >>> -pc_hpddm_coarse_pc_type lu -pc_hpddm_define_subdomains >>> -pc_hpddm_levels_1_eps_gen_non_hermitian -pc_hpddm_levels_1_eps_nev 50 >>> -pc_hpddm_levels_1_eps_threshold 0.1 -pc_hpddm_levels_1_st_matstructure >>> SAME -pc_hpddm_levels_1_st_share_sub_ksp >>> -pc_hpddm_levels_1_sub_pc_factor_mat_solver_type mumps >>> -pc_hpddm_levels_1_sub_pc_type lu >>> >>> Alex >>> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Apr 17 18:49:04 2023 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 17 Apr 2023 19:49:04 -0400 Subject: [petsc-users] PCHPDDM and matrix type In-Reply-To: References: Message-ID: Yes, I cannot figure the error out. We will wait for Pierre to weigh in. Thanks, Matt On Mon, Apr 17, 2023 at 7:26?PM Alexander Lindsay wrote: > I don't really get much more of a stack trace out: > > [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Invalid argument > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > > [0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could > be the program crashed before they were used or a spelling mistake, etc! > [1]PETSC ERROR: Invalid argument > [1]PETSC ERROR: > [0]PETSC ERROR: Option left: name:-i value: full_upwinding_2D.i source: > command line > [0]PETSC ERROR: [1]PETSC ERROR: WARNING! There are option(s) set that were > not used! Could be the program crashed before they were used or a spelling > mistake, etc! > Option left: name:-ksp_converged_reason value: ::failed source: code > [0]PETSC ERROR: Option left: name:-pc_hpddm_coarse_mat_type value: baij > source: command line > [0]PETSC ERROR: Option left: name:-pc_hpddm_coarse_pc_type value: lu > source: command line > [1]PETSC ERROR: Option left: name:-i value: full_upwinding_2D.i source: > command line > [1]PETSC ERROR: Option left: name:-ksp_converged_reason value: ::failed > source: code > [0]PETSC ERROR: Option left: name:-snes_converged_reason value: ::failed > source: code > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [0]PETSC ERROR: Petsc Development GIT revision: v3.17.4-3368-g5a48edb989d > GIT Date: 2023-04-16 17:35:24 +0000 > [1]PETSC ERROR: Option left: name:-pc_hpddm_coarse_mat_type value: baij > source: command line > [1]PETSC ERROR: Option left: name:-pc_hpddm_coarse_pc_type value: lu > source: command line > [0]PETSC ERROR: ../../../moose_test-opt on a arch-moose named > rod.hpc.inl.gov by lindad Mon Apr 17 16:11:09 2023 > [0]PETSC ERROR: Configure options --download-hypre=1 > --with-shared-libraries=1 --download-hdf5=1 --with-hdf5-fortran-bindings=0 > --with-debugging=no --download-fblaslapack=1 --download-metis=1 > --download-ptscotch=1 --download-parmetis=1 --download-superlu_dist=1 > --download-mumps=1 --download-strumpack=1 --download-scalapack=1 > --download-slepc=1 --with-mpi=1 --with-openmp=1 --with-cxx-dialect=C++11 > --with-fortran-bindings=0 --with-sowing=0 --with-64-bit-indices > --with-make-np=256 --download-hpddm > [1]PETSC ERROR: Option left: name:-snes_converged_reason value: ::failed > source: code > [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [1]PETSC ERROR: [0]PETSC ERROR: #1 buildTwo() at > /raid/lindad/moose/petsc/arch-moose/include/HPDDM_schwarz.hpp:1012 > Petsc Development GIT revision: v3.17.4-3368-g5a48edb989d GIT Date: > 2023-04-16 17:35:24 +0000 > [1]PETSC ERROR: ../../../moose_test-opt on a arch-moose named > rod.hpc.inl.gov by lindad Mon Apr 17 16:11:09 2023 > [1]PETSC ERROR: Configure options --download-hypre=1 > --with-shared-libraries=1 --download-hdf5=1 --with-hdf5-fortran-bindings=0 > --with-debugging=no --download-fblaslapack=1 --download-metis=1 > --download-ptscotch=1 --download-parmetis=1 --download-superlu_dist=1 > --download-mumps=1 --download-strumpack=1 --download-scalapack=1 > --download-slepc=1 --with-mpi=1 --with-openmp=1 --with-cxx-dialect=C++11 > --with-fortran-bindings=0 --with-sowing=0 --with-64-bit-indices > --with-make-np=256 --download-hpddm > [1]PETSC ERROR: #1 buildTwo() at > /raid/lindad/moose/petsc/arch-moose/include/HPDDM_schwarz.hpp:1012 > > On Mon, Apr 17, 2023 at 4:55?PM Matthew Knepley wrote: > >> I don't think so. Can you show the whole stack? >> >> THanks, >> >> Matt >> >> On Mon, Apr 17, 2023 at 6:24?PM Alexander Lindsay < >> alexlindsay239 at gmail.com> wrote: >> >>> If it helps: if I use those exact same options in serial, then no errors >>> and the linear solve is beautiful :-) >>> >>> On Mon, Apr 17, 2023 at 4:22?PM Alexander Lindsay < >>> alexlindsay239 at gmail.com> wrote: >>> >>>> I'm likely revealing a lot of ignorance, but in order to use HPDDM as a >>>> preconditioner does my system matrix (I am using the same matrix for A and >>>> P) need to be block type, e.g. baij or sbaij ? In MOOSE our default is aij >>>> and I am currently getting >>>> >>>> [1]PETSC ERROR: #1 buildTwo() at >>>> /raid/lindad/moose/petsc/arch-moose/include/HPDDM_schwarz.hpp:1012 >>>> >>>> with options: >>>> >>>> -pc_type hpddm -pc_hpddm_block_splitting -pc_hpddm_coarse_mat_type baij >>>> -pc_hpddm_coarse_pc_type lu -pc_hpddm_define_subdomains >>>> -pc_hpddm_levels_1_eps_gen_non_hermitian -pc_hpddm_levels_1_eps_nev 50 >>>> -pc_hpddm_levels_1_eps_threshold 0.1 -pc_hpddm_levels_1_st_matstructure >>>> SAME -pc_hpddm_levels_1_st_share_sub_ksp >>>> -pc_hpddm_levels_1_sub_pc_factor_mat_solver_type mumps >>>> -pc_hpddm_levels_1_sub_pc_type lu >>>> >>>> Alex >>>> >>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bourdin at mcmaster.ca Mon Apr 17 20:15:32 2023 From: bourdin at mcmaster.ca (Blaise Bourdin) Date: Tue, 18 Apr 2023 01:15:32 +0000 Subject: [petsc-users] Composing different Field Components into single Vector (or data array) In-Reply-To: References: <33427F79-6425-42F8-A64A-8D65237D6929@mcmaster.ca> Message-ID: <188210CA-BDD3-427E-BF2C-450719E64386@mcmaster.ca> An HTML attachment was scrubbed... URL: From pierre.jolivet at lip6.fr Mon Apr 17 23:23:43 2023 From: pierre.jolivet at lip6.fr (Pierre Jolivet) Date: Tue, 18 Apr 2023 06:23:43 +0200 Subject: [petsc-users] PCHPDDM and matrix type In-Reply-To: References: Message-ID: <55F9689B-88DC-4D1A-8E6C-6A21DA664F85@lip6.fr> 1) PCHPDDM handles AIJ, BAIJ, SBAIJ, IS, NORMAL, NORMALHERMITIAN, SCHURCOMPLEMENT, HTOOL 2) This PC is based on domain decomposition, with no support yet for ?over decomposition?. If you run with a single process, it?s like PCASM or PCBJACOBI, you?ll get the same behavior as if you were just using the sub PC (in this case, an exact factorization) 3) The error you are seeing is likely due to a failure while coarsening, I will ask you for some info 4) Unrelated, but you should probably not use --with-cxx-dialect=C++11 and instead stick to --with-cxx-dialect=11 (unless you have a good reason to) Thanks, Pierre > On 18 Apr 2023, at 1:26 AM, Alexander Lindsay wrote: > > I don't really get much more of a stack trace out: > > [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Invalid argument > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > > [0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! > [1]PETSC ERROR: Invalid argument > [1]PETSC ERROR: > [0]PETSC ERROR: Option left: name:-i value: full_upwinding_2D.i source: command line > [0]PETSC ERROR: [1]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! > Option left: name:-ksp_converged_reason value: ::failed source: code > [0]PETSC ERROR: Option left: name:-pc_hpddm_coarse_mat_type value: baij source: command line > [0]PETSC ERROR: Option left: name:-pc_hpddm_coarse_pc_type value: lu source: command line > [1]PETSC ERROR: Option left: name:-i value: full_upwinding_2D.i source: command line > [1]PETSC ERROR: Option left: name:-ksp_converged_reason value: ::failed source: code > [0]PETSC ERROR: Option left: name:-snes_converged_reason value: ::failed source: code > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [0]PETSC ERROR: Petsc Development GIT revision: v3.17.4-3368-g5a48edb989d GIT Date: 2023-04-16 17:35:24 +0000 > [1]PETSC ERROR: Option left: name:-pc_hpddm_coarse_mat_type value: baij source: command line > [1]PETSC ERROR: Option left: name:-pc_hpddm_coarse_pc_type value: lu source: command line > [0]PETSC ERROR: ../../../moose_test-opt on a arch-moose named rod.hpc.inl.gov by lindad Mon Apr 17 16:11:09 2023 > [0]PETSC ERROR: Configure options --download-hypre=1 --with-shared-libraries=1 --download-hdf5=1 --with-hdf5-fortran-bindings=0 --with-debugging=no --download-fblaslapack=1 --download-metis=1 --download-ptscotch=1 --download-parmetis=1 --download-superlu_dist=1 --download-mumps=1 --download-strumpack=1 --download-scalapack=1 --download-slepc=1 --with-mpi=1 --with-openmp=1 --with-cxx-dialect=C++11 --with-fortran-bindings=0 --with-sowing=0 --with-64-bit-indices --with-make-np=256 --download-hpddm > [1]PETSC ERROR: Option left: name:-snes_converged_reason value: ::failed source: code > [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [1]PETSC ERROR: [0]PETSC ERROR: #1 buildTwo() at /raid/lindad/moose/petsc/arch-moose/include/HPDDM_schwarz.hpp:1012 > Petsc Development GIT revision: v3.17.4-3368-g5a48edb989d GIT Date: 2023-04-16 17:35:24 +0000 > [1]PETSC ERROR: ../../../moose_test-opt on a arch-moose named rod.hpc.inl.gov by lindad Mon Apr 17 16:11:09 2023 > [1]PETSC ERROR: Configure options --download-hypre=1 --with-shared-libraries=1 --download-hdf5=1 --with-hdf5-fortran-bindings=0 --with-debugging=no --download-fblaslapack=1 --download-metis=1 --download-ptscotch=1 --download-parmetis=1 --download-superlu_dist=1 --download-mumps=1 --download-strumpack=1 --download-scalapack=1 --download-slepc=1 --with-mpi=1 --with-openmp=1 --with-cxx-dialect=C++11 --with-fortran-bindings=0 --with-sowing=0 --with-64-bit-indices --with-make-np=256 --download-hpddm > [1]PETSC ERROR: #1 buildTwo() at /raid/lindad/moose/petsc/arch-moose/include/HPDDM_schwarz.hpp:1012 > > On Mon, Apr 17, 2023 at 4:55?PM Matthew Knepley > wrote: >> I don't think so. Can you show the whole stack? >> >> THanks, >> >> Matt >> >> On Mon, Apr 17, 2023 at 6:24?PM Alexander Lindsay > wrote: >>> If it helps: if I use those exact same options in serial, then no errors and the linear solve is beautiful :-) >>> >>> On Mon, Apr 17, 2023 at 4:22?PM Alexander Lindsay > wrote: >>>> I'm likely revealing a lot of ignorance, but in order to use HPDDM as a preconditioner does my system matrix (I am using the same matrix for A and P) need to be block type, e.g. baij or sbaij ? In MOOSE our default is aij and I am currently getting >>>> >>>> [1]PETSC ERROR: #1 buildTwo() at /raid/lindad/moose/petsc/arch-moose/include/HPDDM_schwarz.hpp:1012 >>>> >>>> with options: >>>> >>>> -pc_type hpddm -pc_hpddm_block_splitting -pc_hpddm_coarse_mat_type baij -pc_hpddm_coarse_pc_type lu -pc_hpddm_define_subdomains -pc_hpddm_levels_1_eps_gen_non_hermitian -pc_hpddm_levels_1_eps_nev 50 -pc_hpddm_levels_1_eps_threshold 0.1 -pc_hpddm_levels_1_st_matstructure SAME -pc_hpddm_levels_1_st_share_sub_ksp -pc_hpddm_levels_1_sub_pc_factor_mat_solver_type mumps -pc_hpddm_levels_1_sub_pc_type lu >>>> >>>> Alex >> >> >> -- >> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From yangzongze at gmail.com Tue Apr 18 04:20:27 2023 From: yangzongze at gmail.com (Zongze Yang) Date: Tue, 18 Apr 2023 17:20:27 +0800 Subject: [petsc-users] Build error: vecimpl.h:124:98: error: expected declaration specifiers or '...' before string constant Message-ID: Hi, I am building petsc using gcc at 9.5.0, and found the following error: ``` In file included from /usr/include/alloca.h:25, from /usr/include/stdlib.h:497, from /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petscsys.h:1395, from /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petscsf.h:7, from /home/lrtfm/opt/firedrake/complex-int32/petsc/src/vec/is/sf/interface/vscat.c:1: /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petsc/private/vecimpl.h:124:15: error: expected declaration specifiers or '...' before '__builtin_offsetof' 124 | static_assert(offsetof(struct _VecOps, loadnative) == sizeof(void (*)(void)) * VECOP_LOADNATIVE, ""); | ^~~~~~~~ In file included from /home/lrtfm/opt/firedrake/complex-int32/petsc/src/vec/is/sf/interface/vscat.c:7: /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petsc/private/vecimpl.h:124:98: error: expected declaration specifiers or '...' before string constant 124 | static_assert(offsetof(struct _VecOps, loadnative) == sizeof(void (*)(void)) * VECOP_LOADNATIVE, ""); | ^~ ``` Could someone give me some hints to fix it? The configure.log and make.log are attached. Best wishes, Zongze -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: petsc-make.log Type: application/octet-stream Size: 890940 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: petsc-configure.log Type: application/octet-stream Size: 3772331 bytes Desc: not available URL: From karthikeyan.chockalingam at stfc.ac.uk Tue Apr 18 04:24:00 2023 From: karthikeyan.chockalingam at stfc.ac.uk (Karthikeyan Chockalingam - STFC UKRI) Date: Tue, 18 Apr 2023 09:24:00 +0000 Subject: [petsc-users] Setting up a matrix for Lagrange multiplier Message-ID: Hello, I'm solving a problem using the Lagrange multiplier, the matrix has the form K = [A P^T P 0] I am familiar with constructing K using MATMPIAIJ. However, I would like to know if had [A], can I augment it with [P], [P^T] and [0] of type MATMPIAIJ? Likewise for vectors as well. Can you please point me to the right resource, if it is a common operation in PETSc? Many thanks. Kind regards, Karthik. -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Apr 18 05:08:03 2023 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 18 Apr 2023 06:08:03 -0400 Subject: [petsc-users] Setting up a matrix for Lagrange multiplier In-Reply-To: References: Message-ID: On Tue, Apr 18, 2023 at 5:24?AM Karthikeyan Chockalingam - STFC UKRI via petsc-users wrote: > Hello, > > > > I'm solving a problem using the Lagrange multiplier, the matrix has the > form > > > > K = [A P^T > > P 0] > > > > I am familiar with constructing K using MATMPIAIJ. However, I would like > to know if had [A], can I augment it with [P], [P^T] and [0] of type > MATMPIAIJ? Likewise for vectors as well. > > > > Can you please point me to the right resource, if it is a common operation > in PETSc? > You can do this at least 2 ways: 1) Assemble you submatrices directly into the larger matrix by constructing local-to-global maps for the emplacement. so that you do not change your assembly code, except to change MatSetValues() to MatSetValuesLocal(). This is usually preferable. 2) Use MATNEST and VecNEST to put pointers to submatrices and subvectors directly in. Thanks, Matt > Many thanks. > > > > Kind regards, > > Karthik. > > > > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Tue Apr 18 08:09:31 2023 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 18 Apr 2023 08:09:31 -0500 (CDT) Subject: [petsc-users] Build error: vecimpl.h:124:98: error: expected declaration specifiers or '...' before string constant In-Reply-To: References: Message-ID: <776fdd8e-c22a-0b65-783e-364e43a302c5@mcs.anl.gov> Does this change work? diff --git a/include/petsc/private/vecimpl.h b/include/petsc/private/vecimpl.h index dd75dbbc00b..168540b546e 100644 --- a/include/petsc/private/vecimpl.h +++ b/include/petsc/private/vecimpl.h @@ -110,7 +110,7 @@ struct _VecOps { PetscErrorCode (*setvaluescoo)(Vec, const PetscScalar[], InsertMode); }; -#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= 11)) +#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= 17)) #if (PETSC_C_VERSION >= 11) && (PETSC_C_VERSION < 23) // static_assert() is a keyword since C23, before that defined as macro in assert.h #include Satish On Tue, 18 Apr 2023, Zongze Yang wrote: > Hi, I am building petsc using gcc at 9.5.0, and found the following error: > > ``` > In file included from /usr/include/alloca.h:25, > from /usr/include/stdlib.h:497, > from > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petscsys.h:1395, > from > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petscsf.h:7, > from > /home/lrtfm/opt/firedrake/complex-int32/petsc/src/vec/is/sf/interface/vscat.c:1: > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petsc/private/vecimpl.h:124:15: > error: expected declaration specifiers or '...' before '__builtin_offsetof' > 124 | static_assert(offsetof(struct _VecOps, loadnative) == sizeof(void > (*)(void)) * VECOP_LOADNATIVE, ""); > | ^~~~~~~~ > In file included from > /home/lrtfm/opt/firedrake/complex-int32/petsc/src/vec/is/sf/interface/vscat.c:7: > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petsc/private/vecimpl.h:124:98: > error: expected declaration specifiers or '...' before string constant > 124 | static_assert(offsetof(struct _VecOps, loadnative) == sizeof(void > (*)(void)) * VECOP_LOADNATIVE, ""); > | > ^~ > ``` > > Could someone give me some hints to fix it? The configure.log and make.log > are attached. > > > Best wishes, > Zongze > From yangzongze at gmail.com Tue Apr 18 09:07:57 2023 From: yangzongze at gmail.com (Zongze Yang) Date: Tue, 18 Apr 2023 22:07:57 +0800 Subject: [petsc-users] Build error: vecimpl.h:124:98: error: expected declaration specifiers or '...' before string constant In-Reply-To: <776fdd8e-c22a-0b65-783e-364e43a302c5@mcs.anl.gov> References: <776fdd8e-c22a-0b65-783e-364e43a302c5@mcs.anl.gov> Message-ID: No, it doesn't. It has the same problem. I just `make clean` and the `make`. Do I need to reconfigure? Best wishes, Zongze On Tue, 18 Apr 2023 at 21:09, Satish Balay wrote: > Does this change work? > > diff --git a/include/petsc/private/vecimpl.h > b/include/petsc/private/vecimpl.h > index dd75dbbc00b..168540b546e 100644 > --- a/include/petsc/private/vecimpl.h > +++ b/include/petsc/private/vecimpl.h > @@ -110,7 +110,7 @@ struct _VecOps { > PetscErrorCode (*setvaluescoo)(Vec, const PetscScalar[], InsertMode); > }; > > -#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= 11)) > +#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= 17)) > #if (PETSC_C_VERSION >= 11) && (PETSC_C_VERSION < 23) > // static_assert() is a keyword since C23, before that defined as > macro in assert.h > #include > > > Satish > > On Tue, 18 Apr 2023, Zongze Yang wrote: > > > Hi, I am building petsc using gcc at 9.5.0, and found the following error: > > > > ``` > > In file included from /usr/include/alloca.h:25, > > from /usr/include/stdlib.h:497, > > from > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petscsys.h:1395, > > from > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petscsf.h:7, > > from > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/src/vec/is/sf/interface/vscat.c:1: > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petsc/private/vecimpl.h:124:15: > > error: expected declaration specifiers or '...' before > '__builtin_offsetof' > > 124 | static_assert(offsetof(struct _VecOps, loadnative) == sizeof(void > > (*)(void)) * VECOP_LOADNATIVE, ""); > > | ^~~~~~~~ > > In file included from > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/src/vec/is/sf/interface/vscat.c:7: > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petsc/private/vecimpl.h:124:98: > > error: expected declaration specifiers or '...' before string constant > > 124 | static_assert(offsetof(struct _VecOps, loadnative) == sizeof(void > > (*)(void)) * VECOP_LOADNATIVE, ""); > > | > > ^~ > > ``` > > > > Could someone give me some hints to fix it? The configure.log and > make.log > > are attached. > > > > > > Best wishes, > > Zongze > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jacob.fai at gmail.com Tue Apr 18 09:31:12 2023 From: jacob.fai at gmail.com (Jacob Faibussowitsch) Date: Tue, 18 Apr 2023 10:31:12 -0400 Subject: [petsc-users] Build error: vecimpl.h:124:98: error: expected declaration specifiers or '...' before string constant In-Reply-To: References: <776fdd8e-c22a-0b65-783e-364e43a302c5@mcs.anl.gov> Message-ID: This is a bug in GCC 9. Can you try the following: $ make clean $ make CFLAGS+='-std=gnu11? Best regards, Jacob Faibussowitsch (Jacob Fai - booss - oh - vitch) > On Apr 18, 2023, at 10:07, Zongze Yang wrote: > > No, it doesn't. It has the same problem. I just `make clean` and the `make`. Do I need to reconfigure? > > Best wishes, > Zongze > > > On Tue, 18 Apr 2023 at 21:09, Satish Balay wrote: > Does this change work? > > diff --git a/include/petsc/private/vecimpl.h b/include/petsc/private/vecimpl.h > index dd75dbbc00b..168540b546e 100644 > --- a/include/petsc/private/vecimpl.h > +++ b/include/petsc/private/vecimpl.h > @@ -110,7 +110,7 @@ struct _VecOps { > PetscErrorCode (*setvaluescoo)(Vec, const PetscScalar[], InsertMode); > }; > > -#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= 11)) > +#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= 17)) > #if (PETSC_C_VERSION >= 11) && (PETSC_C_VERSION < 23) > // static_assert() is a keyword since C23, before that defined as macro in assert.h > #include > > > Satish > > On Tue, 18 Apr 2023, Zongze Yang wrote: > > > Hi, I am building petsc using gcc at 9.5.0, and found the following error: > > > > ``` > > In file included from /usr/include/alloca.h:25, > > from /usr/include/stdlib.h:497, > > from > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petscsys.h:1395, > > from > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petscsf.h:7, > > from > > /home/lrtfm/opt/firedrake/complex-int32/petsc/src/vec/is/sf/interface/vscat.c:1: > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petsc/private/vecimpl.h:124:15: > > error: expected declaration specifiers or '...' before '__builtin_offsetof' > > 124 | static_assert(offsetof(struct _VecOps, loadnative) == sizeof(void > > (*)(void)) * VECOP_LOADNATIVE, ""); > > | ^~~~~~~~ > > In file included from > > /home/lrtfm/opt/firedrake/complex-int32/petsc/src/vec/is/sf/interface/vscat.c:7: > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petsc/private/vecimpl.h:124:98: > > error: expected declaration specifiers or '...' before string constant > > 124 | static_assert(offsetof(struct _VecOps, loadnative) == sizeof(void > > (*)(void)) * VECOP_LOADNATIVE, ""); > > | > > ^~ > > ``` > > > > Could someone give me some hints to fix it? The configure.log and make.log > > are attached. > > > > > > Best wishes, > > Zongze > > > From balay at mcs.anl.gov Tue Apr 18 09:36:49 2023 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 18 Apr 2023 09:36:49 -0500 (CDT) Subject: [petsc-users] Build error: vecimpl.h:124:98: error: expected declaration specifiers or '...' before string constant In-Reply-To: References: <776fdd8e-c22a-0b65-783e-364e43a302c5@mcs.anl.gov> Message-ID: I think its best if configure can handle this automatically (check for broken compilers). Until then - perhaps we should use: diff --git a/include/petsc/private/vecimpl.h b/include/petsc/private/vecimpl.h index dd75dbbc00b..dd9ef6791c5 100644 --- a/include/petsc/private/vecimpl.h +++ b/include/petsc/private/vecimpl.h @@ -110,12 +110,7 @@ struct _VecOps { PetscErrorCode (*setvaluescoo)(Vec, const PetscScalar[], InsertMode); }; -#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= 11)) - #if (PETSC_C_VERSION >= 11) && (PETSC_C_VERSION < 23) - // static_assert() is a keyword since C23, before that defined as macro in assert.h - #include - #endif - +#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= 17)) static_assert(offsetof(struct _VecOps, duplicate) == sizeof(void (*)(void)) * VECOP_DUPLICATE, ""); static_assert(offsetof(struct _VecOps, set) == sizeof(void (*)(void)) * VECOP_SET, ""); static_assert(offsetof(struct _VecOps, view) == sizeof(void (*)(void)) * VECOP_VIEW, ""); Or just: +#if defined(offsetof) && defined(__cplusplus) Satish On Tue, 18 Apr 2023, Jacob Faibussowitsch wrote: > This is a bug in GCC 9. Can you try the following: > > $ make clean > $ make CFLAGS+='-std=gnu11? > > Best regards, > > Jacob Faibussowitsch > (Jacob Fai - booss - oh - vitch) > > > On Apr 18, 2023, at 10:07, Zongze Yang wrote: > > > > No, it doesn't. It has the same problem. I just `make clean` and the `make`. Do I need to reconfigure? > > > > Best wishes, > > Zongze > > > > > > On Tue, 18 Apr 2023 at 21:09, Satish Balay wrote: > > Does this change work? > > > > diff --git a/include/petsc/private/vecimpl.h b/include/petsc/private/vecimpl.h > > index dd75dbbc00b..168540b546e 100644 > > --- a/include/petsc/private/vecimpl.h > > +++ b/include/petsc/private/vecimpl.h > > @@ -110,7 +110,7 @@ struct _VecOps { > > PetscErrorCode (*setvaluescoo)(Vec, const PetscScalar[], InsertMode); > > }; > > > > -#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= 11)) > > +#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= 17)) > > #if (PETSC_C_VERSION >= 11) && (PETSC_C_VERSION < 23) > > // static_assert() is a keyword since C23, before that defined as macro in assert.h > > #include > > > > > > Satish > > > > On Tue, 18 Apr 2023, Zongze Yang wrote: > > > > > Hi, I am building petsc using gcc at 9.5.0, and found the following error: > > > > > > ``` > > > In file included from /usr/include/alloca.h:25, > > > from /usr/include/stdlib.h:497, > > > from > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petscsys.h:1395, > > > from > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petscsf.h:7, > > > from > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/src/vec/is/sf/interface/vscat.c:1: > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petsc/private/vecimpl.h:124:15: > > > error: expected declaration specifiers or '...' before '__builtin_offsetof' > > > 124 | static_assert(offsetof(struct _VecOps, loadnative) == sizeof(void > > > (*)(void)) * VECOP_LOADNATIVE, ""); > > > | ^~~~~~~~ > > > In file included from > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/src/vec/is/sf/interface/vscat.c:7: > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petsc/private/vecimpl.h:124:98: > > > error: expected declaration specifiers or '...' before string constant > > > 124 | static_assert(offsetof(struct _VecOps, loadnative) == sizeof(void > > > (*)(void)) * VECOP_LOADNATIVE, ""); > > > | > > > ^~ > > > ``` > > > > > > Could someone give me some hints to fix it? The configure.log and make.log > > > are attached. > > > > > > > > > Best wishes, > > > Zongze > > > > > > From yangzongze at gmail.com Tue Apr 18 09:41:37 2023 From: yangzongze at gmail.com (Zongze Yang) Date: Tue, 18 Apr 2023 22:41:37 +0800 Subject: [petsc-users] Build error: vecimpl.h:124:98: error: expected declaration specifiers or '...' before string constant In-Reply-To: References: <776fdd8e-c22a-0b65-783e-364e43a302c5@mcs.anl.gov> Message-ID: The value of PETSC_C_VERSION = 17. I changed `PETSC_C_VERSION >= 17` to `PETSC_C_VERSION > 17`, then the error is gone. Best wishes, Zongze On Tue, 18 Apr 2023 at 22:36, Satish Balay wrote: > I think its best if configure can handle this automatically (check for > broken compilers). Until then - perhaps we should use: > > > diff --git a/include/petsc/private/vecimpl.h > b/include/petsc/private/vecimpl.h > index dd75dbbc00b..dd9ef6791c5 100644 > --- a/include/petsc/private/vecimpl.h > +++ b/include/petsc/private/vecimpl.h > @@ -110,12 +110,7 @@ struct _VecOps { > PetscErrorCode (*setvaluescoo)(Vec, const PetscScalar[], InsertMode); > }; > > -#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= 11)) > - #if (PETSC_C_VERSION >= 11) && (PETSC_C_VERSION < 23) > - // static_assert() is a keyword since C23, before that defined as > macro in assert.h > - #include > - #endif > - > +#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= 17)) > static_assert(offsetof(struct _VecOps, duplicate) == sizeof(void > (*)(void)) * VECOP_DUPLICATE, ""); > static_assert(offsetof(struct _VecOps, set) == sizeof(void (*)(void)) * > VECOP_SET, ""); > static_assert(offsetof(struct _VecOps, view) == sizeof(void (*)(void)) * > VECOP_VIEW, ""); > > > Or just: > > +#if defined(offsetof) && defined(__cplusplus) > > Satish > > On Tue, 18 Apr 2023, Jacob Faibussowitsch wrote: > > > This is a bug in GCC 9. Can you try the following: > > > > $ make clean > > $ make CFLAGS+='-std=gnu11? > > > > Best regards, > > > > Jacob Faibussowitsch > > (Jacob Fai - booss - oh - vitch) > > > > > On Apr 18, 2023, at 10:07, Zongze Yang wrote: > > > > > > No, it doesn't. It has the same problem. I just `make clean` and the > `make`. Do I need to reconfigure? > > > > > > Best wishes, > > > Zongze > > > > > > > > > On Tue, 18 Apr 2023 at 21:09, Satish Balay wrote: > > > Does this change work? > > > > > > diff --git a/include/petsc/private/vecimpl.h > b/include/petsc/private/vecimpl.h > > > index dd75dbbc00b..168540b546e 100644 > > > --- a/include/petsc/private/vecimpl.h > > > +++ b/include/petsc/private/vecimpl.h > > > @@ -110,7 +110,7 @@ struct _VecOps { > > > PetscErrorCode (*setvaluescoo)(Vec, const PetscScalar[], > InsertMode); > > > }; > > > > > > -#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= > 11)) > > > +#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= > 17)) > > > #if (PETSC_C_VERSION >= 11) && (PETSC_C_VERSION < 23) > > > // static_assert() is a keyword since C23, before that defined as > macro in assert.h > > > #include > > > > > > > > > Satish > > > > > > On Tue, 18 Apr 2023, Zongze Yang wrote: > > > > > > > Hi, I am building petsc using gcc at 9.5.0, and found the following > error: > > > > > > > > ``` > > > > In file included from /usr/include/alloca.h:25, > > > > from /usr/include/stdlib.h:497, > > > > from > > > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petscsys.h:1395, > > > > from > > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petscsf.h:7, > > > > from > > > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/src/vec/is/sf/interface/vscat.c:1: > > > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petsc/private/vecimpl.h:124:15: > > > > error: expected declaration specifiers or '...' before > '__builtin_offsetof' > > > > 124 | static_assert(offsetof(struct _VecOps, loadnative) == > sizeof(void > > > > (*)(void)) * VECOP_LOADNATIVE, ""); > > > > | ^~~~~~~~ > > > > In file included from > > > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/src/vec/is/sf/interface/vscat.c:7: > > > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petsc/private/vecimpl.h:124:98: > > > > error: expected declaration specifiers or '...' before string > constant > > > > 124 | static_assert(offsetof(struct _VecOps, loadnative) == > sizeof(void > > > > (*)(void)) * VECOP_LOADNATIVE, ""); > > > > | > > > > ^~ > > > > ``` > > > > > > > > Could someone give me some hints to fix it? The configure.log and > make.log > > > > are attached. > > > > > > > > > > > > Best wishes, > > > > Zongze > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Tue Apr 18 09:52:43 2023 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 18 Apr 2023 09:52:43 -0500 (CDT) Subject: [petsc-users] Build error: vecimpl.h:124:98: error: expected declaration specifiers or '...' before string constant In-Reply-To: References: <776fdd8e-c22a-0b65-783e-364e43a302c5@mcs.anl.gov> Message-ID: <1bf63daf-96d3-31e9-d231-e8dcb4f5cabc@mcs.anl.gov> BTW: I see this behavior with "gcc (conda-forge gcc 12.2.0-19) 12.2.0" [apart from other issues with this compiler] And CFLAGS='-std=gnu11? doesn't help here. This change works though.. -#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= 11)) +#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= 23)) Satish On Tue, 18 Apr 2023, Jacob Faibussowitsch wrote: > This is a bug in GCC 9. Can you try the following: > > $ make clean > $ make CFLAGS+='-std=gnu11? > > Best regards, > > Jacob Faibussowitsch > (Jacob Fai - booss - oh - vitch) > > > On Apr 18, 2023, at 10:07, Zongze Yang wrote: > > > > No, it doesn't. It has the same problem. I just `make clean` and the `make`. Do I need to reconfigure? > > > > Best wishes, > > Zongze > > > > > > On Tue, 18 Apr 2023 at 21:09, Satish Balay wrote: > > Does this change work? > > > > diff --git a/include/petsc/private/vecimpl.h b/include/petsc/private/vecimpl.h > > index dd75dbbc00b..168540b546e 100644 > > --- a/include/petsc/private/vecimpl.h > > +++ b/include/petsc/private/vecimpl.h > > @@ -110,7 +110,7 @@ struct _VecOps { > > PetscErrorCode (*setvaluescoo)(Vec, const PetscScalar[], InsertMode); > > }; > > > > -#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= 11)) > > +#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= 17)) > > #if (PETSC_C_VERSION >= 11) && (PETSC_C_VERSION < 23) > > // static_assert() is a keyword since C23, before that defined as macro in assert.h > > #include > > > > > > Satish > > > > On Tue, 18 Apr 2023, Zongze Yang wrote: > > > > > Hi, I am building petsc using gcc at 9.5.0, and found the following error: > > > > > > ``` > > > In file included from /usr/include/alloca.h:25, > > > from /usr/include/stdlib.h:497, > > > from > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petscsys.h:1395, > > > from > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petscsf.h:7, > > > from > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/src/vec/is/sf/interface/vscat.c:1: > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petsc/private/vecimpl.h:124:15: > > > error: expected declaration specifiers or '...' before '__builtin_offsetof' > > > 124 | static_assert(offsetof(struct _VecOps, loadnative) == sizeof(void > > > (*)(void)) * VECOP_LOADNATIVE, ""); > > > | ^~~~~~~~ > > > In file included from > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/src/vec/is/sf/interface/vscat.c:7: > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petsc/private/vecimpl.h:124:98: > > > error: expected declaration specifiers or '...' before string constant > > > 124 | static_assert(offsetof(struct _VecOps, loadnative) == sizeof(void > > > (*)(void)) * VECOP_LOADNATIVE, ""); > > > | > > > ^~ > > > ``` > > > > > > Could someone give me some hints to fix it? The configure.log and make.log > > > are attached. > > > > > > > > > Best wishes, > > > Zongze > > > > > > From karthikeyan.chockalingam at stfc.ac.uk Tue Apr 18 10:16:46 2023 From: karthikeyan.chockalingam at stfc.ac.uk (Karthikeyan Chockalingam - STFC UKRI) Date: Tue, 18 Apr 2023 15:16:46 +0000 Subject: [petsc-users] Setting up a matrix for Lagrange multiplier In-Reply-To: References: Message-ID: Thank you for your response. I spend some time understanding how MatSetValuesLocal and ISLocalToGlobalMappingCreate work. Q1) Will the matrix K be of type MATMPIAIJ or MATIS? K = [A P^T P 0] Q2) Can I use both MatSetValues() to MatSetValuesLocal() to populate K? Since I have already used MatSetValues() to construct A. Q3) What are the advantages of using MatSetValuesLocal()? Is it that I can construct P directly using local indies and map the entrees to the global index in K? Q4) I probably don?t have to construct an independent P matrix? Best regards, Karthik. From: Matthew Knepley Date: Tuesday, 18 April 2023 at 11:08 To: Chockalingam, Karthikeyan (STFC,DL,HC) Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Setting up a matrix for Lagrange multiplier On Tue, Apr 18, 2023 at 5:24?AM Karthikeyan Chockalingam - STFC UKRI via petsc-users > wrote: Hello, I'm solving a problem using the Lagrange multiplier, the matrix has the form K = [A P^T P 0] I am familiar with constructing K using MATMPIAIJ. However, I would like to know if had [A], can I augment it with [P], [P^T] and [0] of type MATMPIAIJ? Likewise for vectors as well. Can you please point me to the right resource, if it is a common operation in PETSc? You can do this at least 2 ways: 1) Assemble you submatrices directly into the larger matrix by constructing local-to-global maps for the emplacement. so that you do not change your assembly code, except to change MatSetValues() to MatSetValuesLocal(). This is usually preferable. 2) Use MATNEST and VecNEST to put pointers to submatrices and subvectors directly in. Thanks, Matt Many thanks. Kind regards, Karthik. -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Apr 18 10:20:53 2023 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 18 Apr 2023 11:20:53 -0400 Subject: [petsc-users] Setting up a matrix for Lagrange multiplier In-Reply-To: References: Message-ID: On Tue, Apr 18, 2023 at 11:16?AM Karthikeyan Chockalingam - STFC UKRI < karthikeyan.chockalingam at stfc.ac.uk> wrote: > Thank you for your response. I spend some time understanding how > > MatSetValuesLocal and ISLocalToGlobalMappingCreate work. > You can look at SNES ex28 where we do this with DMCOMPOSITE. > Q1) Will the matrix K be of type MATMPIAIJ or MATIS? > > K = [A P^T > > P 0] > I assume MPIAIJ since IS is only used for Neumann-Neumann decompositions. > Q2) Can I use both MatSetValues() to MatSetValuesLocal() to populate K? > Since I have already used MatSetValues() to construct A. > You can, and there would be no changes in serial if K is exactly the upper left block, but in parallel global indices would change. > Q3) What are the advantages of using MatSetValuesLocal()? Is it that I > can construct P directly using local indies and map the entrees to the > global index in K? > You have a monolithic K, so that you can use sparse direct solvers to check things. THis is impossible with separate storage. > Q4) I probably don?t have to construct an independent P matrix > You wouldn't in this case. Thanks, Matt > Best regards, > > Karthik. > > > > > > > > *From: *Matthew Knepley > *Date: *Tuesday, 18 April 2023 at 11:08 > *To: *Chockalingam, Karthikeyan (STFC,DL,HC) < > karthikeyan.chockalingam at stfc.ac.uk> > *Cc: *petsc-users at mcs.anl.gov > *Subject: *Re: [petsc-users] Setting up a matrix for Lagrange multiplier > > On Tue, Apr 18, 2023 at 5:24?AM Karthikeyan Chockalingam - STFC UKRI via > petsc-users wrote: > > Hello, > > > > I'm solving a problem using the Lagrange multiplier, the matrix has the > form > > > > K = [A P^T > > P 0] > > > > I am familiar with constructing K using MATMPIAIJ. However, I would like > to know if had [A], can I augment it with [P], [P^T] and [0] of type > MATMPIAIJ? Likewise for vectors as well. > > > > Can you please point me to the right resource, if it is a common operation > in PETSc? > > > > You can do this at least 2 ways: > > > > 1) Assemble you submatrices directly into the larger matrix by > constructing local-to-global maps for the emplacement. so that you do > > not change your assembly code, except to change MatSetValues() to > MatSetValuesLocal(). This is usually preferable. > > > > 2) Use MATNEST and VecNEST to put pointers to submatrices and subvectors > directly in. > > > > Thanks, > > > > Matt > > > > Many thanks. > > > > Kind regards, > > Karthik. > > > > > > > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From joauma.marichal at uclouvain.be Wed Apr 19 07:40:09 2023 From: joauma.marichal at uclouvain.be (Joauma Marichal) Date: Wed, 19 Apr 2023 12:40:09 +0000 Subject: [petsc-users] Lagrangian particle tracking - ghost cells Message-ID: Hello, I am using the DMSwarm library in some Eulerian-Lagrangian approach to have vapor bubbles in water. I would like the bubbles to have an impact on the water fields of the same cells and the adjacent ones. Is there any petsc function that can allow to sum the ghost cell values (of an adjacent proc) to the actual ones (on the actual proc)? Thanks a lot for your help. Best regards, Joauma -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Apr 19 07:55:58 2023 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 19 Apr 2023 08:55:58 -0400 Subject: [petsc-users] [petsc-maint] Lagrangian particle tracking - ghost cells In-Reply-To: References: Message-ID: On Wed, Apr 19, 2023 at 8:40?AM Joauma Marichal < joauma.marichal at uclouvain.be> wrote: > Hello, > > > > I am using the DMSwarm library in some Eulerian-Lagrangian approach to > have vapor bubbles in water. > > I would like the bubbles to have an impact on the water fields of the same > cells and the adjacent ones. Is there any petsc function that can allow to > sum the ghost cell values (of an adjacent proc) to the actual ones (on the > actual proc)? > I think you can use the GlobalToLocal map to populate a local vector, and then just sum values as normal. This is our preferred way to handle these kinds of halo communication. Will that work? Thanks, Matt > Thanks a lot for your help. > > > > Best regards, > > > > Joauma > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Apr 19 08:13:41 2023 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 19 Apr 2023 09:13:41 -0400 Subject: [petsc-users] [petsc-maint] DMSwarm with periodic B.C. In-Reply-To: References: Message-ID: If this is a problem, let's make an example and I can debug it because I thought that this worked. I might have only tested with Plex. Thanks, Matt On Wed, Apr 12, 2023 at 11:32?AM Mark Adams wrote: > First, you don't want a DMShell. Just use da_swarm. > See src/dm/tutorials/ex20.c > > You can run this test with > > cd src/dm/tutorials > > make ex20 > > ./ex20 > or > > ./ex20 -mode 1 > See the end of ex20.c for these (lack of) arguments > > Now change that code to one periodic direction and test. > > This could be a bug. This code is not well tested. > > Thanks, > Mark > > > > > On Wed, Apr 12, 2023 at 4:02?AM Joauma Marichal < > joauma.marichal at uclouvain.be> wrote: > >> Hello, >> >> >> >> I am using petsc DMSwarm library for some Lagrangian particle tracking. >> Until now, I was working in a closed box and was therefore initializing my >> swarm object as: >> >> >> >> // Create a DMDA staggered and without ghost cells (for DMSwarm to work) >> >> DMDACreate3d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE, DM_BOUNDARY_NONE, >> DM_BOUNDARY_NONE,DMDA_STENCIL_BOX,M,N,P,m,n,p,1,1,lx,ly,lz,da_swarm); >> >> >> >> DMSetFromOptions(*da_swarm); >> >> DMSetUp(*da_swarm); >> >> PetscScalar xmin, xmax, ymin, ymax, zmin, zmax; >> >> xmin = ymin = zmin = 0.; >> >> xmax = ymax = zmax = 1.; >> >> DMDASetUniformCoordinates(*da_swarm,xmin, xmax, ymin, ymax, zmin, zmax); >> >> //SetNonUniform3DCoordinates(*da_swarm, cornp, gridp, rank); >> >> >> >> /* Create a DMShell for point location purposes */ >> >> DMShellCreate(PETSC_COMM_WORLD,dmcell); >> >> DMSetApplicationContext(*dmcell,*da_swarm); >> >> (*dmcell)->ops->locatepoints = DMLocatePoints_DMDARegular; >> >> (*dmcell)->ops->getneighbors = DMGetNeighbors_DMDARegular; >> >> >> >> // Create a Swarm DMDA >> >> DMCreate(PETSC_COMM_WORLD,swarm); >> >> DMSetType(*swarm,DMSWARM); >> >> DMSetDimension(*swarm,3); >> >> DMSwarmSetType(*swarm,DMSWARM_PIC); >> >> DMSwarmSetCellDM(*swarm,*dmcell); >> >> >> >> I am now trying to work with periodic boundary conditions. I tried >> replacing DM_BOUNDARY_NONE by DM_BOUNDARY_PERIODIC but it does not work? >> I checked for examples using periodic B.C. but have not found any. Is it >> possible? And if yes, how can I make it work? >> >> >> >> Thanks a lot for your answer. >> >> Best regards, >> >> >> >> Joauma >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From karthikeyan.chockalingam at stfc.ac.uk Wed Apr 19 11:52:38 2023 From: karthikeyan.chockalingam at stfc.ac.uk (Karthikeyan Chockalingam - STFC UKRI) Date: Wed, 19 Apr 2023 16:52:38 +0000 Subject: [petsc-users] Setting up a matrix for Lagrange multiplier In-Reply-To: References: Message-ID: I have declared the mapping ISLocalToGlobalMapping mapping; ISLocalToGlobalMappingCreate(PETSC_COMM_WORLD, 1, n, nindices, PETSC_COPY_VALUES, &mapping); But when I use MatSetValuesLocal(), how do I know the above mapping is employed because it is not one of the parameters passed to the function? Thank you. Kind regards, Karthik. From: Matthew Knepley Date: Tuesday, 18 April 2023 at 16:21 To: Chockalingam, Karthikeyan (STFC,DL,HC) Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Setting up a matrix for Lagrange multiplier On Tue, Apr 18, 2023 at 11:16?AM Karthikeyan Chockalingam - STFC UKRI > wrote: Thank you for your response. I spend some time understanding how MatSetValuesLocal and ISLocalToGlobalMappingCreate work. You can look at SNES ex28 where we do this with DMCOMPOSITE. Q1) Will the matrix K be of type MATMPIAIJ or MATIS? K = [A P^T P 0] I assume MPIAIJ since IS is only used for Neumann-Neumann decompositions. Q2) Can I use both MatSetValues() to MatSetValuesLocal() to populate K? Since I have already used MatSetValues() to construct A. You can, and there would be no changes in serial if K is exactly the upper left block, but in parallel global indices would change. Q3) What are the advantages of using MatSetValuesLocal()? Is it that I can construct P directly using local indies and map the entrees to the global index in K? You have a monolithic K, so that you can use sparse direct solvers to check things. THis is impossible with separate storage. Q4) I probably don?t have to construct an independent P matrix You wouldn't in this case. Thanks, Matt Best regards, Karthik. From: Matthew Knepley > Date: Tuesday, 18 April 2023 at 11:08 To: Chockalingam, Karthikeyan (STFC,DL,HC) > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Setting up a matrix for Lagrange multiplier On Tue, Apr 18, 2023 at 5:24?AM Karthikeyan Chockalingam - STFC UKRI via petsc-users > wrote: Hello, I'm solving a problem using the Lagrange multiplier, the matrix has the form K = [A P^T P 0] I am familiar with constructing K using MATMPIAIJ. However, I would like to know if had [A], can I augment it with [P], [P^T] and [0] of type MATMPIAIJ? Likewise for vectors as well. Can you please point me to the right resource, if it is a common operation in PETSc? You can do this at least 2 ways: 1) Assemble you submatrices directly into the larger matrix by constructing local-to-global maps for the emplacement. so that you do not change your assembly code, except to change MatSetValues() to MatSetValuesLocal(). This is usually preferable. 2) Use MATNEST and VecNEST to put pointers to submatrices and subvectors directly in. Thanks, Matt Many thanks. Kind regards, Karthik. -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bojan.niceno.scientist at gmail.com Thu Apr 20 00:53:33 2023 From: bojan.niceno.scientist at gmail.com (Bojan Niceno) Date: Thu, 20 Apr 2023 07:53:33 +0200 Subject: [petsc-users] CG fails to converge in parallel Message-ID: Dear all, I am solving a Laplace equation with finite volume method on an unstructured grid with a Fortran code I have developed, and PETSc 3.19 library. I first used cg solver with asm preconditioner, which converges nicely when executed sequentially, but fails for MPI parallel version. I believed that there must be an error in which I set up and assemble parallel matrices for PETSc, but I soon noticed that if I use bicg with asm, everything works fine, the parallel bicg/asm shows almost the same convergence as sequential version. I could carry on with bicg, but I am still worried a little bit. To my knowledge of Krylov solvers, which is admittedly basic since I am a physicist only using linear algebra, the convergence of bicg should be very similar to that of cg when symmetric systems are solved. When I run my cases sequentially, I see that's indeed the case. But in parallel, bicg converges and cg fails. Do you see the above issues as an anomaly and if so, could you advise how to search for a cause? Kind regards, Bojan -------------- next part -------------- An HTML attachment was scrubbed... URL: From pierre.jolivet at lip6.fr Thu Apr 20 01:07:40 2023 From: pierre.jolivet at lip6.fr (Pierre Jolivet) Date: Thu, 20 Apr 2023 08:07:40 +0200 Subject: [petsc-users] CG fails to converge in parallel In-Reply-To: References: Message-ID: <5C3847B5-E518-473C-A742-D5AE3817B883@lip6.fr> > On 20 Apr 2023, at 7:53 AM, Bojan Niceno wrote: > > Dear all, > > > I am solving a Laplace equation with finite volume method on an unstructured grid with a Fortran code I have developed, and PETSc 3.19 library. > > I first used cg solver with asm preconditioner, which converges nicely when executed sequentially, but fails for MPI parallel version. I believed that there must be an error in which I set up and assemble parallel matrices for PETSc, but I soon noticed that if I use bicg with asm, everything works fine, the parallel bicg/asm shows almost the same convergence as sequential version. KSPCG requires a symmetric PC. By default, PCASMType is PC_ASM_RESTRICT, which yields a non-symmetric preconditioner. With a single process, this does not matter, but with more than one process, it does. If you switch to -pc_asm_type basic, KSPCG should converge. That being said, for Laplace equation, there are much faster alternatives than PCASM, e.g., PCGAMG. Thanks, Pierre > I could carry on with bicg, but I am still worried a little bit. To my knowledge of Krylov solvers, which is admittedly basic since I am a physicist only using linear algebra, the convergence of bicg should be very similar to that of cg when symmetric systems are solved. When I run my cases sequentially, I see that's indeed the case. But in parallel, bicg converges and cg fails. > > Do you see the above issues as an anomaly and if so, could you advise how to search for a cause? > > > Kind regards, > > > Bojan > > > > > From bojan.niceno.scientist at gmail.com Thu Apr 20 01:32:50 2023 From: bojan.niceno.scientist at gmail.com (Bojan Niceno) Date: Thu, 20 Apr 2023 08:32:50 +0200 Subject: [petsc-users] CG fails to converge in parallel In-Reply-To: <5C3847B5-E518-473C-A742-D5AE3817B883@lip6.fr> References: <5C3847B5-E518-473C-A742-D5AE3817B883@lip6.fr> Message-ID: Thanks a lot Pierre, I use PCGAMG as I type this answer. Not only that it works, but converges much faster than PCASM :-) Have a great day, Bojan On Thu, Apr 20, 2023 at 8:07?AM Pierre Jolivet wrote: > > > On 20 Apr 2023, at 7:53 AM, Bojan Niceno < > bojan.niceno.scientist at gmail.com> wrote: > > > > Dear all, > > > > > > I am solving a Laplace equation with finite volume method on an > unstructured grid with a Fortran code I have developed, and PETSc 3.19 > library. > > > > I first used cg solver with asm preconditioner, which converges nicely > when executed sequentially, but fails for MPI parallel version. I believed > that there must be an error in which I set up and assemble parallel > matrices for PETSc, but I soon noticed that if I use bicg with asm, > everything works fine, the parallel bicg/asm shows almost the same > convergence as sequential version. > > KSPCG requires a symmetric PC. > By default, PCASMType is PC_ASM_RESTRICT, which yields a non-symmetric > preconditioner. > With a single process, this does not matter, but with more than one > process, it does. > If you switch to -pc_asm_type basic, KSPCG should converge. > That being said, for Laplace equation, there are much faster alternatives > than PCASM, e.g., PCGAMG. > > Thanks, > Pierre > > > I could carry on with bicg, but I am still worried a little bit. To my > knowledge of Krylov solvers, which is admittedly basic since I am a > physicist only using linear algebra, the convergence of bicg should be very > similar to that of cg when symmetric systems are solved. When I run my > cases sequentially, I see that's indeed the case. But in parallel, bicg > converges and cg fails. > > > > Do you see the above issues as an anomaly and if so, could you advise > how to search for a cause? > > > > > > Kind regards, > > > > > > Bojan > > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From karthikeyan.chockalingam at stfc.ac.uk Thu Apr 20 05:12:45 2023 From: karthikeyan.chockalingam at stfc.ac.uk (Karthikeyan Chockalingam - STFC UKRI) Date: Thu, 20 Apr 2023 10:12:45 +0000 Subject: [petsc-users] question about MatSetLocalToGlobalMapping Message-ID: Hello, I created a new thread, thought would it be more appropriate (and is a continuation of my previous post). I want to construct the below K matrix (which is composed of submatrices) K = [A P^T P 0] Where K is of type MatMPIAIJ. I first constructed the top left [A] using MatSetValues(). Now, I would like to construct the bottom left [p] and top right [p^T] using MatSetValuesLocal(). To use MatSetValuesLocal(), I first have to create a local-to-global mapping using ISLocalToGlobalMappingCreate. I have created two mapping row_mapping and column_mapping. Q1) At what point should I declare MatSetLocalToGlobalMapping ? is it just before I use MatSetValuesLocal()? I will use MatSetLocalToGlobalMapping(K, row_mapping, column_mapping) to build the bottom left [P]. Q2) Can now I reset the mapping as MatSetLocalToGlobalMapping(K, column_mapping, row_mapping) to build the top right [P^T]? Many thanks! Kind regards, Karthik. -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Apr 20 05:37:35 2023 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 20 Apr 2023 06:37:35 -0400 Subject: [petsc-users] question about MatSetLocalToGlobalMapping In-Reply-To: References: Message-ID: On Thu, Apr 20, 2023 at 6:13?AM Karthikeyan Chockalingam - STFC UKRI via petsc-users wrote: > Hello, > > > > I created a new thread, thought would it be more appropriate (and is a > continuation of my previous post). I want to construct the below K matrix > (which is composed of submatrices) > > > > K = [A P^T > > P 0] > > > > Where K is of type MatMPIAIJ. I first constructed the top left [A] using > MatSetValues(). > > > > Now, I would like to construct the bottom left [p] and top right [p^T] using > MatSetValuesLocal(). > > > > To use MatSetValuesLocal(), I first have to create a local-to-global > mapping using ISLocalToGlobalMappingCreate. I have created two mapping > row_mapping and column_mapping. > I do not understand why they are not the same map. Maybe I was unclear before. It looks like you have two fields, say phi and lambda, where lambda is a Lagrange multiplier imposing some constraint. Then you get a saddle point like this. You can imagine matrices (phi, phi) --> A (phi, lambda) --> P^T (lambda, phi) --> P So you make a L2G map for the phi field and the lambda field. Oh, you are calling them row and col map, but they are my phi and lambda maps. I do not like the row and col names since in P they reverse. > Q1) At what point should I declare MatSetLocalToGlobalMapping ? is it > just before I use MatSetValuesLocal()? > Okay, it is good you are asking this because my thinking was somewhat confused. I think the precise steps are: 1) Create the large saddle point matrix K 1a) We must call https://petsc.org/main/manualpages/Mat/MatSetLocalToGlobalMapping/ on it. In the simplest case, this just maps the local rows numbers [0, Nrows) to the global rows numbers [rowStart, rowStart + Nrows). 2) To form each piece: 2a) Extract that block using https://petsc.org/main/manualpages/Mat/MatGetLocalSubMatrix/ This gives back a Mat object that you subsequently restore using https://petsc.org/main/manualpages/Mat/MatRestoreLocalSubMatrix/ 2b) Insert values using https://petsc.org/main/manualpages/Mat/MatSetValuesLocal/ The local indices used for insertion here are indices relative to the block itself, and the L2G map for this matrix has been rewritten to insert into that block in the larger matrix. Thus this looks like just calling MatSetValuesLocal() on the smaller matrix block, but inserts correctly into the larger matrix. Therefore, the code you write code in 2) could work equally well making the large matrix from 1), or independent smaller matrix blocks. Does this make sense? Thanks, Matt > I will use MatSetLocalToGlobalMapping(K, row_mapping, column_mapping) to > build the bottom left [P]. > > > > > > Q2) Can now I reset the mapping as MatSetLocalToGlobalMapping(K, > column_mapping, row_mapping) to build the top right [P^T]? > > > > > > Many thanks! > > > > Kind regards, > > Karthik. > > > > > > > > > > > > > > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From edoardo.alinovi at gmail.com Thu Apr 20 10:06:48 2023 From: edoardo.alinovi at gmail.com (Edoardo alinovi) Date: Thu, 20 Apr 2023 17:06:48 +0200 Subject: [petsc-users] issues with VecSetValues in petsc 3.19 In-Reply-To: References: Message-ID: Hi Matt, thanks for sharing the literature. Would you suggest any monolitic approach for the mpiaij/mpibaij matrix instead of fieldsplit? I did some blind search using gamg/hypre and they look terribile. I guess i am missing a trick, probaly they are not the way to go? Thanks! Il Lun 17 Apr 2023, 13:50 Matthew Knepley ha scritto: > On Mon, Apr 17, 2023 at 6:37?AM Edoardo alinovi > wrote: > >> Sure thing, the solver I am working on is this one: >> https://gitlab.com/alie89/flubio-code-fvm. >> >> It is a 3D, collocated, unstructured, finite volume solver for >> incompressibility NS. I can run steady, unsteady and I can use SIMPLE, PISO >> and Factional step method (both explicit and fully implicit momentum). I >> can also solve for turbulence (k-omega, BSL, SST, Spalart-Allmars, LES). I >> have also implemented some kind of Immersed boundary (2D/3D) that I need to >> resume at some point. >> >> Hot topic of the moment, I am developing a fully coupled pressure based >> solver using field-split. What I have now is working ok, I have validated >> it on a lot of 2D problems and going on with 3D right now. If all the >> tests are passed, I'll focus on tuning the field splitting which looks to >> be a quite interesting topic! >> > > I think a very good discussion of the issues from the point of view of FEM > is > > https://arxiv.org/abs/1810.03315 > > There should be a similar analysis from the FVM side, although it might > not be possible to > find a pressure discretization compatible with the FVM velocity for this > purpose. > > Thanks, > > Matt > > >> Flubio is a project I have been carrying on since PhD days. The >> implementation is 99% on my shoulders, despite the fact I am >> collaborating with some people around. I am coding evenings and >> weekends/free time, it gives me a lot of satisfaction and also a lot of >> insights! >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From hzhang at mcs.anl.gov Thu Apr 20 10:47:10 2023 From: hzhang at mcs.anl.gov (Zhang, Hong) Date: Thu, 20 Apr 2023 15:47:10 +0000 Subject: [petsc-users] question about MatSetLocalToGlobalMapping In-Reply-To: References: Message-ID: Karthik, We built a KKT matrix in TaoSNESJacobian_PDIPM() (see petsc/src/tao/constrained/impls/ipm/pdipm.c) which assembles several small matrices into a large KKT matrix in mpiaij format. You could take the same approach to insert P and P^T into your K. FYI, I attached our paper. Hong ________________________________ From: petsc-users on behalf of Matthew Knepley Sent: Thursday, April 20, 2023 5:37 AM To: Karthikeyan Chockalingam - STFC UKRI Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] question about MatSetLocalToGlobalMapping On Thu, Apr 20, 2023 at 6:13?AM Karthikeyan Chockalingam - STFC UKRI via petsc-users > wrote: Hello, I created a new thread, thought would it be more appropriate (and is a continuation of my previous post). I want to construct the below K matrix (which is composed of submatrices) K = [A P^T P 0] Where K is of type MatMPIAIJ. I first constructed the top left [A] using MatSetValues(). Now, I would like to construct the bottom left [p] and top right [p^T] using MatSetValuesLocal(). To use MatSetValuesLocal(), I first have to create a local-to-global mapping using ISLocalToGlobalMappingCreate. I have created two mapping row_mapping and column_mapping. I do not understand why they are not the same map. Maybe I was unclear before. It looks like you have two fields, say phi and lambda, where lambda is a Lagrange multiplier imposing some constraint. Then you get a saddle point like this. You can imagine matrices (phi, phi) --> A (phi, lambda) --> P^T (lambda, phi) --> P So you make a L2G map for the phi field and the lambda field. Oh, you are calling them row and col map, but they are my phi and lambda maps. I do not like the row and col names since in P they reverse. Q1) At what point should I declare MatSetLocalToGlobalMapping ? is it just before I use MatSetValuesLocal()? Okay, it is good you are asking this because my thinking was somewhat confused. I think the precise steps are: 1) Create the large saddle point matrix K 1a) We must call https://petsc.org/main/manualpages/Mat/MatSetLocalToGlobalMapping/ on it. In the simplest case, this just maps the local rows numbers [0, Nrows) to the global rows numbers [rowStart, rowStart + Nrows). 2) To form each piece: 2a) Extract that block using https://petsc.org/main/manualpages/Mat/MatGetLocalSubMatrix/ This gives back a Mat object that you subsequently restore using https://petsc.org/main/manualpages/Mat/MatRestoreLocalSubMatrix/ 2b) Insert values using https://petsc.org/main/manualpages/Mat/MatSetValuesLocal/ The local indices used for insertion here are indices relative to the block itself, and the L2G map for this matrix has been rewritten to insert into that block in the larger matrix. Thus this looks like just calling MatSetValuesLocal() on the smaller matrix block, but inserts correctly into the larger matrix. Therefore, the code you write code in 2) could work equally well making the large matrix from 1), or independent smaller matrix blocks. Does this make sense? Thanks, Matt I will use MatSetLocalToGlobalMapping(K, row_mapping, column_mapping) to build the bottom left [P]. Q2) Can now I reset the mapping as MatSetLocalToGlobalMapping(K, column_mapping, row_mapping) to build the top right [P^T]? Many thanks! Kind regards, Karthik. -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: IET Generation Trans Dist - 2022 - Sundermann - Parallel primal?dual interior point method for the solution of dynamic.pdf Type: application/pdf Size: 998055 bytes Desc: IET Generation Trans Dist - 2022 - Sundermann - Parallel primal?dual interior point method for the solution of dynamic.pdf URL: From pierre.jolivet at lip6.fr Thu Apr 20 11:58:35 2023 From: pierre.jolivet at lip6.fr (Pierre Jolivet) Date: Thu, 20 Apr 2023 18:58:35 +0200 Subject: [petsc-users] question about MatSetLocalToGlobalMapping In-Reply-To: References: Message-ID: <2B2EFEAE-41BB-444A-864A-8129DEF8C7B4@lip6.fr> Hong, 1) Is there any hope to get PDIPDM to use a MatNest? 2) Is this fixed https://lists.mcs.anl.gov/pipermail/petsc-dev/2020-September/026398.html ? I cannot get users to transition away from Ipopt because of these two missing features. Thanks, Pierre > On 20 Apr 2023, at 5:47 PM, Zhang, Hong via petsc-users wrote: > > Karthik, > We built a KKT matrix in TaoSNESJacobian_PDIPM() (see petsc/src/tao/constrained/impls/ipm/pdipm.c) which assembles several small matrices into a large KKT matrix in mpiaij format. You could take the same approach to insert P and P^T into your K. > FYI, I attached our paper. > Hong > From: petsc-users on behalf of Matthew Knepley > Sent: Thursday, April 20, 2023 5:37 AM > To: Karthikeyan Chockalingam - STFC UKRI > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] question about MatSetLocalToGlobalMapping > > On Thu, Apr 20, 2023 at 6:13?AM Karthikeyan Chockalingam - STFC UKRI via petsc-users > wrote: > Hello, > > I created a new thread, thought would it be more appropriate (and is a continuation of my previous post). I want to construct the below K matrix (which is composed of submatrices) > > K = [A P^T > P 0] > > Where K is of type MatMPIAIJ. I first constructed the top left [A] using MatSetValues(). > > Now, I would like to construct the bottom left [p] and top right [p^T] using MatSetValuesLocal(). > > To use MatSetValuesLocal(), I first have to create a local-to-global mapping using ISLocalToGlobalMappingCreate. I have created two mapping row_mapping and column_mapping. > > I do not understand why they are not the same map. Maybe I was unclear before. It looks like you have two fields, say phi and lambda, where lambda is a Lagrange multiplier imposing some constraint. Then you get a saddle point like this. You can imagine matrices > > (phi, phi) --> A > (phi, lambda) --> P^T > (lambda, phi) --> P > > So you make a L2G map for the phi field and the lambda field. Oh, you are calling them row and col map, but they are my phi and lambda > maps. I do not like the row and col names since in P they reverse. > > Q1) At what point should I declare MatSetLocalToGlobalMapping ? is it just before I use MatSetValuesLocal()? > > Okay, it is good you are asking this because my thinking was somewhat confused. I think the precise steps are: > > 1) Create the large saddle point matrix K > > 1a) We must call https://petsc.org/main/manualpages/Mat/MatSetLocalToGlobalMapping/ on it. In the simplest case, this just maps > the local rows numbers [0, Nrows) to the global rows numbers [rowStart, rowStart + Nrows). > > 2) To form each piece: > > 2a) Extract that block using https://petsc.org/main/manualpages/Mat/MatGetLocalSubMatrix/ > > This gives back a Mat object that you subsequently restore using https://petsc.org/main/manualpages/Mat/MatRestoreLocalSubMatrix/ > > 2b) Insert values using https://petsc.org/main/manualpages/Mat/MatSetValuesLocal/ > > The local indices used for insertion here are indices relative to the block itself, and the L2G map for this matrix > has been rewritten to insert into that block in the larger matrix. Thus this looks like just calling MatSetValuesLocal() > on the smaller matrix block, but inserts correctly into the larger matrix. > > Therefore, the code you write code in 2) could work equally well making the large matrix from 1), or independent smaller matrix blocks. > > Does this make sense? > > Thanks, > > Matt > > I will use MatSetLocalToGlobalMapping(K, row_mapping, column_mapping) to build the bottom left [P]. > > > Q2) Can now I reset the mapping as MatSetLocalToGlobalMapping(K, column_mapping, row_mapping) to build the top right [P^T]? > > > Many thanks! > > Kind regards, > Karthik. > > > > > > > > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From hzhang at mcs.anl.gov Thu Apr 20 15:28:29 2023 From: hzhang at mcs.anl.gov (Zhang, Hong) Date: Thu, 20 Apr 2023 20:28:29 +0000 Subject: [petsc-users] question about MatSetLocalToGlobalMapping In-Reply-To: <2B2EFEAE-41BB-444A-864A-8129DEF8C7B4@lip6.fr> References: <2B2EFEAE-41BB-444A-864A-8129DEF8C7B4@lip6.fr> Message-ID: Pierre, 1) Is there any hope to get PDIPDM to use a MatNest? KKT matrix is indefinite and ill-conditioned, which must be solved using a direct matrix factorization method. For the current implementation, we use MUMPS Cholesky as default. To use MatNest, what direct solver to use, SCHUR_FACTOR? I do not know how to get it work. 2) Is this fixed https://lists.mcs.anl.gov/pipermail/petsc-dev/2020-September/026398.html ? I cannot get users to transition away from Ipopt because of these two missing features. The existing pdipm is the result of a MS student intern project. None of us involved are experts on the optimization solvers. We made a straightforward parallelization of Ipopt. It indeed needs further work, e.g., more features, better matrix storage, convergence criteria... To our knowledge, parallel pdipm is not available other than our pdipm. We should improve our pdipm. Hong On 20 Apr 2023, at 5:47 PM, Zhang, Hong via petsc-users wrote: Karthik, We built a KKT matrix in TaoSNESJacobian_PDIPM() (see petsc/src/tao/constrained/impls/ipm/pdipm.c) which assembles several small matrices into a large KKT matrix in mpiaij format. You could take the same approach to insert P and P^T into your K. FYI, I attached our paper. Hong ________________________________ From: petsc-users on behalf of Matthew Knepley Sent: Thursday, April 20, 2023 5:37 AM To: Karthikeyan Chockalingam - STFC UKRI Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] question about MatSetLocalToGlobalMapping On Thu, Apr 20, 2023 at 6:13?AM Karthikeyan Chockalingam - STFC UKRI via petsc-users > wrote: Hello, I created a new thread, thought would it be more appropriate (and is a continuation of my previous post). I want to construct the below K matrix (which is composed of submatrices) K = [A P^T P 0] Where K is of type MatMPIAIJ. I first constructed the top left [A] using MatSetValues(). Now, I would like to construct the bottom left [p] and top right [p^T] using MatSetValuesLocal(). To use MatSetValuesLocal(), I first have to create a local-to-global mapping using ISLocalToGlobalMappingCreate. I have created two mapping row_mapping and column_mapping. I do not understand why they are not the same map. Maybe I was unclear before. It looks like you have two fields, say phi and lambda, where lambda is a Lagrange multiplier imposing some constraint. Then you get a saddle point like this. You can imagine matrices (phi, phi) --> A (phi, lambda) --> P^T (lambda, phi) --> P So you make a L2G map for the phi field and the lambda field. Oh, you are calling them row and col map, but they are my phi and lambda maps. I do not like the row and col names since in P they reverse. Q1) At what point should I declare MatSetLocalToGlobalMapping ? is it just before I use MatSetValuesLocal()? Okay, it is good you are asking this because my thinking was somewhat confused. I think the precise steps are: 1) Create the large saddle point matrix K 1a) We must call https://petsc.org/main/manualpages/Mat/MatSetLocalToGlobalMapping/ on it. In the simplest case, this just maps the local rows numbers [0, Nrows) to the global rows numbers [rowStart, rowStart + Nrows). 2) To form each piece: 2a) Extract that block using https://petsc.org/main/manualpages/Mat/MatGetLocalSubMatrix/ This gives back a Mat object that you subsequently restore using https://petsc.org/main/manualpages/Mat/MatRestoreLocalSubMatrix/ 2b) Insert values using https://petsc.org/main/manualpages/Mat/MatSetValuesLocal/ The local indices used for insertion here are indices relative to the block itself, and the L2G map for this matrix has been rewritten to insert into that block in the larger matrix. Thus this looks like just calling MatSetValuesLocal() on the smaller matrix block, but inserts correctly into the larger matrix. Therefore, the code you write code in 2) could work equally well making the large matrix from 1), or independent smaller matrix blocks. Does this make sense? Thanks, Matt I will use MatSetLocalToGlobalMapping(K, row_mapping, column_mapping) to build the bottom left [P]. Q2) Can now I reset the mapping as MatSetLocalToGlobalMapping(K, column_mapping, row_mapping) to build the top right [P^T]? Many thanks! Kind regards, Karthik. -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From pierre.jolivet at lip6.fr Fri Apr 21 00:18:46 2023 From: pierre.jolivet at lip6.fr (Pierre Jolivet) Date: Fri, 21 Apr 2023 07:18:46 +0200 Subject: [petsc-users] question about MatSetLocalToGlobalMapping In-Reply-To: References: <2B2EFEAE-41BB-444A-864A-8129DEF8C7B4@lip6.fr> Message-ID: <2D91313B-426F-45AD-A1B2-C3BBAC4394CB@lip6.fr> > On 20 Apr 2023, at 10:28 PM, Zhang, Hong wrote: > > Pierre, > 1) Is there any hope to get PDIPDM to use a MatNest? > > KKT matrix is indefinite and ill-conditioned, which must be solved using a direct matrix factorization method. But you are using PCBJACOBI in the paper you attached? In any case, there are many such systems, e.g., a discretization of Stokes equations, that can be solved with something else than a direct factorization. > For the current implementation, we use MUMPS Cholesky as default. To use MatNest, what direct solver to use, SCHUR_FACTOR? I do not know how to get it work. On the one hand, MatNest can efficiently convert to AIJ or SBAIJ if you want to stick to PCLU or PCCHOLESKY. On the other hand, it allows to easily switch to PCFIELDSPLIT which can be used to solve saddle-point problems. > 2) Is this fixed https://lists.mcs.anl.gov/pipermail/petsc-dev/2020-September/026398.html ? > I cannot get users to transition away from Ipopt because of these two missing features. > > The existing pdipm is the result of a MS student intern project. None of us involved are experts on the optimization solvers. We made a straightforward parallelization of Ipopt. It indeed needs further work, e.g., more features, better matrix storage, convergence criteria... To our knowledge, parallel pdipm is not available other than our pdipm. Ipopt can use MUMPS and PARDISO internally, so it?s in some sense parallel (using shared memory). Also, this is not a very potent selling point. My users that are satisfied with Ipopt as a "non-parallel" black box don?t want to have to touch part of their code just to stick it in a parallel black box which is limited to the same kind of linear solver and which has severe limitations with respect to Hessian/Jacobian/constraint distributions. Thanks, Pierre > We should improve our pdipm. > Hong > >> On 20 Apr 2023, at 5:47 PM, Zhang, Hong via petsc-users > wrote: >> >> Karthik, >> We built a KKT matrix in TaoSNESJacobian_PDIPM() (see petsc/src/tao/constrained/impls/ipm/pdipm.c) which assembles several small matrices into a large KKT matrix in mpiaij format. You could take the same approach to insert P and P^T into your K. >> FYI, I attached our paper. >> Hong >> >> From: petsc-users > on behalf of Matthew Knepley > >> Sent: Thursday, April 20, 2023 5:37 AM >> To: Karthikeyan Chockalingam - STFC UKRI > >> Cc: petsc-users at mcs.anl.gov > >> Subject: Re: [petsc-users] question about MatSetLocalToGlobalMapping >> >> On Thu, Apr 20, 2023 at 6:13?AM Karthikeyan Chockalingam - STFC UKRI via petsc-users > wrote: >> Hello, >> >> I created a new thread, thought would it be more appropriate (and is a continuation of my previous post). I want to construct the below K matrix (which is composed of submatrices) >> >> K = [A P^T >> P 0] >> >> Where K is of type MatMPIAIJ. I first constructed the top left [A] using MatSetValues(). >> >> Now, I would like to construct the bottom left [p] and top right [p^T] using MatSetValuesLocal(). >> >> To use MatSetValuesLocal(), I first have to create a local-to-global mapping using ISLocalToGlobalMappingCreate. I have created two mapping row_mapping and column_mapping. >> >> I do not understand why they are not the same map. Maybe I was unclear before. It looks like you have two fields, say phi and lambda, where lambda is a Lagrange multiplier imposing some constraint. Then you get a saddle point like this. You can imagine matrices >> >> (phi, phi) --> A >> (phi, lambda) --> P^T >> (lambda, phi) --> P >> >> So you make a L2G map for the phi field and the lambda field. Oh, you are calling them row and col map, but they are my phi and lambda >> maps. I do not like the row and col names since in P they reverse. >> >> Q1) At what point should I declare MatSetLocalToGlobalMapping ? is it just before I use MatSetValuesLocal()? >> >> Okay, it is good you are asking this because my thinking was somewhat confused. I think the precise steps are: >> >> 1) Create the large saddle point matrix K >> >> 1a) We must call https://petsc.org/main/manualpages/Mat/MatSetLocalToGlobalMapping/ on it. In the simplest case, this just maps >> the local rows numbers [0, Nrows) to the global rows numbers [rowStart, rowStart + Nrows). >> >> 2) To form each piece: >> >> 2a) Extract that block using https://petsc.org/main/manualpages/Mat/MatGetLocalSubMatrix/ >> >> This gives back a Mat object that you subsequently restore using https://petsc.org/main/manualpages/Mat/MatRestoreLocalSubMatrix/ >> >> 2b) Insert values using https://petsc.org/main/manualpages/Mat/MatSetValuesLocal/ >> >> The local indices used for insertion here are indices relative to the block itself, and the L2G map for this matrix >> has been rewritten to insert into that block in the larger matrix. Thus this looks like just calling MatSetValuesLocal() >> on the smaller matrix block, but inserts correctly into the larger matrix. >> >> Therefore, the code you write code in 2) could work equally well making the large matrix from 1), or independent smaller matrix blocks. >> >> Does this make sense? >> >> Thanks, >> >> Matt >> >> I will use MatSetLocalToGlobalMapping(K, row_mapping, column_mapping) to build the bottom left [P]. >> >> >> Q2) Can now I reset the mapping as MatSetLocalToGlobalMapping(K, column_mapping, row_mapping) to build the top right [P^T]? >> >> >> Many thanks! >> >> Kind regards, >> Karthik. >> >> >> >> >> >> >> >> >> >> >> >> -- >> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From karthikeyan.chockalingam at stfc.ac.uk Fri Apr 21 03:49:33 2023 From: karthikeyan.chockalingam at stfc.ac.uk (Karthikeyan Chockalingam - STFC UKRI) Date: Fri, 21 Apr 2023 08:49:33 +0000 Subject: [petsc-users] question about MatSetLocalToGlobalMapping In-Reply-To: References: Message-ID: Thank you, Matt. It took me a while but I understand how to work with submatrices. (Below is simple stand-alone code which assembles two submatrices into a matrix of 5 x 5, which might be of use to someone else) Q1) Looks like I don?t need to call MatrixAssembleBegin/End of on submatrices? Q2) Though I have preallocated A for 5 x 5 ? why do I still need to call MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE); else I get an error while assembling using MatSetValuesLocal()? Q3) Am I calling MatRestoreLocalSubmatrix() at the right place and what does it do? Q4) When assembling a large submatrix, can I call MatGetOwershipRange on the submatrix? Best, Karthik. #include Mat A; /* linear system matrix */ PetscInt i,j,m = 3,n = 3, P = 5, Q = 5; PetscErrorCode ierr; PetscScalar v; // create matrix A ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr); ierr = MatSetType(A, MATMPIAIJ); CHKERRQ(ierr); ierr = MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,P,Q);CHKERRQ(ierr); ierr = MatSetFromOptions(A);CHKERRQ(ierr); ierr = MatMPIAIJSetPreallocation(A,5,NULL,5,NULL);CHKERRQ(ierr); //ierr = MatSeqAIJSetPreallocation(A,5,NULL);CHKERRQ(ierr); // local indices is always 0 1 2 PetscInt c = 3, global_col_indices[] = {0, 1, 2}; PetscInt r = 2, global_row_indices[] = {3, 4}; ISLocalToGlobalMapping col_mapping, row_mapping; ISLocalToGlobalMappingCreate(PETSC_COMM_WORLD, 1, c, global_col_indices, PETSC_COPY_VALUES, &col_mapping); ISLocalToGlobalMappingCreate(PETSC_COMM_WORLD, 1, r, global_row_indices, PETSC_COPY_VALUES, &row_mapping); MatSetLocalToGlobalMapping(A,row_mapping, col_mapping); for (i=0; i Date: Thursday, 20 April 2023 at 11:37 To: Chockalingam, Karthikeyan (STFC,DL,HC) Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] question about MatSetLocalToGlobalMapping On Thu, Apr 20, 2023 at 6:13?AM Karthikeyan Chockalingam - STFC UKRI via petsc-users > wrote: Hello, I created a new thread, thought would it be more appropriate (and is a continuation of my previous post). I want to construct the below K matrix (which is composed of submatrices) K = [A P^T P 0] Where K is of type MatMPIAIJ. I first constructed the top left [A] using MatSetValues(). Now, I would like to construct the bottom left [p] and top right [p^T] using MatSetValuesLocal(). To use MatSetValuesLocal(), I first have to create a local-to-global mapping using ISLocalToGlobalMappingCreate. I have created two mapping row_mapping and column_mapping. I do not understand why they are not the same map. Maybe I was unclear before. It looks like you have two fields, say phi and lambda, where lambda is a Lagrange multiplier imposing some constraint. Then you get a saddle point like this. You can imagine matrices (phi, phi) --> A (phi, lambda) --> P^T (lambda, phi) --> P So you make a L2G map for the phi field and the lambda field. Oh, you are calling them row and col map, but they are my phi and lambda maps. I do not like the row and col names since in P they reverse. Q1) At what point should I declare MatSetLocalToGlobalMapping ? is it just before I use MatSetValuesLocal()? Okay, it is good you are asking this because my thinking was somewhat confused. I think the precise steps are: 1) Create the large saddle point matrix K 1a) We must call https://petsc.org/main/manualpages/Mat/MatSetLocalToGlobalMapping/ on it. In the simplest case, this just maps the local rows numbers [0, Nrows) to the global rows numbers [rowStart, rowStart + Nrows). 2) To form each piece: 2a) Extract that block using https://petsc.org/main/manualpages/Mat/MatGetLocalSubMatrix/ This gives back a Mat object that you subsequently restore using https://petsc.org/main/manualpages/Mat/MatRestoreLocalSubMatrix/ 2b) Insert values using https://petsc.org/main/manualpages/Mat/MatSetValuesLocal/ The local indices used for insertion here are indices relative to the block itself, and the L2G map for this matrix has been rewritten to insert into that block in the larger matrix. Thus this looks like just calling MatSetValuesLocal() on the smaller matrix block, but inserts correctly into the larger matrix. Therefore, the code you write code in 2) could work equally well making the large matrix from 1), or independent smaller matrix blocks. Does this make sense? Thanks, Matt I will use MatSetLocalToGlobalMapping(K, row_mapping, column_mapping) to build the bottom left [P]. Q2) Can now I reset the mapping as MatSetLocalToGlobalMapping(K, column_mapping, row_mapping) to build the top right [P^T]? Many thanks! Kind regards, Karthik. -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From karthikeyan.chockalingam at stfc.ac.uk Fri Apr 21 03:52:14 2023 From: karthikeyan.chockalingam at stfc.ac.uk (Karthikeyan Chockalingam - STFC UKRI) Date: Fri, 21 Apr 2023 08:52:14 +0000 Subject: [petsc-users] question about MatSetLocalToGlobalMapping In-Reply-To: References: Message-ID: Thank you Hong. I will look into it. Best, Karthik. From: Zhang, Hong Date: Thursday, 20 April 2023 at 16:47 To: Matthew Knepley , Chockalingam, Karthikeyan (STFC,DL,HC) Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] question about MatSetLocalToGlobalMapping Karthik, We built a KKT matrix in TaoSNESJacobian_PDIPM() (see petsc/src/tao/constrained/impls/ipm/pdipm.c) which assembles several small matrices into a large KKT matrix in mpiaij format. You could take the same approach to insert P and P^T into your K. FYI, I attached our paper. Hong ________________________________ From: petsc-users on behalf of Matthew Knepley Sent: Thursday, April 20, 2023 5:37 AM To: Karthikeyan Chockalingam - STFC UKRI Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] question about MatSetLocalToGlobalMapping On Thu, Apr 20, 2023 at 6:13?AM Karthikeyan Chockalingam - STFC UKRI via petsc-users > wrote: Hello, I created a new thread, thought would it be more appropriate (and is a continuation of my previous post). I want to construct the below K matrix (which is composed of submatrices) K = [A P^T P 0] Where K is of type MatMPIAIJ. I first constructed the top left [A] using MatSetValues(). Now, I would like to construct the bottom left [p] and top right [p^T] using MatSetValuesLocal(). To use MatSetValuesLocal(), I first have to create a local-to-global mapping using ISLocalToGlobalMappingCreate. I have created two mapping row_mapping and column_mapping. I do not understand why they are not the same map. Maybe I was unclear before. It looks like you have two fields, say phi and lambda, where lambda is a Lagrange multiplier imposing some constraint. Then you get a saddle point like this. You can imagine matrices (phi, phi) --> A (phi, lambda) --> P^T (lambda, phi) --> P So you make a L2G map for the phi field and the lambda field. Oh, you are calling them row and col map, but they are my phi and lambda maps. I do not like the row and col names since in P they reverse. Q1) At what point should I declare MatSetLocalToGlobalMapping ? is it just before I use MatSetValuesLocal()? Okay, it is good you are asking this because my thinking was somewhat confused. I think the precise steps are: 1) Create the large saddle point matrix K 1a) We must call https://petsc.org/main/manualpages/Mat/MatSetLocalToGlobalMapping/ on it. In the simplest case, this just maps the local rows numbers [0, Nrows) to the global rows numbers [rowStart, rowStart + Nrows). 2) To form each piece: 2a) Extract that block using https://petsc.org/main/manualpages/Mat/MatGetLocalSubMatrix/ This gives back a Mat object that you subsequently restore using https://petsc.org/main/manualpages/Mat/MatRestoreLocalSubMatrix/ 2b) Insert values using https://petsc.org/main/manualpages/Mat/MatSetValuesLocal/ The local indices used for insertion here are indices relative to the block itself, and the L2G map for this matrix has been rewritten to insert into that block in the larger matrix. Thus this looks like just calling MatSetValuesLocal() on the smaller matrix block, but inserts correctly into the larger matrix. Therefore, the code you write code in 2) could work equally well making the large matrix from 1), or independent smaller matrix blocks. Does this make sense? Thanks, Matt I will use MatSetLocalToGlobalMapping(K, row_mapping, column_mapping) to build the bottom left [P]. Q2) Can now I reset the mapping as MatSetLocalToGlobalMapping(K, column_mapping, row_mapping) to build the top right [P^T]? Many thanks! Kind regards, Karthik. -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Apr 21 08:14:31 2023 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 21 Apr 2023 09:14:31 -0400 Subject: [petsc-users] issues with VecSetValues in petsc 3.19 In-Reply-To: References: Message-ID: On Thu, Apr 20, 2023 at 11:07?AM Edoardo alinovi wrote: > Hi Matt, > > thanks for sharing the literature. > > Would you suggest any monolitic approach for the mpiaij/mpibaij matrix > instead of fieldsplit? > I have not seen them be better, but Vanka-type smoothers can work for this system. > I did some blind search using gamg/hypre and they look terribile. I guess > i am missing a trick, probaly they are not the way to go? > I believe that trick is that the patches you use have to very specific. Thanks, Matt > Thanks! > > Il Lun 17 Apr 2023, 13:50 Matthew Knepley ha scritto: > >> On Mon, Apr 17, 2023 at 6:37?AM Edoardo alinovi < >> edoardo.alinovi at gmail.com> wrote: >> >>> Sure thing, the solver I am working on is this one: >>> https://gitlab.com/alie89/flubio-code-fvm. >>> >>> It is a 3D, collocated, unstructured, finite volume solver for >>> incompressibility NS. I can run steady, unsteady and I can use SIMPLE, PISO >>> and Factional step method (both explicit and fully implicit momentum). I >>> can also solve for turbulence (k-omega, BSL, SST, Spalart-Allmars, LES). I >>> have also implemented some kind of Immersed boundary (2D/3D) that I need to >>> resume at some point. >>> >>> Hot topic of the moment, I am developing a fully coupled pressure based >>> solver using field-split. What I have now is working ok, I have validated >>> it on a lot of 2D problems and going on with 3D right now. If all the >>> tests are passed, I'll focus on tuning the field splitting which looks to >>> be a quite interesting topic! >>> >> >> I think a very good discussion of the issues from the point of view of >> FEM is >> >> https://arxiv.org/abs/1810.03315 >> >> There should be a similar analysis from the FVM side, although it might >> not be possible to >> find a pressure discretization compatible with the FVM velocity for this >> purpose. >> >> Thanks, >> >> Matt >> >> >>> Flubio is a project I have been carrying on since PhD days. The >>> implementation is 99% on my shoulders, despite the fact I am >>> collaborating with some people around. I am coding evenings and >>> weekends/free time, it gives me a lot of satisfaction and also a lot of >>> insights! >>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From liufield at gmail.com Fri Apr 21 08:40:37 2023 From: liufield at gmail.com (neil liu) Date: Fri, 21 Apr 2023 09:40:37 -0400 Subject: [petsc-users] Fwd: Inquiry about the dual space (PetscFECreateTabulation_Basic) In-Reply-To: References: Message-ID: Hello, Petsc group, I am learning the FE structure in Petsc by running case https://petsc.org/main/src/snes/tutorials/ex12.c.html with -run_type test -bc_type dirichlet -dm_plex_interpolate 0 -petscspace_degree 1 -show_initial -dm_plex_print_fem 1 When I check the subroutine PetscFECreateTabulation_Basic, I can not understand some parameters there. For the following lines in the file ( https://petsc.org/release//src/dm/dt/fe/impls/basic/febasic.c.html#PETSCFEBASIC ) 135: PetscCall (PetscDualSpaceGetDimension (fem->dualSpace, &pdim));136: PetscCall (PetscFEGetNumComponents (fem, &Nc)); Here, Nc = 2, pdim =6. I am running a scalar case with degree of 1, I expect Nc = 1 and pdim =3. Could you please explain this? In addition, Thanks, Xiaodong -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Apr 21 09:05:08 2023 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 21 Apr 2023 10:05:08 -0400 Subject: [petsc-users] Fwd: Inquiry about the dual space (PetscFECreateTabulation_Basic) In-Reply-To: References: Message-ID: On Fri, Apr 21, 2023 at 10:02?AM neil liu wrote: > Hello, Petsc group, > > I am learning the FE structure in Petsc by running case > https://petsc.org/main/src/snes/tutorials/ex12.c.html with -run_type test > -bc_type dirichlet -dm_plex_interpolate 0 -petscspace_degree 1 > -show_initial -dm_plex_print_fem 1 > -dm_plex_print_fem 5 will print much more > When I check the subroutine PetscFECreateTabulation_Basic, I can not > understand some parameters there. > > For the following lines in the file ( > https://petsc.org/release//src/dm/dt/fe/impls/basic/febasic.c.html#PETSCFEBASIC > ) > > 135: PetscCall (PetscDualSpaceGetDimension (fem->dualSpace, &pdim));136: PetscCall (PetscFEGetNumComponents (fem, &Nc)); > > Here, Nc = 2, pdim =6. I am running a scalar case with degree of 1, > > I expect Nc = 1 and pdim =3. Could you please explain this? In addition, > > Sure. I am guessing that you are looking at the tabulation for the coordinate space. Here you are in 2 dimensions, so the coordinate space has Nc = 2. For multicomponent spaces, we currently do not represent it as a tensor product over the scalar space, so we see 6 basis vectors. Thanks, Matt > Thanks, > > Xiaodong > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From liufield at gmail.com Fri Apr 21 09:36:34 2023 From: liufield at gmail.com (neil liu) Date: Fri, 21 Apr 2023 10:36:34 -0400 Subject: [petsc-users] Fwd: Inquiry about the dual space (PetscFECreateTabulation_Basic) In-Reply-To: References: Message-ID: When you say "For multicomponent spaces, we currently do not represent it as a tensor product over the scalar space, so we see 6 basis vectors." Here, muticomponent = two dimensional ? I am a little confused about the dimensions of the basis functions here. From https://petsc.org/release//src/dm/dt/fe/impls/basic/febasic.c.html#PETSCFEBASIC 144: /* B[npoints, nodes, Nc] = tmpB[npoints, prime, Nc] * invV[prime, nodes] */ How do you define tmpB here (npoints =3, prime =6, Nc =2)? I can get tmpB from PetscSpaceEvaluate_Polynomial, where, tmpB (1x9) is (the prime polynomial is defined by 1 x y)) [ 1 -0.6667 -0.6667 1 -0.6667 0.3333 1 0.3333 -0.6666]. How do you transform from this 1x9 to 3x6x2 there. Thanks, Xiaodong On Fri, Apr 21, 2023 at 10:05?AM Matthew Knepley wrote: > On Fri, Apr 21, 2023 at 10:02?AM neil liu wrote: > >> Hello, Petsc group, >> >> I am learning the FE structure in Petsc by running case >> https://petsc.org/main/src/snes/tutorials/ex12.c.html with -run_type >> test -bc_type dirichlet -dm_plex_interpolate 0 -petscspace_degree 1 >> -show_initial -dm_plex_print_fem 1 >> > > -dm_plex_print_fem 5 will print much more > > >> When I check the subroutine PetscFECreateTabulation_Basic, I can not >> understand some parameters there. >> >> For the following lines in the file ( >> https://petsc.org/release//src/dm/dt/fe/impls/basic/febasic.c.html#PETSCFEBASIC >> ) >> >> 135: PetscCall (PetscDualSpaceGetDimension (fem->dualSpace, &pdim));136: PetscCall (PetscFEGetNumComponents (fem, &Nc)); >> >> Here, Nc = 2, pdim =6. I am running a scalar case with degree of 1, >> >> I expect Nc = 1 and pdim =3. Could you please explain this? In addition, >> >> Sure. I am guessing that you are looking at the tabulation for the > coordinate space. Here you are in 2 dimensions, so the > coordinate space has Nc = 2. For multicomponent spaces, we currently do > not represent it as a tensor product over the > scalar space, so we see 6 basis vectors. > > Thanks, > > Matt > >> Thanks, >> >> Xiaodong >> >> >> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Apr 21 09:57:34 2023 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 21 Apr 2023 10:57:34 -0400 Subject: [petsc-users] Fwd: Inquiry about the dual space (PetscFECreateTabulation_Basic) In-Reply-To: References: Message-ID: On Fri, Apr 21, 2023 at 10:36?AM neil liu wrote: > When you say "For multicomponent spaces, we currently do not represent it > as a tensor product over the scalar space, so we see 6 basis vectors." > Here, muticomponent = two dimensional ? > If you have a vector in a two-dimensional space, it has 2 components, like our coordinate vector. > I am a little confused about the dimensions of the basis functions here. > From > https://petsc.org/release//src/dm/dt/fe/impls/basic/febasic.c.html#PETSCFEBASIC > > 144: /* B[npoints, nodes, Nc] = tmpB[npoints, prime, Nc] * invV[prime, nodes] */ > > How do you define tmpB here (npoints =3, prime =6, Nc =2)? I can get tmpB from > > PetscSpaceEvaluate_Polynomial, where, tmpB (1x9) is (the prime polynomial is defined by 1 x y)) > > [ 1 -0.6667 -0.6667 1 -0.6667 0.3333 1 0.3333 -0.6666]. How do you transform from this 1x9 to 3x6x2 there. > > npoints is the number of quadrature points at which to evaluate nodes (pdim) is the number of functions in the space Nc is the number of components for each function. So a P1 basis for vectors looks like / 1 \ / 0 \ / x \ / 0 \ / y \ / 0 \ \ 0 / \ 1 / \ 0 / \ x / \ 0 / \ y / six vectors with 2 components each. Thanks, Matt > Thanks, > > Xiaodong > > > > > > > On Fri, Apr 21, 2023 at 10:05?AM Matthew Knepley > wrote: > >> On Fri, Apr 21, 2023 at 10:02?AM neil liu wrote: >> >>> Hello, Petsc group, >>> >>> I am learning the FE structure in Petsc by running case >>> https://petsc.org/main/src/snes/tutorials/ex12.c.html with -run_type >>> test -bc_type dirichlet -dm_plex_interpolate 0 -petscspace_degree 1 >>> -show_initial -dm_plex_print_fem 1 >>> >> >> -dm_plex_print_fem 5 will print much more >> >> >>> When I check the subroutine PetscFECreateTabulation_Basic, I can not >>> understand some parameters there. >>> >>> For the following lines in the file ( >>> https://petsc.org/release//src/dm/dt/fe/impls/basic/febasic.c.html#PETSCFEBASIC >>> ) >>> >>> 135: PetscCall (PetscDualSpaceGetDimension (fem->dualSpace, &pdim));136: PetscCall (PetscFEGetNumComponents (fem, &Nc)); >>> >>> Here, Nc = 2, pdim =6. I am running a scalar case with degree of 1, >>> >>> I expect Nc = 1 and pdim =3. Could you please explain this? In addition, >>> >>> Sure. I am guessing that you are looking at the tabulation for the >> coordinate space. Here you are in 2 dimensions, so the >> coordinate space has Nc = 2. For multicomponent spaces, we currently do >> not represent it as a tensor product over the >> scalar space, so we see 6 basis vectors. >> >> Thanks, >> >> Matt >> >>> Thanks, >>> >>> Xiaodong >>> >>> >>> >>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From hzhang at mcs.anl.gov Fri Apr 21 10:29:21 2023 From: hzhang at mcs.anl.gov (Zhang, Hong) Date: Fri, 21 Apr 2023 15:29:21 +0000 Subject: [petsc-users] question about MatSetLocalToGlobalMapping In-Reply-To: <2D91313B-426F-45AD-A1B2-C3BBAC4394CB@lip6.fr> References: <2B2EFEAE-41BB-444A-864A-8129DEF8C7B4@lip6.fr> <2D91313B-426F-45AD-A1B2-C3BBAC4394CB@lip6.fr> Message-ID: Pierre, ________________________________ 1) Is there any hope to get PDIPDM to use a MatNest? KKT matrix is indefinite and ill-conditioned, which must be solved using a direct matrix factorization method. But you are using PCBJACOBI in the paper you attached? In any case, there are many such systems, e.g., a discretization of Stokes equations, that can be solved with something else than a direct factorization. The KKT in this paper is very special, each block represents a time step with loose and weak couplings. We tested larger and slightly varying physical parameters, PCZBJACOBI fails convergence while CHOLESKY encounters out of memory. For the current implementation, we use MUMPS Cholesky as default. To use MatNest, what direct solver to use, SCHUR_FACTOR? I do not know how to get it work. On the one hand, MatNest can efficiently convert to AIJ or SBAIJ if you want to stick to PCLU or PCCHOLESKY. On the other hand, it allows to easily switch to PCFIELDSPLIT which can be used to solve saddle-point problems. MatNest would be a natural way for these type of applications. PETSc has MatConvert_Nest_AIJ(), which could enable users to assemble their matrices in MatNest, and apply mumps/superlu direct solvers. I'm not concerned about the overhead of matrix conversion, because matrix factorization dominates computation in general. 2) Is this fixed https://lists.mcs.anl.gov/pipermail/petsc-dev/2020-September/026398.html ? I cannot get users to transition away from Ipopt because of these two missing features. The existing pdipm is the result of a MS student intern project. None of us involved are experts on the optimization solvers. We made a straightforward parallelization of Ipopt. It indeed needs further work, e.g., more features, better matrix storage, convergence criteria... To our knowledge, parallel pdipm is not available other than our pdipm. Ipopt can use MUMPS and PARDISO internally, so it?s in some sense parallel (using shared memory). Also, this is not a very potent selling point. My users that are satisfied with Ipopt as a "non-parallel" black box don?t want to have to touch part of their code just to stick it in a parallel black box which is limited to the same kind of linear solver and which has severe limitations with respect to Hessian/Jacobian/constraint distributions. We saw exiting 'parallel' pdipm papers, which conduct sequential computation and send KKT matrices to multiple processors, then they show scaling results on matrix computation only. Again, I wish someone can help to improve tao/pdipm. This is a useful solver. We should improve our pdipm. Hong On 20 Apr 2023, at 5:47 PM, Zhang, Hong via petsc-users > wrote: Karthik, We built a KKT matrix in TaoSNESJacobian_PDIPM() (see petsc/src/tao/constrained/impls/ipm/pdipm.c) which assembles several small matrices into a large KKT matrix in mpiaij format. You could take the same approach to insert P and P^T into your K. FYI, I attached our paper. Hong ________________________________ From: petsc-users > on behalf of Matthew Knepley > Sent: Thursday, April 20, 2023 5:37 AM To: Karthikeyan Chockalingam - STFC UKRI > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] question about MatSetLocalToGlobalMapping On Thu, Apr 20, 2023 at 6:13?AM Karthikeyan Chockalingam - STFC UKRI via petsc-users > wrote: Hello, I created a new thread, thought would it be more appropriate (and is a continuation of my previous post). I want to construct the below K matrix (which is composed of submatrices) K = [A P^T P 0] Where K is of type MatMPIAIJ. I first constructed the top left [A] using MatSetValues(). Now, I would like to construct the bottom left [p] and top right [p^T] using MatSetValuesLocal(). To use MatSetValuesLocal(), I first have to create a local-to-global mapping using ISLocalToGlobalMappingCreate. I have created two mapping row_mapping and column_mapping. I do not understand why they are not the same map. Maybe I was unclear before. It looks like you have two fields, say phi and lambda, where lambda is a Lagrange multiplier imposing some constraint. Then you get a saddle point like this. You can imagine matrices (phi, phi) --> A (phi, lambda) --> P^T (lambda, phi) --> P So you make a L2G map for the phi field and the lambda field. Oh, you are calling them row and col map, but they are my phi and lambda maps. I do not like the row and col names since in P they reverse. Q1) At what point should I declare MatSetLocalToGlobalMapping ? is it just before I use MatSetValuesLocal()? Okay, it is good you are asking this because my thinking was somewhat confused. I think the precise steps are: 1) Create the large saddle point matrix K 1a) We must call https://petsc.org/main/manualpages/Mat/MatSetLocalToGlobalMapping/ on it. In the simplest case, this just maps the local rows numbers [0, Nrows) to the global rows numbers [rowStart, rowStart + Nrows). 2) To form each piece: 2a) Extract that block using https://petsc.org/main/manualpages/Mat/MatGetLocalSubMatrix/ This gives back a Mat object that you subsequently restore using https://petsc.org/main/manualpages/Mat/MatRestoreLocalSubMatrix/ 2b) Insert values using https://petsc.org/main/manualpages/Mat/MatSetValuesLocal/ The local indices used for insertion here are indices relative to the block itself, and the L2G map for this matrix has been rewritten to insert into that block in the larger matrix. Thus this looks like just calling MatSetValuesLocal() on the smaller matrix block, but inserts correctly into the larger matrix. Therefore, the code you write code in 2) could work equally well making the large matrix from 1), or independent smaller matrix blocks. Does this make sense? Thanks, Matt I will use MatSetLocalToGlobalMapping(K, row_mapping, column_mapping) to build the bottom left [P]. Q2) Can now I reset the mapping as MatSetLocalToGlobalMapping(K, column_mapping, row_mapping) to build the top right [P^T]? Many thanks! Kind regards, Karthik. -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From liufield at gmail.com Fri Apr 21 11:37:50 2023 From: liufield at gmail.com (neil liu) Date: Fri, 21 Apr 2023 12:37:50 -0400 Subject: [petsc-users] Fwd: Inquiry about the dual space (PetscFECreateTabulation_Basic) In-Reply-To: References: Message-ID: Thanks a lot. Very helpful. On Fri, Apr 21, 2023 at 10:57?AM Matthew Knepley wrote: > On Fri, Apr 21, 2023 at 10:36?AM neil liu wrote: > >> When you say "For multicomponent spaces, we currently do not represent it >> as a tensor product over the scalar space, so we see 6 basis vectors." >> Here, muticomponent = two dimensional ? >> > > If you have a vector in a two-dimensional space, it has 2 components, like > our coordinate vector. > > >> I am a little confused about the dimensions of the basis functions here. >> From >> https://petsc.org/release//src/dm/dt/fe/impls/basic/febasic.c.html#PETSCFEBASIC >> >> 144: /* B[npoints, nodes, Nc] = tmpB[npoints, prime, Nc] * invV[prime, nodes] */ >> >> How do you define tmpB here (npoints =3, prime =6, Nc =2)? I can get tmpB from >> >> PetscSpaceEvaluate_Polynomial, where, tmpB (1x9) is (the prime polynomial is defined by 1 x y)) >> >> [ 1 -0.6667 -0.6667 1 -0.6667 0.3333 1 0.3333 -0.6666]. How do you transform from this 1x9 to 3x6x2 there. >> >> > npoints is the number of quadrature points at which to evaluate > > nodes (pdim) is the number of functions in the space > > Nc is the number of components for each function. > > So a P1 basis for vectors looks like > > / 1 \ / 0 \ / x \ / 0 \ / y \ / 0 \ > \ 0 / \ 1 / \ 0 / \ x / \ 0 / \ y / > > six vectors with 2 components each. > > Thanks, > > Matt > >> Thanks, >> >> Xiaodong >> >> >> >> >> >> >> On Fri, Apr 21, 2023 at 10:05?AM Matthew Knepley >> wrote: >> >>> On Fri, Apr 21, 2023 at 10:02?AM neil liu wrote: >>> >>>> Hello, Petsc group, >>>> >>>> I am learning the FE structure in Petsc by running case >>>> https://petsc.org/main/src/snes/tutorials/ex12.c.html with -run_type >>>> test -bc_type dirichlet -dm_plex_interpolate 0 -petscspace_degree 1 >>>> -show_initial -dm_plex_print_fem 1 >>>> >>> >>> -dm_plex_print_fem 5 will print much more >>> >>> >>>> When I check the subroutine PetscFECreateTabulation_Basic, I can not >>>> understand some parameters there. >>>> >>>> For the following lines in the file ( >>>> https://petsc.org/release//src/dm/dt/fe/impls/basic/febasic.c.html#PETSCFEBASIC >>>> ) >>>> >>>> 135: PetscCall (PetscDualSpaceGetDimension (fem->dualSpace, &pdim));136: PetscCall (PetscFEGetNumComponents (fem, &Nc)); >>>> >>>> Here, Nc = 2, pdim =6. I am running a scalar case with degree of 1, >>>> >>>> I expect Nc = 1 and pdim =3. Could you please explain this? In addition, >>>> >>>> Sure. I am guessing that you are looking at the tabulation for the >>> coordinate space. Here you are in 2 dimensions, so the >>> coordinate space has Nc = 2. For multicomponent spaces, we currently do >>> not represent it as a tensor product over the >>> scalar space, so we see 6 basis vectors. >>> >>> Thanks, >>> >>> Matt >>> >>>> Thanks, >>>> >>>> Xiaodong >>>> >>>> >>>> >>>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From frye.roger at gmail.com Fri Apr 21 12:44:00 2023 From: frye.roger at gmail.com (Roger Frye) Date: Fri, 21 Apr 2023 11:44:00 -0600 Subject: [petsc-users] Configure GPU for Apple M1 chip Message-ID: <889F4090-8E99-42C0-BFE1-EF657C91910E@gmail.com> How do I configure PETSc to use gpu on buy Mac powerbook where the gpu is integrated into the Apple M1 chip? Do I need to download the Apple developer Metal toolkit, and I do I need to install a clang compiler? -Roger From bsmith at petsc.dev Fri Apr 21 14:50:06 2023 From: bsmith at petsc.dev (Barry Smith) Date: Fri, 21 Apr 2023 15:50:06 -0400 Subject: [petsc-users] Configure GPU for Apple M1 chip In-Reply-To: <889F4090-8E99-42C0-BFE1-EF657C91910E@gmail.com> References: <889F4090-8E99-42C0-BFE1-EF657C91910E@gmail.com> Message-ID: <44E36AC1-994C-4814-929A-523B549FE16D@petsc.dev> PETSc has no capability to use the Apple GPUs. We do welcome merge requests, but I suspect it requires a great deal of work. Barry > On Apr 21, 2023, at 1:44 PM, Roger Frye wrote: > > How do I configure PETSc to use gpu on buy Mac powerbook where the gpu is integrated into the Apple M1 chip? > Do I need to download the Apple developer Metal toolkit, and I do I need to install a clang compiler? > -Roger > From jacob.fai at gmail.com Fri Apr 21 14:53:22 2023 From: jacob.fai at gmail.com (Jacob Faibussowitsch) Date: Fri, 21 Apr 2023 15:53:22 -0400 Subject: [petsc-users] Configure GPU for Apple M1 chip In-Reply-To: <889F4090-8E99-42C0-BFE1-EF657C91910E@gmail.com> References: <889F4090-8E99-42C0-BFE1-EF657C91910E@gmail.com> Message-ID: <6AA3D51F-1A61-45CF-A0D0-02F9119987B9@gmail.com> > How do I configure PETSc to use gpu on buy Mac powerbook where the gpu is integrated into the Apple M1 chip? PETSc does not currently support metal. Our GPU support is currently limited to CUDA, HIP, and SYCL. > I do I need to install a clang compiler? macOS comes with a clang compiler installed, but if you want to ensure one is installed you can run `xcode-select --install`. Best regards, Jacob Faibussowitsch (Jacob Fai - booss - oh - vitch) > On Apr 21, 2023, at 13:44, Roger Frye wrote: > > How do I configure PETSc to use gpu on buy Mac powerbook where the gpu is integrated into the Apple M1 chip? > Do I need to download the Apple developer Metal toolkit, and I do I need to install a clang compiler? > -Roger > From bsmith at petsc.dev Sun Apr 23 17:48:43 2023 From: bsmith at petsc.dev (Barry Smith) Date: Sun, 23 Apr 2023 18:48:43 -0400 Subject: [petsc-users] Fieldsplit with redistribute In-Reply-To: References: <35C00A41-36BA-45E6-AC79-7DDDDFCFED6C@petsc.dev> Message-ID: <0DB90ED8-6BD0-431E-B592-37B89DE3DE28@petsc.dev> I have added support for PCREDISTRIBUTE to propogate your PCFieldSplitSetIS() down to an inner PCFIELDSPLIT. You can access it with git fetch git checkout barry/2023-04-22/fieldsplit-fields-propogate ./configure make all check Take a look at src/ksp/ksp/tutorials/ex84.c and run with the options at the bottom of the file. Please let us know at https://gitlab.com/petsc/petsc/-/merge_requests/6366 if it works for you or you have any difficulties. Barry > On Apr 20, 2023, at 10:14 AM, Carl-Johan Thore wrote: > > Great, thanks! I?ve attached the code, a makefile, and a 1-page power-point which hopefully explains > what I?m trying to do on this little toy-problem. There is obviously (?) something I need to add around > line 327 in the code in order to move the indices to the correct rank. > > Output should be something like this when running: > ? > > Let me know if you need any more info, or if the code is incomprehensible or so > (it?s long because I?ve copied a lot from redistribute.c) > > Kind regards, > Carl-Johan > > From: Barry Smith > > Sent: Thursday, April 20, 2023 3:17 PM > To: Carl-Johan Thore > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > Sure > > > On Apr 20, 2023, at 4:09 AM, Carl-Johan Thore > wrote: > > Hi Barry, > > In the conversation below you mentioned that I could send code to you to take a look. I?ve written > up what I think is a minimally working example for this. It?s almost there in the sense of distributing > the correct number of indices to the ranks to match the reduced matrix, but it?s the wrong indices. > Would it be okay if I sent you the code to have look? > > Kind regards, > Carl-Johan > > From: Barry Smith > > Sent: Sunday, April 16, 2023 10:31 PM > To: Carl-Johan Thore > > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > The manual page for ISEmbed is incomprehensible to me. Anyways no matter what, you need to know what degrees of freedom are removed by PCDistribute() in order to produce the reduced IS which is why I think you need information only available inside PCSetUp_Redistribute(). (Sorry it is PCSetUp_Redistribute() not PCApply_Redistribute()) > > Barry > > > > > On Apr 16, 2023, at 3:36 PM, Carl-Johan Thore > wrote: > > Thanks for the quick reply Barry! > I have not tried the version with PCApply_Redistribute that you suggest, but I have a code that does roughly what you describe. It works when running on one rank, but fails on multiple ranks. I suspect the issue is with the use of ISEmbed as, quoting the PETSc-manual, "the resulting IS is sequential, since the index substitution it encodes is purely local" (admittedly I don't fully understand what that means). If you think using ISEmbed is not a good idea, I'll try PCApply_Redistribute() > From: Barry Smith > > Sent: 16 April 2023 21:11:18 > To: Carl-Johan Thore > > Cc: petsc-users at mcs.anl.gov > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > There is no code to do this currently. > > I would start by building your IS for each split before the PCRedistribute and then adding to the PCApply_Redistribute() code that "fixes" these IS by "removing" the entries of the IS associated with removed degrees of freedom and then shifting the entries indices of the IS by taking into account the removed indices. But you have probably already been trying this? It does require digging directly into the PCApply_Redistribute() to get the needed information (which degrees of freedom are removed by the redistribute code), plus it requires shifting the MPI rank ownership of the entries of the IS in the same way the MPI rank ownership of the degrees of freedom of the vector are moved. > > If you have some code that you think should be doing this but doesn't work feel free to send it to us and we may be able to fix it. > > Barry > > > > On Apr 16, 2023, at 2:50 PM, Carl-Johan Thore via petsc-users > wrote: > > > > Hello, > > I'm solving a blocksystem > > [A C; > > C' D], > > where D is not zero, using the PCFIELDSPLIT preconditioner and set the split using PetscFieldSplitSetIS. This works very well until I try PCREDISTRIBUTE (which is attractive as I have many locked DOFs). I suspect something goes wrong when constructing the IS for the split (I've tried various things using the IS-routines). Can PETSc do this automatically? Or else, any hints? > > Kind regards, > > Carl-Johan > > ??? -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image001.png Type: image/png Size: 25214 bytes Desc: not available URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: explanation.pptx Type: application/vnd.openxmlformats-officedocument.presentationml.presentation Size: 515540 bytes Desc: not available URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: IStest.cc Type: application/octet-stream Size: 15346 bytes Desc: not available URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: makefile Type: application/octet-stream Size: 451 bytes Desc: not available URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: From jixingzhou918 at gmail.com Mon Apr 24 06:32:35 2023 From: jixingzhou918 at gmail.com (=?UTF-8?B?5ZCJ5YW05rSy?=) Date: Mon, 24 Apr 2023 19:32:35 +0800 Subject: [petsc-users] Reading user generated Cartesian grid into DMXX Message-ID: Dear all, I'm solving a fluid problem and it has multiple square cylinders in the flow area. Unfortunately, I have to use a Cartesian grid (nonuniform) which can't generated by Gmsh. I have noticed that the object *DMDA* and *DMFOREST* typically have there own intrinsic methods such as *DMDACreate2d *to generate a structured grid or index. But in my case (multiple cylinders in the whole area), it is quite complicated or maybe impossible to use these intrinsic methods generating a Cartesian grid, if I'm not missing something in the *DM * tutorial. So I wonder if I can somehow define my own DMXX to directly read my generated grid. Or is there any other way to deal with (meaning reading the topology and geometry) this "complex" Cartesian grid? All the best, Xingzhou -------------- next part -------------- An HTML attachment was scrubbed... URL: From curfman at anl.gov Mon Apr 24 07:57:18 2023 From: curfman at anl.gov (McInnes, Lois Curfman) Date: Mon, 24 Apr 2023 12:57:18 +0000 Subject: [petsc-users] Celebrate Supercomputing Successes! Nominate colleagues for SIAG/Supercomputing Awards In-Reply-To: <6286350E-CFA1-4208-9B23-BC58D7A9579B@anl.gov> References: <6286350E-CFA1-4208-9B23-BC58D7A9579B@anl.gov> Message-ID: <5158EFA7-32E0-4B71-8B0B-4B42FCECFC36@anl.gov> Dear all -- Please consider nominating colleagues for these 3 SIAG/Supercomputing Awards ? and spread the word about these opportunities. The nomination deadline is July 31. We especially seek candidates for the SIAG/Supercomputing Early Career Prize: Prize Description The SIAM Activity Group on Supercomputing Early Career Prize (SIAG/SC Early Career Prize) is awarded every two years to one individual in their early career for outstanding research contributions in the field of algorithms research and development for parallel scientific and engineering computing in the three calendar years prior to the award year. ________________________________ Eligibility Criteria The recipient's work must be a significant research contribution to algorithms for parallel computing in science and engineering. At least one of the papers containing this work must be published in English in a peer-reviewed journal or conference proceedings bearing a publication date within the three calendar years prior to the year of the award, though a body of papers may be discussed in the nomination. Moreover, either the recipient must be a graduate student or the paper's publication date must be no more than three (3) calendar years later than the year in which the author received the PhD or equivalent degree. The committee may consider exceptions to the three-years-from PhD rule, for career interruptions or delays occurring, e.g., for childbearing, child rearing, or elder care. The award can be received only once in a lifetime. For the 2024 award, the paper must have been published between the dates of January 1, 2021 ? December 31, 2023. The candidate must have been awarded their PhD no earlier than 2018 or may be a current graduate student. ________________________________ Required Materials ? Nominator?s letter of recommendation for candidate ? Candidate's CV ? Bibliographic citation for candidate?s key contributing paper ? Two or three letters of support from experts in the field More info is below -- Thanks, The 2022-2023 SIAG/Supercomputing officers Lois Curfman McInnes (chair) Hatem Ltaief (vice chair) Michael Bader (program director) Rio Yokota (secretary) From: SIAM Date: Friday, March 24, 2023 at 12:32 AM Subject: SIAG on Supercomputing Community Digest for Thursday March 23, 2023 1 new thread from 1 author in the "SIAG on Supercomputing Community" community ... Celebrate Supercomputing Successes! Nominate... [SIAM] SIAG on Supercomputing Community Post New Message Mar 23, 2023 Discussions started 15 hours ago, Rio Yokota (0 replies) Celebrate Supercomputing Successes! Nominate colleagues for SIAG/Supercomputing Awards [external link to thread view] 1. SIAG/Supercomputing Early Career Prize ... Rio Yokota 1. Celebrate Supercomputing Successes! Nominate colleagues for SIAG/Supercomputing Awards Reply to Group Reply to Sender [Rio Yokota] Mar 23, 2023 8:31 AM Rio Yokota * SIAG/Supercomputing Early Career Prize * Established in 2009, the prize is awarded to an individual in their early career for contributions in the field of algorithms research and development for parallel scientific and engineering computing. * SIAG/Supercomputing Career Prize * Established in 2009, the prize is awarded to a senior researcher for broad and distinguished contributions to the field of algorithms research and development for parallel scientific and engineering computing. * SIAG/Supercomputing Best Paper Prize * Established in 2015, the prize is awarded to the author or authors of the most outstanding paper in the field of parallel scientific and engineering computing published in English in a peer-reviewed journal. More info, including eligibility criteria and required nomination materials: https://siag-sc.org/siagsc-award-nominations.html Nominations deadline: July 31, 2023 Prizes will be awarded at the 2024 SIAM Conference on Parallel Processing for Scientific Computing (PP24). ------------------------------ Rio Yokota Tokyo Institute of Technology Tokyo ------------------------------ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Apr 24 08:47:58 2023 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 24 Apr 2023 09:47:58 -0400 Subject: [petsc-users] Reading user generated Cartesian grid into DMXX In-Reply-To: References: Message-ID: On Mon, Apr 24, 2023 at 7:33?AM ??? wrote: > Dear all, > > I'm solving a fluid problem and it has multiple square cylinders in the > flow area. Unfortunately, I have to use a Cartesian grid (nonuniform) which > can't generated by Gmsh. > > I have noticed that the object *DMDA* and *DMFOREST* typically have there > own intrinsic methods such as *DMDACreate2d *to generate a structured > grid or index. But in my case (multiple cylinders in the whole area), it is > quite complicated or maybe impossible to use these intrinsic methods > generating a Cartesian grid, if I'm not missing something in the *DM * > tutorial. > > So I wonder if I can somehow define my own DMXX to directly read my > generated grid. Or is there any other way to deal with (meaning reading the > topology and geometry) this "complex" Cartesian grid? > By "Cartesian grid", we generally mean a regular rectangular grid, where we have a m x n grid of cells. This is also often called a "structured grid". If you mean an irregular arrangement of square cells, then this is an "unstructured grid" using "quadrilateral cells". DMForest deals with "structured adaptive" grids, meaning structured grids where we allow cells to be divided without dividing neighbors. This creates hanging nodes which must be dealt with somehow in the discretization. What kind of mesh are you talking about? Thanks, Matt > All the best, > Xingzhou > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From karthikeyan.chockalingam at stfc.ac.uk Mon Apr 24 09:21:57 2023 From: karthikeyan.chockalingam at stfc.ac.uk (Karthikeyan Chockalingam - STFC UKRI) Date: Mon, 24 Apr 2023 14:21:57 +0000 Subject: [petsc-users] Setting up a matrix for Lagrange multiplier In-Reply-To: References: Message-ID: Hello, I was able to construct the below K matrix (using submatrices P and P^T), which is of type MATAIJ K = [A P^T P 0] and solved them using a direct solver. However, I was reading online that this is a saddle point problem and I should be employing PCFIELDSPLIT. Since I have one monolithic matrix K, I was not sure how to split the fields. Best regards, Karthik. From: Chockalingam, Karthikeyan (STFC,DL,HC) Date: Wednesday, 19 April 2023 at 17:52 To: Matthew Knepley Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Setting up a matrix for Lagrange multiplier I have declared the mapping ISLocalToGlobalMapping mapping; ISLocalToGlobalMappingCreate(PETSC_COMM_WORLD, 1, n, nindices, PETSC_COPY_VALUES, &mapping); But when I use MatSetValuesLocal(), how do I know the above mapping is employed because it is not one of the parameters passed to the function? Thank you. Kind regards, Karthik. From: Matthew Knepley Date: Tuesday, 18 April 2023 at 16:21 To: Chockalingam, Karthikeyan (STFC,DL,HC) Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Setting up a matrix for Lagrange multiplier On Tue, Apr 18, 2023 at 11:16?AM Karthikeyan Chockalingam - STFC UKRI > wrote: Thank you for your response. I spend some time understanding how MatSetValuesLocal and ISLocalToGlobalMappingCreate work. You can look at SNES ex28 where we do this with DMCOMPOSITE. Q1) Will the matrix K be of type MATMPIAIJ or MATIS? K = [A P^T P 0] I assume MPIAIJ since IS is only used for Neumann-Neumann decompositions. Q2) Can I use both MatSetValues() to MatSetValuesLocal() to populate K? Since I have already used MatSetValues() to construct A. You can, and there would be no changes in serial if K is exactly the upper left block, but in parallel global indices would change. Q3) What are the advantages of using MatSetValuesLocal()? Is it that I can construct P directly using local indies and map the entrees to the global index in K? You have a monolithic K, so that you can use sparse direct solvers to check things. THis is impossible with separate storage. Q4) I probably don?t have to construct an independent P matrix You wouldn't in this case. Thanks, Matt Best regards, Karthik. From: Matthew Knepley > Date: Tuesday, 18 April 2023 at 11:08 To: Chockalingam, Karthikeyan (STFC,DL,HC) > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Setting up a matrix for Lagrange multiplier On Tue, Apr 18, 2023 at 5:24?AM Karthikeyan Chockalingam - STFC UKRI via petsc-users > wrote: Hello, I'm solving a problem using the Lagrange multiplier, the matrix has the form K = [A P^T P 0] I am familiar with constructing K using MATMPIAIJ. However, I would like to know if had [A], can I augment it with [P], [P^T] and [0] of type MATMPIAIJ? Likewise for vectors as well. Can you please point me to the right resource, if it is a common operation in PETSc? You can do this at least 2 ways: 1) Assemble you submatrices directly into the larger matrix by constructing local-to-global maps for the emplacement. so that you do not change your assembly code, except to change MatSetValues() to MatSetValuesLocal(). This is usually preferable. 2) Use MATNEST and VecNEST to put pointers to submatrices and subvectors directly in. Thanks, Matt Many thanks. Kind regards, Karthik. -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Mon Apr 24 09:26:00 2023 From: bsmith at petsc.dev (Barry Smith) Date: Mon, 24 Apr 2023 10:26:00 -0400 Subject: [petsc-users] Fieldsplit with redistribute In-Reply-To: References: <35C00A41-36BA-45E6-AC79-7DDDDFCFED6C@petsc.dev> <0DB90ED8-6BD0-431E-B592-37B89DE3DE28@petsc.dev> Message-ID: <88F183A6-6553-4288-89A5-DD1903EA9956@petsc.dev> The bug was mine; I was freeing the map object outside of the if () instead of inside. You can do git pull make all and then try again. Barry > On Apr 24, 2023, at 5:39 AM, Carl-Johan Thore wrote: > > Hi Barry! > > First of all, thank you very very much for this! I was expecting maybe a few hints and pointers on how to proceed > with my work, but then you did a complete implementation ? > > Your code ran fine with ex84.cc . Unfortunately it crashed when running on my main code (a mixed Stokes solver). > When running on 1 core I get a crash which is maybe related to your code, so I?ve attached the error message for that > case. However, on multiple cores I think the issue is mainly that I?m not constructing the original IS correctly, so I?ll > look into that myself. > > Regarding reporting to https://gitlab.com/petsc/petsc/-/merge_requests/6366, should it be done here?: ? > > By the way, I managed yesterday to make a working implementation of my own example and was planning to send it > after cleaning it up and maybe optimizing a bit. I?ve attached it if your curious (or just want to have a good laugh :)) > > Kind regards, > Carl-Johan > > > From: Barry Smith > Sent: Monday, April 24, 2023 12:49 AM > To: Carl-Johan Thore > Cc: PETSc > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > I have added support for PCREDISTRIBUTE to propogate your PCFieldSplitSetIS() down to an inner PCFIELDSPLIT. You can > access it with > > git fetch > git checkout barry/2023-04-22/fieldsplit-fields-propogate > ./configure > make all check > > Take a look at src/ksp/ksp/tutorials/ex84.c and run with the options at the bottom of the file. > > Please let us know at https://gitlab.com/petsc/petsc/-/merge_requests/6366 if it works for you or you have any difficulties. > > Barry > > > > > On Apr 20, 2023, at 10:14 AM, Carl-Johan Thore > wrote: > > Great, thanks! I?ve attached the code, a makefile, and a 1-page power-point which hopefully explains > what I?m trying to do on this little toy-problem. There is obviously (?) something I need to add around > line 327 in the code in order to move the indices to the correct rank. > > Output should be something like this when running: > ? > > Let me know if you need any more info, or if the code is incomprehensible or so > (it?s long because I?ve copied a lot from redistribute.c) > > Kind regards, > Carl-Johan > > From: Barry Smith > > Sent: Thursday, April 20, 2023 3:17 PM > To: Carl-Johan Thore > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > Sure > > > On Apr 20, 2023, at 4:09 AM, Carl-Johan Thore > wrote: > > Hi Barry, > > In the conversation below you mentioned that I could send code to you to take a look. I?ve written > up what I think is a minimally working example for this. It?s almost there in the sense of distributing > the correct number of indices to the ranks to match the reduced matrix, but it?s the wrong indices. > Would it be okay if I sent you the code to have look? > > Kind regards, > Carl-Johan > > From: Barry Smith > > Sent: Sunday, April 16, 2023 10:31 PM > To: Carl-Johan Thore > > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > The manual page for ISEmbed is incomprehensible to me. Anyways no matter what, you need to know what degrees of freedom are removed by PCDistribute() in order to produce the reduced IS which is why I think you need information only available inside PCSetUp_Redistribute(). (Sorry it is PCSetUp_Redistribute() not PCApply_Redistribute()) > > Barry > > > > > On Apr 16, 2023, at 3:36 PM, Carl-Johan Thore > wrote: > > Thanks for the quick reply Barry! > I have not tried the version with PCApply_Redistribute that you suggest, but I have a code that does roughly what you describe. It works when running on one rank, but fails on multiple ranks. I suspect the issue is with the use of ISEmbed as, quoting the PETSc-manual, "the resulting IS is sequential, since the index substitution it encodes is purely local" (admittedly I don't fully understand what that means). If you think using ISEmbed is not a good idea, I'll try PCApply_Redistribute() > From: Barry Smith > > Sent: 16 April 2023 21:11:18 > To: Carl-Johan Thore > > Cc: petsc-users at mcs.anl.gov > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > There is no code to do this currently. > > I would start by building your IS for each split before the PCRedistribute and then adding to the PCApply_Redistribute() code that "fixes" these IS by "removing" the entries of the IS associated with removed degrees of freedom and then shifting the entries indices of the IS by taking into account the removed indices. But you have probably already been trying this? It does require digging directly into the PCApply_Redistribute() to get the needed information (which degrees of freedom are removed by the redistribute code), plus it requires shifting the MPI rank ownership of the entries of the IS in the same way the MPI rank ownership of the degrees of freedom of the vector are moved. > > If you have some code that you think should be doing this but doesn't work feel free to send it to us and we may be able to fix it. > > Barry > > > > On Apr 16, 2023, at 2:50 PM, Carl-Johan Thore via petsc-users > wrote: > > > > Hello, > > I'm solving a blocksystem > > [A C; > > C' D], > > where D is not zero, using the PCFIELDSPLIT preconditioner and set the split using PetscFieldSplitSetIS. This works very well until I try PCREDISTRIBUTE (which is attractive as I have many locked DOFs). I suspect something goes wrong when constructing the IS for the split (I've tried various things using the IS-routines). Can PETSc do this automatically? Or else, any hints? > > Kind regards, > > Carl-Johan > > > ?? -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image002.png Type: image/png Size: 70472 bytes Desc: not available URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image001.png Type: image/png Size: 25214 bytes Desc: not available URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: error_message_fieldsplit_redistribute_1_core.txt URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: IStest.cc Type: application/octet-stream Size: 21885 bytes Desc: not available URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Apr 24 11:56:39 2023 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 24 Apr 2023 12:56:39 -0400 Subject: [petsc-users] Setting up a matrix for Lagrange multiplier In-Reply-To: References: Message-ID: On Mon, Apr 24, 2023 at 10:22?AM Karthikeyan Chockalingam - STFC UKRI < karthikeyan.chockalingam at stfc.ac.uk> wrote: > Hello, > > > > I was able to construct the below K matrix (using submatrices P and P^T), > which is of type MATAIJ > > K = [A P^T > > P 0] > > and solved them using a direct solver. > I modified your example to create either AIJ or Nest matrices, but use the same assembly code: https://gitlab.com/petsc/petsc/-/merge_requests/6368 > However, I was reading online that this is a saddle point problem and I > should be employing PCFIELDSPLIT. > > Since I have one monolithic matrix K, I was not sure how to split the > fields. > With this particular matrix, you can use -pc_fieldsplit_detect_saddle_point and it will split it automatically. Thanks, Matt > Best regards, > > Karthik. > > > > > > *From: *Chockalingam, Karthikeyan (STFC,DL,HC) < > karthikeyan.chockalingam at stfc.ac.uk> > *Date: *Wednesday, 19 April 2023 at 17:52 > *To: *Matthew Knepley > *Cc: *petsc-users at mcs.anl.gov > *Subject: *Re: [petsc-users] Setting up a matrix for Lagrange multiplier > > I have declared the mapping > > > > ISLocalToGlobalMapping mapping; > > ISLocalToGlobalMappingCreate(PETSC_COMM_WORLD, 1, n, nindices, > PETSC_COPY_VALUES, &mapping); > > > > But when I use MatSetValuesLocal(), how do I know the above mapping is > employed because it is not one of the parameters passed to the function? > > > > Thank you. > > > > Kind regards, > > Karthik. > > > > > > *From: *Matthew Knepley > *Date: *Tuesday, 18 April 2023 at 16:21 > *To: *Chockalingam, Karthikeyan (STFC,DL,HC) < > karthikeyan.chockalingam at stfc.ac.uk> > *Cc: *petsc-users at mcs.anl.gov > *Subject: *Re: [petsc-users] Setting up a matrix for Lagrange multiplier > > On Tue, Apr 18, 2023 at 11:16?AM Karthikeyan Chockalingam - STFC UKRI < > karthikeyan.chockalingam at stfc.ac.uk> wrote: > > Thank you for your response. I spend some time understanding how > > MatSetValuesLocal and ISLocalToGlobalMappingCreate work. > > > > You can look at SNES ex28 where we do this with DMCOMPOSITE. > > > > Q1) Will the matrix K be of type MATMPIAIJ or MATIS? > > K = [A P^T > > P 0] > > > > I assume MPIAIJ since IS is only used for Neumann-Neumann decompositions. > > > > Q2) Can I use both MatSetValues() to MatSetValuesLocal() to populate K? > Since I have already used MatSetValues() to construct A. > > > > You can, and there would be no changes in serial if K is exactly the upper > left block, but in parallel global indices would change. > > > > Q3) What are the advantages of using MatSetValuesLocal()? Is it that I can > construct P directly using local indies and map the entrees to the global > index in K? > > > > You have a monolithic K, so that you can use sparse direct solvers to > check things. THis is impossible with separate storage. > > > > Q4) I probably don?t have to construct an independent P matrix > > > > You wouldn't in this case. > > > > Thanks, > > > > Matt > > > > Best regards, > > Karthik. > > > > > > > > *From: *Matthew Knepley > *Date: *Tuesday, 18 April 2023 at 11:08 > *To: *Chockalingam, Karthikeyan (STFC,DL,HC) < > karthikeyan.chockalingam at stfc.ac.uk> > *Cc: *petsc-users at mcs.anl.gov > *Subject: *Re: [petsc-users] Setting up a matrix for Lagrange multiplier > > On Tue, Apr 18, 2023 at 5:24?AM Karthikeyan Chockalingam - STFC UKRI via > petsc-users wrote: > > Hello, > > > > I'm solving a problem using the Lagrange multiplier, the matrix has the > form > > > > K = [A P^T > > P 0] > > > > I am familiar with constructing K using MATMPIAIJ. However, I would like > to know if had [A], can I augment it with [P], [P^T] and [0] of type > MATMPIAIJ? Likewise for vectors as well. > > > > Can you please point me to the right resource, if it is a common operation > in PETSc? > > > > You can do this at least 2 ways: > > > > 1) Assemble you submatrices directly into the larger matrix by > constructing local-to-global maps for the emplacement. so that you do > > not change your assembly code, except to change MatSetValues() to > MatSetValuesLocal(). This is usually preferable. > > > > 2) Use MATNEST and VecNEST to put pointers to submatrices and subvectors > directly in. > > > > Thanks, > > > > Matt > > > > Many thanks. > > > > Kind regards, > > Karthik. > > > > > > > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From karthikeyan.chockalingam at stfc.ac.uk Mon Apr 24 12:37:04 2023 From: karthikeyan.chockalingam at stfc.ac.uk (Karthikeyan Chockalingam - STFC UKRI) Date: Mon, 24 Apr 2023 17:37:04 +0000 Subject: [petsc-users] Setting up a matrix for Lagrange multiplier In-Reply-To: References: Message-ID: Great to know there is an example now. Yes, -pc_fieldsplit_defect_saddle_point worked. But I would like to understand what I did wrong (and why it didn?t work). After reading many posted posts, I needed to create PCFieldSplitSetIS to split the fields. I set the first N indices to the field phi and the next C indices (starting from N) to the field lambda. Please tell me what I did wrong and how I can fix the lines which are now commented? PetscErrorCode ierr; KSP ksp; PC pc; KSPCreate(PETSC_COMM_WORLD, &ksp); KSPSetType(ksp, KSPGMRES); KSPSetOperators(ksp, A[level], A[level]); ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr); ierr = PCSetType(pc,PCFIELDSPLIT);CHKERRQ(ierr); ??.. for (int i = 0; i < N; i++) vec_zero_field_index[i] = i; PetscInt * vec_first_field_index = NULL; PetscMalloc(n * sizeof(PetscInt), &vec_first_field_index); for (int i = 0; i < C; i++) vec_first_field_index[i] = N + i; IS zero_field_isx, first_field_isx; CHKERRQ(ISCreateGeneral(PETSC_COMM_WORLD, N, vec_zero_field_index, PETSC_COPY_VALUES, &zero_field_isx)); CHKERRQ(ISCreateGeneral(PETSC_COMM_WORLD, C, vec_first_field_index, PETSC_COPY_VALUES, &first_field_isx)); ierr = PCFieldSplitSetIS(pc,"0",zero_field_isx); CHKERRQ(ierr); ierr = PCFieldSplitSetIS(pc,"1",first_field_isx); CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL,"-ksp_type", "fgmres"); CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL,"-pc_type", "fieldsplit"); CHKERRQ(ierr); /* ierr = PetscOptionsSetValue(NULL,"-fieldsplit_0_ksp_type", "gmres"); CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL,"-fieldsplit_0_pc_type", "jacobi"); CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL,"-fieldsplit_1_ksp_type", "preonly"); CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL,"-fieldsplit_1_pc_type", "lu"); CHKERRQ(ierr);*/ ierr = PetscOptionsSetValue(NULL, "-pc_fieldsplit_detect_saddle_point", NULL);CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL, "-options_left", NULL);CHKERRQ(ierr); ierr = KSPSetFromOptions(ksp); CHKERRQ(ierr); KSPSolve(ksp, b, x); Best, Karthik. From: Matthew Knepley Date: Monday, 24 April 2023 at 17:57 To: Chockalingam, Karthikeyan (STFC,DL,HC) Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Setting up a matrix for Lagrange multiplier On Mon, Apr 24, 2023 at 10:22?AM Karthikeyan Chockalingam - STFC UKRI > wrote: Hello, I was able to construct the below K matrix (using submatrices P and P^T), which is of type MATAIJ K = [A P^T P 0] and solved them using a direct solver. I modified your example to create either AIJ or Nest matrices, but use the same assembly code: https://gitlab.com/petsc/petsc/-/merge_requests/6368 However, I was reading online that this is a saddle point problem and I should be employing PCFIELDSPLIT. Since I have one monolithic matrix K, I was not sure how to split the fields. With this particular matrix, you can use -pc_fieldsplit_detect_saddle_point and it will split it automatically. Thanks, Matt Best regards, Karthik. From: Chockalingam, Karthikeyan (STFC,DL,HC) > Date: Wednesday, 19 April 2023 at 17:52 To: Matthew Knepley > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Setting up a matrix for Lagrange multiplier I have declared the mapping ISLocalToGlobalMapping mapping; ISLocalToGlobalMappingCreate(PETSC_COMM_WORLD, 1, n, nindices, PETSC_COPY_VALUES, &mapping); But when I use MatSetValuesLocal(), how do I know the above mapping is employed because it is not one of the parameters passed to the function? Thank you. Kind regards, Karthik. From: Matthew Knepley > Date: Tuesday, 18 April 2023 at 16:21 To: Chockalingam, Karthikeyan (STFC,DL,HC) > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Setting up a matrix for Lagrange multiplier On Tue, Apr 18, 2023 at 11:16?AM Karthikeyan Chockalingam - STFC UKRI > wrote: Thank you for your response. I spend some time understanding how MatSetValuesLocal and ISLocalToGlobalMappingCreate work. You can look at SNES ex28 where we do this with DMCOMPOSITE. Q1) Will the matrix K be of type MATMPIAIJ or MATIS? K = [A P^T P 0] I assume MPIAIJ since IS is only used for Neumann-Neumann decompositions. Q2) Can I use both MatSetValues() to MatSetValuesLocal() to populate K? Since I have already used MatSetValues() to construct A. You can, and there would be no changes in serial if K is exactly the upper left block, but in parallel global indices would change. Q3) What are the advantages of using MatSetValuesLocal()? Is it that I can construct P directly using local indies and map the entrees to the global index in K? You have a monolithic K, so that you can use sparse direct solvers to check things. THis is impossible with separate storage. Q4) I probably don?t have to construct an independent P matrix You wouldn't in this case. Thanks, Matt Best regards, Karthik. From: Matthew Knepley > Date: Tuesday, 18 April 2023 at 11:08 To: Chockalingam, Karthikeyan (STFC,DL,HC) > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Setting up a matrix for Lagrange multiplier On Tue, Apr 18, 2023 at 5:24?AM Karthikeyan Chockalingam - STFC UKRI via petsc-users > wrote: Hello, I'm solving a problem using the Lagrange multiplier, the matrix has the form K = [A P^T P 0] I am familiar with constructing K using MATMPIAIJ. However, I would like to know if had [A], can I augment it with [P], [P^T] and [0] of type MATMPIAIJ? Likewise for vectors as well. Can you please point me to the right resource, if it is a common operation in PETSc? You can do this at least 2 ways: 1) Assemble you submatrices directly into the larger matrix by constructing local-to-global maps for the emplacement. so that you do not change your assembly code, except to change MatSetValues() to MatSetValuesLocal(). This is usually preferable. 2) Use MATNEST and VecNEST to put pointers to submatrices and subvectors directly in. Thanks, Matt Many thanks. Kind regards, Karthik. -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Apr 24 12:41:50 2023 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 24 Apr 2023 13:41:50 -0400 Subject: [petsc-users] Setting up a matrix for Lagrange multiplier In-Reply-To: References: Message-ID: On Mon, Apr 24, 2023 at 1:37?PM Karthikeyan Chockalingam - STFC UKRI < karthikeyan.chockalingam at stfc.ac.uk> wrote: > Great to know there is an example now. > > > > Yes, -pc_fieldsplit_defect_saddle_point worked. > > > > But I would like to understand what I did wrong (and why it didn?t work). > Can you show the error? Or give something self-contained that I can run. > After reading many posted posts, I needed to create PCFieldSplitSetIS to > split the fields. > Yes. I would give NULL for the name here. > I set the first N indices to the field phi and the next C indices > (starting from N) to the field lambda. > This looks fine. However, these are global indices, so this would not work in parallel. You would have to offset by the first index on this process. Thanks, Matt > Please tell me what I did wrong and how I can fix the lines which are now > commented? > > > > PetscErrorCode ierr; > > KSP ksp; > > PC pc; > > > > KSPCreate(PETSC_COMM_WORLD, &ksp); > > KSPSetType(ksp, KSPGMRES); > > KSPSetOperators(ksp, A[level], A[level]); > > ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr); > > ierr = PCSetType(pc,PCFIELDSPLIT);CHKERRQ(ierr); > > > > ??.. > > > > * for* (*int* i = 0; i < N; i++) > > vec_zero_field_index[i] = i; > > > > PetscInt * vec_first_field_index = *NULL*; > > PetscMalloc(n * *sizeof*(PetscInt), &vec_first_field_index); > > > > *for* (*int* i = 0; i < C; i++) > > vec_first_field_index[i] = N + i; > > > > IS zero_field_isx, first_field_isx; > > CHKERRQ(ISCreateGeneral(PETSC_COMM_WORLD, N, vec_zero_field_index, > PETSC_COPY_VALUES, &zero_field_isx)); > > CHKERRQ(ISCreateGeneral(PETSC_COMM_WORLD, C, vec_first_field_index, > PETSC_COPY_VALUES, &first_field_isx)); > > > > ierr = PCFieldSplitSetIS(pc,"0",zero_field_isx); CHKERRQ(ierr); > > ierr = PCFieldSplitSetIS(pc,"1",first_field_isx); CHKERRQ(ierr); > > > > ierr = PetscOptionsSetValue(*NULL*,"-ksp_type", "fgmres"); CHKERRQ > (ierr); > > ierr = PetscOptionsSetValue(*NULL*,"-pc_type", "fieldsplit"); CHKERRQ > (ierr); > > > > /* > > ierr = PetscOptionsSetValue(NULL,"-fieldsplit_0_ksp_type", "gmres"); > CHKERRQ(ierr); > > ierr = PetscOptionsSetValue(NULL,"-fieldsplit_0_pc_type", "jacobi"); > CHKERRQ(ierr); > > ierr = PetscOptionsSetValue(NULL,"-fieldsplit_1_ksp_type", > "preonly"); CHKERRQ(ierr); > > ierr = PetscOptionsSetValue(NULL,"-fieldsplit_1_pc_type", "lu"); > CHKERRQ(ierr);*/ > > > > ierr = PetscOptionsSetValue(*NULL*, > "-pc_fieldsplit_detect_saddle_point", *NULL*);CHKERRQ(ierr); > > ierr = PetscOptionsSetValue(*NULL*, "-options_left", *NULL*);CHKERRQ > (ierr); > > ierr = KSPSetFromOptions(ksp); CHKERRQ(ierr); > > KSPSolve(ksp, b, x); > > > > Best, > > Karthik. > > *From: *Matthew Knepley > *Date: *Monday, 24 April 2023 at 17:57 > *To: *Chockalingam, Karthikeyan (STFC,DL,HC) < > karthikeyan.chockalingam at stfc.ac.uk> > *Cc: *petsc-users at mcs.anl.gov > *Subject: *Re: [petsc-users] Setting up a matrix for Lagrange multiplier > > On Mon, Apr 24, 2023 at 10:22?AM Karthikeyan Chockalingam - STFC UKRI < > karthikeyan.chockalingam at stfc.ac.uk> wrote: > > Hello, > > > > I was able to construct the below K matrix (using submatrices P and P^T), > which is of type MATAIJ > > K = [A P^T > > P 0] > > and solved them using a direct solver. > > > > I modified your example to create either AIJ or Nest matrices, but use the > same assembly code: > > > > https://gitlab.com/petsc/petsc/-/merge_requests/6368 > > > > However, I was reading online that this is a saddle point problem and I > should be employing PCFIELDSPLIT. > > Since I have one monolithic matrix K, I was not sure how to split the > fields. > > > > With this particular matrix, you can use > > > > -pc_fieldsplit_detect_saddle_point > > > > and it will split it automatically. > > > > Thanks, > > > > Matt > > > > Best regards, > > Karthik. > > > > > > *From: *Chockalingam, Karthikeyan (STFC,DL,HC) < > karthikeyan.chockalingam at stfc.ac.uk> > *Date: *Wednesday, 19 April 2023 at 17:52 > *To: *Matthew Knepley > *Cc: *petsc-users at mcs.anl.gov > *Subject: *Re: [petsc-users] Setting up a matrix for Lagrange multiplier > > I have declared the mapping > > > > ISLocalToGlobalMapping mapping; > > ISLocalToGlobalMappingCreate(PETSC_COMM_WORLD, 1, n, nindices, > PETSC_COPY_VALUES, &mapping); > > > > But when I use MatSetValuesLocal(), how do I know the above mapping is > employed because it is not one of the parameters passed to the function? > > > > Thank you. > > > > Kind regards, > > Karthik. > > > > > > *From: *Matthew Knepley > *Date: *Tuesday, 18 April 2023 at 16:21 > *To: *Chockalingam, Karthikeyan (STFC,DL,HC) < > karthikeyan.chockalingam at stfc.ac.uk> > *Cc: *petsc-users at mcs.anl.gov > *Subject: *Re: [petsc-users] Setting up a matrix for Lagrange multiplier > > On Tue, Apr 18, 2023 at 11:16?AM Karthikeyan Chockalingam - STFC UKRI < > karthikeyan.chockalingam at stfc.ac.uk> wrote: > > Thank you for your response. I spend some time understanding how > > MatSetValuesLocal and ISLocalToGlobalMappingCreate work. > > > > You can look at SNES ex28 where we do this with DMCOMPOSITE. > > > > Q1) Will the matrix K be of type MATMPIAIJ or MATIS? > > K = [A P^T > > P 0] > > > > I assume MPIAIJ since IS is only used for Neumann-Neumann decompositions. > > > > Q2) Can I use both MatSetValues() to MatSetValuesLocal() to populate K? > Since I have already used MatSetValues() to construct A. > > > > You can, and there would be no changes in serial if K is exactly the upper > left block, but in parallel global indices would change. > > > > Q3) What are the advantages of using MatSetValuesLocal()? Is it that I can > construct P directly using local indies and map the entrees to the global > index in K? > > > > You have a monolithic K, so that you can use sparse direct solvers to > check things. THis is impossible with separate storage. > > > > Q4) I probably don?t have to construct an independent P matrix > > > > You wouldn't in this case. > > > > Thanks, > > > > Matt > > > > Best regards, > > Karthik. > > > > > > > > *From: *Matthew Knepley > *Date: *Tuesday, 18 April 2023 at 11:08 > *To: *Chockalingam, Karthikeyan (STFC,DL,HC) < > karthikeyan.chockalingam at stfc.ac.uk> > *Cc: *petsc-users at mcs.anl.gov > *Subject: *Re: [petsc-users] Setting up a matrix for Lagrange multiplier > > On Tue, Apr 18, 2023 at 5:24?AM Karthikeyan Chockalingam - STFC UKRI via > petsc-users wrote: > > Hello, > > > > I'm solving a problem using the Lagrange multiplier, the matrix has the > form > > > > K = [A P^T > > P 0] > > > > I am familiar with constructing K using MATMPIAIJ. However, I would like > to know if had [A], can I augment it with [P], [P^T] and [0] of type > MATMPIAIJ? Likewise for vectors as well. > > > > Can you please point me to the right resource, if it is a common operation > in PETSc? > > > > You can do this at least 2 ways: > > > > 1) Assemble you submatrices directly into the larger matrix by > constructing local-to-global maps for the emplacement. so that you do > > not change your assembly code, except to change MatSetValues() to > MatSetValuesLocal(). This is usually preferable. > > > > 2) Use MATNEST and VecNEST to put pointers to submatrices and subvectors > directly in. > > > > Thanks, > > > > Matt > > > > Many thanks. > > > > Kind regards, > > Karthik. > > > > > > > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From karthikeyan.chockalingam at stfc.ac.uk Mon Apr 24 12:58:16 2023 From: karthikeyan.chockalingam at stfc.ac.uk (Karthikeyan Chockalingam - STFC UKRI) Date: Mon, 24 Apr 2023 17:58:16 +0000 Subject: [petsc-users] Setting up a matrix for Lagrange multiplier In-Reply-To: References: Message-ID: Changing the names to NULL produced the same error. Don?t I have to give the fields a name? The problem solution didn?t change from one time step to the next. [1;31m[0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0;39m[0;49m[0]PETSC ERROR: No support for this operation for this object type [0]PETSC ERROR: Unsupported viewer gmres [0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [0]PETSC ERROR: Option left: name:-fieldsplit_1_ksp_type value: preonly source: code [0]PETSC ERROR: Option left: name:-fieldsplit_1_ksp_view value: gmres source: code [0]PETSC ERROR: Option left: name:-fieldsplit_1_pc_type value: lu source: code [0]PETSC ERROR: Option left: name:-options_left (no value) source: code [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [0]PETSC ERROR: Petsc Development GIT revision: v3.18.4-529-g995ec06f92 GIT Date: 2023-02-03 18:41:48 +0000 [0]PETSC ERROR: /Users/karthikeyan.chockalingam/AMReX/amrFEM/build/Debug/amrFEM on a named HC20210312 by karthikeyan.chockalingam Mon Apr 24 18:51:25 2023 [0]PETSC ERROR: Configure options --with-debugging=0 --prefix=/Users/karthikeyan.chockalingam/AMReX/petsc --download-fblaslapack=yes --download-scalapack=yes --download-mumps=yes --with-hypre-dir=/Users/karthikeyan.chockalingam/AMReX/hypre/src/hypre [0]PETSC ERROR: #1 PetscOptionsGetViewer() at /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/sys/classes/viewer/interface/viewreg.c:309 [0]PETSC ERROR: #2 KSPSetFromOptions() at /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/ksp/interface/itcl.c:522 [0]PETSC ERROR: #3 PCSetUp_FieldSplit() at /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1054 [0]PETSC ERROR: #4 PCSetUp() at /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/pc/interface/precon.c:994 [0]PETSC ERROR: #5 KSPSetUp() at /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/ksp/interface/itfunc.c:405 [0]PETSC ERROR: #6 KSPSolve_Private() at /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/ksp/interface/itfunc.c:824 [0]PETSC ERROR: #7 KSPSolve() at /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/ksp/interface/itfunc.c:1070 End of program solve time 0.03416619 seconds It will be a bit difficult for me to produce a stand-alone code. Thank you for your help. Best, Karthik. From: Matthew Knepley Date: Monday, 24 April 2023 at 18:42 To: Chockalingam, Karthikeyan (STFC,DL,HC) Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Setting up a matrix for Lagrange multiplier On Mon, Apr 24, 2023 at 1:37?PM Karthikeyan Chockalingam - STFC UKRI > wrote: Great to know there is an example now. Yes, -pc_fieldsplit_defect_saddle_point worked. But I would like to understand what I did wrong (and why it didn?t work). Can you show the error? Or give something self-contained that I can run. After reading many posted posts, I needed to create PCFieldSplitSetIS to split the fields. Yes. I would give NULL for the name here. I set the first N indices to the field phi and the next C indices (starting from N) to the field lambda. This looks fine. However, these are global indices, so this would not work in parallel. You would have to offset by the first index on this process. Thanks, Matt Please tell me what I did wrong and how I can fix the lines which are now commented? PetscErrorCode ierr; KSP ksp; PC pc; KSPCreate(PETSC_COMM_WORLD, &ksp); KSPSetType(ksp, KSPGMRES); KSPSetOperators(ksp, A[level], A[level]); ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr); ierr = PCSetType(pc,PCFIELDSPLIT);CHKERRQ(ierr); ??.. for (int i = 0; i < N; i++) vec_zero_field_index[i] = i; PetscInt * vec_first_field_index = NULL; PetscMalloc(n * sizeof(PetscInt), &vec_first_field_index); for (int i = 0; i < C; i++) vec_first_field_index[i] = N + i; IS zero_field_isx, first_field_isx; CHKERRQ(ISCreateGeneral(PETSC_COMM_WORLD, N, vec_zero_field_index, PETSC_COPY_VALUES, &zero_field_isx)); CHKERRQ(ISCreateGeneral(PETSC_COMM_WORLD, C, vec_first_field_index, PETSC_COPY_VALUES, &first_field_isx)); ierr = PCFieldSplitSetIS(pc,"0",zero_field_isx); CHKERRQ(ierr); ierr = PCFieldSplitSetIS(pc,"1",first_field_isx); CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL,"-ksp_type", "fgmres"); CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL,"-pc_type", "fieldsplit"); CHKERRQ(ierr); /* ierr = PetscOptionsSetValue(NULL,"-fieldsplit_0_ksp_type", "gmres"); CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL,"-fieldsplit_0_pc_type", "jacobi"); CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL,"-fieldsplit_1_ksp_type", "preonly"); CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL,"-fieldsplit_1_pc_type", "lu"); CHKERRQ(ierr);*/ ierr = PetscOptionsSetValue(NULL, "-pc_fieldsplit_detect_saddle_point", NULL);CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL, "-options_left", NULL);CHKERRQ(ierr); ierr = KSPSetFromOptions(ksp); CHKERRQ(ierr); KSPSolve(ksp, b, x); Best, Karthik. From: Matthew Knepley > Date: Monday, 24 April 2023 at 17:57 To: Chockalingam, Karthikeyan (STFC,DL,HC) > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Setting up a matrix for Lagrange multiplier On Mon, Apr 24, 2023 at 10:22?AM Karthikeyan Chockalingam - STFC UKRI > wrote: Hello, I was able to construct the below K matrix (using submatrices P and P^T), which is of type MATAIJ K = [A P^T P 0] and solved them using a direct solver. I modified your example to create either AIJ or Nest matrices, but use the same assembly code: https://gitlab.com/petsc/petsc/-/merge_requests/6368 However, I was reading online that this is a saddle point problem and I should be employing PCFIELDSPLIT. Since I have one monolithic matrix K, I was not sure how to split the fields. With this particular matrix, you can use -pc_fieldsplit_detect_saddle_point and it will split it automatically. Thanks, Matt Best regards, Karthik. From: Chockalingam, Karthikeyan (STFC,DL,HC) > Date: Wednesday, 19 April 2023 at 17:52 To: Matthew Knepley > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Setting up a matrix for Lagrange multiplier I have declared the mapping ISLocalToGlobalMapping mapping; ISLocalToGlobalMappingCreate(PETSC_COMM_WORLD, 1, n, nindices, PETSC_COPY_VALUES, &mapping); But when I use MatSetValuesLocal(), how do I know the above mapping is employed because it is not one of the parameters passed to the function? Thank you. Kind regards, Karthik. From: Matthew Knepley > Date: Tuesday, 18 April 2023 at 16:21 To: Chockalingam, Karthikeyan (STFC,DL,HC) > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Setting up a matrix for Lagrange multiplier On Tue, Apr 18, 2023 at 11:16?AM Karthikeyan Chockalingam - STFC UKRI > wrote: Thank you for your response. I spend some time understanding how MatSetValuesLocal and ISLocalToGlobalMappingCreate work. You can look at SNES ex28 where we do this with DMCOMPOSITE. Q1) Will the matrix K be of type MATMPIAIJ or MATIS? K = [A P^T P 0] I assume MPIAIJ since IS is only used for Neumann-Neumann decompositions. Q2) Can I use both MatSetValues() to MatSetValuesLocal() to populate K? Since I have already used MatSetValues() to construct A. You can, and there would be no changes in serial if K is exactly the upper left block, but in parallel global indices would change. Q3) What are the advantages of using MatSetValuesLocal()? Is it that I can construct P directly using local indies and map the entrees to the global index in K? You have a monolithic K, so that you can use sparse direct solvers to check things. THis is impossible with separate storage. Q4) I probably don?t have to construct an independent P matrix You wouldn't in this case. Thanks, Matt Best regards, Karthik. From: Matthew Knepley > Date: Tuesday, 18 April 2023 at 11:08 To: Chockalingam, Karthikeyan (STFC,DL,HC) > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Setting up a matrix for Lagrange multiplier On Tue, Apr 18, 2023 at 5:24?AM Karthikeyan Chockalingam - STFC UKRI via petsc-users > wrote: Hello, I'm solving a problem using the Lagrange multiplier, the matrix has the form K = [A P^T P 0] I am familiar with constructing K using MATMPIAIJ. However, I would like to know if had [A], can I augment it with [P], [P^T] and [0] of type MATMPIAIJ? Likewise for vectors as well. Can you please point me to the right resource, if it is a common operation in PETSc? You can do this at least 2 ways: 1) Assemble you submatrices directly into the larger matrix by constructing local-to-global maps for the emplacement. so that you do not change your assembly code, except to change MatSetValues() to MatSetValuesLocal(). This is usually preferable. 2) Use MATNEST and VecNEST to put pointers to submatrices and subvectors directly in. Thanks, Matt Many thanks. Kind regards, Karthik. -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Apr 24 13:24:19 2023 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 24 Apr 2023 14:24:19 -0400 Subject: [petsc-users] Setting up a matrix for Lagrange multiplier In-Reply-To: References: Message-ID: On Mon, Apr 24, 2023 at 1:58?PM Karthikeyan Chockalingam - STFC UKRI < karthikeyan.chockalingam at stfc.ac.uk> wrote: > Changing the names to NULL produced the same error. > This is just to make the code simpler. > Don?t I have to give the fields a name? > They get the default name, which is the numbering you used. > The problem solution didn?t change from one time step to the next. > > > > *[1;31m[0]PETSC ERROR: --------------------- Error Message > --------------------------------------------------------------* > > *[0;39m[0;49m[0]PETSC ERROR: No support for this operation for this object > type* > > *[0]PETSC ERROR: Unsupported viewer gmres* > You are not completely catching the error, because it should print out the entire options database on termination. Here it says you gave "gmres" as a Viewer, which is incorrect, but I cannot see all the options you used. Thanks, Matt > *[0]PETSC ERROR: WARNING! There are option(s) set that were not used! > Could be the program crashed before they were used or a spelling mistake, > etc!* > > *[0]PETSC ERROR: Option left: name:-fieldsplit_1_ksp_type value: preonly > source: code* > > *[0]PETSC ERROR: Option left: name:-fieldsplit_1_ksp_view value: gmres > source: code* > > *[0]PETSC ERROR: Option left: name:-fieldsplit_1_pc_type value: lu source: > code* > > *[0]PETSC ERROR: Option left: name:-options_left (no value) source: code* > > *[0]PETSC ERROR: See https://petsc.org/release/faq/ > for trouble shooting.* > > *[0]PETSC ERROR: Petsc Development GIT revision: v3.18.4-529-g995ec06f92 > GIT Date: 2023-02-03 18:41:48 +0000* > > *[0]PETSC ERROR: > /Users/karthikeyan.chockalingam/AMReX/amrFEM/build/Debug/amrFEM on a named > HC20210312 by karthikeyan.chockalingam Mon Apr 24 18:51:25 2023* > > *[0]PETSC ERROR: Configure options --with-debugging=0 > --prefix=/Users/karthikeyan.chockalingam/AMReX/petsc > --download-fblaslapack=yes --download-scalapack=yes --download-mumps=yes > --with-hypre-dir=/Users/karthikeyan.chockalingam/AMReX/hypre/src/hypre* > > *[0]PETSC ERROR: #1 PetscOptionsGetViewer() at > /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/sys/classes/viewer/interface/viewreg.c:309* > > *[0]PETSC ERROR: #2 KSPSetFromOptions() at > /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/ksp/interface/itcl.c:522* > > *[0]PETSC ERROR: #3 PCSetUp_FieldSplit() at > /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1054* > > *[0]PETSC ERROR: #4 PCSetUp() at > /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/pc/interface/precon.c:994* > > *[0]PETSC ERROR: #5 KSPSetUp() at > /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/ksp/interface/itfunc.c:405* > > *[0]PETSC ERROR: #6 KSPSolve_Private() at > /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/ksp/interface/itfunc.c:824* > > *[0]PETSC ERROR: #7 KSPSolve() at > /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/ksp/interface/itfunc.c:1070* > > *End of program * > > *solve time 0.03416619 seconds * > > > > > > It will be a bit difficult for me to produce a stand-alone code. > > Thank you for your help. > > > Best, > Karthik. > > > > *From: *Matthew Knepley > *Date: *Monday, 24 April 2023 at 18:42 > *To: *Chockalingam, Karthikeyan (STFC,DL,HC) < > karthikeyan.chockalingam at stfc.ac.uk> > *Cc: *petsc-users at mcs.anl.gov > *Subject: *Re: [petsc-users] Setting up a matrix for Lagrange multiplier > > On Mon, Apr 24, 2023 at 1:37?PM Karthikeyan Chockalingam - STFC UKRI < > karthikeyan.chockalingam at stfc.ac.uk> wrote: > > Great to know there is an example now. > > > > Yes, -pc_fieldsplit_defect_saddle_point worked. > > > > But I would like to understand what I did wrong (and why it didn?t work). > > > > Can you show the error? Or give something self-contained that I can run. > > > > After reading many posted posts, I needed to create PCFieldSplitSetIS to > split the fields. > > > > Yes. I would give NULL for the name here. > > > > I set the first N indices to the field phi and the next C indices > (starting from N) to the field lambda. > > > > This looks fine. However, these are global indices, so this would not work > in parallel. You would have to offset by > > the first index on this process. > > > > Thanks, > > > > Matt > > > > Please tell me what I did wrong and how I can fix the lines which are now > commented? > > > > PetscErrorCode ierr; > > KSP ksp; > > PC pc; > > > > KSPCreate(PETSC_COMM_WORLD, &ksp); > > KSPSetType(ksp, KSPGMRES); > > KSPSetOperators(ksp, A[level], A[level]); > > ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr); > > ierr = PCSetType(pc,PCFIELDSPLIT);CHKERRQ(ierr); > > > > ??.. > > > > * for* (*int* i = 0; i < N; i++) > > vec_zero_field_index[i] = i; > > > > PetscInt * vec_first_field_index = *NULL*; > > PetscMalloc(n * *sizeof*(PetscInt), &vec_first_field_index); > > > > *for* (*int* i = 0; i < C; i++) > > vec_first_field_index[i] = N + i; > > > > IS zero_field_isx, first_field_isx; > > CHKERRQ(ISCreateGeneral(PETSC_COMM_WORLD, N, vec_zero_field_index, > PETSC_COPY_VALUES, &zero_field_isx)); > > CHKERRQ(ISCreateGeneral(PETSC_COMM_WORLD, C, vec_first_field_index, > PETSC_COPY_VALUES, &first_field_isx)); > > > > ierr = PCFieldSplitSetIS(pc,"0",zero_field_isx); CHKERRQ(ierr); > > ierr = PCFieldSplitSetIS(pc,"1",first_field_isx); CHKERRQ(ierr); > > > > ierr = PetscOptionsSetValue(*NULL*,"-ksp_type", "fgmres"); CHKERRQ > (ierr); > > ierr = PetscOptionsSetValue(*NULL*,"-pc_type", "fieldsplit"); CHKERRQ > (ierr); > > > > /* > > ierr = PetscOptionsSetValue(NULL,"-fieldsplit_0_ksp_type", "gmres"); > CHKERRQ(ierr); > > ierr = PetscOptionsSetValue(NULL,"-fieldsplit_0_pc_type", "jacobi"); > CHKERRQ(ierr); > > ierr = PetscOptionsSetValue(NULL,"-fieldsplit_1_ksp_type", > "preonly"); CHKERRQ(ierr); > > ierr = PetscOptionsSetValue(NULL,"-fieldsplit_1_pc_type", "lu"); > CHKERRQ(ierr);*/ > > > > ierr = PetscOptionsSetValue(*NULL*, > "-pc_fieldsplit_detect_saddle_point", *NULL*);CHKERRQ(ierr); > > ierr = PetscOptionsSetValue(*NULL*, "-options_left", *NULL*);CHKERRQ > (ierr); > > ierr = KSPSetFromOptions(ksp); CHKERRQ(ierr); > > KSPSolve(ksp, b, x); > > > > Best, > > Karthik. > > *From: *Matthew Knepley > *Date: *Monday, 24 April 2023 at 17:57 > *To: *Chockalingam, Karthikeyan (STFC,DL,HC) < > karthikeyan.chockalingam at stfc.ac.uk> > *Cc: *petsc-users at mcs.anl.gov > *Subject: *Re: [petsc-users] Setting up a matrix for Lagrange multiplier > > On Mon, Apr 24, 2023 at 10:22?AM Karthikeyan Chockalingam - STFC UKRI < > karthikeyan.chockalingam at stfc.ac.uk> wrote: > > Hello, > > > > I was able to construct the below K matrix (using submatrices P and P^T), > which is of type MATAIJ > > K = [A P^T > > P 0] > > and solved them using a direct solver. > > > > I modified your example to create either AIJ or Nest matrices, but use the > same assembly code: > > > > https://gitlab.com/petsc/petsc/-/merge_requests/6368 > > > > However, I was reading online that this is a saddle point problem and I > should be employing PCFIELDSPLIT. > > Since I have one monolithic matrix K, I was not sure how to split the > fields. > > > > With this particular matrix, you can use > > > > -pc_fieldsplit_detect_saddle_point > > > > and it will split it automatically. > > > > Thanks, > > > > Matt > > > > Best regards, > > Karthik. > > > > > > *From: *Chockalingam, Karthikeyan (STFC,DL,HC) < > karthikeyan.chockalingam at stfc.ac.uk> > *Date: *Wednesday, 19 April 2023 at 17:52 > *To: *Matthew Knepley > *Cc: *petsc-users at mcs.anl.gov > *Subject: *Re: [petsc-users] Setting up a matrix for Lagrange multiplier > > I have declared the mapping > > > > ISLocalToGlobalMapping mapping; > > ISLocalToGlobalMappingCreate(PETSC_COMM_WORLD, 1, n, nindices, > PETSC_COPY_VALUES, &mapping); > > > > But when I use MatSetValuesLocal(), how do I know the above mapping is > employed because it is not one of the parameters passed to the function? > > > > Thank you. > > > > Kind regards, > > Karthik. > > > > > > *From: *Matthew Knepley > *Date: *Tuesday, 18 April 2023 at 16:21 > *To: *Chockalingam, Karthikeyan (STFC,DL,HC) < > karthikeyan.chockalingam at stfc.ac.uk> > *Cc: *petsc-users at mcs.anl.gov > *Subject: *Re: [petsc-users] Setting up a matrix for Lagrange multiplier > > On Tue, Apr 18, 2023 at 11:16?AM Karthikeyan Chockalingam - STFC UKRI < > karthikeyan.chockalingam at stfc.ac.uk> wrote: > > Thank you for your response. I spend some time understanding how > > MatSetValuesLocal and ISLocalToGlobalMappingCreate work. > > > > You can look at SNES ex28 where we do this with DMCOMPOSITE. > > > > Q1) Will the matrix K be of type MATMPIAIJ or MATIS? > > K = [A P^T > > P 0] > > > > I assume MPIAIJ since IS is only used for Neumann-Neumann decompositions. > > > > Q2) Can I use both MatSetValues() to MatSetValuesLocal() to populate K? > Since I have already used MatSetValues() to construct A. > > > > You can, and there would be no changes in serial if K is exactly the upper > left block, but in parallel global indices would change. > > > > Q3) What are the advantages of using MatSetValuesLocal()? Is it that I can > construct P directly using local indies and map the entrees to the global > index in K? > > > > You have a monolithic K, so that you can use sparse direct solvers to > check things. THis is impossible with separate storage. > > > > Q4) I probably don?t have to construct an independent P matrix > > > > You wouldn't in this case. > > > > Thanks, > > > > Matt > > > > Best regards, > > Karthik. > > > > > > > > *From: *Matthew Knepley > *Date: *Tuesday, 18 April 2023 at 11:08 > *To: *Chockalingam, Karthikeyan (STFC,DL,HC) < > karthikeyan.chockalingam at stfc.ac.uk> > *Cc: *petsc-users at mcs.anl.gov > *Subject: *Re: [petsc-users] Setting up a matrix for Lagrange multiplier > > On Tue, Apr 18, 2023 at 5:24?AM Karthikeyan Chockalingam - STFC UKRI via > petsc-users wrote: > > Hello, > > > > I'm solving a problem using the Lagrange multiplier, the matrix has the > form > > > > K = [A P^T > > P 0] > > > > I am familiar with constructing K using MATMPIAIJ. However, I would like > to know if had [A], can I augment it with [P], [P^T] and [0] of type > MATMPIAIJ? Likewise for vectors as well. > > > > Can you please point me to the right resource, if it is a common operation > in PETSc? > > > > You can do this at least 2 ways: > > > > 1) Assemble you submatrices directly into the larger matrix by > constructing local-to-global maps for the emplacement. so that you do > > not change your assembly code, except to change MatSetValues() to > MatSetValuesLocal(). This is usually preferable. > > > > 2) Use MATNEST and VecNEST to put pointers to submatrices and subvectors > directly in. > > > > Thanks, > > > > Matt > > > > Many thanks. > > > > Kind regards, > > Karthik. > > > > > > > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Mon Apr 24 13:26:47 2023 From: bsmith at petsc.dev (Barry Smith) Date: Mon, 24 Apr 2023 14:26:47 -0400 Subject: [petsc-users] Fieldsplit with redistribute In-Reply-To: References: <35C00A41-36BA-45E6-AC79-7DDDDFCFED6C@petsc.dev> <0DB90ED8-6BD0-431E-B592-37B89DE3DE28@petsc.dev> <88F183A6-6553-4288-89A5-DD1903EA9956@petsc.dev> Message-ID: PCREDISTRIBUTE looks like it is not yet GPU friendly. This needs to be fixed, but it should be a separate fix and MR from my current one. Please just check if the PCREDISTRIBUTE followed by PCFIELDSPLIT just works on CPUs for your code. Barry > On Apr 24, 2023, at 2:23 PM, Carl-Johan Thore wrote: > > I wasn?t sure if I was going to bother you again with this, but since it looks like you plan to merge this with the > main branch (?) I thought it might be interesting to know that I?ve tried this with my code running with CUDA but got the > attached error. I suspect it?s related to the RHS red->b but I?m not sure. If I switch of redistribute my code runs fine > with CUDA. > > Kind regards, > Carl-Johan > > > > From: Carl-Johan Thore > Sent: Monday, April 24, 2023 5:08 PM > To: Barry Smith > > Subject: RE: [petsc-users] Fieldsplit with redistribute > > Ok, that worked great with my code on 1 core! (I haven?t been able to try the multi-core case yet due to issues with my own code > mentioned below) > > I?m not sure if you forgot to remove the freeing of the map object > outside or if I messed up with the pull somehow, but I had to outcomment that line manually: > ? > > /Carl-Johan > > > > From: Barry Smith > > Sent: Monday, April 24, 2023 4:26 PM > To: Carl-Johan Thore > > Cc: PETSc > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > The bug was mine; I was freeing the map object outside of the if () instead of inside. You can do > > git pull > make all > > and then try again. > > Barry > > > > On Apr 24, 2023, at 5:39 AM, Carl-Johan Thore > wrote: > > Hi Barry! > > First of all, thank you very very much for this! I was expecting maybe a few hints and pointers on how to proceed > with my work, but then you did a complete implementation ? > > Your code ran fine with ex84.cc . Unfortunately it crashed when running on my main code (a mixed Stokes solver). > When running on 1 core I get a crash which is maybe related to your code, so I?ve attached the error message for that > case. However, on multiple cores I think the issue is mainly that I?m not constructing the original IS correctly, so I?ll > look into that myself. > > Regarding reporting to https://gitlab.com/petsc/petsc/-/merge_requests/6366, should it be done here?: ? > > By the way, I managed yesterday to make a working implementation of my own example and was planning to send it > after cleaning it up and maybe optimizing a bit. I?ve attached it if your curious (or just want to have a good laugh :)) > > Kind regards, > Carl-Johan > > > From: Barry Smith > > Sent: Monday, April 24, 2023 12:49 AM > To: Carl-Johan Thore > > Cc: PETSc > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > I have added support for PCREDISTRIBUTE to propogate your PCFieldSplitSetIS() down to an inner PCFIELDSPLIT. You can > access it with > > git fetch > git checkout barry/2023-04-22/fieldsplit-fields-propogate > ./configure > make all check > > Take a look at src/ksp/ksp/tutorials/ex84.c and run with the options at the bottom of the file. > > Please let us know at https://gitlab.com/petsc/petsc/-/merge_requests/6366 if it works for you or you have any difficulties. > > Barry > > > > > On Apr 20, 2023, at 10:14 AM, Carl-Johan Thore > wrote: > > Great, thanks! I?ve attached the code, a makefile, and a 1-page power-point which hopefully explains > what I?m trying to do on this little toy-problem. There is obviously (?) something I need to add around > line 327 in the code in order to move the indices to the correct rank. > > Output should be something like this when running: > ? > > Let me know if you need any more info, or if the code is incomprehensible or so > (it?s long because I?ve copied a lot from redistribute.c) > > Kind regards, > Carl-Johan > > From: Barry Smith > > Sent: Thursday, April 20, 2023 3:17 PM > To: Carl-Johan Thore > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > Sure > > > On Apr 20, 2023, at 4:09 AM, Carl-Johan Thore > wrote: > > Hi Barry, > > In the conversation below you mentioned that I could send code to you to take a look. I?ve written > up what I think is a minimally working example for this. It?s almost there in the sense of distributing > the correct number of indices to the ranks to match the reduced matrix, but it?s the wrong indices. > Would it be okay if I sent you the code to have look? > > Kind regards, > Carl-Johan > > From: Barry Smith > > Sent: Sunday, April 16, 2023 10:31 PM > To: Carl-Johan Thore > > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > The manual page for ISEmbed is incomprehensible to me. Anyways no matter what, you need to know what degrees of freedom are removed by PCDistribute() in order to produce the reduced IS which is why I think you need information only available inside PCSetUp_Redistribute(). (Sorry it is PCSetUp_Redistribute() not PCApply_Redistribute()) > > Barry > > > > On Apr 16, 2023, at 3:36 PM, Carl-Johan Thore > wrote: > > Thanks for the quick reply Barry! > I have not tried the version with PCApply_Redistribute that you suggest, but I have a code that does roughly what you describe. It works when running on one rank, but fails on multiple ranks. I suspect the issue is with the use of ISEmbed as, quoting the PETSc-manual, "the resulting IS is sequential, since the index substitution it encodes is purely local" (admittedly I don't fully understand what that means). If you think using ISEmbed is not a good idea, I'll try PCApply_Redistribute() > From: Barry Smith > > Sent: 16 April 2023 21:11:18 > To: Carl-Johan Thore > > Cc: petsc-users at mcs.anl.gov > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > There is no code to do this currently. > > I would start by building your IS for each split before the PCRedistribute and then adding to the PCApply_Redistribute() code that "fixes" these IS by "removing" the entries of the IS associated with removed degrees of freedom and then shifting the entries indices of the IS by taking into account the removed indices. But you have probably already been trying this? It does require digging directly into the PCApply_Redistribute() to get the needed information (which degrees of freedom are removed by the redistribute code), plus it requires shifting the MPI rank ownership of the entries of the IS in the same way the MPI rank ownership of the degrees of freedom of the vector are moved. > > If you have some code that you think should be doing this but doesn't work feel free to send it to us and we may be able to fix it. > > Barry > > > > On Apr 16, 2023, at 2:50 PM, Carl-Johan Thore via petsc-users > wrote: > > > > Hello, > > I'm solving a blocksystem > > [A C; > > C' D], > > where D is not zero, using the PCFIELDSPLIT preconditioner and set the split using PetscFieldSplitSetIS. This works very well until I try PCREDISTRIBUTE (which is attractive as I have many locked DOFs). I suspect something goes wrong when constructing the IS for the split (I've tried various things using the IS-routines). Can PETSc do this automatically? Or else, any hints? > > Kind regards, > > Carl-Johan > > > > ? -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image001.png Type: image/png Size: 79741 bytes Desc: not available URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image002.png Type: image/png Size: 70472 bytes Desc: not available URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image003.png Type: image/png Size: 25214 bytes Desc: not available URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: error_message_fieldsplit_redistribute_gpu_cuda.txt URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: From liufield at gmail.com Mon Apr 24 15:00:37 2023 From: liufield at gmail.com (neil liu) Date: Mon, 24 Apr 2023 16:00:37 -0400 Subject: [petsc-users] Fwd: Inquiry about the dual space (PetscFECreateTabulation_Basic) In-Reply-To: References: Message-ID: I try to find the source code, that transforms the scalar basis <1 x y> to a vectors basis / 1 \ / 0 \ / x \ / 0 \ / y \ / 0 \ \ 0 / \ 1 / \ 0 / \ x / \ 0 / \ y / It seems it is processed by line 856 https://gitlab.com/petsc/petsc/-/blob/main/include/petsc/private/petscimpl.h Could you please direct me to the exact location where the source code has been defined to do the transformation? On Fri, Apr 21, 2023 at 12:37?PM neil liu wrote: > Thanks a lot. Very helpful. > > On Fri, Apr 21, 2023 at 10:57?AM Matthew Knepley > wrote: > >> On Fri, Apr 21, 2023 at 10:36?AM neil liu wrote: >> >>> When you say "For multicomponent spaces, we currently do not represent >>> it as a tensor product over the scalar space, so we see 6 basis vectors." >>> Here, muticomponent = two dimensional ? >>> >> >> If you have a vector in a two-dimensional space, it has 2 components, >> like our coordinate vector. >> >> >>> I am a little confused about the dimensions of the basis functions here. >>> From >>> https://petsc.org/release//src/dm/dt/fe/impls/basic/febasic.c.html#PETSCFEBASIC >>> >>> 144: /* B[npoints, nodes, Nc] = tmpB[npoints, prime, Nc] * invV[prime, nodes] */ >>> >>> How do you define tmpB here (npoints =3, prime =6, Nc =2)? I can get tmpB from >>> >>> PetscSpaceEvaluate_Polynomial, where, tmpB (1x9) is (the prime polynomial is defined by 1 x y)) >>> >>> [ 1 -0.6667 -0.6667 1 -0.6667 0.3333 1 0.3333 -0.6666]. How do you transform from this 1x9 to 3x6x2 there. >>> >>> >> npoints is the number of quadrature points at which to evaluate >> >> nodes (pdim) is the number of functions in the space >> >> Nc is the number of components for each function. >> >> So a P1 basis for vectors looks like >> >> / 1 \ / 0 \ / x \ / 0 \ / y \ / 0 \ >> \ 0 / \ 1 / \ 0 / \ x / \ 0 / \ y / >> >> six vectors with 2 components each. >> >> Thanks, >> >> Matt >> >>> Thanks, >>> >>> Xiaodong >>> >>> >>> >>> >>> >>> >>> On Fri, Apr 21, 2023 at 10:05?AM Matthew Knepley >>> wrote: >>> >>>> On Fri, Apr 21, 2023 at 10:02?AM neil liu wrote: >>>> >>>>> Hello, Petsc group, >>>>> >>>>> I am learning the FE structure in Petsc by running case >>>>> https://petsc.org/main/src/snes/tutorials/ex12.c.html with -run_type >>>>> test -bc_type dirichlet -dm_plex_interpolate 0 -petscspace_degree 1 >>>>> -show_initial -dm_plex_print_fem 1 >>>>> >>>> >>>> -dm_plex_print_fem 5 will print much more >>>> >>>> >>>>> When I check the subroutine PetscFECreateTabulation_Basic, I can not >>>>> understand some parameters there. >>>>> >>>>> For the following lines in the file ( >>>>> https://petsc.org/release//src/dm/dt/fe/impls/basic/febasic.c.html#PETSCFEBASIC >>>>> ) >>>>> >>>>> 135: PetscCall (PetscDualSpaceGetDimension (fem->dualSpace, &pdim));136: PetscCall (PetscFEGetNumComponents (fem, &Nc)); >>>>> >>>>> Here, Nc = 2, pdim =6. I am running a scalar case with degree of 1, >>>>> >>>>> I expect Nc = 1 and pdim =3. Could you please explain this? In addition, >>>>> >>>>> Sure. I am guessing that you are looking at the tabulation for the >>>> coordinate space. Here you are in 2 dimensions, so the >>>> coordinate space has Nc = 2. For multicomponent spaces, we currently do >>>> not represent it as a tensor product over the >>>> scalar space, so we see 6 basis vectors. >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>>> Thanks, >>>>> >>>>> Xiaodong >>>>> >>>>> >>>>> >>>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>>> https://www.cse.buffalo.edu/~knepley/ >>>> >>>> >>> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Apr 24 19:12:06 2023 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 24 Apr 2023 20:12:06 -0400 Subject: [petsc-users] Fwd: Inquiry about the dual space (PetscFECreateTabulation_Basic) In-Reply-To: References: Message-ID: On Mon, Apr 24, 2023 at 4:00?PM neil liu wrote: > I try to find the source code, that transforms the scalar basis <1 x y> to > a vectors basis > > / 1 \ / 0 \ / x \ / 0 \ / y \ / 0 \ > \ 0 / \ 1 / \ 0 / \ x / \ 0 / \ y / > > It seems it is processed by line 856 > > https://gitlab.com/petsc/petsc/-/blob/main/include/petsc/private/petscimpl.h > > Could you please direct me to the exact location where the source code has > been defined to do the transformation? > We do not represent those functions explicitly. The only action of a PetscSpace object is to evaluate the basis functions at a set of points. For polynomial spaces, we do not use monomials, like 1, x, y, but Legendre polynomials. For multiple components, we use the tensor product of the scalar basis, so we replicate evaluations in the right places. This code is all in the PetscSpaceEvaluate() function: https://petsc.org/main/manualpages/SPACE/PetscSpaceEvaluate/ You can see pointers to the implementations at the bottom of that page. Thanks, Matt > On Fri, Apr 21, 2023 at 12:37?PM neil liu wrote: > >> Thanks a lot. Very helpful. >> >> On Fri, Apr 21, 2023 at 10:57?AM Matthew Knepley >> wrote: >> >>> On Fri, Apr 21, 2023 at 10:36?AM neil liu wrote: >>> >>>> When you say "For multicomponent spaces, we currently do not represent >>>> it as a tensor product over the scalar space, so we see 6 basis vectors." >>>> Here, muticomponent = two dimensional ? >>>> >>> >>> If you have a vector in a two-dimensional space, it has 2 components, >>> like our coordinate vector. >>> >>> >>>> I am a little confused about the dimensions of the basis functions >>>> here. From >>>> https://petsc.org/release//src/dm/dt/fe/impls/basic/febasic.c.html#PETSCFEBASIC >>>> >>>> 144: /* B[npoints, nodes, Nc] = tmpB[npoints, prime, Nc] * invV[prime, nodes] */ >>>> >>>> How do you define tmpB here (npoints =3, prime =6, Nc =2)? I can get tmpB from >>>> >>>> PetscSpaceEvaluate_Polynomial, where, tmpB (1x9) is (the prime polynomial is defined by 1 x y)) >>>> >>>> [ 1 -0.6667 -0.6667 1 -0.6667 0.3333 1 0.3333 -0.6666]. How do you transform from this 1x9 to 3x6x2 there. >>>> >>>> >>> npoints is the number of quadrature points at which to evaluate >>> >>> nodes (pdim) is the number of functions in the space >>> >>> Nc is the number of components for each function. >>> >>> So a P1 basis for vectors looks like >>> >>> / 1 \ / 0 \ / x \ / 0 \ / y \ / 0 \ >>> \ 0 / \ 1 / \ 0 / \ x / \ 0 / \ y / >>> >>> six vectors with 2 components each. >>> >>> Thanks, >>> >>> Matt >>> >>>> Thanks, >>>> >>>> Xiaodong >>>> >>>> >>>> >>>> >>>> >>>> >>>> On Fri, Apr 21, 2023 at 10:05?AM Matthew Knepley >>>> wrote: >>>> >>>>> On Fri, Apr 21, 2023 at 10:02?AM neil liu wrote: >>>>> >>>>>> Hello, Petsc group, >>>>>> >>>>>> I am learning the FE structure in Petsc by running case >>>>>> https://petsc.org/main/src/snes/tutorials/ex12.c.html with -run_type >>>>>> test -bc_type dirichlet -dm_plex_interpolate 0 -petscspace_degree 1 >>>>>> -show_initial -dm_plex_print_fem 1 >>>>>> >>>>> >>>>> -dm_plex_print_fem 5 will print much more >>>>> >>>>> >>>>>> When I check the subroutine PetscFECreateTabulation_Basic, I can not >>>>>> understand some parameters there. >>>>>> >>>>>> For the following lines in the file ( >>>>>> https://petsc.org/release//src/dm/dt/fe/impls/basic/febasic.c.html#PETSCFEBASIC >>>>>> ) >>>>>> >>>>>> 135: PetscCall (PetscDualSpaceGetDimension (fem->dualSpace, &pdim));136: PetscCall (PetscFEGetNumComponents (fem, &Nc)); >>>>>> >>>>>> Here, Nc = 2, pdim =6. I am running a scalar case with degree of 1, >>>>>> >>>>>> I expect Nc = 1 and pdim =3. Could you please explain this? In addition, >>>>>> >>>>>> Sure. I am guessing that you are looking at the tabulation for the >>>>> coordinate space. Here you are in 2 dimensions, so the >>>>> coordinate space has Nc = 2. For multicomponent spaces, we currently >>>>> do not represent it as a tensor product over the >>>>> scalar space, so we see 6 basis vectors. >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>>> Thanks, >>>>>> >>>>>> Xiaodong >>>>>> >>>>>> >>>>>> >>>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>>> https://www.cse.buffalo.edu/~knepley/ >>>>> >>>>> >>>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ksl7912 at snu.ac.kr Mon Apr 24 22:27:57 2023 From: ksl7912 at snu.ac.kr (=?UTF-8?B?wq3qtozsirnrpqwgLyDtlZnsg50gLyDtla3qs7XsmrDso7zqs7XtlZnqs7w=?=) Date: Tue, 25 Apr 2023 12:27:57 +0900 Subject: [petsc-users] Question about linking LAPACK library Message-ID: Dear all Hello. I want to make an inverse matrix like inv(A) in MATLAB. Are there some methods to inverse matrix in petsc? If not, I want to use the inverse function in the LAPACK library. Then, how to use the LAPACK library in petsc? I use the C language. Best, Seung Lee Kwon -- Seung Lee Kwon, Ph.D.Candidate Aerospace Structures and Materials Laboratory Department of Mechanical and Aerospace Engineering Seoul National University Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 E-mail : ksl7912 at snu.ac.kr Office : +82-2-880-7389 C. P : +82-10-4695-1062 -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Mon Apr 24 22:30:58 2023 From: bsmith at petsc.dev (Barry Smith) Date: Mon, 24 Apr 2023 23:30:58 -0400 Subject: [petsc-users] Question about linking LAPACK library In-Reply-To: References: Message-ID: <2AD13DB8-4023-45CF-8D8D-F2DB320702F8@petsc.dev> How large are the dense matrices you would like to invert? > On Apr 24, 2023, at 11:27 PM, ???? / ?? / ??????? wrote: > > Dear all > > Hello. > I want to make an inverse matrix like inv(A) in MATLAB. > > Are there some methods to inverse matrix in petsc? > > If not, I want to use the inverse function in the LAPACK library. > > Then, how to use the LAPACK library in petsc? I use the C language. > > Best, > > Seung Lee Kwon > > -- > Seung Lee Kwon, Ph.D.Candidate > Aerospace Structures and Materials Laboratory > Department of Mechanical and Aerospace Engineering > Seoul National University > Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 > E-mail : ksl7912 at snu.ac.kr > Office : +82-2-880-7389 > C. P : +82-10-4695-1062 -------------- next part -------------- An HTML attachment was scrubbed... URL: From ksl7912 at snu.ac.kr Mon Apr 24 22:46:42 2023 From: ksl7912 at snu.ac.kr (=?UTF-8?B?wq3qtozsirnrpqwgLyDtlZnsg50gLyDtla3qs7XsmrDso7zqs7XtlZnqs7w=?=) Date: Tue, 25 Apr 2023 12:46:42 +0900 Subject: [petsc-users] Question about linking LAPACK library In-Reply-To: <2AD13DB8-4023-45CF-8D8D-F2DB320702F8@petsc.dev> References: <2AD13DB8-4023-45CF-8D8D-F2DB320702F8@petsc.dev> Message-ID: Dear all It depends on the problem. It can have hundreds of thousands of degrees of freedom. best, Seung Lee Kwon 2023? 4? 25? (?) ?? 12:32, Barry Smith ?? ??: > > How large are the dense matrices you would like to invert? > > On Apr 24, 2023, at 11:27 PM, ???? / ?? / ??????? > wrote: > > Dear all > > Hello. > I want to make an inverse matrix like inv(A) in MATLAB. > > Are there some methods to inverse matrix in petsc? > > If not, I want to use the inverse function in the LAPACK library. > > Then, how to use the LAPACK library in petsc? I use the C language. > > Best, > > Seung Lee Kwon > > -- > Seung Lee Kwon, Ph.D.Candidate > Aerospace Structures and Materials Laboratory > Department of Mechanical and Aerospace Engineering > Seoul National University > Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 > E-mail : ksl7912 at snu.ac.kr > Office : +82-2-880-7389 > C. P : +82-10-4695-1062 > > > -- Seung Lee Kwon, Ph.D.Candidate Aerospace Structures and Materials Laboratory Department of Mechanical and Aerospace Engineering Seoul National University Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 E-mail : ksl7912 at snu.ac.kr Office : +82-2-880-7389 C. P : +82-10-4695-1062 -------------- next part -------------- An HTML attachment was scrubbed... URL: From karthikeyan.chockalingam at stfc.ac.uk Tue Apr 25 04:38:51 2023 From: karthikeyan.chockalingam at stfc.ac.uk (Karthikeyan Chockalingam - STFC UKRI) Date: Tue, 25 Apr 2023 09:38:51 +0000 Subject: [petsc-users] Setting up a matrix for Lagrange multiplier In-Reply-To: References: Message-ID: Thank you for your response. I will come back to the error later (for now I have it working with -pc_fieldsplit_detect_saddle_point.) I would like to understand how I need to index while running in parallel. (Q1) Index set (IS) for the ISLocalToGlobalMapping. for (int i = 0; i < N; i++) vec_col_index[i] = i; for (int i = 0; i < C; i++) vec_shifted_row_index[i] = i + N; ISLocalToGlobalMapping phi_mapping, lamda_mapping; ISLocalToGlobalMappingCreate(PETSC_COMM_WORLD, 1, C, vec_shifted_row_index, PETSC_COPY_VALUES, &lamda_mapping); ISLocalToGlobalMappingCreate(PETSC_COMM_WORLD, 1, N, vec_col_index, PETSC_COPY_VALUES, &phi_mapping); (i) I believe the above index set is created by all parallel processes ? why then I would have to offset it with the first index of this process? Since the mapping will always hold true and we would only use a subset of this index set while populating the submatrices. I can see that I might have an over-allocated index set than necessary but I can?t see why it can be wrong. (ii) What would be the first index of this process anyway, since at this point I haven?t declared any parallel sub-matrix or sub-vector? (Q2) Index set (IS) while creating the submatrix for (int i = 0; i < C; i++) vec_row_index[i] = i; Mat SubP; IS row_isx, col_isx; ISCreateGeneral(PETSC_COMM_WORLD, C, vec_row_index, PETSC_COPY_VALUES, &row_isx); ISCreateGeneral(PETSC_COMM_WORLD, N, vec_col_index, PETSC_COPY_VALUES, &col_isx); MatSetOption(A[level], MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE); MatSetLocalToGlobalMapping(A, lamda_mapping, phi_mapping); MatGetLocalSubMatrix(A, row_isx, col_isx, &SubP); I believe I will have to request a submatrix SubP of global size C x N on all processes ? right? Or do you think I can still break row_isx and col_isx into parallel parts? (Q3) Populating the submatrix const PetscInt *nindices; PetscInt count; ISGetIndices(isout[level], &nindices); //isout is the global index of the nodes to be constrained. PetscScalar val = 1; for (int irow=0; irow < n; irow++) { int icol = nindices[irow]; MatSetValuesLocal(SubP,1,&irow,1,&icol,&val,ADD_VALUES); } I do think if I have the MatGetOwnerShipRange of SubP; I will be able to parallelize the loop. Is MatGetOwnerShipRange also available for submatrices as well? If I have to parallelize the index sets maybe I have to reorder the above steps, can you please make a suggestion? Kind regards, Karthik. From: Matthew Knepley Date: Monday, 24 April 2023 at 19:24 To: Chockalingam, Karthikeyan (STFC,DL,HC) Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Setting up a matrix for Lagrange multiplier On Mon, Apr 24, 2023 at 1:58?PM Karthikeyan Chockalingam - STFC UKRI > wrote: Changing the names to NULL produced the same error. This is just to make the code simpler. Don?t I have to give the fields a name? They get the default name, which is the numbering you used. The problem solution didn?t change from one time step to the next. [1;31m[0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0;39m[0;49m[0]PETSC ERROR: No support for this operation for this object type [0]PETSC ERROR: Unsupported viewer gmres You are not completely catching the error, because it should print out the entire options database on termination. Here it says you gave "gmres" as a Viewer, which is incorrect, but I cannot see all the options you used. Thanks, Matt [0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be the program crashed before they were used or a spelling mistake, etc! [0]PETSC ERROR: Option left: name:-fieldsplit_1_ksp_type value: preonly source: code [0]PETSC ERROR: Option left: name:-fieldsplit_1_ksp_view value: gmres source: code [0]PETSC ERROR: Option left: name:-fieldsplit_1_pc_type value: lu source: code [0]PETSC ERROR: Option left: name:-options_left (no value) source: code [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [0]PETSC ERROR: Petsc Development GIT revision: v3.18.4-529-g995ec06f92 GIT Date: 2023-02-03 18:41:48 +0000 [0]PETSC ERROR: /Users/karthikeyan.chockalingam/AMReX/amrFEM/build/Debug/amrFEM on a named HC20210312 by karthikeyan.chockalingam Mon Apr 24 18:51:25 2023 [0]PETSC ERROR: Configure options --with-debugging=0 --prefix=/Users/karthikeyan.chockalingam/AMReX/petsc --download-fblaslapack=yes --download-scalapack=yes --download-mumps=yes --with-hypre-dir=/Users/karthikeyan.chockalingam/AMReX/hypre/src/hypre [0]PETSC ERROR: #1 PetscOptionsGetViewer() at /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/sys/classes/viewer/interface/viewreg.c:309 [0]PETSC ERROR: #2 KSPSetFromOptions() at /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/ksp/interface/itcl.c:522 [0]PETSC ERROR: #3 PCSetUp_FieldSplit() at /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1054 [0]PETSC ERROR: #4 PCSetUp() at /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/pc/interface/precon.c:994 [0]PETSC ERROR: #5 KSPSetUp() at /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/ksp/interface/itfunc.c:405 [0]PETSC ERROR: #6 KSPSolve_Private() at /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/ksp/interface/itfunc.c:824 [0]PETSC ERROR: #7 KSPSolve() at /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/ksp/interface/itfunc.c:1070 End of program solve time 0.03416619 seconds It will be a bit difficult for me to produce a stand-alone code. Thank you for your help. Best, Karthik. From: Matthew Knepley > Date: Monday, 24 April 2023 at 18:42 To: Chockalingam, Karthikeyan (STFC,DL,HC) > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Setting up a matrix for Lagrange multiplier On Mon, Apr 24, 2023 at 1:37?PM Karthikeyan Chockalingam - STFC UKRI > wrote: Great to know there is an example now. Yes, -pc_fieldsplit_defect_saddle_point worked. But I would like to understand what I did wrong (and why it didn?t work). Can you show the error? Or give something self-contained that I can run. After reading many posted posts, I needed to create PCFieldSplitSetIS to split the fields. Yes. I would give NULL for the name here. I set the first N indices to the field phi and the next C indices (starting from N) to the field lambda. This looks fine. However, these are global indices, so this would not work in parallel. You would have to offset by the first index on this process. Thanks, Matt Please tell me what I did wrong and how I can fix the lines which are now commented? PetscErrorCode ierr; KSP ksp; PC pc; KSPCreate(PETSC_COMM_WORLD, &ksp); KSPSetType(ksp, KSPGMRES); KSPSetOperators(ksp, A[level], A[level]); ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr); ierr = PCSetType(pc,PCFIELDSPLIT);CHKERRQ(ierr); ??.. for (int i = 0; i < N; i++) vec_zero_field_index[i] = i; PetscInt * vec_first_field_index = NULL; PetscMalloc(n * sizeof(PetscInt), &vec_first_field_index); for (int i = 0; i < C; i++) vec_first_field_index[i] = N + i; IS zero_field_isx, first_field_isx; CHKERRQ(ISCreateGeneral(PETSC_COMM_WORLD, N, vec_zero_field_index, PETSC_COPY_VALUES, &zero_field_isx)); CHKERRQ(ISCreateGeneral(PETSC_COMM_WORLD, C, vec_first_field_index, PETSC_COPY_VALUES, &first_field_isx)); ierr = PCFieldSplitSetIS(pc,"0",zero_field_isx); CHKERRQ(ierr); ierr = PCFieldSplitSetIS(pc,"1",first_field_isx); CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL,"-ksp_type", "fgmres"); CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL,"-pc_type", "fieldsplit"); CHKERRQ(ierr); /* ierr = PetscOptionsSetValue(NULL,"-fieldsplit_0_ksp_type", "gmres"); CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL,"-fieldsplit_0_pc_type", "jacobi"); CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL,"-fieldsplit_1_ksp_type", "preonly"); CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL,"-fieldsplit_1_pc_type", "lu"); CHKERRQ(ierr);*/ ierr = PetscOptionsSetValue(NULL, "-pc_fieldsplit_detect_saddle_point", NULL);CHKERRQ(ierr); ierr = PetscOptionsSetValue(NULL, "-options_left", NULL);CHKERRQ(ierr); ierr = KSPSetFromOptions(ksp); CHKERRQ(ierr); KSPSolve(ksp, b, x); Best, Karthik. From: Matthew Knepley > Date: Monday, 24 April 2023 at 17:57 To: Chockalingam, Karthikeyan (STFC,DL,HC) > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Setting up a matrix for Lagrange multiplier On Mon, Apr 24, 2023 at 10:22?AM Karthikeyan Chockalingam - STFC UKRI > wrote: Hello, I was able to construct the below K matrix (using submatrices P and P^T), which is of type MATAIJ K = [A P^T P 0] and solved them using a direct solver. I modified your example to create either AIJ or Nest matrices, but use the same assembly code: https://gitlab.com/petsc/petsc/-/merge_requests/6368 However, I was reading online that this is a saddle point problem and I should be employing PCFIELDSPLIT. Since I have one monolithic matrix K, I was not sure how to split the fields. With this particular matrix, you can use -pc_fieldsplit_detect_saddle_point and it will split it automatically. Thanks, Matt Best regards, Karthik. From: Chockalingam, Karthikeyan (STFC,DL,HC) > Date: Wednesday, 19 April 2023 at 17:52 To: Matthew Knepley > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Setting up a matrix for Lagrange multiplier I have declared the mapping ISLocalToGlobalMapping mapping; ISLocalToGlobalMappingCreate(PETSC_COMM_WORLD, 1, n, nindices, PETSC_COPY_VALUES, &mapping); But when I use MatSetValuesLocal(), how do I know the above mapping is employed because it is not one of the parameters passed to the function? Thank you. Kind regards, Karthik. From: Matthew Knepley > Date: Tuesday, 18 April 2023 at 16:21 To: Chockalingam, Karthikeyan (STFC,DL,HC) > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Setting up a matrix for Lagrange multiplier On Tue, Apr 18, 2023 at 11:16?AM Karthikeyan Chockalingam - STFC UKRI > wrote: Thank you for your response. I spend some time understanding how MatSetValuesLocal and ISLocalToGlobalMappingCreate work. You can look at SNES ex28 where we do this with DMCOMPOSITE. Q1) Will the matrix K be of type MATMPIAIJ or MATIS? K = [A P^T P 0] I assume MPIAIJ since IS is only used for Neumann-Neumann decompositions. Q2) Can I use both MatSetValues() to MatSetValuesLocal() to populate K? Since I have already used MatSetValues() to construct A. You can, and there would be no changes in serial if K is exactly the upper left block, but in parallel global indices would change. Q3) What are the advantages of using MatSetValuesLocal()? Is it that I can construct P directly using local indies and map the entrees to the global index in K? You have a monolithic K, so that you can use sparse direct solvers to check things. THis is impossible with separate storage. Q4) I probably don?t have to construct an independent P matrix You wouldn't in this case. Thanks, Matt Best regards, Karthik. From: Matthew Knepley > Date: Tuesday, 18 April 2023 at 11:08 To: Chockalingam, Karthikeyan (STFC,DL,HC) > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Setting up a matrix for Lagrange multiplier On Tue, Apr 18, 2023 at 5:24?AM Karthikeyan Chockalingam - STFC UKRI via petsc-users > wrote: Hello, I'm solving a problem using the Lagrange multiplier, the matrix has the form K = [A P^T P 0] I am familiar with constructing K using MATMPIAIJ. However, I would like to know if had [A], can I augment it with [P], [P^T] and [0] of type MATMPIAIJ? Likewise for vectors as well. Can you please point me to the right resource, if it is a common operation in PETSc? You can do this at least 2 ways: 1) Assemble you submatrices directly into the larger matrix by constructing local-to-global maps for the emplacement. so that you do not change your assembly code, except to change MatSetValues() to MatSetValuesLocal(). This is usually preferable. 2) Use MATNEST and VecNEST to put pointers to submatrices and subvectors directly in. Thanks, Matt Many thanks. Kind regards, Karthik. -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Apr 25 04:43:51 2023 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 25 Apr 2023 05:43:51 -0400 Subject: [petsc-users] Question about linking LAPACK library In-Reply-To: References: <2AD13DB8-4023-45CF-8D8D-F2DB320702F8@petsc.dev> Message-ID: On Mon, Apr 24, 2023 at 11:47?PM ???? / ?? / ??????? wrote: > Dear all > > It depends on the problem. It can have hundreds of thousands of degrees of > freedom. > Suppose your matrix was dense and had 1e6 dofs. The work to invert a matrix is O(N^3) with a small constant, so it would take 1e18 = 1 exaflop to invert this matrix and about 10 Terabytes of RAM to store it. Is this available to you? PETSc's supports Elemental and SCALAPACK for this kind of calculation. If the system is sparse, you could invert it using MUMPS, SuperLU_dist, or Pardiso. Then the work and storage depend on the density. There are good estimates for connectivity based on regular grids of given dimension. The limiting resource here is usually memory, which motivates people to try iterative methods. The convergence of iterative methods depend on detailed properties of your system, like the operator spectrum. Thanks, Matt > best, > > Seung Lee Kwon > > 2023? 4? 25? (?) ?? 12:32, Barry Smith ?? ??: > >> >> How large are the dense matrices you would like to invert? >> >> On Apr 24, 2023, at 11:27 PM, ???? / ?? / ??????? >> wrote: >> >> Dear all >> >> Hello. >> I want to make an inverse matrix like inv(A) in MATLAB. >> >> Are there some methods to inverse matrix in petsc? >> >> If not, I want to use the inverse function in the LAPACK library. >> >> Then, how to use the LAPACK library in petsc? I use the C language. >> >> Best, >> >> Seung Lee Kwon >> >> -- >> Seung Lee Kwon, Ph.D.Candidate >> Aerospace Structures and Materials Laboratory >> Department of Mechanical and Aerospace Engineering >> Seoul National University >> Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 >> E-mail : ksl7912 at snu.ac.kr >> Office : +82-2-880-7389 >> C. P : +82-10-4695-1062 >> >> >> > > -- > Seung Lee Kwon, Ph.D.Candidate > Aerospace Structures and Materials Laboratory > Department of Mechanical and Aerospace Engineering > Seoul National University > Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 > E-mail : ksl7912 at snu.ac.kr > Office : +82-2-880-7389 > C. P : +82-10-4695-1062 > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Pierre.Jolivet at lip6.fr Tue Apr 25 04:52:20 2023 From: Pierre.Jolivet at lip6.fr (Pierre Jolivet) Date: Tue, 25 Apr 2023 11:52:20 +0200 Subject: [petsc-users] Question about linking LAPACK library In-Reply-To: References: <2AD13DB8-4023-45CF-8D8D-F2DB320702F8@petsc.dev> Message-ID: <9B4360CB-34CE-4032-A556-7BA6D596FA9B@lip6.fr> > On 25 Apr 2023, at 11:43 AM, Matthew Knepley wrote: > > On Mon, Apr 24, 2023 at 11:47?PM ???? / ?? / ??????? > wrote: >> Dear all >> >> It depends on the problem. It can have hundreds of thousands of degrees of freedom. > > Suppose your matrix was dense and had 1e6 dofs. The work to invert a matrix is O(N^3) with a small > constant, so it would take 1e18 = 1 exaflop to invert this matrix and about 10 Terabytes of RAM to store > it. Is this available to you? PETSc's supports Elemental and SCALAPACK for this kind of calculation. > > If the system is sparse, you could invert it using MUMPS, SuperLU_dist, or Pardiso. Then the work and > storage depend on the density. There are good estimates for connectivity based on regular grids of given > dimension. The limiting resource here is usually memory, which motivates people to try iterative methods. > The convergence of iterative methods depend on detailed properties of your system, like the operator spectrum. And to wrap this up, if your operator is truly dense, e.g., BEM or non-local discretizations, their are available hierarchical formats such as MatH2Opus and MatHtool. They have efficient matrix-vector product implementations such that you can solve linear systems without having to invert (or even store) the coefficient matrix explicitly. Thanks, Pierre > Thanks, > > Matt > >> best, >> >> Seung Lee Kwon >> >> 2023? 4? 25? (?) ?? 12:32, Barry Smith >?? ??: >>> >>> How large are the dense matrices you would like to invert? >>> >>>> On Apr 24, 2023, at 11:27 PM, ???? / ?? / ??????? > wrote: >>>> >>>> Dear all >>>> >>>> Hello. >>>> I want to make an inverse matrix like inv(A) in MATLAB. >>>> >>>> Are there some methods to inverse matrix in petsc? >>>> >>>> If not, I want to use the inverse function in the LAPACK library. >>>> >>>> Then, how to use the LAPACK library in petsc? I use the C language. >>>> >>>> Best, >>>> >>>> Seung Lee Kwon >>>> >>>> -- >>>> Seung Lee Kwon, Ph.D.Candidate >>>> Aerospace Structures and Materials Laboratory >>>> Department of Mechanical and Aerospace Engineering >>>> Seoul National University >>>> Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 >>>> E-mail : ksl7912 at snu.ac.kr >>>> Office : +82-2-880-7389 >>>> C. P : +82-10-4695-1062 >>> >> >> >> -- >> Seung Lee Kwon, Ph.D.Candidate >> Aerospace Structures and Materials Laboratory >> Department of Mechanical and Aerospace Engineering >> Seoul National University >> Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 >> E-mail : ksl7912 at snu.ac.kr >> Office : +82-2-880-7389 >> C. P : +82-10-4695-1062 > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Apr 25 04:53:53 2023 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 25 Apr 2023 05:53:53 -0400 Subject: [petsc-users] Setting up a matrix for Lagrange multiplier In-Reply-To: References: Message-ID: On Tue, Apr 25, 2023 at 5:38?AM Karthikeyan Chockalingam - STFC UKRI < karthikeyan.chockalingam at stfc.ac.uk> wrote: > Thank you for your response. I will come back to the error later (for now > I have it working with -pc_fieldsplit_detect_saddle_point.) > > > > I would like to understand how I need to index while running in parallel. > > > > (Q1)* Index set (IS) for the ISLocalToGlobalMapping.* > > > > > > *for* (*int* i = 0; i < N; i++) > > vec_col_index[i] = i; > > > > *for* (*int* i = 0; i < C; i++) > > vec_shifted_row_index[i] = i + N; > > > > > > ISLocalToGlobalMapping phi_mapping, lamda_mapping; > > > > ISLocalToGlobalMappingCreate(PETSC_COMM_WORLD, 1, C, > vec_shifted_row_index, PETSC_COPY_VALUES, &lamda_mapping); > > ISLocalToGlobalMappingCreate(PETSC_COMM_WORLD, 1, N, vec_col_index, > PETSC_COPY_VALUES, &phi_mapping); > > > > (i) I believe the above index set is created by all > parallel processes ? why then I would have to offset it with the first > index of this process? > > Since the mapping will always hold true and we would only use a *subset* > of this index set while populating the submatrices. I can see that I might > have an over-allocated index set than necessary but I can?t see why it can > be wrong. > Oh, N is the global size of your matrix. Then what your L2G map says above is "replicate this matrix on every process", which is not a scalable solution. A scalable solution has only part of the rows on each process, which is the default in PETSc. In my example, I just put the owned rows/cols on each process. You could choose to have an overlap, so that some shared rows also map to the local representation, but I did not do that in my example. > (ii) What would be the first index of this process anyway, > since at this point I haven?t declared any parallel sub-matrix or > sub-vector? > Either you know the parallel layout of your matrix, or if you let PETSc decide for you, then this function determines it https://petsc.org/main/manualpages/Sys/PetscSplitOwnership/ > (Q2)* Index set (IS) while creating the submatrix* > > > > *for* (*int* i = 0; i < C; i++) > > vec_row_index[i] = i; > > > > Mat SubP; > > IS row_isx, col_isx; > > > > ISCreateGeneral(PETSC_COMM_WORLD, C, vec_row_index, PETSC_COPY_VALUES, > &row_isx); > > ISCreateGeneral(PETSC_COMM_WORLD, N, vec_col_index, PETSC_COPY_VALUES, > &col_isx); > > > > MatSetOption(A[level], MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE); > > > > MatSetLocalToGlobalMapping(A, lamda_mapping, phi_mapping); > > > > MatGetLocalSubMatrix(A, row_isx, col_isx, &SubP); > > > > I believe I will have to request a submatrix SubP of global size C x N on > all processes ? right? Or do you think I can still break row_isx and > col_isx into parallel parts? > The submatrix is local to this process, "LocalSubmatrix", and thus should have a local number of rows and columns. I have chosen the number of owned rows/cols since I did not need an overlap. > (Q3)* Populating the submatrix* > > > > * const* PetscInt *nindices; > > PetscInt count; > > ISGetIndices(isout[level], &nindices); > > //isout is the global index of the nodes to be constrained. > > > > PetscScalar val = 1; > > > > *for* (*int* irow=0; irow < n; irow++) > > { > > *int* icol = nindices[irow]; > > MatSetValuesLocal(SubP,1,&irow,1,&icol,&val,ADD_VALUES); > > } > > > > I do think if I have the MatGetOwnerShipRange of SubP; I will be able to > parallelize the loop. Is MatGetOwnerShipRange also available for > submatrices as well? > It is a local submatrix, so I would only run over local things (parallelization is implicit). I think I have shown all these operations in the example. THanks, Matt > If I have to parallelize the index sets maybe I have to reorder the above > steps, can you please make a suggestion? > > > > Kind regards, > > Karthik. > > > > > > *From: *Matthew Knepley > *Date: *Monday, 24 April 2023 at 19:24 > *To: *Chockalingam, Karthikeyan (STFC,DL,HC) < > karthikeyan.chockalingam at stfc.ac.uk> > *Cc: *petsc-users at mcs.anl.gov > *Subject: *Re: [petsc-users] Setting up a matrix for Lagrange multiplier > > On Mon, Apr 24, 2023 at 1:58?PM Karthikeyan Chockalingam - STFC UKRI < > karthikeyan.chockalingam at stfc.ac.uk> wrote: > > Changing the names to NULL produced the same error. > > > > This is just to make the code simpler. > > > > Don?t I have to give the fields a name? > > > > They get the default name, which is the numbering you used. > > > > The problem solution didn?t change from one time step to the next. > > > > *[1;31m[0]PETSC ERROR: --------------------- Error Message > --------------------------------------------------------------* > > *[0;39m[0;49m[0]PETSC ERROR: No support for this operation for this object > type* > > *[0]PETSC ERROR: Unsupported viewer gmres* > > > > You are not completely catching the error, because it should print out the > entire options database on termination. > > > > Here it says you gave "gmres" as a Viewer, which is incorrect, but I > cannot see all the options you used. > > > > Thanks, > > > > Matt > > > > *[0]PETSC ERROR: WARNING! There are option(s) set that were not used! > Could be the program crashed before they were used or a spelling mistake, > etc!* > > *[0]PETSC ERROR: Option left: name:-fieldsplit_1_ksp_type value: preonly > source: code* > > *[0]PETSC ERROR: Option left: name:-fieldsplit_1_ksp_view value: gmres > source: code* > > *[0]PETSC ERROR: Option left: name:-fieldsplit_1_pc_type value: lu source: > code* > > *[0]PETSC ERROR: Option left: name:-options_left (no value) source: code* > > *[0]PETSC ERROR: See **https://petsc.org/release/faq/* > * for trouble shooting.* > > *[0]PETSC ERROR: Petsc Development GIT revision: v3.18.4-529-g995ec06f92 > GIT Date: 2023-02-03 18:41:48 +0000* > > *[0]PETSC ERROR: > /Users/karthikeyan.chockalingam/AMReX/amrFEM/build/Debug/amrFEM on a named > HC20210312 by karthikeyan.chockalingam Mon Apr 24 18:51:25 2023* > > *[0]PETSC ERROR: Configure options --with-debugging=0 > --prefix=/Users/karthikeyan.chockalingam/AMReX/petsc > --download-fblaslapack=yes --download-scalapack=yes --download-mumps=yes > --with-hypre-dir=/Users/karthikeyan.chockalingam/AMReX/hypre/src/hypre* > > *[0]PETSC ERROR: #1 PetscOptionsGetViewer() at > /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/sys/classes/viewer/interface/viewreg.c:309* > > *[0]PETSC ERROR: #2 KSPSetFromOptions() at > /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/ksp/interface/itcl.c:522* > > *[0]PETSC ERROR: #3 PCSetUp_FieldSplit() at > /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1054* > > *[0]PETSC ERROR: #4 PCSetUp() at > /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/pc/interface/precon.c:994* > > *[0]PETSC ERROR: #5 KSPSetUp() at > /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/ksp/interface/itfunc.c:405* > > *[0]PETSC ERROR: #6 KSPSolve_Private() at > /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/ksp/interface/itfunc.c:824* > > *[0]PETSC ERROR: #7 KSPSolve() at > /Users/karthikeyan.chockalingam/AMReX/SRC_PKG/petsc/src/ksp/ksp/interface/itfunc.c:1070* > > *End of program * > > *solve time 0.03416619 seconds * > > > > > > It will be a bit difficult for me to produce a stand-alone code. > > Thank you for your help. > > > Best, > Karthik. > > > > *From: *Matthew Knepley > *Date: *Monday, 24 April 2023 at 18:42 > *To: *Chockalingam, Karthikeyan (STFC,DL,HC) < > karthikeyan.chockalingam at stfc.ac.uk> > *Cc: *petsc-users at mcs.anl.gov > *Subject: *Re: [petsc-users] Setting up a matrix for Lagrange multiplier > > On Mon, Apr 24, 2023 at 1:37?PM Karthikeyan Chockalingam - STFC UKRI < > karthikeyan.chockalingam at stfc.ac.uk> wrote: > > Great to know there is an example now. > > > > Yes, -pc_fieldsplit_defect_saddle_point worked. > > > > But I would like to understand what I did wrong (and why it didn?t work). > > > > Can you show the error? Or give something self-contained that I can run. > > > > After reading many posted posts, I needed to create PCFieldSplitSetIS to > split the fields. > > > > Yes. I would give NULL for the name here. > > > > I set the first N indices to the field phi and the next C indices > (starting from N) to the field lambda. > > > > This looks fine. However, these are global indices, so this would not work > in parallel. You would have to offset by > > the first index on this process. > > > > Thanks, > > > > Matt > > > > Please tell me what I did wrong and how I can fix the lines which are now > commented? > > > > PetscErrorCode ierr; > > KSP ksp; > > PC pc; > > > > KSPCreate(PETSC_COMM_WORLD, &ksp); > > KSPSetType(ksp, KSPGMRES); > > KSPSetOperators(ksp, A[level], A[level]); > > ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr); > > ierr = PCSetType(pc,PCFIELDSPLIT);CHKERRQ(ierr); > > > > ??.. > > > > * for* (*int* i = 0; i < N; i++) > > vec_zero_field_index[i] = i; > > > > PetscInt * vec_first_field_index = *NULL*; > > PetscMalloc(n * *sizeof*(PetscInt), &vec_first_field_index); > > > > *for* (*int* i = 0; i < C; i++) > > vec_first_field_index[i] = N + i; > > > > IS zero_field_isx, first_field_isx; > > CHKERRQ(ISCreateGeneral(PETSC_COMM_WORLD, N, vec_zero_field_index, > PETSC_COPY_VALUES, &zero_field_isx)); > > CHKERRQ(ISCreateGeneral(PETSC_COMM_WORLD, C, vec_first_field_index, > PETSC_COPY_VALUES, &first_field_isx)); > > > > ierr = PCFieldSplitSetIS(pc,"0",zero_field_isx); CHKERRQ(ierr); > > ierr = PCFieldSplitSetIS(pc,"1",first_field_isx); CHKERRQ(ierr); > > > > ierr = PetscOptionsSetValue(*NULL*,"-ksp_type", "fgmres"); CHKERRQ > (ierr); > > ierr = PetscOptionsSetValue(*NULL*,"-pc_type", "fieldsplit"); CHKERRQ > (ierr); > > > > /* > > ierr = PetscOptionsSetValue(NULL,"-fieldsplit_0_ksp_type", "gmres"); > CHKERRQ(ierr); > > ierr = PetscOptionsSetValue(NULL,"-fieldsplit_0_pc_type", "jacobi"); > CHKERRQ(ierr); > > ierr = PetscOptionsSetValue(NULL,"-fieldsplit_1_ksp_type", > "preonly"); CHKERRQ(ierr); > > ierr = PetscOptionsSetValue(NULL,"-fieldsplit_1_pc_type", "lu"); > CHKERRQ(ierr);*/ > > > > ierr = PetscOptionsSetValue(*NULL*, > "-pc_fieldsplit_detect_saddle_point", *NULL*);CHKERRQ(ierr); > > ierr = PetscOptionsSetValue(*NULL*, "-options_left", *NULL*);CHKERRQ > (ierr); > > ierr = KSPSetFromOptions(ksp); CHKERRQ(ierr); > > KSPSolve(ksp, b, x); > > > > Best, > > Karthik. > > *From: *Matthew Knepley > *Date: *Monday, 24 April 2023 at 17:57 > *To: *Chockalingam, Karthikeyan (STFC,DL,HC) < > karthikeyan.chockalingam at stfc.ac.uk> > *Cc: *petsc-users at mcs.anl.gov > *Subject: *Re: [petsc-users] Setting up a matrix for Lagrange multiplier > > On Mon, Apr 24, 2023 at 10:22?AM Karthikeyan Chockalingam - STFC UKRI < > karthikeyan.chockalingam at stfc.ac.uk> wrote: > > Hello, > > > > I was able to construct the below K matrix (using submatrices P and P^T), > which is of type MATAIJ > > K = [A P^T > > P 0] > > and solved them using a direct solver. > > > > I modified your example to create either AIJ or Nest matrices, but use the > same assembly code: > > > > https://gitlab.com/petsc/petsc/-/merge_requests/6368 > > > > However, I was reading online that this is a saddle point problem and I > should be employing PCFIELDSPLIT. > > Since I have one monolithic matrix K, I was not sure how to split the > fields. > > > > With this particular matrix, you can use > > > > -pc_fieldsplit_detect_saddle_point > > > > and it will split it automatically. > > > > Thanks, > > > > Matt > > > > Best regards, > > Karthik. > > > > > > *From: *Chockalingam, Karthikeyan (STFC,DL,HC) < > karthikeyan.chockalingam at stfc.ac.uk> > *Date: *Wednesday, 19 April 2023 at 17:52 > *To: *Matthew Knepley > *Cc: *petsc-users at mcs.anl.gov > *Subject: *Re: [petsc-users] Setting up a matrix for Lagrange multiplier > > I have declared the mapping > > > > ISLocalToGlobalMapping mapping; > > ISLocalToGlobalMappingCreate(PETSC_COMM_WORLD, 1, n, nindices, > PETSC_COPY_VALUES, &mapping); > > > > But when I use MatSetValuesLocal(), how do I know the above mapping is > employed because it is not one of the parameters passed to the function? > > > > Thank you. > > > > Kind regards, > > Karthik. > > > > > > *From: *Matthew Knepley > *Date: *Tuesday, 18 April 2023 at 16:21 > *To: *Chockalingam, Karthikeyan (STFC,DL,HC) < > karthikeyan.chockalingam at stfc.ac.uk> > *Cc: *petsc-users at mcs.anl.gov > *Subject: *Re: [petsc-users] Setting up a matrix for Lagrange multiplier > > On Tue, Apr 18, 2023 at 11:16?AM Karthikeyan Chockalingam - STFC UKRI < > karthikeyan.chockalingam at stfc.ac.uk> wrote: > > Thank you for your response. I spend some time understanding how > > MatSetValuesLocal and ISLocalToGlobalMappingCreate work. > > > > You can look at SNES ex28 where we do this with DMCOMPOSITE. > > > > Q1) Will the matrix K be of type MATMPIAIJ or MATIS? > > K = [A P^T > > P 0] > > > > I assume MPIAIJ since IS is only used for Neumann-Neumann decompositions. > > > > Q2) Can I use both MatSetValues() to MatSetValuesLocal() to populate K? > Since I have already used MatSetValues() to construct A. > > > > You can, and there would be no changes in serial if K is exactly the upper > left block, but in parallel global indices would change. > > > > Q3) What are the advantages of using MatSetValuesLocal()? Is it that I can > construct P directly using local indies and map the entrees to the global > index in K? > > > > You have a monolithic K, so that you can use sparse direct solvers to > check things. THis is impossible with separate storage. > > > > Q4) I probably don?t have to construct an independent P matrix > > > > You wouldn't in this case. > > > > Thanks, > > > > Matt > > > > Best regards, > > Karthik. > > > > > > > > *From: *Matthew Knepley > *Date: *Tuesday, 18 April 2023 at 11:08 > *To: *Chockalingam, Karthikeyan (STFC,DL,HC) < > karthikeyan.chockalingam at stfc.ac.uk> > *Cc: *petsc-users at mcs.anl.gov > *Subject: *Re: [petsc-users] Setting up a matrix for Lagrange multiplier > > On Tue, Apr 18, 2023 at 5:24?AM Karthikeyan Chockalingam - STFC UKRI via > petsc-users wrote: > > Hello, > > > > I'm solving a problem using the Lagrange multiplier, the matrix has the > form > > > > K = [A P^T > > P 0] > > > > I am familiar with constructing K using MATMPIAIJ. However, I would like > to know if had [A], can I augment it with [P], [P^T] and [0] of type > MATMPIAIJ? Likewise for vectors as well. > > > > Can you please point me to the right resource, if it is a common operation > in PETSc? > > > > You can do this at least 2 ways: > > > > 1) Assemble you submatrices directly into the larger matrix by > constructing local-to-global maps for the emplacement. so that you do > > not change your assembly code, except to change MatSetValues() to > MatSetValuesLocal(). This is usually preferable. > > > > 2) Use MATNEST and VecNEST to put pointers to submatrices and subvectors > directly in. > > > > Thanks, > > > > Matt > > > > Many thanks. > > > > Kind regards, > > Karthik. > > > > > > > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ksl7912 at snu.ac.kr Tue Apr 25 05:03:18 2023 From: ksl7912 at snu.ac.kr (=?UTF-8?B?wq3qtozsirnrpqwgLyDtlZnsg50gLyDtla3qs7XsmrDso7zqs7XtlZnqs7w=?=) Date: Tue, 25 Apr 2023 19:03:18 +0900 Subject: [petsc-users] Question about linking LAPACK library In-Reply-To: References: <2AD13DB8-4023-45CF-8D8D-F2DB320702F8@petsc.dev> Message-ID: Thank you for your reply. I think I gave an example of an unrealistic problem. I just wanted to know how to compute the inverse matrix, so I was wondering if there is an example of computing the inverse matrix in PETSc. Alternatively, I want to know how to link the LAPACK library. best, Seung Lee Kwon 2023? 4? 25? (?) ?? 6:44, Matthew Knepley ?? ??: > On Mon, Apr 24, 2023 at 11:47?PM ???? / ?? / ??????? > wrote: > >> Dear all >> >> It depends on the problem. It can have hundreds of thousands of degrees >> of freedom. >> > > Suppose your matrix was dense and had 1e6 dofs. The work to invert a > matrix is O(N^3) with a small > constant, so it would take 1e18 = 1 exaflop to invert this matrix and > about 10 Terabytes of RAM to store > it. Is this available to you? PETSc's supports Elemental and SCALAPACK for > this kind of calculation. > > If the system is sparse, you could invert it using MUMPS, SuperLU_dist, or > Pardiso. Then the work and > storage depend on the density. There are good estimates for connectivity > based on regular grids of given > dimension. The limiting resource here is usually memory, which motivates > people to try iterative methods. > The convergence of iterative methods depend on detailed properties of your > system, like the operator spectrum. > > Thanks, > > Matt > > >> best, >> >> Seung Lee Kwon >> >> 2023? 4? 25? (?) ?? 12:32, Barry Smith ?? ??: >> >>> >>> How large are the dense matrices you would like to invert? >>> >>> On Apr 24, 2023, at 11:27 PM, ???? / ?? / ??????? >>> wrote: >>> >>> Dear all >>> >>> Hello. >>> I want to make an inverse matrix like inv(A) in MATLAB. >>> >>> Are there some methods to inverse matrix in petsc? >>> >>> If not, I want to use the inverse function in the LAPACK library. >>> >>> Then, how to use the LAPACK library in petsc? I use the C language. >>> >>> Best, >>> >>> Seung Lee Kwon >>> >>> -- >>> Seung Lee Kwon, Ph.D.Candidate >>> Aerospace Structures and Materials Laboratory >>> Department of Mechanical and Aerospace Engineering >>> Seoul National University >>> Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 >>> E-mail : ksl7912 at snu.ac.kr >>> Office : +82-2-880-7389 >>> C. P : +82-10-4695-1062 >>> >>> >>> >> >> -- >> Seung Lee Kwon, Ph.D.Candidate >> Aerospace Structures and Materials Laboratory >> Department of Mechanical and Aerospace Engineering >> Seoul National University >> Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 >> E-mail : ksl7912 at snu.ac.kr >> Office : +82-2-880-7389 >> C. P : +82-10-4695-1062 >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -- Seung Lee Kwon, Ph.D.Candidate Aerospace Structures and Materials Laboratory Department of Mechanical and Aerospace Engineering Seoul National University Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 E-mail : ksl7912 at snu.ac.kr Office : +82-2-880-7389 C. P : +82-10-4695-1062 -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Apr 25 05:59:52 2023 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 25 Apr 2023 06:59:52 -0400 Subject: [petsc-users] Question about linking LAPACK library In-Reply-To: References: <2AD13DB8-4023-45CF-8D8D-F2DB320702F8@petsc.dev> Message-ID: On Tue, Apr 25, 2023 at 6:03?AM ???? / ?? / ??????? wrote: > Thank you for your reply. > > I think I gave an example of an unrealistic problem. > > I just wanted to know how to compute the inverse matrix, so I was > wondering if there is an example of computing the inverse matrix in PETSc. > You just use KSPCreate(PETSC_COMM_WORLD, &ksp); KSPSetOperator(ksp, A, A); KSPSetFromOptions(ksp); KSPSolve(ksp, b, x); This solves the system. If you want an exact solution use -pc_type lu There is a manual chapter on linear solves such as there. Thanks, Matt > Alternatively, I want to know how to link the LAPACK library. > > best, > > Seung Lee Kwon > > 2023? 4? 25? (?) ?? 6:44, Matthew Knepley ?? ??: > >> On Mon, Apr 24, 2023 at 11:47?PM ???? / ?? / ??????? >> wrote: >> >>> Dear all >>> >>> It depends on the problem. It can have hundreds of thousands of degrees >>> of freedom. >>> >> >> Suppose your matrix was dense and had 1e6 dofs. The work to invert a >> matrix is O(N^3) with a small >> constant, so it would take 1e18 = 1 exaflop to invert this matrix and >> about 10 Terabytes of RAM to store >> it. Is this available to you? PETSc's supports Elemental and SCALAPACK >> for this kind of calculation. >> >> If the system is sparse, you could invert it using MUMPS, SuperLU_dist, >> or Pardiso. Then the work and >> storage depend on the density. There are good estimates for connectivity >> based on regular grids of given >> dimension. The limiting resource here is usually memory, which motivates >> people to try iterative methods. >> The convergence of iterative methods depend on detailed properties of >> your system, like the operator spectrum. >> >> Thanks, >> >> Matt >> >> >>> best, >>> >>> Seung Lee Kwon >>> >>> 2023? 4? 25? (?) ?? 12:32, Barry Smith ?? ??: >>> >>>> >>>> How large are the dense matrices you would like to invert? >>>> >>>> On Apr 24, 2023, at 11:27 PM, ???? / ?? / ??????? >>>> wrote: >>>> >>>> Dear all >>>> >>>> Hello. >>>> I want to make an inverse matrix like inv(A) in MATLAB. >>>> >>>> Are there some methods to inverse matrix in petsc? >>>> >>>> If not, I want to use the inverse function in the LAPACK library. >>>> >>>> Then, how to use the LAPACK library in petsc? I use the C language. >>>> >>>> Best, >>>> >>>> Seung Lee Kwon >>>> >>>> -- >>>> Seung Lee Kwon, Ph.D.Candidate >>>> Aerospace Structures and Materials Laboratory >>>> Department of Mechanical and Aerospace Engineering >>>> Seoul National University >>>> Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 >>>> E-mail : ksl7912 at snu.ac.kr >>>> Office : +82-2-880-7389 >>>> C. P : +82-10-4695-1062 >>>> >>>> >>>> >>> >>> -- >>> Seung Lee Kwon, Ph.D.Candidate >>> Aerospace Structures and Materials Laboratory >>> Department of Mechanical and Aerospace Engineering >>> Seoul National University >>> Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 >>> E-mail : ksl7912 at snu.ac.kr >>> Office : +82-2-880-7389 >>> C. P : +82-10-4695-1062 >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > > > -- > Seung Lee Kwon, Ph.D.Candidate > Aerospace Structures and Materials Laboratory > Department of Mechanical and Aerospace Engineering > Seoul National University > Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 > E-mail : ksl7912 at snu.ac.kr > Office : +82-2-880-7389 > C. P : +82-10-4695-1062 > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefano.carli at kuleuven.be Tue Apr 25 03:15:35 2023 From: stefano.carli at kuleuven.be (Stefano Carli) Date: Tue, 25 Apr 2023 08:15:35 +0000 Subject: [petsc-users] obtaining estimated Hessian in BQNLS Message-ID: <05357d86ee454f49a8499739c3654b3a@ICTS-S-EXMBX23.luna.kuleuven.be> Dear PETSc developers, I'm using PETSc version 3.14.1 coupled to a Fortran code, and I was wondering if there is a way of obtaining in output, possibly at each iteration, the estimated Hessian matrix for the BQNLS method. Thank you in advance and best regards, Stefano Carli -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Tue Apr 25 08:10:57 2023 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 25 Apr 2023 08:10:57 -0500 (CDT) Subject: [petsc-users] Question about linking LAPACK library In-Reply-To: References: <2AD13DB8-4023-45CF-8D8D-F2DB320702F8@petsc.dev> Message-ID: <0afa8bc9-96a6-a25e-3078-8d476be52a48@mcs.anl.gov> On Tue, 25 Apr 2023, Matthew Knepley wrote: > On Tue, Apr 25, 2023 at 6:03?AM ???? / ?? / ??????? > wrote: > > > Thank you for your reply. > > > > I think I gave an example of an unrealistic problem. > > > > I just wanted to know how to compute the inverse matrix, so I was > > wondering if there is an example of computing the inverse matrix in PETSc. > > > > You just use > > KSPCreate(PETSC_COMM_WORLD, &ksp); > KSPSetOperator(ksp, A, A); > KSPSetFromOptions(ksp); > KSPSolve(ksp, b, x); > > This solves the system. If you want an exact solution use > > -pc_type lu > > There is a manual chapter on linear solves such as there. > > Thanks, > > Matt > > > > Alternatively, I want to know how to link the LAPACK library. You might want to check: src/mat/impls/dense/seq/dense.c: PetscCallBLAS("LAPACKgetrf", LAPACKgetrf_(&m, &n, mat->v, &mat->lda, mat->pivots, &info)); Satish > > > > best, > > > > Seung Lee Kwon > > > > 2023? 4? 25? (?) ?? 6:44, Matthew Knepley ?? ??: > > > >> On Mon, Apr 24, 2023 at 11:47?PM ???? / ?? / ??????? > >> wrote: > >> > >>> Dear all > >>> > >>> It depends on the problem. It can have hundreds of thousands of degrees > >>> of freedom. > >>> > >> > >> Suppose your matrix was dense and had 1e6 dofs. The work to invert a > >> matrix is O(N^3) with a small > >> constant, so it would take 1e18 = 1 exaflop to invert this matrix and > >> about 10 Terabytes of RAM to store > >> it. Is this available to you? PETSc's supports Elemental and SCALAPACK > >> for this kind of calculation. > >> > >> If the system is sparse, you could invert it using MUMPS, SuperLU_dist, > >> or Pardiso. Then the work and > >> storage depend on the density. There are good estimates for connectivity > >> based on regular grids of given > >> dimension. The limiting resource here is usually memory, which motivates > >> people to try iterative methods. > >> The convergence of iterative methods depend on detailed properties of > >> your system, like the operator spectrum. > >> > >> Thanks, > >> > >> Matt > >> > >> > >>> best, > >>> > >>> Seung Lee Kwon > >>> > >>> 2023? 4? 25? (?) ?? 12:32, Barry Smith ?? ??: > >>> > >>>> > >>>> How large are the dense matrices you would like to invert? > >>>> > >>>> On Apr 24, 2023, at 11:27 PM, ???? / ?? / ??????? > >>>> wrote: > >>>> > >>>> Dear all > >>>> > >>>> Hello. > >>>> I want to make an inverse matrix like inv(A) in MATLAB. > >>>> > >>>> Are there some methods to inverse matrix in petsc? > >>>> > >>>> If not, I want to use the inverse function in the LAPACK library. > >>>> > >>>> Then, how to use the LAPACK library in petsc? I use the C language. > >>>> > >>>> Best, > >>>> > >>>> Seung Lee Kwon > >>>> > >>>> -- > >>>> Seung Lee Kwon, Ph.D.Candidate > >>>> Aerospace Structures and Materials Laboratory > >>>> Department of Mechanical and Aerospace Engineering > >>>> Seoul National University > >>>> Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 > >>>> E-mail : ksl7912 at snu.ac.kr > >>>> Office : +82-2-880-7389 > >>>> C. P : +82-10-4695-1062 > >>>> > >>>> > >>>> > >>> > >>> -- > >>> Seung Lee Kwon, Ph.D.Candidate > >>> Aerospace Structures and Materials Laboratory > >>> Department of Mechanical and Aerospace Engineering > >>> Seoul National University > >>> Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 > >>> E-mail : ksl7912 at snu.ac.kr > >>> Office : +82-2-880-7389 > >>> C. P : +82-10-4695-1062 > >>> > >> > >> > >> -- > >> What most experimenters take for granted before they begin their > >> experiments is infinitely more interesting than any results to which their > >> experiments lead. > >> -- Norbert Wiener > >> > >> https://www.cse.buffalo.edu/~knepley/ > >> > >> > > > > > > -- > > Seung Lee Kwon, Ph.D.Candidate > > Aerospace Structures and Materials Laboratory > > Department of Mechanical and Aerospace Engineering > > Seoul National University > > Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 > > E-mail : ksl7912 at snu.ac.kr > > Office : +82-2-880-7389 > > C. P : +82-10-4695-1062 > > > > > From bsmith at petsc.dev Tue Apr 25 10:55:34 2023 From: bsmith at petsc.dev (Barry Smith) Date: Tue, 25 Apr 2023 11:55:34 -0400 Subject: [petsc-users] Question about linking LAPACK library In-Reply-To: References: <2AD13DB8-4023-45CF-8D8D-F2DB320702F8@petsc.dev> Message-ID: <26F0600A-B053-4BC6-904B-617B1F1D12FA@petsc.dev> LAPACK is always linked with PETSc so you can always make direct calls to LAPACK routines from PETSc code.. Barry > On Apr 25, 2023, at 6:03 AM, ???? / ?? / ??????? wrote: > > Thank you for your reply. > > I think I gave an example of an unrealistic problem. > > I just wanted to know how to compute the inverse matrix, so I was wondering if there is an example of computing the inverse matrix in PETSc. > > Alternatively, I want to know how to link the LAPACK library. > > best, > > Seung Lee Kwon > > 2023? 4? 25? (?) ?? 6:44, Matthew Knepley >?? ??: >> On Mon, Apr 24, 2023 at 11:47?PM ???? / ?? / ??????? > wrote: >>> Dear all >>> >>> It depends on the problem. It can have hundreds of thousands of degrees of freedom. >> >> Suppose your matrix was dense and had 1e6 dofs. The work to invert a matrix is O(N^3) with a small >> constant, so it would take 1e18 = 1 exaflop to invert this matrix and about 10 Terabytes of RAM to store >> it. Is this available to you? PETSc's supports Elemental and SCALAPACK for this kind of calculation. >> >> If the system is sparse, you could invert it using MUMPS, SuperLU_dist, or Pardiso. Then the work and >> storage depend on the density. There are good estimates for connectivity based on regular grids of given >> dimension. The limiting resource here is usually memory, which motivates people to try iterative methods. >> The convergence of iterative methods depend on detailed properties of your system, like the operator spectrum. >> >> Thanks, >> >> Matt >> >>> best, >>> >>> Seung Lee Kwon >>> >>> 2023? 4? 25? (?) ?? 12:32, Barry Smith >?? ??: >>>> >>>> How large are the dense matrices you would like to invert? >>>> >>>>> On Apr 24, 2023, at 11:27 PM, ???? / ?? / ??????? > wrote: >>>>> >>>>> Dear all >>>>> >>>>> Hello. >>>>> I want to make an inverse matrix like inv(A) in MATLAB. >>>>> >>>>> Are there some methods to inverse matrix in petsc? >>>>> >>>>> If not, I want to use the inverse function in the LAPACK library. >>>>> >>>>> Then, how to use the LAPACK library in petsc? I use the C language. >>>>> >>>>> Best, >>>>> >>>>> Seung Lee Kwon >>>>> >>>>> -- >>>>> Seung Lee Kwon, Ph.D.Candidate >>>>> Aerospace Structures and Materials Laboratory >>>>> Department of Mechanical and Aerospace Engineering >>>>> Seoul National University >>>>> Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 >>>>> E-mail : ksl7912 at snu.ac.kr >>>>> Office : +82-2-880-7389 >>>>> C. P : +82-10-4695-1062 >>>> >>> >>> >>> -- >>> Seung Lee Kwon, Ph.D.Candidate >>> Aerospace Structures and Materials Laboratory >>> Department of Mechanical and Aerospace Engineering >>> Seoul National University >>> Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 >>> E-mail : ksl7912 at snu.ac.kr >>> Office : +82-2-880-7389 >>> C. P : +82-10-4695-1062 >> >> >> -- >> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ > > > -- > Seung Lee Kwon, Ph.D.Candidate > Aerospace Structures and Materials Laboratory > Department of Mechanical and Aerospace Engineering > Seoul National University > Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 > E-mail : ksl7912 at snu.ac.kr > Office : +82-2-880-7389 > C. P : +82-10-4695-1062 -------------- next part -------------- An HTML attachment was scrubbed... URL: From hsuh at anl.gov Tue Apr 25 12:56:57 2023 From: hsuh at anl.gov (Suh, Hansol) Date: Tue, 25 Apr 2023 17:56:57 +0000 Subject: [petsc-users] obtaining estimated Hessian in BQNLS Message-ID: petsc's QN routines uses limited-memory variable metric format, which means explicit hessian is not constructed. Instead of creating explicit hessian, TAO uses MatSolve, and MatMult, to "access" the hessian. So you can have access to Hx, and H^{-1}x, but not H matrix by itself. And as of right now, there isn't a way to access such Hx, and H^{-1}x. (which also means that even if the routines to access Hx, H^{-1}x were there, getting those at each iteration doesn't really show much information, I think.) (Sorry for email reply-chain screw up..) From: Stefano Carli Subject: [petsc-users] obtaining estimated Hessian in BQNLS Date: April 25, 2023 at 4:15:35 AM EDT To: "petsc-users at mcs.anl.gov" Message-Id: <05357d86ee454f49a8499739c3654b3a at ICTS-S-EXMBX23.luna.kuleuven.be> Dear PETSc developers, I?m using PETSc version 3.14.1 coupled to a Fortran code, and I was wondering if there is a way of obtaining in output, possibly at each iteration, the estimated Hessian matrix for the BQNLS method. Thank you in advance and best regards, Stefano Carli -------------- next part -------------- An HTML attachment was scrubbed... URL: From hsuh at anl.gov Tue Apr 25 13:21:46 2023 From: hsuh at anl.gov (Suh, Hansol) Date: Tue, 25 Apr 2023 18:21:46 +0000 Subject: [petsc-users] obtaining estimated Hessian in BQNLS In-Reply-To: References: Message-ID: Actually, let me take that back , about inability to access LMVM. You can access LMVM Hessian mat after you are done with TaoSolve, but not within the iteration. See: ``` PetscCall(TaoGetKSP(tao, &ksp)); PetscCall(KSPGetPC(ksp, &pc)); PetscCall(PCLMVMGetMatLMVM(pc, &M)); PetscCall(VecDuplicate(x, &in)); PetscCall(VecDuplicate(x, &out)); PetscCall(VecDuplicate(x, &out2)); PetscCall(VecSet(in, 1.0)); PetscCall(MatMult(M, in, out)); PetscCall(MatSolve(M, out, out2)); PetscCall(VecAXPY(out2, -1.0, in)); PetscCall(VecNorm(out2, NORM_2, &mult_solve_dist)); ``` (tao/unconstrained/tutorials/rosenbrock1.c) Hope this helps. ________________________________ From: petsc-users on behalf of Suh, Hansol via petsc-users Sent: Tuesday, April 25, 2023 12:56 PM To: petsc-users at mcs.anl.gov Cc: Isaac, Toby ; stefano.carli at kuleuven.be Subject: Re: [petsc-users] obtaining estimated Hessian in BQNLS petsc's QN routines uses limited-memory variable metric format, which means explicit hessian is not constructed. Instead of creating explicit hessian, TAO uses MatSolve, and MatMult, to "access" the hessian. So you can have access to Hx, and H^{-1}x, but not H matrix by itself. And as of right now, there isn't a way to access such Hx, and H^{-1}x. (which also means that even if the routines to access Hx, H^{-1}x were there, getting those at each iteration doesn't really show much information, I think.) (Sorry for email reply-chain screw up..) From: Stefano Carli Subject: [petsc-users] obtaining estimated Hessian in BQNLS Date: April 25, 2023 at 4:15:35 AM EDT To: "petsc-users at mcs.anl.gov" Message-Id: <05357d86ee454f49a8499739c3654b3a at ICTS-S-EXMBX23.luna.kuleuven.be> Dear PETSc developers, I?m using PETSc version 3.14.1 coupled to a Fortran code, and I was wondering if there is a way of obtaining in output, possibly at each iteration, the estimated Hessian matrix for the BQNLS method. Thank you in advance and best regards, Stefano Carli -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Apr 25 17:15:03 2023 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 25 Apr 2023 18:15:03 -0400 Subject: [petsc-users] obtaining estimated Hessian in BQNLS In-Reply-To: <05357d86ee454f49a8499739c3654b3a@ICTS-S-EXMBX23.luna.kuleuven.be> References: <05357d86ee454f49a8499739c3654b3a@ICTS-S-EXMBX23.luna.kuleuven.be> Message-ID: On Tue, Apr 25, 2023 at 8:53?AM Stefano Carli wrote: > Dear PETSc developers, > > > > I?m using PETSc version 3.14.1 coupled to a Fortran code, and I was > wondering if there is a way of obtaining in output, possibly at each > iteration, the estimated Hessian matrix for the BQNLS method. > It is represented as the outer product of sets of vectors. Is that what you want? Thanks Matt > Thank you in advance and best regards, > > Stefano Carli > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Apr 25 17:18:10 2023 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 25 Apr 2023 18:18:10 -0400 Subject: [petsc-users] obtaining estimated Hessian in BQNLS In-Reply-To: References: Message-ID: On Tue, Apr 25, 2023 at 2:21?PM Suh, Hansol via petsc-users < petsc-users at mcs.anl.gov> wrote: > Actually, let me take that back , about inability to access LMVM. > > You can access LMVM Hessian mat after you are done with TaoSolve, but not > within the iteration. > Good catch. You could do this in the iteration with a monitor. Thanks, Matt > See: > > ``` > PetscCall(TaoGetKSP(tao, &ksp)); > PetscCall(KSPGetPC(ksp, &pc)); > PetscCall(PCLMVMGetMatLMVM(pc, &M)); > PetscCall(VecDuplicate(x, &in)); > PetscCall(VecDuplicate(x, &out)); > PetscCall(VecDuplicate(x, &out2)); > PetscCall(VecSet(in, 1.0)); > PetscCall(MatMult(M, in, out)); > PetscCall(MatSolve(M, out, out2)); > PetscCall(VecAXPY(out2, -1.0, in)); > PetscCall(VecNorm(out2, NORM_2, &mult_solve_dist)); > ``` > (tao/unconstrained/tutorials/rosenbrock1.c) > > Hope this helps. > > ------------------------------ > *From:* petsc-users on behalf of Suh, > Hansol via petsc-users > *Sent:* Tuesday, April 25, 2023 12:56 PM > *To:* petsc-users at mcs.anl.gov > *Cc:* Isaac, Toby ; stefano.carli at kuleuven.be < > stefano.carli at kuleuven.be> > *Subject:* Re: [petsc-users] obtaining estimated Hessian in BQNLS > > petsc's QN routines uses limited-memory variable metric format, which > means explicit hessian is not constructed. > Instead of creating explicit hessian, TAO uses MatSolve, and MatMult, to > "access" the hessian. So you can have access to Hx, and H^{-1}x, but not H > matrix by itself. > And as of right now, there isn't a way to access such Hx, and H^{-1}x. > (which also means that even if the routines to access Hx, H^{-1}x were > there, getting those at each iteration doesn't really show much > information, I think.) > > > (Sorry for email reply-chain screw up..) > > > *From: *Stefano Carli > *Subject: **[petsc-users] obtaining estimated Hessian in BQNLS* > *Date: *April 25, 2023 at 4:15:35 AM EDT > *To: *"petsc-users at mcs.anl.gov" > *Message-Id: *< > 05357d86ee454f49a8499739c3654b3a at ICTS-S-EXMBX23.luna.kuleuven.be> > > Dear PETSc developers, > > I?m using PETSc version 3.14.1 coupled to a Fortran code, and I was > wondering if there is a way of obtaining in output, possibly at each > iteration, the estimated Hessian matrix for the BQNLS method. > > Thank you in advance and best regards, > Stefano Carli > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ksl7912 at snu.ac.kr Wed Apr 26 05:02:03 2023 From: ksl7912 at snu.ac.kr (=?UTF-8?B?wq3qtozsirnrpqwgLyDtlZnsg50gLyDtla3qs7XsmrDso7zqs7XtlZnqs7w=?=) Date: Wed, 26 Apr 2023 19:02:03 +0900 Subject: [petsc-users] makefile error Message-ID: Dear developers Could you recommend the error messages below? /home/ksl/petsc/arch-linux-c-debug/bin/mpicxx -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -Wno-lto-type-mismatch -fstack-protector -fvisibility=hidden -g -O0 -I/home/ksl/petsc/include -I/home/ksl/petsc/arch-linux-c-debug/include -o app a1.o a2.o a3.o a4.o a5.o -Wl,-rpath,/home/ksl/petsc/arch-linux-c-debug/lib -L/home/ksl/petsc/arch-linux-c-debug/lib -Wl,-rpath,/home/ksl/petsc/arch-linux-c-debug/lib -L/home/ksl/petsc/arch-linux-c-debug/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/9 -L/usr/lib/gcc/x86_64-linux-gnu/9 -lpetsc -llapack -lblas -lpthread -lm -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lstdc++ -ldl /usr/bin/ld: cannot find -lpetsc collect2: error: ld returned 1 exit status make: *** [makefile:33: app] Error 1 Best regards Seung Lee Kwon -- Seung Lee Kwon, Ph.D.Candidate Aerospace Structures and Materials Laboratory Department of Mechanical and Aerospace Engineering Seoul National University Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 E-mail : ksl7912 at snu.ac.kr Office : +82-2-880-7389 C. P : +82-10-4695-1062 -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Apr 26 05:05:30 2023 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 26 Apr 2023 06:05:30 -0400 Subject: [petsc-users] makefile error In-Reply-To: References: Message-ID: On Wed, Apr 26, 2023 at 6:02?AM ???? / ?? / ??????? wrote: > Dear developers > > Could you recommend the error messages below? > > /home/ksl/petsc/arch-linux-c-debug/bin/mpicxx -Wall -Wwrite-strings > -Wno-strict-aliasing -Wno-unknown-pragmas -Wno-lto-type-mismatch > -fstack-protector -fvisibility=hidden -g -O0 -I/home/ksl/petsc/include > -I/home/ksl/petsc/arch-linux-c-debug/include -o app a1.o a2.o a3.o a4.o > a5.o -Wl,-rpath,/home/ksl/petsc/arch-linux-c-debug/lib > -L/home/ksl/petsc/arch-linux-c-debug/lib > -Wl,-rpath,/home/ksl/petsc/arch-linux-c-debug/lib > -L/home/ksl/petsc/arch-linux-c-debug/lib > -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/9 > -L/usr/lib/gcc/x86_64-linux-gnu/9 -lpetsc -llapack -lblas -lpthread -lm > -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s > -lquadmath -lstdc++ -ldl > /usr/bin/ld: cannot find -lpetsc > collect2: error: ld returned 1 exit status > make: *** [makefile:33: app] Error 1 > It could be 1) Your build failed, so libpetsc was not produced 2) arch-linux-c-debug is not the PETSC_ARCH that you built Thanks, Matt > Best regards > Seung Lee Kwon > > -- > Seung Lee Kwon, Ph.D.Candidate > Aerospace Structures and Materials Laboratory > Department of Mechanical and Aerospace Engineering > Seoul National University > Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 > E-mail : ksl7912 at snu.ac.kr > Office : +82-2-880-7389 > C. P : +82-10-4695-1062 > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ksl7912 at snu.ac.kr Wed Apr 26 05:42:30 2023 From: ksl7912 at snu.ac.kr (=?UTF-8?B?wq3qtozsirnrpqwgLyDtlZnsg50gLyDtla3qs7XsmrDso7zqs7XtlZnqs7w=?=) Date: Wed, 26 Apr 2023 19:42:30 +0900 Subject: [petsc-users] makefile error In-Reply-To: References: Message-ID: Thank you for your reply. This problem occurred after I downloaded mpich. Is there any way to solve this problem? Or Do I have to reinstall PETSc? Best regards Seung Lee Kwon 2023? 4? 26? (?) ?? 7:05, Matthew Knepley ?? ??: > On Wed, Apr 26, 2023 at 6:02?AM ???? / ?? / ??????? > wrote: > >> Dear developers >> >> Could you recommend the error messages below? >> >> /home/ksl/petsc/arch-linux-c-debug/bin/mpicxx -Wall -Wwrite-strings >> -Wno-strict-aliasing -Wno-unknown-pragmas -Wno-lto-type-mismatch >> -fstack-protector -fvisibility=hidden -g -O0 -I/home/ksl/petsc/include >> -I/home/ksl/petsc/arch-linux-c-debug/include -o app a1.o a2.o a3.o a4.o >> a5.o -Wl,-rpath,/home/ksl/petsc/arch-linux-c-debug/lib >> -L/home/ksl/petsc/arch-linux-c-debug/lib >> -Wl,-rpath,/home/ksl/petsc/arch-linux-c-debug/lib >> -L/home/ksl/petsc/arch-linux-c-debug/lib >> -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/9 >> -L/usr/lib/gcc/x86_64-linux-gnu/9 -lpetsc -llapack -lblas -lpthread -lm >> -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s >> -lquadmath -lstdc++ -ldl >> /usr/bin/ld: cannot find -lpetsc >> collect2: error: ld returned 1 exit status >> make: *** [makefile:33: app] Error 1 >> > > It could be > > 1) Your build failed, so libpetsc was not produced > > 2) arch-linux-c-debug is not the PETSC_ARCH that you built > > Thanks, > > Matt > > >> Best regards >> Seung Lee Kwon >> >> -- >> Seung Lee Kwon, Ph.D.Candidate >> Aerospace Structures and Materials Laboratory >> Department of Mechanical and Aerospace Engineering >> Seoul National University >> Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 >> E-mail : ksl7912 at snu.ac.kr >> Office : +82-2-880-7389 >> C. P : +82-10-4695-1062 >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -- Seung Lee Kwon, Ph.D.Candidate Aerospace Structures and Materials Laboratory Department of Mechanical and Aerospace Engineering Seoul National University Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 E-mail : ksl7912 at snu.ac.kr Office : +82-2-880-7389 C. P : +82-10-4695-1062 -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Wed Apr 26 06:05:05 2023 From: mfadams at lbl.gov (Mark Adams) Date: Wed, 26 Apr 2023 07:05:05 -0400 Subject: [petsc-users] makefile error In-Reply-To: References: Message-ID: Remove the "arch" directory to start with a clean build. If the new build fails then send use the configure and make .log files On Wed, Apr 26, 2023 at 6:42?AM ???? / ?? / ??????? wrote: > Thank you for your reply. > > This problem occurred after I downloaded mpich. > > Is there any way to solve this problem? Or Do I have to reinstall PETSc? > > Best regards > Seung Lee Kwon > > 2023? 4? 26? (?) ?? 7:05, Matthew Knepley ?? ??: > >> On Wed, Apr 26, 2023 at 6:02?AM ???? / ?? / ??????? >> wrote: >> >>> Dear developers >>> >>> Could you recommend the error messages below? >>> >>> /home/ksl/petsc/arch-linux-c-debug/bin/mpicxx -Wall -Wwrite-strings >>> -Wno-strict-aliasing -Wno-unknown-pragmas -Wno-lto-type-mismatch >>> -fstack-protector -fvisibility=hidden -g -O0 -I/home/ksl/petsc/include >>> -I/home/ksl/petsc/arch-linux-c-debug/include -o app a1.o a2.o a3.o a4.o >>> a5.o -Wl,-rpath,/home/ksl/petsc/arch-linux-c-debug/lib >>> -L/home/ksl/petsc/arch-linux-c-debug/lib >>> -Wl,-rpath,/home/ksl/petsc/arch-linux-c-debug/lib >>> -L/home/ksl/petsc/arch-linux-c-debug/lib >>> -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/9 >>> -L/usr/lib/gcc/x86_64-linux-gnu/9 -lpetsc -llapack -lblas -lpthread -lm >>> -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s >>> -lquadmath -lstdc++ -ldl >>> /usr/bin/ld: cannot find -lpetsc >>> collect2: error: ld returned 1 exit status >>> make: *** [makefile:33: app] Error 1 >>> >> >> It could be >> >> 1) Your build failed, so libpetsc was not produced >> >> 2) arch-linux-c-debug is not the PETSC_ARCH that you built >> >> Thanks, >> >> Matt >> >> >>> Best regards >>> Seung Lee Kwon >>> >>> -- >>> Seung Lee Kwon, Ph.D.Candidate >>> Aerospace Structures and Materials Laboratory >>> Department of Mechanical and Aerospace Engineering >>> Seoul National University >>> Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 >>> E-mail : ksl7912 at snu.ac.kr >>> Office : +82-2-880-7389 >>> C. P : +82-10-4695-1062 >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > > > -- > Seung Lee Kwon, Ph.D.Candidate > Aerospace Structures and Materials Laboratory > Department of Mechanical and Aerospace Engineering > Seoul National University > Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 > E-mail : ksl7912 at snu.ac.kr > Office : +82-2-880-7389 > C. P : +82-10-4695-1062 > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Apr 26 06:05:34 2023 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 26 Apr 2023 07:05:34 -0400 Subject: [petsc-users] makefile error In-Reply-To: References: Message-ID: On Wed, Apr 26, 2023 at 6:42?AM ???? / ?? / ??????? wrote: > Thank you for your reply. > > This problem occurred after I downloaded mpich. > This is not the reason. The error say "libpetsc is not located in /home/ksl/petsc/arch-linux-c-debug/lib". So there is some reason it is not there. > Is there any way to solve this problem? Or Do I have to reinstall PETSc? > Just rebuild cd $PETSC_DIR make Thanks, Matt > Best regards > Seung Lee Kwon > > 2023? 4? 26? (?) ?? 7:05, Matthew Knepley ?? ??: > >> On Wed, Apr 26, 2023 at 6:02?AM ???? / ?? / ??????? >> wrote: >> >>> Dear developers >>> >>> Could you recommend the error messages below? >>> >>> /home/ksl/petsc/arch-linux-c-debug/bin/mpicxx -Wall -Wwrite-strings >>> -Wno-strict-aliasing -Wno-unknown-pragmas -Wno-lto-type-mismatch >>> -fstack-protector -fvisibility=hidden -g -O0 -I/home/ksl/petsc/include >>> -I/home/ksl/petsc/arch-linux-c-debug/include -o app a1.o a2.o a3.o a4.o >>> a5.o -Wl,-rpath,/home/ksl/petsc/arch-linux-c-debug/lib >>> -L/home/ksl/petsc/arch-linux-c-debug/lib >>> -Wl,-rpath,/home/ksl/petsc/arch-linux-c-debug/lib >>> -L/home/ksl/petsc/arch-linux-c-debug/lib >>> -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/9 >>> -L/usr/lib/gcc/x86_64-linux-gnu/9 -lpetsc -llapack -lblas -lpthread -lm >>> -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s >>> -lquadmath -lstdc++ -ldl >>> /usr/bin/ld: cannot find -lpetsc >>> collect2: error: ld returned 1 exit status >>> make: *** [makefile:33: app] Error 1 >>> >> >> It could be >> >> 1) Your build failed, so libpetsc was not produced >> >> 2) arch-linux-c-debug is not the PETSC_ARCH that you built >> >> Thanks, >> >> Matt >> >> >>> Best regards >>> Seung Lee Kwon >>> >>> -- >>> Seung Lee Kwon, Ph.D.Candidate >>> Aerospace Structures and Materials Laboratory >>> Department of Mechanical and Aerospace Engineering >>> Seoul National University >>> Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 >>> E-mail : ksl7912 at snu.ac.kr >>> Office : +82-2-880-7389 >>> C. P : +82-10-4695-1062 >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> https://www.cse.buffalo.edu/~knepley/ >> >> > > > -- > Seung Lee Kwon, Ph.D.Candidate > Aerospace Structures and Materials Laboratory > Department of Mechanical and Aerospace Engineering > Seoul National University > Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 > E-mail : ksl7912 at snu.ac.kr > Office : +82-2-880-7389 > C. P : +82-10-4695-1062 > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Wed Apr 26 09:10:42 2023 From: bsmith at petsc.dev (Barry Smith) Date: Wed, 26 Apr 2023 10:10:42 -0400 Subject: [petsc-users] Fieldsplit with redistribute In-Reply-To: References: <35C00A41-36BA-45E6-AC79-7DDDDFCFED6C@petsc.dev> <0DB90ED8-6BD0-431E-B592-37B89DE3DE28@petsc.dev> <88F183A6-6553-4288-89A5-DD1903EA9956@petsc.dev> Message-ID: What happens if you pass in your IS using directly PCFIELDSPLIT and not using PCREDISTRIBUTE? > On Apr 26, 2023, at 2:27 AM, Carl-Johan Thore wrote: > > Hi again, > > I now think I got my IS in order (it?s just one IS because unlike in your ex84.c I don?t provide the complement of the IS explicitly but let fieldsplit compute it, but ex84.c works fine if I do the same there). > As before, my code works with pcredistribute and pcfieldsplit on 1 core. I then try > with 2 cores. First I check the IS in Matlab, and it looks fine as far as I can tell, with identical content > as in the 1-core case, which it should?. Then I try running the code, but it fails with (the first two lines are mine) > > ? > [1]ISdata: min= 6301, max=10639, freeudofs= 3244. min= 6293, max=10639, freedofs= 4347: row 2629 > [0]ISdata: min= 139, max= 6292, freeudofs= 4459. min= 0, max= 6292, freedofs= 6293: row 2629 > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Argument out of range > [0]PETSC ERROR: Index 3748's value 5320 is larger than maximum given 5320 > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [0]PETSC ERROR: Petsc Development GIT revision: v3.19.0-236-gee39b84cc03 GIT Date: 2023-04-23 18:43:23 -0400 > [0]PETSC ERROR: topopt on a arch-linux-c-debug named win01705 by carlthore Wed Apr 26 08:01:49 2023 > [0]PETSC ERROR: Configure options -f --with-cuda --with-cusp --download-scalapack --download-hdf5 --download-zlib --download-mumps --download-parmetis --download-metis --download-ptscotch --download-hypre --download-spai > [0]PETSC ERROR: #1 ISComplement() at /mnt/c/mathware/petsc/src/vec/is/is/utils/iscoloring.c:804 > [0]PETSC ERROR: #2 PCFieldSplitSetDefaults() at /mnt/c/mathware/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:544 > [0]PETSC ERROR: #3 PCSetUp_FieldSplit() at /mnt/c/mathware/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:587 > [0]PETSC ERROR: #4 PCSetUp() at /mnt/c/mathware/petsc/src/ksp/pc/interface/precon.c:994 > [0]PETSC ERROR: #5 KSPSetUp() at /mnt/c/mathware/petsc/src/ksp/ksp/interface/itfunc.c:406 > [0]PETSC ERROR: #6 PCSetUp_Redistribute() at /mnt/c/mathware/petsc/src/ksp/pc/impls/redistribute/redistribute.c:327 > [0]PETSC ERROR: #7 PCSetUp() at /mnt/c/mathware/petsc/src/ksp/pc/interface/precon.c:994 > [0]PETSC ERROR: #8 KSPSetUp() at /mnt/c/mathware/petsc/src/ksp/ksp/interface/itfunc.c:406 > [0]PETSC ERROR: #9 SetUpSolver() at /mnt/c/TOPet/fts_topopt_in_petsc-master/MixedStokes.cc:2650 > ? > > This is all the errors I see, I?m not sure why the error message is not written out in full. I guess the error is on > my side, with the IS still not being constructed correctly, but what do you think? > > Kind regards, > Carl-Johan > > > From: Barry Smith > > Sent: Monday, April 24, 2023 8:27 PM > To: Carl-Johan Thore > > Cc: PETSc > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > PCREDISTRIBUTE looks like it is not yet GPU friendly. This needs to be fixed, but it should be a separate fix and MR from my current one. > > Please just check if the PCREDISTRIBUTE followed by PCFIELDSPLIT just works on CPUs for your code. > > Barry > > > > On Apr 24, 2023, at 2:23 PM, Carl-Johan Thore > wrote: > > I wasn?t sure if I was going to bother you again with this, but since it looks like you plan to merge this with the > main branch (?) I thought it might be interesting to know that I?ve tried this with my code running with CUDA but got the > attached error. I suspect it?s related to the RHS red->b but I?m not sure. If I switch of redistribute my code runs fine > with CUDA. > > Kind regards, > Carl-Johan > > > > From: Carl-Johan Thore > Sent: Monday, April 24, 2023 5:08 PM > To: Barry Smith > > Subject: RE: [petsc-users] Fieldsplit with redistribute > > Ok, that worked great with my code on 1 core! (I haven?t been able to try the multi-core case yet due to issues with my own code > mentioned below) > > I?m not sure if you forgot to remove the freeing of the map object > outside or if I messed up with the pull somehow, but I had to outcomment that line manually: > ? > > /Carl-Johan > > > > From: Barry Smith > > Sent: Monday, April 24, 2023 4:26 PM > To: Carl-Johan Thore > > Cc: PETSc > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > The bug was mine; I was freeing the map object outside of the if () instead of inside. You can do > > git pull > make all > > and then try again. > > Barry > > > > On Apr 24, 2023, at 5:39 AM, Carl-Johan Thore > wrote: > > Hi Barry! > > First of all, thank you very very much for this! I was expecting maybe a few hints and pointers on how to proceed > with my work, but then you did a complete implementation ? > > Your code ran fine with ex84.cc . Unfortunately it crashed when running on my main code (a mixed Stokes solver). > When running on 1 core I get a crash which is maybe related to your code, so I?ve attached the error message for that > case. However, on multiple cores I think the issue is mainly that I?m not constructing the original IS correctly, so I?ll > look into that myself. > > Regarding reporting to https://gitlab.com/petsc/petsc/-/merge_requests/6366, should it be done here?: ? > > By the way, I managed yesterday to make a working implementation of my own example and was planning to send it > after cleaning it up and maybe optimizing a bit. I?ve attached it if your curious (or just want to have a good laugh :)) > > Kind regards, > Carl-Johan > > > From: Barry Smith > > Sent: Monday, April 24, 2023 12:49 AM > To: Carl-Johan Thore > > Cc: PETSc > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > I have added support for PCREDISTRIBUTE to propogate your PCFieldSplitSetIS() down to an inner PCFIELDSPLIT. You can > access it with > > git fetch > git checkout barry/2023-04-22/fieldsplit-fields-propogate > ./configure > make all check > > Take a look at src/ksp/ksp/tutorials/ex84.c and run with the options at the bottom of the file. > > Please let us know at https://gitlab.com/petsc/petsc/-/merge_requests/6366 if it works for you or you have any difficulties. > > Barry > > > > > On Apr 20, 2023, at 10:14 AM, Carl-Johan Thore > wrote: > > Great, thanks! I?ve attached the code, a makefile, and a 1-page power-point which hopefully explains > what I?m trying to do on this little toy-problem. There is obviously (?) something I need to add around > line 327 in the code in order to move the indices to the correct rank. > > Output should be something like this when running: > ? > > Let me know if you need any more info, or if the code is incomprehensible or so > (it?s long because I?ve copied a lot from redistribute.c) > > Kind regards, > Carl-Johan > > From: Barry Smith > > Sent: Thursday, April 20, 2023 3:17 PM > To: Carl-Johan Thore > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > Sure > > > On Apr 20, 2023, at 4:09 AM, Carl-Johan Thore > wrote: > > Hi Barry, > > In the conversation below you mentioned that I could send code to you to take a look. I?ve written > up what I think is a minimally working example for this. It?s almost there in the sense of distributing > the correct number of indices to the ranks to match the reduced matrix, but it?s the wrong indices. > Would it be okay if I sent you the code to have look? > > Kind regards, > Carl-Johan > > From: Barry Smith > > Sent: Sunday, April 16, 2023 10:31 PM > To: Carl-Johan Thore > > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > The manual page for ISEmbed is incomprehensible to me. Anyways no matter what, you need to know what degrees of freedom are removed by PCDistribute() in order to produce the reduced IS which is why I think you need information only available inside PCSetUp_Redistribute(). (Sorry it is PCSetUp_Redistribute() not PCApply_Redistribute()) > > Barry > > > > On Apr 16, 2023, at 3:36 PM, Carl-Johan Thore > wrote: > > Thanks for the quick reply Barry! > I have not tried the version with PCApply_Redistribute that you suggest, but I have a code that does roughly what you describe. It works when running on one rank, but fails on multiple ranks. I suspect the issue is with the use of ISEmbed as, quoting the PETSc-manual, "the resulting IS is sequential, since the index substitution it encodes is purely local" (admittedly I don't fully understand what that means). If you think using ISEmbed is not a good idea, I'll try PCApply_Redistribute() > From: Barry Smith > > Sent: 16 April 2023 21:11:18 > To: Carl-Johan Thore > > Cc: petsc-users at mcs.anl.gov > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > There is no code to do this currently. > > I would start by building your IS for each split before the PCRedistribute and then adding to the PCApply_Redistribute() code that "fixes" these IS by "removing" the entries of the IS associated with removed degrees of freedom and then shifting the entries indices of the IS by taking into account the removed indices. But you have probably already been trying this? It does require digging directly into the PCApply_Redistribute() to get the needed information (which degrees of freedom are removed by the redistribute code), plus it requires shifting the MPI rank ownership of the entries of the IS in the same way the MPI rank ownership of the degrees of freedom of the vector are moved. > > If you have some code that you think should be doing this but doesn't work feel free to send it to us and we may be able to fix it. > > Barry > > > > On Apr 16, 2023, at 2:50 PM, Carl-Johan Thore via petsc-users > wrote: > > > > Hello, > > I'm solving a blocksystem > > [A C; > > C' D], > > where D is not zero, using the PCFIELDSPLIT preconditioner and set the split using PetscFieldSplitSetIS. This works very well until I try PCREDISTRIBUTE (which is attractive as I have many locked DOFs). I suspect something goes wrong when constructing the IS for the split (I've tried various things using the IS-routines). Can PETSc do this automatically? Or else, any hints? > > Kind regards, > > Carl-Johan > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image001.png Type: image/png Size: 79741 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image002.png Type: image/png Size: 70472 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image003.png Type: image/png Size: 25214 bytes Desc: not available URL: From bsmith at petsc.dev Wed Apr 26 09:46:49 2023 From: bsmith at petsc.dev (Barry Smith) Date: Wed, 26 Apr 2023 10:46:49 -0400 Subject: [petsc-users] Fieldsplit with redistribute In-Reply-To: References: <35C00A41-36BA-45E6-AC79-7DDDDFCFED6C@petsc.dev> <0DB90ED8-6BD0-431E-B592-37B89DE3DE28@petsc.dev> <88F183A6-6553-4288-89A5-DD1903EA9956@petsc.dev> Message-ID: <6A7D360F-999F-4222-8602-3BC1B7915323@petsc.dev> > On Apr 26, 2023, at 10:32 AM, Carl-Johan Thore wrote: > > One 1 core there is no errors, but the solution to the linear system is wrong as expected. Why is it expect to be wrong? Are you still using the MatZeroRowsColumns()? You should. > On 2 cores I get this: > > [0]ISdata: min= 139, max= 6292, freeudofs= 4459. min= 0, max= 6292, freedofs= 6293: row 2629 > [1]ISdata: min= 6301, max=10639, freeudofs= 3244. min= 6293, max=10639, freedofs= 4347: row 2629 > [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [1]PETSC ERROR: Argument out of range > [1]PETSC ERROR: Index 0's value 6301 is smaller than minimum given 19845 > [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [1]PETSC ERROR: Petsc Development GIT revision: v3.19.0-236-gee39b84cc03 GIT Date: 2023-04-23 18:43:23 -0400 > [1]PETSC ERROR: topopt on a arch-linux-c-debug named win01705 by carlthore Wed Apr 26 16:26:24 2023 > [1]PETSC ERROR: Configure options -f --with-cuda --with-cusp --download-scalapack --download-hdf5 --download-zlib --download-mumps --download-parmetis --download-metis --download-ptscotch --download-hypre --download-spai > [1]PETSC ERROR: #1 ISComplement() at /mnt/c/mathware/petsc/src/vec/is/is/utils/iscoloring.c:803 > [1]PETSC ERROR: #2 PCFieldSplitSetDefaults() at /mnt/c/mathware/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:544 > [1]PETSC ERROR: #3 PCSetUp_FieldSplit() at /mnt/c/mathware/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:587 > [1]PETSC ERROR: #4 PCSetUp() at /mnt/c/mathware/petsc/src/ksp/pc/interface/precon.c:994 > [1]PETSC ERROR: #5 KSPSetUp() at /mnt/c/mathware/petsc/src/ksp/ksp/interface/itfunc.c:406 > [1]PETSC ERROR: #6 SetUpSolver() at /mnt/c/TOPet/fts_topopt_in_petsc-master/MixedStokes.cc:2650 > > > From: Barry Smith > Sent: Wednesday, April 26, 2023 4:11 PM > To: Carl-Johan Thore > Cc: PETSc > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > What happens if you pass in your IS using directly PCFIELDSPLIT and not using PCREDISTRIBUTE? > > > On Apr 26, 2023, at 2:27 AM, Carl-Johan Thore > wrote: > > Hi again, > > I now think I got my IS in order (it?s just one IS because unlike in your ex84.c I don?t provide the complement of the IS explicitly but let fieldsplit compute it, but ex84.c works fine if I do the same there). > As before, my code works with pcredistribute and pcfieldsplit on 1 core. I then try > with 2 cores. First I check the IS in Matlab, and it looks fine as far as I can tell, with identical content > as in the 1-core case, which it should?. Then I try running the code, but it fails with (the first two lines are mine) > > ? > [1]ISdata: min= 6301, max=10639, freeudofs= 3244. min= 6293, max=10639, freedofs= 4347: row 2629 > [0]ISdata: min= 139, max= 6292, freeudofs= 4459. min= 0, max= 6292, freedofs= 6293: row 2629 > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Argument out of range > [0]PETSC ERROR: Index 3748's value 5320 is larger than maximum given 5320 > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [0]PETSC ERROR: Petsc Development GIT revision: v3.19.0-236-gee39b84cc03 GIT Date: 2023-04-23 18:43:23 -0400 > [0]PETSC ERROR: topopt on a arch-linux-c-debug named win01705 by carlthore Wed Apr 26 08:01:49 2023 > [0]PETSC ERROR: Configure options -f --with-cuda --with-cusp --download-scalapack --download-hdf5 --download-zlib --download-mumps --download-parmetis --download-metis --download-ptscotch --download-hypre --download-spai > [0]PETSC ERROR: #1 ISComplement() at /mnt/c/mathware/petsc/src/vec/is/is/utils/iscoloring.c:804 > [0]PETSC ERROR: #2 PCFieldSplitSetDefaults() at /mnt/c/mathware/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:544 > [0]PETSC ERROR: #3 PCSetUp_FieldSplit() at /mnt/c/mathware/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:587 > [0]PETSC ERROR: #4 PCSetUp() at /mnt/c/mathware/petsc/src/ksp/pc/interface/precon.c:994 > [0]PETSC ERROR: #5 KSPSetUp() at /mnt/c/mathware/petsc/src/ksp/ksp/interface/itfunc.c:406 > [0]PETSC ERROR: #6 PCSetUp_Redistribute() at /mnt/c/mathware/petsc/src/ksp/pc/impls/redistribute/redistribute.c:327 > [0]PETSC ERROR: #7 PCSetUp() at /mnt/c/mathware/petsc/src/ksp/pc/interface/precon.c:994 > [0]PETSC ERROR: #8 KSPSetUp() at /mnt/c/mathware/petsc/src/ksp/ksp/interface/itfunc.c:406 > [0]PETSC ERROR: #9 SetUpSolver() at /mnt/c/TOPet/fts_topopt_in_petsc-master/MixedStokes.cc:2650 > ? > > This is all the errors I see, I?m not sure why the error message is not written out in full. I guess the error is on > my side, with the IS still not being constructed correctly, but what do you think? > > Kind regards, > Carl-Johan > > > From: Barry Smith > > Sent: Monday, April 24, 2023 8:27 PM > To: Carl-Johan Thore > > Cc: PETSc > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > PCREDISTRIBUTE looks like it is not yet GPU friendly. This needs to be fixed, but it should be a separate fix and MR from my current one. > > Please just check if the PCREDISTRIBUTE followed by PCFIELDSPLIT just works on CPUs for your code. > > Barry > > > > > On Apr 24, 2023, at 2:23 PM, Carl-Johan Thore > wrote: > > I wasn?t sure if I was going to bother you again with this, but since it looks like you plan to merge this with the > main branch (?) I thought it might be interesting to know that I?ve tried this with my code running with CUDA but got the > attached error. I suspect it?s related to the RHS red->b but I?m not sure. If I switch of redistribute my code runs fine > with CUDA. > > Kind regards, > Carl-Johan > > > > From: Carl-Johan Thore > Sent: Monday, April 24, 2023 5:08 PM > To: Barry Smith > > Subject: RE: [petsc-users] Fieldsplit with redistribute > > Ok, that worked great with my code on 1 core! (I haven?t been able to try the multi-core case yet due to issues with my own code > mentioned below) > > I?m not sure if you forgot to remove the freeing of the map object > outside or if I messed up with the pull somehow, but I had to outcomment that line manually: > ? > > /Carl-Johan > > > > From: Barry Smith > > Sent: Monday, April 24, 2023 4:26 PM > To: Carl-Johan Thore > > Cc: PETSc > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > The bug was mine; I was freeing the map object outside of the if () instead of inside. You can do > > git pull > make all > > and then try again. > > Barry > > > > On Apr 24, 2023, at 5:39 AM, Carl-Johan Thore > wrote: > > Hi Barry! > > First of all, thank you very very much for this! I was expecting maybe a few hints and pointers on how to proceed > with my work, but then you did a complete implementation ? > > Your code ran fine with ex84.cc . Unfortunately it crashed when running on my main code (a mixed Stokes solver). > When running on 1 core I get a crash which is maybe related to your code, so I?ve attached the error message for that > case. However, on multiple cores I think the issue is mainly that I?m not constructing the original IS correctly, so I?ll > look into that myself. > > Regarding reporting to https://gitlab.com/petsc/petsc/-/merge_requests/6366, should it be done here?: ? > > By the way, I managed yesterday to make a working implementation of my own example and was planning to send it > after cleaning it up and maybe optimizing a bit. I?ve attached it if your curious (or just want to have a good laugh :)) > > Kind regards, > Carl-Johan > > > From: Barry Smith > > Sent: Monday, April 24, 2023 12:49 AM > To: Carl-Johan Thore > > Cc: PETSc > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > I have added support for PCREDISTRIBUTE to propogate your PCFieldSplitSetIS() down to an inner PCFIELDSPLIT. You can > access it with > > git fetch > git checkout barry/2023-04-22/fieldsplit-fields-propogate > ./configure > make all check > > Take a look at src/ksp/ksp/tutorials/ex84.c and run with the options at the bottom of the file. > > Please let us know at https://gitlab.com/petsc/petsc/-/merge_requests/6366 if it works for you or you have any difficulties. > > Barry > > > > > On Apr 20, 2023, at 10:14 AM, Carl-Johan Thore > wrote: > > Great, thanks! I?ve attached the code, a makefile, and a 1-page power-point which hopefully explains > what I?m trying to do on this little toy-problem. There is obviously (?) something I need to add around > line 327 in the code in order to move the indices to the correct rank. > > Output should be something like this when running: > ? > > Let me know if you need any more info, or if the code is incomprehensible or so > (it?s long because I?ve copied a lot from redistribute.c) > > Kind regards, > Carl-Johan > > From: Barry Smith > > Sent: Thursday, April 20, 2023 3:17 PM > To: Carl-Johan Thore > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > Sure > > > On Apr 20, 2023, at 4:09 AM, Carl-Johan Thore > wrote: > > Hi Barry, > > In the conversation below you mentioned that I could send code to you to take a look. I?ve written > up what I think is a minimally working example for this. It?s almost there in the sense of distributing > the correct number of indices to the ranks to match the reduced matrix, but it?s the wrong indices. > Would it be okay if I sent you the code to have look? > > Kind regards, > Carl-Johan > > From: Barry Smith > > Sent: Sunday, April 16, 2023 10:31 PM > To: Carl-Johan Thore > > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > The manual page for ISEmbed is incomprehensible to me. Anyways no matter what, you need to know what degrees of freedom are removed by PCDistribute() in order to produce the reduced IS which is why I think you need information only available inside PCSetUp_Redistribute(). (Sorry it is PCSetUp_Redistribute() not PCApply_Redistribute()) > > Barry > > > > On Apr 16, 2023, at 3:36 PM, Carl-Johan Thore > wrote: > > Thanks for the quick reply Barry! > I have not tried the version with PCApply_Redistribute that you suggest, but I have a code that does roughly what you describe. It works when running on one rank, but fails on multiple ranks. I suspect the issue is with the use of ISEmbed as, quoting the PETSc-manual, "the resulting IS is sequential, since the index substitution it encodes is purely local" (admittedly I don't fully understand what that means). If you think using ISEmbed is not a good idea, I'll try PCApply_Redistribute() > From: Barry Smith > > Sent: 16 April 2023 21:11:18 > To: Carl-Johan Thore > > Cc: petsc-users at mcs.anl.gov > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > There is no code to do this currently. > > I would start by building your IS for each split before the PCRedistribute and then adding to the PCApply_Redistribute() code that "fixes" these IS by "removing" the entries of the IS associated with removed degrees of freedom and then shifting the entries indices of the IS by taking into account the removed indices. But you have probably already been trying this? It does require digging directly into the PCApply_Redistribute() to get the needed information (which degrees of freedom are removed by the redistribute code), plus it requires shifting the MPI rank ownership of the entries of the IS in the same way the MPI rank ownership of the degrees of freedom of the vector are moved. > > If you have some code that you think should be doing this but doesn't work feel free to send it to us and we may be able to fix it. > > Barry > > > > On Apr 16, 2023, at 2:50 PM, Carl-Johan Thore via petsc-users > wrote: > > > > Hello, > > I'm solving a blocksystem > > [A C; > > C' D], > > where D is not zero, using the PCFIELDSPLIT preconditioner and set the split using PetscFieldSplitSetIS. This works very well until I try PCREDISTRIBUTE (which is attractive as I have many locked DOFs). I suspect something goes wrong when constructing the IS for the split (I've tried various things using the IS-routines). Can PETSc do this automatically? Or else, any hints? > > Kind regards, > > Carl-Johan > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image001.png Type: image/png Size: 79741 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image002.png Type: image/png Size: 70472 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image003.png Type: image/png Size: 25214 bytes Desc: not available URL: From ksl7912 at snu.ac.kr Wed Apr 26 09:47:38 2023 From: ksl7912 at snu.ac.kr (=?UTF-8?B?wq3qtozsirnrpqwgLyDtlZnsg50gLyDtla3qs7XsmrDso7zqs7XtlZnqs7w=?=) Date: Wed, 26 Apr 2023 23:47:38 +0900 Subject: [petsc-users] makefile error In-Reply-To: References: Message-ID: thank you. As you said, I did run the command. But I didn't remove the "arch" directory. // cd #petcs_dir make // After that, makefile works well. But errors are occured like as below, [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: See https://petsc.org/release/overview/linear_solve_table/ for possible LU and Cholesky solvers [0]PETSC ERROR: Could not locate solver type mumps for factorization type LU and matrix type seqaij. Perhaps you must ./configure with --download-mumps [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.18.5, unknown [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by ksl Wed Apr 26 04:13:01 2023 [0]PETSC ERROR: Configure options --download-mpich [0]PETSC ERROR: #1 MatGetFactor() at /home/ksl/petsc/src/mat/interface/matrix.c:4751 [0]PETSC ERROR: #2 PCFactorSetUpMatSolverType_Factor() at /home/ksl/petsc/src/ksp/pc/impls/factor/factimpl.c:10 [0]PETSC ERROR: #3 PCFactorSetUpMatSolverType() at /home/ksl/petsc/src/ksp/pc/impls/factor/factor.c:99 [0]PETSC ERROR: #4 main() at /home/ksl/Downloads/coding_friction/a1.c:737 [0]PETSC ERROR: No PETSc Option Table entries [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- Then, can I download the mumps as the message? 2023? 4? 26? (?) ?? 8:05, Matthew Knepley ?? ??: > On Wed, Apr 26, 2023 at 6:42?AM ???? / ?? / ??????? > wrote: > >> Thank you for your reply. >> >> This problem occurred after I downloaded mpich. >> > > This is not the reason. The error say "libpetsc is not located > in /home/ksl/petsc/arch-linux-c-debug/lib". So there is some reason it is > not there. > > >> Is there any way to solve this problem? Or Do I have to reinstall PETSc? >> > > Just rebuild > > cd $PETSC_DIR > make > > Thanks, > > Matt > > >> Best regards >> Seung Lee Kwon >> >> 2023? 4? 26? (?) ?? 7:05, Matthew Knepley ?? ??: >> >>> On Wed, Apr 26, 2023 at 6:02?AM ???? / ?? / ??????? >>> wrote: >>> >>>> Dear developers >>>> >>>> Could you recommend the error messages below? >>>> >>>> /home/ksl/petsc/arch-linux-c-debug/bin/mpicxx -Wall -Wwrite-strings >>>> -Wno-strict-aliasing -Wno-unknown-pragmas -Wno-lto-type-mismatch >>>> -fstack-protector -fvisibility=hidden -g -O0 -I/home/ksl/petsc/include >>>> -I/home/ksl/petsc/arch-linux-c-debug/include -o app a1.o a2.o a3.o a4.o >>>> a5.o -Wl,-rpath,/home/ksl/petsc/arch-linux-c-debug/lib >>>> -L/home/ksl/petsc/arch-linux-c-debug/lib >>>> -Wl,-rpath,/home/ksl/petsc/arch-linux-c-debug/lib >>>> -L/home/ksl/petsc/arch-linux-c-debug/lib >>>> -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/9 >>>> -L/usr/lib/gcc/x86_64-linux-gnu/9 -lpetsc -llapack -lblas -lpthread -lm >>>> -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s >>>> -lquadmath -lstdc++ -ldl >>>> /usr/bin/ld: cannot find -lpetsc >>>> collect2: error: ld returned 1 exit status >>>> make: *** [makefile:33: app] Error 1 >>>> >>> >>> It could be >>> >>> 1) Your build failed, so libpetsc was not produced >>> >>> 2) arch-linux-c-debug is not the PETSC_ARCH that you built >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> Best regards >>>> Seung Lee Kwon >>>> >>>> -- >>>> Seung Lee Kwon, Ph.D.Candidate >>>> Aerospace Structures and Materials Laboratory >>>> Department of Mechanical and Aerospace Engineering >>>> Seoul National University >>>> Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 >>>> E-mail : ksl7912 at snu.ac.kr >>>> Office : +82-2-880-7389 >>>> C. P : +82-10-4695-1062 >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> https://www.cse.buffalo.edu/~knepley/ >>> >>> >> >> >> -- >> Seung Lee Kwon, Ph.D.Candidate >> Aerospace Structures and Materials Laboratory >> Department of Mechanical and Aerospace Engineering >> Seoul National University >> Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 >> E-mail : ksl7912 at snu.ac.kr >> Office : +82-2-880-7389 >> C. P : +82-10-4695-1062 >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Wed Apr 26 09:54:06 2023 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 26 Apr 2023 09:54:06 -0500 (CDT) Subject: [petsc-users] makefile error In-Reply-To: References: Message-ID: <181a9ed4-f5fe-c903-65f1-e358fce3b4c8@mcs.anl.gov> yes - if you need mumps: ./configure --download-mpich --download-mumps --download-scalapack make make check Also we recommend using latest petsc release - i.e petsc-3.19.0 Satish On Wed, 26 Apr 2023, ???? / ?? / ??????? wrote: > thank you. > > As you said, I did run the command. But I didn't remove the "arch" > directory. > // > cd #petcs_dir > make > // > After that, makefile works well. > But errors are occured like as below, > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: See https://petsc.org/release/overview/linear_solve_table/ > for possible LU and Cholesky solvers > [0]PETSC ERROR: Could not locate solver type mumps for factorization type > LU and matrix type seqaij. Perhaps you must ./configure with > --download-mumps > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.18.5, unknown > [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by ksl Wed Apr > 26 04:13:01 2023 > [0]PETSC ERROR: Configure options --download-mpich > [0]PETSC ERROR: #1 MatGetFactor() at > /home/ksl/petsc/src/mat/interface/matrix.c:4751 > [0]PETSC ERROR: #2 PCFactorSetUpMatSolverType_Factor() at > /home/ksl/petsc/src/ksp/pc/impls/factor/factimpl.c:10 > [0]PETSC ERROR: #3 PCFactorSetUpMatSolverType() at > /home/ksl/petsc/src/ksp/pc/impls/factor/factor.c:99 > [0]PETSC ERROR: #4 main() at /home/ksl/Downloads/coding_friction/a1.c:737 > [0]PETSC ERROR: No PETSc Option Table entries > [0]PETSC ERROR: ----------------End of Error Message -------send entire > error message to petsc-maint at mcs.anl.gov---------- > > Then, can I download the mumps as the message? > > 2023? 4? 26? (?) ?? 8:05, Matthew Knepley ?? ??: > > > On Wed, Apr 26, 2023 at 6:42?AM ???? / ?? / ??????? > > wrote: > > > >> Thank you for your reply. > >> > >> This problem occurred after I downloaded mpich. > >> > > > > This is not the reason. The error say "libpetsc is not located > > in /home/ksl/petsc/arch-linux-c-debug/lib". So there is some reason it is > > not there. > > > > > >> Is there any way to solve this problem? Or Do I have to reinstall PETSc? > >> > > > > Just rebuild > > > > cd $PETSC_DIR > > make > > > > Thanks, > > > > Matt > > > > > >> Best regards > >> Seung Lee Kwon > >> > >> 2023? 4? 26? (?) ?? 7:05, Matthew Knepley ?? ??: > >> > >>> On Wed, Apr 26, 2023 at 6:02?AM ???? / ?? / ??????? > >>> wrote: > >>> > >>>> Dear developers > >>>> > >>>> Could you recommend the error messages below? > >>>> > >>>> /home/ksl/petsc/arch-linux-c-debug/bin/mpicxx -Wall -Wwrite-strings > >>>> -Wno-strict-aliasing -Wno-unknown-pragmas -Wno-lto-type-mismatch > >>>> -fstack-protector -fvisibility=hidden -g -O0 -I/home/ksl/petsc/include > >>>> -I/home/ksl/petsc/arch-linux-c-debug/include -o app a1.o a2.o a3.o a4.o > >>>> a5.o -Wl,-rpath,/home/ksl/petsc/arch-linux-c-debug/lib > >>>> -L/home/ksl/petsc/arch-linux-c-debug/lib > >>>> -Wl,-rpath,/home/ksl/petsc/arch-linux-c-debug/lib > >>>> -L/home/ksl/petsc/arch-linux-c-debug/lib > >>>> -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/9 > >>>> -L/usr/lib/gcc/x86_64-linux-gnu/9 -lpetsc -llapack -lblas -lpthread -lm > >>>> -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s > >>>> -lquadmath -lstdc++ -ldl > >>>> /usr/bin/ld: cannot find -lpetsc > >>>> collect2: error: ld returned 1 exit status > >>>> make: *** [makefile:33: app] Error 1 > >>>> > >>> > >>> It could be > >>> > >>> 1) Your build failed, so libpetsc was not produced > >>> > >>> 2) arch-linux-c-debug is not the PETSC_ARCH that you built > >>> > >>> Thanks, > >>> > >>> Matt > >>> > >>> > >>>> Best regards > >>>> Seung Lee Kwon > >>>> > >>>> -- > >>>> Seung Lee Kwon, Ph.D.Candidate > >>>> Aerospace Structures and Materials Laboratory > >>>> Department of Mechanical and Aerospace Engineering > >>>> Seoul National University > >>>> Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 > >>>> E-mail : ksl7912 at snu.ac.kr > >>>> Office : +82-2-880-7389 > >>>> C. P : +82-10-4695-1062 > >>>> > >>> > >>> > >>> -- > >>> What most experimenters take for granted before they begin their > >>> experiments is infinitely more interesting than any results to which their > >>> experiments lead. > >>> -- Norbert Wiener > >>> > >>> https://www.cse.buffalo.edu/~knepley/ > >>> > >>> > >> > >> > >> -- > >> Seung Lee Kwon, Ph.D.Candidate > >> Aerospace Structures and Materials Laboratory > >> Department of Mechanical and Aerospace Engineering > >> Seoul National University > >> Building 300 Rm 503, Gwanak-ro 1, Gwanak-gu, Seoul, South Korea, 08826 > >> E-mail : ksl7912 at snu.ac.kr > >> Office : +82-2-880-7389 > >> C. P : +82-10-4695-1062 > >> > > > > > > -- > > What most experimenters take for granted before they begin their > > experiments is infinitely more interesting than any results to which their > > experiments lead. > > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > > > > From bsmith at petsc.dev Wed Apr 26 10:21:02 2023 From: bsmith at petsc.dev (Barry Smith) Date: Wed, 26 Apr 2023 11:21:02 -0400 Subject: [petsc-users] Fieldsplit with redistribute In-Reply-To: References: <35C00A41-36BA-45E6-AC79-7DDDDFCFED6C@petsc.dev> <0DB90ED8-6BD0-431E-B592-37B89DE3DE28@petsc.dev> <88F183A6-6553-4288-89A5-DD1903EA9956@petsc.dev> <6A7D360F-999F-4222-8602-3BC1B7915323@petsc.dev> Message-ID: Perhaps there is a misunderstanding. With the code I added you always just provide IS for the original problem without any concern for what rows will be zeroed and what rows will be redistributed. PCREDISTRIBUTE manages "fixing" things. I want you to use PCFIEDLSPLIT and not PCREDISTRIBUTE at all (so -pc_type fieldsplit) with your IS and see if that works correctly sequentially and in parallel. Then I want you to use -pc_type redistribute -redistribute_pc_type fieldsplit with your IS sequentially and in parallel (note you do not change your code, or even recompiled it for all the cases.) Barry > On Apr 26, 2023, at 10:58 AM, Carl-Johan Thore wrote: > > Because without redistribute, pcfieldsplit expects number pointing to rows in the ?big? matrix, whereas when > I construct my IS for redistribute it will have indices pointing to rows in the ?reduced matrix?. Coming back to my small > example below, I construct an IS with 0 2 4 as expected (?) for the reduced matrix. ? > If I pass 0 2 4 to the big matrix I expect the wrong result. If I skip the part in the construction of my IS where > 1 3 7 gets converted to 0 2 4 it works without redistribute. > > I?m using MatZeroRowsColumns in all cases yes. > > From: Barry Smith > > Sent: Wednesday, April 26, 2023 4:47 PM > To: Carl-Johan Thore > > Cc: PETSc > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > > > On Apr 26, 2023, at 10:32 AM, Carl-Johan Thore > wrote: > > One 1 core there is no errors, but the solution to the linear system is wrong as expected. > > Why is it expect to be wrong? Are you still using the MatZeroRowsColumns()? You should. > > > On 2 cores I get this: > > [0]ISdata: min= 139, max= 6292, freeudofs= 4459. min= 0, max= 6292, freedofs= 6293: row 2629 > [1]ISdata: min= 6301, max=10639, freeudofs= 3244. min= 6293, max=10639, freedofs= 4347: row 2629 > [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [1]PETSC ERROR: Argument out of range > [1]PETSC ERROR: Index 0's value 6301 is smaller than minimum given 19845 > [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [1]PETSC ERROR: Petsc Development GIT revision: v3.19.0-236-gee39b84cc03 GIT Date: 2023-04-23 18:43:23 -0400 > [1]PETSC ERROR: topopt on a arch-linux-c-debug named win01705 by carlthore Wed Apr 26 16:26:24 2023 > [1]PETSC ERROR: Configure options -f --with-cuda --with-cusp --download-scalapack --download-hdf5 --download-zlib --download-mumps --download-parmetis --download-metis --download-ptscotch --download-hypre --download-spai > [1]PETSC ERROR: #1 ISComplement() at /mnt/c/mathware/petsc/src/vec/is/is/utils/iscoloring.c:803 > [1]PETSC ERROR: #2 PCFieldSplitSetDefaults() at /mnt/c/mathware/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:544 > [1]PETSC ERROR: #3 PCSetUp_FieldSplit() at /mnt/c/mathware/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:587 > [1]PETSC ERROR: #4 PCSetUp() at /mnt/c/mathware/petsc/src/ksp/pc/interface/precon.c:994 > [1]PETSC ERROR: #5 KSPSetUp() at /mnt/c/mathware/petsc/src/ksp/ksp/interface/itfunc.c:406 > [1]PETSC ERROR: #6 SetUpSolver() at /mnt/c/TOPet/fts_topopt_in_petsc-master/MixedStokes.cc:2650 > > > From: Barry Smith > > Sent: Wednesday, April 26, 2023 4:11 PM > To: Carl-Johan Thore > > Cc: PETSc > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > What happens if you pass in your IS using directly PCFIELDSPLIT and not using PCREDISTRIBUTE? > > > > On Apr 26, 2023, at 2:27 AM, Carl-Johan Thore > wrote: > > Hi again, > > I now think I got my IS in order (it?s just one IS because unlike in your ex84.c I don?t provide the complement of the IS explicitly but let fieldsplit compute it, but ex84.c works fine if I do the same there). > As before, my code works with pcredistribute and pcfieldsplit on 1 core. I then try > with 2 cores. First I check the IS in Matlab, and it looks fine as far as I can tell, with identical content > as in the 1-core case, which it should?. Then I try running the code, but it fails with (the first two lines are mine) > > ? > [1]ISdata: min= 6301, max=10639, freeudofs= 3244. min= 6293, max=10639, freedofs= 4347: row 2629 > [0]ISdata: min= 139, max= 6292, freeudofs= 4459. min= 0, max= 6292, freedofs= 6293: row 2629 > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Argument out of range > [0]PETSC ERROR: Index 3748's value 5320 is larger than maximum given 5320 > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [0]PETSC ERROR: Petsc Development GIT revision: v3.19.0-236-gee39b84cc03 GIT Date: 2023-04-23 18:43:23 -0400 > [0]PETSC ERROR: topopt on a arch-linux-c-debug named win01705 by carlthore Wed Apr 26 08:01:49 2023 > [0]PETSC ERROR: Configure options -f --with-cuda --with-cusp --download-scalapack --download-hdf5 --download-zlib --download-mumps --download-parmetis --download-metis --download-ptscotch --download-hypre --download-spai > [0]PETSC ERROR: #1 ISComplement() at /mnt/c/mathware/petsc/src/vec/is/is/utils/iscoloring.c:804 > [0]PETSC ERROR: #2 PCFieldSplitSetDefaults() at /mnt/c/mathware/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:544 > [0]PETSC ERROR: #3 PCSetUp_FieldSplit() at /mnt/c/mathware/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:587 > [0]PETSC ERROR: #4 PCSetUp() at /mnt/c/mathware/petsc/src/ksp/pc/interface/precon.c:994 > [0]PETSC ERROR: #5 KSPSetUp() at /mnt/c/mathware/petsc/src/ksp/ksp/interface/itfunc.c:406 > [0]PETSC ERROR: #6 PCSetUp_Redistribute() at /mnt/c/mathware/petsc/src/ksp/pc/impls/redistribute/redistribute.c:327 > [0]PETSC ERROR: #7 PCSetUp() at /mnt/c/mathware/petsc/src/ksp/pc/interface/precon.c:994 > [0]PETSC ERROR: #8 KSPSetUp() at /mnt/c/mathware/petsc/src/ksp/ksp/interface/itfunc.c:406 > [0]PETSC ERROR: #9 SetUpSolver() at /mnt/c/TOPet/fts_topopt_in_petsc-master/MixedStokes.cc:2650 > ? > > This is all the errors I see, I?m not sure why the error message is not written out in full. I guess the error is on > my side, with the IS still not being constructed correctly, but what do you think? > > Kind regards, > Carl-Johan > > > From: Barry Smith > > Sent: Monday, April 24, 2023 8:27 PM > To: Carl-Johan Thore > > Cc: PETSc > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > PCREDISTRIBUTE looks like it is not yet GPU friendly. This needs to be fixed, but it should be a separate fix and MR from my current one. > > Please just check if the PCREDISTRIBUTE followed by PCFIELDSPLIT just works on CPUs for your code. > > Barry > > > > > > On Apr 24, 2023, at 2:23 PM, Carl-Johan Thore > wrote: > > I wasn?t sure if I was going to bother you again with this, but since it looks like you plan to merge this with the > main branch (?) I thought it might be interesting to know that I?ve tried this with my code running with CUDA but got the > attached error. I suspect it?s related to the RHS red->b but I?m not sure. If I switch of redistribute my code runs fine > with CUDA. > > Kind regards, > Carl-Johan > > > > From: Carl-Johan Thore > Sent: Monday, April 24, 2023 5:08 PM > To: Barry Smith > > Subject: RE: [petsc-users] Fieldsplit with redistribute > > Ok, that worked great with my code on 1 core! (I haven?t been able to try the multi-core case yet due to issues with my own code > mentioned below) > > I?m not sure if you forgot to remove the freeing of the map object > outside or if I messed up with the pull somehow, but I had to outcomment that line manually: > ? > > /Carl-Johan > > > > From: Barry Smith > > Sent: Monday, April 24, 2023 4:26 PM > To: Carl-Johan Thore > > Cc: PETSc > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > The bug was mine; I was freeing the map object outside of the if () instead of inside. You can do > > git pull > make all > > and then try again. > > Barry > > > > On Apr 24, 2023, at 5:39 AM, Carl-Johan Thore > wrote: > > Hi Barry! > > First of all, thank you very very much for this! I was expecting maybe a few hints and pointers on how to proceed > with my work, but then you did a complete implementation ? > > Your code ran fine with ex84.cc . Unfortunately it crashed when running on my main code (a mixed Stokes solver). > When running on 1 core I get a crash which is maybe related to your code, so I?ve attached the error message for that > case. However, on multiple cores I think the issue is mainly that I?m not constructing the original IS correctly, so I?ll > look into that myself. > > Regarding reporting to https://gitlab.com/petsc/petsc/-/merge_requests/6366, should it be done here?: ? > > By the way, I managed yesterday to make a working implementation of my own example and was planning to send it > after cleaning it up and maybe optimizing a bit. I?ve attached it if your curious (or just want to have a good laugh :)) > > Kind regards, > Carl-Johan > > > From: Barry Smith > > Sent: Monday, April 24, 2023 12:49 AM > To: Carl-Johan Thore > > Cc: PETSc > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > I have added support for PCREDISTRIBUTE to propogate your PCFieldSplitSetIS() down to an inner PCFIELDSPLIT. You can > access it with > > git fetch > git checkout barry/2023-04-22/fieldsplit-fields-propogate > ./configure > make all check > > Take a look at src/ksp/ksp/tutorials/ex84.c and run with the options at the bottom of the file. > > Please let us know at https://gitlab.com/petsc/petsc/-/merge_requests/6366 if it works for you or you have any difficulties. > > Barry > > > > > On Apr 20, 2023, at 10:14 AM, Carl-Johan Thore > wrote: > > Great, thanks! I?ve attached the code, a makefile, and a 1-page power-point which hopefully explains > what I?m trying to do on this little toy-problem. There is obviously (?) something I need to add around > line 327 in the code in order to move the indices to the correct rank. > > Output should be something like this when running: > ? > > Let me know if you need any more info, or if the code is incomprehensible or so > (it?s long because I?ve copied a lot from redistribute.c) > > Kind regards, > Carl-Johan > > From: Barry Smith > > Sent: Thursday, April 20, 2023 3:17 PM > To: Carl-Johan Thore > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > Sure > > > On Apr 20, 2023, at 4:09 AM, Carl-Johan Thore > wrote: > > Hi Barry, > > In the conversation below you mentioned that I could send code to you to take a look. I?ve written > up what I think is a minimally working example for this. It?s almost there in the sense of distributing > the correct number of indices to the ranks to match the reduced matrix, but it?s the wrong indices. > Would it be okay if I sent you the code to have look? > > Kind regards, > Carl-Johan > > From: Barry Smith > > Sent: Sunday, April 16, 2023 10:31 PM > To: Carl-Johan Thore > > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > The manual page for ISEmbed is incomprehensible to me. Anyways no matter what, you need to know what degrees of freedom are removed by PCDistribute() in order to produce the reduced IS which is why I think you need information only available inside PCSetUp_Redistribute(). (Sorry it is PCSetUp_Redistribute() not PCApply_Redistribute()) > > Barry > > > > On Apr 16, 2023, at 3:36 PM, Carl-Johan Thore > wrote: > > Thanks for the quick reply Barry! > I have not tried the version with PCApply_Redistribute that you suggest, but I have a code that does roughly what you describe. It works when running on one rank, but fails on multiple ranks. I suspect the issue is with the use of ISEmbed as, quoting the PETSc-manual, "the resulting IS is sequential, since the index substitution it encodes is purely local" (admittedly I don't fully understand what that means). If you think using ISEmbed is not a good idea, I'll try PCApply_Redistribute() > From: Barry Smith > > Sent: 16 April 2023 21:11:18 > To: Carl-Johan Thore > > Cc: petsc-users at mcs.anl.gov > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > There is no code to do this currently. > > I would start by building your IS for each split before the PCRedistribute and then adding to the PCApply_Redistribute() code that "fixes" these IS by "removing" the entries of the IS associated with removed degrees of freedom and then shifting the entries indices of the IS by taking into account the removed indices. But you have probably already been trying this? It does require digging directly into the PCApply_Redistribute() to get the needed information (which degrees of freedom are removed by the redistribute code), plus it requires shifting the MPI rank ownership of the entries of the IS in the same way the MPI rank ownership of the degrees of freedom of the vector are moved. > > If you have some code that you think should be doing this but doesn't work feel free to send it to us and we may be able to fix it. > > Barry > > > > On Apr 16, 2023, at 2:50 PM, Carl-Johan Thore via petsc-users > wrote: > > > > Hello, > > I'm solving a blocksystem > > [A C; > > C' D], > > where D is not zero, using the PCFIELDSPLIT preconditioner and set the split using PetscFieldSplitSetIS. This works very well until I try PCREDISTRIBUTE (which is attractive as I have many locked DOFs). I suspect something goes wrong when constructing the IS for the split (I've tried various things using the IS-routines). Can PETSc do this automatically? Or else, any hints? > > Kind regards, > > Carl-Johan > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image004.png Type: image/png Size: 45433 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image005.png Type: image/png Size: 79741 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image006.png Type: image/png Size: 70472 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image007.png Type: image/png Size: 25214 bytes Desc: not available URL: From bsmith at petsc.dev Wed Apr 26 11:14:40 2023 From: bsmith at petsc.dev (Barry Smith) Date: Wed, 26 Apr 2023 12:14:40 -0400 Subject: [petsc-users] Fieldsplit with redistribute In-Reply-To: References: <35C00A41-36BA-45E6-AC79-7DDDDFCFED6C@petsc.dev> <0DB90ED8-6BD0-431E-B592-37B89DE3DE28@petsc.dev> <88F183A6-6553-4288-89A5-DD1903EA9956@petsc.dev> <6A7D360F-999F-4222-8602-3BC1B7915323@petsc.dev> Message-ID: <9F35FD35-FCB5-4A53-8122-0D4349606FA0@petsc.dev> Thanks, seems we are moving to the same page. > On Apr 26, 2023, at 11:55 AM, Carl-Johan Thore wrote: > > Ok yes, I thought one was supposed to provide the ?reduced? indices. When I did that, the sequential case worked .. > But it is clearly much more convenient to only have to provide the original indices > > Just to be very clear: to me, ?the IS for the original problem without any concern?? is an IS containing ALL velocity DOFs, including locked > ones, and pcfieldsplit should compute the pressure DOFs automatically as the complement. Yes > > If I provide this IS with pcfieldsplit WITHOUT PCREDISTRIBUTE, everything works fine both sequentially and in parallel. > > If I provide this IS with pcfieldsplit WITH PCREDISTRIBUTE neither the sequential nor the parallel case works. The attached > error message comes from the sequential case I'll take a look and see what might be causing this. Seems more likely a problem on our end of things. > > > Not sure it is of any use, but here is how I construct the original IS, called isu, and pass it to pcfieldsplit: > > DMStagStencil stencil[24]; > for (PetscInt i=0; i<8; i++) { > for (PetscInt j=0; j<3; j++) { > stencil[j+i*3].loc = enodes[i]; > stencil[j+i*3].c = j; > stencil[j+i*3].i = 0; > stencil[j+i*3].j = 0; > stencil[j+i*3].k = 0; > } > } > ierr = DMStagCreateISFromStencils(dm_state,24,stencil,&isu); CHKERRQ(ierr); > ierr = PCFieldSplitSetIS(ipc,"0",isu); CHKERRQ(ierr); > > > > From: Barry Smith > > Sent: Wednesday, April 26, 2023 5:21 PM > To: Carl-Johan Thore > > Cc: PETSc > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > Perhaps there is a misunderstanding. With the code I added you always just provide IS for the original problem without any concern for what rows will be zeroed and what rows will be redistributed. PCREDISTRIBUTE manages "fixing" things. > > I want you to use PCFIEDLSPLIT and not PCREDISTRIBUTE at all (so -pc_type fieldsplit) with your IS and see if that works correctly sequentially and in parallel. > > Then I want you to use -pc_type redistribute -redistribute_pc_type fieldsplit with your IS sequentially and in parallel (note you do not change your code, or even recompiled it for all the cases.) > > Barry > > > > > > On Apr 26, 2023, at 10:58 AM, Carl-Johan Thore > wrote: > > Because without redistribute, pcfieldsplit expects number pointing to rows in the ?big? matrix, whereas when > I construct my IS for redistribute it will have indices pointing to rows in the ?reduced matrix?. Coming back to my small > example below, I construct an IS with 0 2 4 as expected (?) for the reduced matrix. ? > If I pass 0 2 4 to the big matrix I expect the wrong result. If I skip the part in the construction of my IS where > 1 3 7 gets converted to 0 2 4 it works without redistribute. > > I?m using MatZeroRowsColumns in all cases yes. > > From: Barry Smith > > Sent: Wednesday, April 26, 2023 4:47 PM > To: Carl-Johan Thore > > Cc: PETSc > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > > > > On Apr 26, 2023, at 10:32 AM, Carl-Johan Thore > wrote: > > One 1 core there is no errors, but the solution to the linear system is wrong as expected. > > Why is it expect to be wrong? Are you still using the MatZeroRowsColumns()? You should. > > > > On 2 cores I get this: > > [0]ISdata: min= 139, max= 6292, freeudofs= 4459. min= 0, max= 6292, freedofs= 6293: row 2629 > [1]ISdata: min= 6301, max=10639, freeudofs= 3244. min= 6293, max=10639, freedofs= 4347: row 2629 > [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [1]PETSC ERROR: Argument out of range > [1]PETSC ERROR: Index 0's value 6301 is smaller than minimum given 19845 > [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [1]PETSC ERROR: Petsc Development GIT revision: v3.19.0-236-gee39b84cc03 GIT Date: 2023-04-23 18:43:23 -0400 > [1]PETSC ERROR: topopt on a arch-linux-c-debug named win01705 by carlthore Wed Apr 26 16:26:24 2023 > [1]PETSC ERROR: Configure options -f --with-cuda --with-cusp --download-scalapack --download-hdf5 --download-zlib --download-mumps --download-parmetis --download-metis --download-ptscotch --download-hypre --download-spai > [1]PETSC ERROR: #1 ISComplement() at /mnt/c/mathware/petsc/src/vec/is/is/utils/iscoloring.c:803 > [1]PETSC ERROR: #2 PCFieldSplitSetDefaults() at /mnt/c/mathware/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:544 > [1]PETSC ERROR: #3 PCSetUp_FieldSplit() at /mnt/c/mathware/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:587 > [1]PETSC ERROR: #4 PCSetUp() at /mnt/c/mathware/petsc/src/ksp/pc/interface/precon.c:994 > [1]PETSC ERROR: #5 KSPSetUp() at /mnt/c/mathware/petsc/src/ksp/ksp/interface/itfunc.c:406 > [1]PETSC ERROR: #6 SetUpSolver() at /mnt/c/TOPet/fts_topopt_in_petsc-master/MixedStokes.cc:2650 > > > From: Barry Smith > > Sent: Wednesday, April 26, 2023 4:11 PM > To: Carl-Johan Thore > > Cc: PETSc > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > What happens if you pass in your IS using directly PCFIELDSPLIT and not using PCREDISTRIBUTE? > > > > > On Apr 26, 2023, at 2:27 AM, Carl-Johan Thore > wrote: > > Hi again, > > I now think I got my IS in order (it?s just one IS because unlike in your ex84.c I don?t provide the complement of the IS explicitly but let fieldsplit compute it, but ex84.c works fine if I do the same there). > As before, my code works with pcredistribute and pcfieldsplit on 1 core. I then try > with 2 cores. First I check the IS in Matlab, and it looks fine as far as I can tell, with identical content > as in the 1-core case, which it should?. Then I try running the code, but it fails with (the first two lines are mine) > > ? > [1]ISdata: min= 6301, max=10639, freeudofs= 3244. min= 6293, max=10639, freedofs= 4347: row 2629 > [0]ISdata: min= 139, max= 6292, freeudofs= 4459. min= 0, max= 6292, freedofs= 6293: row 2629 > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Argument out of range > [0]PETSC ERROR: Index 3748's value 5320 is larger than maximum given 5320 > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > [0]PETSC ERROR: Petsc Development GIT revision: v3.19.0-236-gee39b84cc03 GIT Date: 2023-04-23 18:43:23 -0400 > [0]PETSC ERROR: topopt on a arch-linux-c-debug named win01705 by carlthore Wed Apr 26 08:01:49 2023 > [0]PETSC ERROR: Configure options -f --with-cuda --with-cusp --download-scalapack --download-hdf5 --download-zlib --download-mumps --download-parmetis --download-metis --download-ptscotch --download-hypre --download-spai > [0]PETSC ERROR: #1 ISComplement() at /mnt/c/mathware/petsc/src/vec/is/is/utils/iscoloring.c:804 > [0]PETSC ERROR: #2 PCFieldSplitSetDefaults() at /mnt/c/mathware/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:544 > [0]PETSC ERROR: #3 PCSetUp_FieldSplit() at /mnt/c/mathware/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c:587 > [0]PETSC ERROR: #4 PCSetUp() at /mnt/c/mathware/petsc/src/ksp/pc/interface/precon.c:994 > [0]PETSC ERROR: #5 KSPSetUp() at /mnt/c/mathware/petsc/src/ksp/ksp/interface/itfunc.c:406 > [0]PETSC ERROR: #6 PCSetUp_Redistribute() at /mnt/c/mathware/petsc/src/ksp/pc/impls/redistribute/redistribute.c:327 > [0]PETSC ERROR: #7 PCSetUp() at /mnt/c/mathware/petsc/src/ksp/pc/interface/precon.c:994 > [0]PETSC ERROR: #8 KSPSetUp() at /mnt/c/mathware/petsc/src/ksp/ksp/interface/itfunc.c:406 > [0]PETSC ERROR: #9 SetUpSolver() at /mnt/c/TOPet/fts_topopt_in_petsc-master/MixedStokes.cc:2650 > ? > > This is all the errors I see, I?m not sure why the error message is not written out in full. I guess the error is on > my side, with the IS still not being constructed correctly, but what do you think? > > Kind regards, > Carl-Johan > > > From: Barry Smith > > Sent: Monday, April 24, 2023 8:27 PM > To: Carl-Johan Thore > > Cc: PETSc > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > PCREDISTRIBUTE looks like it is not yet GPU friendly. This needs to be fixed, but it should be a separate fix and MR from my current one. > > Please just check if the PCREDISTRIBUTE followed by PCFIELDSPLIT just works on CPUs for your code. > > Barry > > > > > > > On Apr 24, 2023, at 2:23 PM, Carl-Johan Thore > wrote: > > I wasn?t sure if I was going to bother you again with this, but since it looks like you plan to merge this with the > main branch (?) I thought it might be interesting to know that I?ve tried this with my code running with CUDA but got the > attached error. I suspect it?s related to the RHS red->b but I?m not sure. If I switch of redistribute my code runs fine > with CUDA. > > Kind regards, > Carl-Johan > > > > From: Carl-Johan Thore > Sent: Monday, April 24, 2023 5:08 PM > To: Barry Smith > > Subject: RE: [petsc-users] Fieldsplit with redistribute > > Ok, that worked great with my code on 1 core! (I haven?t been able to try the multi-core case yet due to issues with my own code > mentioned below) > > I?m not sure if you forgot to remove the freeing of the map object > outside or if I messed up with the pull somehow, but I had to outcomment that line manually: > ? > > /Carl-Johan > > > > From: Barry Smith > > Sent: Monday, April 24, 2023 4:26 PM > To: Carl-Johan Thore > > Cc: PETSc > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > The bug was mine; I was freeing the map object outside of the if () instead of inside. You can do > > git pull > make all > > and then try again. > > Barry > > > > On Apr 24, 2023, at 5:39 AM, Carl-Johan Thore > wrote: > > Hi Barry! > > First of all, thank you very very much for this! I was expecting maybe a few hints and pointers on how to proceed > with my work, but then you did a complete implementation ? > > Your code ran fine with ex84.cc . Unfortunately it crashed when running on my main code (a mixed Stokes solver). > When running on 1 core I get a crash which is maybe related to your code, so I?ve attached the error message for that > case. However, on multiple cores I think the issue is mainly that I?m not constructing the original IS correctly, so I?ll > look into that myself. > > Regarding reporting to https://gitlab.com/petsc/petsc/-/merge_requests/6366, should it be done here?: ? > > By the way, I managed yesterday to make a working implementation of my own example and was planning to send it > after cleaning it up and maybe optimizing a bit. I?ve attached it if your curious (or just want to have a good laugh :)) > > Kind regards, > Carl-Johan > > > From: Barry Smith > > Sent: Monday, April 24, 2023 12:49 AM > To: Carl-Johan Thore > > Cc: PETSc > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > I have added support for PCREDISTRIBUTE to propogate your PCFieldSplitSetIS() down to an inner PCFIELDSPLIT. You can > access it with > > git fetch > git checkout barry/2023-04-22/fieldsplit-fields-propogate > ./configure > make all check > > Take a look at src/ksp/ksp/tutorials/ex84.c and run with the options at the bottom of the file. > > Please let us know at https://gitlab.com/petsc/petsc/-/merge_requests/6366 if it works for you or you have any difficulties. > > Barry > > > > > On Apr 20, 2023, at 10:14 AM, Carl-Johan Thore > wrote: > > Great, thanks! I?ve attached the code, a makefile, and a 1-page power-point which hopefully explains > what I?m trying to do on this little toy-problem. There is obviously (?) something I need to add around > line 327 in the code in order to move the indices to the correct rank. > > Output should be something like this when running: > ? > > Let me know if you need any more info, or if the code is incomprehensible or so > (it?s long because I?ve copied a lot from redistribute.c) > > Kind regards, > Carl-Johan > > From: Barry Smith > > Sent: Thursday, April 20, 2023 3:17 PM > To: Carl-Johan Thore > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > Sure > > > On Apr 20, 2023, at 4:09 AM, Carl-Johan Thore > wrote: > > Hi Barry, > > In the conversation below you mentioned that I could send code to you to take a look. I?ve written > up what I think is a minimally working example for this. It?s almost there in the sense of distributing > the correct number of indices to the ranks to match the reduced matrix, but it?s the wrong indices. > Would it be okay if I sent you the code to have look? > > Kind regards, > Carl-Johan > > From: Barry Smith > > Sent: Sunday, April 16, 2023 10:31 PM > To: Carl-Johan Thore > > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > The manual page for ISEmbed is incomprehensible to me. Anyways no matter what, you need to know what degrees of freedom are removed by PCDistribute() in order to produce the reduced IS which is why I think you need information only available inside PCSetUp_Redistribute(). (Sorry it is PCSetUp_Redistribute() not PCApply_Redistribute()) > > Barry > > > > On Apr 16, 2023, at 3:36 PM, Carl-Johan Thore > wrote: > > Thanks for the quick reply Barry! > I have not tried the version with PCApply_Redistribute that you suggest, but I have a code that does roughly what you describe. It works when running on one rank, but fails on multiple ranks. I suspect the issue is with the use of ISEmbed as, quoting the PETSc-manual, "the resulting IS is sequential, since the index substitution it encodes is purely local" (admittedly I don't fully understand what that means). If you think using ISEmbed is not a good idea, I'll try PCApply_Redistribute() > From: Barry Smith > > Sent: 16 April 2023 21:11:18 > To: Carl-Johan Thore > > Cc: petsc-users at mcs.anl.gov > > Subject: Re: [petsc-users] Fieldsplit with redistribute > > > There is no code to do this currently. > > I would start by building your IS for each split before the PCRedistribute and then adding to the PCApply_Redistribute() code that "fixes" these IS by "removing" the entries of the IS associated with removed degrees of freedom and then shifting the entries indices of the IS by taking into account the removed indices. But you have probably already been trying this? It does require digging directly into the PCApply_Redistribute() to get the needed information (which degrees of freedom are removed by the redistribute code), plus it requires shifting the MPI rank ownership of the entries of the IS in the same way the MPI rank ownership of the degrees of freedom of the vector are moved. > > If you have some code that you think should be doing this but doesn't work feel free to send it to us and we may be able to fix it. > > Barry > > > > On Apr 16, 2023, at 2:50 PM, Carl-Johan Thore via petsc-users > wrote: > > > > Hello, > > I'm solving a blocksystem > > [A C; > > C' D], > > where D is not zero, using the PCFIELDSPLIT preconditioner and set the split using PetscFieldSplitSetIS. This works very well until I try PCREDISTRIBUTE (which is attractive as I have many locked DOFs). I suspect something goes wrong when constructing the IS for the split (I've tried various things using the IS-routines). Can PETSc do this automatically? Or else, any hints? > > Kind regards, > > Carl-Johan > > ? -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image001.png Type: image/png Size: 45433 bytes Desc: not available URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image002.png Type: image/png Size: 79741 bytes Desc: not available URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image003.png Type: image/png Size: 70472 bytes Desc: not available URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image004.png Type: image/png Size: 25214 bytes Desc: not available URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: error_message.txt URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: From grezende.oliv at gmail.com Wed Apr 26 12:05:32 2023 From: grezende.oliv at gmail.com (Guilherme Lima) Date: Wed, 26 Apr 2023 14:05:32 -0300 Subject: [petsc-users] GLvis Message-ID: I'm trying to use GLvis with MFEM and PETSc. Since I already know the OpenGL directory, and configuring PETSc with "--download-glvis" requires me to use "--with-opengl-include=directory". But it answer with "UNABLE to CONFIGURE with GIVEN OPTIONS" Am I doing this wrong or is there a different way to use these 3 correctly? -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: text/x-log Size: 969551 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: make.log Type: text/x-log Size: 114475 bytes Desc: not available URL: From balay at mcs.anl.gov Wed Apr 26 12:32:49 2023 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 26 Apr 2023 12:32:49 -0500 (CDT) Subject: [petsc-users] GLvis In-Reply-To: References: Message-ID: <00d9a9e2-7852-9784-1283-cc01d4c823d7@mcs.anl.gov> > --with-opengl-dir=/usr/include/GL Try: --with-opengl=1 Assuming you have these installed balay at petsc-gpu-01:~$ dpkg -S /usr/lib/x86_64-linux-gnu/libGL.so /usr/lib/x86_64-linux-gnu/libGLU.so libgl-dev:amd64: /usr/lib/x86_64-linux-gnu/libGL.so libglu1-mesa-dev:amd64: /usr/lib/x86_64-linux-gnu/libGLU.so Satish On Wed, 26 Apr 2023, Guilherme Lima wrote: > I'm trying to use GLvis with MFEM and PETSc. Since I already know the > OpenGL directory, and configuring PETSc with "--download-glvis" requires me > to use "--with-opengl-include=directory". But it answer with "UNABLE to > CONFIGURE with GIVEN OPTIONS" > Am I doing this wrong or is there a different way to use these 3 correctly? > From maxime.bouyges at gmail.com Wed Apr 26 15:07:01 2023 From: maxime.bouyges at gmail.com (Maxime Bouyges) Date: Wed, 26 Apr 2023 22:07:01 +0200 Subject: [petsc-users] MatSetValuesCOO after MatDuplicate Message-ID: Dear PETSc developers, I am trying to use the MatSetValuesCOO function (very appropriate and performant for my case) but I am encountering a problem when I use it on a Mat obtained with MatDuplicate. It seems that the non-zero pattern is preserved by MatDuplicate, but not the "COO information". Here are the few steps to reproduce the problem: MatCreate(comm, A) MatSetUp(A) MatSetPreallocationCOO(A, ncoo, coo_i, coo_j) MatSetValuesCOO(A, coo_v, INSERT_VALUES) # -> works ok MatDuplicate(A, MAT_DO_NOT_COPY_VALUES, B) MatSetValuesCOO(B, coo_v, INSERT_VALUES) # -> seg-fault Is this an expected behaviour? Of course if I call again MatSetPreallocationCOO on the duplicated matrix it's ok but the performance is lost in my case. Is there a way to fix it? Thank you in advance. Best regards, Maxime Bouyges From bsmith at petsc.dev Wed Apr 26 15:58:42 2023 From: bsmith at petsc.dev (Barry Smith) Date: Wed, 26 Apr 2023 16:58:42 -0400 Subject: [petsc-users] MatSetValuesCOO after MatDuplicate In-Reply-To: References: Message-ID: <4D64AB6B-D7BC-4256-A714-29907922A728@petsc.dev> Yes, it looks like a bug since no one tested this situation. MatSetPreallocationCOO() is pretty heavy memory-wise. It essentially keeps a copy of all the coo_i, coo_j indices within the Mat as well as the usual matrix information. So in your scenario, you will have two copies of all this stuff; lots of memory. Is this really what you need? > On Apr 26, 2023, at 4:07 PM, Maxime Bouyges wrote: > > Dear PETSc developers, > > I am trying to use the MatSetValuesCOO function (very appropriate and performant for my case) but I am encountering a problem when I use it on a Mat obtained with MatDuplicate. It seems that the non-zero pattern is preserved by MatDuplicate, but not the "COO information". Here are the few steps to reproduce the problem: > MatCreate(comm, A) > MatSetUp(A) > MatSetPreallocationCOO(A, ncoo, coo_i, coo_j) > MatSetValuesCOO(A, coo_v, INSERT_VALUES) # -> works ok > MatDuplicate(A, MAT_DO_NOT_COPY_VALUES, B) > MatSetValuesCOO(B, coo_v, INSERT_VALUES) # -> seg-fault > > Is this an expected behaviour? Of course if I call again MatSetPreallocationCOO on the duplicated matrix it's ok but the performance is lost in my case. Is there a way to fix it? Thank you in advance. > > Best regards, > > Maxime Bouyges > From junchao.zhang at gmail.com Wed Apr 26 16:26:45 2023 From: junchao.zhang at gmail.com (Junchao Zhang) Date: Wed, 26 Apr 2023 16:26:45 -0500 Subject: [petsc-users] MatSetValuesCOO after MatDuplicate In-Reply-To: <4D64AB6B-D7BC-4256-A714-29907922A728@petsc.dev> References: <4D64AB6B-D7BC-4256-A714-29907922A728@petsc.dev> Message-ID: It sounds like we should do reference counting on the internal data structures used by COO>. --Junchao Zhang On Wed, Apr 26, 2023 at 3:59?PM Barry Smith wrote: > > Yes, it looks like a bug since no one tested this situation. > > MatSetPreallocationCOO() is pretty heavy memory-wise. It essentially > keeps a copy of all the coo_i, coo_j indices within the Mat as well as the > usual matrix information. So in your scenario, you will have two copies of > all this stuff; lots of memory. Is this really what you need? > > > > > On Apr 26, 2023, at 4:07 PM, Maxime Bouyges > wrote: > > > > Dear PETSc developers, > > > > I am trying to use the MatSetValuesCOO function (very appropriate and > performant for my case) but I am encountering a problem when I use it on a > Mat obtained with MatDuplicate. It seems that the non-zero pattern is > preserved by MatDuplicate, but not the "COO information". Here are the few > steps to reproduce the problem: > > MatCreate(comm, A) > > MatSetUp(A) > > MatSetPreallocationCOO(A, ncoo, coo_i, coo_j) > > MatSetValuesCOO(A, coo_v, INSERT_VALUES) # -> works ok > > MatDuplicate(A, MAT_DO_NOT_COPY_VALUES, B) > > MatSetValuesCOO(B, coo_v, INSERT_VALUES) # -> seg-fault > > > > Is this an expected behaviour? Of course if I call again > MatSetPreallocationCOO on the duplicated matrix it's ok but the performance > is lost in my case. Is there a way to fix it? Thank you in advance. > > > > Best regards, > > > > Maxime Bouyges > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Wed Apr 26 17:18:23 2023 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 26 Apr 2023 17:18:23 -0500 (CDT) Subject: [petsc-users] Build error: vecimpl.h:124:98: error: expected declaration specifiers or '...' before string constant In-Reply-To: References: <776fdd8e-c22a-0b65-783e-364e43a302c5@mcs.anl.gov> Message-ID: <5ee45055-30b2-f1e0-1040-2d1a01eb1578@mcs.anl.gov> Change at https://gitlab.com/petsc/petsc/-/merge_requests/6382 Satish On Tue, 18 Apr 2023, Satish Balay via petsc-users wrote: > I think its best if configure can handle this automatically (check for broken compilers). Until then - perhaps we should use: > > > diff --git a/include/petsc/private/vecimpl.h b/include/petsc/private/vecimpl.h > index dd75dbbc00b..dd9ef6791c5 100644 > --- a/include/petsc/private/vecimpl.h > +++ b/include/petsc/private/vecimpl.h > @@ -110,12 +110,7 @@ struct _VecOps { > PetscErrorCode (*setvaluescoo)(Vec, const PetscScalar[], InsertMode); > }; > > -#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= 11)) > - #if (PETSC_C_VERSION >= 11) && (PETSC_C_VERSION < 23) > - // static_assert() is a keyword since C23, before that defined as macro in assert.h > - #include > - #endif > - > +#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= 17)) > static_assert(offsetof(struct _VecOps, duplicate) == sizeof(void (*)(void)) * VECOP_DUPLICATE, ""); > static_assert(offsetof(struct _VecOps, set) == sizeof(void (*)(void)) * VECOP_SET, ""); > static_assert(offsetof(struct _VecOps, view) == sizeof(void (*)(void)) * VECOP_VIEW, ""); > > > Or just: > > +#if defined(offsetof) && defined(__cplusplus) > > Satish > > On Tue, 18 Apr 2023, Jacob Faibussowitsch wrote: > > > This is a bug in GCC 9. Can you try the following: > > > > $ make clean > > $ make CFLAGS+='-std=gnu11? > > > > Best regards, > > > > Jacob Faibussowitsch > > (Jacob Fai - booss - oh - vitch) > > > > > On Apr 18, 2023, at 10:07, Zongze Yang wrote: > > > > > > No, it doesn't. It has the same problem. I just `make clean` and the `make`. Do I need to reconfigure? > > > > > > Best wishes, > > > Zongze > > > > > > > > > On Tue, 18 Apr 2023 at 21:09, Satish Balay wrote: > > > Does this change work? > > > > > > diff --git a/include/petsc/private/vecimpl.h b/include/petsc/private/vecimpl.h > > > index dd75dbbc00b..168540b546e 100644 > > > --- a/include/petsc/private/vecimpl.h > > > +++ b/include/petsc/private/vecimpl.h > > > @@ -110,7 +110,7 @@ struct _VecOps { > > > PetscErrorCode (*setvaluescoo)(Vec, const PetscScalar[], InsertMode); > > > }; > > > > > > -#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= 11)) > > > +#if defined(offsetof) && (defined(__cplusplus) || (PETSC_C_VERSION >= 17)) > > > #if (PETSC_C_VERSION >= 11) && (PETSC_C_VERSION < 23) > > > // static_assert() is a keyword since C23, before that defined as macro in assert.h > > > #include > > > > > > > > > Satish > > > > > > On Tue, 18 Apr 2023, Zongze Yang wrote: > > > > > > > Hi, I am building petsc using gcc at 9.5.0, and found the following error: > > > > > > > > ``` > > > > In file included from /usr/include/alloca.h:25, > > > > from /usr/include/stdlib.h:497, > > > > from > > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petscsys.h:1395, > > > > from > > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petscsf.h:7, > > > > from > > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/src/vec/is/sf/interface/vscat.c:1: > > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petsc/private/vecimpl.h:124:15: > > > > error: expected declaration specifiers or '...' before '__builtin_offsetof' > > > > 124 | static_assert(offsetof(struct _VecOps, loadnative) == sizeof(void > > > > (*)(void)) * VECOP_LOADNATIVE, ""); > > > > | ^~~~~~~~ > > > > In file included from > > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/src/vec/is/sf/interface/vscat.c:7: > > > > /home/lrtfm/opt/firedrake/complex-int32/petsc/include/petsc/private/vecimpl.h:124:98: > > > > error: expected declaration specifiers or '...' before string constant > > > > 124 | static_assert(offsetof(struct _VecOps, loadnative) == sizeof(void > > > > (*)(void)) * VECOP_LOADNATIVE, ""); > > > > | > > > > ^~ > > > > ``` > > > > > > > > Could someone give me some hints to fix it? The configure.log and make.log > > > > are attached. > > > > > > > > > > > > Best wishes, > > > > Zongze > > > > > > > > > > From maxime.bouyges at gmail.com Thu Apr 27 14:15:53 2023 From: maxime.bouyges at gmail.com (Maxime Bouyges) Date: Thu, 27 Apr 2023 21:15:53 +0200 Subject: [petsc-users] MatSetValuesCOO after MatDuplicate In-Reply-To: References: <4D64AB6B-D7BC-4256-A714-29907922A728@petsc.dev> Message-ID: Thanks for the prompt confirmation! I have to admit that I hadn't tested yet the memory performance of using COO instead of the classical MatSetValues. I will keep both approach in my code and do more performance checks (CPU and memory) before I take a decision. In any case, calling "again" MatSetPreallocationCOO after MatDuplicate is working so I can live with that. I just wanted to be sure that it was a "bug" (or a missing feature I would say) and not a misuse from me. If you are curious, here is the context. I am using the Julia langage to solve an ODE using the DifferentialEquations.jl package (https://github.com/SciML/DifferentialEquations.jl). The system jacobian matrix is a Julia SparseMatrix (https://docs.julialang.org/en/v1/stdlib/SparseArrays/) with CSC format. I am using PETSc as a backend for the linear algebra (with https://github.com/bmxam/PetscWrap.jl). So at some point I have to fill a PETSc matrix with the values of a Julia CSC sparse matrix. Recovering the COO information from the Julia matrix is trivial, and using MatSetValuesCOO with this information seems very efficient. However, the ODE solver does several matrix duplication (wrapped as MatDuplicate in my case) and that's why I stumbled accross this bug. But as explained above 1) I can call MatSetPreallocationCOO each time MatDuplicate is called and 2) I can keep the classical MatSetValues and use an other way to fill the PETSc matrix. Thanks again for your quick answer (and for the great library ;)) ! Best regards, Maxime Bouyges On 26/04/2023 23:26, Junchao Zhang wrote: > It sounds like we should do reference counting on the internal data > structures used by COO>. > --Junchao Zhang > > > On Wed, Apr 26, 2023 at 3:59?PM Barry Smith wrote: > > > ? ?Yes, it looks like a bug since no one tested this situation. > > ? ?MatSetPreallocationCOO() is pretty heavy memory-wise. It > essentially keeps a copy of all the coo_i, coo_j indices within > the Mat as well as the usual matrix information. So in your > scenario, you will have two copies of all this stuff; lots of > memory. Is this really what you need? > > > > > On Apr 26, 2023, at 4:07 PM, Maxime Bouyges > wrote: > > > > Dear PETSc developers, > > > > I am trying to use the MatSetValuesCOO function (very > appropriate and performant for my case) but I am encountering a > problem when I use it on a Mat obtained with MatDuplicate. It > seems that the non-zero pattern is preserved by MatDuplicate, but > not the "COO information". Here are the few steps to reproduce the > problem: > > MatCreate(comm, A) > > MatSetUp(A) > > MatSetPreallocationCOO(A, ncoo, coo_i, coo_j) > > MatSetValuesCOO(A, coo_v, INSERT_VALUES) # -> works ok > > MatDuplicate(A, MAT_DO_NOT_COPY_VALUES, B) > > MatSetValuesCOO(B, coo_v, INSERT_VALUES) # -> seg-fault > > > > Is this an expected behaviour? Of course if I call again > MatSetPreallocationCOO on the duplicated matrix it's ok but the > performance is lost in my case. Is there a way to fix it? Thank > you in advance. > > > > Best regards, > > > > Maxime Bouyges > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From leonardo.mutti01 at universitadipavia.it Fri Apr 28 11:07:23 2023 From: leonardo.mutti01 at universitadipavia.it (LEONARDO MUTTI) Date: Fri, 28 Apr 2023 18:07:23 +0200 Subject: [petsc-users] Understanding index sets for PCGASM Message-ID: Hello. I am having a hard time understanding the index sets to feed PCGASMSetSubdomains, and I am working in Fortran (as a PETSc novice). To get more intuition on how the IS objects behave I tried the following minimal (non) working example, which should tile a 16x16 matrix into 16 square, non-overlapping submatrices: #include #include #include USE petscmat USE petscksp USE petscpc Mat :: A PetscInt :: M, NSubx, dof, overlap, NSub INTEGER :: I,J PetscErrorCode :: ierr PetscScalar :: v KSP :: ksp PC :: pc IS :: subdomains_IS, inflated_IS call PetscInitialize(PETSC_NULL_CHARACTER , ierr) !-----Create a dummy matrix M = 16 call MatCreateAIJ(MPI_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE, & M, M, & PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER, & PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER, & A, ierr) DO I=1,M DO J=1,M v = I*J CALL MatSetValue (A,I-1,J-1,v, & INSERT_VALUES , ierr) END DO END DO call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY , ierr) call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY , ierr) !-----Create KSP and PC call KSPCreate(PETSC_COMM_WORLD,ksp, ierr) call KSPSetOperators(ksp,A,A, ierr) call KSPSetType(ksp,"bcgs",ierr) call KSPGetPC(ksp,pc,ierr) call KSPSetUp(ksp, ierr) call PCSetType(pc,PCGASM, ierr) call PCSetUp(pc , ierr) !-----GASM setup NSubx = 4 dof = 1 overlap = 0 call PCGASMCreateSubdomains2D(pc, & M, M, & NSubx, NSubx, & dof, overlap, & NSub, subdomains_IS, inflated_IS, ierr) call ISView(subdomains_IS, PETSC_VIEWER_STDOUT_WORLD, ierr) call KSPDestroy(ksp, ierr) call PetscFinalize(ierr) Running this on one processor, I get NSub = 4. If PCASM and PCASMCreateSubdomains2D are used instead, I get NSub = 16 as expected. Moreover, I get in the end "forrtl: severe (157): Program Exception - access violation". So: 1) why do I get two different results with ASM, and GASM? 2) why do I get access violation and how can I solve this? In fact, in C, subdomains_IS, inflated_IS should pointers to IS objects. As I see on the Fortran interface, the arguments to PCGASMCreateSubdomains2D are IS objects: subroutine PCGASMCreateSubdomains2D(a,b,c,d,e,f,g,h,i,j,z) import tPC,tIS PC a ! PC PetscInt b ! PetscInt PetscInt c ! PetscInt PetscInt d ! PetscInt PetscInt e ! PetscInt PetscInt f ! PetscInt PetscInt g ! PetscInt PetscInt h ! PetscInt IS i ! IS IS j ! IS PetscErrorCode z end subroutine PCGASMCreateSubdomains2D Thus: 3) what should be inside e.g., subdomains_IS? I expect it to contain, for every created subdomain, the list of rows and columns defining the subblock in the matrix, am I right? Context: I have a block-tridiagonal system arising from space-time finite elements, and I want to solve it with GMRES+PCGASM preconditioner, where each overlapping submatrix is on the diagonal and of size 3x3 blocks (and spanning multiple processes). This is PETSc 3.17.1 on Windows. Thanks in advance, Leonardo -------------- next part -------------- An HTML attachment was scrubbed... URL: From samar.khatiwala at earth.ox.ac.uk Fri Apr 28 11:43:44 2023 From: samar.khatiwala at earth.ox.ac.uk (Samar Khatiwala) Date: Fri, 28 Apr 2023 16:43:44 +0000 Subject: [petsc-users] PETSc build asks for network connections In-Reply-To: References: Message-ID: <851A0B82-83E7-40F1-BC7F-60BB7AB29B00@earth.ox.ac.uk> Hi, I realize this is an old thread but I have some recent experience based on setting up an M2 Mac that might be relevant. I was dreading moving to Apple Silicon Macs because of issues like these but I actually did not run into this particular problem. While I can?t be certain I think it is because in the process of installing another piece of software I had to modify Apple?s security restrictions to make them more permissive. Details of how to do this are in the following and it takes only a minute to implement: https://rogueamoeba.com/support/knowledgebase/?showArticle=ACE-StepByStep&product=Audio+Hijack Incidentally, I built mpich from source followed by PETSc in the usual way. Something else that might be helpful for others is my experience getting ifort to work. (My needs were somewhat specific: mixed fortran/C code, preferably ifort, and avoid package managers.) The intel OneAPI installer ran smoothly (via rosetta) but when building mpich (or PETSc) I ran into an obvious problem: clang produces arm64 object files while ifort produces x86 ones. I couldn?t manage to set the correct CFLAGS to tell clang to target x86. Instead, the (simpler) solution turned out to be (1) the fact that all the executables in Apple?s toolchain are universal binaries, and (2) the ?arch? command can let you run programs for any of the two architectures. Specifically, executing in the terminal: arch -x86_64 bash starts a bash shell and *every* program that is then run from that shell is automatically the x86 version. So I could then do: FC=ifort ./configure --prefix=/usr/local/mpichx86 --enable-two-level-namespace make sudo make install and get an x86 build of mpich which I could then use (from the same shell or a new one started as above) to build [x86] PETSc. Except for some annoying warnings from MKL (I think because it is confused what architecture it is running on) everything runs smoothly and - even in emulation - surprisingly fast. Sorry if this is all well know and already documented on PETSc?s install page. Samar On Mar 20, 2023, at 6:39 AM, Pierre Jolivet > wrote: On 20 Mar 2023, at 2:45 AM, Barry Smith > wrote: I found a bit more information in gmakefile.test which has the magic sauce used by make test to stop the firewall popups while running the test suite. # MACOS FIREWALL HANDLING # - if run with MACOS_FIREWALL=1 # (automatically set in $PETSC_ARCH/lib/petsc/conf/petscvariables if configured --with-macos-firewall-rules), # ensure mpiexec and test executable is on firewall list # ifeq ($(MACOS_FIREWALL),1) FW := /usr/libexec/ApplicationFirewall/socketfilterfw # There is no reliable realpath command in macOS without need for 3rd party tools like homebrew coreutils # Using Python's realpath seems like the most robust way here realpath-py = $(shell $(PYTHON) -c 'import os, sys; print(os.path.realpath(sys.argv[1]))' $(1)) # define macos-firewall-register @APP=$(call realpath-py, $(1)); \ if ! sudo -n true 2>/dev/null; then printf "Asking for sudo password to add new firewall rule for\n $$APP\n"; fi; \ sudo $(FW) --remove $$APP --add $$APP --blockapp $$APP endef endif and below. When building each executable it automatically calls socketfilterfw on that executable so it won't popup. From this I think you can reverse engineer how to turn it off for your executables. Perhaps PETSc's make ex1 etc should also apply this magic sauce, Pierre? This configure option was added in https://gitlab.com/petsc/petsc/-/merge_requests/3131 but it never worked on my machines. I just tried again this morning a make check with MACOS_FIREWALL=1, it?s asking for my password to register MPICH in the firewall, but the popups are still appearing afterwards. That?s why I?ve never used that configure option and why I?m not sure if I can trust this code from makefile.test, but I?m probably being paranoid. Prior to Ventura, when I was running the test suite, I manually disabled the firewall https://support.apple.com/en-gb/guide/mac-help/mh11783/12.0/mac/12.0 Apple has done yet again Apple things, and even if you disable the firewall on Ventura (https://support.apple.com/en-gb/guide/mac-help/mh11783/13.0/mac/13.0), the popups are still appearing. Right now, I don?t have a solution, except for not using my machine while the test suite runs? I don?t recall whether this has been mentioned by any of the other devs, but this is a completely harmless (though frustrating) message: MPI and/or PETSc cannot be used without an action from the user to allow others to get access to your machine. Thanks, Pierre On Mar 19, 2023, at 8:10 PM, Amneet Bhalla > wrote: This helped only during the configure stage, and not during the check stage and during executing the application built on PETSc. Do you think it is because I built mpich locally and not with PETSc? On Sun, Mar 19, 2023 at 3:51?PM Barry Smith > wrote: ./configure option with-macos-firewall-rules On Mar 19, 2023, at 5:25 PM, Amneet Bhalla > wrote: Yes, this is MPI that is triggering the apple firewall. If I allow it it gets added to the allowed list (see the screenshot) and it does not trigger the firewall again. However, this needs to be done for all executables (there will be several main2d's in the list). Any way to suppress it for all executables linked to mpi in the first place? On Sun, Mar 19, 2023 at 11:01?AM Matthew Knepley > wrote: On Sun, Mar 19, 2023 at 1:59?PM Amneet Bhalla > wrote: I'm building PETSc without mpi (I built mpich v 4.1.1 locally). Here is the configure command line that I used: ./configure --CC=mpicc --CXX=mpicxx --FC=mpif90 --PETSC_ARCH=darwin-dbg --with-debugging=1 --download-hypre=1 --with-x=0 No, this uses MPI, it just does not built it. Configuring with --with-mpi=0 will shut off any use of MPI, which is what Satish thinks is bugging the firewall. Thanks, Matt On Sun, Mar 19, 2023 at 10:56?AM Satish Balay > wrote: I think its due to some of the system calls from MPI. You can verify this with a '--with-mpi=0' build. I wonder if there is a way to build mpich or openmpi - that doesn't trigger Apple's firewall.. Satish On Sun, 19 Mar 2023, Amneet Bhalla wrote: > Hi Folks, > > I'm trying to build PETSc on MacOS Ventura (Apple M2) with hypre. I'm using > the latest version (v3.18.5). During the configure and make check stage I > get a request about accepting network connections. The configure and check > proceeds without my input but the dialog box stays in place. Please see the > screenshot. I'm wondering if it is benign or something to be concerned > about? Do I need to accept any network certificate to not see this dialog > box? > > Thanks, > > -- --Amneet -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -- --Amneet -- --Amneet -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Fri Apr 28 12:04:03 2023 From: balay at mcs.anl.gov (Satish Balay) Date: Fri, 28 Apr 2023 12:04:03 -0500 (CDT) Subject: [petsc-users] PETSc build asks for network connections In-Reply-To: <851A0B82-83E7-40F1-BC7F-60BB7AB29B00@earth.ox.ac.uk> References: <851A0B82-83E7-40F1-BC7F-60BB7AB29B00@earth.ox.ac.uk> Message-ID: <652175c5-a204-87a4-d2ec-dc1872b4e3e3@mcs.anl.gov> Thanks for these notes. FWIW - I don't see this issue on the CI MacOS boxes [2 Intel (default install of catalina, upgraded to ventura) , 1 M1 (montery, managed by admins)] And I think it should be preferable to avoid these nasty scripts that keep modifying the firewall rules [per petsc binary, with sudo for each] - so if tweaking 'security' settings can accomplish that - and we should recommend it.] Satish On Fri, 28 Apr 2023, Samar Khatiwala wrote: > Hi, > > I realize this is an old thread but I have some recent experience based on setting up an M2 Mac that might be relevant. > > I was dreading moving to Apple Silicon Macs because of issues like these but I actually did not run into this particular problem. > While I can?t be certain I think it is because in the process of installing another piece of software I had to modify Apple?s security > restrictions to make them more permissive. Details of how to do this are in the following and it takes only a minute to implement: > > https://rogueamoeba.com/support/knowledgebase/?showArticle=ACE-StepByStep&product=Audio+Hijack > > Incidentally, I built mpich from source followed by PETSc in the usual way. > > Something else that might be helpful for others is my experience getting ifort to work. (My needs were somewhat specific: mixed > fortran/C code, preferably ifort, and avoid package managers.) The intel OneAPI installer ran smoothly (via rosetta) but when > building mpich (or PETSc) I ran into an obvious problem: clang produces arm64 object files while ifort produces x86 ones. I couldn?t > manage to set the correct CFLAGS to tell clang to target x86. Instead, the (simpler) solution turned out to be (1) the fact that all the > executables in Apple?s toolchain are universal binaries, and (2) the ?arch? command can let you run programs for any of the two > architectures. Specifically, executing in the terminal: > > arch -x86_64 bash > > starts a bash shell and *every* program that is then run from that shell is automatically the x86 version. So I could then do: > FC=ifort > ./configure --prefix=/usr/local/mpichx86 --enable-two-level-namespace > make > sudo make install > > and get an x86 build of mpich which I could then use (from the same shell or a new one started as above) to build [x86] PETSc. > Except for some annoying warnings from MKL (I think because it is confused what architecture it is running on) everything runs > smoothly and - even in emulation - surprisingly fast. > > Sorry if this is all well know and already documented on PETSc?s install page. > > Samar > > On Mar 20, 2023, at 6:39 AM, Pierre Jolivet > wrote: > > > On 20 Mar 2023, at 2:45 AM, Barry Smith > wrote: > > > I found a bit more information in gmakefile.test which has the magic sauce used by make test to stop the firewall popups while running the test suite. > > # MACOS FIREWALL HANDLING > # - if run with MACOS_FIREWALL=1 > # (automatically set in $PETSC_ARCH/lib/petsc/conf/petscvariables if configured --with-macos-firewall-rules), > # ensure mpiexec and test executable is on firewall list > # > ifeq ($(MACOS_FIREWALL),1) > FW := /usr/libexec/ApplicationFirewall/socketfilterfw > # There is no reliable realpath command in macOS without need for 3rd party tools like homebrew coreutils > # Using Python's realpath seems like the most robust way here > realpath-py = $(shell $(PYTHON) -c 'import os, sys; print(os.path.realpath(sys.argv[1]))' $(1)) > # > define macos-firewall-register > @APP=$(call realpath-py, $(1)); \ > if ! sudo -n true 2>/dev/null; then printf "Asking for sudo password to add new firewall rule for\n $$APP\n"; fi; \ > sudo $(FW) --remove $$APP --add $$APP --blockapp $$APP > endef > endif > > and below. When building each executable it automatically calls socketfilterfw on that executable so it won't popup. > > From this I think you can reverse engineer how to turn it off for your executables. > > Perhaps PETSc's make ex1 etc should also apply this magic sauce, Pierre? > > This configure option was added in https://gitlab.com/petsc/petsc/-/merge_requests/3131 but it never worked on my machines. > I just tried again this morning a make check with MACOS_FIREWALL=1, it?s asking for my password to register MPICH in the firewall, but the popups are still appearing afterwards. > That?s why I?ve never used that configure option and why I?m not sure if I can trust this code from makefile.test, but I?m probably being paranoid. > Prior to Ventura, when I was running the test suite, I manually disabled the firewall https://support.apple.com/en-gb/guide/mac-help/mh11783/12.0/mac/12.0 > Apple has done yet again Apple things, and even if you disable the firewall on Ventura (https://support.apple.com/en-gb/guide/mac-help/mh11783/13.0/mac/13.0), the popups are still appearing. > Right now, I don?t have a solution, except for not using my machine while the test suite runs? > I don?t recall whether this has been mentioned by any of the other devs, but this is a completely harmless (though frustrating) message: MPI and/or PETSc cannot be used without an action from the user to allow others to get access to your machine. > > Thanks, > Pierre > > On Mar 19, 2023, at 8:10 PM, Amneet Bhalla > wrote: > > This helped only during the configure stage, and not during the check stage and during executing the application built on PETSc. Do you think it is because I built mpich locally and not with PETSc? > > On Sun, Mar 19, 2023 at 3:51?PM Barry Smith > wrote: > > ./configure option with-macos-firewall-rules > > > On Mar 19, 2023, at 5:25 PM, Amneet Bhalla > wrote: > > Yes, this is MPI that is triggering the apple firewall. If I allow it it gets added to the allowed list (see the screenshot) and it does not trigger the firewall again. However, this needs to be done for all executables (there will be several main2d's in the list). Any way to suppress it for all executables linked to mpi in the first place? > > > > On Sun, Mar 19, 2023 at 11:01?AM Matthew Knepley > wrote: > On Sun, Mar 19, 2023 at 1:59?PM Amneet Bhalla > wrote: > I'm building PETSc without mpi (I built mpich v 4.1.1 locally). Here is the configure command line that I used: > > ./configure --CC=mpicc --CXX=mpicxx --FC=mpif90 --PETSC_ARCH=darwin-dbg --with-debugging=1 --download-hypre=1 --with-x=0 > > > No, this uses MPI, it just does not built it. Configuring with --with-mpi=0 will shut off any use of MPI, which is what Satish thinks is bugging the firewall. > > Thanks, > > Matt > > On Sun, Mar 19, 2023 at 10:56?AM Satish Balay > wrote: > I think its due to some of the system calls from MPI. > > You can verify this with a '--with-mpi=0' build. > > I wonder if there is a way to build mpich or openmpi - that doesn't trigger Apple's firewall.. > > Satish > > On Sun, 19 Mar 2023, Amneet Bhalla wrote: > > > Hi Folks, > > > > I'm trying to build PETSc on MacOS Ventura (Apple M2) with hypre. I'm using > > the latest version (v3.18.5). During the configure and make check stage I > > get a request about accepting network connections. The configure and check > > proceeds without my input but the dialog box stays in place. Please see the > > screenshot. I'm wondering if it is benign or something to be concerned > > about? Do I need to accept any network certificate to not see this dialog > > box? > > > > Thanks, > > > > > > > > -- > --Amneet > > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/ > > > -- > --Amneet > > > > > > > -- > --Amneet > > From junchao.zhang at gmail.com Fri Apr 28 14:52:04 2023 From: junchao.zhang at gmail.com (Junchao Zhang) Date: Fri, 28 Apr 2023 14:52:04 -0500 Subject: [petsc-users] MatSetValuesCOO after MatDuplicate In-Reply-To: References: <4D64AB6B-D7BC-4256-A714-29907922A728@petsc.dev> Message-ID: Hi, Maxime, Thanks for the introduction. It sounds to me it makes sense to share the COO information in MatDuplicate. The data structure consumes a lot of memory, making it more justified. I will add it. --Junchao Zhang On Thu, Apr 27, 2023 at 2:15?PM Maxime Bouyges wrote: > Thanks for the prompt confirmation! I have to admit that I hadn't tested > yet the memory performance of using COO instead of the classical > MatSetValues. I will keep both approach in my code and do more performance > checks (CPU and memory) before I take a decision. In any case, calling > "again" MatSetPreallocationCOO after MatDuplicate is working so I can live > with that. I just wanted to be sure that it was a "bug" (or a missing > feature I would say) and not a misuse from me. > > If you are curious, here is the context. I am using the Julia langage to > solve an ODE using the DifferentialEquations.jl package ( > https://github.com/SciML/DifferentialEquations.jl). The system jacobian > matrix is a Julia SparseMatrix ( > https://docs.julialang.org/en/v1/stdlib/SparseArrays/) with CSC format. I > am using PETSc as a backend for the linear algebra (with > https://github.com/bmxam/PetscWrap.jl). So at some point I have to fill a > PETSc matrix with the values of a Julia CSC sparse matrix. Recovering the > COO information from the Julia matrix is trivial, and using MatSetValuesCOO > with this information seems very efficient. However, the ODE solver does > several matrix duplication (wrapped as MatDuplicate in my case) and that's > why I stumbled accross this bug. But as explained above 1) I can call > MatSetPreallocationCOO each time MatDuplicate is called and 2) I can keep > the classical MatSetValues and use an other way to fill the PETSc matrix. > > Thanks again for your quick answer (and for the great library ;)) ! > > Best regards, > > Maxime Bouyges > On 26/04/2023 23:26, Junchao Zhang wrote: > > It sounds like we should do reference counting on the internal data > structures used by COO>. > --Junchao Zhang > > > On Wed, Apr 26, 2023 at 3:59?PM Barry Smith wrote: > >> >> Yes, it looks like a bug since no one tested this situation. >> >> MatSetPreallocationCOO() is pretty heavy memory-wise. It essentially >> keeps a copy of all the coo_i, coo_j indices within the Mat as well as the >> usual matrix information. So in your scenario, you will have two copies of >> all this stuff; lots of memory. Is this really what you need? >> >> >> >> > On Apr 26, 2023, at 4:07 PM, Maxime Bouyges >> wrote: >> > >> > Dear PETSc developers, >> > >> > I am trying to use the MatSetValuesCOO function (very appropriate and >> performant for my case) but I am encountering a problem when I use it on a >> Mat obtained with MatDuplicate. It seems that the non-zero pattern is >> preserved by MatDuplicate, but not the "COO information". Here are the few >> steps to reproduce the problem: >> > MatCreate(comm, A) >> > MatSetUp(A) >> > MatSetPreallocationCOO(A, ncoo, coo_i, coo_j) >> > MatSetValuesCOO(A, coo_v, INSERT_VALUES) # -> works ok >> > MatDuplicate(A, MAT_DO_NOT_COPY_VALUES, B) >> > MatSetValuesCOO(B, coo_v, INSERT_VALUES) # -> seg-fault >> > >> > Is this an expected behaviour? Of course if I call again >> MatSetPreallocationCOO on the duplicated matrix it's ok but the performance >> is lost in my case. Is there a way to fix it? Thank you in advance. >> > >> > Best regards, >> > >> > Maxime Bouyges >> > >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From maxime.bouyges at gmail.com Fri Apr 28 15:10:39 2023 From: maxime.bouyges at gmail.com (maxime.bouyges) Date: Fri, 28 Apr 2023 22:10:39 +0200 Subject: [petsc-users] MatSetValuesCOO after MatDuplicate In-Reply-To: Message-ID: <644c283f.5d0a0220.be850.f1bf@mx.google.com> Hi Junchao,I share you pov, but I don't know what PETSc looks like deep down. I was afraid to suggest that because I guess handling the case where A is destroyed but B remains while pointing to the COO information of A... But if you can of course that will be nice (for me at least!). I will watch closely the next PETSc changelog ;).Maxime -------- Message d'origine --------De : Junchao Zhang Date : 28/04/2023 21:52 (GMT+01:00) ? : Maxime Bouyges Cc : Barry Smith , petsc-users at mcs.anl.gov Objet : Re: [petsc-users] MatSetValuesCOO after MatDuplicate Hi, Maxime,? Thanks for the introduction. It sounds to me it makes sense to share the COO information in MatDuplicate.? The data structure consumes a lot of memory, making it more justified.? I will add it.??--Junchao ZhangOn Thu, Apr 27, 2023 at 2:15?PM Maxime Bouyges wrote: Thanks for the prompt confirmation! I have to admit that I hadn't tested yet the memory performance of using COO instead of the classical MatSetValues. I will keep both approach in my code and do more performance checks (CPU and memory) before I take a decision. In any case, calling "again" MatSetPreallocationCOO after MatDuplicate is working so I can live with that. I just wanted to be sure that it was a "bug" (or a missing feature I would say) and not a misuse from me. If you are curious, here is the context. I am using the Julia langage to solve an ODE using the DifferentialEquations.jl package (https://github.com/SciML/DifferentialEquations.jl). The system jacobian matrix is a Julia SparseMatrix (https://docs.julialang.org/en/v1/stdlib/SparseArrays/) with CSC format. I am using PETSc as a backend for the linear algebra (with https://github.com/bmxam/PetscWrap.jl). So at some point I have to fill a PETSc matrix with the values of a Julia CSC sparse matrix. Recovering the COO information from the Julia matrix is trivial, and using MatSetValuesCOO with this information seems very efficient. However, the ODE solver does several matrix duplication (wrapped as MatDuplicate in my case) and that's why I stumbled accross this bug. But as explained above 1) I can call MatSetPreallocationCOO each time MatDuplicate is called and 2) I can keep the classical MatSetValues and use an other way to fill the PETSc matrix. Thanks again for your quick answer (and for the great library ;)) ! Best regards, Maxime Bouyges On 26/04/2023 23:26, Junchao Zhang wrote: It sounds like we should do reference counting on the internal data structures used by COO>. --Junchao Zhang On Wed, Apr 26, 2023 at 3:59?PM Barry Smith wrote: ? ?Yes, it looks like a bug since no one tested this situation. ? ?MatSetPreallocationCOO() is pretty heavy memory-wise. It essentially keeps a copy of all the coo_i, coo_j indices within the Mat as well as the usual matrix information. So in your scenario, you will have two copies of all this stuff; lots of memory. Is this really what you need? > On Apr 26, 2023, at 4:07 PM, Maxime Bouyges wrote: > > Dear PETSc developers, > > I am trying to use the MatSetValuesCOO function (very appropriate and performant for my case) but I am encountering a problem when I use it on a Mat obtained with MatDuplicate. It seems that the non-zero pattern is preserved by MatDuplicate, but not the "COO information". Here are the few steps to reproduce the problem: > MatCreate(comm, A) > MatSetUp(A) > MatSetPreallocationCOO(A, ncoo, coo_i, coo_j) > MatSetValuesCOO(A, coo_v, INSERT_VALUES) # -> works ok > MatDuplicate(A, MAT_DO_NOT_COPY_VALUES, B) > MatSetValuesCOO(B, coo_v, INSERT_VALUES) # -> seg-fault > > Is this an expected behaviour? Of course if I call again MatSetPreallocationCOO on the duplicated matrix it's ok but the performance is lost in my case. Is there a way to fix it? Thank you in advance. > > Best regards, > > Maxime Bouyges > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at petsc.dev Sat Apr 29 13:30:43 2023 From: bsmith at petsc.dev (Barry Smith) Date: Sat, 29 Apr 2023 14:30:43 -0400 Subject: [petsc-users] Understanding index sets for PCGASM In-Reply-To: References: Message-ID: Thank you for the test code. I have a fix in the branch barry/2023-04-29/fix-pcasmcreatesubdomains2d with merge request https://gitlab.com/petsc/petsc/-/merge_requests/6394 The functions did not have proper Fortran stubs and interfaces so I had to provide them manually in the new branch. Use git fetch git checkout barry/2023-04-29/fix-pcasmcreatesubdomains2d ./configure etc Your now working test code is in src/ksp/ksp/tests/ex71f.F90 I had to change things slightly and I updated the error handling for the latest version. Please let us know if you have any later questions. Barry > On Apr 28, 2023, at 12:07 PM, LEONARDO MUTTI wrote: > > Hello. I am having a hard time understanding the index sets to feed PCGASMSetSubdomains, and I am working in Fortran (as a PETSc novice). To get more intuition on how the IS objects behave I tried the following minimal (non) working example, which should tile a 16x16 matrix into 16 square, non-overlapping submatrices: > > #include > #include > #include > USE petscmat > USE petscksp > USE petscpc > > Mat :: A > PetscInt :: M, NSubx, dof, overlap, NSub > INTEGER :: I,J > PetscErrorCode :: ierr > PetscScalar :: v > KSP :: ksp > PC :: pc > IS :: subdomains_IS, inflated_IS > > call PetscInitialize(PETSC_NULL_CHARACTER , ierr) > > !-----Create a dummy matrix > M = 16 > call MatCreateAIJ(MPI_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE, > & M, M, > & PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER, > & PETSC_DEFAULT_INTEGER, PETSC_NULL_INTEGER, > & A, ierr) > > DO I=1,M > DO J=1,M > v = I*J > CALL MatSetValue (A,I-1,J-1,v, > & INSERT_VALUES , ierr) > END DO > END DO > > call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY , ierr) > call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY , ierr) > > !-----Create KSP and PC > call KSPCreate(PETSC_COMM_WORLD,ksp, ierr) > call KSPSetOperators(ksp,A,A, ierr) > call KSPSetType(ksp,"bcgs",ierr) > call KSPGetPC(ksp,pc,ierr) > call KSPSetUp(ksp, ierr) > call PCSetType(pc,PCGASM, ierr) > call PCSetUp(pc , ierr) > > !-----GASM setup > NSubx = 4 > dof = 1 > overlap = 0 > > call PCGASMCreateSubdomains2D(pc, > & M, M, > & NSubx, NSubx, > & dof, overlap, > & NSub, subdomains_IS, inflated_IS, ierr) > > call ISView(subdomains_IS, PETSC_VIEWER_STDOUT_WORLD, ierr) > > call KSPDestroy(ksp, ierr) > call PetscFinalize(ierr) > > Running this on one processor, I get NSub = 4. > If PCASM and PCASMCreateSubdomains2D are used instead, I get NSub = 16 as expected. > Moreover, I get in the end "forrtl: severe (157): Program Exception - access violation". So: > 1) why do I get two different results with ASM, and GASM? > 2) why do I get access violation and how can I solve this? > In fact, in C, subdomains_IS, inflated_IS should pointers to IS objects. As I see on the Fortran interface, the arguments to PCGASMCreateSubdomains2D are IS objects: > > subroutine PCGASMCreateSubdomains2D(a,b,c,d,e,f,g,h,i,j,z) > import tPC,tIS > PC a ! PC > PetscInt b ! PetscInt > PetscInt c ! PetscInt > PetscInt d ! PetscInt > PetscInt e ! PetscInt > PetscInt f ! PetscInt > PetscInt g ! PetscInt > PetscInt h ! PetscInt > IS i ! IS > IS j ! IS > PetscErrorCode z > end subroutine PCGASMCreateSubdomains2D > Thus: > 3) what should be inside e.g., subdomains_IS? I expect it to contain, for every created subdomain, the list of rows and columns defining the subblock in the matrix, am I right? > > Context: I have a block-tridiagonal system arising from space-time finite elements, and I want to solve it with GMRES+PCGASM preconditioner, where each overlapping submatrix is on the diagonal and of size 3x3 blocks (and spanning multiple processes). This is PETSc 3.17.1 on Windows. > > Thanks in advance, > Leonardo -------------- next part -------------- An HTML attachment was scrubbed... URL: From danyang.su at gmail.com Sun Apr 30 01:57:43 2023 From: danyang.su at gmail.com (Danyang Su) Date: Sat, 29 Apr 2023 23:57:43 -0700 Subject: [petsc-users] PETSC ERROR in DMGetLocalBoundingBox? In-Reply-To: <001601d958fa$a6b83a60$f428af20$@gmail.com> References: <00ab01d94e31$51fdc590$f5f950b0$@gmail.com> <8426FD29-CAD9-4B7B-8937-C03D1EF9C831@gmail.com> <001601d958fa$a6b83a60$f428af20$@gmail.com> Message-ID: <9FE6A728-CA51-43F9-8F50-B4C6F86629DB@gmail.com> Hi Matt, Just let you know that the error problem in DMGetLocalBoundingBox seems fixed in the latest dev version. I didn?t catch the error information any more. Regards, Danyang From: Date: Friday, March 17, 2023 at 11:02 AM To: 'Matthew Knepley' Cc: Subject: RE: [petsc-users] PETSC ERROR in DMGetLocalBoundingBox? Hi Matt, I am following up to check if you can reproduce the problem on your side. Thanks and have a great weekend, Danyang From: Danyang Su Sent: March 4, 2023 4:38 PM To: Matthew Knepley Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] PETSC ERROR in DMGetLocalBoundingBox? Hi Matt, Attached is the source code and example. I have deleted most of the unused source code but it is still a bit length. Sorry about that. The errors come after DMGetLocalBoundingBox and DMGetBoundingBox. -> To compile the code Please type 'make exe' and the executable file petsc_bounding will be created under the same folder. -> To test the code Please go to fold 'test' and type 'mpiexec -n 1 ../petsc_bounding'. -> The output from PETSc 3.18, error information input file: stedvs.dat ------------------------------------------------------------------------ global control parameters ------------------------------------------------------------------------ [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Corrupt argument: https://petsc.org/release/faq/#valgrind [0]PETSC ERROR: Object already free: Parameter # 1 [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.18.3, Dec 28, 2022 [0]PETSC ERROR: ../petsc_bounding on a linux-gnu-dbg named starblazer by dsu Sat Mar 4 16:20:51 2023 [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-mpich --download-scalapack --download-parmetis --download-metis --download-mumps --download-ptscotch --download-chaco --download-fblaslapack --download-hypre --download-superlu_dist --download-hdf5=yes --download-ctetgen --download-zlib --download-pnetcdf --download-cmake --with-hdf5-fortran-bindings --with-debugging=1 [0]PETSC ERROR: #1 VecGetArrayRead() at /home/dsu/Soft/petsc/petsc-3.18.3/src/vec/vec/interface/rvector.c:1928 [0]PETSC ERROR: #2 DMGetLocalBoundingBox() at /home/dsu/Soft/petsc/petsc-3.18.3/src/dm/interface/dmcoordinates.c:897 [0]PETSC ERROR: #3 /home/dsu/Work/bug-check/petsc_bounding/src/solver_ddmethod.F90:1920 Total volume of simulation domain 0.20000000E+01 Total volume of simulation domain 0.20000000E+01 -> The output from PETSc 3.17 and earlier, no error input file: stedvs.dat ------------------------------------------------------------------------ global control parameters ------------------------------------------------------------------------ Total volume of simulation domain 0.20000000E+01 Total volume of simulation domain 0.20000000E+01 Thanks, Danyang From: Matthew Knepley Date: Friday, March 3, 2023 at 8:58 PM To: Cc: Subject: Re: [petsc-users] PETSC ERROR in DMGetLocalBoundingBox? On Sat, Mar 4, 2023 at 1:35?AM wrote: Hi All, I get a very strange error after upgrading PETSc version to 3.18.3, indicating some object is already free. The error is begin and does not crash the code. There is no error before PETSc 3.17.5 versions. We have changed the way coordinates are handled in order to support higher order coordinate fields. Is it possible to send something that we can run that has this error? It could be on our end, but it could also be that you are destroying a coordinate vector accidentally. Thanks, Matt !Check coordinates call DMGetCoordinateDM(dmda_flow%da,cda,ierr) CHKERRQ(ierr) call DMGetCoordinates(dmda_flow%da,gc,ierr) CHKERRQ(ierr) call DMGetLocalBoundingBox(dmda_flow%da,lmin,lmax,ierr) CHKERRQ(ierr) call DMGetBoundingBox(dmda_flow%da,gmin,gmax,ierr) CHKERRQ(ierr) [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Corrupt argument: https://petsc.org/release/faq/#valgrind [0]PETSC ERROR: Object already free: Parameter # 1 [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.18.3, Dec 28, 2022 [0]PETSC ERROR: ../min3p-hpc-mpi-petsc-3.18.3 on a linux-gnu-dbg named starblazer by dsu Fri Mar 3 16:26:03 2023 [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-mpich --download-scalapack --download-parmetis --download-metis --download-mumps --download-ptscotch --download-chaco --download-fblaslapack --download-hypre --download-superlu_dist --download-hdf5=yes --download-ctetgen --download-zlib --download-pnetcdf --download-cmake --with-hdf5-fortran-bindings --with-debugging=1 [0]PETSC ERROR: #1 VecGetArrayRead() at /home/dsu/Soft/petsc/petsc-3.18.3/src/vec/vec/interface/rvector.c:1928 [0]PETSC ERROR: #2 DMGetLocalBoundingBox() at /home/dsu/Soft/petsc/petsc-3.18.3/src/dm/interface/dmcoordinates.c:897 [0]PETSC ERROR: #3 /home/dsu/Work/min3p-dbs-backup/src/project/makefile_p/../../solver/solver_ddmethod.F90:2140 Any suggestion on this? Thanks, Danyang -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Apr 30 09:40:59 2023 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 30 Apr 2023 10:40:59 -0400 Subject: [petsc-users] PETSC ERROR in DMGetLocalBoundingBox? In-Reply-To: <9FE6A728-CA51-43F9-8F50-B4C6F86629DB@gmail.com> References: <00ab01d94e31$51fdc590$f5f950b0$@gmail.com> <8426FD29-CAD9-4B7B-8937-C03D1EF9C831@gmail.com> <001601d958fa$a6b83a60$f428af20$@gmail.com> <9FE6A728-CA51-43F9-8F50-B4C6F86629DB@gmail.com> Message-ID: On Sun, Apr 30, 2023 at 2:57?AM Danyang Su wrote: > Hi Matt, > > > > Just let you know that the error problem in DMGetLocalBoundingBox seems > fixed in the latest dev version. I didn?t catch the error information any > more. > Sorry, I forgot to mail back. Thanks for reporting that. Matt > > > Regards, > > > > Danyang > > > > *From: * > *Date: *Friday, March 17, 2023 at 11:02 AM > *To: *'Matthew Knepley' > *Cc: * > *Subject: *RE: [petsc-users] PETSC ERROR in DMGetLocalBoundingBox? > > > > Hi Matt, > > > > I am following up to check if you can reproduce the problem on your side. > > > > Thanks and have a great weekend, > > > > Danyang > > > > *From:* Danyang Su > *Sent:* March 4, 2023 4:38 PM > *To:* Matthew Knepley > *Cc:* petsc-users at mcs.anl.gov > *Subject:* Re: [petsc-users] PETSC ERROR in DMGetLocalBoundingBox? > > > > Hi Matt, > > > > Attached is the source code and example. I have deleted most of the unused > source code but it is still a bit length. Sorry about that. The errors come after > DMGetLocalBoundingBox and DMGetBoundingBox. > > > > -> To compile the code > > Please type 'make exe' and the executable file petsc_bounding will be > created under the same folder. > > > > > > -> To test the code > > Please go to fold 'test' and type 'mpiexec -n 1 ../petsc_bounding'. > > > > > > -> The output from PETSc 3.18, error information > > input file: stedvs.dat > > > > ------------------------------------------------------------------------ > > global control parameters > > ------------------------------------------------------------------------ > > > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > > [0]PETSC ERROR: Corrupt argument: https://petsc.org/release/faq/#valgrind > > [0]PETSC ERROR: Object already free: Parameter # 1 > > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > > [0]PETSC ERROR: Petsc Release Version 3.18.3, Dec 28, 2022 > > [0]PETSC ERROR: ../petsc_bounding on a linux-gnu-dbg named starblazer by > dsu Sat Mar 4 16:20:51 2023 > > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ > --with-fc=gfortran --download-mpich --download-scalapack > --download-parmetis --download-metis --download-mumps --download-ptscotch > --download-chaco --download-fblaslapack --download-hypre > --download-superlu_dist --download-hdf5=yes --download-ctetgen > --download-zlib --download-pnetcdf --download-cmake > --with-hdf5-fortran-bindings --with-debugging=1 > > [0]PETSC ERROR: #1 VecGetArrayRead() at > /home/dsu/Soft/petsc/petsc-3.18.3/src/vec/vec/interface/rvector.c:1928 > > [0]PETSC ERROR: #2 DMGetLocalBoundingBox() at > /home/dsu/Soft/petsc/petsc-3.18.3/src/dm/interface/dmcoordinates.c:897 > > [0]PETSC ERROR: #3 > /home/dsu/Work/bug-check/petsc_bounding/src/solver_ddmethod.F90:1920 > > Total volume of simulation domain 0.20000000E+01 > > Total volume of simulation domain 0.20000000E+01 > > > > > > -> The output from PETSc 3.17 and earlier, no error > > input file: stedvs.dat > > > > ------------------------------------------------------------------------ > > global control parameters > > ------------------------------------------------------------------------ > > > > Total volume of simulation domain 0.20000000E+01 > > Total volume of simulation domain 0.20000000E+01 > > > > > > Thanks, > > > > Danyang > > *From: *Matthew Knepley > *Date: *Friday, March 3, 2023 at 8:58 PM > *To: * > *Cc: * > *Subject: *Re: [petsc-users] PETSC ERROR in DMGetLocalBoundingBox? > > > > On Sat, Mar 4, 2023 at 1:35?AM wrote: > > Hi All, > > > > I get a very strange error after upgrading PETSc version to 3.18.3, > indicating some object is already free. The error is begin and does not > crash the code. There is no error before PETSc 3.17.5 versions. > > > > We have changed the way coordinates are handled in order to support higher > order coordinate fields. Is it possible > > to send something that we can run that has this error? It could be on our > end, but it could also be that you are > > destroying a coordinate vector accidentally. > > > > Thanks, > > > > Matt > > > > > > !Check coordinates > > call DMGetCoordinateDM(dmda_flow%da,cda,ierr) > > CHKERRQ(ierr) > > call DMGetCoordinates(dmda_flow%da,gc,ierr) > > CHKERRQ(ierr) > > call DMGetLocalBoundingBox(dmda_flow%da,lmin,lmax,ierr) > > CHKERRQ(ierr) > > call DMGetBoundingBox(dmda_flow%da,gmin,gmax,ierr) > > CHKERRQ(ierr) > > > > > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > > [0]PETSC ERROR: Corrupt argument: https://petsc.org/release/faq/#valgrind > > [0]PETSC ERROR: Object already free: Parameter # 1 > > [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. > > [0]PETSC ERROR: Petsc Release Version 3.18.3, Dec 28, 2022 > > [0]PETSC ERROR: ../min3p-hpc-mpi-petsc-3.18.3 on a linux-gnu-dbg named > starblazer by dsu Fri Mar 3 16:26:03 2023 > > [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ > --with-fc=gfortran --download-mpich --download-scalapack > --download-parmetis --download-metis --download-mumps --download-ptscotch > --download-chaco --download-fblaslapack --download-hypre > --download-superlu_dist --download-hdf5=yes --download-ctetgen > --download-zlib --download-pnetcdf --download-cmake > --with-hdf5-fortran-bindings --with-debugging=1 > > [0]PETSC ERROR: #1 VecGetArrayRead() at > /home/dsu/Soft/petsc/petsc-3.18.3/src/vec/vec/interface/rvector.c:1928 > > [0]PETSC ERROR: #2 DMGetLocalBoundingBox() at > /home/dsu/Soft/petsc/petsc-3.18.3/src/dm/interface/dmcoordinates.c:897 > > [0]PETSC ERROR: #3 > /home/dsu/Work/min3p-dbs-backup/src/project/makefile_p/../../solver/solver_ddmethod.F90:2140 > > > > Any suggestion on this? > > > > Thanks, > > > > Danyang > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > https://www.cse.buffalo.edu/~knepley/ > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From myoung.space.science at gmail.com Sun Apr 30 12:12:00 2023 From: myoung.space.science at gmail.com (Matthew Young) Date: Sun, 30 Apr 2023 13:12:00 -0400 Subject: [petsc-users] DMSWARM with DMDA and KSP Message-ID: Hi all, I am developing a particle-in-cell code that models ions as particles and electrons as an inertialess fluid. I use a PIC DMSWARM for the ions, which I gather into density and flux before solving a linear system for the electrostatic potential (phi). I currently have one DMDA with 5 degrees of freedom -- one each for density, 3 flux components, and phi. When setting up the linear system to solve for phi, I've been following examples like KSP ex34.c and ex42.c when writing the KSP operator and RHS functions but I'm not sure I have the right approach, since 4 of the DOFs are known and 1 is unknown. I saw this thread that recommended using DMDAGetReducedDMDA, which I gather has been deprecated in favor of DMDACreateCompatibleDMDA. Is that a good approach for managing a regular grid with known and unknown quantities on each node? Could a composite DM be useful? Has anyone else worked on a problem like this? --Matt ========================== Matthew Young, PhD (he/him) Research Scientist II Space Science Center University of New Hampshire Matthew.Young at unh.edu ========================== -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Apr 30 12:52:26 2023 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 30 Apr 2023 13:52:26 -0400 Subject: [petsc-users] DMSWARM with DMDA and KSP In-Reply-To: References: Message-ID: On Sun, Apr 30, 2023 at 1:12?PM Matthew Young < myoung.space.science at gmail.com> wrote: > Hi all, > > I am developing a particle-in-cell code that models ions as particles and > electrons as an inertialess fluid. I use a PIC DMSWARM for the ions, which > I gather into density and flux before solving a linear system for the > electrostatic potential (phi). I currently have one DMDA with 5 degrees of > freedom -- one each for density, 3 flux components, and phi. > > When setting up the linear system to solve for phi, I've been following > examples like KSP ex34.c and ex42.c when writing the KSP operator and RHS > functions but I'm not sure I have the right approach, since 4 of the DOFs > are known and 1 is unknown. > > I saw this thread > > that recommended using DMDAGetReducedDMDA, which I gather has been > deprecated in favor of DMDACreateCompatibleDMDA. Is that a good approach > for managing a regular grid with known and unknown quantities on each node? > Could a composite DM be useful? Has anyone else worked on a problem like > this? > I recommend making a different DM for each kind of solve you want. DMDACreateCompatibleDMDA() should be the implementation of DMClone(), but we have yet to harmonize all things for all DMs. I would create one DM for your Vlasov components and one for the Poisson. We follow this strategy in our Vlasov-Poisson test for Landau damping: https://gitlab.com/petsc/petsc/-/blob/main/src/dm/impls/swarm/tests/ex9.c Thanks, Matt > --Matt > ========================== > Matthew Young, PhD (he/him) > Research Scientist II > Space Science Center > University of New Hampshire > Matthew.Young at unh.edu > ========================== > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ -------------- next part -------------- An HTML attachment was scrubbed... URL: