From asmund.ervik at ntnu.no Sat Mar 1 04:56:00 2014 From: asmund.ervik at ntnu.no (=?iso-8859-1?Q?=C5smund_Ervik?=) Date: Sat, 1 Mar 2014 10:56:00 +0000 Subject: [petsc-users] Using DM for a refined level-set grid In-Reply-To: <87zjlkc6mv.fsf@jedbrown.org> References: , <87zjlkc6mv.fsf@jedbrown.org> Message-ID: <0E576811AB298343AC632BBCAAEFC37945BD4992@WAREHOUSE08.win.ntnu.no> Hi Jed, Thanks for the answer, sorry for the long delay here, I got knocked out by the flu. Oh well. >Fra: Jed Brown [jed at jedbrown.org] >Sendt: 22. februar 2014 01:50 >Til: ?smund Ervik; petsc-users at mcs.anl.gov >Emne: Re: [petsc-users] Using DM for a refined level-set grid > > >Is this 2D or 3D, what fraction of the domain is likely to be within >5*dxCoarse of the interface, and how large do you intend your subdomains >to be? This is 3D. Of the global domain only a small fraction would be within 5*dxCoarse of the interface; we're not studying atomization or anything extreme like that. I guess it's a little tricky to say how large the subdomains will be, but I'd say around 300 subdomains would be a reasonable guess at the number of them. Then the size would be roughly 1/7th of the global domain in each dimension, so for a typical case around 100*dxCoarse in each dimension. So still the subdomains should not be nearly-full refined. >My worry is that even if you implement the dynamic algorithm you're >thinking of (which is nontrivial to do well), some subdomains will have >nearly-full refined sections which will limit your max problem size and >performance. If you can't load balance dynamically (in terms of peak >memory and time) at the needed granularity, there is no point trying to >save some storage in benign regions of your domain. This is a good point. It's probably worth it to implement just a plain refined grid (without any "sparse allocation") and see if that will be sufficiently fast. Should be quick to do. >What sort of methods will you be using for your CFD? Does it involve >assembled matrices? We're solving a pressure Poisson equation, so yes, it involves assembled matrices. >From your questions I now realize that perhaps the worst-scaling part of our code will be velocity extrapolation for level-set advection. This takes some time (not as much as the Poisson equation, maybe 5% of that) and only happens in a band around the interface. I guess we should look into alternatives to velocity extrapolation. Regards, ?smund From song.gao.mcgill at gmail.com Sat Mar 1 09:10:55 2014 From: song.gao.mcgill at gmail.com (Song Gao) Date: Sat, 1 Mar 2014 10:10:55 -0500 Subject: [petsc-users] SNESSetFunction and MatMFFDSetFunction In-Reply-To: References: <10F9F249-BD2C-4A5A-B0E5-3B86FF831C50@mcs.anl.gov> <87fvp6wgu2.fsf@jedbrown.org> <52C6FEE7.8060005@newmerical.com> <4CAC0C20-40D3-408C-A98C-2162AD378ACB@mcs.anl.gov> Message-ID: For the record, I did as Barry suggested, and it worked well for our problem. Thank you very much. On Fri, Jan 17, 2014 at 4:47 PM, Barry Smith wrote: > > On Jan 16, 2014, at 10:49 AM, Song Gao wrote: > > > I was looking at the example of MatMFFDSetFunction on website. > > > http://www.mcs.anl.gov/petsc/petsc-current/src/snes/examples/tutorials/ex22.c.html > > I think, the line 312, the last snes should be ctx. > > 312: MatMFFDSetFunction(*A,(PetscErrorCode > (*)(void*,Vec,Vec))SNESComputeFunction > > ,snes); > > No the code as in the example is correct. The first argument to > SNESComputeFunction() is the SNES object > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/SNES/SNESComputeFunction.html#SNESComputeFunction > > > > > > > > > > > > On Thu, Jan 16, 2014 at 11:04 AM, Song Gao > wrote: > > Thank you. > > > > I have found the bug in my codes. SNESSetFunction and MatMFFDSetFunction > expect functions with different interfaces, but I passed the same function > to them. > > > > > > On Wed, Jan 8, 2014 at 11:52 PM, Barry Smith wrote: > > > > I suspect the problem is here: > > > > > > > > call MatMFFDSetBase(myJctx%mf, pet_solu_snes, PETSC_NULL_INTEGER, > > > > > > @ ierrpet) > > > > In fact I am surprised it didn't crash at this line, since we don't > have code to handle the PETSC_NULL_INTEGER > > > > Try adding the > > > > > call > SNESGetFunction(snes,f,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,ierrpet); > > > > before the > > > > > MatMFFDSetBase(myJctx%mf,x,f,ierrpet); > > > > that I suggested before. > > > > Does that change anything? > > > > If you still get different values here is how I would debug it next > > > > 1) 1 process > > > > 2) run each version separately in the debugger (you can use the options > -start_in_debugger noxterm ) > > > > 3) put a break point in MatMult(). In most debuggers you just use > > > > b MatMult > > > > 4) then type c to continue > > > > 5) when it stops in MatMult do > > > > VecView(x,0) > > > > 6) make sure both versions produce the exact same numbers (do they?) > > > > 7) then type next several times until it gets to the > PetscFunctionReturn(0) line > > > > 8) do > > > > VecView(y,0) > > > > again are both answers identical? By all logic they will be different > since the norms you print are different. > > > > If (6) produces the same numbers but (8) produces different ones then > put a break point in MatMult_MFFD() > > and call VecView() on ctx->current_u ctx->current_f and a. For both > versions they should be the same. Are they? > > > > Barry > > > > > > > > On Jan 8, 2014, at 10:47 AM, Song Gao wrote: > > > > > Dear Barry, > > > > > > Thanks for reply. I basically implemented your codes. Then I have two > > > questions. > > > > > > The first is I'm working on Fortran. So I can't use MatShellSetContext > to > > > set the structure. Therefore I let the variable I want to set, MyJctx, > to > > > be global. Is there other way to do that? > > > > > > The second question is I did some tests. let the D vec to be zero, I > > > expect the code which I explicit set the matrix-free jacobian and the > code > > > which I use runtime option -snes_mf give the same residual history. > But it > > > doesn't. > > > > > > Here is the histories for > > > > > > -snes_monitor -ksp_max_it 5 -snes_converged_reason -snes_max_it 2 > -ksp_converged_reason -ksp_monitor -snes_max_linear_solve_fail 300 -pc_type > none -snes_view -snes_linesearch_type basic > > > > > > 0 SNES Function norm 4.272952196300e-02 > > > > > > 0 KSP Residual norm 4.272952196300e-02 > > > 1 KSP Residual norm 4.234712668718e-02 > > > 2 KSP Residual norm 3.683301946690e-02 > > > > > > 3 KSP Residual norm 3.465586805169e-02 > > > > > > 4 KSP Residual norm 3.452667066800e-02 > > > 5 KSP Residual norm 3.451739518719e-02 > > > Linear solve did not converge due to DIVERGED_ITS iterations 5 > > > 1 SNES Function norm 4.203973403992e-02 > > > > > > 0 KSP Residual norm 4.203973403992e-02 > > > > > > 1 KSP Residual norm 4.203070641961e-02 > > > 2 KSP Residual norm 4.202387940443e-02 > > > 3 KSP Residual norm 4.183739347023e-02 > > > 4 KSP Residual norm 4.183629424897e-02 > > > > > > 5 KSP Residual norm 4.159456024825e-02 > > > > > > Linear solve did not converge due to DIVERGED_ITS iterations 5 > > > 2 SNES Function norm 4.200901009970e-02 > > > Nonlinear solve did not converge due to DIVERGED_MAX_IT iterations 2 > > > > > > > > > Here is the histories for > > > -snes_mf -snes_monitor -ksp_max_it 5 -snes_converged_reason > -snes_max_it 2 -ksp_converged_reason -ksp_monitor > -snes_max_linear_solve_fail 300 -pc_type none -snes_view > -snes_linesearch_type basic > > > > > > > > > 0 SNES Function norm 4.272952196300e-02 > > > 0 KSP Residual norm 4.272952196300e-02 > > > 1 KSP Residual norm 4.270267664569e-02 > > > 2 KSP Residual norm 3.690026921954e-02 > > > > > > 3 KSP Residual norm 3.681740616743e-02 > > > > > > 4 KSP Residual norm 3.464377294985e-02 > > > 5 KSP Residual norm 3.464376048536e-02 > > > Linear solve did not converge due to DIVERGED_ITS iterations 5 > > > 1 SNES Function norm 3.461633424373e-02 > > > > > > 0 KSP Residual norm 3.461633424373e-02 > > > > > > 1 KSP Residual norm 3.461632119472e-02 > > > 2 KSP Residual norm 3.406130197963e-02 > > > 3 KSP Residual norm 3.406122155751e-02 > > > 4 KSP Residual norm 3.403393397001e-02 > > > > > > 5 KSP Residual norm 3.403367748538e-02 > > > > > > Linear solve did not converge due to DIVERGED_ITS iterations 5 > > > 2 SNES Function norm 3.403367847002e-02 > > > Nonlinear solve did not converge due to DIVERGED_MAX_IT iterations 2 > > > > > > > > > We can see that at 0 SNES 1 KSP step, the residual norms are > different. Did I do something wrong here? > > > > > > The codes are like > > > > > > type MyJContext > > > Mat mf > > > Vec D > > > Vec work > > > end type MyJContext > > > > > > c > > > > > > type(MyJContext) myJctx > > > > > > > -------------------------------------------------------------------------- > > > call SNESCreate(PETSC_COMM_WORLD, snes, ierpetsc) > > > call SNESSetFunction(snes, pet_rhs_snes, flowsolrhs, ctx, > > > > > > @ ierpetsc) > > > > > > c > > > call MatCreateSNESMF(snes, myJctx%mf, ierpetsc) > > > call MatMFFDSetFunction(myJctx%mf, flowsolrhs, ctx, ierpetsc) > > > call VecDuplicate(pet_solu_snes, myJctx%D, ierpetsc) > > > > > > call VecDuplicate(pet_solu_snes, myJctx%work, ierpetsc) > > > > > > call VecSet(myJctx%D, 0.0D-3, ierpetsc) > > > call MatCreateShell(PETSC_COMM_WORLD, pet_nfff, pet_nfff, > > > @ PETSC_DETERMINE, PETSC_DETERMINE, ctx, myJ, ierpetsc) > > > > > > call MatShellSetOperation(myJ, MATOP_MULT, mymultply, > > > > > > @ ierpetsc) > > > call SNESSetJacobian(snes, myJ, pet_mat_pre, > > > @ flowsoljac, ctx, ierpetsc) > > > > > > > -------------------------------------------------------------------------- > > > > > > > > > subroutine mymultply ( A, x, y, ierpet) > > > Mat :: A > > > Vec :: x, y > > > PetscErrorCode :: ierpet > > > c > > > call MatMult(myJctx%mf,x,y, ierpet) > > > c > > > end > > > > -------------------------------------------------------------------------- > > > > > > > > > subroutine flowsoljac ( snes, pet_solu_snes, pet_mat_snes, > > > @ pet_mat_pre, flag, ctxx, ierrpet ) > > > > > > c explicitly assemble pet_mat_pre matrix here > > > c ......... > > > c ......... > > > > > > call MatMFFDSetBase(myJctx%mf, pet_solu_snes, PETSC_NULL_INTEGER, > > > > > > @ ierrpet) > > > > > > end > > > > > > > > > > > > > > > On Fri, Jan 3, 2014 at 6:46 PM, Barry Smith > wrote: > > > > > > Dario, > > > > > > Your discussion below (SOR, ILU(200)) seems to imply that you are > providing some actual explicit representation of the Jacobian, not just > doing something completely matrix free. Is this correct? But the PETSc > MatMFFD() is completely matrix free, it provides only a matrix-vector > product and no access to the matrix entries, hence I am slightly confused. > > > > > > If you wish to use for the Jacobian something like D + J and do it > completely matrix free then rather than than monkeying with "changing the > function" I would > > > use the "correct" function to compute J x using matrix free multiply > and then apply the D to as an additional operation. Hence you would do > something like > > > > > > typedef struct { /* data structure to store the usual matrix > free matrix and the additional diagonal matrix */ > > > Mat mf; > > > Vec D; > > > Vec work; > > > } MyJContext; > > > > > > MyJContext myJctx; > > > > > > MatCreateSNESMF(SNES,&myJctx.mf); /* create the usual MFFD > matrix using the real nonlinear function */ > > > > MatMFFDSetFunction(myJctx.mf,yournonlinearfunction,nonlinearfunctionctx); > > > VecCreate(comm,&myJctx.D); > > > /* set the correct sizes for D and fill up with your diagonal > matrix entries */ > > > VecDuplicate(&myJctx.D,&myJCtx.work); > > > MatCreateShell(comm,.... &myJ); > > > MatShellSetOperation(myJ,MATOP_MULT, mymultiply); > > > MatShellSetContext(myJ,&myJctx); > > > SNESSetJacobian(snes,myJ,myJ, myJFunction,NULL); > > > > > > where > > > > > > PetscErrorCode mymultply(Mat A,Vec x,Vec y) /* computes y = J x > + D x > > > { > > > MyJContext *myJctx; > > > > > > MatShellGetContext(A,&myJctx); > > > MatMult(myJctx->mf,x,y); > > > VecPointwiseMult(myJctx->D,x,myJctx->work); > > > VecAXPY(y,1.myJctx->work); > > > } > > > > > > and > > > > > > PetscErrorCode myJFunction(SNES snes,Vec x,Mat *A,Mat > *B,MatStructure *str,void* ctx) > > > > > > /* this is called for each new "Jacobian" to set the point at > which it is computed */ > > > { > > > MyJContext *myJctx; > > > Vec f; > > > MatShellGetContext(*A,&myJctx); > > > SNESGetFunction(snes,&f); > > > MatMFFDSetBase(myJctx->mf,x,f); > > > > > > /* change the D entries if they depend on the current > solution etc */ > > > return 0; > > > } > > > > > > Sorry now that I have typed it out it looks a bit more > complicated then it really is. It does what you want but without any > trickery or confusing code. > > > > > > But, of course, since it is completely matrix free you cannot > use SOR with it. Of course by making D suitably large you can make it as > well conditioned as you want and thus get rapid linear convergence (though > that may slow down or ruin the nonlinear convergence). > > > > > > Hope this helps, > > > > > > Barry > > > > > > > > > > > > > > > > > > On Jan 3, 2014, at 12:18 PM, Dario Isola > wrote: > > > > > > > Dear, Barry and Jed, > > > > > > > > Thanks for your replies. > > > > > > > > We understand your doubts, so let me to put our question into > context. In CFD it is standard practice to solve non-linear equations of > conservation for steady flows by means of a inexact Newton method. The > original Jacobian matrix is modified by adding terms on the diagonal which > are proportional to the Courant number and to the lumped mass matrix. This > allows us to obtain two things, "relax" the solution update and increase > the diagonal dominance of the matrix itself. > > > > > > > > The latter is key when simple preconditioners are adopted, in our > case point Jacobi or SOR. Indeed, if the original matrix was to be used, > the GMRES method would converge only on very regular meshes and only when > adopting ILU preconditioners with a very high level of fill-in. As result a > higher number of non-linear iterations is traded with a simpler linear > system to be solved. > > > > > > > > While exploring the SNES+MF capabilities we found out that we could > successfully solve the linear system only with ILU(200) or so. Of course we > do not want to touch the function used to evaluate the residual, which > determines the final solution. However we think that a suitable > modification of the function that Petsc differences to compute the matrix > vector product would allow us to obtain a behavior similar to the inexact > Newton method. > > > > > > > > Best regards, > > > > Dario > > > > > > > > > > > > On 01/03/2014 12:32 PM, Song Gao wrote: > > > >> > > > >> > > > >> ---------- Forwarded message ---------- > > > >> From: Jed Brown > > > >> Date: Thu, Jan 2, 2014 at 10:20 AM > > > >> Subject: Re: [petsc-users] SNESSetFunction and MatMFFDSetFunction > > > >> To: Song Gao , Barry Smith < > bsmith at mcs.anl.gov> > > > >> Cc: petsc-users > > > >> > > > >> > > > >> Song Gao writes: > > > >> > > > >> > Thanks, Barry. > > > >> > > > > >> > I mean 2) providing a function that I want PETSc to difference to > evaluate > > > >> > the matrix vector product. > > > >> > > > > >> > I want to make a slight modification of the matrix after PETSc > evaluate the > > > >> > matrix vector product. > > > >> > > > >> Performing a matrix-vector product is not supposed to modify the > matrix. > > > >> It's unlikely that you really want this. > > > >> > > > >> > > > >> On Wed, Jan 1, 2014 at 3:01 PM, Barry Smith > wrote: > > > >> > > > >> On Jan 1, 2014, at 11:09 AM, Song Gao > wrote: > > > >> > > > >> > Dear all, > > > >> > > > > >> > Happy new year! > > > >> > > > > >> > I'm using the matrix-free method to solve NS equations. I call > the SNESSetFunction to set the RHS function. I think SNES also uses that > RHS function to evaluate the matrix vector product. > > > >> > > > >> Yes, PETSc differences this function to evaluate the matrix > vector product. > > > >> > > > >> > > > > >> > But I want to set one function to evaluate the residual, and > another different function to evaluate the matrix vector product. > > > >> > > > >> Are you providing a function that > > > >> > > > >> 1) actually evaluates the matrix vector product or are you > > > >> > > > >> 2) providing a function that you want PETSc to difference to > evaluate the matrix vector product? > > > >> > > > >> > How can I do that? Does MatMFFDSetFunction do this job? > > > >> > > > >> For 2) yes, but if the function you provide is different than > the function provided with SNESSetFunction then the matrix-vector product > obtained from differencing it will not be "correct" Jacobian for the > SNESSetFunction() you are providing so I don't see why you would do it. > > > >> > > > >> For 1) you should use MatCreateShell() and > MatShellSetOperation(mat,MATOP_MULT, ....) and then pass that matrix to > SNESSetJacobian(), then PETSc will use that "matrix" to do its > matrix-vector products. > > > >> > > > >> Barry > > > >> > > > >> > > > >> > > > >> > > > > >> > Any suggestion is appreciated. Thank you. > > > >> > > > > >> > > > > >> > Song > > > >> > > > > > > > > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Sat Mar 1 18:49:55 2014 From: jed at jedbrown.org (Jed Brown) Date: Sat, 01 Mar 2014 18:49:55 -0600 Subject: [petsc-users] Using DM for a refined level-set grid In-Reply-To: <0E576811AB298343AC632BBCAAEFC37945BD4992@WAREHOUSE08.win.ntnu.no> References: <87zjlkc6mv.fsf@jedbrown.org> <0E576811AB298343AC632BBCAAEFC37945BD4992@WAREHOUSE08.win.ntnu.no> Message-ID: <87a9d9qv9o.fsf@jedbrown.org> ?smund Ervik writes: > Thanks for the answer, sorry for the long delay here, I got knocked > out by the flu. Oh well. And I'm typing with one hand... > From your questions I now realize that perhaps the worst-scaling part > of our code will be velocity extrapolation for level-set > advection. This takes some time (not as much as the Poisson equation, > maybe 5% of that) and only happens in a band around the interface. I > guess we should look into alternatives to velocity extrapolation. And it sounds like this part won't be load-balanced without significant concessions and effort. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From asmund.ervik at ntnu.no Sun Mar 2 02:52:19 2014 From: asmund.ervik at ntnu.no (=?iso-8859-1?Q?=C5smund_Ervik?=) Date: Sun, 2 Mar 2014 08:52:19 +0000 Subject: [petsc-users] Using DM for a refined level-set grid In-Reply-To: <87a9d9qv9o.fsf@jedbrown.org> References: <87zjlkc6mv.fsf@jedbrown.org> <0E576811AB298343AC632BBCAAEFC37945BD4992@WAREHOUSE08.win.ntnu.no>, <87a9d9qv9o.fsf@jedbrown.org> Message-ID: I think there are options to velocity extrapolation that we can try before tackling load-balancing that. Level-set methods have so many different little tricks it's not even funny. - ?smund Jed Brown skrev: ?smund Ervik writes: > Thanks for the answer, sorry for the long delay here, I got knocked > out by the flu. Oh well. And I'm typing with one hand... > From your questions I now realize that perhaps the worst-scaling part > of our code will be velocity extrapolation for level-set > advection. This takes some time (not as much as the Poisson equation, > maybe 5% of that) and only happens in a band around the interface. I > guess we should look into alternatives to velocity extrapolation. And it sounds like this part won't be load-balanced without significant concessions and effort. -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Mar 2 13:33:06 2014 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 2 Mar 2014 13:33:06 -0600 Subject: [petsc-users] Fieldsplit schur applied on A00 of a fieldsplit schur In-Reply-To: <4CFC2209-79EB-42C6-9972-1467B8AE1A58@columbia.edu> References: <4CFC2209-79EB-42C6-9972-1467B8AE1A58@columbia.edu> Message-ID: On Fri, Feb 28, 2014 at 8:18 PM, Luc Berger-Vergiat < luc.berger.vergiat at gmail.com> wrote: > Hi all, > sorry for the cryptic title but this is a little complex. > Here is what I am doing: > I created a DMShell that gets four fields passed from a PetscSection. > Now I am doing this because I want to apply a schur complement to my > problem. > In order to do so I pass the following arguments to my code: > > -ksp_type gmres > -pc_type fieldsplit > -pc_fieldsplit_type schur > -pc_fieldsplit_schur_factorization_type full > -pc_fieldsplit_0_fields 2,3 <--- This define A00 for my > schur > -pc_fieldsplit_1_fields 0,1 > > Up to here everything works fine and as expected (I actually do a > -ksp_view to make sure that everything makes sense). > Now things get tricky, I would like to compute A00^-1 using another schur > decomposition so here are the commands I issue: > > -fieldsplit_0_ksp_type preonly > -fieldsplit_0_pc_type fieldsplit > -fieldsplit_0_pc_fieldsplit_type schur > -fieldsplit_0_pc_fieldsplit_schur_factorization_type full > -fieldsplit_0_pc_fieldsplit_0_fields 2 > -fieldsplit_0_pc_fieldsplit_1_fields 3 > > I am almost sure that the 4 first commands are correct, I am not however > sure that the last two are understood by PETSc. > Actually I am worried that the DMShell that I created for the first level > schur is not passed on the second level schur. > Here is the error message I get when I run my code: > Sorry, I am really bogged down at the moment. Can you try this: 1) You do not need to specify 2,3 for the inner fields since it will use them automatically 2) Can you try changing src/dm/impls/shell/dmshell.c:664 to include DMSetUp(*subdm); ? Thanks, Matt > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: Object is in wrong state! > [0]PETSC ERROR: Decomposition defined only after DMSetUp! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-4597-g3edecfd GIT > Date: 2014-02-20 20:43:18 -0600 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: > /home/luc/research/feap_repo/ShearBands/parfeap-petsc34/feap on a > arch-linux2-c-opt named euler by luc Fri Feb 28 20:07:18 2014 > [0]PETSC ERROR: Libraries linked from > /home/luc/research/petsc/arch-linux2-c-opt/lib > [0]PETSC ERROR: Configure run at Fri Feb 21 17:31:31 2014 > [0]PETSC ERROR: Configure options --download-cmake --download-hypre > --download-metis --download-mpich --download-parmetis --with-debugging=0 > --with-share-libraries=0 > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: DMCreateFieldDecomposition() line 1262 in > /home/luc/research/petsc/src/dm/interface/dm.c > [0]PETSC ERROR: PCFieldSplitSetDefaults() line 336 in > /home/luc/research/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c > [0]PETSC ERROR: PCSetUp_FieldSplit() line 485 in > /home/luc/research/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c > [0]PETSC ERROR: PCSetUp() line 888 in > /home/luc/research/petsc/src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSPSetUp() line 278 in > /home/luc/research/petsc/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: KSPSolve() line 390 in > /home/luc/research/petsc/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: PCApply_FieldSplit_Schur() line 859 in > /home/luc/research/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c > [0]PETSC ERROR: PCApply() line 440 in > /home/luc/research/petsc/src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSP_PCApply() line 227 in > /home/luc/research/petsc/include/petsc-private/kspimpl.h > [0]PETSC ERROR: KSPInitialResidual() line 64 in > /home/luc/research/petsc/src/ksp/ksp/interface/itres.c > [0]PETSC ERROR: KSPSolve_GMRES() line 234 in > /home/luc/research/petsc/src/ksp/ksp/impls/gmres/gmres.c > [0]PETSC ERROR: KSPSolve() line 432 in > /home/luc/research/petsc/src/ksp/ksp/interface/itfunc.c > > Let me know if I'm doing something wrong or misunderstood something. > > Best, > Luc > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From matt.landreman at gmail.com Sun Mar 2 16:23:46 2014 From: matt.landreman at gmail.com (Matt Landreman) Date: Sun, 2 Mar 2014 17:23:46 -0500 Subject: [petsc-users] Error using MUMPS to solve large linear system In-Reply-To: References: <0a26f0bf3e454a7cb284fd136453e7e5@NAGURSKI.anl.gov> <3f57971afae8492fb66dd34c8e2f1b03@NAGURSKI.anl.gov> <3FB64BEE-FFF6-4A1C-943D-52900613AF70@ldeo.columbia.edu> <51132AD1-BCA6-4865-8147-BE5F77A90DC7@mcs.anl.gov> <9D66FB28-3132-4E5E-B207-22A16517E28B@ldeo.columbia.edu> Message-ID: Hi, I'm having some problems with my PETSc application similar to the ones discussed in this thread, so perhaps one of you can help. In my application I factorize a preconditioner matrix with mumps or superlu_dist, using this factorized preconditioner to accelerate gmres on a matrix that is denser than the preconditioner. I've been running on edison at nersc. My program works reliably for problem sizes below about 1 million x 1 million, but above this size, the factorization step fails in one of many possible ways, depending on the compiler, # of nodes, # of procs/node, etc: When I use superlu_dist, I get 1 of 2 failure modes: (1) the first step of KSP returns "0 KSP residual norm -nan" and ksp then returns KSPConvergedReason = -9, or (2) the factorization completes, but GMRES then converges excruciatingly slowly or not at all, even if I choose the "real" matrix to be identical to the preconditioner matrix so KSP ought to converge in 1 step (which it does for smaller matrices). For mumps, the factorization can fail in many different ways: (3) With the intel compiler I usually get "Caught signal number 11 SEGV: Segmentation Violation" (4) Sometimes with the intel compiler I get "Caught signal number 7 BUS: Bus Error" (5) With the gnu compiler I often get a bunch of lines like "problem with NIV2_FLOPS message -5.9604644775390625E-008 0 -227464733.99999997" (6) Other times with gnu I get a mumps error with INFO(1)=-9 or INFO(1)=-17. The mumps documentation suggests I should increase icntl(14), but what is an appropriate value? 50? 10000? (7) With the Cray compiler I consistently get this cryptic error: Fatal error in PMPI_Test: Invalid MPI_Request, error stack: PMPI_Test(166): MPI_Test(request=0xb228dbf3c, flag=0x7ffffffe097c, status=0x7ffffffe0a00) failed PMPI_Test(121): Invalid MPI_Request _pmiu_daemon(SIGCHLD): [NID 02784] [c6-1c1s8n0] [Sun Mar 2 10:35:20 2014] PE RANK 0 exit signal Aborted [NID 02784] 2014-03-02 10:35:20 Apid 3374579: initiated application termination Application 3374579 exit codes: 134 For linear systems smaller than around 1 million^2, my application is very robust, working consistently with both mumps & superlu_dist, working for a wide range of # of nodes and # of procs/node, and working with all 3 available compilers on edison (intel, gnu, cray). By the way, mumps failed for much smaller problems until I tried -mat_mumps_icntl_7 2 (inspired by your conversation last week). I tried all the other options for icntl(7), icntl(28), and icntl(29), finding icntl(7)=2 works best by far. I tried the flags that worked for Samar (-mat_superlu_dist_colperm PARMETIS -mat_superlu_dist_parsymbfact 1) with superlu_dist, but they did not appear to change anything in my case. Can you recommend any other parameters of petsc, superlu_dist, or mumps that I should try changing? I don't care in the end whether I use superlu_dist or mumps. Thanks! Matt Landreman On Tue, Feb 25, 2014 at 3:50 PM, Xiaoye S. Li wrote: > Very good! Thanks for the update. > I guess you are using all 16 cores per node? Since superlu_dist currently > is MPI-only, if you generate 16 MPI tasks, serial symbolic factorization > only has less than 2 GB memory to work with. > > Sherry > > > On Tue, Feb 25, 2014 at 12:22 PM, Samar Khatiwala wrote: > >> Hi Sherry, >> >> Thanks! I tried your suggestions and it worked! >> >> For the record I added these flags: -mat_superlu_dist_colperm PARMETIS >> -mat_superlu_dist_parsymbfact 1 >> >> Also, for completeness and since you asked: >> >> size: 2346346 x 2346346 >> nnz: 60856894 >> unsymmetric >> >> The hardware (http://www2.cisl.ucar.edu/resources/yellowstone/hardware) >> specs are: 2 GB/core, 32 GB/node (27 GB usable), (16 cores per node) >> I've been running on 8 nodes (so 8 x 27 ~ 216 GB). >> >> Thanks again for your help! >> >> Samar >> >> On Feb 25, 2014, at 1:00 PM, "Xiaoye S. Li" wrote: >> >> I didn't follow the discussion thread closely ... How large is your >> matrix dimension, and number of nonzeros? >> How large is the memory per core (or per node)? >> >> The default setting in superlu_dist is to use serial symbolic >> factorization. You can turn on parallel symbolic factorization by: >> >> options.ParSymbFact = YES; >> options.ColPerm = PARMETIS; >> >> Is your matrix symmetric? if so, you need to give both upper and lower >> half of matrix A to superlu, which doesn't exploit symmetry. >> >> Do you know whether you need numerical pivoting? If not, you can turn >> off pivoting by: >> >> options.RowPerm = NATURAL; >> >> This avoids some other serial bottleneck. >> >> All these options can be turned on in the petsc interface. Please check >> out the syntax there. >> >> >> Sherry >> >> >> >> On Tue, Feb 25, 2014 at 8:07 AM, Samar Khatiwala wrote: >> >>> Hi Barry, >>> >>> You're probably right. I note that the error occurs almost instantly and >>> I've tried increasing the number of CPUs >>> (as many as ~1000 on Yellowstone) to no avail. I know this is a big >>> problem but I didn't think it was that big! >>> >>> Sherry: Is there any way to write out more diagnostic info? E.g.,how >>> much memory superlu thinks it needs/is attempting >>> to allocate. >>> >>> Thanks, >>> >>> Samar >>> >>> On Feb 25, 2014, at 10:57 AM, Barry Smith wrote: >>> > >>> >> >>> >> I tried superlu_dist again and it crashes even more quickly than >>> MUMPS with just the following error: >>> >> >>> >> ERROR: 0031-250 task 128: Killed >>> > >>> > This is usually a symptom of running out of memory. >>> > >>> >> >>> >> Absolutely nothing else is written out to either stderr or stdout. >>> This is with -mat_superlu_dist_statprint. >>> >> The program works fine on a smaller matrix. >>> >> >>> >> This is the sequence of calls: >>> >> >>> >> KSPSetType(ksp,KSPPREONLY); >>> >> PCSetType(pc,PCLU); >>> >> PCFactorSetMatSolverPackage(pc,MATSOLVERSUPERLU_DIST); >>> >> KSPSetFromOptions(ksp); >>> >> PCSetFromOptions(pc); >>> >> KSPSolve(ksp,b,x); >>> >> >>> >> All of these successfully return *except* the very last one to >>> KSPSolve. >>> >> >>> >> Any help would be appreciated. Thanks! >>> >> >>> >> Samar >>> >> >>> >> On Feb 24, 2014, at 3:58 PM, Xiaoye S. Li wrote: >>> >> >>> >>> Samar: >>> >>> If you include the error message while crashing using superlu_dist, >>> I probably know the reason. (better yet, include the printout before the >>> crash. ) >>> >>> >>> >>> Sherry >>> >>> >>> >>> >>> >>> On Mon, Feb 24, 2014 at 9:56 AM, Hong Zhang >>> wrote: >>> >>> Samar : >>> >>> There are limitations for direct solvers. >>> >>> Do not expect any solver can be used on arbitrarily large problems. >>> >>> Since superlu_dist also crashes, direct solvers may not be able to >>> work on your application. >>> >>> This is why I suggest to increase size incrementally. >>> >>> You may have to experiment other type of solvers. >>> >>> >>> >>> Hong >>> >>> >>> >>> Hi Hong and Jed, >>> >>> >>> >>> Many thanks for replying. It would indeed be nice if the error >>> messages from MUMPS were less cryptic! >>> >>> >>> >>> 1) I have tried smaller matrices although given how my problem is >>> set up a jump is difficult to avoid. But a good idea >>> >>> that I will try. >>> >>> >>> >>> 2) I did try various ordering but not the one you suggested. >>> >>> >>> >>> 3) Tracing the error through the MUMPS code suggest a rather abrupt >>> termination of the program (there should be more >>> >>> error messages if, for example, memory was a problem). I therefore >>> thought it might be an interface problem rather than >>> >>> one with mumps and turned to the petsc-users group first. >>> >>> >>> >>> 4) I've tried superlu_dist but it also crashes (also unclear as to >>> why) at which point I decided to try mumps. The fact that both >>> >>> crash would again indicate a common (memory?) problem. >>> >>> >>> >>> I'll try a few more things before asking the MUMPS developers. >>> >>> >>> >>> Thanks again for your help! >>> >>> >>> >>> Samar >>> >>> >>> >>> On Feb 24, 2014, at 11:47 AM, Hong Zhang wrote: >>> >>> >>> >>>> Samar: >>> >>>> The crash occurs in >>> >>>> ... >>> >>>> [161]PETSC ERROR: Error in external library! >>> >>>> [161]PETSC ERROR: Error reported by MUMPS in numerical >>> factorization phase: INFO(1)=-1, INFO(2)=48 >>> >>>> >>> >>>> for very large matrix, likely memory problem as you suspected. >>> >>>> I would suggest >>> >>>> 1. run problems with increased sizes (not jump from a small one to >>> a very large one) and observe memory usage using >>> >>>> '-ksp_view'. >>> >>>> I see you use '-mat_mumps_icntl_14 1000', i.e., percentage of >>> estimated workspace increase. Is it too large? >>> >>>> Anyway, this input should not cause the crash, I guess. >>> >>>> 2. experimenting with different matrix ordering -mat_mumps_icntl_7 >>> <> (I usually use sequential ordering 2) >>> >>>> I see you use parallel ordering -mat_mumps_icntl_29 2. >>> >>>> 3. send bug report to mumps developers for their suggestion. >>> >>>> >>> >>>> 4. try other direct solvers, e.g., superlu_dist. >>> >>>> >>> >>>> ... >>> >>>> >>> >>>> etc etc. The above error I can tell has something to do with >>> processor 48 (INFO(2)) and so forth but not the previous one. >>> >>>> >>> >>>> The full output enabled with -mat_mumps_icntl_4 3 looks as in the >>> attached file. Any hints as to what could be giving this >>> >>>> error would be very much appreciated. >>> >>>> >>> >>>> I do not know how to interpret this output file. mumps developer >>> would give you better suggestion on it. >>> >>>> I would appreciate to learn as well :-) >>> >>>> >>> >>>> Hong >>> >>> >>> >>> >>> >>> >>> >> >>> > >>> >>> >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From luc.berger.vergiat at gmail.com Sun Mar 2 16:39:51 2014 From: luc.berger.vergiat at gmail.com (Luc) Date: Sun, 02 Mar 2014 17:39:51 -0500 Subject: [petsc-users] Fieldsplit schur applied on A00 of a fieldsplit schur In-Reply-To: References: <4CFC2209-79EB-42C6-9972-1467B8AE1A58@columbia.edu> Message-ID: <5313B337.1080209@gmail.com> Hi Matt, I just got your answer. Thanks for confirming my diagnostic. I will look at the code and do my best : ) I'll keep you posted on my progresses and issues. Best, Luc On 03/02/2014 02:33 PM, Matthew Knepley wrote: > On Fri, Feb 28, 2014 at 8:18 PM, Luc Berger-Vergiat > > > wrote: > > Hi all, > sorry for the cryptic title but this is a little complex. > Here is what I am doing: > I created a DMShell that gets four fields passed from a PetscSection. > Now I am doing this because I want to apply a schur complement to > my problem. > In order to do so I pass the following arguments to my code: > > -ksp_type gmres > -pc_type fieldsplit > -pc_fieldsplit_type schur > -pc_fieldsplit_schur_factorization_type full > -pc_fieldsplit_0_fields 2,3 <--- This define > A00 for my schur > -pc_fieldsplit_1_fields 0,1 > > Up to here everything works fine and as expected (I actually do a > -ksp_view to make sure that everything makes sense). > Now things get tricky, I would like to compute A00^-1 using > another schur decomposition so here are the commands I issue: > > -fieldsplit_0_ksp_type preonly > -fieldsplit_0_pc_type fieldsplit > -fieldsplit_0_pc_fieldsplit_type schur > -fieldsplit_0_pc_fieldsplit_schur_factorization_type full > -fieldsplit_0_pc_fieldsplit_0_fields 2 > -fieldsplit_0_pc_fieldsplit_1_fields 3 > > I am almost sure that the 4 first commands are correct, I am not > however sure that the last two are understood by PETSc. > Actually I am worried that the DMShell that I created for the > first level schur is not passed on the second level schur. > Here is the error message I get when I run my code: > > > Sorry, I am really bogged down at the moment. Can you try this: > > 1) You do not need to specify 2,3 for the inner fields since it will > use them automatically > > 2) Can you try changing src/dm/impls/shell/dmshell.c:664 to include > DMSetUp(*subdm); ? > > Thanks, > > Matt > > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: Object is in wrong state! > [0]PETSC ERROR: Decomposition defined only after DMSetUp! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Development GIT revision: > v3.4.3-4597-g3edecfd GIT Date: 2014-02-20 20:43:18 -0600 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: > /home/luc/research/feap_repo/ShearBands/parfeap-petsc34/feap on a > arch-linux2-c-opt named euler by luc Fri Feb 28 20:07:18 2014 > [0]PETSC ERROR: Libraries linked from > /home/luc/research/petsc/arch-linux2-c-opt/lib > [0]PETSC ERROR: Configure run at Fri Feb 21 17:31:31 2014 > [0]PETSC ERROR: Configure options --download-cmake > --download-hypre --download-metis --download-mpich > --download-parmetis --with-debugging=0 --with-share-libraries=0 > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: DMCreateFieldDecomposition() line 1262 in > /home/luc/research/petsc/src/dm/interface/dm.c > [0]PETSC ERROR: PCFieldSplitSetDefaults() line 336 in > /home/luc/research/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c > [0]PETSC ERROR: PCSetUp_FieldSplit() line 485 in > /home/luc/research/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c > [0]PETSC ERROR: PCSetUp() line 888 in > /home/luc/research/petsc/src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSPSetUp() line 278 in > /home/luc/research/petsc/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: KSPSolve() line 390 in > /home/luc/research/petsc/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: PCApply_FieldSplit_Schur() line 859 in > /home/luc/research/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c > [0]PETSC ERROR: PCApply() line 440 in > /home/luc/research/petsc/src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSP_PCApply() line 227 in > /home/luc/research/petsc/include/petsc-private/kspimpl.h > [0]PETSC ERROR: KSPInitialResidual() line 64 in > /home/luc/research/petsc/src/ksp/ksp/interface/itres.c > [0]PETSC ERROR: KSPSolve_GMRES() line 234 in > /home/luc/research/petsc/src/ksp/ksp/impls/gmres/gmres.c > [0]PETSC ERROR: KSPSolve() line 432 in > /home/luc/research/petsc/src/ksp/ksp/interface/itfunc.c > > Let me know if I'm doing something wrong or misunderstood something. > > Best, > Luc > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From salazardetroya at gmail.com Sun Mar 2 18:54:50 2014 From: salazardetroya at gmail.com (Miguel Angel Salazar de Troya) Date: Sun, 2 Mar 2014 18:54:50 -0600 Subject: [petsc-users] Problems running ex12.c In-Reply-To: References: <8448FFCE4362914496BCEAF8BE810C13EF1D34@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D5E@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D7C@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D8F@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DA3@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DBF@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E22@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E50@DCPWPEXMBX02.mdanderson.edu> Message-ID: Hi everybody I am trying to run example ex12.c without much success. I specifically run it with the command options: ./ex12 -run_type test -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial -dm_plex_print_fem 1 And I get this output Local function: Vec Object: 1 MPI processes type: seq 0 1 1 2 1 2 2 3 Initial guess Vec Object: 1 MPI processes type: seq L_2 Error: 0.625 Residual: Vec Object: 1 MPI processes type: seq 0 0 0 0 0 0 0 0 Initial Residual Vec Object: 1 MPI processes type: seq L_2 Residual: 0 [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [0]PETSC ERROR: likely location of problem given in stack below [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [0]PETSC ERROR: INSTEAD the line number of the start of the function [0]PETSC ERROR: is given. [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 867 /home/salaza11/petsc/src/dm/impls/plex/plexfem.c [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 /home/salaza11/petsc/src/snes/interface/snes.c [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 /home/salaza11/petsc/src/snes/interface/snes.c [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: Signal received! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-3453-g0a94005 GIT Date: 2014-03-02 13:12:04 -0600 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 Sun Mar 2 17:00:09 2014 [0]PETSC ERROR: Libraries linked from /home/salaza11/petsc/linux-gnu-c-debug/lib [0]PETSC ERROR: Configure run at Sun Mar 2 16:46:51 2014 [0]PETSC ERROR: Configure options --download-mpich --download-scientificpython --download-triangle --download-ctetgen --download-chaco --with-c2html=0 [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: User provided function() line 0 in unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 [unset]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 Probably my problems could be on my configuration. I attach the configure.log. I ran ./configure like this ./configure --download-mpich --download-scientificpython --download-triangle --download-ctetgen --download-chaco --with-c2html=0 Thanks a lot in advance. On Tue, Jan 28, 2014 at 10:37 AM, Matthew Knepley wrote: > On Tue, Jan 28, 2014 at 10:31 AM, Yaakoub El Khamra wrote: > >> >> If >> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >> -variable_coefficient field -interpolate 1 -petscspace_order 2 >> -show_initial -dm_plex_print_fem >> >> is for serial, any chance we can get the options to run in parallel? >> > > Just use mpiexec -n > > Matt > > >> >> Regards >> Yaakoub El Khamra >> >> >> On Fri, Jan 17, 2014 at 11:29 AM, Matthew Knepley wrote: >> >>> On Fri, Jan 17, 2014 at 11:06 AM, Jones,Martin Alexander < >>> MAJones2 at mdanderson.org> wrote: >>> >>>> >>>> ------------------------------ >>>> *From:* Matthew Knepley [knepley at gmail.com] >>>> *Sent:* Friday, January 17, 2014 11:04 AM >>>> *To:* Jones,Martin Alexander >>>> *Cc:* petsc-users at mcs.anl.gov >>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>> >>>> On Fri, Jan 17, 2014 at 11:00 AM, Jones,Martin Alexander < >>>> MAJones2 at mdanderson.org> wrote: >>>> >>>>> These examples all seem to run excepting the following command, >>>>> >>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>> -show_initial -dm_plex_print_fem >>>>> >>>>> I get the following ouput: >>>>> >>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>> -show_initial -dm_plex_print_fem >>>>> Local function: >>>>> ./ex12: symbol lookup error: >>>>> /opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so: undefined >>>>> symbol: omp_get_num_procs >>>>> >>>> >>>> This is a build problem, but it should affect all the runs. Is this >>>> reproducible? Can you send configure.log? MKL is the worst. If this >>>> persists, I would just switch to --download-f-blas-lapack. >>>> >>> >>> Thanks. I have some advice on options >>> >>> --with-precision=single # I would not use this unless you are doing >>> something special, like CUDA >>> --with-clanguage=C++ # I would recommend switching to C, the build is >>> much faster >>> --with-mpi-dir=/usr --with-mpi4py=0 >>> --with-shared-libraries --CFLAGS=-O0 --CXXFLAGS=-O0 --with-fc=0 >>> --with-etags=1 # This is unnecessary >>> >>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>> # Here is the problem, see below >>> --download-metis >>> --download-fiat=yes --download-generator --download-scientificpython # >>> Get rid of these, they are obsolete >>> >>> Your MKL needs another library for the OpenMP symbols. I would recommend >>> switching to --download-f2cblaslapack, >>> or you can try and find that library. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> ------------------------------ >>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>> *Sent:* Thursday, January 16, 2014 6:35 PM >>>>> *To:* Jones,Martin Alexander >>>>> *Cc:* petsc-users at mcs.anl.gov >>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>> >>>>> On Thu, Jan 16, 2014 at 5:43 PM, Jones,Martin Alexander < >>>>> MAJones2 at mdanderson.org> wrote: >>>>> >>>>>> Hi, This is the next error message after configuring and building >>>>>> with the triangle package when trying to run ex12 >>>>>> >>>>> >>>>> This is my fault for bad defaults. I will fix. Try running >>>>> >>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type dirichlet >>>>> -interpolate 0 -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>> >>>>> for a representative run. Then you could try 3D >>>>> >>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>> -show_initial -dm_plex_print_fem >>>>> >>>>> or a full run >>>>> >>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet -interpolate >>>>> -petscspace_order 1 >>>>> >>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet -interpolate >>>>> -petscspace_order 2 >>>>> >>>>> Let me know if those work. >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> ./ex12 >>>>>> [0]PETSC ERROR: >>>>>> ------------------------------------------------------------------------ >>>>>> [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point >>>>>> Exception,probably divide by zero >>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>> -on_error_attach_debugger >>>>>> [0]PETSC ERROR: or see >>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory >>>>>> corruption errors >>>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>> ------------------------------------ >>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>>> available, >>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of the >>>>>> function >>>>>> [0]PETSC ERROR: is given. >>>>>> [0]PETSC ERROR: [0] DMPlexComputeResidualFEM line 531 >>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexfem.c >>>>>> [0]PETSC ERROR: [0] SNESComputeFunction_DMLocal line 63 >>>>>> /home/mjonesa/PETSc/petsc/src/snes/utils/dmlocalsnes.c >>>>>> [0]PETSC ERROR: [0] SNES user function line 2088 >>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>> [0]PETSC ERROR: [0] SNESComputeFunction line 2076 >>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>> [0]PETSC ERROR: [0] SNESSolve_NEWTONLS line 144 >>>>>> /home/mjonesa/PETSc/petsc/src/snes/impls/ls/ls.c >>>>>> [0]PETSC ERROR: [0] SNESSolve line 3765 >>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>> ------------------------------------ >>>>>> [0]PETSC ERROR: Signal received! >>>>>> [0]PETSC ERROR: >>>>>> ------------------------------------------------------------------------ >>>>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-2317-gcd0e7f7 >>>>>> GIT Date: 2014-01-15 20:33:42 -0600 >>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>> [0]PETSC ERROR: >>>>>> ------------------------------------------------------------------------ >>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named maeda by >>>>>> mjonesa Thu Jan 16 17:41:23 2014 >>>>>> [0]PETSC ERROR: Libraries linked from /home/mjonesa/local/lib >>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 17:38:33 2014 >>>>>> [0]PETSC ERROR: Configure options --prefix=/home/mjonesa/local >>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>> --with-c2html=0 --with-clanguage=c++ PETSC_ARCH=arch-linux2-cxx-debug >>>>>> --download-triangle >>>>>> [0]PETSC ERROR: >>>>>> ------------------------------------------------------------------------ >>>>>> [0]PETSC ERROR: User provided function() line 0 in unknown file >>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>> ------------------------------ >>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>> *Sent:* Thursday, January 16, 2014 4:37 PM >>>>>> *To:* Jones,Martin Alexander >>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>> >>>>>> On Thu, Jan 16, 2014 at 4:33 PM, Jones,Martin Alexander < >>>>>> MAJones2 at mdanderson.org> wrote: >>>>>> >>>>>>> Hi, I have downloaded and built the dev version you suggested. I >>>>>>> think I need the triangle package to run this particular case. Is there any >>>>>>> thing else that appears wrong in what I have done from the error messages >>>>>>> below: >>>>>>> >>>>>> >>>>>> Great! Its running. You can reconfigure like this: >>>>>> >>>>>> $PETSC_DIR/$PETSC_ARCH/conf/reconfigure-$PETSC_ARCH.py >>>>>> --download-triangle >>>>>> >>>>>> and then rebuild >>>>>> >>>>>> make >>>>>> >>>>>> and then rerun. You can load meshes, but its much easier to have >>>>>> triangle create them. >>>>>> >>>>>> Thanks for being patient, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>> ------------------------------------ >>>>>>> [0]PETSC ERROR: No support for this operation for this object type! >>>>>>> [0]PETSC ERROR: Mesh generation needs external package support. >>>>>>> Please reconfigure with --download-triangle.! >>>>>>> [0]PETSC ERROR: >>>>>>> ------------------------------------------------------------------------ >>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>> [0]PETSC ERROR: >>>>>>> ------------------------------------------------------------------------ >>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named maeda by >>>>>>> mjonesa Thu Jan 16 16:28:20 2014 >>>>>>> [0]PETSC ERROR: Libraries linked from /home/mjonesa/local/lib >>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 16:25:53 2014 >>>>>>> [0]PETSC ERROR: Configure options --prefix=/home/mjonesa/local >>>>>>> --with-clanguage=c++ --with-c2html=0 >>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>> [0]PETSC ERROR: >>>>>>> ------------------------------------------------------------------------ >>>>>>> [0]PETSC ERROR: DMPlexGenerate() line 4332 in >>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plex.c >>>>>>> [0]PETSC ERROR: DMPlexCreateBoxMesh() line 600 in >>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexcreate.c >>>>>>> [0]PETSC ERROR: CreateMesh() line 295 in >>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>> [0]PETSC ERROR: main() line 659 in >>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0 >>>>>>> ------------------------------ >>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>> *Sent:* Thursday, January 16, 2014 4:06 PM >>>>>>> *To:* Jones,Martin Alexander >>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>> >>>>>>> On Thu, Jan 16, 2014 at 4:05 PM, Jones,Martin Alexander < >>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>> >>>>>>>> Hi. I changed the ENV variable to the correct entry. when I type >>>>>>>> make ex12 I get this: >>>>>>>> >>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>> make ex12 >>>>>>>> g++ -o ex12.o -c -Wall -Wwrite-strings -Wno-strict-aliasing >>>>>>>> -Wno-unknown-pragmas -g -fPIC >>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include >>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/arch-linux2-cxx-debug/include >>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include/mpiuni >>>>>>>> -D__INSDIR__=src/snes/examples/tutorials/ ex12.c >>>>>>>> ex12.c:14:18: fatal error: ex12.h: No such file or directory >>>>>>>> compilation terminated. >>>>>>>> make: *** [ex12.o] Error 1 >>>>>>>> >>>>>>>> Any help of yours is very much appreciated. >>>>>>>> >>>>>>> >>>>>>> Yes, this relates to my 3). This is not going to work for you with >>>>>>> the release. Please see the link I sent. >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> ------------------------------ >>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>> *Sent:* Thursday, January 16, 2014 3:58 PM >>>>>>>> *To:* Jones,Martin Alexander >>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>> >>>>>>>> On Thu, Jan 16, 2014 at 3:55 PM, Jones,Martin Alexander < >>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>> >>>>>>>>> Thanks! >>>>>>>>> >>>>>>>> >>>>>>>> You built with PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> ------------------------------ >>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>> *Sent:* Thursday, January 16, 2014 3:31 PM >>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>> >>>>>>>>> On Thu, Jan 16, 2014 at 3:11 PM, Jones,Martin Alexander < >>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>> >>>>>>>>>> Now I went to the directory where ex12.c sits and just did a >>>>>>>>>> 'make ex12.c' with the following error if this helps? : >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>> make ex12.c >>>>>>>>>> >>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/variables:108: >>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscvariables: No >>>>>>>>>> such file or directory >>>>>>>>>> >>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/rules:962: >>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules: No >>>>>>>>>> such file or directory >>>>>>>>>> >>>>>>>>>> make: *** No rule to make target >>>>>>>>>> `/home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules'. >>>>>>>>>> Stop. >>>>>>>>>> >>>>>>>>> >>>>>>>>> 1) You would type 'make ex12' >>>>>>>>> >>>>>>>>> 2) Either you PETSC_DIR (/home/mjonesa/PETSc/petsc-3.4.3) or >>>>>>>>> PETSC_ARCH (linux-gnu-cxx-debug) environment variables >>>>>>>>> do not match what you built. Please send configure.log and >>>>>>>>> make.log >>>>>>>>> >>>>>>>>> 3) Since it was only recently added, if you want to use the FEM >>>>>>>>> functionality, you must use the development version: >>>>>>>>> >>>>>>>>> http://www.mcs.anl.gov/petsc/developers/index.html >>>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> *From:* Matthew Knepley [mailto:knepley at gmail.com] >>>>>>>>>> *Sent:* Thursday, January 16, 2014 2:48 PM >>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Thu, Jan 16, 2014 at 2:35 PM, Jones,Martin Alexander < >>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>> >>>>>>>>>> Hello To Whom it Concerns, >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> I am trying to run the tutorial ex12.c by running 'bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>> src/snes/examples/tutorials/ex12.h' >>>>>>>>>> but getting the following error: >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> $ bin/pythonscripts/PetscGenerateFEMQuadrature.py dim order dim 1 >>>>>>>>>> laplacian dim order dim 1 boundary src/snes/examples/tutorials/ex12.h >>>>>>>>>> Traceback (most recent call last): >>>>>>>>>> File "bin/pythonscripts/PetscGenerateFEMQuadrature.py", line >>>>>>>>>> 15, in >>>>>>>>>> from FIAT.reference_element import default_simplex >>>>>>>>>> ImportError: No module named FIAT.reference_element >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> I have removed the requirement of generating the header file (its >>>>>>>>>> now all handled in C). I thought >>>>>>>>>> >>>>>>>>>> I changed the documentation everywhere (including the latest >>>>>>>>>> tutorial slides). Can you try running >>>>>>>>>> >>>>>>>>>> with 'master' (or 'next'), and point me toward the old docs? >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Thanks, >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- *Miguel Angel Salazar de Troya* Graduate Research Assistant Department of Mechanical Science and Engineering University of Illinois at Urbana-Champaign (217) 550-2360 salaza11 at illinois.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: text/x-log Size: 2214750 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: make.log Type: text/x-log Size: 86209 bytes Desc: not available URL: From knepley at gmail.com Sun Mar 2 19:11:57 2014 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 2 Mar 2014 19:11:57 -0600 Subject: [petsc-users] Problems running ex12.c In-Reply-To: References: <8448FFCE4362914496BCEAF8BE810C13EF1D34@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D5E@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D7C@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D8F@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DA3@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DBF@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E22@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E50@DCPWPEXMBX02.mdanderson.edu> Message-ID: On Sun, Mar 2, 2014 at 6:54 PM, Miguel Angel Salazar de Troya < salazardetroya at gmail.com> wrote: > Hi everybody > > I am trying to run example ex12.c without much success. I specifically run > it with the command options: > We need to start narrowing down differences, because it runs for me and our nightly tests. So, first can you confirm that you are using the latest 'next' branch? Thanks, Matt > ./ex12 -run_type test -refinement_limit 0.0 -bc_type dirichlet > -interpolate 0 -petscspace_order 1 -show_initial -dm_plex_print_fem 1 > > And I get this output > > Local function: > Vec Object: 1 MPI processes > type: seq > 0 > 1 > 1 > 2 > 1 > 2 > 2 > 3 > Initial guess > Vec Object: 1 MPI processes > type: seq > L_2 Error: 0.625 > Residual: > Vec Object: 1 MPI processes > type: seq > 0 > 0 > 0 > 0 > 0 > 0 > 0 > 0 > Initial Residual > Vec Object: 1 MPI processes > type: seq > L_2 Residual: 0 > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, > probably memory access out of range > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [0]PETSC ERROR: or see > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try > http://valgrind.org on GNU/linux and Apple Mac OS X to find memory > corruption errors > [0]PETSC ERROR: likely location of problem given in stack below > [0]PETSC ERROR: --------------------- Stack Frames > ------------------------------------ > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not > available, > [0]PETSC ERROR: INSTEAD the line number of the start of the function > [0]PETSC ERROR: is given. > [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 867 > /home/salaza11/petsc/src/dm/impls/plex/plexfem.c > [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 > /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c > [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 > /home/salaza11/petsc/src/snes/interface/snes.c > [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 > /home/salaza11/petsc/src/snes/interface/snes.c > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: Signal received! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-3453-g0a94005 GIT > Date: 2014-03-02 13:12:04 -0600 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 Sun > Mar 2 17:00:09 2014 > [0]PETSC ERROR: Libraries linked from > /home/salaza11/petsc/linux-gnu-c-debug/lib > [0]PETSC ERROR: Configure run at Sun Mar 2 16:46:51 2014 > [0]PETSC ERROR: Configure options --download-mpich > --download-scientificpython --download-triangle --download-ctetgen > --download-chaco --with-c2html=0 > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: User provided function() line 0 in unknown file > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 > [unset]: aborting job: > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 > > > Probably my problems could be on my configuration. I attach the > configure.log. I ran ./configure like this > > ./configure --download-mpich --download-scientificpython > --download-triangle --download-ctetgen --download-chaco --with-c2html=0 > > Thanks a lot in advance. > > > > > > On Tue, Jan 28, 2014 at 10:37 AM, Matthew Knepley wrote: > >> On Tue, Jan 28, 2014 at 10:31 AM, Yaakoub El Khamra wrote: >> >>> >>> If >>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>> -show_initial -dm_plex_print_fem >>> >>> is for serial, any chance we can get the options to run in parallel? >>> >> >> Just use mpiexec -n >> >> Matt >> >> >>> >>> Regards >>> Yaakoub El Khamra >>> >>> >>> On Fri, Jan 17, 2014 at 11:29 AM, Matthew Knepley wrote: >>> >>>> On Fri, Jan 17, 2014 at 11:06 AM, Jones,Martin Alexander < >>>> MAJones2 at mdanderson.org> wrote: >>>> >>>>> >>>>> ------------------------------ >>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>> *Sent:* Friday, January 17, 2014 11:04 AM >>>>> *To:* Jones,Martin Alexander >>>>> *Cc:* petsc-users at mcs.anl.gov >>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>> >>>>> On Fri, Jan 17, 2014 at 11:00 AM, Jones,Martin Alexander < >>>>> MAJones2 at mdanderson.org> wrote: >>>>> >>>>>> These examples all seem to run excepting the following command, >>>>>> >>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>> -show_initial -dm_plex_print_fem >>>>>> >>>>>> I get the following ouput: >>>>>> >>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>> -show_initial -dm_plex_print_fem >>>>>> Local function: >>>>>> ./ex12: symbol lookup error: >>>>>> /opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so: undefined >>>>>> symbol: omp_get_num_procs >>>>>> >>>>> >>>>> This is a build problem, but it should affect all the runs. Is this >>>>> reproducible? Can you send configure.log? MKL is the worst. If this >>>>> persists, I would just switch to --download-f-blas-lapack. >>>>> >>>> >>>> Thanks. I have some advice on options >>>> >>>> --with-precision=single # I would not use this unless you are doing >>>> something special, like CUDA >>>> --with-clanguage=C++ # I would recommend switching to C, the build >>>> is much faster >>>> --with-mpi-dir=/usr --with-mpi4py=0 >>>> --with-shared-libraries --CFLAGS=-O0 --CXXFLAGS=-O0 --with-fc=0 >>>> --with-etags=1 # This is unnecessary >>>> >>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>> # Here is the problem, see below >>>> --download-metis >>>> --download-fiat=yes --download-generator --download-scientificpython >>>> # Get rid of these, they are obsolete >>>> >>>> Your MKL needs another library for the OpenMP symbols. I would >>>> recommend switching to --download-f2cblaslapack, >>>> or you can try and find that library. >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> ------------------------------ >>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>> *Sent:* Thursday, January 16, 2014 6:35 PM >>>>>> *To:* Jones,Martin Alexander >>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>> >>>>>> On Thu, Jan 16, 2014 at 5:43 PM, Jones,Martin Alexander < >>>>>> MAJones2 at mdanderson.org> wrote: >>>>>> >>>>>>> Hi, This is the next error message after configuring and building >>>>>>> with the triangle package when trying to run ex12 >>>>>>> >>>>>> >>>>>> This is my fault for bad defaults. I will fix. Try running >>>>>> >>>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type dirichlet >>>>>> -interpolate 0 -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>>> >>>>>> for a representative run. Then you could try 3D >>>>>> >>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>> -show_initial -dm_plex_print_fem >>>>>> >>>>>> or a full run >>>>>> >>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet -interpolate >>>>>> -petscspace_order 1 >>>>>> >>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet -interpolate >>>>>> -petscspace_order 2 >>>>>> >>>>>> Let me know if those work. >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> ./ex12 >>>>>>> [0]PETSC ERROR: >>>>>>> ------------------------------------------------------------------------ >>>>>>> [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point >>>>>>> Exception,probably divide by zero >>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>> -on_error_attach_debugger >>>>>>> [0]PETSC ERROR: or see >>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory >>>>>>> corruption errors >>>>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>> ------------------------------------ >>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>>>> available, >>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of the >>>>>>> function >>>>>>> [0]PETSC ERROR: is given. >>>>>>> [0]PETSC ERROR: [0] DMPlexComputeResidualFEM line 531 >>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexfem.c >>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction_DMLocal line 63 >>>>>>> /home/mjonesa/PETSc/petsc/src/snes/utils/dmlocalsnes.c >>>>>>> [0]PETSC ERROR: [0] SNES user function line 2088 >>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction line 2076 >>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>> [0]PETSC ERROR: [0] SNESSolve_NEWTONLS line 144 >>>>>>> /home/mjonesa/PETSc/petsc/src/snes/impls/ls/ls.c >>>>>>> [0]PETSC ERROR: [0] SNESSolve line 3765 >>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>> ------------------------------------ >>>>>>> [0]PETSC ERROR: Signal received! >>>>>>> [0]PETSC ERROR: >>>>>>> ------------------------------------------------------------------------ >>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-2317-gcd0e7f7 >>>>>>> GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>> [0]PETSC ERROR: >>>>>>> ------------------------------------------------------------------------ >>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named maeda by >>>>>>> mjonesa Thu Jan 16 17:41:23 2014 >>>>>>> [0]PETSC ERROR: Libraries linked from /home/mjonesa/local/lib >>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 17:38:33 2014 >>>>>>> [0]PETSC ERROR: Configure options --prefix=/home/mjonesa/local >>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>> --with-c2html=0 --with-clanguage=c++ PETSC_ARCH=arch-linux2-cxx-debug >>>>>>> --download-triangle >>>>>>> [0]PETSC ERROR: >>>>>>> ------------------------------------------------------------------------ >>>>>>> [0]PETSC ERROR: User provided function() line 0 in unknown file >>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>> ------------------------------ >>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>> *Sent:* Thursday, January 16, 2014 4:37 PM >>>>>>> *To:* Jones,Martin Alexander >>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>> >>>>>>> On Thu, Jan 16, 2014 at 4:33 PM, Jones,Martin Alexander < >>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>> >>>>>>>> Hi, I have downloaded and built the dev version you suggested. I >>>>>>>> think I need the triangle package to run this particular case. Is there any >>>>>>>> thing else that appears wrong in what I have done from the error messages >>>>>>>> below: >>>>>>>> >>>>>>> >>>>>>> Great! Its running. You can reconfigure like this: >>>>>>> >>>>>>> $PETSC_DIR/$PETSC_ARCH/conf/reconfigure-$PETSC_ARCH.py >>>>>>> --download-triangle >>>>>>> >>>>>>> and then rebuild >>>>>>> >>>>>>> make >>>>>>> >>>>>>> and then rerun. You can load meshes, but its much easier to have >>>>>>> triangle create them. >>>>>>> >>>>>>> Thanks for being patient, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>> ------------------------------------ >>>>>>>> [0]PETSC ERROR: No support for this operation for this object type! >>>>>>>> [0]PETSC ERROR: Mesh generation needs external package support. >>>>>>>> Please reconfigure with --download-triangle.! >>>>>>>> [0]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>> [0]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named maeda by >>>>>>>> mjonesa Thu Jan 16 16:28:20 2014 >>>>>>>> [0]PETSC ERROR: Libraries linked from /home/mjonesa/local/lib >>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 16:25:53 2014 >>>>>>>> [0]PETSC ERROR: Configure options --prefix=/home/mjonesa/local >>>>>>>> --with-clanguage=c++ --with-c2html=0 >>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>> [0]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> [0]PETSC ERROR: DMPlexGenerate() line 4332 in >>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plex.c >>>>>>>> [0]PETSC ERROR: DMPlexCreateBoxMesh() line 600 in >>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexcreate.c >>>>>>>> [0]PETSC ERROR: CreateMesh() line 295 in >>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>> [0]PETSC ERROR: main() line 659 in >>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0 >>>>>>>> ------------------------------ >>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>> *Sent:* Thursday, January 16, 2014 4:06 PM >>>>>>>> *To:* Jones,Martin Alexander >>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>> >>>>>>>> On Thu, Jan 16, 2014 at 4:05 PM, Jones,Martin Alexander < >>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>> >>>>>>>>> Hi. I changed the ENV variable to the correct entry. when I type >>>>>>>>> make ex12 I get this: >>>>>>>>> >>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>> make ex12 >>>>>>>>> g++ -o ex12.o -c -Wall -Wwrite-strings -Wno-strict-aliasing >>>>>>>>> -Wno-unknown-pragmas -g -fPIC >>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include >>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/arch-linux2-cxx-debug/include >>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include/mpiuni >>>>>>>>> -D__INSDIR__=src/snes/examples/tutorials/ ex12.c >>>>>>>>> ex12.c:14:18: fatal error: ex12.h: No such file or directory >>>>>>>>> compilation terminated. >>>>>>>>> make: *** [ex12.o] Error 1 >>>>>>>>> >>>>>>>>> Any help of yours is very much appreciated. >>>>>>>>> >>>>>>>> >>>>>>>> Yes, this relates to my 3). This is not going to work for you >>>>>>>> with the release. Please see the link I sent. >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> ------------------------------ >>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>> *Sent:* Thursday, January 16, 2014 3:58 PM >>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>> >>>>>>>>> On Thu, Jan 16, 2014 at 3:55 PM, Jones,Martin Alexander < >>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>> >>>>>>>>>> Thanks! >>>>>>>>>> >>>>>>>>> >>>>>>>>> You built with PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> ------------------------------ >>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:31 PM >>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>> >>>>>>>>>> On Thu, Jan 16, 2014 at 3:11 PM, Jones,Martin Alexander < >>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>> >>>>>>>>>>> Now I went to the directory where ex12.c sits and just did a >>>>>>>>>>> 'make ex12.c' with the following error if this helps? : >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>> make ex12.c >>>>>>>>>>> >>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/variables:108: >>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscvariables: No >>>>>>>>>>> such file or directory >>>>>>>>>>> >>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/rules:962: >>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules: No >>>>>>>>>>> such file or directory >>>>>>>>>>> >>>>>>>>>>> make: *** No rule to make target >>>>>>>>>>> `/home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules'. >>>>>>>>>>> Stop. >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> 1) You would type 'make ex12' >>>>>>>>>> >>>>>>>>>> 2) Either you PETSC_DIR (/home/mjonesa/PETSc/petsc-3.4.3) or >>>>>>>>>> PETSC_ARCH (linux-gnu-cxx-debug) environment variables >>>>>>>>>> do not match what you built. Please send configure.log and >>>>>>>>>> make.log >>>>>>>>>> >>>>>>>>>> 3) Since it was only recently added, if you want to use the FEM >>>>>>>>>> functionality, you must use the development version: >>>>>>>>>> >>>>>>>>>> http://www.mcs.anl.gov/petsc/developers/index.html >>>>>>>>>> >>>>>>>>>> Thanks, >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> *From:* Matthew Knepley [mailto:knepley at gmail.com] >>>>>>>>>>> *Sent:* Thursday, January 16, 2014 2:48 PM >>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Thu, Jan 16, 2014 at 2:35 PM, Jones,Martin Alexander < >>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>> >>>>>>>>>>> Hello To Whom it Concerns, >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> I am trying to run the tutorial ex12.c by running 'bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>> src/snes/examples/tutorials/ex12.h' >>>>>>>>>>> but getting the following error: >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> $ bin/pythonscripts/PetscGenerateFEMQuadrature.py dim order dim >>>>>>>>>>> 1 laplacian dim order dim 1 boundary src/snes/examples/tutorials/ex12.h >>>>>>>>>>> Traceback (most recent call last): >>>>>>>>>>> File "bin/pythonscripts/PetscGenerateFEMQuadrature.py", line >>>>>>>>>>> 15, in >>>>>>>>>>> from FIAT.reference_element import default_simplex >>>>>>>>>>> ImportError: No module named FIAT.reference_element >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> I have removed the requirement of generating the header file >>>>>>>>>>> (its now all handled in C). I thought >>>>>>>>>>> >>>>>>>>>>> I changed the documentation everywhere (including the latest >>>>>>>>>>> tutorial slides). Can you try running >>>>>>>>>>> >>>>>>>>>>> with 'master' (or 'next'), and point me toward the old docs? >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Thanks, >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > > > -- > *Miguel Angel Salazar de Troya* > Graduate Research Assistant > Department of Mechanical Science and Engineering > University of Illinois at Urbana-Champaign > (217) 550-2360 > salaza11 at illinois.edu > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From luc.berger.vergiat at gmail.com Sun Mar 2 19:41:27 2014 From: luc.berger.vergiat at gmail.com (Luc) Date: Sun, 02 Mar 2014 20:41:27 -0500 Subject: [petsc-users] Fieldsplit schur applied on A00 of a fieldsplit schur In-Reply-To: References: <4CFC2209-79EB-42C6-9972-1467B8AE1A58@columbia.edu> Message-ID: <5313DDC7.5070801@gmail.com> So I did the modification as you suggested and it works fine. I tried it on an example where I do two schur complements (on the global system and then on the A00 term of the global schur) and solve the A00 and S blocks of the local schur with jacobi sweeps. I attached the -ksp_view output for info. Best, Luc On 03/02/2014 02:33 PM, Matthew Knepley wrote: > On Fri, Feb 28, 2014 at 8:18 PM, Luc Berger-Vergiat > > > wrote: > > Hi all, > sorry for the cryptic title but this is a little complex. > Here is what I am doing: > I created a DMShell that gets four fields passed from a PetscSection. > Now I am doing this because I want to apply a schur complement to > my problem. > In order to do so I pass the following arguments to my code: > > -ksp_type gmres > -pc_type fieldsplit > -pc_fieldsplit_type schur > -pc_fieldsplit_schur_factorization_type full > -pc_fieldsplit_0_fields 2,3 <--- This define > A00 for my schur > -pc_fieldsplit_1_fields 0,1 > > Up to here everything works fine and as expected (I actually do a > -ksp_view to make sure that everything makes sense). > Now things get tricky, I would like to compute A00^-1 using > another schur decomposition so here are the commands I issue: > > -fieldsplit_0_ksp_type preonly > -fieldsplit_0_pc_type fieldsplit > -fieldsplit_0_pc_fieldsplit_type schur > -fieldsplit_0_pc_fieldsplit_schur_factorization_type full > -fieldsplit_0_pc_fieldsplit_0_fields 2 > -fieldsplit_0_pc_fieldsplit_1_fields 3 > > I am almost sure that the 4 first commands are correct, I am not > however sure that the last two are understood by PETSc. > Actually I am worried that the DMShell that I created for the > first level schur is not passed on the second level schur. > Here is the error message I get when I run my code: > > > Sorry, I am really bogged down at the moment. Can you try this: > > 1) You do not need to specify 2,3 for the inner fields since it will > use them automatically > > 2) Can you try changing src/dm/impls/shell/dmshell.c:664 to include > DMSetUp(*subdm); ? > > Thanks, > > Matt > > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: Object is in wrong state! > [0]PETSC ERROR: Decomposition defined only after DMSetUp! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Development GIT revision: > v3.4.3-4597-g3edecfd GIT Date: 2014-02-20 20:43:18 -0600 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: > /home/luc/research/feap_repo/ShearBands/parfeap-petsc34/feap on a > arch-linux2-c-opt named euler by luc Fri Feb 28 20:07:18 2014 > [0]PETSC ERROR: Libraries linked from > /home/luc/research/petsc/arch-linux2-c-opt/lib > [0]PETSC ERROR: Configure run at Fri Feb 21 17:31:31 2014 > [0]PETSC ERROR: Configure options --download-cmake > --download-hypre --download-metis --download-mpich > --download-parmetis --with-debugging=0 --with-share-libraries=0 > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: DMCreateFieldDecomposition() line 1262 in > /home/luc/research/petsc/src/dm/interface/dm.c > [0]PETSC ERROR: PCFieldSplitSetDefaults() line 336 in > /home/luc/research/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c > [0]PETSC ERROR: PCSetUp_FieldSplit() line 485 in > /home/luc/research/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c > [0]PETSC ERROR: PCSetUp() line 888 in > /home/luc/research/petsc/src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSPSetUp() line 278 in > /home/luc/research/petsc/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: KSPSolve() line 390 in > /home/luc/research/petsc/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: PCApply_FieldSplit_Schur() line 859 in > /home/luc/research/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c > [0]PETSC ERROR: PCApply() line 440 in > /home/luc/research/petsc/src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSP_PCApply() line 227 in > /home/luc/research/petsc/include/petsc-private/kspimpl.h > [0]PETSC ERROR: KSPInitialResidual() line 64 in > /home/luc/research/petsc/src/ksp/ksp/interface/itres.c > [0]PETSC ERROR: KSPSolve_GMRES() line 234 in > /home/luc/research/petsc/src/ksp/ksp/impls/gmres/gmres.c > [0]PETSC ERROR: KSPSolve() line 432 in > /home/luc/research/petsc/src/ksp/ksp/interface/itfunc.c > > Let me know if I'm doing something wrong or misunderstood something. > > Best, > Luc > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- KSP Object: 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization FULL Preconditioner for the Schur complement formed from A11 Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (fieldsplit_0_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_) 1 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization FULL Preconditioner for the Schur complement formed from A11 Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (fieldsplit_0_fieldsplit_Field_2_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_fieldsplit_Field_2_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Mat Object: (fieldsplit_0_fieldsplit_Field_2_) 1 MPI processes type: seqaij rows=4, cols=4 total: nonzeros=16, allocated nonzeros=16 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 1 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_0_fieldsplit_Field_3_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_fieldsplit_Field_3_) 1 MPI processes type: jacobi linear system matrix followed by preconditioner matrix: Mat Object: (fieldsplit_0_fieldsplit_Field_3_) 1 MPI processes type: schurcomplement rows=4, cols=4 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (fieldsplit_0_fieldsplit_Field_3_) 1 MPI processes type: seqaij rows=4, cols=4 total: nonzeros=16, allocated nonzeros=16 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 1 nodes, limit used is 5 A10 Mat Object: 1 MPI processes type: seqaij rows=4, cols=4 total: nonzeros=16, allocated nonzeros=16 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 1 nodes, limit used is 5 KSP of A00 KSP Object: (fieldsplit_0_fieldsplit_Field_2_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_fieldsplit_Field_2_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Mat Object: (fieldsplit_0_fieldsplit_Field_2_) 1 MPI processes type: seqaij rows=4, cols=4 total: nonzeros=16, allocated nonzeros=16 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 1 nodes, limit used is 5 A01 Mat Object: 1 MPI processes type: seqaij rows=4, cols=4 total: nonzeros=16, allocated nonzeros=16 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 1 nodes, limit used is 5 Mat Object: (fieldsplit_0_fieldsplit_Field_3_) 1 MPI processes type: seqaij rows=4, cols=4 total: nonzeros=16, allocated nonzeros=16 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 1 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 1 MPI processes type: seqaij rows=8, cols=8 total: nonzeros=64, allocated nonzeros=64 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_1_) 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object: (fieldsplit_1_) 1 MPI processes type: ilu ILU: out-of-place factorization 0 levels of fill tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot [INBLOCKS] matrix ordering: natural factor fill ratio given 1, needed 1 Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=8, cols=8 package used to perform factorization: petsc total: nonzeros=64, allocated nonzeros=64 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2 nodes, limit used is 5 linear system matrix followed by preconditioner matrix: Mat Object: (fieldsplit_1_) 1 MPI processes type: schurcomplement rows=8, cols=8 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (fieldsplit_1_) 1 MPI processes type: seqaij rows=8, cols=8 total: nonzeros=64, allocated nonzeros=64 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2 nodes, limit used is 5 A10 Mat Object: 1 MPI processes type: seqaij rows=8, cols=8 total: nonzeros=64, allocated nonzeros=64 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2 nodes, limit used is 5 KSP of A00 KSP Object: (fieldsplit_0_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_) 1 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization FULL Preconditioner for the Schur complement formed from A11 Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (fieldsplit_0_fieldsplit_Field_2_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_fieldsplit_Field_2_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Mat Object: (fieldsplit_0_fieldsplit_Field_2_) 1 MPI processes type: seqaij rows=4, cols=4 total: nonzeros=16, allocated nonzeros=16 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 1 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_0_fieldsplit_Field_3_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_fieldsplit_Field_3_) 1 MPI processes type: jacobi linear system matrix followed by preconditioner matrix: Mat Object: (fieldsplit_0_fieldsplit_Field_3_) 1 MPI processes type: schurcomplement rows=4, cols=4 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (fieldsplit_0_fieldsplit_Field_3_) 1 MPI processes type: seqaij rows=4, cols=4 total: nonzeros=16, allocated nonzeros=16 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 1 nodes, limit used is 5 A10 Mat Object: 1 MPI processes type: seqaij rows=4, cols=4 total: nonzeros=16, allocated nonzeros=16 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 1 nodes, limit used is 5 KSP of A00 KSP Object: (fieldsplit_0_fieldsplit_Field_2_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_fieldsplit_Field_2_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Mat Object: (fieldsplit_0_fieldsplit_Field_2_) 1 MPI processes type: seqaij rows=4, cols=4 total: nonzeros=16, allocated nonzeros=16 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 1 nodes, limit used is 5 A01 Mat Object: 1 MPI processes type: seqaij rows=4, cols=4 total: nonzeros=16, allocated nonzeros=16 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 1 nodes, limit used is 5 Mat Object: (fieldsplit_0_fieldsplit_Field_3_) 1 MPI processes type: seqaij rows=4, cols=4 total: nonzeros=16, allocated nonzeros=16 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 1 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 1 MPI processes type: seqaij rows=8, cols=8 total: nonzeros=64, allocated nonzeros=64 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2 nodes, limit used is 5 A01 Mat Object: 1 MPI processes type: seqaij rows=8, cols=8 total: nonzeros=64, allocated nonzeros=64 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2 nodes, limit used is 5 Mat Object: (fieldsplit_1_) 1 MPI processes type: seqaij rows=8, cols=8 total: nonzeros=64, allocated nonzeros=64 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=16, cols=16 total: nonzeros=256, allocated nonzeros=320 total number of mallocs used during MatSetValues calls =16 using I-node routines: found 4 nodes, limit used is 5 -------------- next part -------------- ./ex1 -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_schur_factorization_type full -pc_fieldsplit_0_fields 2,3 -pc_fieldsplit_1_fields 0,1 -fieldsplit_0_ksp_type preonly -fieldsplit_0_pc_type fieldsplit -fieldsplit_0_pc_fieldsplit_type schur -fieldsplit_0_pc_fieldsplit_schur_factorization_type full -fieldsplit_0_fieldsplit_Field_2_ksp_type preonly -fieldsplit_0_fieldsplit_Field_2_pc_type jacobi -fieldsplit_0_fieldsplit_Field_3_ksp_type preonly -fieldsplit_0_fieldsplit_Field_3_pc_type jacobi -ksp_view -ksp_monitor From epscodes at gmail.com Mon Mar 3 11:20:07 2014 From: epscodes at gmail.com (Xiangdong) Date: Mon, 3 Mar 2014 12:20:07 -0500 Subject: [petsc-users] DMDA questions In-Reply-To: References: <87zjlfw4al.fsf@jedbrown.org> <87r46rw0ag.fsf@jedbrown.org> <87eh2rvvd0.fsf@jedbrown.org> <87ha7mvqvy.fsf@jedbrown.org> <87bnxuvqb5.fsf@jedbrown.org> <2097886F-7026-4DD4-A1BD-027EDE0FD3DA@mcs.anl.gov> Message-ID: I have a question about using KSPSetComputeOperator with user provided function ComputeMatrix(). In order to use multigrid, this function ComputeMatrix() should be able to generate the matrix for any grid level. However, the matrix I am trying to generate depending on a vector u (the solution at previous time step). It is okay for ComputeMatrix at finest level, because I have the vector u at the finest level. However, if the ComputeMatrix tries to compute the matrix at a coarse level, I do not have the vector u at the coarse level. Then I see some error messages about nonconfirming sizes. Should I manually restrict the vector u at a coarse level or petsc has some functions on this? In other words, there is a vector u in ctx struct passing to the ComputeMatrix at finest level, how can I interpolate/restrict it to the correct grid level for multigrid applications? Thank you. Xiangdong On Fri, Feb 28, 2014 at 5:16 PM, Matthew Knepley wrote: > On Fri, Feb 28, 2014 at 2:14 PM, Xiangdong wrote: >> >> On Fri, Feb 28, 2014 at 5:11 PM, Matthew Knepley wrote: >> >>> On Fri, Feb 28, 2014 at 2:10 PM, Xiangdong wrote: >>> >>>> >>>> >>>> >>>> On Fri, Feb 28, 2014 at 4:48 PM, Barry Smith wrote: >>>> >>>>> >>>>> On Feb 28, 2014, at 3:27 PM, Xiangdong wrote: >>>>> >>>>> > >>>>> > >>>>> > >>>>> > On Fri, Feb 28, 2014 at 4:21 PM, Matthew Knepley >>>>> wrote: >>>>> > On Fri, Feb 28, 2014 at 1:16 PM, Xiangdong >>>>> wrote: >>>>> > If I assembly the matrix with MatSetValuesStencil and use >>>>> KspSetOpreators for Ksp, do I need to call KspSetDM first? >>>>> > >>>>> > No >>>>> > >>>>> > What is the key difference or advantage of using >>>>> KspSetComputeOperators against KspSetOperators? >>>>> > >>>>> > With the later, you have to manage creating and preallocating the >>>>> matrix. >>>>> > >>>>> > If I use DMCreateMatrix and MatSetValuesStencil, I do not need to >>>>> preallocate the matrix. >>>>> >>>>> Correct. >>>>> >>>>> The SetComputeOperators and KSPSetDM is useful if you wish to use >>>>> multigrid on system since PCMG will call your compute operators function on >>>>> each level for you automatically. See src/ksp/ksp/examples/tutorials/ >>>>> ex25.c ex28.c ex29.c ex31.c ex32.c ex34.c ex45.c ex50.c If you are just >>>>> solving the one system and not using multigrid then there is no particular >>>>> advantage in SetComputeOperators and KSPSetDM >>>>> >>>> >>>> If I want to use multigrid as a preconditioner, I have to call >>>> KspSetComputeOperators. Is this true? >>>> >>> >>> No, you can always do everything by hand, but that call simplifies the >>> job. >>> >> >> Does "by hand" mean providing the routines for assembling the matrix at >> different level/size manually? >> > > It means providing the matrices directly, or using the Galerkin process. > There is a manual section on it. > > Matt > > >> Xiangdong >> >> >>> >>> Matt >>> >>> >>>> Thank you. >>>> >>>> Xiangdong >>>> >>>> >>>> >>>> >>>>> >>>>> Barry >>>>> >>>>> > >>>>> > Xiangdong >>>>> > >>>>> > >>>>> > >>>>> > >>>>> > Matt >>>>> > >>>>> > Thanks you. >>>>> > >>>>> > Xiangdong >>>>> > >>>>> > >>>>> > On Tue, Feb 25, 2014 at 4:24 PM, Jed Brown wrote: >>>>> > Xiangdong writes: >>>>> > > I am not clear about this. In this example, where is the format >>>>> declaration >>>>> > > (e.g.,mpiaij) of Mat jac? Why is this mat jac always distributed in >>>>> > > a compatible way as the DM vectors? >>>>> > >>>>> > src/ksp/ksp/examples/tutorials/ex25.c calls KSPSetDM and sets >>>>> functions >>>>> > to assemble the matrices and right hand side. Those objects are >>>>> created >>>>> > internally (you can use -dm_mat_type sbaij if you like) and the user >>>>> > doesn't have to see them. >>>>> > >>>>> > >>>>> > >>>>> > >>>>> > -- >>>>> > What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> > -- Norbert Wiener >>>>> > >>>>> >>>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Mon Mar 3 11:35:19 2014 From: jed at jedbrown.org (Jed Brown) Date: Mon, 03 Mar 2014 11:35:19 -0600 Subject: [petsc-users] DMDA questions In-Reply-To: References: <87zjlfw4al.fsf@jedbrown.org> <87r46rw0ag.fsf@jedbrown.org> <87eh2rvvd0.fsf@jedbrown.org> <87ha7mvqvy.fsf@jedbrown.org> <87bnxuvqb5.fsf@jedbrown.org> <2097886F-7026-4DD4-A1BD-027EDE0FD3DA@mcs.anl.gov> Message-ID: <877g8bnq20.fsf@jedbrown.org> Xiangdong writes: > I have a question about using KSPSetComputeOperator with user provided > function ComputeMatrix(). > > In order to use multigrid, this function ComputeMatrix() should be able to > generate the matrix for any grid level. However, the matrix I am trying to > generate depending on a vector u (the solution at previous time step). It > is okay for ComputeMatrix at finest level, because I have the vector u at > the finest level. However, if the ComputeMatrix tries to compute the matrix > at a coarse level, I do not have the vector u at the coarse level. Then I > see some error messages about nonconfirming sizes. Should I manually > restrict the vector u at a coarse level or petsc has some functions on this? > > In other words, there is a vector u in ctx struct passing to the > ComputeMatrix at finest level, how can I interpolate/restrict it to the > correct grid level for multigrid applications? I encourage you to use TS because it can manage this for you, but if you want to manage it yourself, use DMCoarsenHookAdd to set callbacks that are called as the coarse grids are constructed and set up and PCMG. Use DMGetNamedGlobalVector to store your coarsened state and access it from your compute matrix function. See the TS implementations if you want an example to follow. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From salazardetroya at gmail.com Mon Mar 3 12:39:45 2014 From: salazardetroya at gmail.com (Miguel Angel Salazar de Troya) Date: Mon, 3 Mar 2014 12:39:45 -0600 Subject: [petsc-users] Problems running ex12.c In-Reply-To: References: <8448FFCE4362914496BCEAF8BE810C13EF1D34@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D5E@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D7C@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D8F@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DA3@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DBF@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E22@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E50@DCPWPEXMBX02.mdanderson.edu> Message-ID: Thanks for your response. Sorry I did not have the "next" version, but the "master" version. I still have an error though. I followed the steps given here (https://bitbucket.org/petsc/petsc/wiki/Home) to obtain the next version, I configured petsc as above and ran ex12 as above as well, getting this error: [salaza11 at maya tutorials]$ ./ex12 -run_type test -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial -dm_plex_print_fem 1 Local function: Vec Object: 1 MPI processes type: seq 0 0.25 1 0.25 0.5 1.25 1 1.25 2 Initial guess Vec Object: 1 MPI processes type: seq 0.5 L_2 Error: 0.111111 Residual: Vec Object: 1 MPI processes type: seq 0 0 0 0 0 0 0 0 0 Initial Residual Vec Object: 1 MPI processes type: seq 0 L_2 Residual: 0 [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [0]PETSC ERROR: likely location of problem given in stack below [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [0]PETSC ERROR: INSTEAD the line number of the start of the function [0]PETSC ERROR: is given. [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 871 /home/salaza11/petsc/src/dm/impls/plex/plexfem.c [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 /home/salaza11/petsc/src/snes/interface/snes.c [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 /home/salaza11/petsc/src/snes/interface/snes.c [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Signal received [0]PETSC ERROR: See http:// http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-4705-gfb6b3bc GIT Date: 2014-03-03 08:23:43 -0600 [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 Mon Mar 3 11:49:15 2014 [0]PETSC ERROR: Configure options --download-mpich --download-scientificpython --download-triangle --download-ctetgen --download-chaco --with-c2html=0 [0]PETSC ERROR: #1 User provided function() line 0 in unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 [unset]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 On Sun, Mar 2, 2014 at 7:11 PM, Matthew Knepley wrote: > On Sun, Mar 2, 2014 at 6:54 PM, Miguel Angel Salazar de Troya < > salazardetroya at gmail.com> wrote: > >> Hi everybody >> >> I am trying to run example ex12.c without much success. I specifically >> run it with the command options: >> > > We need to start narrowing down differences, because it runs for me and > our nightly tests. So, first can > you confirm that you are using the latest 'next' branch? > > Thanks, > > Matt > > >> ./ex12 -run_type test -refinement_limit 0.0 -bc_type dirichlet >> -interpolate 0 -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >> >> And I get this output >> >> Local function: >> Vec Object: 1 MPI processes >> type: seq >> 0 >> 1 >> 1 >> 2 >> 1 >> 2 >> 2 >> 3 >> Initial guess >> Vec Object: 1 MPI processes >> type: seq >> L_2 Error: 0.625 >> Residual: >> Vec Object: 1 MPI processes >> type: seq >> 0 >> 0 >> 0 >> 0 >> 0 >> 0 >> 0 >> 0 >> Initial Residual >> Vec Object: 1 MPI processes >> type: seq >> L_2 Residual: 0 >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >> probably memory access out of range >> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger >> [0]PETSC ERROR: or see >> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory >> corruption errors >> [0]PETSC ERROR: likely location of problem given in stack below >> [0]PETSC ERROR: --------------------- Stack Frames >> ------------------------------------ >> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >> available, >> [0]PETSC ERROR: INSTEAD the line number of the start of the function >> [0]PETSC ERROR: is given. >> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 867 >> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >> /home/salaza11/petsc/src/snes/interface/snes.c >> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >> /home/salaza11/petsc/src/snes/interface/snes.c >> [0]PETSC ERROR: --------------------- Error Message >> ------------------------------------ >> [0]PETSC ERROR: Signal received! >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-3453-g0a94005 GIT >> Date: 2014-03-02 13:12:04 -0600 >> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >> [0]PETSC ERROR: See docs/index.html for manual pages. >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 Sun >> Mar 2 17:00:09 2014 >> [0]PETSC ERROR: Libraries linked from >> /home/salaza11/petsc/linux-gnu-c-debug/lib >> [0]PETSC ERROR: Configure run at Sun Mar 2 16:46:51 2014 >> [0]PETSC ERROR: Configure options --download-mpich >> --download-scientificpython --download-triangle --download-ctetgen >> --download-chaco --with-c2html=0 >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: User provided function() line 0 in unknown file >> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >> [unset]: aborting job: >> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >> >> >> Probably my problems could be on my configuration. I attach the >> configure.log. I ran ./configure like this >> >> ./configure --download-mpich --download-scientificpython >> --download-triangle --download-ctetgen --download-chaco --with-c2html=0 >> >> Thanks a lot in advance. >> >> >> >> >> >> On Tue, Jan 28, 2014 at 10:37 AM, Matthew Knepley wrote: >> >>> On Tue, Jan 28, 2014 at 10:31 AM, Yaakoub El Khamra >> > wrote: >>> >>>> >>>> If >>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>> -show_initial -dm_plex_print_fem >>>> >>>> is for serial, any chance we can get the options to run in parallel? >>>> >>> >>> Just use mpiexec -n >>> >>> Matt >>> >>> >>>> >>>> Regards >>>> Yaakoub El Khamra >>>> >>>> >>>> On Fri, Jan 17, 2014 at 11:29 AM, Matthew Knepley wrote: >>>> >>>>> On Fri, Jan 17, 2014 at 11:06 AM, Jones,Martin Alexander < >>>>> MAJones2 at mdanderson.org> wrote: >>>>> >>>>>> >>>>>> ------------------------------ >>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>> *Sent:* Friday, January 17, 2014 11:04 AM >>>>>> *To:* Jones,Martin Alexander >>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>> >>>>>> On Fri, Jan 17, 2014 at 11:00 AM, Jones,Martin Alexander < >>>>>> MAJones2 at mdanderson.org> wrote: >>>>>> >>>>>>> These examples all seem to run excepting the following command, >>>>>>> >>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>> -show_initial -dm_plex_print_fem >>>>>>> >>>>>>> I get the following ouput: >>>>>>> >>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>> -show_initial -dm_plex_print_fem >>>>>>> Local function: >>>>>>> ./ex12: symbol lookup error: >>>>>>> /opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so: undefined >>>>>>> symbol: omp_get_num_procs >>>>>>> >>>>>> >>>>>> This is a build problem, but it should affect all the runs. Is this >>>>>> reproducible? Can you send configure.log? MKL is the worst. If this >>>>>> persists, I would just switch to --download-f-blas-lapack. >>>>>> >>>>> >>>>> Thanks. I have some advice on options >>>>> >>>>> --with-precision=single # I would not use this unless you are doing >>>>> something special, like CUDA >>>>> --with-clanguage=C++ # I would recommend switching to C, the build >>>>> is much faster >>>>> --with-mpi-dir=/usr --with-mpi4py=0 >>>>> --with-shared-libraries --CFLAGS=-O0 --CXXFLAGS=-O0 --with-fc=0 >>>>> --with-etags=1 # This is unnecessary >>>>> >>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>> # Here is the problem, see below >>>>> --download-metis >>>>> --download-fiat=yes --download-generator --download-scientificpython >>>>> # Get rid of these, they are obsolete >>>>> >>>>> Your MKL needs another library for the OpenMP symbols. I would >>>>> recommend switching to --download-f2cblaslapack, >>>>> or you can try and find that library. >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> ------------------------------ >>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>> *Sent:* Thursday, January 16, 2014 6:35 PM >>>>>>> *To:* Jones,Martin Alexander >>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>> >>>>>>> On Thu, Jan 16, 2014 at 5:43 PM, Jones,Martin Alexander < >>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>> >>>>>>>> Hi, This is the next error message after configuring and building >>>>>>>> with the triangle package when trying to run ex12 >>>>>>>> >>>>>>> >>>>>>> This is my fault for bad defaults. I will fix. Try running >>>>>>> >>>>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type >>>>>>> dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>> -dm_plex_print_fem 1 >>>>>>> >>>>>>> for a representative run. Then you could try 3D >>>>>>> >>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>> -show_initial -dm_plex_print_fem >>>>>>> >>>>>>> or a full run >>>>>>> >>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet -interpolate >>>>>>> -petscspace_order 1 >>>>>>> >>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet -interpolate >>>>>>> -petscspace_order 2 >>>>>>> >>>>>>> Let me know if those work. >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> ./ex12 >>>>>>>> [0]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point >>>>>>>> Exception,probably divide by zero >>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>> -on_error_attach_debugger >>>>>>>> [0]PETSC ERROR: or see >>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory >>>>>>>> corruption errors >>>>>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>> ------------------------------------ >>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>>>>> available, >>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of the >>>>>>>> function >>>>>>>> [0]PETSC ERROR: is given. >>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeResidualFEM line 531 >>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexfem.c >>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction_DMLocal line 63 >>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>> [0]PETSC ERROR: [0] SNES user function line 2088 >>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction line 2076 >>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>> [0]PETSC ERROR: [0] SNESSolve_NEWTONLS line 144 >>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/impls/ls/ls.c >>>>>>>> [0]PETSC ERROR: [0] SNESSolve line 3765 >>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>> ------------------------------------ >>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>> [0]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>> [0]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named maeda by >>>>>>>> mjonesa Thu Jan 16 17:41:23 2014 >>>>>>>> [0]PETSC ERROR: Libraries linked from /home/mjonesa/local/lib >>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 17:38:33 2014 >>>>>>>> [0]PETSC ERROR: Configure options --prefix=/home/mjonesa/local >>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>> --with-c2html=0 --with-clanguage=c++ PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>> --download-triangle >>>>>>>> [0]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> [0]PETSC ERROR: User provided function() line 0 in unknown file >>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>> ------------------------------ >>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>> *Sent:* Thursday, January 16, 2014 4:37 PM >>>>>>>> *To:* Jones,Martin Alexander >>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>> >>>>>>>> On Thu, Jan 16, 2014 at 4:33 PM, Jones,Martin Alexander < >>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>> >>>>>>>>> Hi, I have downloaded and built the dev version you suggested. I >>>>>>>>> think I need the triangle package to run this particular case. Is there any >>>>>>>>> thing else that appears wrong in what I have done from the error messages >>>>>>>>> below: >>>>>>>>> >>>>>>>> >>>>>>>> Great! Its running. You can reconfigure like this: >>>>>>>> >>>>>>>> $PETSC_DIR/$PETSC_ARCH/conf/reconfigure-$PETSC_ARCH.py >>>>>>>> --download-triangle >>>>>>>> >>>>>>>> and then rebuild >>>>>>>> >>>>>>>> make >>>>>>>> >>>>>>>> and then rerun. You can load meshes, but its much easier to have >>>>>>>> triangle create them. >>>>>>>> >>>>>>>> Thanks for being patient, >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>> ------------------------------------ >>>>>>>>> [0]PETSC ERROR: No support for this operation for this object type! >>>>>>>>> [0]PETSC ERROR: Mesh generation needs external package support. >>>>>>>>> Please reconfigure with --download-triangle.! >>>>>>>>> [0]PETSC ERROR: >>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>> [0]PETSC ERROR: >>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named maeda by >>>>>>>>> mjonesa Thu Jan 16 16:28:20 2014 >>>>>>>>> [0]PETSC ERROR: Libraries linked from /home/mjonesa/local/lib >>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 16:25:53 2014 >>>>>>>>> [0]PETSC ERROR: Configure options --prefix=/home/mjonesa/local >>>>>>>>> --with-clanguage=c++ --with-c2html=0 >>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>> [0]PETSC ERROR: >>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>> [0]PETSC ERROR: DMPlexGenerate() line 4332 in >>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plex.c >>>>>>>>> [0]PETSC ERROR: DMPlexCreateBoxMesh() line 600 in >>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexcreate.c >>>>>>>>> [0]PETSC ERROR: CreateMesh() line 295 in >>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>> [0]PETSC ERROR: main() line 659 in >>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0 >>>>>>>>> ------------------------------ >>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>> *Sent:* Thursday, January 16, 2014 4:06 PM >>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>> >>>>>>>>> On Thu, Jan 16, 2014 at 4:05 PM, Jones,Martin Alexander < >>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>> >>>>>>>>>> Hi. I changed the ENV variable to the correct entry. when I >>>>>>>>>> type make ex12 I get this: >>>>>>>>>> >>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>> make ex12 >>>>>>>>>> g++ -o ex12.o -c -Wall -Wwrite-strings -Wno-strict-aliasing >>>>>>>>>> -Wno-unknown-pragmas -g -fPIC >>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include >>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/arch-linux2-cxx-debug/include >>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include/mpiuni >>>>>>>>>> -D__INSDIR__=src/snes/examples/tutorials/ ex12.c >>>>>>>>>> ex12.c:14:18: fatal error: ex12.h: No such file or directory >>>>>>>>>> compilation terminated. >>>>>>>>>> make: *** [ex12.o] Error 1 >>>>>>>>>> >>>>>>>>>> Any help of yours is very much appreciated. >>>>>>>>>> >>>>>>>>> >>>>>>>>> Yes, this relates to my 3). This is not going to work for you >>>>>>>>> with the release. Please see the link I sent. >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> ------------------------------ >>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:58 PM >>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>> >>>>>>>>>> On Thu, Jan 16, 2014 at 3:55 PM, Jones,Martin Alexander < >>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>> >>>>>>>>>>> Thanks! >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> You built with PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> ------------------------------ >>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:31 PM >>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>> >>>>>>>>>>> On Thu, Jan 16, 2014 at 3:11 PM, Jones,Martin Alexander < >>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>> >>>>>>>>>>>> Now I went to the directory where ex12.c sits and just did a >>>>>>>>>>>> 'make ex12.c' with the following error if this helps? : >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>> make ex12.c >>>>>>>>>>>> >>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/variables:108: >>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscvariables: No >>>>>>>>>>>> such file or directory >>>>>>>>>>>> >>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/rules:962: >>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules: No >>>>>>>>>>>> such file or directory >>>>>>>>>>>> >>>>>>>>>>>> make: *** No rule to make target >>>>>>>>>>>> `/home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules'. >>>>>>>>>>>> Stop. >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> 1) You would type 'make ex12' >>>>>>>>>>> >>>>>>>>>>> 2) Either you PETSC_DIR (/home/mjonesa/PETSc/petsc-3.4.3) or >>>>>>>>>>> PETSC_ARCH (linux-gnu-cxx-debug) environment variables >>>>>>>>>>> do not match what you built. Please send configure.log and >>>>>>>>>>> make.log >>>>>>>>>>> >>>>>>>>>>> 3) Since it was only recently added, if you want to use the >>>>>>>>>>> FEM functionality, you must use the development version: >>>>>>>>>>> >>>>>>>>>>> http://www.mcs.anl.gov/petsc/developers/index.html >>>>>>>>>>> >>>>>>>>>>> Thanks, >>>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> *From:* Matthew Knepley [mailto:knepley at gmail.com] >>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 2:48 PM >>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On Thu, Jan 16, 2014 at 2:35 PM, Jones,Martin Alexander < >>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>> >>>>>>>>>>>> Hello To Whom it Concerns, >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> I am trying to run the tutorial ex12.c by running 'bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>> src/snes/examples/tutorials/ex12.h' >>>>>>>>>>>> but getting the following error: >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> $ bin/pythonscripts/PetscGenerateFEMQuadrature.py dim order dim >>>>>>>>>>>> 1 laplacian dim order dim 1 boundary src/snes/examples/tutorials/ex12.h >>>>>>>>>>>> Traceback (most recent call last): >>>>>>>>>>>> File "bin/pythonscripts/PetscGenerateFEMQuadrature.py", line >>>>>>>>>>>> 15, in >>>>>>>>>>>> from FIAT.reference_element import default_simplex >>>>>>>>>>>> ImportError: No module named FIAT.reference_element >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> I have removed the requirement of generating the header file >>>>>>>>>>>> (its now all handled in C). I thought >>>>>>>>>>>> >>>>>>>>>>>> I changed the documentation everywhere (including the latest >>>>>>>>>>>> tutorial slides). Can you try running >>>>>>>>>>>> >>>>>>>>>>>> with 'master' (or 'next'), and point me toward the old docs? >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Thanks, >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> >> >> -- >> *Miguel Angel Salazar de Troya* >> Graduate Research Assistant >> Department of Mechanical Science and Engineering >> University of Illinois at Urbana-Champaign >> (217) 550-2360 >> salaza11 at illinois.edu >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- *Miguel Angel Salazar de Troya* Graduate Research Assistant Department of Mechanical Science and Engineering University of Illinois at Urbana-Champaign (217) 550-2360 salaza11 at illinois.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Mar 3 12:48:56 2014 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 3 Mar 2014 12:48:56 -0600 Subject: [petsc-users] Problems running ex12.c In-Reply-To: References: <8448FFCE4362914496BCEAF8BE810C13EF1D34@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D5E@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D7C@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D8F@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DA3@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DBF@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E22@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E50@DCPWPEXMBX02.mdanderson.edu> Message-ID: On Mon, Mar 3, 2014 at 12:39 PM, Miguel Angel Salazar de Troya < salazardetroya at gmail.com> wrote: > Thanks for your response. Sorry I did not have the "next" version, but the > "master" version. I still have an error though. I followed the steps given > here (https://bitbucket.org/petsc/petsc/wiki/Home) to obtain the next > version, I configured petsc as above and ran ex12 as above as well, getting > this error: > > [salaza11 at maya tutorials]$ ./ex12 -run_type test -refinement_limit 0.0 > -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial > -dm_plex_print_fem 1 > Local function: > Vec Object: 1 MPI processes > type: seq > 0 > 0.25 > 1 > 0.25 > 0.5 > 1.25 > 1 > 1.25 > 2 > Initial guess > Vec Object: 1 MPI processes > type: seq > 0.5 > L_2 Error: 0.111111 > Residual: > Vec Object: 1 MPI processes > type: seq > 0 > 0 > 0 > 0 > 0 > 0 > 0 > 0 > 0 > Initial Residual > Vec Object: 1 MPI processes > type: seq > 0 > L_2 Residual: 0 > Okay, now run with -start_in_debugger, and give me a stack trace using 'where'. Thanks, Matt > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, > probably memory access out of range > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [0]PETSC ERROR: or see > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try > http://valgrind.org on GNU/linux and Apple Mac OS X to find memory > corruption errors > [0]PETSC ERROR: likely location of problem given in stack below > [0]PETSC ERROR: --------------------- Stack Frames > ------------------------------------ > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not > available, > [0]PETSC ERROR: INSTEAD the line number of the start of the function > [0]PETSC ERROR: is given. > [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 871 > /home/salaza11/petsc/src/dm/impls/plex/plexfem.c > [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 > /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c > [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 > /home/salaza11/petsc/src/snes/interface/snes.c > [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 > /home/salaza11/petsc/src/snes/interface/snes.c > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Signal received > [0]PETSC ERROR: See http:// > http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-4705-gfb6b3bc GIT > Date: 2014-03-03 08:23:43 -0600 > [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 Mon > Mar 3 11:49:15 2014 > [0]PETSC ERROR: Configure options --download-mpich > --download-scientificpython --download-triangle --download-ctetgen > --download-chaco --with-c2html=0 > [0]PETSC ERROR: #1 User provided function() line 0 in unknown file > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 > [unset]: aborting job: > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 > > > > On Sun, Mar 2, 2014 at 7:11 PM, Matthew Knepley wrote: > >> On Sun, Mar 2, 2014 at 6:54 PM, Miguel Angel Salazar de Troya < >> salazardetroya at gmail.com> wrote: >> >>> Hi everybody >>> >>> I am trying to run example ex12.c without much success. I specifically >>> run it with the command options: >>> >> >> We need to start narrowing down differences, because it runs for me and >> our nightly tests. So, first can >> you confirm that you are using the latest 'next' branch? >> >> Thanks, >> >> Matt >> >> >>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type dirichlet >>> -interpolate 0 -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>> >>> And I get this output >>> >>> Local function: >>> Vec Object: 1 MPI processes >>> type: seq >>> 0 >>> 1 >>> 1 >>> 2 >>> 1 >>> 2 >>> 2 >>> 3 >>> Initial guess >>> Vec Object: 1 MPI processes >>> type: seq >>> L_2 Error: 0.625 >>> Residual: >>> Vec Object: 1 MPI processes >>> type: seq >>> 0 >>> 0 >>> 0 >>> 0 >>> 0 >>> 0 >>> 0 >>> 0 >>> Initial Residual >>> Vec Object: 1 MPI processes >>> type: seq >>> L_2 Residual: 0 >>> [0]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >>> probably memory access out of range >>> [0]PETSC ERROR: Try option -start_in_debugger or >>> -on_error_attach_debugger >>> [0]PETSC ERROR: or see >>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory >>> corruption errors >>> [0]PETSC ERROR: likely location of problem given in stack below >>> [0]PETSC ERROR: --------------------- Stack Frames >>> ------------------------------------ >>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>> available, >>> [0]PETSC ERROR: INSTEAD the line number of the start of the >>> function >>> [0]PETSC ERROR: is given. >>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 867 >>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>> /home/salaza11/petsc/src/snes/interface/snes.c >>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>> /home/salaza11/petsc/src/snes/interface/snes.c >>> [0]PETSC ERROR: --------------------- Error Message >>> ------------------------------------ >>> [0]PETSC ERROR: Signal received! >>> [0]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-3453-g0a94005 >>> GIT Date: 2014-03-02 13:12:04 -0600 >>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>> [0]PETSC ERROR: See docs/index.html for manual pages. >>> [0]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 Sun >>> Mar 2 17:00:09 2014 >>> [0]PETSC ERROR: Libraries linked from >>> /home/salaza11/petsc/linux-gnu-c-debug/lib >>> [0]PETSC ERROR: Configure run at Sun Mar 2 16:46:51 2014 >>> [0]PETSC ERROR: Configure options --download-mpich >>> --download-scientificpython --download-triangle --download-ctetgen >>> --download-chaco --with-c2html=0 >>> [0]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [0]PETSC ERROR: User provided function() line 0 in unknown file >>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>> [unset]: aborting job: >>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>> >>> >>> Probably my problems could be on my configuration. I attach the >>> configure.log. I ran ./configure like this >>> >>> ./configure --download-mpich --download-scientificpython >>> --download-triangle --download-ctetgen --download-chaco --with-c2html=0 >>> >>> Thanks a lot in advance. >>> >>> >>> >>> >>> >>> On Tue, Jan 28, 2014 at 10:37 AM, Matthew Knepley wrote: >>> >>>> On Tue, Jan 28, 2014 at 10:31 AM, Yaakoub El Khamra < >>>> yelkhamra at gmail.com> wrote: >>>> >>>>> >>>>> If >>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>> -show_initial -dm_plex_print_fem >>>>> >>>>> is for serial, any chance we can get the options to run in parallel? >>>>> >>>> >>>> Just use mpiexec -n >>>> >>>> Matt >>>> >>>> >>>>> >>>>> Regards >>>>> Yaakoub El Khamra >>>>> >>>>> >>>>> On Fri, Jan 17, 2014 at 11:29 AM, Matthew Knepley wrote: >>>>> >>>>>> On Fri, Jan 17, 2014 at 11:06 AM, Jones,Martin Alexander < >>>>>> MAJones2 at mdanderson.org> wrote: >>>>>> >>>>>>> >>>>>>> ------------------------------ >>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>> *Sent:* Friday, January 17, 2014 11:04 AM >>>>>>> *To:* Jones,Martin Alexander >>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>> >>>>>>> On Fri, Jan 17, 2014 at 11:00 AM, Jones,Martin Alexander < >>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>> >>>>>>>> These examples all seem to run excepting the following command, >>>>>>>> >>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>> >>>>>>>> I get the following ouput: >>>>>>>> >>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>> Local function: >>>>>>>> ./ex12: symbol lookup error: >>>>>>>> /opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so: undefined >>>>>>>> symbol: omp_get_num_procs >>>>>>>> >>>>>>> >>>>>>> This is a build problem, but it should affect all the runs. Is >>>>>>> this reproducible? Can you send configure.log? MKL is the worst. If this >>>>>>> persists, I would just switch to --download-f-blas-lapack. >>>>>>> >>>>>> >>>>>> Thanks. I have some advice on options >>>>>> >>>>>> --with-precision=single # I would not use this unless you are doing >>>>>> something special, like CUDA >>>>>> --with-clanguage=C++ # I would recommend switching to C, the build >>>>>> is much faster >>>>>> --with-mpi-dir=/usr --with-mpi4py=0 >>>>>> --with-shared-libraries --CFLAGS=-O0 --CXXFLAGS=-O0 --with-fc=0 >>>>>> --with-etags=1 # This is unnecessary >>>>>> >>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>> # Here is the problem, see below >>>>>> --download-metis >>>>>> --download-fiat=yes --download-generator >>>>>> --download-scientificpython # Get rid of these, they are obsolete >>>>>> >>>>>> Your MKL needs another library for the OpenMP symbols. I would >>>>>> recommend switching to --download-f2cblaslapack, >>>>>> or you can try and find that library. >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> ------------------------------ >>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>> *Sent:* Thursday, January 16, 2014 6:35 PM >>>>>>>> *To:* Jones,Martin Alexander >>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>> >>>>>>>> On Thu, Jan 16, 2014 at 5:43 PM, Jones,Martin Alexander < >>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>> >>>>>>>>> Hi, This is the next error message after configuring and >>>>>>>>> building with the triangle package when trying to run ex12 >>>>>>>>> >>>>>>>> >>>>>>>> This is my fault for bad defaults. I will fix. Try running >>>>>>>> >>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type >>>>>>>> dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>> -dm_plex_print_fem 1 >>>>>>>> >>>>>>>> for a representative run. Then you could try 3D >>>>>>>> >>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>> >>>>>>>> or a full run >>>>>>>> >>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet -interpolate >>>>>>>> -petscspace_order 1 >>>>>>>> >>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet -interpolate >>>>>>>> -petscspace_order 2 >>>>>>>> >>>>>>>> Let me know if those work. >>>>>>>> >>>>>>>> Thanks, >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> ./ex12 >>>>>>>>> [0]PETSC ERROR: >>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>> [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point >>>>>>>>> Exception,probably divide by zero >>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>> -on_error_attach_debugger >>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find >>>>>>>>> memory corruption errors >>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>> ------------------------------------ >>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>>>>>> available, >>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of the >>>>>>>>> function >>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeResidualFEM line 531 >>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction_DMLocal line 63 >>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>> [0]PETSC ERROR: [0] SNES user function line 2088 >>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction line 2076 >>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>> [0]PETSC ERROR: [0] SNESSolve_NEWTONLS line 144 >>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/impls/ls/ls.c >>>>>>>>> [0]PETSC ERROR: [0] SNESSolve line 3765 >>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>> ------------------------------------ >>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>> [0]PETSC ERROR: >>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>> [0]PETSC ERROR: >>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named maeda by >>>>>>>>> mjonesa Thu Jan 16 17:41:23 2014 >>>>>>>>> [0]PETSC ERROR: Libraries linked from /home/mjonesa/local/lib >>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 17:38:33 2014 >>>>>>>>> [0]PETSC ERROR: Configure options --prefix=/home/mjonesa/local >>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>> --with-c2html=0 --with-clanguage=c++ PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>> --download-triangle >>>>>>>>> [0]PETSC ERROR: >>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in unknown file >>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>> ------------------------------ >>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>> *Sent:* Thursday, January 16, 2014 4:37 PM >>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>> >>>>>>>>> On Thu, Jan 16, 2014 at 4:33 PM, Jones,Martin Alexander < >>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>> >>>>>>>>>> Hi, I have downloaded and built the dev version you suggested. >>>>>>>>>> I think I need the triangle package to run this particular case. Is there >>>>>>>>>> any thing else that appears wrong in what I have done from the error >>>>>>>>>> messages below: >>>>>>>>>> >>>>>>>>> >>>>>>>>> Great! Its running. You can reconfigure like this: >>>>>>>>> >>>>>>>>> $PETSC_DIR/$PETSC_ARCH/conf/reconfigure-$PETSC_ARCH.py >>>>>>>>> --download-triangle >>>>>>>>> >>>>>>>>> and then rebuild >>>>>>>>> >>>>>>>>> make >>>>>>>>> >>>>>>>>> and then rerun. You can load meshes, but its much easier to have >>>>>>>>> triangle create them. >>>>>>>>> >>>>>>>>> Thanks for being patient, >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>> ------------------------------------ >>>>>>>>>> [0]PETSC ERROR: No support for this operation for this object >>>>>>>>>> type! >>>>>>>>>> [0]PETSC ERROR: Mesh generation needs external package support. >>>>>>>>>> Please reconfigure with --download-triangle.! >>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>> shooting. >>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named maeda by >>>>>>>>>> mjonesa Thu Jan 16 16:28:20 2014 >>>>>>>>>> [0]PETSC ERROR: Libraries linked from /home/mjonesa/local/lib >>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 16:25:53 2014 >>>>>>>>>> [0]PETSC ERROR: Configure options --prefix=/home/mjonesa/local >>>>>>>>>> --with-clanguage=c++ --with-c2html=0 >>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>> [0]PETSC ERROR: DMPlexGenerate() line 4332 in >>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plex.c >>>>>>>>>> [0]PETSC ERROR: DMPlexCreateBoxMesh() line 600 in >>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexcreate.c >>>>>>>>>> [0]PETSC ERROR: CreateMesh() line 295 in >>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>> [0]PETSC ERROR: main() line 659 in >>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0 >>>>>>>>>> ------------------------------ >>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:06 PM >>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>> >>>>>>>>>> On Thu, Jan 16, 2014 at 4:05 PM, Jones,Martin Alexander < >>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>> >>>>>>>>>>> Hi. I changed the ENV variable to the correct entry. when I >>>>>>>>>>> type make ex12 I get this: >>>>>>>>>>> >>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>> make ex12 >>>>>>>>>>> g++ -o ex12.o -c -Wall -Wwrite-strings -Wno-strict-aliasing >>>>>>>>>>> -Wno-unknown-pragmas -g -fPIC >>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include >>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/arch-linux2-cxx-debug/include >>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include/mpiuni >>>>>>>>>>> -D__INSDIR__=src/snes/examples/tutorials/ ex12.c >>>>>>>>>>> ex12.c:14:18: fatal error: ex12.h: No such file or directory >>>>>>>>>>> compilation terminated. >>>>>>>>>>> make: *** [ex12.o] Error 1 >>>>>>>>>>> >>>>>>>>>>> Any help of yours is very much appreciated. >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Yes, this relates to my 3). This is not going to work for you >>>>>>>>>> with the release. Please see the link I sent. >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> ------------------------------ >>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:58 PM >>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>> >>>>>>>>>>> On Thu, Jan 16, 2014 at 3:55 PM, Jones,Martin Alexander < >>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>> >>>>>>>>>>>> Thanks! >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> You built with PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> ------------------------------ >>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:31 PM >>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>> >>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:11 PM, Jones,Martin Alexander < >>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> Now I went to the directory where ex12.c sits and just did a >>>>>>>>>>>>> 'make ex12.c' with the following error if this helps? : >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>> make ex12.c >>>>>>>>>>>>> >>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/variables:108: >>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscvariables: No >>>>>>>>>>>>> such file or directory >>>>>>>>>>>>> >>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/rules:962: >>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules: No >>>>>>>>>>>>> such file or directory >>>>>>>>>>>>> >>>>>>>>>>>>> make: *** No rule to make target >>>>>>>>>>>>> `/home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules'. >>>>>>>>>>>>> Stop. >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> 1) You would type 'make ex12' >>>>>>>>>>>> >>>>>>>>>>>> 2) Either you PETSC_DIR (/home/mjonesa/PETSc/petsc-3.4.3) or >>>>>>>>>>>> PETSC_ARCH (linux-gnu-cxx-debug) environment variables >>>>>>>>>>>> do not match what you built. Please send configure.log and >>>>>>>>>>>> make.log >>>>>>>>>>>> >>>>>>>>>>>> 3) Since it was only recently added, if you want to use the >>>>>>>>>>>> FEM functionality, you must use the development version: >>>>>>>>>>>> >>>>>>>>>>>> http://www.mcs.anl.gov/petsc/developers/index.html >>>>>>>>>>>> >>>>>>>>>>>> Thanks, >>>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> *From:* Matthew Knepley [mailto:knepley at gmail.com] >>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 2:48 PM >>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> On Thu, Jan 16, 2014 at 2:35 PM, Jones,Martin Alexander < >>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>> Hello To Whom it Concerns, >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> I am trying to run the tutorial ex12.c by running 'bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h' >>>>>>>>>>>>> but getting the following error: >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> $ bin/pythonscripts/PetscGenerateFEMQuadrature.py dim order >>>>>>>>>>>>> dim 1 laplacian dim order dim 1 boundary src/snes/examples/tutorials/ex12.h >>>>>>>>>>>>> Traceback (most recent call last): >>>>>>>>>>>>> File "bin/pythonscripts/PetscGenerateFEMQuadrature.py", line >>>>>>>>>>>>> 15, in >>>>>>>>>>>>> from FIAT.reference_element import default_simplex >>>>>>>>>>>>> ImportError: No module named FIAT.reference_element >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> I have removed the requirement of generating the header file >>>>>>>>>>>>> (its now all handled in C). I thought >>>>>>>>>>>>> >>>>>>>>>>>>> I changed the documentation everywhere (including the latest >>>>>>>>>>>>> tutorial slides). Can you try running >>>>>>>>>>>>> >>>>>>>>>>>>> with 'master' (or 'next'), and point me toward the old docs? >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks, >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Matt >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >>> >>> -- >>> *Miguel Angel Salazar de Troya* >>> Graduate Research Assistant >>> Department of Mechanical Science and Engineering >>> University of Illinois at Urbana-Champaign >>> (217) 550-2360 >>> salaza11 at illinois.edu >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > > > -- > *Miguel Angel Salazar de Troya* > Graduate Research Assistant > Department of Mechanical Science and Engineering > University of Illinois at Urbana-Champaign > (217) 550-2360 > salaza11 at illinois.edu > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From epscodes at gmail.com Mon Mar 3 13:06:04 2014 From: epscodes at gmail.com (Xiangdong) Date: Mon, 3 Mar 2014 14:06:04 -0500 Subject: [petsc-users] DMDA questions In-Reply-To: <877g8bnq20.fsf@jedbrown.org> References: <87zjlfw4al.fsf@jedbrown.org> <87r46rw0ag.fsf@jedbrown.org> <87eh2rvvd0.fsf@jedbrown.org> <87ha7mvqvy.fsf@jedbrown.org> <87bnxuvqb5.fsf@jedbrown.org> <2097886F-7026-4DD4-A1BD-027EDE0FD3DA@mcs.anl.gov> <877g8bnq20.fsf@jedbrown.org> Message-ID: On Mon, Mar 3, 2014 at 12:35 PM, Jed Brown wrote: > Xiangdong writes: > > > I have a question about using KSPSetComputeOperator with user provided > > function ComputeMatrix(). > > > > In order to use multigrid, this function ComputeMatrix() should be able > to > > generate the matrix for any grid level. However, the matrix I am trying > to > > generate depending on a vector u (the solution at previous time step). It > > is okay for ComputeMatrix at finest level, because I have the vector u at > > the finest level. However, if the ComputeMatrix tries to compute the > matrix > > at a coarse level, I do not have the vector u at the coarse level. Then I > > see some error messages about nonconfirming sizes. Should I manually > > restrict the vector u at a coarse level or petsc has some functions on > this? > > > > In other words, there is a vector u in ctx struct passing to the > > ComputeMatrix at finest level, how can I interpolate/restrict it to the > > correct grid level for multigrid applications? > > I encourage you to use TS because it can manage this for you, but if you > want to manage it yourself, use DMCoarsenHookAdd to set callbacks that > are called as the coarse grids are constructed and set up and PCMG. Use > DMGetNamedGlobalVector to store your coarsened state and access it from > your compute matrix function. See the TS implementations if you want an > example to follow. > Could you please expand it a little more on using DMCoarsenHookAdd to restrict a fine vector on a coarse grid? The only example I can find is ex48 in snes. It is not clear how the coarsen vector are generated from that example. thank you. Xiangdong -------------- next part -------------- An HTML attachment was scrubbed... URL: From salazardetroya at gmail.com Mon Mar 3 13:39:11 2014 From: salazardetroya at gmail.com (Miguel Angel Salazar de Troya) Date: Mon, 3 Mar 2014 13:39:11 -0600 Subject: [petsc-users] Problems running ex12.c In-Reply-To: References: <8448FFCE4362914496BCEAF8BE810C13EF1D34@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D5E@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D7C@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D8F@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DA3@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DBF@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E22@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E50@DCPWPEXMBX02.mdanderson.edu> Message-ID: This is what I get at gdb when I type 'where'. #0 0x000000310e0aa860 in __nanosleep_nocancel () from /lib64/libc.so.6 #1 0x000000310e0aa70f in sleep () from /lib64/libc.so.6 #2 0x00007fd83a00a8be in PetscSleep (s=10) at /home/salaza11/petsc/src/sys/utils/psleep.c:52 #3 0x00007fd83a06f331 in PetscAttachDebugger () at /home/salaza11/petsc/src/sys/error/adebug.c:397 #4 0x00007fd83a0af1d2 in PetscOptionsCheckInitial_Private () at /home/salaza11/petsc/src/sys/objects/init.c:444 #5 0x00007fd83a0b6448 in PetscInitialize (argc=0x7fff5cd8df2c, args=0x7fff5cd8df20, file=0x0, help=0x60ce40 "Poisson Problem in 2d and 3d with simplicial finite elements.\nWe solve the Poisson problem in a rectangular\ndomain, using a parallel unstructured mesh (DMPLEX) to discretize it.\n\n\n") at /home/salaza11/petsc/src/sys/objects/pinit.c:876 #6 0x0000000000408f2c in main (argc=15, argv=0x7fff5cd8f1f8) at /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:663 The rest of the gdb output is attached. I am a bit ignorant with gdb, I apologize for that. On Mon, Mar 3, 2014 at 12:48 PM, Matthew Knepley wrote: > On Mon, Mar 3, 2014 at 12:39 PM, Miguel Angel Salazar de Troya < > salazardetroya at gmail.com> wrote: > >> Thanks for your response. Sorry I did not have the "next" version, but >> the "master" version. I still have an error though. I followed the steps >> given here (https://bitbucket.org/petsc/petsc/wiki/Home) to obtain the >> next version, I configured petsc as above and ran ex12 as above as well, >> getting this error: >> >> [salaza11 at maya tutorials]$ ./ex12 -run_type test -refinement_limit 0.0 >> -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >> -dm_plex_print_fem 1 >> Local function: >> Vec Object: 1 MPI processes >> type: seq >> 0 >> 0.25 >> 1 >> 0.25 >> 0.5 >> 1.25 >> 1 >> 1.25 >> 2 >> Initial guess >> Vec Object: 1 MPI processes >> type: seq >> 0.5 >> L_2 Error: 0.111111 >> Residual: >> Vec Object: 1 MPI processes >> type: seq >> 0 >> 0 >> 0 >> 0 >> 0 >> 0 >> 0 >> 0 >> 0 >> Initial Residual >> Vec Object: 1 MPI processes >> type: seq >> 0 >> L_2 Residual: 0 >> > > Okay, now run with -start_in_debugger, and give me a stack trace using > 'where'. > > Thanks, > > Matt > > >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >> probably memory access out of range >> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger >> [0]PETSC ERROR: or see >> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory >> corruption errors >> [0]PETSC ERROR: likely location of problem given in stack below >> [0]PETSC ERROR: --------------------- Stack Frames >> ------------------------------------ >> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >> available, >> [0]PETSC ERROR: INSTEAD the line number of the start of the function >> [0]PETSC ERROR: is given. >> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 871 >> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >> /home/salaza11/petsc/src/snes/interface/snes.c >> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >> /home/salaza11/petsc/src/snes/interface/snes.c >> [0]PETSC ERROR: --------------------- Error Message >> -------------------------------------------------------------- >> [0]PETSC ERROR: Signal received >> [0]PETSC ERROR: See http:// >> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. >> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-4705-gfb6b3bc >> GIT Date: 2014-03-03 08:23:43 -0600 >> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 Mon >> Mar 3 11:49:15 2014 >> [0]PETSC ERROR: Configure options --download-mpich >> --download-scientificpython --download-triangle --download-ctetgen >> --download-chaco --with-c2html=0 >> [0]PETSC ERROR: #1 User provided function() line 0 in unknown file >> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >> [unset]: aborting job: >> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >> >> >> >> On Sun, Mar 2, 2014 at 7:11 PM, Matthew Knepley wrote: >> >>> On Sun, Mar 2, 2014 at 6:54 PM, Miguel Angel Salazar de Troya < >>> salazardetroya at gmail.com> wrote: >>> >>>> Hi everybody >>>> >>>> I am trying to run example ex12.c without much success. I specifically >>>> run it with the command options: >>>> >>> >>> We need to start narrowing down differences, because it runs for me and >>> our nightly tests. So, first can >>> you confirm that you are using the latest 'next' branch? >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type dirichlet >>>> -interpolate 0 -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>> >>>> And I get this output >>>> >>>> Local function: >>>> Vec Object: 1 MPI processes >>>> type: seq >>>> 0 >>>> 1 >>>> 1 >>>> 2 >>>> 1 >>>> 2 >>>> 2 >>>> 3 >>>> Initial guess >>>> Vec Object: 1 MPI processes >>>> type: seq >>>> L_2 Error: 0.625 >>>> Residual: >>>> Vec Object: 1 MPI processes >>>> type: seq >>>> 0 >>>> 0 >>>> 0 >>>> 0 >>>> 0 >>>> 0 >>>> 0 >>>> 0 >>>> Initial Residual >>>> Vec Object: 1 MPI processes >>>> type: seq >>>> L_2 Residual: 0 >>>> [0]PETSC ERROR: >>>> ------------------------------------------------------------------------ >>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >>>> probably memory access out of range >>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>> -on_error_attach_debugger >>>> [0]PETSC ERROR: or see >>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory >>>> corruption errors >>>> [0]PETSC ERROR: likely location of problem given in stack below >>>> [0]PETSC ERROR: --------------------- Stack Frames >>>> ------------------------------------ >>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>> available, >>>> [0]PETSC ERROR: INSTEAD the line number of the start of the >>>> function >>>> [0]PETSC ERROR: is given. >>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 867 >>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>> [0]PETSC ERROR: --------------------- Error Message >>>> ------------------------------------ >>>> [0]PETSC ERROR: Signal received! >>>> [0]PETSC ERROR: >>>> ------------------------------------------------------------------------ >>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-3453-g0a94005 >>>> GIT Date: 2014-03-02 13:12:04 -0600 >>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>> [0]PETSC ERROR: >>>> ------------------------------------------------------------------------ >>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 >>>> Sun Mar 2 17:00:09 2014 >>>> [0]PETSC ERROR: Libraries linked from >>>> /home/salaza11/petsc/linux-gnu-c-debug/lib >>>> [0]PETSC ERROR: Configure run at Sun Mar 2 16:46:51 2014 >>>> [0]PETSC ERROR: Configure options --download-mpich >>>> --download-scientificpython --download-triangle --download-ctetgen >>>> --download-chaco --with-c2html=0 >>>> [0]PETSC ERROR: >>>> ------------------------------------------------------------------------ >>>> [0]PETSC ERROR: User provided function() line 0 in unknown file >>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>> [unset]: aborting job: >>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>> >>>> >>>> Probably my problems could be on my configuration. I attach the >>>> configure.log. I ran ./configure like this >>>> >>>> ./configure --download-mpich --download-scientificpython >>>> --download-triangle --download-ctetgen --download-chaco --with-c2html=0 >>>> >>>> Thanks a lot in advance. >>>> >>>> >>>> >>>> >>>> >>>> On Tue, Jan 28, 2014 at 10:37 AM, Matthew Knepley wrote: >>>> >>>>> On Tue, Jan 28, 2014 at 10:31 AM, Yaakoub El Khamra < >>>>> yelkhamra at gmail.com> wrote: >>>>> >>>>>> >>>>>> If >>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>> -show_initial -dm_plex_print_fem >>>>>> >>>>>> is for serial, any chance we can get the options to run in parallel? >>>>>> >>>>> >>>>> Just use mpiexec -n >>>>> >>>>> Matt >>>>> >>>>> >>>>>> >>>>>> Regards >>>>>> Yaakoub El Khamra >>>>>> >>>>>> >>>>>> On Fri, Jan 17, 2014 at 11:29 AM, Matthew Knepley wrote: >>>>>> >>>>>>> On Fri, Jan 17, 2014 at 11:06 AM, Jones,Martin Alexander < >>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>> >>>>>>>> >>>>>>>> ------------------------------ >>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>> *Sent:* Friday, January 17, 2014 11:04 AM >>>>>>>> *To:* Jones,Martin Alexander >>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>> >>>>>>>> On Fri, Jan 17, 2014 at 11:00 AM, Jones,Martin Alexander < >>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>> >>>>>>>>> These examples all seem to run excepting the following command, >>>>>>>>> >>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>> >>>>>>>>> I get the following ouput: >>>>>>>>> >>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>> Local function: >>>>>>>>> ./ex12: symbol lookup error: >>>>>>>>> /opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so: undefined >>>>>>>>> symbol: omp_get_num_procs >>>>>>>>> >>>>>>>> >>>>>>>> This is a build problem, but it should affect all the runs. Is >>>>>>>> this reproducible? Can you send configure.log? MKL is the worst. If this >>>>>>>> persists, I would just switch to --download-f-blas-lapack. >>>>>>>> >>>>>>> >>>>>>> Thanks. I have some advice on options >>>>>>> >>>>>>> --with-precision=single # I would not use this unless you are >>>>>>> doing something special, like CUDA >>>>>>> --with-clanguage=C++ # I would recommend switching to C, the >>>>>>> build is much faster >>>>>>> --with-mpi-dir=/usr --with-mpi4py=0 >>>>>>> --with-shared-libraries --CFLAGS=-O0 --CXXFLAGS=-O0 --with-fc=0 >>>>>>> --with-etags=1 # This is unnecessary >>>>>>> >>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>> # Here is the problem, see below >>>>>>> --download-metis >>>>>>> --download-fiat=yes --download-generator >>>>>>> --download-scientificpython # Get rid of these, they are obsolete >>>>>>> >>>>>>> Your MKL needs another library for the OpenMP symbols. I would >>>>>>> recommend switching to --download-f2cblaslapack, >>>>>>> or you can try and find that library. >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> Thanks, >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> ------------------------------ >>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>> *Sent:* Thursday, January 16, 2014 6:35 PM >>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>> >>>>>>>>> On Thu, Jan 16, 2014 at 5:43 PM, Jones,Martin Alexander < >>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>> >>>>>>>>>> Hi, This is the next error message after configuring and >>>>>>>>>> building with the triangle package when trying to run ex12 >>>>>>>>>> >>>>>>>>> >>>>>>>>> This is my fault for bad defaults. I will fix. Try running >>>>>>>>> >>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type >>>>>>>>> dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>> >>>>>>>>> for a representative run. Then you could try 3D >>>>>>>>> >>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>> >>>>>>>>> or a full run >>>>>>>>> >>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet -interpolate >>>>>>>>> -petscspace_order 1 >>>>>>>>> >>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet -interpolate >>>>>>>>> -petscspace_order 2 >>>>>>>>> >>>>>>>>> Let me know if those work. >>>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> ./ex12 >>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>> [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point >>>>>>>>>> Exception,probably divide by zero >>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>> -on_error_attach_debugger >>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find >>>>>>>>>> memory corruption errors >>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>> ------------------------------------ >>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>>>>>>> available, >>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of the >>>>>>>>>> function >>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeResidualFEM line 531 >>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction_DMLocal line 63 >>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>> [0]PETSC ERROR: [0] SNES user function line 2088 >>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction line 2076 >>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve_NEWTONLS line 144 >>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/impls/ls/ls.c >>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve line 3765 >>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>> ------------------------------------ >>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>> shooting. >>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named maeda by >>>>>>>>>> mjonesa Thu Jan 16 17:41:23 2014 >>>>>>>>>> [0]PETSC ERROR: Libraries linked from /home/mjonesa/local/lib >>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 17:38:33 2014 >>>>>>>>>> [0]PETSC ERROR: Configure options --prefix=/home/mjonesa/local >>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>> --with-c2html=0 --with-clanguage=c++ PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>> --download-triangle >>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in unknown file >>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>> ------------------------------ >>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:37 PM >>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>> >>>>>>>>>> On Thu, Jan 16, 2014 at 4:33 PM, Jones,Martin Alexander < >>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>> >>>>>>>>>>> Hi, I have downloaded and built the dev version you suggested. >>>>>>>>>>> I think I need the triangle package to run this particular case. Is there >>>>>>>>>>> any thing else that appears wrong in what I have done from the error >>>>>>>>>>> messages below: >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Great! Its running. You can reconfigure like this: >>>>>>>>>> >>>>>>>>>> $PETSC_DIR/$PETSC_ARCH/conf/reconfigure-$PETSC_ARCH.py >>>>>>>>>> --download-triangle >>>>>>>>>> >>>>>>>>>> and then rebuild >>>>>>>>>> >>>>>>>>>> make >>>>>>>>>> >>>>>>>>>> and then rerun. You can load meshes, but its much easier to >>>>>>>>>> have triangle create them. >>>>>>>>>> >>>>>>>>>> Thanks for being patient, >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>> ------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: No support for this operation for this object >>>>>>>>>>> type! >>>>>>>>>>> [0]PETSC ERROR: Mesh generation needs external package support. >>>>>>>>>>> Please reconfigure with --download-triangle.! >>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>> shooting. >>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named maeda by >>>>>>>>>>> mjonesa Thu Jan 16 16:28:20 2014 >>>>>>>>>>> [0]PETSC ERROR: Libraries linked from /home/mjonesa/local/lib >>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 16:25:53 2014 >>>>>>>>>>> [0]PETSC ERROR: Configure options --prefix=/home/mjonesa/local >>>>>>>>>>> --with-clanguage=c++ --with-c2html=0 >>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: DMPlexGenerate() line 4332 in >>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plex.c >>>>>>>>>>> [0]PETSC ERROR: DMPlexCreateBoxMesh() line 600 in >>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexcreate.c >>>>>>>>>>> [0]PETSC ERROR: CreateMesh() line 295 in >>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>> [0]PETSC ERROR: main() line 659 in >>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0 >>>>>>>>>>> ------------------------------ >>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:06 PM >>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>> >>>>>>>>>>> On Thu, Jan 16, 2014 at 4:05 PM, Jones,Martin Alexander < >>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>> >>>>>>>>>>>> Hi. I changed the ENV variable to the correct entry. when I >>>>>>>>>>>> type make ex12 I get this: >>>>>>>>>>>> >>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>> make ex12 >>>>>>>>>>>> g++ -o ex12.o -c -Wall -Wwrite-strings -Wno-strict-aliasing >>>>>>>>>>>> -Wno-unknown-pragmas -g -fPIC >>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include >>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/arch-linux2-cxx-debug/include >>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include/mpiuni >>>>>>>>>>>> -D__INSDIR__=src/snes/examples/tutorials/ ex12.c >>>>>>>>>>>> ex12.c:14:18: fatal error: ex12.h: No such file or directory >>>>>>>>>>>> compilation terminated. >>>>>>>>>>>> make: *** [ex12.o] Error 1 >>>>>>>>>>>> >>>>>>>>>>>> Any help of yours is very much appreciated. >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Yes, this relates to my 3). This is not going to work for you >>>>>>>>>>> with the release. Please see the link I sent. >>>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> ------------------------------ >>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:58 PM >>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>> >>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:55 PM, Jones,Martin Alexander < >>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> Thanks! >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> You built with PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:31 PM >>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>> >>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:11 PM, Jones,Martin Alexander < >>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> Now I went to the directory where ex12.c sits and just did >>>>>>>>>>>>>> a 'make ex12.c' with the following error if this helps? : >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>> make ex12.c >>>>>>>>>>>>>> >>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/variables:108: >>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscvariables: No >>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>> >>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/rules:962: >>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules: No >>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>> >>>>>>>>>>>>>> make: *** No rule to make target >>>>>>>>>>>>>> `/home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules'. >>>>>>>>>>>>>> Stop. >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> 1) You would type 'make ex12' >>>>>>>>>>>>> >>>>>>>>>>>>> 2) Either you PETSC_DIR (/home/mjonesa/PETSc/petsc-3.4.3) or >>>>>>>>>>>>> PETSC_ARCH (linux-gnu-cxx-debug) environment variables >>>>>>>>>>>>> do not match what you built. Please send configure.log and >>>>>>>>>>>>> make.log >>>>>>>>>>>>> >>>>>>>>>>>>> 3) Since it was only recently added, if you want to use the >>>>>>>>>>>>> FEM functionality, you must use the development version: >>>>>>>>>>>>> >>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/developers/index.html >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks, >>>>>>>>>>>>> >>>>>>>>>>>>> Matt >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> *From:* Matthew Knepley [mailto:knepley at gmail.com] >>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 2:48 PM >>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 2:35 PM, Jones,Martin Alexander < >>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>> Hello To Whom it Concerns, >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> I am trying to run the tutorial ex12.c by running 'bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h' >>>>>>>>>>>>>> but getting the following error: >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> $ bin/pythonscripts/PetscGenerateFEMQuadrature.py dim order >>>>>>>>>>>>>> dim 1 laplacian dim order dim 1 boundary src/snes/examples/tutorials/ex12.h >>>>>>>>>>>>>> Traceback (most recent call last): >>>>>>>>>>>>>> File "bin/pythonscripts/PetscGenerateFEMQuadrature.py", >>>>>>>>>>>>>> line 15, in >>>>>>>>>>>>>> from FIAT.reference_element import default_simplex >>>>>>>>>>>>>> ImportError: No module named FIAT.reference_element >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> I have removed the requirement of generating the header file >>>>>>>>>>>>>> (its now all handled in C). I thought >>>>>>>>>>>>>> >>>>>>>>>>>>>> I changed the documentation everywhere (including the latest >>>>>>>>>>>>>> tutorial slides). Can you try running >>>>>>>>>>>>>> >>>>>>>>>>>>>> with 'master' (or 'next'), and point me toward the old docs? >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Matt >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>>> >>>> -- >>>> *Miguel Angel Salazar de Troya* >>>> Graduate Research Assistant >>>> Department of Mechanical Science and Engineering >>>> University of Illinois at Urbana-Champaign >>>> (217) 550-2360 >>>> salaza11 at illinois.edu >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> >> >> -- >> *Miguel Angel Salazar de Troya* >> Graduate Research Assistant >> Department of Mechanical Science and Engineering >> University of Illinois at Urbana-Champaign >> (217) 550-2360 >> salaza11 at illinois.edu >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- *Miguel Angel Salazar de Troya* Graduate Research Assistant Department of Mechanical Science and Engineering University of Illinois at Urbana-Champaign (217) 550-2360 salaza11 at illinois.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: gdb_output.log Type: text/x-log Size: 5303 bytes Desc: not available URL: From knepley at gmail.com Mon Mar 3 13:40:38 2014 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 3 Mar 2014 13:40:38 -0600 Subject: [petsc-users] Problems running ex12.c In-Reply-To: References: <8448FFCE4362914496BCEAF8BE810C13EF1D34@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D5E@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D7C@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D8F@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DA3@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DBF@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E22@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E50@DCPWPEXMBX02.mdanderson.edu> Message-ID: On Mon, Mar 3, 2014 at 1:39 PM, Miguel Angel Salazar de Troya < salazardetroya at gmail.com> wrote: > This is what I get at gdb when I type 'where'. > You have to type 'cont', and then when it fails you type 'where'. Matt > #0 0x000000310e0aa860 in __nanosleep_nocancel () from /lib64/libc.so.6 > #1 0x000000310e0aa70f in sleep () from /lib64/libc.so.6 > #2 0x00007fd83a00a8be in PetscSleep (s=10) > at /home/salaza11/petsc/src/sys/utils/psleep.c:52 > #3 0x00007fd83a06f331 in PetscAttachDebugger () > at /home/salaza11/petsc/src/sys/error/adebug.c:397 > #4 0x00007fd83a0af1d2 in PetscOptionsCheckInitial_Private () > at /home/salaza11/petsc/src/sys/objects/init.c:444 > #5 0x00007fd83a0b6448 in PetscInitialize (argc=0x7fff5cd8df2c, > args=0x7fff5cd8df20, file=0x0, > help=0x60ce40 "Poisson Problem in 2d and 3d with simplicial finite > elements.\nWe solve the Poisson problem in a rectangular\ndomain, using a > parallel unstructured mesh (DMPLEX) to discretize it.\n\n\n") > at /home/salaza11/petsc/src/sys/objects/pinit.c:876 > #6 0x0000000000408f2c in main (argc=15, argv=0x7fff5cd8f1f8) > at /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:663 > > The rest of the gdb output is attached. I am a bit ignorant with gdb, I > apologize for that. > > > > On Mon, Mar 3, 2014 at 12:48 PM, Matthew Knepley wrote: > >> On Mon, Mar 3, 2014 at 12:39 PM, Miguel Angel Salazar de Troya < >> salazardetroya at gmail.com> wrote: >> >>> Thanks for your response. Sorry I did not have the "next" version, but >>> the "master" version. I still have an error though. I followed the steps >>> given here (https://bitbucket.org/petsc/petsc/wiki/Home) to obtain the >>> next version, I configured petsc as above and ran ex12 as above as well, >>> getting this error: >>> >>> [salaza11 at maya tutorials]$ ./ex12 -run_type test -refinement_limit 0.0 >>> -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>> -dm_plex_print_fem 1 >>> Local function: >>> Vec Object: 1 MPI processes >>> type: seq >>> 0 >>> 0.25 >>> 1 >>> 0.25 >>> 0.5 >>> 1.25 >>> 1 >>> 1.25 >>> 2 >>> Initial guess >>> Vec Object: 1 MPI processes >>> type: seq >>> 0.5 >>> L_2 Error: 0.111111 >>> Residual: >>> Vec Object: 1 MPI processes >>> type: seq >>> 0 >>> 0 >>> 0 >>> 0 >>> 0 >>> 0 >>> 0 >>> 0 >>> 0 >>> Initial Residual >>> Vec Object: 1 MPI processes >>> type: seq >>> 0 >>> L_2 Residual: 0 >>> >> >> Okay, now run with -start_in_debugger, and give me a stack trace using >> 'where'. >> >> Thanks, >> >> Matt >> >> >>> [0]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >>> probably memory access out of range >>> [0]PETSC ERROR: Try option -start_in_debugger or >>> -on_error_attach_debugger >>> [0]PETSC ERROR: or see >>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory >>> corruption errors >>> [0]PETSC ERROR: likely location of problem given in stack below >>> [0]PETSC ERROR: --------------------- Stack Frames >>> ------------------------------------ >>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>> available, >>> [0]PETSC ERROR: INSTEAD the line number of the start of the >>> function >>> [0]PETSC ERROR: is given. >>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 871 >>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>> /home/salaza11/petsc/src/snes/interface/snes.c >>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>> /home/salaza11/petsc/src/snes/interface/snes.c >>> [0]PETSC ERROR: --------------------- Error Message >>> -------------------------------------------------------------- >>> [0]PETSC ERROR: Signal received >>> [0]PETSC ERROR: See http:// >>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble >>> shooting. >>> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-4705-gfb6b3bc >>> GIT Date: 2014-03-03 08:23:43 -0600 >>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 Mon >>> Mar 3 11:49:15 2014 >>> [0]PETSC ERROR: Configure options --download-mpich >>> --download-scientificpython --download-triangle --download-ctetgen >>> --download-chaco --with-c2html=0 >>> [0]PETSC ERROR: #1 User provided function() line 0 in unknown file >>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>> [unset]: aborting job: >>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>> >>> >>> >>> On Sun, Mar 2, 2014 at 7:11 PM, Matthew Knepley wrote: >>> >>>> On Sun, Mar 2, 2014 at 6:54 PM, Miguel Angel Salazar de Troya < >>>> salazardetroya at gmail.com> wrote: >>>> >>>>> Hi everybody >>>>> >>>>> I am trying to run example ex12.c without much success. I specifically >>>>> run it with the command options: >>>>> >>>> >>>> We need to start narrowing down differences, because it runs for me and >>>> our nightly tests. So, first can >>>> you confirm that you are using the latest 'next' branch? >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type dirichlet >>>>> -interpolate 0 -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>> >>>>> And I get this output >>>>> >>>>> Local function: >>>>> Vec Object: 1 MPI processes >>>>> type: seq >>>>> 0 >>>>> 1 >>>>> 1 >>>>> 2 >>>>> 1 >>>>> 2 >>>>> 2 >>>>> 3 >>>>> Initial guess >>>>> Vec Object: 1 MPI processes >>>>> type: seq >>>>> L_2 Error: 0.625 >>>>> Residual: >>>>> Vec Object: 1 MPI processes >>>>> type: seq >>>>> 0 >>>>> 0 >>>>> 0 >>>>> 0 >>>>> 0 >>>>> 0 >>>>> 0 >>>>> 0 >>>>> Initial Residual >>>>> Vec Object: 1 MPI processes >>>>> type: seq >>>>> L_2 Residual: 0 >>>>> [0]PETSC ERROR: >>>>> ------------------------------------------------------------------------ >>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >>>>> probably memory access out of range >>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>> -on_error_attach_debugger >>>>> [0]PETSC ERROR: or see >>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory >>>>> corruption errors >>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>> ------------------------------------ >>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>> available, >>>>> [0]PETSC ERROR: INSTEAD the line number of the start of the >>>>> function >>>>> [0]PETSC ERROR: is given. >>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 867 >>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>> [0]PETSC ERROR: --------------------- Error Message >>>>> ------------------------------------ >>>>> [0]PETSC ERROR: Signal received! >>>>> [0]PETSC ERROR: >>>>> ------------------------------------------------------------------------ >>>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-3453-g0a94005 >>>>> GIT Date: 2014-03-02 13:12:04 -0600 >>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>> [0]PETSC ERROR: >>>>> ------------------------------------------------------------------------ >>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 >>>>> Sun Mar 2 17:00:09 2014 >>>>> [0]PETSC ERROR: Libraries linked from >>>>> /home/salaza11/petsc/linux-gnu-c-debug/lib >>>>> [0]PETSC ERROR: Configure run at Sun Mar 2 16:46:51 2014 >>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>> --download-chaco --with-c2html=0 >>>>> [0]PETSC ERROR: >>>>> ------------------------------------------------------------------------ >>>>> [0]PETSC ERROR: User provided function() line 0 in unknown file >>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>> [unset]: aborting job: >>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>> >>>>> >>>>> Probably my problems could be on my configuration. I attach the >>>>> configure.log. I ran ./configure like this >>>>> >>>>> ./configure --download-mpich --download-scientificpython >>>>> --download-triangle --download-ctetgen --download-chaco --with-c2html=0 >>>>> >>>>> Thanks a lot in advance. >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> On Tue, Jan 28, 2014 at 10:37 AM, Matthew Knepley wrote: >>>>> >>>>>> On Tue, Jan 28, 2014 at 10:31 AM, Yaakoub El Khamra < >>>>>> yelkhamra at gmail.com> wrote: >>>>>> >>>>>>> >>>>>>> If >>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>> -show_initial -dm_plex_print_fem >>>>>>> >>>>>>> is for serial, any chance we can get the options to run in parallel? >>>>>>> >>>>>> >>>>>> Just use mpiexec -n >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> >>>>>>> Regards >>>>>>> Yaakoub El Khamra >>>>>>> >>>>>>> >>>>>>> On Fri, Jan 17, 2014 at 11:29 AM, Matthew Knepley >>>>>> > wrote: >>>>>>> >>>>>>>> On Fri, Jan 17, 2014 at 11:06 AM, Jones,Martin Alexander < >>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>> >>>>>>>>> >>>>>>>>> ------------------------------ >>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>> *Sent:* Friday, January 17, 2014 11:04 AM >>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>> >>>>>>>>> On Fri, Jan 17, 2014 at 11:00 AM, Jones,Martin Alexander < >>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>> >>>>>>>>>> These examples all seem to run excepting the following command, >>>>>>>>>> >>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>> >>>>>>>>>> I get the following ouput: >>>>>>>>>> >>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>> Local function: >>>>>>>>>> ./ex12: symbol lookup error: >>>>>>>>>> /opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so: undefined >>>>>>>>>> symbol: omp_get_num_procs >>>>>>>>>> >>>>>>>>> >>>>>>>>> This is a build problem, but it should affect all the runs. Is >>>>>>>>> this reproducible? Can you send configure.log? MKL is the worst. If this >>>>>>>>> persists, I would just switch to --download-f-blas-lapack. >>>>>>>>> >>>>>>>> >>>>>>>> Thanks. I have some advice on options >>>>>>>> >>>>>>>> --with-precision=single # I would not use this unless you are >>>>>>>> doing something special, like CUDA >>>>>>>> --with-clanguage=C++ # I would recommend switching to C, the >>>>>>>> build is much faster >>>>>>>> --with-mpi-dir=/usr --with-mpi4py=0 >>>>>>>> --with-shared-libraries --CFLAGS=-O0 --CXXFLAGS=-O0 --with-fc=0 >>>>>>>> --with-etags=1 # This is unnecessary >>>>>>>> >>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>> # Here is the problem, see below >>>>>>>> --download-metis >>>>>>>> --download-fiat=yes --download-generator >>>>>>>> --download-scientificpython # Get rid of these, they are obsolete >>>>>>>> >>>>>>>> Your MKL needs another library for the OpenMP symbols. I would >>>>>>>> recommend switching to --download-f2cblaslapack, >>>>>>>> or you can try and find that library. >>>>>>>> >>>>>>>> Thanks, >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> ------------------------------ >>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>> *Sent:* Thursday, January 16, 2014 6:35 PM >>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>> >>>>>>>>>> On Thu, Jan 16, 2014 at 5:43 PM, Jones,Martin Alexander < >>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>> >>>>>>>>>>> Hi, This is the next error message after configuring and >>>>>>>>>>> building with the triangle package when trying to run ex12 >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> This is my fault for bad defaults. I will fix. Try running >>>>>>>>>> >>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type >>>>>>>>>> dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>> >>>>>>>>>> for a representative run. Then you could try 3D >>>>>>>>>> >>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>> >>>>>>>>>> or a full run >>>>>>>>>> >>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet -interpolate >>>>>>>>>> -petscspace_order 1 >>>>>>>>>> >>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet -interpolate >>>>>>>>>> -petscspace_order 2 >>>>>>>>>> >>>>>>>>>> Let me know if those work. >>>>>>>>>> >>>>>>>>>> Thanks, >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> ./ex12 >>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point >>>>>>>>>>> Exception,probably divide by zero >>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find >>>>>>>>>>> memory corruption errors >>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>> ------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are >>>>>>>>>>> not available, >>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of >>>>>>>>>>> the function >>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeResidualFEM line 531 >>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction_DMLocal line 63 >>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>> [0]PETSC ERROR: [0] SNES user function line 2088 >>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction line 2076 >>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve_NEWTONLS line 144 >>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/impls/ls/ls.c >>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve line 3765 >>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>> ------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>> shooting. >>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named maeda by >>>>>>>>>>> mjonesa Thu Jan 16 17:41:23 2014 >>>>>>>>>>> [0]PETSC ERROR: Libraries linked from /home/mjonesa/local/lib >>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 17:38:33 2014 >>>>>>>>>>> [0]PETSC ERROR: Configure options --prefix=/home/mjonesa/local >>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>> --with-c2html=0 --with-clanguage=c++ PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>> --download-triangle >>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in unknown file >>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>> ------------------------------ >>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:37 PM >>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>> >>>>>>>>>>> On Thu, Jan 16, 2014 at 4:33 PM, Jones,Martin Alexander < >>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>> >>>>>>>>>>>> Hi, I have downloaded and built the dev version you >>>>>>>>>>>> suggested. I think I need the triangle package to run this particular case. >>>>>>>>>>>> Is there any thing else that appears wrong in what I have done from the >>>>>>>>>>>> error messages below: >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Great! Its running. You can reconfigure like this: >>>>>>>>>>> >>>>>>>>>>> $PETSC_DIR/$PETSC_ARCH/conf/reconfigure-$PETSC_ARCH.py >>>>>>>>>>> --download-triangle >>>>>>>>>>> >>>>>>>>>>> and then rebuild >>>>>>>>>>> >>>>>>>>>>> make >>>>>>>>>>> >>>>>>>>>>> and then rerun. You can load meshes, but its much easier to >>>>>>>>>>> have triangle create them. >>>>>>>>>>> >>>>>>>>>>> Thanks for being patient, >>>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: No support for this operation for this object >>>>>>>>>>>> type! >>>>>>>>>>>> [0]PETSC ERROR: Mesh generation needs external package support. >>>>>>>>>>>> Please reconfigure with --download-triangle.! >>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>> shooting. >>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named maeda >>>>>>>>>>>> by mjonesa Thu Jan 16 16:28:20 2014 >>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from /home/mjonesa/local/lib >>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 16:25:53 2014 >>>>>>>>>>>> [0]PETSC ERROR: Configure options --prefix=/home/mjonesa/local >>>>>>>>>>>> --with-clanguage=c++ --with-c2html=0 >>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: DMPlexGenerate() line 4332 in >>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plex.c >>>>>>>>>>>> [0]PETSC ERROR: DMPlexCreateBoxMesh() line 600 in >>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexcreate.c >>>>>>>>>>>> [0]PETSC ERROR: CreateMesh() line 295 in >>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>> [0]PETSC ERROR: main() line 659 in >>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0 >>>>>>>>>>>> ------------------------------ >>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:06 PM >>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>> >>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:05 PM, Jones,Martin Alexander < >>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> Hi. I changed the ENV variable to the correct entry. when I >>>>>>>>>>>>> type make ex12 I get this: >>>>>>>>>>>>> >>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>> make ex12 >>>>>>>>>>>>> g++ -o ex12.o -c -Wall -Wwrite-strings -Wno-strict-aliasing >>>>>>>>>>>>> -Wno-unknown-pragmas -g -fPIC >>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include >>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/arch-linux2-cxx-debug/include >>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include/mpiuni >>>>>>>>>>>>> -D__INSDIR__=src/snes/examples/tutorials/ ex12.c >>>>>>>>>>>>> ex12.c:14:18: fatal error: ex12.h: No such file or directory >>>>>>>>>>>>> compilation terminated. >>>>>>>>>>>>> make: *** [ex12.o] Error 1 >>>>>>>>>>>>> >>>>>>>>>>>>> Any help of yours is very much appreciated. >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Yes, this relates to my 3). This is not going to work for you >>>>>>>>>>>> with the release. Please see the link I sent. >>>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:58 PM >>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>> >>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:55 PM, Jones,Martin Alexander < >>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> You built with PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>> >>>>>>>>>>>>> Matt >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:31 PM >>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:11 PM, Jones,Martin Alexander < >>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Now I went to the directory where ex12.c sits and just did >>>>>>>>>>>>>>> a 'make ex12.c' with the following error if this helps? : >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>> make ex12.c >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/variables:108: >>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscvariables: No >>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/rules:962: >>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules: No >>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> make: *** No rule to make target >>>>>>>>>>>>>>> `/home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules'. >>>>>>>>>>>>>>> Stop. >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> 1) You would type 'make ex12' >>>>>>>>>>>>>> >>>>>>>>>>>>>> 2) Either you PETSC_DIR (/home/mjonesa/PETSc/petsc-3.4.3) >>>>>>>>>>>>>> or PETSC_ARCH (linux-gnu-cxx-debug) environment variables >>>>>>>>>>>>>> do not match what you built. Please send configure.log >>>>>>>>>>>>>> and make.log >>>>>>>>>>>>>> >>>>>>>>>>>>>> 3) Since it was only recently added, if you want to use the >>>>>>>>>>>>>> FEM functionality, you must use the development version: >>>>>>>>>>>>>> >>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/developers/index.html >>>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>> >>>>>>>>>>>>>> Matt >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> *From:* Matthew Knepley [mailto:knepley at gmail.com] >>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 2:48 PM >>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 2:35 PM, Jones,Martin Alexander < >>>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Hello To Whom it Concerns, >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> I am trying to run the tutorial ex12.c by running 'bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h' >>>>>>>>>>>>>>> but getting the following error: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> $ bin/pythonscripts/PetscGenerateFEMQuadrature.py dim order >>>>>>>>>>>>>>> dim 1 laplacian dim order dim 1 boundary src/snes/examples/tutorials/ex12.h >>>>>>>>>>>>>>> Traceback (most recent call last): >>>>>>>>>>>>>>> File "bin/pythonscripts/PetscGenerateFEMQuadrature.py", >>>>>>>>>>>>>>> line 15, in >>>>>>>>>>>>>>> from FIAT.reference_element import default_simplex >>>>>>>>>>>>>>> ImportError: No module named FIAT.reference_element >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> I have removed the requirement of generating the header file >>>>>>>>>>>>>>> (its now all handled in C). I thought >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> I changed the documentation everywhere (including the latest >>>>>>>>>>>>>>> tutorial slides). Can you try running >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> with 'master' (or 'next'), and point me toward the old docs? >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> *Miguel Angel Salazar de Troya* >>>>> Graduate Research Assistant >>>>> Department of Mechanical Science and Engineering >>>>> University of Illinois at Urbana-Champaign >>>>> (217) 550-2360 >>>>> salaza11 at illinois.edu >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >>> >>> -- >>> *Miguel Angel Salazar de Troya* >>> Graduate Research Assistant >>> Department of Mechanical Science and Engineering >>> University of Illinois at Urbana-Champaign >>> (217) 550-2360 >>> salaza11 at illinois.edu >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > > > -- > *Miguel Angel Salazar de Troya* > Graduate Research Assistant > Department of Mechanical Science and Engineering > University of Illinois at Urbana-Champaign > (217) 550-2360 > salaza11 at illinois.edu > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From salazardetroya at gmail.com Mon Mar 3 13:44:32 2014 From: salazardetroya at gmail.com (Miguel Angel Salazar de Troya) Date: Mon, 3 Mar 2014 13:44:32 -0600 Subject: [petsc-users] Problems running ex12.c In-Reply-To: References: <8448FFCE4362914496BCEAF8BE810C13EF1D34@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D5E@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D7C@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D8F@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DA3@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DBF@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E22@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E50@DCPWPEXMBX02.mdanderson.edu> Message-ID: Thanks. This is what I get. (gdb) cont Continuing. Program received signal SIGSEGV, Segmentation fault. 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, X=0x168b5b0, Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, str=0x7fffae6e7970, user=0x7fd6811be509) at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 882 ierr = PetscFEGetDimension(fe[f], &Nb);CHKERRQ(ierr); (gdb) where #0 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, X=0x168b5b0, Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, str=0x7fffae6e7970, user=0x7fd6811be509) at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 #1 0x00007fd6814a5bf6 in SNESComputeJacobian_DMLocal (snes=0x14e9450, X=0x1622ad0, A=0x7fffae6e8a88, B=0x7fffae6e8a88, ctx=0x1652300) at /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c:102 #2 0x00007fd6814cc609 in SNESComputeJacobian (snes=0x14e9450, X=0x1622ad0, A=0x7fffae6e8a88, B=0x7fffae6e8a88) at /home/salaza11/petsc/src/snes/interface/snes.c:2245 #3 0x000000000040af72 in main (argc=15, argv=0x7fffae6e8bc8) at /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:784 On Mon, Mar 3, 2014 at 1:40 PM, Matthew Knepley wrote: > On Mon, Mar 3, 2014 at 1:39 PM, Miguel Angel Salazar de Troya < > salazardetroya at gmail.com> wrote: > >> This is what I get at gdb when I type 'where'. >> > > You have to type 'cont', and then when it fails you type 'where'. > > Matt > > >> #0 0x000000310e0aa860 in __nanosleep_nocancel () from /lib64/libc.so.6 >> #1 0x000000310e0aa70f in sleep () from /lib64/libc.so.6 >> #2 0x00007fd83a00a8be in PetscSleep (s=10) >> at /home/salaza11/petsc/src/sys/utils/psleep.c:52 >> #3 0x00007fd83a06f331 in PetscAttachDebugger () >> at /home/salaza11/petsc/src/sys/error/adebug.c:397 >> #4 0x00007fd83a0af1d2 in PetscOptionsCheckInitial_Private () >> at /home/salaza11/petsc/src/sys/objects/init.c:444 >> #5 0x00007fd83a0b6448 in PetscInitialize (argc=0x7fff5cd8df2c, >> args=0x7fff5cd8df20, file=0x0, >> help=0x60ce40 "Poisson Problem in 2d and 3d with simplicial finite >> elements.\nWe solve the Poisson problem in a rectangular\ndomain, using a >> parallel unstructured mesh (DMPLEX) to discretize it.\n\n\n") >> at /home/salaza11/petsc/src/sys/objects/pinit.c:876 >> #6 0x0000000000408f2c in main (argc=15, argv=0x7fff5cd8f1f8) >> at /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:663 >> >> The rest of the gdb output is attached. I am a bit ignorant with gdb, I >> apologize for that. >> >> >> >> On Mon, Mar 3, 2014 at 12:48 PM, Matthew Knepley wrote: >> >>> On Mon, Mar 3, 2014 at 12:39 PM, Miguel Angel Salazar de Troya < >>> salazardetroya at gmail.com> wrote: >>> >>>> Thanks for your response. Sorry I did not have the "next" version, but >>>> the "master" version. I still have an error though. I followed the steps >>>> given here (https://bitbucket.org/petsc/petsc/wiki/Home) to obtain the >>>> next version, I configured petsc as above and ran ex12 as above as well, >>>> getting this error: >>>> >>>> [salaza11 at maya tutorials]$ ./ex12 -run_type test -refinement_limit 0.0 >>>> -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>> -dm_plex_print_fem 1 >>>> Local function: >>>> Vec Object: 1 MPI processes >>>> type: seq >>>> 0 >>>> 0.25 >>>> 1 >>>> 0.25 >>>> 0.5 >>>> 1.25 >>>> 1 >>>> 1.25 >>>> 2 >>>> Initial guess >>>> Vec Object: 1 MPI processes >>>> type: seq >>>> 0.5 >>>> L_2 Error: 0.111111 >>>> Residual: >>>> Vec Object: 1 MPI processes >>>> type: seq >>>> 0 >>>> 0 >>>> 0 >>>> 0 >>>> 0 >>>> 0 >>>> 0 >>>> 0 >>>> 0 >>>> Initial Residual >>>> Vec Object: 1 MPI processes >>>> type: seq >>>> 0 >>>> L_2 Residual: 0 >>>> >>> >>> Okay, now run with -start_in_debugger, and give me a stack trace using >>> 'where'. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> [0]PETSC ERROR: >>>> ------------------------------------------------------------------------ >>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >>>> probably memory access out of range >>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>> -on_error_attach_debugger >>>> [0]PETSC ERROR: or see >>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory >>>> corruption errors >>>> [0]PETSC ERROR: likely location of problem given in stack below >>>> [0]PETSC ERROR: --------------------- Stack Frames >>>> ------------------------------------ >>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>> available, >>>> [0]PETSC ERROR: INSTEAD the line number of the start of the >>>> function >>>> [0]PETSC ERROR: is given. >>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 871 >>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>> [0]PETSC ERROR: --------------------- Error Message >>>> -------------------------------------------------------------- >>>> [0]PETSC ERROR: Signal received >>>> [0]PETSC ERROR: See http:// >>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble >>>> shooting. >>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-4705-gfb6b3bc >>>> GIT Date: 2014-03-03 08:23:43 -0600 >>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 >>>> Mon Mar 3 11:49:15 2014 >>>> [0]PETSC ERROR: Configure options --download-mpich >>>> --download-scientificpython --download-triangle --download-ctetgen >>>> --download-chaco --with-c2html=0 >>>> [0]PETSC ERROR: #1 User provided function() line 0 in unknown file >>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>> [unset]: aborting job: >>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>> >>>> >>>> >>>> On Sun, Mar 2, 2014 at 7:11 PM, Matthew Knepley wrote: >>>> >>>>> On Sun, Mar 2, 2014 at 6:54 PM, Miguel Angel Salazar de Troya < >>>>> salazardetroya at gmail.com> wrote: >>>>> >>>>>> Hi everybody >>>>>> >>>>>> I am trying to run example ex12.c without much success. I >>>>>> specifically run it with the command options: >>>>>> >>>>> >>>>> We need to start narrowing down differences, because it runs for me >>>>> and our nightly tests. So, first can >>>>> you confirm that you are using the latest 'next' branch? >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type dirichlet >>>>>> -interpolate 0 -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>>> >>>>>> And I get this output >>>>>> >>>>>> Local function: >>>>>> Vec Object: 1 MPI processes >>>>>> type: seq >>>>>> 0 >>>>>> 1 >>>>>> 1 >>>>>> 2 >>>>>> 1 >>>>>> 2 >>>>>> 2 >>>>>> 3 >>>>>> Initial guess >>>>>> Vec Object: 1 MPI processes >>>>>> type: seq >>>>>> L_2 Error: 0.625 >>>>>> Residual: >>>>>> Vec Object: 1 MPI processes >>>>>> type: seq >>>>>> 0 >>>>>> 0 >>>>>> 0 >>>>>> 0 >>>>>> 0 >>>>>> 0 >>>>>> 0 >>>>>> 0 >>>>>> Initial Residual >>>>>> Vec Object: 1 MPI processes >>>>>> type: seq >>>>>> L_2 Residual: 0 >>>>>> [0]PETSC ERROR: >>>>>> ------------------------------------------------------------------------ >>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >>>>>> probably memory access out of range >>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>> -on_error_attach_debugger >>>>>> [0]PETSC ERROR: or see >>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory >>>>>> corruption errors >>>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>> ------------------------------------ >>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>>> available, >>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of the >>>>>> function >>>>>> [0]PETSC ERROR: is given. >>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 867 >>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>> ------------------------------------ >>>>>> [0]PETSC ERROR: Signal received! >>>>>> [0]PETSC ERROR: >>>>>> ------------------------------------------------------------------------ >>>>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-3453-g0a94005 >>>>>> GIT Date: 2014-03-02 13:12:04 -0600 >>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>> [0]PETSC ERROR: >>>>>> ------------------------------------------------------------------------ >>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 >>>>>> Sun Mar 2 17:00:09 2014 >>>>>> [0]PETSC ERROR: Libraries linked from >>>>>> /home/salaza11/petsc/linux-gnu-c-debug/lib >>>>>> [0]PETSC ERROR: Configure run at Sun Mar 2 16:46:51 2014 >>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>> --download-chaco --with-c2html=0 >>>>>> [0]PETSC ERROR: >>>>>> ------------------------------------------------------------------------ >>>>>> [0]PETSC ERROR: User provided function() line 0 in unknown file >>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>> [unset]: aborting job: >>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>> >>>>>> >>>>>> Probably my problems could be on my configuration. I attach the >>>>>> configure.log. I ran ./configure like this >>>>>> >>>>>> ./configure --download-mpich --download-scientificpython >>>>>> --download-triangle --download-ctetgen --download-chaco --with-c2html=0 >>>>>> >>>>>> Thanks a lot in advance. >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> On Tue, Jan 28, 2014 at 10:37 AM, Matthew Knepley wrote: >>>>>> >>>>>>> On Tue, Jan 28, 2014 at 10:31 AM, Yaakoub El Khamra < >>>>>>> yelkhamra at gmail.com> wrote: >>>>>>> >>>>>>>> >>>>>>>> If >>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>> >>>>>>>> is for serial, any chance we can get the options to run in parallel? >>>>>>>> >>>>>>> >>>>>>> Just use mpiexec -n >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> >>>>>>>> Regards >>>>>>>> Yaakoub El Khamra >>>>>>>> >>>>>>>> >>>>>>>> On Fri, Jan 17, 2014 at 11:29 AM, Matthew Knepley < >>>>>>>> knepley at gmail.com> wrote: >>>>>>>> >>>>>>>>> On Fri, Jan 17, 2014 at 11:06 AM, Jones,Martin Alexander < >>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>> >>>>>>>>>> >>>>>>>>>> ------------------------------ >>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>> *Sent:* Friday, January 17, 2014 11:04 AM >>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>> >>>>>>>>>> On Fri, Jan 17, 2014 at 11:00 AM, Jones,Martin Alexander < >>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>> >>>>>>>>>>> These examples all seem to run excepting the following command, >>>>>>>>>>> >>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>> >>>>>>>>>>> I get the following ouput: >>>>>>>>>>> >>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>> Local function: >>>>>>>>>>> ./ex12: symbol lookup error: >>>>>>>>>>> /opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so: undefined >>>>>>>>>>> symbol: omp_get_num_procs >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> This is a build problem, but it should affect all the runs. Is >>>>>>>>>> this reproducible? Can you send configure.log? MKL is the worst. If this >>>>>>>>>> persists, I would just switch to --download-f-blas-lapack. >>>>>>>>>> >>>>>>>>> >>>>>>>>> Thanks. I have some advice on options >>>>>>>>> >>>>>>>>> --with-precision=single # I would not use this unless you are >>>>>>>>> doing something special, like CUDA >>>>>>>>> --with-clanguage=C++ # I would recommend switching to C, the >>>>>>>>> build is much faster >>>>>>>>> --with-mpi-dir=/usr --with-mpi4py=0 >>>>>>>>> --with-shared-libraries --CFLAGS=-O0 --CXXFLAGS=-O0 --with-fc=0 >>>>>>>>> --with-etags=1 # This is unnecessary >>>>>>>>> >>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>> # Here is the problem, see below >>>>>>>>> --download-metis >>>>>>>>> --download-fiat=yes --download-generator >>>>>>>>> --download-scientificpython # Get rid of these, they are obsolete >>>>>>>>> >>>>>>>>> Your MKL needs another library for the OpenMP symbols. I would >>>>>>>>> recommend switching to --download-f2cblaslapack, >>>>>>>>> or you can try and find that library. >>>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> Thanks, >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> ------------------------------ >>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>> *Sent:* Thursday, January 16, 2014 6:35 PM >>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>> >>>>>>>>>>> On Thu, Jan 16, 2014 at 5:43 PM, Jones,Martin Alexander < >>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>> >>>>>>>>>>>> Hi, This is the next error message after configuring and >>>>>>>>>>>> building with the triangle package when trying to run ex12 >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> This is my fault for bad defaults. I will fix. Try running >>>>>>>>>>> >>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type >>>>>>>>>>> dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>> >>>>>>>>>>> for a representative run. Then you could try 3D >>>>>>>>>>> >>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>> >>>>>>>>>>> or a full run >>>>>>>>>>> >>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet -interpolate >>>>>>>>>>> -petscspace_order 1 >>>>>>>>>>> >>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet -interpolate >>>>>>>>>>> -petscspace_order 2 >>>>>>>>>>> >>>>>>>>>>> Let me know if those work. >>>>>>>>>>> >>>>>>>>>>> Thanks, >>>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> ./ex12 >>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point >>>>>>>>>>>> Exception,probably divide by zero >>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find >>>>>>>>>>>> memory corruption errors >>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are >>>>>>>>>>>> not available, >>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of >>>>>>>>>>>> the function >>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeResidualFEM line 531 >>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction_DMLocal line 63 >>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user function line 2088 >>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction line 2076 >>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve_NEWTONLS line 144 >>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/impls/ls/ls.c >>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve line 3765 >>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>> shooting. >>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named maeda >>>>>>>>>>>> by mjonesa Thu Jan 16 17:41:23 2014 >>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from /home/mjonesa/local/lib >>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 17:38:33 2014 >>>>>>>>>>>> [0]PETSC ERROR: Configure options --prefix=/home/mjonesa/local >>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>> --with-c2html=0 --with-clanguage=c++ PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>> --download-triangle >>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in unknown file >>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>> ------------------------------ >>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:37 PM >>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>> >>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:33 PM, Jones,Martin Alexander < >>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> Hi, I have downloaded and built the dev version you >>>>>>>>>>>>> suggested. I think I need the triangle package to run this particular case. >>>>>>>>>>>>> Is there any thing else that appears wrong in what I have done from the >>>>>>>>>>>>> error messages below: >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Great! Its running. You can reconfigure like this: >>>>>>>>>>>> >>>>>>>>>>>> $PETSC_DIR/$PETSC_ARCH/conf/reconfigure-$PETSC_ARCH.py >>>>>>>>>>>> --download-triangle >>>>>>>>>>>> >>>>>>>>>>>> and then rebuild >>>>>>>>>>>> >>>>>>>>>>>> make >>>>>>>>>>>> >>>>>>>>>>>> and then rerun. You can load meshes, but its much easier to >>>>>>>>>>>> have triangle create them. >>>>>>>>>>>> >>>>>>>>>>>> Thanks for being patient, >>>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>> [0]PETSC ERROR: No support for this operation for this object >>>>>>>>>>>>> type! >>>>>>>>>>>>> [0]PETSC ERROR: Mesh generation needs external package support. >>>>>>>>>>>>> Please reconfigure with --download-triangle.! >>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>>> shooting. >>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named maeda >>>>>>>>>>>>> by mjonesa Thu Jan 16 16:28:20 2014 >>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from /home/mjonesa/local/lib >>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 16:25:53 2014 >>>>>>>>>>>>> [0]PETSC ERROR: Configure options --prefix=/home/mjonesa/local >>>>>>>>>>>>> --with-clanguage=c++ --with-c2html=0 >>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>> [0]PETSC ERROR: DMPlexGenerate() line 4332 in >>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plex.c >>>>>>>>>>>>> [0]PETSC ERROR: DMPlexCreateBoxMesh() line 600 in >>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexcreate.c >>>>>>>>>>>>> [0]PETSC ERROR: CreateMesh() line 295 in >>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>> [0]PETSC ERROR: main() line 659 in >>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0 >>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:06 PM >>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>> >>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:05 PM, Jones,Martin Alexander < >>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> Hi. I changed the ENV variable to the correct entry. when I >>>>>>>>>>>>>> type make ex12 I get this: >>>>>>>>>>>>>> >>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>> make ex12 >>>>>>>>>>>>>> g++ -o ex12.o -c -Wall -Wwrite-strings -Wno-strict-aliasing >>>>>>>>>>>>>> -Wno-unknown-pragmas -g -fPIC >>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include >>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/arch-linux2-cxx-debug/include >>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include/mpiuni >>>>>>>>>>>>>> -D__INSDIR__=src/snes/examples/tutorials/ ex12.c >>>>>>>>>>>>>> ex12.c:14:18: fatal error: ex12.h: No such file or directory >>>>>>>>>>>>>> compilation terminated. >>>>>>>>>>>>>> make: *** [ex12.o] Error 1 >>>>>>>>>>>>>> >>>>>>>>>>>>>> Any help of yours is very much appreciated. >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Yes, this relates to my 3). This is not going to work for >>>>>>>>>>>>> you with the release. Please see the link I sent. >>>>>>>>>>>>> >>>>>>>>>>>>> Matt >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:58 PM >>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:55 PM, Jones,Martin Alexander < >>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> You built with PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>> >>>>>>>>>>>>>> Matt >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:31 PM >>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:11 PM, Jones,Martin Alexander < >>>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Now I went to the directory where ex12.c sits and just >>>>>>>>>>>>>>>> did a 'make ex12.c' with the following error if this helps? : >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>> make ex12.c >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/variables:108: >>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscvariables: No >>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/rules:962: >>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules: No >>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> make: *** No rule to make target >>>>>>>>>>>>>>>> `/home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules'. >>>>>>>>>>>>>>>> Stop. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> 1) You would type 'make ex12' >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> 2) Either you PETSC_DIR (/home/mjonesa/PETSc/petsc-3.4.3) >>>>>>>>>>>>>>> or PETSC_ARCH (linux-gnu-cxx-debug) environment variables >>>>>>>>>>>>>>> do not match what you built. Please send configure.log >>>>>>>>>>>>>>> and make.log >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> 3) Since it was only recently added, if you want to use >>>>>>>>>>>>>>> the FEM functionality, you must use the development version: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/developers/index.html >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> *From:* Matthew Knepley [mailto:knepley at gmail.com] >>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 2:48 PM >>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 2:35 PM, Jones,Martin Alexander < >>>>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Hello To Whom it Concerns, >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> I am trying to run the tutorial ex12.c by running 'bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h' >>>>>>>>>>>>>>>> but getting the following error: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> $ bin/pythonscripts/PetscGenerateFEMQuadrature.py dim order >>>>>>>>>>>>>>>> dim 1 laplacian dim order dim 1 boundary src/snes/examples/tutorials/ex12.h >>>>>>>>>>>>>>>> Traceback (most recent call last): >>>>>>>>>>>>>>>> File "bin/pythonscripts/PetscGenerateFEMQuadrature.py", >>>>>>>>>>>>>>>> line 15, in >>>>>>>>>>>>>>>> from FIAT.reference_element import default_simplex >>>>>>>>>>>>>>>> ImportError: No module named FIAT.reference_element >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> I have removed the requirement of generating the header >>>>>>>>>>>>>>>> file (its now all handled in C). I thought >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> I changed the documentation everywhere (including the >>>>>>>>>>>>>>>> latest tutorial slides). Can you try running >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> with 'master' (or 'next'), and point me toward the old docs? >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> *Miguel Angel Salazar de Troya* >>>>>> Graduate Research Assistant >>>>>> Department of Mechanical Science and Engineering >>>>>> University of Illinois at Urbana-Champaign >>>>>> (217) 550-2360 >>>>>> salaza11 at illinois.edu >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>>> >>>> -- >>>> *Miguel Angel Salazar de Troya* >>>> Graduate Research Assistant >>>> Department of Mechanical Science and Engineering >>>> University of Illinois at Urbana-Champaign >>>> (217) 550-2360 >>>> salaza11 at illinois.edu >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> >> >> -- >> *Miguel Angel Salazar de Troya* >> Graduate Research Assistant >> Department of Mechanical Science and Engineering >> University of Illinois at Urbana-Champaign >> (217) 550-2360 >> salaza11 at illinois.edu >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- *Miguel Angel Salazar de Troya* Graduate Research Assistant Department of Mechanical Science and Engineering University of Illinois at Urbana-Champaign (217) 550-2360 salaza11 at illinois.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: From dharmareddy84 at gmail.com Mon Mar 3 15:01:49 2014 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Mon, 3 Mar 2014 15:01:49 -0600 Subject: [petsc-users] Pseudo time stepping Message-ID: Hello, I need to slove a nonlinear problem : F(x)= 0 with x = xd specified on the boundary. I have setup an snes based solver. I need to modify it to handle the situation described below. The bc value xd takes a set of values xd = {x1, x2 ......., xn}, xj > xi for j > i . Parametrising xd(t)= x1 + (xn - x1) * t such that by selcting t between 0 and 1 I get all the xd values mentioned above. I want the solver to automatically increase or decrease the t based on convergence criteria ( i.e., go back to earlier converged step and try again with a smaller step, ofcourse report divergence if the step size is smaller than a specified minimum step size) and solve system of equations. Can I setup this problem as a TS ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Mon Mar 3 15:06:17 2014 From: jed at jedbrown.org (Jed Brown) Date: Mon, 03 Mar 2014 15:06:17 -0600 Subject: [petsc-users] Pseudo time stepping In-Reply-To: References: Message-ID: <871tyjngae.fsf@jedbrown.org> Dharmendar Reddy writes: > Hello, > I need to slove a nonlinear problem : > F(x)= 0 with x = xd specified on the boundary. I have setup an snes based > solver. I need to modify it to handle the situation described below. > > The bc value xd takes a set of values > xd = {x1, x2 ......., xn}, xj > xi for j > i . > > Parametrising xd(t)= x1 + (xn - x1) * t such that by selcting t between 0 > and 1 I get all the xd values mentioned above. > > I want the solver to automatically increase or decrease the t based on > convergence criteria ( i.e., go back to earlier converged step and try > again with a smaller step, ofcourse report divergence if the step size is > smaller than a specified minimum step size) and solve system of equations. See TSPSEUDO, which uses residual-based step control. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From knepley at gmail.com Mon Mar 3 16:13:58 2014 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 3 Mar 2014 16:13:58 -0600 Subject: [petsc-users] Problems running ex12.c In-Reply-To: References: <8448FFCE4362914496BCEAF8BE810C13EF1D34@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D5E@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D7C@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D8F@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DA3@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DBF@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E22@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E50@DCPWPEXMBX02.mdanderson.edu> Message-ID: On Mon, Mar 3, 2014 at 1:44 PM, Miguel Angel Salazar de Troya < salazardetroya at gmail.com> wrote: > Thanks. This is what I get. > Okay, this was broken by a new push to master/next in the last few days. I have pushed a fix, however next is currently broken due to a failure to check in a file. This should be fixed shortly, and then ex12 will work. I will mail you when its ready. Thanks for finding this, Matt > (gdb) cont > Continuing. > > Program received signal SIGSEGV, Segmentation fault. > 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, X=0x168b5b0, > Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, str=0x7fffae6e7970, > user=0x7fd6811be509) > at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 > 882 ierr = PetscFEGetDimension(fe[f], &Nb);CHKERRQ(ierr); > (gdb) where > #0 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, > X=0x168b5b0, > Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, str=0x7fffae6e7970, > user=0x7fd6811be509) > at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 > #1 0x00007fd6814a5bf6 in SNESComputeJacobian_DMLocal (snes=0x14e9450, > X=0x1622ad0, A=0x7fffae6e8a88, B=0x7fffae6e8a88, ctx=0x1652300) > at /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c:102 > #2 0x00007fd6814cc609 in SNESComputeJacobian (snes=0x14e9450, > X=0x1622ad0, > A=0x7fffae6e8a88, B=0x7fffae6e8a88) > at /home/salaza11/petsc/src/snes/interface/snes.c:2245 > #3 0x000000000040af72 in main (argc=15, argv=0x7fffae6e8bc8) > at /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:784 > > > > > On Mon, Mar 3, 2014 at 1:40 PM, Matthew Knepley wrote: > >> On Mon, Mar 3, 2014 at 1:39 PM, Miguel Angel Salazar de Troya < >> salazardetroya at gmail.com> wrote: >> >>> This is what I get at gdb when I type 'where'. >>> >> >> You have to type 'cont', and then when it fails you type 'where'. >> >> Matt >> >> >>> #0 0x000000310e0aa860 in __nanosleep_nocancel () from /lib64/libc.so.6 >>> #1 0x000000310e0aa70f in sleep () from /lib64/libc.so.6 >>> #2 0x00007fd83a00a8be in PetscSleep (s=10) >>> at /home/salaza11/petsc/src/sys/utils/psleep.c:52 >>> #3 0x00007fd83a06f331 in PetscAttachDebugger () >>> at /home/salaza11/petsc/src/sys/error/adebug.c:397 >>> #4 0x00007fd83a0af1d2 in PetscOptionsCheckInitial_Private () >>> at /home/salaza11/petsc/src/sys/objects/init.c:444 >>> #5 0x00007fd83a0b6448 in PetscInitialize (argc=0x7fff5cd8df2c, >>> args=0x7fff5cd8df20, file=0x0, >>> help=0x60ce40 "Poisson Problem in 2d and 3d with simplicial finite >>> elements.\nWe solve the Poisson problem in a rectangular\ndomain, using a >>> parallel unstructured mesh (DMPLEX) to discretize it.\n\n\n") >>> at /home/salaza11/petsc/src/sys/objects/pinit.c:876 >>> #6 0x0000000000408f2c in main (argc=15, argv=0x7fff5cd8f1f8) >>> at /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:663 >>> >>> The rest of the gdb output is attached. I am a bit ignorant with gdb, I >>> apologize for that. >>> >>> >>> >>> On Mon, Mar 3, 2014 at 12:48 PM, Matthew Knepley wrote: >>> >>>> On Mon, Mar 3, 2014 at 12:39 PM, Miguel Angel Salazar de Troya < >>>> salazardetroya at gmail.com> wrote: >>>> >>>>> Thanks for your response. Sorry I did not have the "next" version, but >>>>> the "master" version. I still have an error though. I followed the steps >>>>> given here (https://bitbucket.org/petsc/petsc/wiki/Home) to obtain >>>>> the next version, I configured petsc as above and ran ex12 as above as >>>>> well, getting this error: >>>>> >>>>> [salaza11 at maya tutorials]$ ./ex12 -run_type test -refinement_limit >>>>> 0.0 -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>> -dm_plex_print_fem 1 >>>>> Local function: >>>>> Vec Object: 1 MPI processes >>>>> type: seq >>>>> 0 >>>>> 0.25 >>>>> 1 >>>>> 0.25 >>>>> 0.5 >>>>> 1.25 >>>>> 1 >>>>> 1.25 >>>>> 2 >>>>> Initial guess >>>>> Vec Object: 1 MPI processes >>>>> type: seq >>>>> 0.5 >>>>> L_2 Error: 0.111111 >>>>> Residual: >>>>> Vec Object: 1 MPI processes >>>>> type: seq >>>>> 0 >>>>> 0 >>>>> 0 >>>>> 0 >>>>> 0 >>>>> 0 >>>>> 0 >>>>> 0 >>>>> 0 >>>>> Initial Residual >>>>> Vec Object: 1 MPI processes >>>>> type: seq >>>>> 0 >>>>> L_2 Residual: 0 >>>>> >>>> >>>> Okay, now run with -start_in_debugger, and give me a stack trace using >>>> 'where'. >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> [0]PETSC ERROR: >>>>> ------------------------------------------------------------------------ >>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >>>>> probably memory access out of range >>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>> -on_error_attach_debugger >>>>> [0]PETSC ERROR: or see >>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory >>>>> corruption errors >>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>> ------------------------------------ >>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>> available, >>>>> [0]PETSC ERROR: INSTEAD the line number of the start of the >>>>> function >>>>> [0]PETSC ERROR: is given. >>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 871 >>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>> [0]PETSC ERROR: --------------------- Error Message >>>>> -------------------------------------------------------------- >>>>> [0]PETSC ERROR: Signal received >>>>> [0]PETSC ERROR: See http:// >>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble >>>>> shooting. >>>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-4705-gfb6b3bc >>>>> GIT Date: 2014-03-03 08:23:43 -0600 >>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 >>>>> Mon Mar 3 11:49:15 2014 >>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>> --download-chaco --with-c2html=0 >>>>> [0]PETSC ERROR: #1 User provided function() line 0 in unknown file >>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>> [unset]: aborting job: >>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>> >>>>> >>>>> >>>>> On Sun, Mar 2, 2014 at 7:11 PM, Matthew Knepley wrote: >>>>> >>>>>> On Sun, Mar 2, 2014 at 6:54 PM, Miguel Angel Salazar de Troya < >>>>>> salazardetroya at gmail.com> wrote: >>>>>> >>>>>>> Hi everybody >>>>>>> >>>>>>> I am trying to run example ex12.c without much success. I >>>>>>> specifically run it with the command options: >>>>>>> >>>>>> >>>>>> We need to start narrowing down differences, because it runs for me >>>>>> and our nightly tests. So, first can >>>>>> you confirm that you are using the latest 'next' branch? >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type dirichlet >>>>>>> -interpolate 0 -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>>>> >>>>>>> And I get this output >>>>>>> >>>>>>> Local function: >>>>>>> Vec Object: 1 MPI processes >>>>>>> type: seq >>>>>>> 0 >>>>>>> 1 >>>>>>> 1 >>>>>>> 2 >>>>>>> 1 >>>>>>> 2 >>>>>>> 2 >>>>>>> 3 >>>>>>> Initial guess >>>>>>> Vec Object: 1 MPI processes >>>>>>> type: seq >>>>>>> L_2 Error: 0.625 >>>>>>> Residual: >>>>>>> Vec Object: 1 MPI processes >>>>>>> type: seq >>>>>>> 0 >>>>>>> 0 >>>>>>> 0 >>>>>>> 0 >>>>>>> 0 >>>>>>> 0 >>>>>>> 0 >>>>>>> 0 >>>>>>> Initial Residual >>>>>>> Vec Object: 1 MPI processes >>>>>>> type: seq >>>>>>> L_2 Residual: 0 >>>>>>> [0]PETSC ERROR: >>>>>>> ------------------------------------------------------------------------ >>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>> Violation, probably memory access out of range >>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>> -on_error_attach_debugger >>>>>>> [0]PETSC ERROR: or see >>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory >>>>>>> corruption errors >>>>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>> ------------------------------------ >>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>>>> available, >>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of the >>>>>>> function >>>>>>> [0]PETSC ERROR: is given. >>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 867 >>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>> ------------------------------------ >>>>>>> [0]PETSC ERROR: Signal received! >>>>>>> [0]PETSC ERROR: >>>>>>> ------------------------------------------------------------------------ >>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-3453-g0a94005 >>>>>>> GIT Date: 2014-03-02 13:12:04 -0600 >>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>> [0]PETSC ERROR: >>>>>>> ------------------------------------------------------------------------ >>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 >>>>>>> Sun Mar 2 17:00:09 2014 >>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>> /home/salaza11/petsc/linux-gnu-c-debug/lib >>>>>>> [0]PETSC ERROR: Configure run at Sun Mar 2 16:46:51 2014 >>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>> --download-chaco --with-c2html=0 >>>>>>> [0]PETSC ERROR: >>>>>>> ------------------------------------------------------------------------ >>>>>>> [0]PETSC ERROR: User provided function() line 0 in unknown file >>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>> [unset]: aborting job: >>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>> >>>>>>> >>>>>>> Probably my problems could be on my configuration. I attach the >>>>>>> configure.log. I ran ./configure like this >>>>>>> >>>>>>> ./configure --download-mpich --download-scientificpython >>>>>>> --download-triangle --download-ctetgen --download-chaco --with-c2html=0 >>>>>>> >>>>>>> Thanks a lot in advance. >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> On Tue, Jan 28, 2014 at 10:37 AM, Matthew Knepley >>>>>> > wrote: >>>>>>> >>>>>>>> On Tue, Jan 28, 2014 at 10:31 AM, Yaakoub El Khamra < >>>>>>>> yelkhamra at gmail.com> wrote: >>>>>>>> >>>>>>>>> >>>>>>>>> If >>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>> >>>>>>>>> is for serial, any chance we can get the options to run in >>>>>>>>> parallel? >>>>>>>>> >>>>>>>> >>>>>>>> Just use mpiexec -n >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> >>>>>>>>> Regards >>>>>>>>> Yaakoub El Khamra >>>>>>>>> >>>>>>>>> >>>>>>>>> On Fri, Jan 17, 2014 at 11:29 AM, Matthew Knepley < >>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> On Fri, Jan 17, 2014 at 11:06 AM, Jones,Martin Alexander < >>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> ------------------------------ >>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>> *Sent:* Friday, January 17, 2014 11:04 AM >>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>> >>>>>>>>>>> On Fri, Jan 17, 2014 at 11:00 AM, Jones,Martin Alexander < >>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>> >>>>>>>>>>>> These examples all seem to run excepting the following >>>>>>>>>>>> command, >>>>>>>>>>>> >>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>> >>>>>>>>>>>> I get the following ouput: >>>>>>>>>>>> >>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>> Local function: >>>>>>>>>>>> ./ex12: symbol lookup error: >>>>>>>>>>>> /opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so: undefined >>>>>>>>>>>> symbol: omp_get_num_procs >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> This is a build problem, but it should affect all the runs. Is >>>>>>>>>>> this reproducible? Can you send configure.log? MKL is the worst. If this >>>>>>>>>>> persists, I would just switch to --download-f-blas-lapack. >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Thanks. I have some advice on options >>>>>>>>>> >>>>>>>>>> --with-precision=single # I would not use this unless you are >>>>>>>>>> doing something special, like CUDA >>>>>>>>>> --with-clanguage=C++ # I would recommend switching to C, the >>>>>>>>>> build is much faster >>>>>>>>>> --with-mpi-dir=/usr --with-mpi4py=0 >>>>>>>>>> --with-shared-libraries --CFLAGS=-O0 --CXXFLAGS=-O0 --with-fc=0 >>>>>>>>>> --with-etags=1 # This is unnecessary >>>>>>>>>> >>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>> # Here is the problem, see below >>>>>>>>>> --download-metis >>>>>>>>>> --download-fiat=yes --download-generator >>>>>>>>>> --download-scientificpython # Get rid of these, they are obsolete >>>>>>>>>> >>>>>>>>>> Your MKL needs another library for the OpenMP symbols. I would >>>>>>>>>> recommend switching to --download-f2cblaslapack, >>>>>>>>>> or you can try and find that library. >>>>>>>>>> >>>>>>>>>> Thanks, >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> Thanks, >>>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> ------------------------------ >>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 6:35 PM >>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>> >>>>>>>>>>>> On Thu, Jan 16, 2014 at 5:43 PM, Jones,Martin Alexander < >>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> Hi, This is the next error message after configuring and >>>>>>>>>>>>> building with the triangle package when trying to run ex12 >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> This is my fault for bad defaults. I will fix. Try running >>>>>>>>>>>> >>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type >>>>>>>>>>>> dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>> >>>>>>>>>>>> for a representative run. Then you could try 3D >>>>>>>>>>>> >>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>> >>>>>>>>>>>> or a full run >>>>>>>>>>>> >>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet -interpolate >>>>>>>>>>>> -petscspace_order 1 >>>>>>>>>>>> >>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet -interpolate >>>>>>>>>>>> -petscspace_order 2 >>>>>>>>>>>> >>>>>>>>>>>> Let me know if those work. >>>>>>>>>>>> >>>>>>>>>>>> Thanks, >>>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> ./ex12 >>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point >>>>>>>>>>>>> Exception,probably divide by zero >>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find >>>>>>>>>>>>> memory corruption errors >>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are >>>>>>>>>>>>> not available, >>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of >>>>>>>>>>>>> the function >>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeResidualFEM line 531 >>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction_DMLocal line 63 >>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user function line 2088 >>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction line 2076 >>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve_NEWTONLS line 144 >>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/impls/ls/ls.c >>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve line 3765 >>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>>> shooting. >>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named maeda >>>>>>>>>>>>> by mjonesa Thu Jan 16 17:41:23 2014 >>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from /home/mjonesa/local/lib >>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 17:38:33 2014 >>>>>>>>>>>>> [0]PETSC ERROR: Configure options --prefix=/home/mjonesa/local >>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>> --with-c2html=0 --with-clanguage=c++ PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>> --download-triangle >>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in unknown >>>>>>>>>>>>> file >>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:37 PM >>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>> >>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:33 PM, Jones,Martin Alexander < >>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> Hi, I have downloaded and built the dev version you >>>>>>>>>>>>>> suggested. I think I need the triangle package to run this particular case. >>>>>>>>>>>>>> Is there any thing else that appears wrong in what I have done from the >>>>>>>>>>>>>> error messages below: >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Great! Its running. You can reconfigure like this: >>>>>>>>>>>>> >>>>>>>>>>>>> $PETSC_DIR/$PETSC_ARCH/conf/reconfigure-$PETSC_ARCH.py >>>>>>>>>>>>> --download-triangle >>>>>>>>>>>>> >>>>>>>>>>>>> and then rebuild >>>>>>>>>>>>> >>>>>>>>>>>>> make >>>>>>>>>>>>> >>>>>>>>>>>>> and then rerun. You can load meshes, but its much easier to >>>>>>>>>>>>> have triangle create them. >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks for being patient, >>>>>>>>>>>>> >>>>>>>>>>>>> Matt >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>> [0]PETSC ERROR: No support for this operation for this object >>>>>>>>>>>>>> type! >>>>>>>>>>>>>> [0]PETSC ERROR: Mesh generation needs external package >>>>>>>>>>>>>> support. >>>>>>>>>>>>>> Please reconfigure with --download-triangle.! >>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>> updates. >>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>>>> shooting. >>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named maeda >>>>>>>>>>>>>> by mjonesa Thu Jan 16 16:28:20 2014 >>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from /home/mjonesa/local/lib >>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 16:25:53 2014 >>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>> --prefix=/home/mjonesa/local --with-clanguage=c++ --with-c2html=0 >>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexGenerate() line 4332 in >>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plex.c >>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexCreateBoxMesh() line 600 in >>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexcreate.c >>>>>>>>>>>>>> [0]PETSC ERROR: CreateMesh() line 295 in >>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>> [0]PETSC ERROR: main() line 659 in >>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0 >>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:06 PM >>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:05 PM, Jones,Martin Alexander < >>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Hi. I changed the ENV variable to the correct entry. when >>>>>>>>>>>>>>> I type make ex12 I get this: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>> make ex12 >>>>>>>>>>>>>>> g++ -o ex12.o -c -Wall -Wwrite-strings -Wno-strict-aliasing >>>>>>>>>>>>>>> -Wno-unknown-pragmas -g -fPIC >>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include >>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/arch-linux2-cxx-debug/include >>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include/mpiuni >>>>>>>>>>>>>>> -D__INSDIR__=src/snes/examples/tutorials/ ex12.c >>>>>>>>>>>>>>> ex12.c:14:18: fatal error: ex12.h: No such file or directory >>>>>>>>>>>>>>> compilation terminated. >>>>>>>>>>>>>>> make: *** [ex12.o] Error 1 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Any help of yours is very much appreciated. >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Yes, this relates to my 3). This is not going to work for >>>>>>>>>>>>>> you with the release. Please see the link I sent. >>>>>>>>>>>>>> >>>>>>>>>>>>>> Matt >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:58 PM >>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:55 PM, Jones,Martin Alexander < >>>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> You built with PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:31 PM >>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:11 PM, Jones,Martin Alexander >>>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Now I went to the directory where ex12.c sits and just >>>>>>>>>>>>>>>>> did a 'make ex12.c' with the following error if this helps? : >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>> make ex12.c >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/variables:108: >>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscvariables: No >>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/rules:962: >>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules: No >>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> make: *** No rule to make target >>>>>>>>>>>>>>>>> `/home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules'. >>>>>>>>>>>>>>>>> Stop. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> 1) You would type 'make ex12' >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> 2) Either you PETSC_DIR (/home/mjonesa/PETSc/petsc-3.4.3) >>>>>>>>>>>>>>>> or PETSC_ARCH (linux-gnu-cxx-debug) environment variables >>>>>>>>>>>>>>>> do not match what you built. Please send configure.log >>>>>>>>>>>>>>>> and make.log >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> 3) Since it was only recently added, if you want to use >>>>>>>>>>>>>>>> the FEM functionality, you must use the development version: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/developers/index.html >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> *From:* Matthew Knepley [mailto:knepley at gmail.com] >>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 2:48 PM >>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 2:35 PM, Jones,Martin Alexander < >>>>>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Hello To Whom it Concerns, >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> I am trying to run the tutorial ex12.c by running 'bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h' >>>>>>>>>>>>>>>>> but getting the following error: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> $ bin/pythonscripts/PetscGenerateFEMQuadrature.py dim >>>>>>>>>>>>>>>>> order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h >>>>>>>>>>>>>>>>> Traceback (most recent call last): >>>>>>>>>>>>>>>>> File "bin/pythonscripts/PetscGenerateFEMQuadrature.py", >>>>>>>>>>>>>>>>> line 15, in >>>>>>>>>>>>>>>>> from FIAT.reference_element import default_simplex >>>>>>>>>>>>>>>>> ImportError: No module named FIAT.reference_element >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> I have removed the requirement of generating the header >>>>>>>>>>>>>>>>> file (its now all handled in C). I thought >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> I changed the documentation everywhere (including the >>>>>>>>>>>>>>>>> latest tutorial slides). Can you try running >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> with 'master' (or 'next'), and point me toward the old >>>>>>>>>>>>>>>>> docs? >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> *Miguel Angel Salazar de Troya* >>>>>>> Graduate Research Assistant >>>>>>> Department of Mechanical Science and Engineering >>>>>>> University of Illinois at Urbana-Champaign >>>>>>> (217) 550-2360 >>>>>>> salaza11 at illinois.edu >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> *Miguel Angel Salazar de Troya* >>>>> Graduate Research Assistant >>>>> Department of Mechanical Science and Engineering >>>>> University of Illinois at Urbana-Champaign >>>>> (217) 550-2360 >>>>> salaza11 at illinois.edu >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >>> >>> -- >>> *Miguel Angel Salazar de Troya* >>> Graduate Research Assistant >>> Department of Mechanical Science and Engineering >>> University of Illinois at Urbana-Champaign >>> (217) 550-2360 >>> salaza11 at illinois.edu >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > > > -- > *Miguel Angel Salazar de Troya* > Graduate Research Assistant > Department of Mechanical Science and Engineering > University of Illinois at Urbana-Champaign > (217) 550-2360 > salaza11 at illinois.edu > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From salazardetroya at gmail.com Mon Mar 3 16:59:47 2014 From: salazardetroya at gmail.com (Miguel Angel Salazar de Troya) Date: Mon, 3 Mar 2014 16:59:47 -0600 Subject: [petsc-users] Problems running ex12.c In-Reply-To: References: <8448FFCE4362914496BCEAF8BE810C13EF1D34@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D5E@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D7C@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D8F@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DA3@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DBF@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E22@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E50@DCPWPEXMBX02.mdanderson.edu> Message-ID: You are welcome, thanks for your help. On Mon, Mar 3, 2014 at 4:13 PM, Matthew Knepley wrote: > On Mon, Mar 3, 2014 at 1:44 PM, Miguel Angel Salazar de Troya < > salazardetroya at gmail.com> wrote: > >> Thanks. This is what I get. >> > > Okay, this was broken by a new push to master/next in the last few days. I > have pushed a fix, > however next is currently broken due to a failure to check in a file. This > should be fixed shortly, > and then ex12 will work. I will mail you when its ready. > > Thanks for finding this, > > Matt > > >> (gdb) cont >> Continuing. >> >> Program received signal SIGSEGV, Segmentation fault. >> 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, >> X=0x168b5b0, >> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, str=0x7fffae6e7970, >> user=0x7fd6811be509) >> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >> 882 ierr = PetscFEGetDimension(fe[f], &Nb);CHKERRQ(ierr); >> (gdb) where >> #0 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, >> X=0x168b5b0, >> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, str=0x7fffae6e7970, >> user=0x7fd6811be509) >> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >> #1 0x00007fd6814a5bf6 in SNESComputeJacobian_DMLocal (snes=0x14e9450, >> X=0x1622ad0, A=0x7fffae6e8a88, B=0x7fffae6e8a88, ctx=0x1652300) >> at /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c:102 >> #2 0x00007fd6814cc609 in SNESComputeJacobian (snes=0x14e9450, >> X=0x1622ad0, >> A=0x7fffae6e8a88, B=0x7fffae6e8a88) >> at /home/salaza11/petsc/src/snes/interface/snes.c:2245 >> #3 0x000000000040af72 in main (argc=15, argv=0x7fffae6e8bc8) >> at /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:784 >> >> >> >> >> On Mon, Mar 3, 2014 at 1:40 PM, Matthew Knepley wrote: >> >>> On Mon, Mar 3, 2014 at 1:39 PM, Miguel Angel Salazar de Troya < >>> salazardetroya at gmail.com> wrote: >>> >>>> This is what I get at gdb when I type 'where'. >>>> >>> >>> You have to type 'cont', and then when it fails you type 'where'. >>> >>> Matt >>> >>> >>>> #0 0x000000310e0aa860 in __nanosleep_nocancel () from /lib64/libc.so.6 >>>> #1 0x000000310e0aa70f in sleep () from /lib64/libc.so.6 >>>> #2 0x00007fd83a00a8be in PetscSleep (s=10) >>>> at /home/salaza11/petsc/src/sys/utils/psleep.c:52 >>>> #3 0x00007fd83a06f331 in PetscAttachDebugger () >>>> at /home/salaza11/petsc/src/sys/error/adebug.c:397 >>>> #4 0x00007fd83a0af1d2 in PetscOptionsCheckInitial_Private () >>>> at /home/salaza11/petsc/src/sys/objects/init.c:444 >>>> #5 0x00007fd83a0b6448 in PetscInitialize (argc=0x7fff5cd8df2c, >>>> args=0x7fff5cd8df20, file=0x0, >>>> help=0x60ce40 "Poisson Problem in 2d and 3d with simplicial finite >>>> elements.\nWe solve the Poisson problem in a rectangular\ndomain, using a >>>> parallel unstructured mesh (DMPLEX) to discretize it.\n\n\n") >>>> at /home/salaza11/petsc/src/sys/objects/pinit.c:876 >>>> #6 0x0000000000408f2c in main (argc=15, argv=0x7fff5cd8f1f8) >>>> at /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:663 >>>> >>>> The rest of the gdb output is attached. I am a bit ignorant with gdb, I >>>> apologize for that. >>>> >>>> >>>> >>>> On Mon, Mar 3, 2014 at 12:48 PM, Matthew Knepley wrote: >>>> >>>>> On Mon, Mar 3, 2014 at 12:39 PM, Miguel Angel Salazar de Troya < >>>>> salazardetroya at gmail.com> wrote: >>>>> >>>>>> Thanks for your response. Sorry I did not have the "next" version, >>>>>> but the "master" version. I still have an error though. I followed the >>>>>> steps given here (https://bitbucket.org/petsc/petsc/wiki/Home) to >>>>>> obtain the next version, I configured petsc as above and ran ex12 as above >>>>>> as well, getting this error: >>>>>> >>>>>> [salaza11 at maya tutorials]$ ./ex12 -run_type test -refinement_limit >>>>>> 0.0 -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>> -dm_plex_print_fem 1 >>>>>> Local function: >>>>>> Vec Object: 1 MPI processes >>>>>> type: seq >>>>>> 0 >>>>>> 0.25 >>>>>> 1 >>>>>> 0.25 >>>>>> 0.5 >>>>>> 1.25 >>>>>> 1 >>>>>> 1.25 >>>>>> 2 >>>>>> Initial guess >>>>>> Vec Object: 1 MPI processes >>>>>> type: seq >>>>>> 0.5 >>>>>> L_2 Error: 0.111111 >>>>>> Residual: >>>>>> Vec Object: 1 MPI processes >>>>>> type: seq >>>>>> 0 >>>>>> 0 >>>>>> 0 >>>>>> 0 >>>>>> 0 >>>>>> 0 >>>>>> 0 >>>>>> 0 >>>>>> 0 >>>>>> Initial Residual >>>>>> Vec Object: 1 MPI processes >>>>>> type: seq >>>>>> 0 >>>>>> L_2 Residual: 0 >>>>>> >>>>> >>>>> Okay, now run with -start_in_debugger, and give me a stack trace using >>>>> 'where'. >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> [0]PETSC ERROR: >>>>>> ------------------------------------------------------------------------ >>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >>>>>> probably memory access out of range >>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>> -on_error_attach_debugger >>>>>> [0]PETSC ERROR: or see >>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory >>>>>> corruption errors >>>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>> ------------------------------------ >>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>>> available, >>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of the >>>>>> function >>>>>> [0]PETSC ERROR: is given. >>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 871 >>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>> -------------------------------------------------------------- >>>>>> [0]PETSC ERROR: Signal received >>>>>> [0]PETSC ERROR: See http:// >>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble >>>>>> shooting. >>>>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-4705-gfb6b3bc >>>>>> GIT Date: 2014-03-03 08:23:43 -0600 >>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 >>>>>> Mon Mar 3 11:49:15 2014 >>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>> --download-chaco --with-c2html=0 >>>>>> [0]PETSC ERROR: #1 User provided function() line 0 in unknown file >>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>> [unset]: aborting job: >>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>> >>>>>> >>>>>> >>>>>> On Sun, Mar 2, 2014 at 7:11 PM, Matthew Knepley wrote: >>>>>> >>>>>>> On Sun, Mar 2, 2014 at 6:54 PM, Miguel Angel Salazar de Troya < >>>>>>> salazardetroya at gmail.com> wrote: >>>>>>> >>>>>>>> Hi everybody >>>>>>>> >>>>>>>> I am trying to run example ex12.c without much success. I >>>>>>>> specifically run it with the command options: >>>>>>>> >>>>>>> >>>>>>> We need to start narrowing down differences, because it runs for me >>>>>>> and our nightly tests. So, first can >>>>>>> you confirm that you are using the latest 'next' branch? >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type dirichlet >>>>>>>> -interpolate 0 -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>>>>> >>>>>>>> And I get this output >>>>>>>> >>>>>>>> Local function: >>>>>>>> Vec Object: 1 MPI processes >>>>>>>> type: seq >>>>>>>> 0 >>>>>>>> 1 >>>>>>>> 1 >>>>>>>> 2 >>>>>>>> 1 >>>>>>>> 2 >>>>>>>> 2 >>>>>>>> 3 >>>>>>>> Initial guess >>>>>>>> Vec Object: 1 MPI processes >>>>>>>> type: seq >>>>>>>> L_2 Error: 0.625 >>>>>>>> Residual: >>>>>>>> Vec Object: 1 MPI processes >>>>>>>> type: seq >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> Initial Residual >>>>>>>> Vec Object: 1 MPI processes >>>>>>>> type: seq >>>>>>>> L_2 Residual: 0 >>>>>>>> [0]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>> Violation, probably memory access out of range >>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>> -on_error_attach_debugger >>>>>>>> [0]PETSC ERROR: or see >>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory >>>>>>>> corruption errors >>>>>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>> ------------------------------------ >>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>>>>> available, >>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of the >>>>>>>> function >>>>>>>> [0]PETSC ERROR: is given. >>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 867 >>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>> ------------------------------------ >>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>> [0]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>> v3.4.3-3453-g0a94005 GIT Date: 2014-03-02 13:12:04 -0600 >>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>> [0]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by >>>>>>>> salaza11 Sun Mar 2 17:00:09 2014 >>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>> /home/salaza11/petsc/linux-gnu-c-debug/lib >>>>>>>> [0]PETSC ERROR: Configure run at Sun Mar 2 16:46:51 2014 >>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>> --download-chaco --with-c2html=0 >>>>>>>> [0]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> [0]PETSC ERROR: User provided function() line 0 in unknown file >>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>> [unset]: aborting job: >>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>> >>>>>>>> >>>>>>>> Probably my problems could be on my configuration. I attach the >>>>>>>> configure.log. I ran ./configure like this >>>>>>>> >>>>>>>> ./configure --download-mpich --download-scientificpython >>>>>>>> --download-triangle --download-ctetgen --download-chaco --with-c2html=0 >>>>>>>> >>>>>>>> Thanks a lot in advance. >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On Tue, Jan 28, 2014 at 10:37 AM, Matthew Knepley < >>>>>>>> knepley at gmail.com> wrote: >>>>>>>> >>>>>>>>> On Tue, Jan 28, 2014 at 10:31 AM, Yaakoub El Khamra < >>>>>>>>> yelkhamra at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> >>>>>>>>>> If >>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>> >>>>>>>>>> is for serial, any chance we can get the options to run in >>>>>>>>>> parallel? >>>>>>>>>> >>>>>>>>> >>>>>>>>> Just use mpiexec -n >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> >>>>>>>>>> Regards >>>>>>>>>> Yaakoub El Khamra >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Fri, Jan 17, 2014 at 11:29 AM, Matthew Knepley < >>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> On Fri, Jan 17, 2014 at 11:06 AM, Jones,Martin Alexander < >>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> ------------------------------ >>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>> *Sent:* Friday, January 17, 2014 11:04 AM >>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>> >>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:00 AM, Jones,Martin Alexander < >>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> These examples all seem to run excepting the following >>>>>>>>>>>>> command, >>>>>>>>>>>>> >>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>> >>>>>>>>>>>>> I get the following ouput: >>>>>>>>>>>>> >>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>> Local function: >>>>>>>>>>>>> ./ex12: symbol lookup error: >>>>>>>>>>>>> /opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so: undefined >>>>>>>>>>>>> symbol: omp_get_num_procs >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> This is a build problem, but it should affect all the runs. >>>>>>>>>>>> Is this reproducible? Can you send configure.log? MKL is the worst. If this >>>>>>>>>>>> persists, I would just switch to --download-f-blas-lapack. >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Thanks. I have some advice on options >>>>>>>>>>> >>>>>>>>>>> --with-precision=single # I would not use this unless you are >>>>>>>>>>> doing something special, like CUDA >>>>>>>>>>> --with-clanguage=C++ # I would recommend switching to C, the >>>>>>>>>>> build is much faster >>>>>>>>>>> --with-mpi-dir=/usr --with-mpi4py=0 >>>>>>>>>>> --with-shared-libraries --CFLAGS=-O0 --CXXFLAGS=-O0 --with-fc=0 >>>>>>>>>>> --with-etags=1 # This is unnecessary >>>>>>>>>>> >>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>> # Here is the problem, see below >>>>>>>>>>> --download-metis >>>>>>>>>>> --download-fiat=yes --download-generator >>>>>>>>>>> --download-scientificpython # Get rid of these, they are obsolete >>>>>>>>>>> >>>>>>>>>>> Your MKL needs another library for the OpenMP symbols. I would >>>>>>>>>>> recommend switching to --download-f2cblaslapack, >>>>>>>>>>> or you can try and find that library. >>>>>>>>>>> >>>>>>>>>>> Thanks, >>>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> Thanks, >>>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 6:35 PM >>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>> >>>>>>>>>>>>> On Thu, Jan 16, 2014 at 5:43 PM, Jones,Martin Alexander < >>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> Hi, This is the next error message after configuring and >>>>>>>>>>>>>> building with the triangle package when trying to run ex12 >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> This is my fault for bad defaults. I will fix. Try running >>>>>>>>>>>>> >>>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type >>>>>>>>>>>>> dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>>> >>>>>>>>>>>>> for a representative run. Then you could try 3D >>>>>>>>>>>>> >>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>> >>>>>>>>>>>>> or a full run >>>>>>>>>>>>> >>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>> -interpolate -petscspace_order 1 >>>>>>>>>>>>> >>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>> -interpolate -petscspace_order 2 >>>>>>>>>>>>> >>>>>>>>>>>>> Let me know if those work. >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks, >>>>>>>>>>>>> >>>>>>>>>>>>> Matt >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> ./ex12 >>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point >>>>>>>>>>>>>> Exception,probably divide by zero >>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find >>>>>>>>>>>>>> memory corruption errors >>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack >>>>>>>>>>>>>> below >>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are >>>>>>>>>>>>>> not available, >>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of >>>>>>>>>>>>>> the function >>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeResidualFEM line 531 >>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction_DMLocal line 63 >>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user function line 2088 >>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction line 2076 >>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve_NEWTONLS line 144 >>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/impls/ls/ls.c >>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve line 3765 >>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>> updates. >>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>>>> shooting. >>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named maeda >>>>>>>>>>>>>> by mjonesa Thu Jan 16 17:41:23 2014 >>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from /home/mjonesa/local/lib >>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 17:38:33 2014 >>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>> --prefix=/home/mjonesa/local >>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>> --with-c2html=0 --with-clanguage=c++ PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>> --download-triangle >>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in unknown >>>>>>>>>>>>>> file >>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:37 PM >>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:33 PM, Jones,Martin Alexander < >>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Hi, I have downloaded and built the dev version you >>>>>>>>>>>>>>> suggested. I think I need the triangle package to run this particular case. >>>>>>>>>>>>>>> Is there any thing else that appears wrong in what I have done from the >>>>>>>>>>>>>>> error messages below: >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Great! Its running. You can reconfigure like this: >>>>>>>>>>>>>> >>>>>>>>>>>>>> $PETSC_DIR/$PETSC_ARCH/conf/reconfigure-$PETSC_ARCH.py >>>>>>>>>>>>>> --download-triangle >>>>>>>>>>>>>> >>>>>>>>>>>>>> and then rebuild >>>>>>>>>>>>>> >>>>>>>>>>>>>> make >>>>>>>>>>>>>> >>>>>>>>>>>>>> and then rerun. You can load meshes, but its much easier to >>>>>>>>>>>>>> have triangle create them. >>>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks for being patient, >>>>>>>>>>>>>> >>>>>>>>>>>>>> Matt >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: No support for this operation for this >>>>>>>>>>>>>>> object type! >>>>>>>>>>>>>>> [0]PETSC ERROR: Mesh generation needs external package >>>>>>>>>>>>>>> support. >>>>>>>>>>>>>>> Please reconfigure with --download-triangle.! >>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>>> updates. >>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>>>>> shooting. >>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named >>>>>>>>>>>>>>> maeda by mjonesa Thu Jan 16 16:28:20 2014 >>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from /home/mjonesa/local/lib >>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 16:25:53 2014 >>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>> --prefix=/home/mjonesa/local --with-clanguage=c++ --with-c2html=0 >>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexGenerate() line 4332 in >>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plex.c >>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexCreateBoxMesh() line 600 in >>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexcreate.c >>>>>>>>>>>>>>> [0]PETSC ERROR: CreateMesh() line 295 in >>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>> [0]PETSC ERROR: main() line 659 in >>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0 >>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:06 PM >>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:05 PM, Jones,Martin Alexander < >>>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Hi. I changed the ENV variable to the correct entry. when >>>>>>>>>>>>>>>> I type make ex12 I get this: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>> make ex12 >>>>>>>>>>>>>>>> g++ -o ex12.o -c -Wall -Wwrite-strings -Wno-strict-aliasing >>>>>>>>>>>>>>>> -Wno-unknown-pragmas -g -fPIC >>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include >>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/arch-linux2-cxx-debug/include >>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include/mpiuni >>>>>>>>>>>>>>>> -D__INSDIR__=src/snes/examples/tutorials/ ex12.c >>>>>>>>>>>>>>>> ex12.c:14:18: fatal error: ex12.h: No such file or directory >>>>>>>>>>>>>>>> compilation terminated. >>>>>>>>>>>>>>>> make: *** [ex12.o] Error 1 >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Any help of yours is very much appreciated. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Yes, this relates to my 3). This is not going to work for >>>>>>>>>>>>>>> you with the release. Please see the link I sent. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:58 PM >>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:55 PM, Jones,Martin Alexander >>>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> You built with PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:31 PM >>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:11 PM, Jones,Martin >>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Now I went to the directory where ex12.c sits and just >>>>>>>>>>>>>>>>>> did a 'make ex12.c' with the following error if this helps? : >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>> make ex12.c >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/variables:108: >>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscvariables: No >>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/rules:962: >>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules: No >>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> make: *** No rule to make target >>>>>>>>>>>>>>>>>> `/home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules'. >>>>>>>>>>>>>>>>>> Stop. >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> 1) You would type 'make ex12' >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> 2) Either you PETSC_DIR (/home/mjonesa/PETSc/petsc-3.4.3) >>>>>>>>>>>>>>>>> or PETSC_ARCH (linux-gnu-cxx-debug) environment variables >>>>>>>>>>>>>>>>> do not match what you built. Please send configure.log >>>>>>>>>>>>>>>>> and make.log >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> 3) Since it was only recently added, if you want to use >>>>>>>>>>>>>>>>> the FEM functionality, you must use the development version: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/developers/index.html >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [mailto:knepley at gmail.com] >>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 2:48 PM >>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 2:35 PM, Jones,Martin Alexander < >>>>>>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Hello To Whom it Concerns, >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> I am trying to run the tutorial ex12.c by running 'bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h' >>>>>>>>>>>>>>>>>> but getting the following error: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> $ bin/pythonscripts/PetscGenerateFEMQuadrature.py dim >>>>>>>>>>>>>>>>>> order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h >>>>>>>>>>>>>>>>>> Traceback (most recent call last): >>>>>>>>>>>>>>>>>> File "bin/pythonscripts/PetscGenerateFEMQuadrature.py", >>>>>>>>>>>>>>>>>> line 15, in >>>>>>>>>>>>>>>>>> from FIAT.reference_element import default_simplex >>>>>>>>>>>>>>>>>> ImportError: No module named FIAT.reference_element >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> I have removed the requirement of generating the header >>>>>>>>>>>>>>>>>> file (its now all handled in C). I thought >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> I changed the documentation everywhere (including the >>>>>>>>>>>>>>>>>> latest tutorial slides). Can you try running >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> with 'master' (or 'next'), and point me toward the old >>>>>>>>>>>>>>>>>> docs? >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>> Graduate Research Assistant >>>>>>>> Department of Mechanical Science and Engineering >>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>> (217) 550-2360 >>>>>>>> salaza11 at illinois.edu >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> *Miguel Angel Salazar de Troya* >>>>>> Graduate Research Assistant >>>>>> Department of Mechanical Science and Engineering >>>>>> University of Illinois at Urbana-Champaign >>>>>> (217) 550-2360 >>>>>> salaza11 at illinois.edu >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>>> >>>> -- >>>> *Miguel Angel Salazar de Troya* >>>> Graduate Research Assistant >>>> Department of Mechanical Science and Engineering >>>> University of Illinois at Urbana-Champaign >>>> (217) 550-2360 >>>> salaza11 at illinois.edu >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> >> >> -- >> *Miguel Angel Salazar de Troya* >> Graduate Research Assistant >> Department of Mechanical Science and Engineering >> University of Illinois at Urbana-Champaign >> (217) 550-2360 >> salaza11 at illinois.edu >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- *Miguel Angel Salazar de Troya* Graduate Research Assistant Department of Mechanical Science and Engineering University of Illinois at Urbana-Champaign (217) 550-2360 salaza11 at illinois.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: From fd.kong at siat.ac.cn Mon Mar 3 18:42:33 2014 From: fd.kong at siat.ac.cn (Fande Kong) Date: Mon, 3 Mar 2014 17:42:33 -0700 Subject: [petsc-users] questions about PetscSF Message-ID: Hi all, I was wondering mechanisms of the object PetscSF. What are definitions of roots and leaves? Do roots/leaves associate with the data we want to receive/send? For the function Bcast, it seems that we transfer data from roots to leaves. But in another function Reduce, it seems that we move data in the opposite direction (from leaves to roots). These kind of mechanisms possibly make users confused. Fande, -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Mon Mar 3 18:54:22 2014 From: jed at jedbrown.org (Jed Brown) Date: Mon, 03 Mar 2014 18:54:22 -0600 Subject: [petsc-users] questions about PetscSF In-Reply-To: References: Message-ID: <87txben5q9.fsf@jedbrown.org> Fande Kong writes: > Hi all, > > I was wondering mechanisms of the object PetscSF. What are definitions of > roots and leaves? Do roots/leaves associate with the data we want to > receive/send? For the function Bcast, it seems that we transfer data from > roots to leaves. But in another function Reduce, it seems that we move data > in the opposite direction (from leaves to roots). These kind of mechanisms > possibly make users confused. The SF graph is asymmetric so that it can have cleaner semantics. Read my note and reply here if something is still unclear. http://59a2.org/files/StarForest.pdf -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From knepley at gmail.com Mon Mar 3 19:05:53 2014 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 3 Mar 2014 19:05:53 -0600 Subject: [petsc-users] Problems running ex12.c In-Reply-To: References: <8448FFCE4362914496BCEAF8BE810C13EF1D34@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D5E@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D7C@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D8F@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DA3@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DBF@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E22@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E50@DCPWPEXMBX02.mdanderson.edu> Message-ID: On Mon, Mar 3, 2014 at 4:59 PM, Miguel Angel Salazar de Troya < salazardetroya at gmail.com> wrote: > You are welcome, thanks for your help. > Okay, I have rebuilt completely clean, and ex12 runs for me. Can you try again after pulling? Thanks, Matt > On Mon, Mar 3, 2014 at 4:13 PM, Matthew Knepley wrote: > >> On Mon, Mar 3, 2014 at 1:44 PM, Miguel Angel Salazar de Troya < >> salazardetroya at gmail.com> wrote: >> >>> Thanks. This is what I get. >>> >> >> Okay, this was broken by a new push to master/next in the last few days. >> I have pushed a fix, >> however next is currently broken due to a failure to check in a file. >> This should be fixed shortly, >> and then ex12 will work. I will mail you when its ready. >> >> Thanks for finding this, >> >> Matt >> >> >>> (gdb) cont >>> Continuing. >>> >>> Program received signal SIGSEGV, Segmentation fault. >>> 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, >>> X=0x168b5b0, >>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, str=0x7fffae6e7970, >>> user=0x7fd6811be509) >>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>> 882 ierr = PetscFEGetDimension(fe[f], &Nb);CHKERRQ(ierr); >>> (gdb) where >>> #0 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, >>> X=0x168b5b0, >>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, str=0x7fffae6e7970, >>> user=0x7fd6811be509) >>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>> #1 0x00007fd6814a5bf6 in SNESComputeJacobian_DMLocal (snes=0x14e9450, >>> X=0x1622ad0, A=0x7fffae6e8a88, B=0x7fffae6e8a88, ctx=0x1652300) >>> at /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c:102 >>> #2 0x00007fd6814cc609 in SNESComputeJacobian (snes=0x14e9450, >>> X=0x1622ad0, >>> A=0x7fffae6e8a88, B=0x7fffae6e8a88) >>> at /home/salaza11/petsc/src/snes/interface/snes.c:2245 >>> #3 0x000000000040af72 in main (argc=15, argv=0x7fffae6e8bc8) >>> at /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:784 >>> >>> >>> >>> >>> On Mon, Mar 3, 2014 at 1:40 PM, Matthew Knepley wrote: >>> >>>> On Mon, Mar 3, 2014 at 1:39 PM, Miguel Angel Salazar de Troya < >>>> salazardetroya at gmail.com> wrote: >>>> >>>>> This is what I get at gdb when I type 'where'. >>>>> >>>> >>>> You have to type 'cont', and then when it fails you type 'where'. >>>> >>>> Matt >>>> >>>> >>>>> #0 0x000000310e0aa860 in __nanosleep_nocancel () from /lib64/libc.so.6 >>>>> #1 0x000000310e0aa70f in sleep () from /lib64/libc.so.6 >>>>> #2 0x00007fd83a00a8be in PetscSleep (s=10) >>>>> at /home/salaza11/petsc/src/sys/utils/psleep.c:52 >>>>> #3 0x00007fd83a06f331 in PetscAttachDebugger () >>>>> at /home/salaza11/petsc/src/sys/error/adebug.c:397 >>>>> #4 0x00007fd83a0af1d2 in PetscOptionsCheckInitial_Private () >>>>> at /home/salaza11/petsc/src/sys/objects/init.c:444 >>>>> #5 0x00007fd83a0b6448 in PetscInitialize (argc=0x7fff5cd8df2c, >>>>> args=0x7fff5cd8df20, file=0x0, >>>>> help=0x60ce40 "Poisson Problem in 2d and 3d with simplicial finite >>>>> elements.\nWe solve the Poisson problem in a rectangular\ndomain, using a >>>>> parallel unstructured mesh (DMPLEX) to discretize it.\n\n\n") >>>>> at /home/salaza11/petsc/src/sys/objects/pinit.c:876 >>>>> #6 0x0000000000408f2c in main (argc=15, argv=0x7fff5cd8f1f8) >>>>> at /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:663 >>>>> >>>>> The rest of the gdb output is attached. I am a bit ignorant with gdb, >>>>> I apologize for that. >>>>> >>>>> >>>>> >>>>> On Mon, Mar 3, 2014 at 12:48 PM, Matthew Knepley wrote: >>>>> >>>>>> On Mon, Mar 3, 2014 at 12:39 PM, Miguel Angel Salazar de Troya < >>>>>> salazardetroya at gmail.com> wrote: >>>>>> >>>>>>> Thanks for your response. Sorry I did not have the "next" version, >>>>>>> but the "master" version. I still have an error though. I followed the >>>>>>> steps given here (https://bitbucket.org/petsc/petsc/wiki/Home) to >>>>>>> obtain the next version, I configured petsc as above and ran ex12 as above >>>>>>> as well, getting this error: >>>>>>> >>>>>>> [salaza11 at maya tutorials]$ ./ex12 -run_type test -refinement_limit >>>>>>> 0.0 -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>> -dm_plex_print_fem 1 >>>>>>> Local function: >>>>>>> Vec Object: 1 MPI processes >>>>>>> type: seq >>>>>>> 0 >>>>>>> 0.25 >>>>>>> 1 >>>>>>> 0.25 >>>>>>> 0.5 >>>>>>> 1.25 >>>>>>> 1 >>>>>>> 1.25 >>>>>>> 2 >>>>>>> Initial guess >>>>>>> Vec Object: 1 MPI processes >>>>>>> type: seq >>>>>>> 0.5 >>>>>>> L_2 Error: 0.111111 >>>>>>> Residual: >>>>>>> Vec Object: 1 MPI processes >>>>>>> type: seq >>>>>>> 0 >>>>>>> 0 >>>>>>> 0 >>>>>>> 0 >>>>>>> 0 >>>>>>> 0 >>>>>>> 0 >>>>>>> 0 >>>>>>> 0 >>>>>>> Initial Residual >>>>>>> Vec Object: 1 MPI processes >>>>>>> type: seq >>>>>>> 0 >>>>>>> L_2 Residual: 0 >>>>>>> >>>>>> >>>>>> Okay, now run with -start_in_debugger, and give me a stack trace >>>>>> using 'where'. >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> [0]PETSC ERROR: >>>>>>> ------------------------------------------------------------------------ >>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>> Violation, probably memory access out of range >>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>> -on_error_attach_debugger >>>>>>> [0]PETSC ERROR: or see >>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory >>>>>>> corruption errors >>>>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>> ------------------------------------ >>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>>>> available, >>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of the >>>>>>> function >>>>>>> [0]PETSC ERROR: is given. >>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 871 >>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>> -------------------------------------------------------------- >>>>>>> [0]PETSC ERROR: Signal received >>>>>>> [0]PETSC ERROR: See http:// >>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble >>>>>>> shooting. >>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>> v3.4.3-4705-gfb6b3bc GIT Date: 2014-03-03 08:23:43 -0600 >>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 >>>>>>> Mon Mar 3 11:49:15 2014 >>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>> --download-chaco --with-c2html=0 >>>>>>> [0]PETSC ERROR: #1 User provided function() line 0 in unknown file >>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>> [unset]: aborting job: >>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>> >>>>>>> >>>>>>> >>>>>>> On Sun, Mar 2, 2014 at 7:11 PM, Matthew Knepley wrote: >>>>>>> >>>>>>>> On Sun, Mar 2, 2014 at 6:54 PM, Miguel Angel Salazar de Troya < >>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>> >>>>>>>>> Hi everybody >>>>>>>>> >>>>>>>>> I am trying to run example ex12.c without much success. I >>>>>>>>> specifically run it with the command options: >>>>>>>>> >>>>>>>> >>>>>>>> We need to start narrowing down differences, because it runs for me >>>>>>>> and our nightly tests. So, first can >>>>>>>> you confirm that you are using the latest 'next' branch? >>>>>>>> >>>>>>>> Thanks, >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type dirichlet >>>>>>>>> -interpolate 0 -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>>>>>> >>>>>>>>> And I get this output >>>>>>>>> >>>>>>>>> Local function: >>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>> type: seq >>>>>>>>> 0 >>>>>>>>> 1 >>>>>>>>> 1 >>>>>>>>> 2 >>>>>>>>> 1 >>>>>>>>> 2 >>>>>>>>> 2 >>>>>>>>> 3 >>>>>>>>> Initial guess >>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>> type: seq >>>>>>>>> L_2 Error: 0.625 >>>>>>>>> Residual: >>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>> type: seq >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> Initial Residual >>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>> type: seq >>>>>>>>> L_2 Residual: 0 >>>>>>>>> [0]PETSC ERROR: >>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>>> Violation, probably memory access out of range >>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>> -on_error_attach_debugger >>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find >>>>>>>>> memory corruption errors >>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>> ------------------------------------ >>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>>>>>> available, >>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of the >>>>>>>>> function >>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 867 >>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>> ------------------------------------ >>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>> [0]PETSC ERROR: >>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>> v3.4.3-3453-g0a94005 GIT Date: 2014-03-02 13:12:04 -0600 >>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>> [0]PETSC ERROR: >>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by >>>>>>>>> salaza11 Sun Mar 2 17:00:09 2014 >>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>> /home/salaza11/petsc/linux-gnu-c-debug/lib >>>>>>>>> [0]PETSC ERROR: Configure run at Sun Mar 2 16:46:51 2014 >>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>> [0]PETSC ERROR: >>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in unknown file >>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>> [unset]: aborting job: >>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>> >>>>>>>>> >>>>>>>>> Probably my problems could be on my configuration. I attach the >>>>>>>>> configure.log. I ran ./configure like this >>>>>>>>> >>>>>>>>> ./configure --download-mpich --download-scientificpython >>>>>>>>> --download-triangle --download-ctetgen --download-chaco --with-c2html=0 >>>>>>>>> >>>>>>>>> Thanks a lot in advance. >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> On Tue, Jan 28, 2014 at 10:37 AM, Matthew Knepley < >>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> On Tue, Jan 28, 2014 at 10:31 AM, Yaakoub El Khamra < >>>>>>>>>> yelkhamra at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> If >>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>> >>>>>>>>>>> is for serial, any chance we can get the options to run in >>>>>>>>>>> parallel? >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Just use mpiexec -n >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Regards >>>>>>>>>>> Yaakoub El Khamra >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Fri, Jan 17, 2014 at 11:29 AM, Matthew Knepley < >>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:06 AM, Jones,Martin Alexander < >>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>> *Sent:* Friday, January 17, 2014 11:04 AM >>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>> >>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:00 AM, Jones,Martin Alexander < >>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> These examples all seem to run excepting the following >>>>>>>>>>>>>> command, >>>>>>>>>>>>>> >>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>> >>>>>>>>>>>>>> I get the following ouput: >>>>>>>>>>>>>> >>>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>> ./ex12: symbol lookup error: >>>>>>>>>>>>>> /opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so: undefined >>>>>>>>>>>>>> symbol: omp_get_num_procs >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> This is a build problem, but it should affect all the runs. >>>>>>>>>>>>> Is this reproducible? Can you send configure.log? MKL is the worst. If this >>>>>>>>>>>>> persists, I would just switch to --download-f-blas-lapack. >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Thanks. I have some advice on options >>>>>>>>>>>> >>>>>>>>>>>> --with-precision=single # I would not use this unless you are >>>>>>>>>>>> doing something special, like CUDA >>>>>>>>>>>> --with-clanguage=C++ # I would recommend switching to C, the >>>>>>>>>>>> build is much faster >>>>>>>>>>>> --with-mpi-dir=/usr --with-mpi4py=0 >>>>>>>>>>>> --with-shared-libraries --CFLAGS=-O0 --CXXFLAGS=-O0 >>>>>>>>>>>> --with-fc=0 >>>>>>>>>>>> --with-etags=1 # This is unnecessary >>>>>>>>>>>> >>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>> # Here is the problem, see below >>>>>>>>>>>> --download-metis >>>>>>>>>>>> --download-fiat=yes --download-generator >>>>>>>>>>>> --download-scientificpython # Get rid of these, they are obsolete >>>>>>>>>>>> >>>>>>>>>>>> Your MKL needs another library for the OpenMP symbols. I would >>>>>>>>>>>> recommend switching to --download-f2cblaslapack, >>>>>>>>>>>> or you can try and find that library. >>>>>>>>>>>> >>>>>>>>>>>> Thanks, >>>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> Thanks, >>>>>>>>>>>>> >>>>>>>>>>>>> Matt >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 6:35 PM >>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 5:43 PM, Jones,Martin Alexander < >>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Hi, This is the next error message after configuring and >>>>>>>>>>>>>>> building with the triangle package when trying to run ex12 >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> This is my fault for bad defaults. I will fix. Try running >>>>>>>>>>>>>> >>>>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type >>>>>>>>>>>>>> dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>>>> >>>>>>>>>>>>>> for a representative run. Then you could try 3D >>>>>>>>>>>>>> >>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>> >>>>>>>>>>>>>> or a full run >>>>>>>>>>>>>> >>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>>> -interpolate -petscspace_order 1 >>>>>>>>>>>>>> >>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>>> -interpolate -petscspace_order 2 >>>>>>>>>>>>>> >>>>>>>>>>>>>> Let me know if those work. >>>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>> >>>>>>>>>>>>>> Matt >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> ./ex12 >>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point >>>>>>>>>>>>>>> Exception,probably divide by zero >>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find >>>>>>>>>>>>>>> memory corruption errors >>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack >>>>>>>>>>>>>>> below >>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack >>>>>>>>>>>>>>> are not available, >>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start >>>>>>>>>>>>>>> of the function >>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeResidualFEM line 531 >>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction_DMLocal line 63 >>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user function line 2088 >>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction line 2076 >>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve_NEWTONLS line 144 >>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/impls/ls/ls.c >>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve line 3765 >>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>>> updates. >>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>>>>> shooting. >>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named >>>>>>>>>>>>>>> maeda by mjonesa Thu Jan 16 17:41:23 2014 >>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from /home/mjonesa/local/lib >>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 17:38:33 2014 >>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>> --prefix=/home/mjonesa/local >>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>> --with-c2html=0 --with-clanguage=c++ PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>> --download-triangle >>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in unknown >>>>>>>>>>>>>>> file >>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:37 PM >>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:33 PM, Jones,Martin Alexander < >>>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Hi, I have downloaded and built the dev version you >>>>>>>>>>>>>>>> suggested. I think I need the triangle package to run this particular case. >>>>>>>>>>>>>>>> Is there any thing else that appears wrong in what I have done from the >>>>>>>>>>>>>>>> error messages below: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Great! Its running. You can reconfigure like this: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> $PETSC_DIR/$PETSC_ARCH/conf/reconfigure-$PETSC_ARCH.py >>>>>>>>>>>>>>> --download-triangle >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> and then rebuild >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> make >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> and then rerun. You can load meshes, but its much easier >>>>>>>>>>>>>>> to have triangle create them. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks for being patient, >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>> [0]PETSC ERROR: No support for this operation for this >>>>>>>>>>>>>>>> object type! >>>>>>>>>>>>>>>> [0]PETSC ERROR: Mesh generation needs external package >>>>>>>>>>>>>>>> support. >>>>>>>>>>>>>>>> Please reconfigure with --download-triangle.! >>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>>>> updates. >>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>>>>>> shooting. >>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named >>>>>>>>>>>>>>>> maeda by mjonesa Thu Jan 16 16:28:20 2014 >>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 16:25:53 2014 >>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local --with-clanguage=c++ --with-c2html=0 >>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexGenerate() line 4332 in >>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plex.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexCreateBoxMesh() line 600 in >>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexcreate.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: CreateMesh() line 295 in >>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: main() line 659 in >>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0 >>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:06 PM >>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:05 PM, Jones,Martin Alexander >>>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Hi. I changed the ENV variable to the correct entry. >>>>>>>>>>>>>>>>> when I type make ex12 I get this: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>> make ex12 >>>>>>>>>>>>>>>>> g++ -o ex12.o -c -Wall -Wwrite-strings >>>>>>>>>>>>>>>>> -Wno-strict-aliasing -Wno-unknown-pragmas -g -fPIC >>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include >>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/arch-linux2-cxx-debug/include >>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include/mpiuni >>>>>>>>>>>>>>>>> -D__INSDIR__=src/snes/examples/tutorials/ ex12.c >>>>>>>>>>>>>>>>> ex12.c:14:18: fatal error: ex12.h: No such file or >>>>>>>>>>>>>>>>> directory >>>>>>>>>>>>>>>>> compilation terminated. >>>>>>>>>>>>>>>>> make: *** [ex12.o] Error 1 >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Any help of yours is very much appreciated. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Yes, this relates to my 3). This is not going to work for >>>>>>>>>>>>>>>> you with the release. Please see the link I sent. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:58 PM >>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:55 PM, Jones,Martin >>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> You built with PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:31 PM >>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:11 PM, Jones,Martin >>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Now I went to the directory where ex12.c sits and just >>>>>>>>>>>>>>>>>>> did a 'make ex12.c' with the following error if this helps? : >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>>> make ex12.c >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/variables:108: >>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscvariables: No >>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/rules:962: >>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules: No >>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> make: *** No rule to make target >>>>>>>>>>>>>>>>>>> `/home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules'. >>>>>>>>>>>>>>>>>>> Stop. >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> 1) You would type 'make ex12' >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> 2) Either you PETSC_DIR (/home/mjonesa/PETSc/petsc-3.4.3) >>>>>>>>>>>>>>>>>> or PETSC_ARCH (linux-gnu-cxx-debug) environment variables >>>>>>>>>>>>>>>>>> do not match what you built. Please send >>>>>>>>>>>>>>>>>> configure.log and make.log >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> 3) Since it was only recently added, if you want to use >>>>>>>>>>>>>>>>>> the FEM functionality, you must use the development version: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/developers/index.html >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [mailto:knepley at gmail.com] >>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 2:48 PM >>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 2:35 PM, Jones,Martin Alexander < >>>>>>>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Hello To Whom it Concerns, >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> I am trying to run the tutorial ex12.c by running 'bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h' >>>>>>>>>>>>>>>>>>> but getting the following error: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> $ bin/pythonscripts/PetscGenerateFEMQuadrature.py dim >>>>>>>>>>>>>>>>>>> order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h >>>>>>>>>>>>>>>>>>> Traceback (most recent call last): >>>>>>>>>>>>>>>>>>> File >>>>>>>>>>>>>>>>>>> "bin/pythonscripts/PetscGenerateFEMQuadrature.py", line 15, in >>>>>>>>>>>>>>>>>>> from FIAT.reference_element import default_simplex >>>>>>>>>>>>>>>>>>> ImportError: No module named FIAT.reference_element >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> I have removed the requirement of generating the header >>>>>>>>>>>>>>>>>>> file (its now all handled in C). I thought >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> I changed the documentation everywhere (including the >>>>>>>>>>>>>>>>>>> latest tutorial slides). Can you try running >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> with 'master' (or 'next'), and point me toward the old >>>>>>>>>>>>>>>>>>> docs? >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>> Graduate Research Assistant >>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>> (217) 550-2360 >>>>>>>>> salaza11 at illinois.edu >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> *Miguel Angel Salazar de Troya* >>>>>>> Graduate Research Assistant >>>>>>> Department of Mechanical Science and Engineering >>>>>>> University of Illinois at Urbana-Champaign >>>>>>> (217) 550-2360 >>>>>>> salaza11 at illinois.edu >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> *Miguel Angel Salazar de Troya* >>>>> Graduate Research Assistant >>>>> Department of Mechanical Science and Engineering >>>>> University of Illinois at Urbana-Champaign >>>>> (217) 550-2360 >>>>> salaza11 at illinois.edu >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >>> >>> -- >>> *Miguel Angel Salazar de Troya* >>> Graduate Research Assistant >>> Department of Mechanical Science and Engineering >>> University of Illinois at Urbana-Champaign >>> (217) 550-2360 >>> salaza11 at illinois.edu >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > > > -- > *Miguel Angel Salazar de Troya* > Graduate Research Assistant > Department of Mechanical Science and Engineering > University of Illinois at Urbana-Champaign > (217) 550-2360 > salaza11 at illinois.edu > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Mon Mar 3 20:58:34 2014 From: jed at jedbrown.org (Jed Brown) Date: Mon, 03 Mar 2014 20:58:34 -0600 Subject: [petsc-users] DMDA questions In-Reply-To: References: <87zjlfw4al.fsf@jedbrown.org> <87r46rw0ag.fsf@jedbrown.org> <87eh2rvvd0.fsf@jedbrown.org> <87ha7mvqvy.fsf@jedbrown.org> <87bnxuvqb5.fsf@jedbrown.org> <2097886F-7026-4DD4-A1BD-027EDE0FD3DA@mcs.anl.gov> <877g8bnq20.fsf@jedbrown.org> Message-ID: <87d2i2mzz9.fsf@jedbrown.org> Xiangdong writes: > Could you please expand it a little more on using DMCoarsenHookAdd to > restrict a fine vector on a coarse grid? The only example I can find is > ex48 in snes. It is not clear how the coarsen vector are generated from > that example. Look at the use in src/ts/impls/implicit/theta/theta.c -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From epscodes at gmail.com Tue Mar 4 11:44:51 2014 From: epscodes at gmail.com (Xiangdong) Date: Tue, 4 Mar 2014 12:44:51 -0500 Subject: [petsc-users] vtk output ASCII or binary Message-ID: Hello everyone, When I use PetsViewerVTKOpen to output vec in vtk format, is it in ASCII format or binary format? Are there any options to choose between them? Thank you. Xiangdong -------------- next part -------------- An HTML attachment was scrubbed... URL: From salazardetroya at gmail.com Tue Mar 4 11:51:48 2014 From: salazardetroya at gmail.com (Miguel Angel Salazar de Troya) Date: Tue, 4 Mar 2014 11:51:48 -0600 Subject: [petsc-users] Problems running ex12.c In-Reply-To: References: <8448FFCE4362914496BCEAF8BE810C13EF1D34@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D5E@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D7C@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D8F@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DA3@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DBF@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E22@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E50@DCPWPEXMBX02.mdanderson.edu> Message-ID: I can run it now, thanks. Although if I run it with valgrind 3.5.0 (should I update to the last version?) I get some memory leaks related with the function DMPlexCreateBoxMesh. [salaza11 at maya tutorials]$ valgrind --leak-check=full ./ex12 -run_type test -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial -dm_plex_print_fem 1 ==9625== Memcheck, a memory error detector ==9625== Copyright (C) 2002-2009, and GNU GPL'd, by Julian Seward et al. ==9625== Using Valgrind-3.5.0 and LibVEX; rerun with -h for copyright info ==9625== Command: ./ex12 -run_type test -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial -dm_plex_print_fem 1 ==9625== Local function: Vec Object: 1 MPI processes type: seq 0 0.25 1 0.25 0.5 1.25 1 1.25 2 Initial guess Vec Object: 1 MPI processes type: seq 0.5 L_2 Error: 0.111111 Residual: Vec Object: 1 MPI processes type: seq 0 0 0 0 0 0 0 0 0 Initial Residual Vec Object: 1 MPI processes type: seq 0 L_2 Residual: 0 Jacobian: Mat Object: 1 MPI processes type: seqaij row 0: (0, 4) Residual: Vec Object: 1 MPI processes type: seq 0 0 0 0 -2 0 0 0 0 Au - b = Au + F(0) Vec Object: 1 MPI processes type: seq 0 Linear L_2 Residual: 0 ==9625== ==9625== HEAP SUMMARY: ==9625== in use at exit: 288 bytes in 3 blocks ==9625== total heap usage: 2,484 allocs, 2,481 frees, 1,009,287 bytes allocated ==9625== ==9625== 48 bytes in 1 blocks are definitely lost in loss record 1 of 3 ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) ==9625== by 0x5D8D4E1: writepoly (triangle.c:12012) ==9625== by 0x5D8FAAC: triangulate (triangle.c:13167) ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) ==9625== by 0x4051FA: CreateMesh (ex12.c:341) ==9625== by 0x408D3D: main (ex12.c:651) ==9625== ==9625== 96 bytes in 1 blocks are definitely lost in loss record 2 of 3 ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) ==9625== by 0x5D8D485: writepoly (triangle.c:12004) ==9625== by 0x5D8FAAC: triangulate (triangle.c:13167) ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) ==9625== by 0x4051FA: CreateMesh (ex12.c:341) ==9625== by 0x408D3D: main (ex12.c:651) ==9625== ==9625== 144 bytes in 1 blocks are definitely lost in loss record 3 of 3 ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) ==9625== by 0x5D8CD20: writenodes (triangle.c:11718) ==9625== by 0x5D8F9DE: triangulate (triangle.c:13132) ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) ==9625== by 0x4051FA: CreateMesh (ex12.c:341) ==9625== by 0x408D3D: main (ex12.c:651) ==9625== ==9625== LEAK SUMMARY: ==9625== definitely lost: 288 bytes in 3 blocks ==9625== indirectly lost: 0 bytes in 0 blocks ==9625== possibly lost: 0 bytes in 0 blocks ==9625== still reachable: 0 bytes in 0 blocks ==9625== suppressed: 0 bytes in 0 blocks ==9625== ==9625== For counts of detected and suppressed errors, rerun with: -v ==9625== ERROR SUMMARY: 3 errors from 3 contexts (suppressed: 6 from 6) On Mon, Mar 3, 2014 at 7:05 PM, Matthew Knepley wrote: > On Mon, Mar 3, 2014 at 4:59 PM, Miguel Angel Salazar de Troya < > salazardetroya at gmail.com> wrote: > >> You are welcome, thanks for your help. >> > > Okay, I have rebuilt completely clean, and ex12 runs for me. Can you try > again after pulling? > > Thanks, > > Matt > > >> On Mon, Mar 3, 2014 at 4:13 PM, Matthew Knepley wrote: >> >>> On Mon, Mar 3, 2014 at 1:44 PM, Miguel Angel Salazar de Troya < >>> salazardetroya at gmail.com> wrote: >>> >>>> Thanks. This is what I get. >>>> >>> >>> Okay, this was broken by a new push to master/next in the last few days. >>> I have pushed a fix, >>> however next is currently broken due to a failure to check in a file. >>> This should be fixed shortly, >>> and then ex12 will work. I will mail you when its ready. >>> >>> Thanks for finding this, >>> >>> Matt >>> >>> >>>> (gdb) cont >>>> Continuing. >>>> >>>> Program received signal SIGSEGV, Segmentation fault. >>>> 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, >>>> X=0x168b5b0, >>>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, str=0x7fffae6e7970, >>>> user=0x7fd6811be509) >>>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>>> 882 ierr = PetscFEGetDimension(fe[f], &Nb);CHKERRQ(ierr); >>>> (gdb) where >>>> #0 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, >>>> X=0x168b5b0, >>>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, str=0x7fffae6e7970, >>>> user=0x7fd6811be509) >>>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>>> #1 0x00007fd6814a5bf6 in SNESComputeJacobian_DMLocal (snes=0x14e9450, >>>> X=0x1622ad0, A=0x7fffae6e8a88, B=0x7fffae6e8a88, ctx=0x1652300) >>>> at /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c:102 >>>> #2 0x00007fd6814cc609 in SNESComputeJacobian (snes=0x14e9450, >>>> X=0x1622ad0, >>>> A=0x7fffae6e8a88, B=0x7fffae6e8a88) >>>> at /home/salaza11/petsc/src/snes/interface/snes.c:2245 >>>> #3 0x000000000040af72 in main (argc=15, argv=0x7fffae6e8bc8) >>>> at /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:784 >>>> >>>> >>>> >>>> >>>> On Mon, Mar 3, 2014 at 1:40 PM, Matthew Knepley wrote: >>>> >>>>> On Mon, Mar 3, 2014 at 1:39 PM, Miguel Angel Salazar de Troya < >>>>> salazardetroya at gmail.com> wrote: >>>>> >>>>>> This is what I get at gdb when I type 'where'. >>>>>> >>>>> >>>>> You have to type 'cont', and then when it fails you type 'where'. >>>>> >>>>> Matt >>>>> >>>>> >>>>>> #0 0x000000310e0aa860 in __nanosleep_nocancel () from >>>>>> /lib64/libc.so.6 >>>>>> #1 0x000000310e0aa70f in sleep () from /lib64/libc.so.6 >>>>>> #2 0x00007fd83a00a8be in PetscSleep (s=10) >>>>>> at /home/salaza11/petsc/src/sys/utils/psleep.c:52 >>>>>> #3 0x00007fd83a06f331 in PetscAttachDebugger () >>>>>> at /home/salaza11/petsc/src/sys/error/adebug.c:397 >>>>>> #4 0x00007fd83a0af1d2 in PetscOptionsCheckInitial_Private () >>>>>> at /home/salaza11/petsc/src/sys/objects/init.c:444 >>>>>> #5 0x00007fd83a0b6448 in PetscInitialize (argc=0x7fff5cd8df2c, >>>>>> args=0x7fff5cd8df20, file=0x0, >>>>>> help=0x60ce40 "Poisson Problem in 2d and 3d with simplicial >>>>>> finite elements.\nWe solve the Poisson problem in a rectangular\ndomain, >>>>>> using a parallel unstructured mesh (DMPLEX) to discretize it.\n\n\n") >>>>>> at /home/salaza11/petsc/src/sys/objects/pinit.c:876 >>>>>> #6 0x0000000000408f2c in main (argc=15, argv=0x7fff5cd8f1f8) >>>>>> at /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:663 >>>>>> >>>>>> The rest of the gdb output is attached. I am a bit ignorant with gdb, >>>>>> I apologize for that. >>>>>> >>>>>> >>>>>> >>>>>> On Mon, Mar 3, 2014 at 12:48 PM, Matthew Knepley wrote: >>>>>> >>>>>>> On Mon, Mar 3, 2014 at 12:39 PM, Miguel Angel Salazar de Troya < >>>>>>> salazardetroya at gmail.com> wrote: >>>>>>> >>>>>>>> Thanks for your response. Sorry I did not have the "next" version, >>>>>>>> but the "master" version. I still have an error though. I followed the >>>>>>>> steps given here (https://bitbucket.org/petsc/petsc/wiki/Home) to >>>>>>>> obtain the next version, I configured petsc as above and ran ex12 as above >>>>>>>> as well, getting this error: >>>>>>>> >>>>>>>> [salaza11 at maya tutorials]$ ./ex12 -run_type test -refinement_limit >>>>>>>> 0.0 -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>> -dm_plex_print_fem 1 >>>>>>>> Local function: >>>>>>>> Vec Object: 1 MPI processes >>>>>>>> type: seq >>>>>>>> 0 >>>>>>>> 0.25 >>>>>>>> 1 >>>>>>>> 0.25 >>>>>>>> 0.5 >>>>>>>> 1.25 >>>>>>>> 1 >>>>>>>> 1.25 >>>>>>>> 2 >>>>>>>> Initial guess >>>>>>>> Vec Object: 1 MPI processes >>>>>>>> type: seq >>>>>>>> 0.5 >>>>>>>> L_2 Error: 0.111111 >>>>>>>> Residual: >>>>>>>> Vec Object: 1 MPI processes >>>>>>>> type: seq >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> Initial Residual >>>>>>>> Vec Object: 1 MPI processes >>>>>>>> type: seq >>>>>>>> 0 >>>>>>>> L_2 Residual: 0 >>>>>>>> >>>>>>> >>>>>>> Okay, now run with -start_in_debugger, and give me a stack trace >>>>>>> using 'where'. >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> [0]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>> Violation, probably memory access out of range >>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>> -on_error_attach_debugger >>>>>>>> [0]PETSC ERROR: or see >>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory >>>>>>>> corruption errors >>>>>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>> ------------------------------------ >>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>>>>> available, >>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of the >>>>>>>> function >>>>>>>> [0]PETSC ERROR: is given. >>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 871 >>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>> -------------------------------------------------------------- >>>>>>>> [0]PETSC ERROR: Signal received >>>>>>>> [0]PETSC ERROR: See http:// >>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble >>>>>>>> shooting. >>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>> v3.4.3-4705-gfb6b3bc GIT Date: 2014-03-03 08:23:43 -0600 >>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by >>>>>>>> salaza11 Mon Mar 3 11:49:15 2014 >>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>> --download-chaco --with-c2html=0 >>>>>>>> [0]PETSC ERROR: #1 User provided function() line 0 in unknown file >>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>> [unset]: aborting job: >>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On Sun, Mar 2, 2014 at 7:11 PM, Matthew Knepley wrote: >>>>>>>> >>>>>>>>> On Sun, Mar 2, 2014 at 6:54 PM, Miguel Angel Salazar de Troya < >>>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> Hi everybody >>>>>>>>>> >>>>>>>>>> I am trying to run example ex12.c without much success. I >>>>>>>>>> specifically run it with the command options: >>>>>>>>>> >>>>>>>>> >>>>>>>>> We need to start narrowing down differences, because it runs for >>>>>>>>> me and our nightly tests. So, first can >>>>>>>>> you confirm that you are using the latest 'next' branch? >>>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type >>>>>>>>>> dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>> >>>>>>>>>> And I get this output >>>>>>>>>> >>>>>>>>>> Local function: >>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>> type: seq >>>>>>>>>> 0 >>>>>>>>>> 1 >>>>>>>>>> 1 >>>>>>>>>> 2 >>>>>>>>>> 1 >>>>>>>>>> 2 >>>>>>>>>> 2 >>>>>>>>>> 3 >>>>>>>>>> Initial guess >>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>> type: seq >>>>>>>>>> L_2 Error: 0.625 >>>>>>>>>> Residual: >>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>> type: seq >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> Initial Residual >>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>> type: seq >>>>>>>>>> L_2 Residual: 0 >>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>>>> Violation, probably memory access out of range >>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>> -on_error_attach_debugger >>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find >>>>>>>>>> memory corruption errors >>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>> ------------------------------------ >>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>>>>>>> available, >>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of the >>>>>>>>>> function >>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 867 >>>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>> ------------------------------------ >>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>> v3.4.3-3453-g0a94005 GIT Date: 2014-03-02 13:12:04 -0600 >>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>> shooting. >>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by >>>>>>>>>> salaza11 Sun Mar 2 17:00:09 2014 >>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>> /home/salaza11/petsc/linux-gnu-c-debug/lib >>>>>>>>>> [0]PETSC ERROR: Configure run at Sun Mar 2 16:46:51 2014 >>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in unknown file >>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>> [unset]: aborting job: >>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Probably my problems could be on my configuration. I attach the >>>>>>>>>> configure.log. I ran ./configure like this >>>>>>>>>> >>>>>>>>>> ./configure --download-mpich --download-scientificpython >>>>>>>>>> --download-triangle --download-ctetgen --download-chaco --with-c2html=0 >>>>>>>>>> >>>>>>>>>> Thanks a lot in advance. >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Tue, Jan 28, 2014 at 10:37 AM, Matthew Knepley < >>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> On Tue, Jan 28, 2014 at 10:31 AM, Yaakoub El Khamra < >>>>>>>>>>> yelkhamra at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> If >>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>> >>>>>>>>>>>> is for serial, any chance we can get the options to run in >>>>>>>>>>>> parallel? >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Just use mpiexec -n >>>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Regards >>>>>>>>>>>> Yaakoub El Khamra >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:29 AM, Matthew Knepley < >>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:06 AM, Jones,Martin Alexander < >>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>> *Sent:* Friday, January 17, 2014 11:04 AM >>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:00 AM, Jones,Martin Alexander < >>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> These examples all seem to run excepting the following >>>>>>>>>>>>>>> command, >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> I get the following ouput: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>>> ./ex12: symbol lookup error: >>>>>>>>>>>>>>> /opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so: undefined >>>>>>>>>>>>>>> symbol: omp_get_num_procs >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> This is a build problem, but it should affect all the runs. >>>>>>>>>>>>>> Is this reproducible? Can you send configure.log? MKL is the worst. If this >>>>>>>>>>>>>> persists, I would just switch to --download-f-blas-lapack. >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks. I have some advice on options >>>>>>>>>>>>> >>>>>>>>>>>>> --with-precision=single # I would not use this unless you >>>>>>>>>>>>> are doing something special, like CUDA >>>>>>>>>>>>> --with-clanguage=C++ # I would recommend switching to C, >>>>>>>>>>>>> the build is much faster >>>>>>>>>>>>> --with-mpi-dir=/usr --with-mpi4py=0 >>>>>>>>>>>>> --with-shared-libraries --CFLAGS=-O0 --CXXFLAGS=-O0 >>>>>>>>>>>>> --with-fc=0 >>>>>>>>>>>>> --with-etags=1 # This is unnecessary >>>>>>>>>>>>> >>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>> # Here is the problem, see below >>>>>>>>>>>>> --download-metis >>>>>>>>>>>>> --download-fiat=yes --download-generator >>>>>>>>>>>>> --download-scientificpython # Get rid of these, they are obsolete >>>>>>>>>>>>> >>>>>>>>>>>>> Your MKL needs another library for the OpenMP symbols. I would >>>>>>>>>>>>> recommend switching to --download-f2cblaslapack, >>>>>>>>>>>>> or you can try and find that library. >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks, >>>>>>>>>>>>> >>>>>>>>>>>>> Matt >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>> >>>>>>>>>>>>>> Matt >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 6:35 PM >>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 5:43 PM, Jones,Martin Alexander < >>>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Hi, This is the next error message after configuring and >>>>>>>>>>>>>>>> building with the triangle package when trying to run ex12 >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> This is my fault for bad defaults. I will fix. Try running >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type >>>>>>>>>>>>>>> dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> for a representative run. Then you could try 3D >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> or a full run >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>>>> -interpolate -petscspace_order 1 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>>>> -interpolate -petscspace_order 2 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Let me know if those work. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> ./ex12 >>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point >>>>>>>>>>>>>>>> Exception,probably divide by zero >>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to >>>>>>>>>>>>>>>> find memory corruption errors >>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack >>>>>>>>>>>>>>>> below >>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack >>>>>>>>>>>>>>>> are not available, >>>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start >>>>>>>>>>>>>>>> of the function >>>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeResidualFEM line 531 >>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction_DMLocal line 63 >>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user function line 2088 >>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction line 2076 >>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve_NEWTONLS line 144 >>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/impls/ls/ls.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve line 3765 >>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>>>> updates. >>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>>>>>> shooting. >>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named >>>>>>>>>>>>>>>> maeda by mjonesa Thu Jan 16 17:41:23 2014 >>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 17:38:33 2014 >>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local >>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>> --with-c2html=0 --with-clanguage=c++ PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>> --download-triangle >>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in unknown >>>>>>>>>>>>>>>> file >>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:37 PM >>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:33 PM, Jones,Martin Alexander >>>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Hi, I have downloaded and built the dev version you >>>>>>>>>>>>>>>>> suggested. I think I need the triangle package to run this particular case. >>>>>>>>>>>>>>>>> Is there any thing else that appears wrong in what I have done from the >>>>>>>>>>>>>>>>> error messages below: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Great! Its running. You can reconfigure like this: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> $PETSC_DIR/$PETSC_ARCH/conf/reconfigure-$PETSC_ARCH.py >>>>>>>>>>>>>>>> --download-triangle >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> and then rebuild >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> make >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> and then rerun. You can load meshes, but its much easier >>>>>>>>>>>>>>>> to have triangle create them. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Thanks for being patient, >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>> [0]PETSC ERROR: No support for this operation for this >>>>>>>>>>>>>>>>> object type! >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Mesh generation needs external package >>>>>>>>>>>>>>>>> support. >>>>>>>>>>>>>>>>> Please reconfigure with --download-triangle.! >>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>>>>> updates. >>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>>>>>>> shooting. >>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named >>>>>>>>>>>>>>>>> maeda by mjonesa Thu Jan 16 16:28:20 2014 >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 16:25:53 2014 >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local --with-clanguage=c++ --with-c2html=0 >>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexGenerate() line 4332 in >>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plex.c >>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexCreateBoxMesh() line 600 in >>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexcreate.c >>>>>>>>>>>>>>>>> [0]PETSC ERROR: CreateMesh() line 295 in >>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>>> [0]PETSC ERROR: main() line 659 in >>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 56) - process >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:06 PM >>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:05 PM, Jones,Martin >>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Hi. I changed the ENV variable to the correct entry. >>>>>>>>>>>>>>>>>> when I type make ex12 I get this: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>> make ex12 >>>>>>>>>>>>>>>>>> g++ -o ex12.o -c -Wall -Wwrite-strings >>>>>>>>>>>>>>>>>> -Wno-strict-aliasing -Wno-unknown-pragmas -g -fPIC >>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include >>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/arch-linux2-cxx-debug/include >>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include/mpiuni >>>>>>>>>>>>>>>>>> -D__INSDIR__=src/snes/examples/tutorials/ ex12.c >>>>>>>>>>>>>>>>>> ex12.c:14:18: fatal error: ex12.h: No such file or >>>>>>>>>>>>>>>>>> directory >>>>>>>>>>>>>>>>>> compilation terminated. >>>>>>>>>>>>>>>>>> make: *** [ex12.o] Error 1 >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Any help of yours is very much appreciated. >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Yes, this relates to my 3). This is not going to work >>>>>>>>>>>>>>>>> for you with the release. Please see the link I sent. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:58 PM >>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:55 PM, Jones,Martin >>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> You built with PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:31 PM >>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:11 PM, Jones,Martin >>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Now I went to the directory where ex12.c sits and >>>>>>>>>>>>>>>>>>>> just did a 'make ex12.c' with the following error if this helps? : >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>>>> make ex12.c >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/variables:108: >>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscvariables: No >>>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/rules:962: >>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules: No >>>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> make: *** No rule to make target >>>>>>>>>>>>>>>>>>>> `/home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules'. >>>>>>>>>>>>>>>>>>>> Stop. >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> 1) You would type 'make ex12' >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> 2) Either you PETSC_DIR (/home/mjonesa/PETSc/petsc-3. >>>>>>>>>>>>>>>>>>> 4.3) or PETSC_ARCH (linux-gnu-cxx-debug) environment >>>>>>>>>>>>>>>>>>> variables >>>>>>>>>>>>>>>>>>> do not match what you built. Please send >>>>>>>>>>>>>>>>>>> configure.log and make.log >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> 3) Since it was only recently added, if you want to >>>>>>>>>>>>>>>>>>> use the FEM functionality, you must use the development version: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/developers/index.html >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [mailto:knepley at gmail.com] >>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 2:48 PM >>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 2:35 PM, Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Hello To Whom it Concerns, >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> I am trying to run the tutorial ex12.c by running 'bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h' >>>>>>>>>>>>>>>>>>>> but getting the following error: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> $ bin/pythonscripts/PetscGenerateFEMQuadrature.py dim >>>>>>>>>>>>>>>>>>>> order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h >>>>>>>>>>>>>>>>>>>> Traceback (most recent call last): >>>>>>>>>>>>>>>>>>>> File >>>>>>>>>>>>>>>>>>>> "bin/pythonscripts/PetscGenerateFEMQuadrature.py", line 15, in >>>>>>>>>>>>>>>>>>>> from FIAT.reference_element import default_simplex >>>>>>>>>>>>>>>>>>>> ImportError: No module named FIAT.reference_element >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> I have removed the requirement of generating the header >>>>>>>>>>>>>>>>>>>> file (its now all handled in C). I thought >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> I changed the documentation everywhere (including the >>>>>>>>>>>>>>>>>>>> latest tutorial slides). Can you try running >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> with 'master' (or 'next'), and point me toward the old >>>>>>>>>>>>>>>>>>>> docs? >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>> Graduate Research Assistant >>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>> (217) 550-2360 >>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>> Graduate Research Assistant >>>>>>>> Department of Mechanical Science and Engineering >>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>> (217) 550-2360 >>>>>>>> salaza11 at illinois.edu >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> *Miguel Angel Salazar de Troya* >>>>>> Graduate Research Assistant >>>>>> Department of Mechanical Science and Engineering >>>>>> University of Illinois at Urbana-Champaign >>>>>> (217) 550-2360 >>>>>> salaza11 at illinois.edu >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>>> >>>> -- >>>> *Miguel Angel Salazar de Troya* >>>> Graduate Research Assistant >>>> Department of Mechanical Science and Engineering >>>> University of Illinois at Urbana-Champaign >>>> (217) 550-2360 >>>> salaza11 at illinois.edu >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> >> >> -- >> *Miguel Angel Salazar de Troya* >> Graduate Research Assistant >> Department of Mechanical Science and Engineering >> University of Illinois at Urbana-Champaign >> (217) 550-2360 >> salaza11 at illinois.edu >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- *Miguel Angel Salazar de Troya* Graduate Research Assistant Department of Mechanical Science and Engineering University of Illinois at Urbana-Champaign (217) 550-2360 salaza11 at illinois.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: From fd.kong at siat.ac.cn Tue Mar 4 11:59:10 2014 From: fd.kong at siat.ac.cn (Fande Kong) Date: Tue, 4 Mar 2014 10:59:10 -0700 Subject: [petsc-users] questions about PetscSF In-Reply-To: <87txben5q9.fsf@jedbrown.org> References: <87txben5q9.fsf@jedbrown.org> Message-ID: Jed, Thanks, I readed the node you gave. I still have some questions. (1) One root can be related with zero, one or many leaves, right? One leaf only can be related with zero or one root. (2) In Algorithms section. Could you please give me a very simple example to demonstrate how 'Extracting a submatrix from a sparse matrix', ' Ownership discovery and transfer' and 'Graph distribution'. The description is hard for me to understand. Fande, On Mon, Mar 3, 2014 at 5:54 PM, Jed Brown wrote: > Fande Kong writes: > > > Hi all, > > > > I was wondering mechanisms of the object PetscSF. What are definitions of > > roots and leaves? Do roots/leaves associate with the data we want to > > receive/send? For the function Bcast, it seems that we transfer data > from > > roots to leaves. But in another function Reduce, it seems that we move > data > > in the opposite direction (from leaves to roots). These kind of > mechanisms > > possibly make users confused. > > The SF graph is asymmetric so that it can have cleaner semantics. Read > my note and reply here if something is still unclear. > > http://59a2.org/files/StarForest.pdf > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Mar 4 12:01:28 2014 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 4 Mar 2014 12:01:28 -0600 Subject: [petsc-users] Problems running ex12.c In-Reply-To: References: <8448FFCE4362914496BCEAF8BE810C13EF1D34@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D5E@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D7C@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D8F@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DA3@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DBF@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E22@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E50@DCPWPEXMBX02.mdanderson.edu> Message-ID: On Tue, Mar 4, 2014 at 11:51 AM, Miguel Angel Salazar de Troya < salazardetroya at gmail.com> wrote: > I can run it now, thanks. Although if I run it with valgrind 3.5.0 (should > I update to the last version?) I get some memory leaks related with the > function DMPlexCreateBoxMesh. > I will check it out. Thanks, Matt > [salaza11 at maya tutorials]$ valgrind --leak-check=full ./ex12 -run_type > test -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 > -petscspace_order 1 -show_initial -dm_plex_print_fem 1 > ==9625== Memcheck, a memory error detector > ==9625== Copyright (C) 2002-2009, and GNU GPL'd, by Julian Seward et al. > ==9625== Using Valgrind-3.5.0 and LibVEX; rerun with -h for copyright info > ==9625== Command: ./ex12 -run_type test -refinement_limit 0.0 -bc_type > dirichlet -interpolate 0 -petscspace_order 1 -show_initial > -dm_plex_print_fem 1 > ==9625== > Local function: > Vec Object: 1 MPI processes > type: seq > 0 > 0.25 > 1 > 0.25 > 0.5 > 1.25 > 1 > 1.25 > 2 > Initial guess > Vec Object: 1 MPI processes > type: seq > 0.5 > L_2 Error: 0.111111 > Residual: > Vec Object: 1 MPI processes > type: seq > 0 > 0 > 0 > 0 > 0 > 0 > 0 > 0 > 0 > Initial Residual > Vec Object: 1 MPI processes > type: seq > 0 > L_2 Residual: 0 > Jacobian: > Mat Object: 1 MPI processes > type: seqaij > row 0: (0, 4) > Residual: > Vec Object: 1 MPI processes > type: seq > 0 > 0 > 0 > 0 > -2 > 0 > 0 > 0 > 0 > Au - b = Au + F(0) > Vec Object: 1 MPI processes > type: seq > 0 > Linear L_2 Residual: 0 > ==9625== > ==9625== HEAP SUMMARY: > ==9625== in use at exit: 288 bytes in 3 blocks > ==9625== total heap usage: 2,484 allocs, 2,481 frees, 1,009,287 bytes > allocated > ==9625== > ==9625== 48 bytes in 1 blocks are definitely lost in loss record 1 of 3 > ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) > ==9625== by 0x5D8D4E1: writepoly (triangle.c:12012) > ==9625== by 0x5D8FAAC: triangulate (triangle.c:13167) > ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) > ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) > ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) > ==9625== by 0x4051FA: CreateMesh (ex12.c:341) > ==9625== by 0x408D3D: main (ex12.c:651) > ==9625== > ==9625== 96 bytes in 1 blocks are definitely lost in loss record 2 of 3 > ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) > ==9625== by 0x5D8D485: writepoly (triangle.c:12004) > ==9625== by 0x5D8FAAC: triangulate (triangle.c:13167) > ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) > ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) > ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) > ==9625== by 0x4051FA: CreateMesh (ex12.c:341) > ==9625== by 0x408D3D: main (ex12.c:651) > ==9625== > ==9625== 144 bytes in 1 blocks are definitely lost in loss record 3 of 3 > ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) > ==9625== by 0x5D8CD20: writenodes (triangle.c:11718) > ==9625== by 0x5D8F9DE: triangulate (triangle.c:13132) > ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) > ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) > ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) > ==9625== by 0x4051FA: CreateMesh (ex12.c:341) > ==9625== by 0x408D3D: main (ex12.c:651) > ==9625== > ==9625== LEAK SUMMARY: > ==9625== definitely lost: 288 bytes in 3 blocks > ==9625== indirectly lost: 0 bytes in 0 blocks > ==9625== possibly lost: 0 bytes in 0 blocks > ==9625== still reachable: 0 bytes in 0 blocks > ==9625== suppressed: 0 bytes in 0 blocks > ==9625== > ==9625== For counts of detected and suppressed errors, rerun with: -v > ==9625== ERROR SUMMARY: 3 errors from 3 contexts (suppressed: 6 from 6) > > > > > > On Mon, Mar 3, 2014 at 7:05 PM, Matthew Knepley wrote: > >> On Mon, Mar 3, 2014 at 4:59 PM, Miguel Angel Salazar de Troya < >> salazardetroya at gmail.com> wrote: >> >>> You are welcome, thanks for your help. >>> >> >> Okay, I have rebuilt completely clean, and ex12 runs for me. Can you try >> again after pulling? >> >> Thanks, >> >> Matt >> >> >>> On Mon, Mar 3, 2014 at 4:13 PM, Matthew Knepley wrote: >>> >>>> On Mon, Mar 3, 2014 at 1:44 PM, Miguel Angel Salazar de Troya < >>>> salazardetroya at gmail.com> wrote: >>>> >>>>> Thanks. This is what I get. >>>>> >>>> >>>> Okay, this was broken by a new push to master/next in the last few >>>> days. I have pushed a fix, >>>> however next is currently broken due to a failure to check in a file. >>>> This should be fixed shortly, >>>> and then ex12 will work. I will mail you when its ready. >>>> >>>> Thanks for finding this, >>>> >>>> Matt >>>> >>>> >>>>> (gdb) cont >>>>> Continuing. >>>>> >>>>> Program received signal SIGSEGV, Segmentation fault. >>>>> 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, >>>>> X=0x168b5b0, >>>>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, str=0x7fffae6e7970, >>>>> user=0x7fd6811be509) >>>>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>>>> 882 ierr = PetscFEGetDimension(fe[f], &Nb);CHKERRQ(ierr); >>>>> (gdb) where >>>>> #0 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, >>>>> X=0x168b5b0, >>>>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, str=0x7fffae6e7970, >>>>> user=0x7fd6811be509) >>>>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>>>> #1 0x00007fd6814a5bf6 in SNESComputeJacobian_DMLocal (snes=0x14e9450, >>>>> X=0x1622ad0, A=0x7fffae6e8a88, B=0x7fffae6e8a88, ctx=0x1652300) >>>>> at /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c:102 >>>>> #2 0x00007fd6814cc609 in SNESComputeJacobian (snes=0x14e9450, >>>>> X=0x1622ad0, >>>>> A=0x7fffae6e8a88, B=0x7fffae6e8a88) >>>>> at /home/salaza11/petsc/src/snes/interface/snes.c:2245 >>>>> #3 0x000000000040af72 in main (argc=15, argv=0x7fffae6e8bc8) >>>>> at /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:784 >>>>> >>>>> >>>>> >>>>> >>>>> On Mon, Mar 3, 2014 at 1:40 PM, Matthew Knepley wrote: >>>>> >>>>>> On Mon, Mar 3, 2014 at 1:39 PM, Miguel Angel Salazar de Troya < >>>>>> salazardetroya at gmail.com> wrote: >>>>>> >>>>>>> This is what I get at gdb when I type 'where'. >>>>>>> >>>>>> >>>>>> You have to type 'cont', and then when it fails you type 'where'. >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> #0 0x000000310e0aa860 in __nanosleep_nocancel () from >>>>>>> /lib64/libc.so.6 >>>>>>> #1 0x000000310e0aa70f in sleep () from /lib64/libc.so.6 >>>>>>> #2 0x00007fd83a00a8be in PetscSleep (s=10) >>>>>>> at /home/salaza11/petsc/src/sys/utils/psleep.c:52 >>>>>>> #3 0x00007fd83a06f331 in PetscAttachDebugger () >>>>>>> at /home/salaza11/petsc/src/sys/error/adebug.c:397 >>>>>>> #4 0x00007fd83a0af1d2 in PetscOptionsCheckInitial_Private () >>>>>>> at /home/salaza11/petsc/src/sys/objects/init.c:444 >>>>>>> #5 0x00007fd83a0b6448 in PetscInitialize (argc=0x7fff5cd8df2c, >>>>>>> args=0x7fff5cd8df20, file=0x0, >>>>>>> help=0x60ce40 "Poisson Problem in 2d and 3d with simplicial >>>>>>> finite elements.\nWe solve the Poisson problem in a rectangular\ndomain, >>>>>>> using a parallel unstructured mesh (DMPLEX) to discretize it.\n\n\n") >>>>>>> at /home/salaza11/petsc/src/sys/objects/pinit.c:876 >>>>>>> #6 0x0000000000408f2c in main (argc=15, argv=0x7fff5cd8f1f8) >>>>>>> at /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:663 >>>>>>> >>>>>>> The rest of the gdb output is attached. I am a bit ignorant with >>>>>>> gdb, I apologize for that. >>>>>>> >>>>>>> >>>>>>> >>>>>>> On Mon, Mar 3, 2014 at 12:48 PM, Matthew Knepley wrote: >>>>>>> >>>>>>>> On Mon, Mar 3, 2014 at 12:39 PM, Miguel Angel Salazar de Troya < >>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>> >>>>>>>>> Thanks for your response. Sorry I did not have the "next" version, >>>>>>>>> but the "master" version. I still have an error though. I followed the >>>>>>>>> steps given here (https://bitbucket.org/petsc/petsc/wiki/Home) to >>>>>>>>> obtain the next version, I configured petsc as above and ran ex12 as above >>>>>>>>> as well, getting this error: >>>>>>>>> >>>>>>>>> [salaza11 at maya tutorials]$ ./ex12 -run_type test >>>>>>>>> -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 >>>>>>>>> -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>>>>>> Local function: >>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>> type: seq >>>>>>>>> 0 >>>>>>>>> 0.25 >>>>>>>>> 1 >>>>>>>>> 0.25 >>>>>>>>> 0.5 >>>>>>>>> 1.25 >>>>>>>>> 1 >>>>>>>>> 1.25 >>>>>>>>> 2 >>>>>>>>> Initial guess >>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>> type: seq >>>>>>>>> 0.5 >>>>>>>>> L_2 Error: 0.111111 >>>>>>>>> Residual: >>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>> type: seq >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> Initial Residual >>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>> type: seq >>>>>>>>> 0 >>>>>>>>> L_2 Residual: 0 >>>>>>>>> >>>>>>>> >>>>>>>> Okay, now run with -start_in_debugger, and give me a stack trace >>>>>>>> using 'where'. >>>>>>>> >>>>>>>> Thanks, >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> [0]PETSC ERROR: >>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>>> Violation, probably memory access out of range >>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>> -on_error_attach_debugger >>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find >>>>>>>>> memory corruption errors >>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>> ------------------------------------ >>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>>>>>> available, >>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of the >>>>>>>>> function >>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 871 >>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>> -------------------------------------------------------------- >>>>>>>>> [0]PETSC ERROR: Signal received >>>>>>>>> [0]PETSC ERROR: See http:// >>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble >>>>>>>>> shooting. >>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>> v3.4.3-4705-gfb6b3bc GIT Date: 2014-03-03 08:23:43 -0600 >>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by >>>>>>>>> salaza11 Mon Mar 3 11:49:15 2014 >>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>> [0]PETSC ERROR: #1 User provided function() line 0 in unknown file >>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>> [unset]: aborting job: >>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> On Sun, Mar 2, 2014 at 7:11 PM, Matthew Knepley >>>>>>>> > wrote: >>>>>>>>> >>>>>>>>>> On Sun, Mar 2, 2014 at 6:54 PM, Miguel Angel Salazar de Troya < >>>>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> Hi everybody >>>>>>>>>>> >>>>>>>>>>> I am trying to run example ex12.c without much success. I >>>>>>>>>>> specifically run it with the command options: >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> We need to start narrowing down differences, because it runs for >>>>>>>>>> me and our nightly tests. So, first can >>>>>>>>>> you confirm that you are using the latest 'next' branch? >>>>>>>>>> >>>>>>>>>> Thanks, >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type >>>>>>>>>>> dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>> >>>>>>>>>>> And I get this output >>>>>>>>>>> >>>>>>>>>>> Local function: >>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>> type: seq >>>>>>>>>>> 0 >>>>>>>>>>> 1 >>>>>>>>>>> 1 >>>>>>>>>>> 2 >>>>>>>>>>> 1 >>>>>>>>>>> 2 >>>>>>>>>>> 2 >>>>>>>>>>> 3 >>>>>>>>>>> Initial guess >>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>> type: seq >>>>>>>>>>> L_2 Error: 0.625 >>>>>>>>>>> Residual: >>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>> type: seq >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> Initial Residual >>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>> type: seq >>>>>>>>>>> L_2 Residual: 0 >>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>>>>> Violation, probably memory access out of range >>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find >>>>>>>>>>> memory corruption errors >>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>> ------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are >>>>>>>>>>> not available, >>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of >>>>>>>>>>> the function >>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 867 >>>>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>> ------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>> v3.4.3-3453-g0a94005 GIT Date: 2014-03-02 13:12:04 -0600 >>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>> shooting. >>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by >>>>>>>>>>> salaza11 Sun Mar 2 17:00:09 2014 >>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>> /home/salaza11/petsc/linux-gnu-c-debug/lib >>>>>>>>>>> [0]PETSC ERROR: Configure run at Sun Mar 2 16:46:51 2014 >>>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in unknown file >>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>> [unset]: aborting job: >>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Probably my problems could be on my configuration. I attach the >>>>>>>>>>> configure.log. I ran ./configure like this >>>>>>>>>>> >>>>>>>>>>> ./configure --download-mpich --download-scientificpython >>>>>>>>>>> --download-triangle --download-ctetgen --download-chaco --with-c2html=0 >>>>>>>>>>> >>>>>>>>>>> Thanks a lot in advance. >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Tue, Jan 28, 2014 at 10:37 AM, Matthew Knepley < >>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> On Tue, Jan 28, 2014 at 10:31 AM, Yaakoub El Khamra < >>>>>>>>>>>> yelkhamra at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> If >>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>> >>>>>>>>>>>>> is for serial, any chance we can get the options to run in >>>>>>>>>>>>> parallel? >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Just use mpiexec -n >>>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Regards >>>>>>>>>>>>> Yaakoub El Khamra >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:29 AM, Matthew Knepley < >>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:06 AM, Jones,Martin Alexander < >>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>> *Sent:* Friday, January 17, 2014 11:04 AM >>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:00 AM, Jones,Martin Alexander >>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> These examples all seem to run excepting the following >>>>>>>>>>>>>>>> command, >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> I get the following ouput: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>>>> ./ex12: symbol lookup error: >>>>>>>>>>>>>>>> /opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so: undefined >>>>>>>>>>>>>>>> symbol: omp_get_num_procs >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> This is a build problem, but it should affect all the >>>>>>>>>>>>>>> runs. Is this reproducible? Can you send configure.log? MKL is the worst. >>>>>>>>>>>>>>> If this >>>>>>>>>>>>>>> persists, I would just switch to --download-f-blas-lapack. >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks. I have some advice on options >>>>>>>>>>>>>> >>>>>>>>>>>>>> --with-precision=single # I would not use this unless you >>>>>>>>>>>>>> are doing something special, like CUDA >>>>>>>>>>>>>> --with-clanguage=C++ # I would recommend switching to C, >>>>>>>>>>>>>> the build is much faster >>>>>>>>>>>>>> --with-mpi-dir=/usr --with-mpi4py=0 >>>>>>>>>>>>>> --with-shared-libraries --CFLAGS=-O0 --CXXFLAGS=-O0 >>>>>>>>>>>>>> --with-fc=0 >>>>>>>>>>>>>> --with-etags=1 # This is unnecessary >>>>>>>>>>>>>> >>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>> # Here is the problem, see below >>>>>>>>>>>>>> --download-metis >>>>>>>>>>>>>> --download-fiat=yes --download-generator >>>>>>>>>>>>>> --download-scientificpython # Get rid of these, they are obsolete >>>>>>>>>>>>>> >>>>>>>>>>>>>> Your MKL needs another library for the OpenMP symbols. I >>>>>>>>>>>>>> would recommend switching to --download-f2cblaslapack, >>>>>>>>>>>>>> or you can try and find that library. >>>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>> >>>>>>>>>>>>>> Matt >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 6:35 PM >>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 5:43 PM, Jones,Martin Alexander >>>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Hi, This is the next error message after configuring and >>>>>>>>>>>>>>>>> building with the triangle package when trying to run ex12 >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> This is my fault for bad defaults. I will fix. Try running >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type >>>>>>>>>>>>>>>> dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> for a representative run. Then you could try 3D >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> or a full run >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>>>>> -interpolate -petscspace_order 1 >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>>>>> -interpolate -petscspace_order 2 >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Let me know if those work. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> ./ex12 >>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point >>>>>>>>>>>>>>>>> Exception,probably divide by zero >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to >>>>>>>>>>>>>>>>> find memory corruption errors >>>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack >>>>>>>>>>>>>>>>> below >>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack >>>>>>>>>>>>>>>>> are not available, >>>>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start >>>>>>>>>>>>>>>>> of the function >>>>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeResidualFEM line 531 >>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction_DMLocal line 63 >>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user function line 2088 >>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction line 2076 >>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve_NEWTONLS line 144 >>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/impls/ls/ls.c >>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve line 3765 >>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>>>>> updates. >>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>>>>>>> shooting. >>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named >>>>>>>>>>>>>>>>> maeda by mjonesa Thu Jan 16 17:41:23 2014 >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 17:38:33 2014 >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local >>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>> --with-c2html=0 --with-clanguage=c++ PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>>> --download-triangle >>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in >>>>>>>>>>>>>>>>> unknown file >>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:37 PM >>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:33 PM, Jones,Martin >>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Hi, I have downloaded and built the dev version you >>>>>>>>>>>>>>>>>> suggested. I think I need the triangle package to run this particular case. >>>>>>>>>>>>>>>>>> Is there any thing else that appears wrong in what I have done from the >>>>>>>>>>>>>>>>>> error messages below: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Great! Its running. You can reconfigure like this: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> $PETSC_DIR/$PETSC_ARCH/conf/reconfigure-$PETSC_ARCH.py >>>>>>>>>>>>>>>>> --download-triangle >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> and then rebuild >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> make >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> and then rerun. You can load meshes, but its much easier >>>>>>>>>>>>>>>>> to have triangle create them. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Thanks for being patient, >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: No support for this operation for this >>>>>>>>>>>>>>>>>> object type! >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Mesh generation needs external package >>>>>>>>>>>>>>>>>> support. >>>>>>>>>>>>>>>>>> Please reconfigure with --download-triangle.! >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>>>>>> updates. >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>>>>>>>> shooting. >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named >>>>>>>>>>>>>>>>>> maeda by mjonesa Thu Jan 16 16:28:20 2014 >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 16:25:53 2014 >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local --with-clanguage=c++ --with-c2html=0 >>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexGenerate() line 4332 in >>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plex.c >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexCreateBoxMesh() line 600 in >>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexcreate.c >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: CreateMesh() line 295 in >>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: main() line 659 in >>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 56) - >>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:06 PM >>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:05 PM, Jones,Martin >>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Hi. I changed the ENV variable to the correct entry. >>>>>>>>>>>>>>>>>>> when I type make ex12 I get this: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>>> make ex12 >>>>>>>>>>>>>>>>>>> g++ -o ex12.o -c -Wall -Wwrite-strings >>>>>>>>>>>>>>>>>>> -Wno-strict-aliasing -Wno-unknown-pragmas -g -fPIC >>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include >>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/arch-linux2-cxx-debug/include >>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include/mpiuni >>>>>>>>>>>>>>>>>>> -D__INSDIR__=src/snes/examples/tutorials/ ex12.c >>>>>>>>>>>>>>>>>>> ex12.c:14:18: fatal error: ex12.h: No such file or >>>>>>>>>>>>>>>>>>> directory >>>>>>>>>>>>>>>>>>> compilation terminated. >>>>>>>>>>>>>>>>>>> make: *** [ex12.o] Error 1 >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Any help of yours is very much appreciated. >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Yes, this relates to my 3). This is not going to work >>>>>>>>>>>>>>>>>> for you with the release. Please see the link I sent. >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:58 PM >>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:55 PM, Jones,Martin >>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> You built with PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:31 PM >>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:11 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Now I went to the directory where ex12.c sits and >>>>>>>>>>>>>>>>>>>>> just did a 'make ex12.c' with the following error if this helps? : >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>>>>> make ex12.c >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/variables:108: >>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscvariables: No >>>>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/rules:962: >>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules: No >>>>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> make: *** No rule to make target >>>>>>>>>>>>>>>>>>>>> `/home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules'. >>>>>>>>>>>>>>>>>>>>> Stop. >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> 1) You would type 'make ex12' >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> 2) Either you PETSC_DIR (/home/mjonesa/PETSc/petsc-3. >>>>>>>>>>>>>>>>>>>> 4.3) or PETSC_ARCH (linux-gnu-cxx-debug) environment >>>>>>>>>>>>>>>>>>>> variables >>>>>>>>>>>>>>>>>>>> do not match what you built. Please send >>>>>>>>>>>>>>>>>>>> configure.log and make.log >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> 3) Since it was only recently added, if you want to >>>>>>>>>>>>>>>>>>>> use the FEM functionality, you must use the development version: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/developers/index.html >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [mailto:knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 2:48 PM >>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 2:35 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Hello To Whom it Concerns, >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> I am trying to run the tutorial ex12.c by running 'bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h' >>>>>>>>>>>>>>>>>>>>> but getting the following error: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> $ bin/pythonscripts/PetscGenerateFEMQuadrature.py dim >>>>>>>>>>>>>>>>>>>>> order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h >>>>>>>>>>>>>>>>>>>>> Traceback (most recent call last): >>>>>>>>>>>>>>>>>>>>> File >>>>>>>>>>>>>>>>>>>>> "bin/pythonscripts/PetscGenerateFEMQuadrature.py", line 15, in >>>>>>>>>>>>>>>>>>>>> from FIAT.reference_element import default_simplex >>>>>>>>>>>>>>>>>>>>> ImportError: No module named FIAT.reference_element >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> I have removed the requirement of generating the >>>>>>>>>>>>>>>>>>>>> header file (its now all handled in C). I thought >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> I changed the documentation everywhere (including the >>>>>>>>>>>>>>>>>>>>> latest tutorial slides). Can you try running >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> with 'master' (or 'next'), and point me toward the old >>>>>>>>>>>>>>>>>>>>> docs? >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>> (217) 550-2360 >>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>> Graduate Research Assistant >>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>> (217) 550-2360 >>>>>>>>> salaza11 at illinois.edu >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> *Miguel Angel Salazar de Troya* >>>>>>> Graduate Research Assistant >>>>>>> Department of Mechanical Science and Engineering >>>>>>> University of Illinois at Urbana-Champaign >>>>>>> (217) 550-2360 >>>>>>> salaza11 at illinois.edu >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> *Miguel Angel Salazar de Troya* >>>>> Graduate Research Assistant >>>>> Department of Mechanical Science and Engineering >>>>> University of Illinois at Urbana-Champaign >>>>> (217) 550-2360 >>>>> salaza11 at illinois.edu >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >>> >>> -- >>> *Miguel Angel Salazar de Troya* >>> Graduate Research Assistant >>> Department of Mechanical Science and Engineering >>> University of Illinois at Urbana-Champaign >>> (217) 550-2360 >>> salaza11 at illinois.edu >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > > > -- > *Miguel Angel Salazar de Troya* > Graduate Research Assistant > Department of Mechanical Science and Engineering > University of Illinois at Urbana-Champaign > (217) 550-2360 > salaza11 at illinois.edu > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Mar 4 12:02:35 2014 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 4 Mar 2014 12:02:35 -0600 Subject: [petsc-users] vtk output ASCII or binary In-Reply-To: References: Message-ID: On Tue, Mar 4, 2014 at 11:44 AM, Xiangdong wrote: > Hello everyone, > > When I use PetsViewerVTKOpen to output vec in vtk format, is it in ASCII > format or binary format? Are there any options to choose between them? > It is determined by the format, e.g. PETSC_VIEWER_VTK_VTU. Matt > Thank you. > > Xiangdong > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Mar 4 12:04:24 2014 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 4 Mar 2014 12:04:24 -0600 Subject: [petsc-users] questions about PetscSF In-Reply-To: References: <87txben5q9.fsf@jedbrown.org> Message-ID: On Tue, Mar 4, 2014 at 11:59 AM, Fande Kong wrote: > Jed, > > Thanks, > > I readed the node you gave. I still have some questions. > > (1) One root can be related with zero, one or many leaves, right? One leaf > only can be related with zero or one root. > Yes, that is what makes it a tree, rather than a DAG. > (2) In Algorithms section. Could you please give me a very simple example > to demonstrate how 'Extracting a submatrix from a sparse matrix', ' > Ownership discovery and transfer' and 'Graph distribution'. The description > is hard for me to understand. > These are not simple things. What exactly are you looking for? You can see graph distribution done with PetscSF in DMPlexDistribute(). Matt > Fande, > > > On Mon, Mar 3, 2014 at 5:54 PM, Jed Brown wrote: > >> Fande Kong writes: >> >> > Hi all, >> > >> > I was wondering mechanisms of the object PetscSF. What are definitions >> of >> > roots and leaves? Do roots/leaves associate with the data we want to >> > receive/send? For the function Bcast, it seems that we transfer data >> from >> > roots to leaves. But in another function Reduce, it seems that we move >> data >> > in the opposite direction (from leaves to roots). These kind of >> mechanisms >> > possibly make users confused. >> >> The SF graph is asymmetric so that it can have cleaner semantics. Read >> my note and reply here if something is still unclear. >> >> http://59a2.org/files/StarForest.pdf >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From epscodes at gmail.com Tue Mar 4 12:31:44 2014 From: epscodes at gmail.com (Xiangdong) Date: Tue, 4 Mar 2014 13:31:44 -0500 Subject: [petsc-users] vtk output ASCII or binary In-Reply-To: References: Message-ID: What is the difference between PetscViewerSetType and PetscViewerSetFormat? It seems that the first one take argument like PETSCVIEWERVTK, while the second one takes PETSC_VIEWER_VTK_VTS. By the way, how can I find a full list of formats? Clearly, PETSC_VIEWER_VTK_VTS is not listed in online documentation for PetscViewerSetFormat http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Viewer/PetscViewerSetFormat.html Thank you. Xiangdong On Tue, Mar 4, 2014 at 1:02 PM, Matthew Knepley wrote: > On Tue, Mar 4, 2014 at 11:44 AM, Xiangdong wrote: > >> Hello everyone, >> >> When I use PetsViewerVTKOpen to output vec in vtk format, is it in ASCII >> format or binary format? Are there any options to choose between them? >> > > It is determined by the format, e.g. PETSC_VIEWER_VTK_VTU. > > Matt > > >> Thank you. >> >> Xiangdong >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Mar 4 12:52:49 2014 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 4 Mar 2014 12:52:49 -0600 Subject: [petsc-users] vtk output ASCII or binary In-Reply-To: References: Message-ID: On Tue, Mar 4, 2014 at 12:31 PM, Xiangdong wrote: > What is the difference between PetscViewerSetType and > PetscViewerSetFormat? It seems that the first one take argument like > PETSCVIEWERVTK, while the second one takes PETSC_VIEWER_VTK_VTS. > A viewer type is the object type, just like other PETSc object, e.g. KSP. The format is a particular version of that output. For example, ASCII viewer is a type, whereas Matlab is a format. > By the way, how can I find a full list of formats? Clearly, > PETSC_VIEWER_VTK_VTS is not listed in online documentation for > PetscViewerSetFormat > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Viewer/PetscViewerSetFormat.html > It is listed in the complete list: http://www.mcs.anl.gov/petsc/petsc-dev/docs/manualpages/Viewer/PetscViewerFormat.html Matt > Thank you. > > Xiangdong > > > On Tue, Mar 4, 2014 at 1:02 PM, Matthew Knepley wrote: > >> On Tue, Mar 4, 2014 at 11:44 AM, Xiangdong wrote: >> >>> Hello everyone, >>> >>> When I use PetsViewerVTKOpen to output vec in vtk format, is it in ASCII >>> format or binary format? Are there any options to choose between them? >>> >> >> It is determined by the format, e.g. PETSC_VIEWER_VTK_VTU. >> >> Matt >> >> >>> Thank you. >>> >>> Xiangdong >>> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From epscodes at gmail.com Tue Mar 4 12:58:09 2014 From: epscodes at gmail.com (Xiangdong) Date: Tue, 4 Mar 2014 13:58:09 -0500 Subject: [petsc-users] vtk output ASCII or binary In-Reply-To: References: Message-ID: Thanks, Matt. Given that the FILE_MODE_APPEND is not supported for VTK format, is it possible to write two vectors into the same VTK data file? Thank you. Xiangdong On Tue, Mar 4, 2014 at 1:52 PM, Matthew Knepley wrote: > On Tue, Mar 4, 2014 at 12:31 PM, Xiangdong wrote: > >> What is the difference between PetscViewerSetType and >> PetscViewerSetFormat? It seems that the first one take argument like >> PETSCVIEWERVTK, while the second one takes PETSC_VIEWER_VTK_VTS. >> > > A viewer type is the object type, just like other PETSc object, e.g. KSP. > The format is a particular version > of that output. For example, ASCII viewer is a type, whereas Matlab is a > format. > > >> By the way, how can I find a full list of formats? Clearly, >> PETSC_VIEWER_VTK_VTS is not listed in online documentation for >> PetscViewerSetFormat >> >> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Viewer/PetscViewerSetFormat.html >> > > It is listed in the complete list: > > > http://www.mcs.anl.gov/petsc/petsc-dev/docs/manualpages/Viewer/PetscViewerFormat.html > > Matt > > >> Thank you. >> >> Xiangdong >> >> >> On Tue, Mar 4, 2014 at 1:02 PM, Matthew Knepley wrote: >> >>> On Tue, Mar 4, 2014 at 11:44 AM, Xiangdong wrote: >>> >>>> Hello everyone, >>>> >>>> When I use PetsViewerVTKOpen to output vec in vtk format, is it in >>>> ASCII format or binary format? Are there any options to choose between them? >>>> >>> >>> It is determined by the format, e.g. PETSC_VIEWER_VTK_VTU. >>> >>> Matt >>> >>> >>>> Thank you. >>>> >>>> Xiangdong >>>> >>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Mar 4 13:06:15 2014 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 4 Mar 2014 13:06:15 -0600 Subject: [petsc-users] vtk output ASCII or binary In-Reply-To: References: Message-ID: On Tue, Mar 4, 2014 at 12:58 PM, Xiangdong wrote: > Thanks, Matt. > > Given that the FILE_MODE_APPEND is not supported for VTK format, is it > possible to write two vectors into the same VTK data file? > Yes, just call VecView() twice. Matt > Thank you. > > Xiangdong > > > On Tue, Mar 4, 2014 at 1:52 PM, Matthew Knepley wrote: > >> On Tue, Mar 4, 2014 at 12:31 PM, Xiangdong wrote: >> >>> What is the difference between PetscViewerSetType and >>> PetscViewerSetFormat? It seems that the first one take argument like >>> PETSCVIEWERVTK, while the second one takes PETSC_VIEWER_VTK_VTS. >>> >> >> A viewer type is the object type, just like other PETSc object, e.g. KSP. >> The format is a particular version >> of that output. For example, ASCII viewer is a type, whereas Matlab is a >> format. >> >> >>> By the way, how can I find a full list of formats? Clearly, >>> PETSC_VIEWER_VTK_VTS is not listed in online documentation for >>> PetscViewerSetFormat >>> >>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Viewer/PetscViewerSetFormat.html >>> >> >> It is listed in the complete list: >> >> >> http://www.mcs.anl.gov/petsc/petsc-dev/docs/manualpages/Viewer/PetscViewerFormat.html >> >> Matt >> >> >>> Thank you. >>> >>> Xiangdong >>> >>> >>> On Tue, Mar 4, 2014 at 1:02 PM, Matthew Knepley wrote: >>> >>>> On Tue, Mar 4, 2014 at 11:44 AM, Xiangdong wrote: >>>> >>>>> Hello everyone, >>>>> >>>>> When I use PetsViewerVTKOpen to output vec in vtk format, is it in >>>>> ASCII format or binary format? Are there any options to choose between them? >>>>> >>>> >>>> It is determined by the format, e.g. PETSC_VIEWER_VTK_VTU. >>>> >>>> Matt >>>> >>>> >>>>> Thank you. >>>>> >>>>> Xiangdong >>>>> >>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Tue Mar 4 14:32:32 2014 From: jed at jedbrown.org (Jed Brown) Date: Tue, 04 Mar 2014 13:32:32 -0700 Subject: [petsc-users] questions about PetscSF In-Reply-To: References: <87txben5q9.fsf@jedbrown.org> Message-ID: <87k3c9ln6n.fsf@jedbrown.org> Matthew Knepley writes: >> (2) In Algorithms section. Could you please give me a very simple example >> to demonstrate how 'Extracting a submatrix from a sparse matrix', ' >> Ownership discovery and transfer' and 'Graph distribution'. The description >> is hard for me to understand. >> > > These are not simple things. What exactly are you looking for? You can see > graph distribution > done with PetscSF in DMPlexDistribute(). You might look at MatPermute_MPIAIJ or MatTranspose_MPIAIJ for less cluttered uses. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From epscodes at gmail.com Tue Mar 4 15:43:11 2014 From: epscodes at gmail.com (Xiangdong) Date: Tue, 4 Mar 2014 16:43:11 -0500 Subject: [petsc-users] vtk output ASCII or binary In-Reply-To: References: Message-ID: If I use VecView and write the vector into vtk file, is this done in parallel or serial? Does each processor send the data to processor 0 and let it write to the disk? Is it possible to let each processor write its own portion and merge them together later? Thank you. Xiangdong On Tue, Mar 4, 2014 at 2:06 PM, Matthew Knepley wrote: > On Tue, Mar 4, 2014 at 12:58 PM, Xiangdong wrote: > >> Thanks, Matt. >> >> Given that the FILE_MODE_APPEND is not supported for VTK format, is it >> possible to write two vectors into the same VTK data file? >> > > Yes, just call VecView() twice. > > Matt > > >> Thank you. >> >> Xiangdong >> >> >> On Tue, Mar 4, 2014 at 1:52 PM, Matthew Knepley wrote: >> >>> On Tue, Mar 4, 2014 at 12:31 PM, Xiangdong wrote: >>> >>>> What is the difference between PetscViewerSetType and >>>> PetscViewerSetFormat? It seems that the first one take argument like >>>> PETSCVIEWERVTK, while the second one takes PETSC_VIEWER_VTK_VTS. >>>> >>> >>> A viewer type is the object type, just like other PETSc object, e.g. >>> KSP. The format is a particular version >>> of that output. For example, ASCII viewer is a type, whereas Matlab is a >>> format. >>> >>> >>>> By the way, how can I find a full list of formats? Clearly, >>>> PETSC_VIEWER_VTK_VTS is not listed in online documentation for >>>> PetscViewerSetFormat >>>> >>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Viewer/PetscViewerSetFormat.html >>>> >>> >>> It is listed in the complete list: >>> >>> >>> http://www.mcs.anl.gov/petsc/petsc-dev/docs/manualpages/Viewer/PetscViewerFormat.html >>> >>> Matt >>> >>> >>>> Thank you. >>>> >>>> Xiangdong >>>> >>>> >>>> On Tue, Mar 4, 2014 at 1:02 PM, Matthew Knepley wrote: >>>> >>>>> On Tue, Mar 4, 2014 at 11:44 AM, Xiangdong wrote: >>>>> >>>>>> Hello everyone, >>>>>> >>>>>> When I use PetsViewerVTKOpen to output vec in vtk format, is it in >>>>>> ASCII format or binary format? Are there any options to choose between them? >>>>>> >>>>> >>>>> It is determined by the format, e.g. PETSC_VIEWER_VTK_VTU. >>>>> >>>>> Matt >>>>> >>>>> >>>>>> Thank you. >>>>>> >>>>>> Xiangdong >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Mar 4 15:49:24 2014 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 4 Mar 2014 15:49:24 -0600 Subject: [petsc-users] vtk output ASCII or binary In-Reply-To: References: Message-ID: On Tue, Mar 4, 2014 at 3:43 PM, Xiangdong wrote: > If I use VecView and write the vector into vtk file, is this done in > parallel or serial? Does each processor send the data to processor 0 and > let it write to the disk? > > Is it possible to let each processor write its own portion and merge them > together later? > I urge you strongly to avoid premature optimization and worrying. If you want to discuss details that is fine, but I do not think the fears expressed above are grounded in reality. Network bandwidth is at least as good as memory bandwidth, so sending to proc 0 is not a problem below 10,000 procs or so. You might think serializing the disk writes would be, but its very likely your machine is doing that anyway by having a small number (usually 1) of I/O nodes. For high core counts we can use MPI I/O which you can turn on with a command line argument. Matt > Thank you. > > Xiangdong > > > On Tue, Mar 4, 2014 at 2:06 PM, Matthew Knepley wrote: > >> On Tue, Mar 4, 2014 at 12:58 PM, Xiangdong wrote: >> >>> Thanks, Matt. >>> >>> Given that the FILE_MODE_APPEND is not supported for VTK format, is it >>> possible to write two vectors into the same VTK data file? >>> >> >> Yes, just call VecView() twice. >> >> Matt >> >> >>> Thank you. >>> >>> Xiangdong >>> >>> >>> On Tue, Mar 4, 2014 at 1:52 PM, Matthew Knepley wrote: >>> >>>> On Tue, Mar 4, 2014 at 12:31 PM, Xiangdong wrote: >>>> >>>>> What is the difference between PetscViewerSetType and >>>>> PetscViewerSetFormat? It seems that the first one take argument like >>>>> PETSCVIEWERVTK, while the second one takes PETSC_VIEWER_VTK_VTS. >>>>> >>>> >>>> A viewer type is the object type, just like other PETSc object, e.g. >>>> KSP. The format is a particular version >>>> of that output. For example, ASCII viewer is a type, whereas Matlab is >>>> a format. >>>> >>>> >>>>> By the way, how can I find a full list of formats? Clearly, >>>>> PETSC_VIEWER_VTK_VTS is not listed in online documentation for >>>>> PetscViewerSetFormat >>>>> >>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Viewer/PetscViewerSetFormat.html >>>>> >>>> >>>> It is listed in the complete list: >>>> >>>> >>>> http://www.mcs.anl.gov/petsc/petsc-dev/docs/manualpages/Viewer/PetscViewerFormat.html >>>> >>>> Matt >>>> >>>> >>>>> Thank you. >>>>> >>>>> Xiangdong >>>>> >>>>> >>>>> On Tue, Mar 4, 2014 at 1:02 PM, Matthew Knepley wrote: >>>>> >>>>>> On Tue, Mar 4, 2014 at 11:44 AM, Xiangdong wrote: >>>>>> >>>>>>> Hello everyone, >>>>>>> >>>>>>> When I use PetsViewerVTKOpen to output vec in vtk format, is it in >>>>>>> ASCII format or binary format? Are there any options to choose between them? >>>>>>> >>>>>> >>>>>> It is determined by the format, e.g. PETSC_VIEWER_VTK_VTU. >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> Thank you. >>>>>>> >>>>>>> Xiangdong >>>>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Tue Mar 4 16:10:04 2014 From: jed at jedbrown.org (Jed Brown) Date: Tue, 04 Mar 2014 15:10:04 -0700 Subject: [petsc-users] vtk output ASCII or binary In-Reply-To: References: Message-ID: <87a9d5lio3.fsf@jedbrown.org> Xiangdong writes: > Is it possible to let each processor write its own portion and merge them > together later? This is *much* slower. You can use MPI-IO if you have IO performance problems at scale. Make sure you a binary-appended (.vtr, .vts, or .vtu) rather than the legacy ASCII format. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From epscodes at gmail.com Tue Mar 4 19:28:30 2014 From: epscodes at gmail.com (Xiangdong) Date: Tue, 4 Mar 2014 20:28:30 -0500 Subject: [petsc-users] vtk output ASCII or binary In-Reply-To: References: Message-ID: At this moment, when I write the solutions (two vectors with a few billion unknowns) to binary vts format, the writing rate is about 100MB/s from proc 0. Do you think writing the data into pvts format will enhance the performance? Can I use some PETSc functions to save the local vector as pvts format and view them as a global vector through paraview/visit? Thank you. Xiangdong On Tue, Mar 4, 2014 at 4:49 PM, Matthew Knepley wrote: > On Tue, Mar 4, 2014 at 3:43 PM, Xiangdong wrote: > >> If I use VecView and write the vector into vtk file, is this done in >> parallel or serial? Does each processor send the data to processor 0 and >> let it write to the disk? >> >> Is it possible to let each processor write its own portion and merge them >> together later? >> > > I urge you strongly to avoid premature optimization and worrying. > > If you want to discuss details that is fine, but I do not think the fears > expressed > above are grounded in reality. Network bandwidth is at least as good as > memory > bandwidth, so sending to proc 0 is not a problem below 10,000 procs or so. > You > might think serializing the disk writes would be, but its very likely your > machine > is doing that anyway by having a small number (usually 1) of I/O nodes. > For high > core counts we can use MPI I/O which you can turn on with a command line > argument. > > Matt > > >> Thank you. >> >> Xiangdong >> >> >> On Tue, Mar 4, 2014 at 2:06 PM, Matthew Knepley wrote: >> >>> On Tue, Mar 4, 2014 at 12:58 PM, Xiangdong wrote: >>> >>>> Thanks, Matt. >>>> >>>> Given that the FILE_MODE_APPEND is not supported for VTK format, is it >>>> possible to write two vectors into the same VTK data file? >>>> >>> >>> Yes, just call VecView() twice. >>> >>> Matt >>> >>> >>>> Thank you. >>>> >>>> Xiangdong >>>> >>>> >>>> On Tue, Mar 4, 2014 at 1:52 PM, Matthew Knepley wrote: >>>> >>>>> On Tue, Mar 4, 2014 at 12:31 PM, Xiangdong wrote: >>>>> >>>>>> What is the difference between PetscViewerSetType and >>>>>> PetscViewerSetFormat? It seems that the first one take argument like >>>>>> PETSCVIEWERVTK, while the second one takes PETSC_VIEWER_VTK_VTS. >>>>>> >>>>> >>>>> A viewer type is the object type, just like other PETSc object, e.g. >>>>> KSP. The format is a particular version >>>>> of that output. For example, ASCII viewer is a type, whereas Matlab is >>>>> a format. >>>>> >>>>> >>>>>> By the way, how can I find a full list of formats? Clearly, >>>>>> PETSC_VIEWER_VTK_VTS is not listed in online documentation for >>>>>> PetscViewerSetFormat >>>>>> >>>>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Viewer/PetscViewerSetFormat.html >>>>>> >>>>> >>>>> It is listed in the complete list: >>>>> >>>>> >>>>> http://www.mcs.anl.gov/petsc/petsc-dev/docs/manualpages/Viewer/PetscViewerFormat.html >>>>> >>>>> Matt >>>>> >>>>> >>>>>> Thank you. >>>>>> >>>>>> Xiangdong >>>>>> >>>>>> >>>>>> On Tue, Mar 4, 2014 at 1:02 PM, Matthew Knepley wrote: >>>>>> >>>>>>> On Tue, Mar 4, 2014 at 11:44 AM, Xiangdong wrote: >>>>>>> >>>>>>>> Hello everyone, >>>>>>>> >>>>>>>> When I use PetsViewerVTKOpen to output vec in vtk format, is it in >>>>>>>> ASCII format or binary format? Are there any options to choose between them? >>>>>>>> >>>>>>> >>>>>>> It is determined by the format, e.g. PETSC_VIEWER_VTK_VTU. >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> Thank you. >>>>>>>> >>>>>>>> Xiangdong >>>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Mar 4 23:28:50 2014 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 4 Mar 2014 23:28:50 -0600 Subject: [petsc-users] Problems running ex12.c In-Reply-To: References: <8448FFCE4362914496BCEAF8BE810C13EF1D34@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D5E@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D7C@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D8F@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DA3@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DBF@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E22@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E50@DCPWPEXMBX02.mdanderson.edu> Message-ID: On Tue, Mar 4, 2014 at 12:01 PM, Matthew Knepley wrote: > On Tue, Mar 4, 2014 at 11:51 AM, Miguel Angel Salazar de Troya < > salazardetroya at gmail.com> wrote: > >> I can run it now, thanks. Although if I run it with valgrind 3.5.0 >> (should I update to the last version?) I get some memory leaks related with >> the function DMPlexCreateBoxMesh. >> > > I will check it out. > This is now fixed. Thanks for finding it Matt > Thanks, > > Matt > > >> [salaza11 at maya tutorials]$ valgrind --leak-check=full ./ex12 -run_type >> test -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 >> -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >> ==9625== Memcheck, a memory error detector >> ==9625== Copyright (C) 2002-2009, and GNU GPL'd, by Julian Seward et al. >> ==9625== Using Valgrind-3.5.0 and LibVEX; rerun with -h for copyright info >> ==9625== Command: ./ex12 -run_type test -refinement_limit 0.0 -bc_type >> dirichlet -interpolate 0 -petscspace_order 1 -show_initial >> -dm_plex_print_fem 1 >> ==9625== >> Local function: >> Vec Object: 1 MPI processes >> type: seq >> 0 >> 0.25 >> 1 >> 0.25 >> 0.5 >> 1.25 >> 1 >> 1.25 >> 2 >> Initial guess >> Vec Object: 1 MPI processes >> type: seq >> 0.5 >> L_2 Error: 0.111111 >> Residual: >> Vec Object: 1 MPI processes >> type: seq >> 0 >> 0 >> 0 >> 0 >> 0 >> 0 >> 0 >> 0 >> 0 >> Initial Residual >> Vec Object: 1 MPI processes >> type: seq >> 0 >> L_2 Residual: 0 >> Jacobian: >> Mat Object: 1 MPI processes >> type: seqaij >> row 0: (0, 4) >> Residual: >> Vec Object: 1 MPI processes >> type: seq >> 0 >> 0 >> 0 >> 0 >> -2 >> 0 >> 0 >> 0 >> 0 >> Au - b = Au + F(0) >> Vec Object: 1 MPI processes >> type: seq >> 0 >> Linear L_2 Residual: 0 >> ==9625== >> ==9625== HEAP SUMMARY: >> ==9625== in use at exit: 288 bytes in 3 blocks >> ==9625== total heap usage: 2,484 allocs, 2,481 frees, 1,009,287 bytes >> allocated >> ==9625== >> ==9625== 48 bytes in 1 blocks are definitely lost in loss record 1 of 3 >> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >> ==9625== by 0x5D8D4E1: writepoly (triangle.c:12012) >> ==9625== by 0x5D8FAAC: triangulate (triangle.c:13167) >> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >> ==9625== by 0x408D3D: main (ex12.c:651) >> ==9625== >> ==9625== 96 bytes in 1 blocks are definitely lost in loss record 2 of 3 >> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >> ==9625== by 0x5D8D485: writepoly (triangle.c:12004) >> ==9625== by 0x5D8FAAC: triangulate (triangle.c:13167) >> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >> ==9625== by 0x408D3D: main (ex12.c:651) >> ==9625== >> ==9625== 144 bytes in 1 blocks are definitely lost in loss record 3 of 3 >> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >> ==9625== by 0x5D8CD20: writenodes (triangle.c:11718) >> ==9625== by 0x5D8F9DE: triangulate (triangle.c:13132) >> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >> ==9625== by 0x408D3D: main (ex12.c:651) >> ==9625== >> ==9625== LEAK SUMMARY: >> ==9625== definitely lost: 288 bytes in 3 blocks >> ==9625== indirectly lost: 0 bytes in 0 blocks >> ==9625== possibly lost: 0 bytes in 0 blocks >> ==9625== still reachable: 0 bytes in 0 blocks >> ==9625== suppressed: 0 bytes in 0 blocks >> ==9625== >> ==9625== For counts of detected and suppressed errors, rerun with: -v >> ==9625== ERROR SUMMARY: 3 errors from 3 contexts (suppressed: 6 from 6) >> >> >> >> >> >> On Mon, Mar 3, 2014 at 7:05 PM, Matthew Knepley wrote: >> >>> On Mon, Mar 3, 2014 at 4:59 PM, Miguel Angel Salazar de Troya < >>> salazardetroya at gmail.com> wrote: >>> >>>> You are welcome, thanks for your help. >>>> >>> >>> Okay, I have rebuilt completely clean, and ex12 runs for me. Can you try >>> again after pulling? >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> On Mon, Mar 3, 2014 at 4:13 PM, Matthew Knepley wrote: >>>> >>>>> On Mon, Mar 3, 2014 at 1:44 PM, Miguel Angel Salazar de Troya < >>>>> salazardetroya at gmail.com> wrote: >>>>> >>>>>> Thanks. This is what I get. >>>>>> >>>>> >>>>> Okay, this was broken by a new push to master/next in the last few >>>>> days. I have pushed a fix, >>>>> however next is currently broken due to a failure to check in a file. >>>>> This should be fixed shortly, >>>>> and then ex12 will work. I will mail you when its ready. >>>>> >>>>> Thanks for finding this, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> (gdb) cont >>>>>> Continuing. >>>>>> >>>>>> Program received signal SIGSEGV, Segmentation fault. >>>>>> 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, >>>>>> X=0x168b5b0, >>>>>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, str=0x7fffae6e7970, >>>>>> user=0x7fd6811be509) >>>>>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>>>>> 882 ierr = PetscFEGetDimension(fe[f], &Nb);CHKERRQ(ierr); >>>>>> (gdb) where >>>>>> #0 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, >>>>>> X=0x168b5b0, >>>>>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, str=0x7fffae6e7970, >>>>>> user=0x7fd6811be509) >>>>>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>>>>> #1 0x00007fd6814a5bf6 in SNESComputeJacobian_DMLocal >>>>>> (snes=0x14e9450, >>>>>> X=0x1622ad0, A=0x7fffae6e8a88, B=0x7fffae6e8a88, ctx=0x1652300) >>>>>> at /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c:102 >>>>>> #2 0x00007fd6814cc609 in SNESComputeJacobian (snes=0x14e9450, >>>>>> X=0x1622ad0, >>>>>> A=0x7fffae6e8a88, B=0x7fffae6e8a88) >>>>>> at /home/salaza11/petsc/src/snes/interface/snes.c:2245 >>>>>> #3 0x000000000040af72 in main (argc=15, argv=0x7fffae6e8bc8) >>>>>> at /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:784 >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> On Mon, Mar 3, 2014 at 1:40 PM, Matthew Knepley wrote: >>>>>> >>>>>>> On Mon, Mar 3, 2014 at 1:39 PM, Miguel Angel Salazar de Troya < >>>>>>> salazardetroya at gmail.com> wrote: >>>>>>> >>>>>>>> This is what I get at gdb when I type 'where'. >>>>>>>> >>>>>>> >>>>>>> You have to type 'cont', and then when it fails you type 'where'. >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> #0 0x000000310e0aa860 in __nanosleep_nocancel () from >>>>>>>> /lib64/libc.so.6 >>>>>>>> #1 0x000000310e0aa70f in sleep () from /lib64/libc.so.6 >>>>>>>> #2 0x00007fd83a00a8be in PetscSleep (s=10) >>>>>>>> at /home/salaza11/petsc/src/sys/utils/psleep.c:52 >>>>>>>> #3 0x00007fd83a06f331 in PetscAttachDebugger () >>>>>>>> at /home/salaza11/petsc/src/sys/error/adebug.c:397 >>>>>>>> #4 0x00007fd83a0af1d2 in PetscOptionsCheckInitial_Private () >>>>>>>> at /home/salaza11/petsc/src/sys/objects/init.c:444 >>>>>>>> #5 0x00007fd83a0b6448 in PetscInitialize (argc=0x7fff5cd8df2c, >>>>>>>> args=0x7fff5cd8df20, file=0x0, >>>>>>>> help=0x60ce40 "Poisson Problem in 2d and 3d with simplicial >>>>>>>> finite elements.\nWe solve the Poisson problem in a rectangular\ndomain, >>>>>>>> using a parallel unstructured mesh (DMPLEX) to discretize it.\n\n\n") >>>>>>>> at /home/salaza11/petsc/src/sys/objects/pinit.c:876 >>>>>>>> #6 0x0000000000408f2c in main (argc=15, argv=0x7fff5cd8f1f8) >>>>>>>> at /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:663 >>>>>>>> >>>>>>>> The rest of the gdb output is attached. I am a bit ignorant with >>>>>>>> gdb, I apologize for that. >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On Mon, Mar 3, 2014 at 12:48 PM, Matthew Knepley >>>>>>> > wrote: >>>>>>>> >>>>>>>>> On Mon, Mar 3, 2014 at 12:39 PM, Miguel Angel Salazar de Troya < >>>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> Thanks for your response. Sorry I did not have the "next" >>>>>>>>>> version, but the "master" version. I still have an error though. I followed >>>>>>>>>> the steps given here (https://bitbucket.org/petsc/petsc/wiki/Home) >>>>>>>>>> to obtain the next version, I configured petsc as above and ran ex12 as >>>>>>>>>> above as well, getting this error: >>>>>>>>>> >>>>>>>>>> [salaza11 at maya tutorials]$ ./ex12 -run_type test >>>>>>>>>> -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 >>>>>>>>>> -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>>>>>>> Local function: >>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>> type: seq >>>>>>>>>> 0 >>>>>>>>>> 0.25 >>>>>>>>>> 1 >>>>>>>>>> 0.25 >>>>>>>>>> 0.5 >>>>>>>>>> 1.25 >>>>>>>>>> 1 >>>>>>>>>> 1.25 >>>>>>>>>> 2 >>>>>>>>>> Initial guess >>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>> type: seq >>>>>>>>>> 0.5 >>>>>>>>>> L_2 Error: 0.111111 >>>>>>>>>> Residual: >>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>> type: seq >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> Initial Residual >>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>> type: seq >>>>>>>>>> 0 >>>>>>>>>> L_2 Residual: 0 >>>>>>>>>> >>>>>>>>> >>>>>>>>> Okay, now run with -start_in_debugger, and give me a stack trace >>>>>>>>> using 'where'. >>>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>>>> Violation, probably memory access out of range >>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>> -on_error_attach_debugger >>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find >>>>>>>>>> memory corruption errors >>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>> ------------------------------------ >>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>>>>>>> available, >>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of the >>>>>>>>>> function >>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 871 >>>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>> -------------------------------------------------------------- >>>>>>>>>> [0]PETSC ERROR: Signal received >>>>>>>>>> [0]PETSC ERROR: See http:// >>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble >>>>>>>>>> shooting. >>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>> v3.4.3-4705-gfb6b3bc GIT Date: 2014-03-03 08:23:43 -0600 >>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by >>>>>>>>>> salaza11 Mon Mar 3 11:49:15 2014 >>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>>> [0]PETSC ERROR: #1 User provided function() line 0 in unknown >>>>>>>>>> file >>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>> [unset]: aborting job: >>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Sun, Mar 2, 2014 at 7:11 PM, Matthew Knepley < >>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> On Sun, Mar 2, 2014 at 6:54 PM, Miguel Angel Salazar de Troya < >>>>>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> Hi everybody >>>>>>>>>>>> >>>>>>>>>>>> I am trying to run example ex12.c without much success. I >>>>>>>>>>>> specifically run it with the command options: >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> We need to start narrowing down differences, because it runs for >>>>>>>>>>> me and our nightly tests. So, first can >>>>>>>>>>> you confirm that you are using the latest 'next' branch? >>>>>>>>>>> >>>>>>>>>>> Thanks, >>>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type >>>>>>>>>>>> dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>> >>>>>>>>>>>> And I get this output >>>>>>>>>>>> >>>>>>>>>>>> Local function: >>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>> type: seq >>>>>>>>>>>> 0 >>>>>>>>>>>> 1 >>>>>>>>>>>> 1 >>>>>>>>>>>> 2 >>>>>>>>>>>> 1 >>>>>>>>>>>> 2 >>>>>>>>>>>> 2 >>>>>>>>>>>> 3 >>>>>>>>>>>> Initial guess >>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>> type: seq >>>>>>>>>>>> L_2 Error: 0.625 >>>>>>>>>>>> Residual: >>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>> type: seq >>>>>>>>>>>> 0 >>>>>>>>>>>> 0 >>>>>>>>>>>> 0 >>>>>>>>>>>> 0 >>>>>>>>>>>> 0 >>>>>>>>>>>> 0 >>>>>>>>>>>> 0 >>>>>>>>>>>> 0 >>>>>>>>>>>> Initial Residual >>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>> type: seq >>>>>>>>>>>> L_2 Residual: 0 >>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>>>>>> Violation, probably memory access out of range >>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find >>>>>>>>>>>> memory corruption errors >>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are >>>>>>>>>>>> not available, >>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of >>>>>>>>>>>> the function >>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 867 >>>>>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>> v3.4.3-3453-g0a94005 GIT Date: 2014-03-02 13:12:04 -0600 >>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>> shooting. >>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by >>>>>>>>>>>> salaza11 Sun Mar 2 17:00:09 2014 >>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>> /home/salaza11/petsc/linux-gnu-c-debug/lib >>>>>>>>>>>> [0]PETSC ERROR: Configure run at Sun Mar 2 16:46:51 2014 >>>>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in unknown file >>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>> [unset]: aborting job: >>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Probably my problems could be on my configuration. I attach the >>>>>>>>>>>> configure.log. I ran ./configure like this >>>>>>>>>>>> >>>>>>>>>>>> ./configure --download-mpich --download-scientificpython >>>>>>>>>>>> --download-triangle --download-ctetgen --download-chaco --with-c2html=0 >>>>>>>>>>>> >>>>>>>>>>>> Thanks a lot in advance. >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On Tue, Jan 28, 2014 at 10:37 AM, Matthew Knepley < >>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> On Tue, Jan 28, 2014 at 10:31 AM, Yaakoub El Khamra < >>>>>>>>>>>>> yelkhamra at gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> If >>>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>> >>>>>>>>>>>>>> is for serial, any chance we can get the options to run in >>>>>>>>>>>>>> parallel? >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Just use mpiexec -n >>>>>>>>>>>>> >>>>>>>>>>>>> Matt >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Regards >>>>>>>>>>>>>> Yaakoub El Khamra >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:29 AM, Matthew Knepley < >>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:06 AM, Jones,Martin Alexander < >>>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>> *Sent:* Friday, January 17, 2014 11:04 AM >>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:00 AM, Jones,Martin >>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> These examples all seem to run excepting the following >>>>>>>>>>>>>>>>> command, >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> I get the following ouput: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>>>>> ./ex12: symbol lookup error: >>>>>>>>>>>>>>>>> /opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so: undefined >>>>>>>>>>>>>>>>> symbol: omp_get_num_procs >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> This is a build problem, but it should affect all the >>>>>>>>>>>>>>>> runs. Is this reproducible? Can you send configure.log? MKL is the worst. >>>>>>>>>>>>>>>> If this >>>>>>>>>>>>>>>> persists, I would just switch to --download-f-blas-lapack. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks. I have some advice on options >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> --with-precision=single # I would not use this unless you >>>>>>>>>>>>>>> are doing something special, like CUDA >>>>>>>>>>>>>>> --with-clanguage=C++ # I would recommend switching to C, >>>>>>>>>>>>>>> the build is much faster >>>>>>>>>>>>>>> --with-mpi-dir=/usr --with-mpi4py=0 >>>>>>>>>>>>>>> --with-shared-libraries --CFLAGS=-O0 --CXXFLAGS=-O0 >>>>>>>>>>>>>>> --with-fc=0 >>>>>>>>>>>>>>> --with-etags=1 # This is unnecessary >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>> # Here is the problem, see below >>>>>>>>>>>>>>> --download-metis >>>>>>>>>>>>>>> --download-fiat=yes --download-generator >>>>>>>>>>>>>>> --download-scientificpython # Get rid of these, they are obsolete >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Your MKL needs another library for the OpenMP symbols. I >>>>>>>>>>>>>>> would recommend switching to --download-f2cblaslapack, >>>>>>>>>>>>>>> or you can try and find that library. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 6:35 PM >>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 5:43 PM, Jones,Martin >>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Hi, This is the next error message after configuring >>>>>>>>>>>>>>>>>> and building with the triangle package when trying to run ex12 >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> This is my fault for bad defaults. I will fix. Try >>>>>>>>>>>>>>>>> running >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 >>>>>>>>>>>>>>>>> -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> for a representative run. Then you could try 3D >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> or a full run >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>>>>>> -interpolate -petscspace_order 1 >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>>>>>> -interpolate -petscspace_order 2 >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Let me know if those work. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> ./ex12 >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 8 FPE: Floating >>>>>>>>>>>>>>>>>> Point Exception,probably divide by zero >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to >>>>>>>>>>>>>>>>>> find memory corruption errors >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack >>>>>>>>>>>>>>>>>> below >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack >>>>>>>>>>>>>>>>>> are not available, >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the >>>>>>>>>>>>>>>>>> start of the function >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeResidualFEM line 531 >>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction_DMLocal line 63 >>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user function line 2088 >>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction line 2076 >>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve_NEWTONLS line 144 >>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/impls/ls/ls.c >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve line 3765 >>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>>>>>> updates. >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>>>>>>>> shooting. >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named >>>>>>>>>>>>>>>>>> maeda by mjonesa Thu Jan 16 17:41:23 2014 >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 17:38:33 2014 >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local >>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>> --with-c2html=0 --with-clanguage=c++ PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>> --download-triangle >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in >>>>>>>>>>>>>>>>>> unknown file >>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - >>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:37 PM >>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:33 PM, Jones,Martin >>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Hi, I have downloaded and built the dev version you >>>>>>>>>>>>>>>>>>> suggested. I think I need the triangle package to run this particular case. >>>>>>>>>>>>>>>>>>> Is there any thing else that appears wrong in what I have done from the >>>>>>>>>>>>>>>>>>> error messages below: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Great! Its running. You can reconfigure like this: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> $PETSC_DIR/$PETSC_ARCH/conf/reconfigure-$PETSC_ARCH.py --download-triangle >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> and then rebuild >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> make >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> and then rerun. You can load meshes, but its much >>>>>>>>>>>>>>>>>> easier to have triangle create them. >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Thanks for being patient, >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: No support for this operation for this >>>>>>>>>>>>>>>>>>> object type! >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Mesh generation needs external package >>>>>>>>>>>>>>>>>>> support. >>>>>>>>>>>>>>>>>>> Please reconfigure with --download-triangle.! >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>>>>>>> updates. >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about >>>>>>>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named >>>>>>>>>>>>>>>>>>> maeda by mjonesa Thu Jan 16 16:28:20 2014 >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 16:25:53 2014 >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local --with-clanguage=c++ --with-c2html=0 >>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexGenerate() line 4332 in >>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plex.c >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexCreateBoxMesh() line 600 in >>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexcreate.c >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: CreateMesh() line 295 in >>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: main() line 659 in >>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 56) - >>>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:06 PM >>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:05 PM, Jones,Martin >>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Hi. I changed the ENV variable to the correct entry. >>>>>>>>>>>>>>>>>>>> when I type make ex12 I get this: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>>>> make ex12 >>>>>>>>>>>>>>>>>>>> g++ -o ex12.o -c -Wall -Wwrite-strings >>>>>>>>>>>>>>>>>>>> -Wno-strict-aliasing -Wno-unknown-pragmas -g -fPIC >>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include >>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/arch-linux2-cxx-debug/include >>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include/mpiuni >>>>>>>>>>>>>>>>>>>> -D__INSDIR__=src/snes/examples/tutorials/ ex12.c >>>>>>>>>>>>>>>>>>>> ex12.c:14:18: fatal error: ex12.h: No such file or >>>>>>>>>>>>>>>>>>>> directory >>>>>>>>>>>>>>>>>>>> compilation terminated. >>>>>>>>>>>>>>>>>>>> make: *** [ex12.o] Error 1 >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Any help of yours is very much appreciated. >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Yes, this relates to my 3). This is not going to work >>>>>>>>>>>>>>>>>>> for you with the release. Please see the link I sent. >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:58 PM >>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:55 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> You built with PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:31 PM >>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:11 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Now I went to the directory where ex12.c sits and >>>>>>>>>>>>>>>>>>>>>> just did a 'make ex12.c' with the following error if this helps? : >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>>>>>> make ex12.c >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/variables:108: >>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscvariables: No >>>>>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/rules:962: >>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules: No >>>>>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> make: *** No rule to make target >>>>>>>>>>>>>>>>>>>>>> `/home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules'. >>>>>>>>>>>>>>>>>>>>>> Stop. >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> 1) You would type 'make ex12' >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> 2) Either you PETSC_DIR (/home/mjonesa/PETSc/petsc-3. >>>>>>>>>>>>>>>>>>>>> 4.3) or PETSC_ARCH (linux-gnu-cxx-debug) environment >>>>>>>>>>>>>>>>>>>>> variables >>>>>>>>>>>>>>>>>>>>> do not match what you built. Please send >>>>>>>>>>>>>>>>>>>>> configure.log and make.log >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> 3) Since it was only recently added, if you want to >>>>>>>>>>>>>>>>>>>>> use the FEM functionality, you must use the development version: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/developers/index.html >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [mailto:knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 2:48 PM >>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 2:35 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Hello To Whom it Concerns, >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> I am trying to run the tutorial ex12.c by running 'bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>>>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h' >>>>>>>>>>>>>>>>>>>>>> but getting the following error: >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> $ bin/pythonscripts/PetscGenerateFEMQuadrature.py dim >>>>>>>>>>>>>>>>>>>>>> order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h >>>>>>>>>>>>>>>>>>>>>> Traceback (most recent call last): >>>>>>>>>>>>>>>>>>>>>> File >>>>>>>>>>>>>>>>>>>>>> "bin/pythonscripts/PetscGenerateFEMQuadrature.py", line 15, in >>>>>>>>>>>>>>>>>>>>>> from FIAT.reference_element import default_simplex >>>>>>>>>>>>>>>>>>>>>> ImportError: No module named FIAT.reference_element >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> I have removed the requirement of generating the >>>>>>>>>>>>>>>>>>>>>> header file (its now all handled in C). I thought >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> I changed the documentation everywhere (including the >>>>>>>>>>>>>>>>>>>>>> latest tutorial slides). Can you try running >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> with 'master' (or 'next'), and point me toward the >>>>>>>>>>>>>>>>>>>>>> old docs? >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>> Graduate Research Assistant >>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>> (217) 550-2360 >>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>> Graduate Research Assistant >>>>>>>> Department of Mechanical Science and Engineering >>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>> (217) 550-2360 >>>>>>>> salaza11 at illinois.edu >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> *Miguel Angel Salazar de Troya* >>>>>> Graduate Research Assistant >>>>>> Department of Mechanical Science and Engineering >>>>>> University of Illinois at Urbana-Champaign >>>>>> (217) 550-2360 >>>>>> salaza11 at illinois.edu >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>>> >>>> -- >>>> *Miguel Angel Salazar de Troya* >>>> Graduate Research Assistant >>>> Department of Mechanical Science and Engineering >>>> University of Illinois at Urbana-Champaign >>>> (217) 550-2360 >>>> salaza11 at illinois.edu >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> >> >> -- >> *Miguel Angel Salazar de Troya* >> Graduate Research Assistant >> Department of Mechanical Science and Engineering >> University of Illinois at Urbana-Champaign >> (217) 550-2360 >> salaza11 at illinois.edu >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Tue Mar 4 22:59:04 2014 From: jed at jedbrown.org (Jed Brown) Date: Tue, 04 Mar 2014 21:59:04 -0700 Subject: [petsc-users] vtk output ASCII or binary In-Reply-To: References: Message-ID: <87zjl5i6lj.fsf@jedbrown.org> Xiangdong writes: > At this moment, when I write the solutions (two vectors with a few billion > unknowns) to binary vts format, the writing rate is about 100MB/s from proc > 0. How many procs are you writing from and what filesystem do you have? How much faster is it if you write the PETSc binary format using MPI-IO (-viewer_binary_mpiio)? If that is lots faster for you, I can fairly simply add support for writing VTS that way. You'll still pay when you visualize, however. > Do you think writing the data into pvts format will enhance the > performance? No. PVTS is a crappy format designed by people that evidently did not understand parallel IO performance. But all the VTK formats are crappy if you really care about performance. With the VTK formats, if you write it efficiently, it will still be a bottleneck to read. We support them because they are easy. Use HDF5 if you want something sensible. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From epscodes at gmail.com Wed Mar 5 12:41:04 2014 From: epscodes at gmail.com (Xiangdong) Date: Wed, 5 Mar 2014 13:41:04 -0500 Subject: [petsc-users] vtk output ASCII or binary In-Reply-To: <87zjl5i6lj.fsf@jedbrown.org> References: <87zjl5i6lj.fsf@jedbrown.org> Message-ID: When using the option -viewer_binary_mpiio, I see PETSC Error: DMDAArrayMPIIO() line 532 in /src/dm/impls/da/gr2.c. However, when I run it without mpiio, it outputs the binary data fine. What options am I missing here? Thank you. xiangdong On Tue, Mar 4, 2014 at 11:59 PM, Jed Brown wrote: > Xiangdong writes: > > > At this moment, when I write the solutions (two vectors with a few > billion > > unknowns) to binary vts format, the writing rate is about 100MB/s from > proc > > 0. > > How many procs are you writing from and what filesystem do you have? > How much faster is it if you write the PETSc binary format using MPI-IO > (-viewer_binary_mpiio)? If that is lots faster for you, I can fairly > simply add support for writing VTS that way. You'll still pay when you > visualize, however. > > > Do you think writing the data into pvts format will enhance the > > performance? > > No. PVTS is a crappy format designed by people that evidently did not > understand parallel IO performance. But all the VTK formats are crappy > if you really care about performance. With the VTK formats, if you > write it efficiently, it will still be a bottleneck to read. We support > them because they are easy. Use HDF5 if you want something sensible. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Wed Mar 5 12:47:43 2014 From: jed at jedbrown.org (Jed Brown) Date: Wed, 05 Mar 2014 11:47:43 -0700 Subject: [petsc-users] vtk output ASCII or binary In-Reply-To: References: <87zjl5i6lj.fsf@jedbrown.org> Message-ID: <877g88iisw.fsf@jedbrown.org> Xiangdong writes: > When using the option -viewer_binary_mpiio, I see PETSC Error: > DMDAArrayMPIIO() line 532 in /src/dm/impls/da/gr2.c. NEVER truncate an error message. (We have to remind people of this every day. Always send the whole thing.) MPI-IO works for me and in our tests, so we should figure out what happened. > However, when I run it without mpiio, it outputs the binary data fine. What > options am I missing here? Time it so we can compare output performance using the different methods. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From epscodes at gmail.com Wed Mar 5 12:54:20 2014 From: epscodes at gmail.com (Xiangdong) Date: Wed, 5 Mar 2014 13:54:20 -0500 Subject: [petsc-users] DMDA questions In-Reply-To: <87d2i2mzz9.fsf@jedbrown.org> References: <87zjlfw4al.fsf@jedbrown.org> <87r46rw0ag.fsf@jedbrown.org> <87eh2rvvd0.fsf@jedbrown.org> <87ha7mvqvy.fsf@jedbrown.org> <87bnxuvqb5.fsf@jedbrown.org> <2097886F-7026-4DD4-A1BD-027EDE0FD3DA@mcs.anl.gov> <877g8bnq20.fsf@jedbrown.org> <87d2i2mzz9.fsf@jedbrown.org> Message-ID: If I first define a DM, and obtain the solution vector on this DM (through non multigrid method). However, given that this DM is so fine, I may only need to save/view the solution on a coarsen grid. Is there some functions available in petsc for this instead of writing my own vec scatter? Thank you. Xiangdong On Mon, Mar 3, 2014 at 9:58 PM, Jed Brown wrote: > Xiangdong writes: > > Could you please expand it a little more on using DMCoarsenHookAdd to > > restrict a fine vector on a coarse grid? The only example I can find is > > ex48 in snes. It is not clear how the coarsen vector are generated from > > that example. > > Look at the use in src/ts/impls/implicit/theta/theta.c > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Mar 5 12:56:00 2014 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 5 Mar 2014 12:56:00 -0600 Subject: [petsc-users] DMDA questions In-Reply-To: References: <87zjlfw4al.fsf@jedbrown.org> <87r46rw0ag.fsf@jedbrown.org> <87eh2rvvd0.fsf@jedbrown.org> <87ha7mvqvy.fsf@jedbrown.org> <87bnxuvqb5.fsf@jedbrown.org> <2097886F-7026-4DD4-A1BD-027EDE0FD3DA@mcs.anl.gov> <877g8bnq20.fsf@jedbrown.org> <87d2i2mzz9.fsf@jedbrown.org> Message-ID: On Wed, Mar 5, 2014 at 12:54 PM, Xiangdong wrote: > If I first define a DM, and obtain the solution vector on this DM (through > non multigrid method). However, given that this DM is so fine, I may only > need to save/view the solution on a coarsen grid. Is there some functions > available in petsc for this instead of writing my own vec scatter? > Interpolation between general DMs does not make sense because we do not prescribe anything about the function representation. DMDA has interpolation routines, and so does DMPlex. Matt > Thank you. > > Xiangdong > > > On Mon, Mar 3, 2014 at 9:58 PM, Jed Brown wrote: > >> Xiangdong writes: >> > Could you please expand it a little more on using DMCoarsenHookAdd to >> > restrict a fine vector on a coarse grid? The only example I can find is >> > ex48 in snes. It is not clear how the coarsen vector are generated from >> > that example. >> >> Look at the use in src/ts/impls/implicit/theta/theta.c >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Wed Mar 5 12:57:43 2014 From: jed at jedbrown.org (Jed Brown) Date: Wed, 05 Mar 2014 11:57:43 -0700 Subject: [petsc-users] DMDA questions In-Reply-To: References: <87zjlfw4al.fsf@jedbrown.org> <87r46rw0ag.fsf@jedbrown.org> <87eh2rvvd0.fsf@jedbrown.org> <87ha7mvqvy.fsf@jedbrown.org> <87bnxuvqb5.fsf@jedbrown.org> <2097886F-7026-4DD4-A1BD-027EDE0FD3DA@mcs.anl.gov> <877g8bnq20.fsf@je dbrown.o rg> <87d2i2mzz9.fsf@jedbrown.org> Message-ID: <874n3ciic8.fsf@jedbrown.org> Xiangdong writes: > If I first define a DM, and obtain the solution vector on this DM (through > non multigrid method). However, given that this DM is so fine, I may only > need to save/view the solution on a coarsen grid. Is there some functions > available in petsc for this instead of writing my own vec scatter? Where did you try searching and what terms did you try before asking? http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMCoarsen.html http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMCreateInterpolation.html http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMCreateInjection.html -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From epscodes at gmail.com Wed Mar 5 13:12:52 2014 From: epscodes at gmail.com (Xiangdong) Date: Wed, 5 Mar 2014 14:12:52 -0500 Subject: [petsc-users] vtk output ASCII or binary In-Reply-To: <877g88iisw.fsf@jedbrown.org> References: <87zjl5i6lj.fsf@jedbrown.org> <877g88iisw.fsf@jedbrown.org> Message-ID: I got the following error messages on each processor when use mpiio: [5]PETSC ERROR: DMDAArrayMPIIO() line 532 in MyLocal/petsc/petsc-3.4.3/src/dm/impls/da/gr2.c [5]PETSC ERROR: VecView_MPI_DA() line 605 in MyLocal/petsc/petsc-3.4.3/src/dm/impls/da/gr2.c [5]PETSC ERROR: VecView() line 717 in MyLocal/petsc/petsc-3.4.3/src/vec/vec/interface/vector.c The program does not crash, but skip writing the binary output. Thank you. Xiangdong On Wed, Mar 5, 2014 at 1:47 PM, Jed Brown wrote: > Xiangdong writes: > > > When using the option -viewer_binary_mpiio, I see PETSC Error: > > DMDAArrayMPIIO() line 532 in /src/dm/impls/da/gr2.c. > > NEVER truncate an error message. (We have to remind people of this > every day. Always send the whole thing.) > > MPI-IO works for me and in our tests, so we should figure out what > happened. > > > However, when I run it without mpiio, it outputs the binary data fine. > What > > options am I missing here? > > Time it so we can compare output performance using the different methods. -------------- next part -------------- An HTML attachment was scrubbed... URL: From epscodes at gmail.com Wed Mar 5 13:16:33 2014 From: epscodes at gmail.com (Xiangdong) Date: Wed, 5 Mar 2014 14:16:33 -0500 Subject: [petsc-users] DMDA questions In-Reply-To: <874n3ciic8.fsf@jedbrown.org> References: <87zjlfw4al.fsf@jedbrown.org> <87r46rw0ag.fsf@jedbrown.org> <87eh2rvvd0.fsf@jedbrown.org> <87ha7mvqvy.fsf@jedbrown.org> <87bnxuvqb5.fsf@jedbrown.org> <2097886F-7026-4DD4-A1BD-027EDE0FD3DA@mcs.anl.gov> <87d2i2mzz9.fsf@jedbrown.org> <874n3ciic8.fsf@jedbrown.org> Message-ID: Yes, I am using DMDA. I looked at the DM index page with keywords *coarsen*, but did not realize that *interpolation* is the key word to search. Thanks pointing out. http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/index.html Xiangdong On Wed, Mar 5, 2014 at 1:57 PM, Jed Brown wrote: > Xiangdong writes: > > > If I first define a DM, and obtain the solution vector on this DM > (through > > non multigrid method). However, given that this DM is so fine, I may > only > > need to save/view the solution on a coarsen grid. Is there some functions > > available in petsc for this instead of writing my own vec scatter? > > Where did you try searching and what terms did you try before asking? > > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMCoarsen.html > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMCreateInterpolation.html > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMCreateInjection.html > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Wed Mar 5 13:21:54 2014 From: jed at jedbrown.org (Jed Brown) Date: Wed, 05 Mar 2014 12:21:54 -0700 Subject: [petsc-users] vtk output ASCII or binary In-Reply-To: References: <87zjl5i6lj.fsf@jedbrown.org> <877g88iisw.fsf@jedbrown.org> Message-ID: <871tygih7x.fsf@jedbrown.org> Xiangdong writes: > I got the following error messages on each processor when use mpiio: > > [5]PETSC ERROR: DMDAArrayMPIIO() line 532 in > MyLocal/petsc/petsc-3.4.3/src/dm/impls/da/gr2.c Looks like this line is failing. Please either use a debugger or create a representative test case so we can debug. ierr = MPI_Type_create_subarray(dd->dim+1,gsizes,lsizes,lstarts,MPI_ORDER_FORTRAN,MPIU_SCALAR,&view);CHKERRQ(ierr); Does this work? petsc/src/ksp/ksp/examples/tutorials$ mpiexec -n 4 ./ex45 -da_refine 3 -ksp_monitor -pc_type mg -ksp_view_solution binary:foo -viewer_binary_mpiio > [5]PETSC ERROR: VecView_MPI_DA() line 605 in > MyLocal/petsc/petsc-3.4.3/src/dm/impls/da/gr2.c > [5]PETSC ERROR: VecView() line 717 in > MyLocal/petsc/petsc-3.4.3/src/vec/vec/interface/vector.c > > The program does not crash, but skip writing the binary output. That's because you weren't checking error codes. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From jed at jedbrown.org Wed Mar 5 13:24:59 2014 From: jed at jedbrown.org (Jed Brown) Date: Wed, 05 Mar 2014 12:24:59 -0700 Subject: [petsc-users] DMDA questions In-Reply-To: References: <87r46rw0ag.fsf@jedbrown.org> <87eh2rvvd0.fsf@jedbrown.org> <87ha7mvqvy.fsf@jedbrown.org> <87bnxuvqb5.fsf@jedbrown.org> <2097886F-7026-4DD4-A1BD-027EDE0FD3DA@mcs.anl.gov> <87d2i2mzz9.fsf@jedbrown.org> <874n3ciic8.fsf@jedbrown.org> Message-ID: <87y50oh2ic.fsf@jedbrown.org> Xiangdong writes: > Yes, I am using DMDA. I looked at the DM index page with keywords > *coarsen*, You searched the DM page for "coarsen" and did not find DMCoarsen? > but did not realize that *interpolation* is the key word to search. Thanks > pointing out. > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/index.html -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From epscodes at gmail.com Wed Mar 5 13:33:52 2014 From: epscodes at gmail.com (Xiangdong) Date: Wed, 5 Mar 2014 14:33:52 -0500 Subject: [petsc-users] DMDA questions In-Reply-To: <87y50oh2ic.fsf@jedbrown.org> References: <87r46rw0ag.fsf@jedbrown.org> <87eh2rvvd0.fsf@jedbrown.org> <87ha7mvqvy.fsf@jedbrown.org> <87bnxuvqb5.fsf@jedbrown.org> <2097886F-7026-4DD4-A1BD-027EDE0FD3DA@mcs.anl.gov> <87d2i2mzz9.fsf@jedbrown.org> <874n3ciic8.fsf@jedbrown.org> <87y50oh2ic.fsf@jedbrown.org> Message-ID: I did find the coarsen, but skip the interpolation. I passed the DMCoarsen at first glance because it is the function between two DM objects and I did not see any mat or vec as input or output in that function. Anyway, DMCreateInterpolation is the right one to use. Thanks. Xiangdong On Wed, Mar 5, 2014 at 2:24 PM, Jed Brown wrote: > Xiangdong writes: > > > Yes, I am using DMDA. I looked at the DM index page with keywords > > *coarsen*, > > You searched the DM page for "coarsen" and did not find DMCoarsen? > > > but did not realize that *interpolation* is the key word to search. > Thanks > > pointing out. > > > > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/index.html > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Wed Mar 5 13:40:26 2014 From: jed at jedbrown.org (Jed Brown) Date: Wed, 05 Mar 2014 12:40:26 -0700 Subject: [petsc-users] DMDA questions In-Reply-To: References: <87r46rw0ag.fsf@jedbrown.org> <87eh2rvvd0.fsf@jedbrown.org> <87ha7mvqvy.fsf@jedbrown.org> <87bnxuvqb5.fsf@jedbrown.org> <2097886F-7026-4DD4-A1BD-027EDE0FD3DA@mcs.anl.gov> <87d2i2mzz9.fsf@jedbrown.org> <874n3ciic8.fsf@je dbrown.o rg> <87y50oh2ic.fsf@jedbrown.org> Message-ID: <87vbvsh1sl.fsf@jedbrown.org> Xiangdong writes: > I did find the coarsen, but skip the interpolation. It is cross-linked on the man page. http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMCoarsen.html > I passed the DMCoarsen at first glance because it is the function > between two DM objects and I did not see any mat or vec as input or > output in that function. I'm asking because we have to figure out what is wrong with the documentation. Some people will give up without asking and we wouldn't have any time for our real jobs if everyone asked questions that should be answered by the docs. So I want to understand how best to improve the docs. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From epscodes at gmail.com Wed Mar 5 14:25:24 2014 From: epscodes at gmail.com (Xiangdong) Date: Wed, 5 Mar 2014 15:25:24 -0500 Subject: [petsc-users] vtk output ASCII or binary In-Reply-To: <871tygih7x.fsf@jedbrown.org> References: <87zjl5i6lj.fsf@jedbrown.org> <877g88iisw.fsf@jedbrown.org> <871tygih7x.fsf@jedbrown.org> Message-ID: On Wed, Mar 5, 2014 at 2:21 PM, Jed Brown wrote: > Xiangdong writes: > > > I got the following error messages on each processor when use mpiio: > > > > [5]PETSC ERROR: DMDAArrayMPIIO() line 532 in > > MyLocal/petsc/petsc-3.4.3/src/dm/impls/da/gr2.c > > Looks like this line is failing. Please either use a debugger or create > a representative test case so we can debug. > > ierr = > MPI_Type_create_subarray(dd->dim+1,gsizes,lsizes,lstarts,MPI_ORDER_FORTRAN,MPIU_SCALAR,&view);CHKERRQ(ierr); > > > > Does this work? > > petsc/src/ksp/ksp/examples/tutorials$ mpiexec -n 4 ./ex45 -da_refine 3 > -ksp_monitor -pc_type mg -ksp_view_solution binary:foo -viewer_binary_mpiio > This works fine. However, when I add the option -da_grid_x 8, the same error messages pop up. It is strange that -da_grid_x 5 works fine. Moreover, if we set M=N=P=8, it also works fine. Thank you. Xiangdong > > > [5]PETSC ERROR: VecView_MPI_DA() line 605 in > > MyLocal/petsc/petsc-3.4.3/src/dm/impls/da/gr2.c > > [5]PETSC ERROR: VecView() line 717 in > > MyLocal/petsc/petsc-3.4.3/src/vec/vec/interface/vector.c > > > > The program does not crash, but skip writing the binary output. > > That's because you weren't checking error codes. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mlohry at gmail.com Thu Mar 6 08:05:44 2014 From: mlohry at gmail.com (Mark Lohry) Date: Thu, 06 Mar 2014 09:05:44 -0500 Subject: [petsc-users] DMDA Global vs Local indexing Message-ID: <531880B8.6060900@gmail.com> I'm using DMDAs for managing my DOF data on structured grids, so processes have access to local array chunks from i=gxs; i References: <531880B8.6060900@gmail.com> Message-ID: <027788CB-DDA2-41C0-AD45-42B6A2999584@mcs.anl.gov> On Mar 6, 2014, at 8:05 AM, Mark Lohry wrote: > I'm using DMDAs for managing my DOF data on structured grids, so processes have access to local array chunks from i=gxs; i References: <531880B8.6060900@gmail.com> <027788CB-DDA2-41C0-AD45-42B6A2999584@mcs.anl.gov> Message-ID: <531899C3.7000807@gmail.com> Look at the source for DMDAVecGetArray() it calls either VecGetArray1d, VecGetArray2d, VecGetArray3d. You can call the VecGetArrayNd directly setting the local start and stop you want. Thanks, I'll look into this. As a PETSc developer, of course, I would recommend keeping your local/temporary data also in Vecs and using the DMDAVecGetArray() for access to those also and having all code written in the ?local patch style? with loops i=gxs; i On Mar 6, 2014, at 8:05 AM, Mark Lohry wrote: > >> I'm using DMDAs for managing my DOF data on structured grids, so processes have access to local array chunks from i=gxs; i Mark, > > Look at the source for DMDAVecGetArray() it calls either VecGetArray1d, VecGetArray2d, VecGetArray3d. You can call the VecGetArrayNd directly setting the local start and stop you want. > > Barry > > As a PETSc developer, of course, I would recommend keeping your local/temporary data also in Vecs and using the DMDAVecGetArray() for access to those also and having all code written in the ?local patch style? with loops i=gxs; i -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Thu Mar 6 10:01:29 2014 From: jed at jedbrown.org (Jed Brown) Date: Thu, 06 Mar 2014 09:01:29 -0700 Subject: [petsc-users] DMDA Global vs Local indexing In-Reply-To: <531899C3.7000807@gmail.com> References: <531880B8.6060900@gmail.com> <027788CB-DDA2-41C0-AD45-42B6A2999584@mcs.anl.gov> <531899C3.7000807@gmail.com> Message-ID: <87siqvfh9i.fsf@jedbrown.org> Mark Lohry writes: > As a PETSc developer, of course, I would recommend keeping your > local/temporary data also in Vecs and using the > DMDAVecGetArray() for access to those also and having all code > written in the ?local patch style? with loops i=gxs; i I think the code is clearer and easier to reason about than > having each process from 0 to vxm etc. > > Yeah, I definitely see the attraction and I may ultimately go that > route. As a non-PETSc developer however, it seems preferable to > absolutely minimize the reliance on PETSc data management for code > re-use in a non-PETSc application going forward. I'd be eager to hear > from other devs on how they approach this. Note that using a DMDA for this auxiliary data means you can restrict and interpolate it through a hierarchy, as well as visualize it. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From lu_qin_2000 at yahoo.com Thu Mar 6 10:23:45 2014 From: lu_qin_2000 at yahoo.com (Qin Lu) Date: Thu, 6 Mar 2014 08:23:45 -0800 (PST) Subject: [petsc-users] Ouput matrix and rhs to files Message-ID: <1394123025.42811.YahooMailNeo@web160206.mail.bf1.yahoo.com> Hello, ? Are there?any?PETSc subroutines to output the matrix and right-hand-side to files in CRS for MATLAB format? There seems to be PetscViewerASCIIOpen subroutine, does it also output rhs? Is there any example code for this? ? Many thanks, Qin -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Mar 6 10:29:53 2014 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 6 Mar 2014 10:29:53 -0600 Subject: [petsc-users] Ouput matrix and rhs to files In-Reply-To: <1394123025.42811.YahooMailNeo@web160206.mail.bf1.yahoo.com> References: <1394123025.42811.YahooMailNeo@web160206.mail.bf1.yahoo.com> Message-ID: On Thu, Mar 6, 2014 at 10:23 AM, Qin Lu wrote: > Hello, > > Are there any PETSc subroutines to output the matrix and right-hand-side > to files in CRS for MATLAB format? There seems to be PetscViewerASCIIOpen > subroutine, does it also output rhs? Is there any example code for this? > You can use the ASCII viewer with the MATLAB format, however we recommend you just use the binary viewer and bin/matlab/PetscBInaryRead.m Thanks, Matt > > Many thanks, > Qin > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From baagaard at usgs.gov Thu Mar 6 12:00:48 2014 From: baagaard at usgs.gov (Brad Aagaard) Date: Thu, 6 Mar 2014 10:00:48 -0800 Subject: [petsc-users] Viewing DMPlex mesh as VTK file using DMViewFromOptions() Message-ID: <5318B7D0.8070601@usgs.gov> I am trying to view just a DMPlex mesh as a VTK file using DMViewFromOptions(). Using ":mesh.view:ascii_info_detail" as the format dumps an ASCII representation to an ASCII file as desired. However, using "vtk:mesh.vtk:ascii_vtk" does not write the mesh. My guess is that the VTK viewer isn't writing because I don't view a field. Is there a way to force the VTK viewer to write the mesh without viewing a field? Thanks, Brad From bsmith at mcs.anl.gov Thu Mar 6 14:36:17 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 6 Mar 2014 14:36:17 -0600 Subject: [petsc-users] DMDA Global vs Local indexing In-Reply-To: <87siqvfh9i.fsf@jedbrown.org> References: <531880B8.6060900@gmail.com> <027788CB-DDA2-41C0-AD45-42B6A2999584@mcs.anl.gov> <531899C3.7000807@gmail.com> <87siqvfh9i.fsf@jedbrown.org> Message-ID: On Mar 6, 2014, at 10:01 AM, Jed Brown wrote: > Mark Lohry writes: >> As a PETSc developer, of course, I would recommend keeping your >> local/temporary data also in Vecs and using the >> DMDAVecGetArray() for access to those also and having all code >> written in the ?local patch style? with loops i=gxs; i> I think the code is clearer and easier to reason about than >> having each process from 0 to vxm etc. >> >> Yeah, I definitely see the attraction and I may ultimately go that >> route. As a non-PETSc developer however, it seems preferable to >> absolutely minimize the reliance on PETSc data management for code >> re-use in a non-PETSc application going forward. Sure, but you can still use the global patch model i=xs to xe etc with or without tight coupling to PETSc. PETSc makes the global patch model easy, it might be harder to do without PETSc. Barry >> I'd be eager to hear >> from other devs on how they approach this. > > Note that using a DMDA for this auxiliary data means you can restrict > and interpolate it through a hierarchy, as well as visualize it. From mirzadeh at gmail.com Thu Mar 6 17:24:33 2014 From: mirzadeh at gmail.com (Mohammad Mirzadeh) Date: Thu, 6 Mar 2014 15:24:33 -0800 Subject: [petsc-users] having issues with nullspace Message-ID: Hi guys, I have a discretization of Poisson equation with Neumann bc for embedded boundary grids in such a way that that nullspace is not the usual constant vector. Instead the nullspace is constant in the domain of interest and zero elsewhere. I compute this nullspace myself and have checked it against MATLAB by dumping the matrix and computing the nullspace explicitly using null function -- they match and there is only this single vector. Then I take this calculated vector and subtract it off the matrix and rhs. However, I am having convergence issues. For instance this is the output of ksp_monitor_true_residual for one particular run: 0 KSP preconditioned resid norm 3.033840960250e+02 true resid norm 2.332886580745e-01 ||r(i)||/||b|| 1.000000000000e+00 1 KSP preconditioned resid norm 1.018974811826e+01 true resid norm 1.941629896918e-02 ||r(i)||/||b|| 8.322864527335e-02 2 KSP preconditioned resid norm 5.450493684941e-02 true resid norm 1.029339589324e-02 ||r(i)||/||b|| 4.412300185615e-02 3 KSP preconditioned resid norm 3.944064039516e-02 true resid norm 1.030277925024e-02 ||r(i)||/||b|| 4.416322394443e-02 4 KSP preconditioned resid norm 6.286181172600e-05 true resid norm 1.030243055045e-02 ||r(i)||/||b|| 4.416172923059e-02 5 KSP preconditioned resid norm 4.349133658643e-06 true resid norm 1.030239080406e-02 ||r(i)||/||b|| 4.416155885630e-02 6 KSP preconditioned resid norm 9.279429568232e-08 true resid norm 1.030239169298e-02 ||r(i)||/||b|| 4.416156266666e-02 7 KSP preconditioned resid norm 3.032522248740e-09 true resid norm 1.030239175066e-02 ||r(i)||/||b|| 4.416156291393e-02 8 KSP preconditioned resid norm 6.533747246875e-09 true resid norm 1.030239175718e-02 ||r(i)||/||b|| 4.416156294184e-02 9 KSP preconditioned resid norm 6.083185162500e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292220e-02 10 KSP preconditioned resid norm 5.510319622225e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292221e-02 11 KSP preconditioned resid norm 5.456758524534e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292221e-02 12 KSP preconditioned resid norm 5.456756081783e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292221e-02 13 KSP preconditioned resid norm 5.456755930952e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292221e-02 14 KSP preconditioned resid norm 5.456755930949e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292221e-02 15 KSP preconditioned resid norm 5.456755930949e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292221e-02 As you can see, the true residual is quite large and moreover it does not reduce beyond a certain point. This is using hypre as preconditioner, but the situation is equally bad with several other preconditioner (ilu, sor, jacobi, or even none). As for the solution itself, the error has poor to none convergence under grid refinement. All this suggests that the linear system is not converging in my case. Do you have any idea/suggestions why this is happening and how I can avoid it? Thanks -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Mar 6 17:33:02 2014 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 6 Mar 2014 17:33:02 -0600 Subject: [petsc-users] having issues with nullspace In-Reply-To: References: Message-ID: On Thu, Mar 6, 2014 at 5:24 PM, Mohammad Mirzadeh wrote: > Hi guys, > > I have a discretization of Poisson equation with Neumann bc for embedded > boundary grids in such a way that that nullspace is not the usual constant > vector. Instead the nullspace is constant in the domain of interest and > zero elsewhere. > > I compute this nullspace myself and have checked it against MATLAB by > dumping the matrix and computing the nullspace explicitly using null > function -- they match and there is only this single vector. Then I take > this calculated vector and subtract it off the matrix and rhs. > "subtract it off the matrix" does not make sense to me. Are you calling KSPSetNullSpace()? Matt > However, I am having convergence issues. For instance this is the output > of ksp_monitor_true_residual for one particular run: > > 0 KSP preconditioned resid norm 3.033840960250e+02 true resid norm 2.332886580745e-01 ||r(i)||/||b|| 1.000000000000e+00 > 1 KSP preconditioned resid norm 1.018974811826e+01 true resid norm 1.941629896918e-02 ||r(i)||/||b|| 8.322864527335e-02 > 2 KSP preconditioned resid norm 5.450493684941e-02 true resid norm 1.029339589324e-02 ||r(i)||/||b|| 4.412300185615e-02 > 3 KSP preconditioned resid norm 3.944064039516e-02 true resid norm 1.030277925024e-02 ||r(i)||/||b|| 4.416322394443e-02 > 4 KSP preconditioned resid norm 6.286181172600e-05 true resid norm 1.030243055045e-02 ||r(i)||/||b|| 4.416172923059e-02 > 5 KSP preconditioned resid norm 4.349133658643e-06 true resid norm 1.030239080406e-02 ||r(i)||/||b|| 4.416155885630e-02 > 6 KSP preconditioned resid norm 9.279429568232e-08 true resid norm 1.030239169298e-02 ||r(i)||/||b|| 4.416156266666e-02 > 7 KSP preconditioned resid norm 3.032522248740e-09 true resid norm 1.030239175066e-02 ||r(i)||/||b|| 4.416156291393e-02 > 8 KSP preconditioned resid norm 6.533747246875e-09 true resid norm 1.030239175718e-02 ||r(i)||/||b|| 4.416156294184e-02 > 9 KSP preconditioned resid norm 6.083185162500e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292220e-02 > 10 KSP preconditioned resid norm 5.510319622225e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292221e-02 > 11 KSP preconditioned resid norm 5.456758524534e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292221e-02 > 12 KSP preconditioned resid norm 5.456756081783e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292221e-02 > 13 KSP preconditioned resid norm 5.456755930952e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292221e-02 > 14 KSP preconditioned resid norm 5.456755930949e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292221e-02 > 15 KSP preconditioned resid norm 5.456755930949e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292221e-02 > > > As you can see, the true residual is quite large and moreover it does not reduce beyond a certain point. This is using hypre as preconditioner, but the situation is equally bad with several other preconditioner (ilu, sor, jacobi, or even none). As for the solution itself, the error has poor to none convergence under grid refinement. All this suggests that the linear system is not converging in my case. > > > Do you have any idea/suggestions why this is happening and how I can avoid it? > > > Thanks > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mirzadeh at gmail.com Thu Mar 6 17:38:39 2014 From: mirzadeh at gmail.com (Mohammad Mirzadeh) Date: Thu, 6 Mar 2014 15:38:39 -0800 Subject: [petsc-users] having issues with nullspace In-Reply-To: References: Message-ID: Yes. To be precise this is the set of functions I call: ierr = MatNullSpaceCreate(mpicomm, PETSC_FALSE, 1, &null_space, &A_null_space); CHKERRXX(ierr); ierr = MatSetNullSpace(A, A_null_space); CHKERRXX(ierr); ierr = KSPSetNullSpace(ksp, A_null_space); CHKERRXX(ierr); ierr = MatNullSpaceRemove(A_null_space, rhs_, NULL); CHKERRXX(ierr); ierr = KSPSolve(ksp, rhs_, solution); CHKERRXX(ierr); On Thu, Mar 6, 2014 at 3:33 PM, Matthew Knepley wrote: > On Thu, Mar 6, 2014 at 5:24 PM, Mohammad Mirzadeh wrote: > >> Hi guys, >> >> I have a discretization of Poisson equation with Neumann bc for embedded >> boundary grids in such a way that that nullspace is not the usual constant >> vector. Instead the nullspace is constant in the domain of interest and >> zero elsewhere. >> >> I compute this nullspace myself and have checked it against MATLAB by >> dumping the matrix and computing the nullspace explicitly using null >> function -- they match and there is only this single vector. Then I take >> this calculated vector and subtract it off the matrix and rhs. >> > > "subtract it off the matrix" does not make sense to me. Are you calling > KSPSetNullSpace()? > > Matt > > >> However, I am having convergence issues. For instance this is the output >> of ksp_monitor_true_residual for one particular run: >> >> 0 KSP preconditioned resid norm 3.033840960250e+02 true resid norm 2.332886580745e-01 ||r(i)||/||b|| 1.000000000000e+00 >> 1 KSP preconditioned resid norm 1.018974811826e+01 true resid norm 1.941629896918e-02 ||r(i)||/||b|| 8.322864527335e-02 >> 2 KSP preconditioned resid norm 5.450493684941e-02 true resid norm 1.029339589324e-02 ||r(i)||/||b|| 4.412300185615e-02 >> 3 KSP preconditioned resid norm 3.944064039516e-02 true resid norm 1.030277925024e-02 ||r(i)||/||b|| 4.416322394443e-02 >> 4 KSP preconditioned resid norm 6.286181172600e-05 true resid norm 1.030243055045e-02 ||r(i)||/||b|| 4.416172923059e-02 >> 5 KSP preconditioned resid norm 4.349133658643e-06 true resid norm 1.030239080406e-02 ||r(i)||/||b|| 4.416155885630e-02 >> 6 KSP preconditioned resid norm 9.279429568232e-08 true resid norm 1.030239169298e-02 ||r(i)||/||b|| 4.416156266666e-02 >> 7 KSP preconditioned resid norm 3.032522248740e-09 true resid norm 1.030239175066e-02 ||r(i)||/||b|| 4.416156291393e-02 >> 8 KSP preconditioned resid norm 6.533747246875e-09 true resid norm 1.030239175718e-02 ||r(i)||/||b|| 4.416156294184e-02 >> 9 KSP preconditioned resid norm 6.083185162500e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292220e-02 >> 10 KSP preconditioned resid norm 5.510319622225e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292221e-02 >> 11 KSP preconditioned resid norm 5.456758524534e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292221e-02 >> 12 KSP preconditioned resid norm 5.456756081783e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292221e-02 >> 13 KSP preconditioned resid norm 5.456755930952e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292221e-02 >> 14 KSP preconditioned resid norm 5.456755930949e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292221e-02 >> 15 KSP preconditioned resid norm 5.456755930949e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292221e-02 >> >> >> As you can see, the true residual is quite large and moreover it does not reduce beyond a certain point. This is using hypre as preconditioner, but the situation is equally bad with several other preconditioner (ilu, sor, jacobi, or even none). As for the solution itself, the error has poor to none convergence under grid refinement. All this suggests that the linear system is not converging in my case. >> >> >> Do you have any idea/suggestions why this is happening and how I can avoid it? >> >> >> Thanks >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Mar 6 17:42:00 2014 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 6 Mar 2014 17:42:00 -0600 Subject: [petsc-users] having issues with nullspace In-Reply-To: References: Message-ID: On Thu, Mar 6, 2014 at 5:38 PM, Mohammad Mirzadeh wrote: > Yes. To be precise this is the set of functions I call: > > ierr = MatNullSpaceCreate(mpicomm, PETSC_FALSE, 1, &null_space, &A_null_space); CHKERRXX(ierr); > > ierr = MatSetNullSpace(A, A_null_space); CHKERRXX(ierr); > > Verify using http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatNullSpaceTest.html > ierr = KSPSetNullSpace(ksp, A_null_space); CHKERRXX(ierr); > > ierr = MatNullSpaceRemove(A_null_space, rhs_, NULL); CHKERRXX(ierr); > > ierr = KSPSolve(ksp, rhs_, solution); CHKERRXX(ierr); > > Send the output of -ksp_monitor -ksp_view Matt > > On Thu, Mar 6, 2014 at 3:33 PM, Matthew Knepley wrote: > >> On Thu, Mar 6, 2014 at 5:24 PM, Mohammad Mirzadeh wrote: >> >>> Hi guys, >>> >>> I have a discretization of Poisson equation with Neumann bc for embedded >>> boundary grids in such a way that that nullspace is not the usual constant >>> vector. Instead the nullspace is constant in the domain of interest and >>> zero elsewhere. >>> >>> I compute this nullspace myself and have checked it against MATLAB by >>> dumping the matrix and computing the nullspace explicitly using null >>> function -- they match and there is only this single vector. Then I take >>> this calculated vector and subtract it off the matrix and rhs. >>> >> >> "subtract it off the matrix" does not make sense to me. Are you calling >> KSPSetNullSpace()? >> >> Matt >> >> >>> However, I am having convergence issues. For instance this is the output >>> of ksp_monitor_true_residual for one particular run: >>> >>> 0 KSP preconditioned resid norm 3.033840960250e+02 true resid norm 2.332886580745e-01 ||r(i)||/||b|| 1.000000000000e+00 >>> 1 KSP preconditioned resid norm 1.018974811826e+01 true resid norm 1.941629896918e-02 ||r(i)||/||b|| 8.322864527335e-02 >>> 2 KSP preconditioned resid norm 5.450493684941e-02 true resid norm 1.029339589324e-02 ||r(i)||/||b|| 4.412300185615e-02 >>> 3 KSP preconditioned resid norm 3.944064039516e-02 true resid norm 1.030277925024e-02 ||r(i)||/||b|| 4.416322394443e-02 >>> 4 KSP preconditioned resid norm 6.286181172600e-05 true resid norm 1.030243055045e-02 ||r(i)||/||b|| 4.416172923059e-02 >>> 5 KSP preconditioned resid norm 4.349133658643e-06 true resid norm 1.030239080406e-02 ||r(i)||/||b|| 4.416155885630e-02 >>> 6 KSP preconditioned resid norm 9.279429568232e-08 true resid norm 1.030239169298e-02 ||r(i)||/||b|| 4.416156266666e-02 >>> 7 KSP preconditioned resid norm 3.032522248740e-09 true resid norm 1.030239175066e-02 ||r(i)||/||b|| 4.416156291393e-02 >>> 8 KSP preconditioned resid norm 6.533747246875e-09 true resid norm 1.030239175718e-02 ||r(i)||/||b|| 4.416156294184e-02 >>> 9 KSP preconditioned resid norm 6.083185162500e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292220e-02 >>> 10 KSP preconditioned resid norm 5.510319622225e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292221e-02 >>> 11 KSP preconditioned resid norm 5.456758524534e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292221e-02 >>> 12 KSP preconditioned resid norm 5.456756081783e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292221e-02 >>> 13 KSP preconditioned resid norm 5.456755930952e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292221e-02 >>> 14 KSP preconditioned resid norm 5.456755930949e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292221e-02 >>> 15 KSP preconditioned resid norm 5.456755930949e-12 true resid norm 1.030239175259e-02 ||r(i)||/||b|| 4.416156292221e-02 >>> >>> >>> As you can see, the true residual is quite large and moreover it does not reduce beyond a certain point. This is using hypre as preconditioner, but the situation is equally bad with several other preconditioner (ilu, sor, jacobi, or even none). As for the solution itself, the error has poor to none convergence under grid refinement. All this suggests that the linear system is not converging in my case. >>> >>> >>> Do you have any idea/suggestions why this is happening and how I can avoid it? >>> >>> >>> Thanks >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Thu Mar 6 17:43:33 2014 From: jed at jedbrown.org (Jed Brown) Date: Thu, 06 Mar 2014 16:43:33 -0700 Subject: [petsc-users] having issues with nullspace In-Reply-To: References: Message-ID: <87r46eevve.fsf@jedbrown.org> Mohammad Mirzadeh writes: > Yes. To be precise this is the set of functions I call: > > ierr = MatNullSpaceCreate(mpicomm, PETSC_FALSE, 1, &null_space, > &A_null_space); CHKERRXX(ierr); > > ierr = MatSetNullSpace(A, A_null_space); CHKERRXX(ierr); > > ierr = KSPSetNullSpace(ksp, A_null_space); CHKERRXX(ierr); > > ierr = MatNullSpaceRemove(A_null_space, rhs_, NULL); CHKERRXX(ierr); Is the matrix symmetric? If not, the right and left null spaces could be different, in which case this system might be inconsistent. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From mirzadeh at gmail.com Thu Mar 6 17:49:50 2014 From: mirzadeh at gmail.com (Mohammad Mirzadeh) Date: Thu, 6 Mar 2014 15:49:50 -0800 Subject: [petsc-users] having issues with nullspace In-Reply-To: <87r46eevve.fsf@jedbrown.org> References: <87r46eevve.fsf@jedbrown.org> Message-ID: Matt, here's the output: 0 KSP Residual norm 3.033840960250e+02 1 KSP Residual norm 1.018974811826e+01 2 KSP Residual norm 5.450493684941e-02 3 KSP Residual norm 3.944064039516e-02 4 KSP Residual norm 6.286181172600e-05 5 KSP Residual norm 4.349133658643e-06 6 KSP Residual norm 9.279429568232e-08 7 KSP Residual norm 3.032522248740e-09 8 KSP Residual norm 6.533747246875e-09 9 KSP Residual norm 6.083185162500e-12 Linear solve converged due to CONVERGED_RTOL iterations 9 KSP Object: 1 MPI processes type: bcgs maximum iterations=10000, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000 left preconditioning has attached null space using PRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: hypre HYPRE BoomerAMG preconditioning HYPRE BoomerAMG: Cycle type V HYPRE BoomerAMG: Maximum number of levels 25 HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1 HYPRE BoomerAMG: Convergence tolerance PER hypre call 0 HYPRE BoomerAMG: Threshold for strong coupling 0.5 HYPRE BoomerAMG: Interpolation truncation factor 0 HYPRE BoomerAMG: Interpolation: max elements per row 0 HYPRE BoomerAMG: Number of levels of aggressive coarsening 0 HYPRE BoomerAMG: Number of paths for aggressive coarsening 1 HYPRE BoomerAMG: Maximum row sums 0.9 HYPRE BoomerAMG: Sweeps down 1 HYPRE BoomerAMG: Sweeps up 1 HYPRE BoomerAMG: Sweeps on coarse 1 HYPRE BoomerAMG: Relax down symmetric-SOR/Jacobi HYPRE BoomerAMG: Relax up symmetric-SOR/Jacobi HYPRE BoomerAMG: Relax on coarse symmetric-SOR/Jacobi HYPRE BoomerAMG: Relax weight (all) 1 HYPRE BoomerAMG: Outer relax weight (all) 1 HYPRE BoomerAMG: Using CF-relaxation HYPRE BoomerAMG: Measure type local HYPRE BoomerAMG: Coarsen type Falgout HYPRE BoomerAMG: Interpolation type classical linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=263169, cols=263169 total: nonzeros=1236141, allocated nonzeros=1244417 total number of mallocs used during MatSetValues calls =0 has attached null space not using I-node routines On Thu, Mar 6, 2014 at 3:43 PM, Jed Brown wrote: > Mohammad Mirzadeh writes: > > > Yes. To be precise this is the set of functions I call: > > > > ierr = MatNullSpaceCreate(mpicomm, PETSC_FALSE, 1, &null_space, > > &A_null_space); CHKERRXX(ierr); > > > > ierr = MatSetNullSpace(A, A_null_space); CHKERRXX(ierr); > > > > ierr = KSPSetNullSpace(ksp, A_null_space); CHKERRXX(ierr); > > > > ierr = MatNullSpaceRemove(A_null_space, rhs_, NULL); CHKERRXX(ierr); > > Is the matrix symmetric? If not, the right and left null spaces could > be different, in which case this system might be inconsistent. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mirzadeh at gmail.com Thu Mar 6 17:50:55 2014 From: mirzadeh at gmail.com (Mohammad Mirzadeh) Date: Thu, 6 Mar 2014 15:50:55 -0800 Subject: [petsc-users] having issues with nullspace In-Reply-To: References: <87r46eevve.fsf@jedbrown.org> Message-ID: Jed, No the matrix is actually non-symmetric due to grid adaptivity (hanging nodes of QuadTree). Anyway, what do you exactly mean by the system being inconsistent? On Thu, Mar 6, 2014 at 3:49 PM, Mohammad Mirzadeh wrote: > Matt, > > here's the output: > > 0 KSP Residual norm 3.033840960250e+02 > > 1 KSP Residual norm 1.018974811826e+01 > > 2 KSP Residual norm 5.450493684941e-02 > > 3 KSP Residual norm 3.944064039516e-02 > > 4 KSP Residual norm 6.286181172600e-05 > > 5 KSP Residual norm 4.349133658643e-06 > > 6 KSP Residual norm 9.279429568232e-08 > > 7 KSP Residual norm 3.032522248740e-09 > > 8 KSP Residual norm 6.533747246875e-09 > > 9 KSP Residual norm 6.083185162500e-12 > > Linear solve converged due to CONVERGED_RTOL iterations 9 > > KSP Object: 1 MPI processes > > type: bcgs > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-12, absolute=1e-50, divergence=10000 > > left preconditioning > > has attached null space > > using PRECONDITIONED norm type for convergence test > > PC Object: 1 MPI processes > > type: hypre > > HYPRE BoomerAMG preconditioning > > HYPRE BoomerAMG: Cycle type V > > HYPRE BoomerAMG: Maximum number of levels 25 > > HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1 > > HYPRE BoomerAMG: Convergence tolerance PER hypre call 0 > > HYPRE BoomerAMG: Threshold for strong coupling 0.5 > > HYPRE BoomerAMG: Interpolation truncation factor 0 > > HYPRE BoomerAMG: Interpolation: max elements per row 0 > > HYPRE BoomerAMG: Number of levels of aggressive coarsening 0 > > HYPRE BoomerAMG: Number of paths for aggressive coarsening 1 > > HYPRE BoomerAMG: Maximum row sums 0.9 > > HYPRE BoomerAMG: Sweeps down 1 > > HYPRE BoomerAMG: Sweeps up 1 > > HYPRE BoomerAMG: Sweeps on coarse 1 > > HYPRE BoomerAMG: Relax down symmetric-SOR/Jacobi > > HYPRE BoomerAMG: Relax up symmetric-SOR/Jacobi > > HYPRE BoomerAMG: Relax on coarse symmetric-SOR/Jacobi > > HYPRE BoomerAMG: Relax weight (all) 1 > > HYPRE BoomerAMG: Outer relax weight (all) 1 > > HYPRE BoomerAMG: Using CF-relaxation > > HYPRE BoomerAMG: Measure type local > > HYPRE BoomerAMG: Coarsen type Falgout > > HYPRE BoomerAMG: Interpolation type classical > > linear system matrix = precond matrix: > > Mat Object: 1 MPI processes > > type: seqaij > > rows=263169, cols=263169 > > total: nonzeros=1236141, allocated nonzeros=1244417 > > total number of mallocs used during MatSetValues calls =0 > > has attached null space > > not using I-node routines > > > > > On Thu, Mar 6, 2014 at 3:43 PM, Jed Brown wrote: > >> Mohammad Mirzadeh writes: >> >> > Yes. To be precise this is the set of functions I call: >> > >> > ierr = MatNullSpaceCreate(mpicomm, PETSC_FALSE, 1, &null_space, >> > &A_null_space); CHKERRXX(ierr); >> > >> > ierr = MatSetNullSpace(A, A_null_space); CHKERRXX(ierr); >> > >> > ierr = KSPSetNullSpace(ksp, A_null_space); CHKERRXX(ierr); >> > >> > ierr = MatNullSpaceRemove(A_null_space, rhs_, NULL); CHKERRXX(ierr); >> >> Is the matrix symmetric? If not, the right and left null spaces could >> be different, in which case this system might be inconsistent. >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Mar 6 17:57:30 2014 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 6 Mar 2014 17:57:30 -0600 Subject: [petsc-users] having issues with nullspace In-Reply-To: References: <87r46eevve.fsf@jedbrown.org> Message-ID: On Thu, Mar 6, 2014 at 5:49 PM, Mohammad Mirzadeh wrote: > Matt, > 1) Use -ksp_monitor_true_residual 2) Quit using Hypre. It can easily get rank deficient stuff on its coarse grid, which throws off this analysis. Use SOR if it converges, otherwise use -pc_type svd Matt > > here's the output: > > 0 KSP Residual norm 3.033840960250e+02 > > 1 KSP Residual norm 1.018974811826e+01 > > 2 KSP Residual norm 5.450493684941e-02 > > 3 KSP Residual norm 3.944064039516e-02 > > 4 KSP Residual norm 6.286181172600e-05 > > 5 KSP Residual norm 4.349133658643e-06 > > 6 KSP Residual norm 9.279429568232e-08 > > 7 KSP Residual norm 3.032522248740e-09 > > 8 KSP Residual norm 6.533747246875e-09 > > 9 KSP Residual norm 6.083185162500e-12 > > Linear solve converged due to CONVERGED_RTOL iterations 9 > > KSP Object: 1 MPI processes > > type: bcgs > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-12, absolute=1e-50, divergence=10000 > > left preconditioning > > has attached null space > > using PRECONDITIONED norm type for convergence test > > PC Object: 1 MPI processes > > type: hypre > > HYPRE BoomerAMG preconditioning > > HYPRE BoomerAMG: Cycle type V > > HYPRE BoomerAMG: Maximum number of levels 25 > > HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1 > > HYPRE BoomerAMG: Convergence tolerance PER hypre call 0 > > HYPRE BoomerAMG: Threshold for strong coupling 0.5 > > HYPRE BoomerAMG: Interpolation truncation factor 0 > > HYPRE BoomerAMG: Interpolation: max elements per row 0 > > HYPRE BoomerAMG: Number of levels of aggressive coarsening 0 > > HYPRE BoomerAMG: Number of paths for aggressive coarsening 1 > > HYPRE BoomerAMG: Maximum row sums 0.9 > > HYPRE BoomerAMG: Sweeps down 1 > > HYPRE BoomerAMG: Sweeps up 1 > > HYPRE BoomerAMG: Sweeps on coarse 1 > > HYPRE BoomerAMG: Relax down symmetric-SOR/Jacobi > > HYPRE BoomerAMG: Relax up symmetric-SOR/Jacobi > > HYPRE BoomerAMG: Relax on coarse symmetric-SOR/Jacobi > > HYPRE BoomerAMG: Relax weight (all) 1 > > HYPRE BoomerAMG: Outer relax weight (all) 1 > > HYPRE BoomerAMG: Using CF-relaxation > > HYPRE BoomerAMG: Measure type local > > HYPRE BoomerAMG: Coarsen type Falgout > > HYPRE BoomerAMG: Interpolation type classical > > linear system matrix = precond matrix: > > Mat Object: 1 MPI processes > > type: seqaij > > rows=263169, cols=263169 > > total: nonzeros=1236141, allocated nonzeros=1244417 > > total number of mallocs used during MatSetValues calls =0 > > has attached null space > > not using I-node routines > > > > > On Thu, Mar 6, 2014 at 3:43 PM, Jed Brown wrote: > >> Mohammad Mirzadeh writes: >> >> > Yes. To be precise this is the set of functions I call: >> > >> > ierr = MatNullSpaceCreate(mpicomm, PETSC_FALSE, 1, &null_space, >> > &A_null_space); CHKERRXX(ierr); >> > >> > ierr = MatSetNullSpace(A, A_null_space); CHKERRXX(ierr); >> > >> > ierr = KSPSetNullSpace(ksp, A_null_space); CHKERRXX(ierr); >> > >> > ierr = MatNullSpaceRemove(A_null_space, rhs_, NULL); CHKERRXX(ierr); >> >> Is the matrix symmetric? If not, the right and left null spaces could >> be different, in which case this system might be inconsistent. >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Thu Mar 6 18:02:48 2014 From: jed at jedbrown.org (Jed Brown) Date: Thu, 06 Mar 2014 17:02:48 -0700 Subject: [petsc-users] having issues with nullspace In-Reply-To: References: <87r46eevve.fsf@jedbrown.org> Message-ID: <87ob1ieuzb.fsf@jedbrown.org> Mohammad Mirzadeh writes: > Jed, > > No the matrix is actually non-symmetric due to grid adaptivity (hanging > nodes of QuadTree). Anyway, what do you exactly mean by the system being > inconsistent? Sounds like the right and left null spaces are different. You can test by checking the null space using the transpose of your system. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From mirzadeh at gmail.com Thu Mar 6 18:06:34 2014 From: mirzadeh at gmail.com (Mohammad Mirzadeh) Date: Thu, 6 Mar 2014 16:06:34 -0800 Subject: [petsc-users] having issues with nullspace In-Reply-To: <87ob1ieuzb.fsf@jedbrown.org> References: <87r46eevve.fsf@jedbrown.org> <87ob1ieuzb.fsf@jedbrown.org> Message-ID: hummm just tried it in Matlab and you are correct -- they are different. What does this mean for my system? Also what is the correct approach here? On Thu, Mar 6, 2014 at 4:02 PM, Jed Brown wrote: > Mohammad Mirzadeh writes: > > > Jed, > > > > No the matrix is actually non-symmetric due to grid adaptivity (hanging > > nodes of QuadTree). Anyway, what do you exactly mean by the system being > > inconsistent? > > Sounds like the right and left null spaces are different. You can test > by checking the null space using the transpose of your system. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mirzadeh at gmail.com Thu Mar 6 18:08:06 2014 From: mirzadeh at gmail.com (Mohammad Mirzadeh) Date: Thu, 6 Mar 2014 16:08:06 -0800 Subject: [petsc-users] having issues with nullspace In-Reply-To: References: <87r46eevve.fsf@jedbrown.org> <87ob1ieuzb.fsf@jedbrown.org> Message-ID: Mat, here's sor. svd still running! 0 KSP preconditioned resid norm 2.481182137607e+00 true resid norm 1.027599467617e+00 ||r(i)||/||b|| 1.000000000000e+00 1 KSP preconditioned resid norm 1.428369058648e+00 true resid norm 4.775799486483e-01 ||r(i)||/||b|| 4.647530129184e-01 2 KSP preconditioned resid norm 4.540176572297e-01 true resid norm 1.868438043300e-01 ||r(i)||/||b|| 1.818255168654e-01 3 KSP preconditioned resid norm 1.999321637574e-01 true resid norm 1.185660060102e-01 ||r(i)||/||b|| 1.153815370158e-01 4 KSP preconditioned resid norm 1.134340150403e-01 true resid norm 8.909919316428e-02 ||r(i)||/||b|| 8.670614959631e-02 5 KSP preconditioned resid norm 7.566523445491e-02 true resid norm 7.766809539565e-02 ||r(i)||/||b|| 7.558207048878e-02 6 KSP preconditioned resid norm 5.537203663154e-02 true resid norm 7.485665618568e-02 ||r(i)||/||b|| 7.284614146334e-02 7 KSP preconditioned resid norm 4.390492285958e-02 true resid norm 7.302891311152e-02 ||r(i)||/||b|| 7.106748827039e-02 8 KSP preconditioned resid norm 3.621562502013e-02 true resid norm 7.279539496786e-02 ||r(i)||/||b|| 7.084024200273e-02 9 KSP preconditioned resid norm 3.309171702542e-02 true resid norm 7.139006972953e-02 ||r(i)||/||b|| 6.947266126469e-02 10 KSP preconditioned resid norm 3.651904851324e-02 true resid norm 7.153789926206e-02 ||r(i)||/||b|| 6.961652036271e-02 11 KSP preconditioned resid norm 5.973622761183e-02 true resid norm 7.347037429585e-02 ||r(i)||/||b|| 7.149709260383e-02 12 KSP preconditioned resid norm 9.586867231154e-02 true resid norm 7.608732859604e-02 ||r(i)||/||b|| 7.404376023325e-02 13 KSP preconditioned resid norm 1.635382207023e-02 true resid norm 7.118213033182e-02 ||r(i)||/||b|| 6.927030674402e-02 14 KSP preconditioned resid norm 9.385509745433e-03 true resid norm 7.101728347294e-02 ||r(i)||/||b|| 6.910988737434e-02 15 KSP preconditioned resid norm 7.423535884473e-03 true resid norm 7.104945342327e-02 ||r(i)||/||b|| 6.914119329785e-02 16 KSP preconditioned resid norm 6.323417551714e-03 true resid norm 7.104173274422e-02 ||r(i)||/||b|| 6.913367998231e-02 17 KSP preconditioned resid norm 5.970492489076e-03 true resid norm 7.099818043932e-02 ||r(i)||/||b|| 6.909129741373e-02 18 KSP preconditioned resid norm 6.777117918864e-03 true resid norm 7.098851590105e-02 ||r(i)||/||b|| 6.908189244753e-02 19 KSP preconditioned resid norm 1.556240130595e-02 true resid norm 7.103123679068e-02 ||r(i)||/||b|| 6.912346593114e-02 20 KSP preconditioned resid norm 4.049582768289e-03 true resid norm 7.120382212555e-02 ||r(i)||/||b|| 6.929141593532e-02 21 KSP preconditioned resid norm 6.862248582464e-04 true resid norm 7.111495778110e-02 ||r(i)||/||b|| 6.920493832682e-02 22 KSP preconditioned resid norm 3.280255317757e-04 true resid norm 7.109751049924e-02 ||r(i)||/||b|| 6.918795964747e-02 23 KSP preconditioned resid norm 1.943287835633e-04 true resid norm 7.109218692978e-02 ||r(i)||/||b|| 6.918277905948e-02 24 KSP preconditioned resid norm 1.265015512889e-04 true resid norm 7.108929821126e-02 ||r(i)||/||b|| 6.917996792673e-02 25 KSP preconditioned resid norm 1.264202759318e-04 true resid norm 7.108882409509e-02 ||r(i)||/||b|| 6.917950654446e-02 26 KSP preconditioned resid norm 1.941274487871e-04 true resid norm 7.108881499846e-02 ||r(i)||/||b|| 6.917949769215e-02 27 KSP preconditioned resid norm 1.237782236805e-04 true resid norm 7.108539247506e-02 ||r(i)||/||b|| 6.917616709155e-02 28 KSP preconditioned resid norm 2.567842373425e-05 true resid norm 7.108693575066e-02 ||r(i)||/||b|| 6.917766891756e-02 29 KSP preconditioned resid norm 1.624250348737e-05 true resid norm 7.108694548270e-02 ||r(i)||/||b|| 6.917767838821e-02 30 KSP preconditioned resid norm 1.400709805305e-05 true resid norm 7.108704978977e-02 ||r(i)||/||b|| 6.917777989378e-02 31 KSP preconditioned resid norm 1.258524583639e-05 true resid norm 7.108709377947e-02 ||r(i)||/||b|| 6.917782270200e-02 32 KSP preconditioned resid norm 1.214563778665e-05 true resid norm 7.108699539645e-02 ||r(i)||/||b|| 6.917772696137e-02 33 KSP preconditioned resid norm 1.175376138755e-05 true resid norm 7.108687362476e-02 ||r(i)||/||b|| 6.917760846025e-02 34 KSP preconditioned resid norm 1.218910479225e-05 true resid norm 7.108702667143e-02 ||r(i)||/||b|| 6.917775739636e-02 35 KSP preconditioned resid norm 1.786145773473e-05 true resid norm 7.108711007365e-02 ||r(i)||/||b|| 6.917783855855e-02 36 KSP preconditioned resid norm 2.293730220946e-05 true resid norm 7.108631479962e-02 ||r(i)||/||b|| 6.917706464414e-02 37 KSP preconditioned resid norm 5.223812358404e-06 true resid norm 7.108670686965e-02 ||r(i)||/||b|| 6.917744618388e-02 38 KSP preconditioned resid norm 3.754603897641e-06 true resid norm 7.108673794435e-02 ||r(i)||/||b|| 6.917747642397e-02 39 KSP preconditioned resid norm 3.359131707993e-06 true resid norm 7.108673993370e-02 ||r(i)||/||b|| 6.917747835988e-02 40 KSP preconditioned resid norm 3.150846888622e-06 true resid norm 7.108674979713e-02 ||r(i)||/||b|| 6.917748795841e-02 41 KSP preconditioned resid norm 2.872188456387e-06 true resid norm 7.108671888049e-02 ||r(i)||/||b|| 6.917745787212e-02 42 KSP preconditioned resid norm 2.644889812492e-06 true resid norm 7.108676255906e-02 ||r(i)||/||b|| 6.917750037757e-02 43 KSP preconditioned resid norm 2.702539684936e-06 true resid norm 7.108676352647e-02 ||r(i)||/||b|| 6.917750131900e-02 44 KSP preconditioned resid norm 4.022429227783e-06 true resid norm 7.108676431469e-02 ||r(i)||/||b|| 6.917750208604e-02 45 KSP preconditioned resid norm 1.579236965984e-06 true resid norm 7.108675412426e-02 ||r(i)||/||b|| 6.917749216931e-02 46 KSP preconditioned resid norm 1.831765135201e-06 true resid norm 7.108674882093e-02 ||r(i)||/||b|| 6.917748700842e-02 47 KSP preconditioned resid norm 1.846944675698e-06 true resid norm 7.108675619194e-02 ||r(i)||/||b|| 6.917749418146e-02 48 KSP preconditioned resid norm 1.820006170791e-06 true resid norm 7.108675215517e-02 ||r(i)||/||b|| 6.917749025311e-02 49 KSP preconditioned resid norm 1.770579229536e-06 true resid norm 7.108675662247e-02 ||r(i)||/||b|| 6.917749460042e-02 50 KSP preconditioned resid norm 1.715571966759e-06 true resid norm 7.108675399614e-02 ||r(i)||/||b|| 6.917749204463e-02 51 KSP preconditioned resid norm 1.646182320252e-06 true resid norm 7.108675922712e-02 ||r(i)||/||b|| 6.917749713512e-02 52 KSP preconditioned resid norm 1.483705350928e-06 true resid norm 7.108675882398e-02 ||r(i)||/||b|| 6.917749674280e-02 53 KSP preconditioned resid norm 1.116320649371e-06 true resid norm 7.108676123954e-02 ||r(i)||/||b|| 6.917749909349e-02 54 KSP preconditioned resid norm 1.657360936891e-07 true resid norm 7.108676285301e-02 ||r(i)||/||b|| 6.917750066362e-02 55 KSP preconditioned resid norm 9.060420145627e-07 true resid norm 7.108677215316e-02 ||r(i)||/||b|| 6.917750971399e-02 56 KSP preconditioned resid norm 8.701510106194e-07 true resid norm 7.108675863210e-02 ||r(i)||/||b|| 6.917749655608e-02 57 KSP preconditioned resid norm 6.250673067310e-07 true resid norm 7.108675858904e-02 ||r(i)||/||b|| 6.917749651417e-02 58 KSP preconditioned resid norm 8.606839230868e-07 true resid norm 7.108676660388e-02 ||r(i)||/||b|| 6.917750431376e-02 59 KSP preconditioned resid norm 2.619366455426e-07 true resid norm 7.108676234484e-02 ||r(i)||/||b|| 6.917750016910e-02 60 KSP preconditioned resid norm 4.200943714852e-08 true resid norm 7.108676245753e-02 ||r(i)||/||b|| 6.917750027877e-02 61 KSP preconditioned resid norm 7.666291884631e-09 true resid norm 7.108676261926e-02 ||r(i)||/||b|| 6.917750043615e-02 62 KSP preconditioned resid norm 6.267989202765e-09 true resid norm 7.108676267362e-02 ||r(i)||/||b|| 6.917750048905e-02 63 KSP preconditioned resid norm 6.322065776573e-09 true resid norm 7.108676266915e-02 ||r(i)||/||b|| 6.917750048470e-02 64 KSP preconditioned resid norm 5.419388728381e-09 true resid norm 7.108676275176e-02 ||r(i)||/||b|| 6.917750056510e-02 65 KSP preconditioned resid norm 4.864432437470e-09 true resid norm 7.108676262299e-02 ||r(i)||/||b|| 6.917750043978e-02 66 KSP preconditioned resid norm 5.889407294927e-09 true resid norm 7.108676258555e-02 ||r(i)||/||b|| 6.917750040335e-02 67 KSP preconditioned resid norm 2.329377620253e-09 true resid norm 7.108676262702e-02 ||r(i)||/||b|| 6.917750044371e-02 68 KSP preconditioned resid norm 6.789865008325e-10 true resid norm 7.108676263968e-02 ||r(i)||/||b|| 6.917750045602e-02 69 KSP preconditioned resid norm 9.951873144471e-10 true resid norm 7.108676262735e-02 ||r(i)||/||b|| 6.917750044402e-02 70 KSP preconditioned resid norm 3.345568224138e-10 true resid norm 7.108676262789e-02 ||r(i)||/||b|| 6.917750044455e-02 71 KSP preconditioned resid norm 2.723475696116e-10 true resid norm 7.108676263029e-02 ||r(i)||/||b|| 6.917750044688e-02 72 KSP preconditioned resid norm 2.270307705471e-10 true resid norm 7.108676263411e-02 ||r(i)||/||b|| 6.917750045060e-02 73 KSP preconditioned resid norm 3.952312974708e-10 true resid norm 7.108676262930e-02 ||r(i)||/||b|| 6.917750044592e-02 74 KSP preconditioned resid norm 2.215148179871e-09 true resid norm 7.108676260334e-02 ||r(i)||/||b|| 6.917750042066e-02 75 KSP preconditioned resid norm 6.854898442912e-11 true resid norm 7.108676262831e-02 ||r(i)||/||b|| 6.917750044496e-02 76 KSP preconditioned resid norm 8.140088634442e-12 true resid norm 7.108676262834e-02 ||r(i)||/||b|| 6.917750044499e-02 77 KSP preconditioned resid norm 2.214472895697e-12 true resid norm 7.108676262855e-02 ||r(i)||/||b|| 6.917750044519e-02 Linear solve converged due to CONVERGED_RTOL iterations 77 KSP Object: 1 MPI processes type: bcgs maximum iterations=10000, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000 left preconditioning has attached null space using PRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=4225, cols=4225 total: nonzeros=19301, allocated nonzeros=20273 total number of mallocs used during MatSetValues calls =0 has attached null space not using I-node routines On Thu, Mar 6, 2014 at 4:06 PM, Mohammad Mirzadeh wrote: > hummm just tried it in Matlab and you are correct -- they are different. > What does this mean for my system? Also what is the correct approach here? > > > On Thu, Mar 6, 2014 at 4:02 PM, Jed Brown wrote: > >> Mohammad Mirzadeh writes: >> >> > Jed, >> > >> > No the matrix is actually non-symmetric due to grid adaptivity (hanging >> > nodes of QuadTree). Anyway, what do you exactly mean by the system being >> > inconsistent? >> >> Sounds like the right and left null spaces are different. You can test >> by checking the null space using the transpose of your system. >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Thu Mar 6 18:10:55 2014 From: jed at jedbrown.org (Jed Brown) Date: Thu, 06 Mar 2014 17:10:55 -0700 Subject: [petsc-users] having issues with nullspace In-Reply-To: References: <87r46eevve.fsf@jedbrown.org> <87ob1ieuzb.fsf@jedbrown.org> Message-ID: <87lhwmeuls.fsf@jedbrown.org> Mohammad Mirzadeh writes: > hummm just tried it in Matlab and you are correct -- they are different. > What does this mean for my system? Also what is the correct approach here? It means you projected the wrong null space out of your right hand side. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From jed at jedbrown.org Thu Mar 6 18:38:57 2014 From: jed at jedbrown.org (Jed Brown) Date: Thu, 06 Mar 2014 17:38:57 -0700 Subject: [petsc-users] vtk output ASCII or binary In-Reply-To: References: <87zjl5i6lj.fsf@jedbrown.org> <877g88iisw.fsf@jedbrown.org> <871tygih7x.fsf@jedbrown.org> Message-ID: <87iorqetb2.fsf@jedbrown.org> Xiangdong writes: > This works fine. However, when I add the option -da_grid_x 8, the same > error messages pop up. There was a memory error introduced when Barry changed PetscMPIIntCast to be better behaved. For equal-aspect grids, the uninitialized slot contained the right value by accident. Thanks. If you are using PETSc from Git, run "git pull && make", otherwise you can apply the patch below and rebuild. commit 334634e2e3b3ad30f394a243f8d8200a86444220 Author: Jed Brown Date: Thu Mar 6 17:28:37 2014 -0700 VecView_MPI_DA: fix gsizes bug (bad conversion in parent commit) Reported-by: Xiangdong diff --git a/src/dm/impls/da/gr2.c b/src/dm/impls/da/gr2.c index f726037..abb8dea 100644 --- a/src/dm/impls/da/gr2.c +++ b/src/dm/impls/da/gr2.c @@ -472,7 +472,7 @@ static PetscErrorCode DMDAArrayMPIIO(DM da,PetscViewer viewer,Vec xin,PetscBool gsizes[0] = dof; ierr = PetscMPIIntCast(dd->M,gsizes+1);CHKERRQ(ierr); ierr = PetscMPIIntCast(dd->N,gsizes+2);CHKERRQ(ierr); - ierr = PetscMPIIntCast(dd->P,gsizes+1);CHKERRQ(ierr); + ierr = PetscMPIIntCast(dd->P,gsizes+3);CHKERRQ(ierr); lsizes[0] = dof; ierr = PetscMPIIntCast((dd->xe-dd->xs)/dof,lsizes+1);CHKERRQ(ierr); ierr = PetscMPIIntCast(dd->ye-dd->ys,lsizes+2);CHKERRQ(ierr); -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From mirzadeh at gmail.com Thu Mar 6 20:53:23 2014 From: mirzadeh at gmail.com (Mohammad Mirzadeh) Date: Thu, 6 Mar 2014 18:53:23 -0800 Subject: [petsc-users] having issues with nullspace In-Reply-To: <87lhwmeuls.fsf@jedbrown.org> References: <87r46eevve.fsf@jedbrown.org> <87ob1ieuzb.fsf@jedbrown.org> <87lhwmeuls.fsf@jedbrown.org> Message-ID: I am confused -- are you saying that I need to project the left null space out of the rhs? On Thu, Mar 6, 2014 at 4:10 PM, Jed Brown wrote: > Mohammad Mirzadeh writes: > > > hummm just tried it in Matlab and you are correct -- they are different. > > What does this mean for my system? Also what is the correct approach > here? > > It means you projected the wrong null space out of your right hand side. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Thu Mar 6 21:00:44 2014 From: jed at jedbrown.org (Jed Brown) Date: Thu, 06 Mar 2014 20:00:44 -0700 Subject: [petsc-users] having issues with nullspace In-Reply-To: References: <87r46eevve.fsf@jedbrown.org> <87ob1ieuzb.fsf@jedbrown.org> <87lhwmeuls.fsf@jedbrown.org> Message-ID: <87bnxiemqr.fsf@jedbrown.org> Mohammad Mirzadeh writes: > I am confused -- are you saying that I need to project the left null space > out of the rhs? Yes, think about A x = b and what it means to be consistent. Alternatively, you could enforce boundary conditions symmetrically. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From C.Klaij at marin.nl Fri Mar 7 01:42:43 2014 From: C.Klaij at marin.nl (Klaij, Christiaan) Date: Fri, 7 Mar 2014 07:42:43 +0000 Subject: [petsc-users] webpage layout Message-ID: <6b57ee7a97ab4f609ba59c3d0b1e545d@MAR190n2.marin.local> Something's wrong with http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatSetOption.html In Firefox 24.3.0, the option description is continued at the bottom of the page, after the examples. dr. ir. Christiaan Klaij CFD Researcher Research & Development E mailto:C.Klaij at marin.nl T +31 317 49 33 44 MARIN 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl From epscodes at gmail.com Fri Mar 7 07:22:17 2014 From: epscodes at gmail.com (Xiangdong) Date: Fri, 7 Mar 2014 08:22:17 -0500 Subject: [petsc-users] webpage layout In-Reply-To: <6b57ee7a97ab4f609ba59c3d0b1e545d@MAR190n2.marin.local> References: <6b57ee7a97ab4f609ba59c3d0b1e545d@MAR190n2.marin.local> Message-ID: Another tiny webpage issues: http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCML.html *1) -pc_mg_type :* should start from a new line and be bold. 2) * -pc_mg_cycles <1>: 1 for V cycle, 2 for W.* I found that this option does not work for ml preconditioner. However, -pc_mg_cycle_type can set the option to be w cycle. Xiangdong On Fri, Mar 7, 2014 at 2:42 AM, Klaij, Christiaan wrote: > Something's wrong with > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatSetOption.html > In Firefox 24.3.0, the option description is continued at the bottom of > the page, after the examples. > > > dr. ir. Christiaan Klaij > CFD Researcher > Research & Development > E mailto:C.Klaij at marin.nl > T +31 317 49 33 44 > > > MARIN > 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands > T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Mar 7 08:07:01 2014 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 7 Mar 2014 08:07:01 -0600 Subject: [petsc-users] webpage layout In-Reply-To: <6b57ee7a97ab4f609ba59c3d0b1e545d@MAR190n2.marin.local> References: <6b57ee7a97ab4f609ba59c3d0b1e545d@MAR190n2.marin.local> Message-ID: On Fri, Mar 7, 2014 at 1:42 AM, Klaij, Christiaan wrote: > Something's wrong with > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatSetOption.html > In Firefox 24.3.0, the option description is continued at the bottom of > the page, after the examples. > I have fixed this: https://bitbucket.org/petsc/petsc/commits/a4018660e6b437ff8a4ed869b529a9d31b9a35b2 Matt > dr. ir. Christiaan Klaij > CFD Researcher > Research & Development > E mailto:C.Klaij at marin.nl > T +31 317 49 33 44 > > > MARIN > 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands > T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Mar 7 08:08:06 2014 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 7 Mar 2014 08:08:06 -0600 Subject: [petsc-users] webpage layout In-Reply-To: References: <6b57ee7a97ab4f609ba59c3d0b1e545d@MAR190n2.marin.local> Message-ID: On Fri, Mar 7, 2014 at 7:22 AM, Xiangdong wrote: > Another tiny webpage issues: > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCML.html > > *1) -pc_mg_type :* should start from a new line and be > bold. > > 2) * -pc_mg_cycles <1>: 1 for V cycle, 2 for W.* I found that this option > does not work for ml preconditioner. However, -pc_mg_cycle_type can set the > option to be w cycle. > I have fixed the formatting issue. Will look at the semantics. https://bitbucket.org/petsc/petsc/commits/2612397fc73317b415d12799bb7d85b320714208 Matt > Xiangdong > > > On Fri, Mar 7, 2014 at 2:42 AM, Klaij, Christiaan wrote: > >> Something's wrong with >> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatSetOption.html >> In Firefox 24.3.0, the option description is continued at the bottom of >> the page, after the examples. >> >> >> dr. ir. Christiaan Klaij >> CFD Researcher >> Research & Development >> E mailto:C.Klaij at marin.nl >> T +31 317 49 33 44 >> >> >> MARIN >> 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands >> T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From song.gao2 at mail.mcgill.ca Fri Mar 7 13:21:19 2014 From: song.gao2 at mail.mcgill.ca (Song Gao) Date: Fri, 7 Mar 2014 14:21:19 -0500 Subject: [petsc-users] Is it possible to use -snes_ksp_ew in the framework of KSP? Message-ID: Hello, We are working on a legacy codes which solves the NS equations. The codes take control of newton iterations itself and use KSP to solve the linear system. We modified the codes, used SNES to control the newton iteration and changed the code to matrix free fashion. But the legacy codes did a lot of other things between two newton iterations (such as output solution, update variables....). I know we could use linesearchpostcheck but it is difficulty to do that correctly. Therefore, we decide to go back to the KSP framework but still use matrix free. When using SNES, we always use the runtime option -snes_ksp_ew, we observe that for some test cases, the residual stalls without -snes_ksp_ew, but converges with-snes_ksp_ew. So I'm thinking if it is possible to use -snes_ksp_ew in KSP? Thanks in advance. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Fri Mar 7 13:24:19 2014 From: jed at jedbrown.org (Jed Brown) Date: Fri, 07 Mar 2014 12:24:19 -0700 Subject: [petsc-users] Is it possible to use -snes_ksp_ew in the framework of KSP? In-Reply-To: References: Message-ID: <87ob1hbyn0.fsf@jedbrown.org> Song Gao writes: > We modified the codes, used SNES to control the newton iteration and > changed the code to matrix free fashion. But the legacy codes did a lot of > other things between two newton iterations (such as output solution, update > variables....). I know we could use linesearchpostcheck but it is > difficulty to do that correctly. It shouldn't be difficult and I still recommend this (SNES with monitors and pre/post checks if necessary). > Therefore, we decide to go back to the KSP framework but still use > matrix free. > > When using SNES, we always use the runtime option -snes_ksp_ew, we observe > that for some test cases, the residual stalls without -snes_ksp_ew, but > converges with-snes_ksp_ew. So I'm thinking if it is possible to > use -snes_ksp_ew in KSP? Thanks in advance. KSP doesn't know about the nonlinear solve so it has no way of doing this on its own. You can implement the equivalent algorithm yourself if you want. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From bsmith at mcs.anl.gov Fri Mar 7 13:35:08 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 7 Mar 2014 13:35:08 -0600 Subject: [petsc-users] Is it possible to use -snes_ksp_ew in the framework of KSP? In-Reply-To: References: Message-ID: <153C3484-532D-490B-BFD7-ACB0ABDCC108@mcs.anl.gov> On Mar 7, 2014, at 1:21 PM, Song Gao wrote: > Hello, > > We are working on a legacy codes which solves the NS equations. The codes take control of newton iterations itself and use KSP to solve the linear system. > > We modified the codes, used SNES to control the newton iteration and changed the code to matrix free fashion. But the legacy codes did a lot of other things between two newton iterations (such as output solution, update variables....). I know we could use linesearchpostcheck but it is difficulty to do that correctly. Please let us know what difficulties arise with the various pre and post checks, we are willing to add functionality or improve it to satisfy your needs. Barry > Therefore, we decide to go back to the KSP framework but still use matrix free. > > When using SNES, we always use the runtime option -snes_ksp_ew, we observe that for some test cases, the residual stalls without -snes_ksp_ew, but converges with-snes_ksp_ew. So I'm thinking if it is possible to use -snes_ksp_ew in KSP? Thanks in advance. > > From knepley at gmail.com Fri Mar 7 13:39:52 2014 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 7 Mar 2014 13:39:52 -0600 Subject: [petsc-users] Is it possible to use -snes_ksp_ew in the framework of KSP? In-Reply-To: References: Message-ID: On Fri, Mar 7, 2014 at 1:21 PM, Song Gao wrote: > Hello, > > We are working on a legacy codes which solves the NS equations. The codes > take control of newton iterations itself and use KSP to solve the linear > system. > > We modified the codes, used SNES to control the newton iteration and > changed the code to matrix free fashion. But the legacy codes did a lot of > other things between two newton iterations (such as output solution, update > variables....). I know we could use linesearchpostcheck but it is > difficulty to do that correctly. Therefore, we decide to go back to the KSP > framework but still use matrix free. > What makes it difficult? Thanks, Matt > When using SNES, we always use the runtime option -snes_ksp_ew, we observe > that for some test cases, the residual stalls without -snes_ksp_ew, but > converges with-snes_ksp_ew. So I'm thinking if it is possible to > use -snes_ksp_ew in KSP? Thanks in advance. > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From xsli at lbl.gov Fri Mar 7 15:00:33 2014 From: xsli at lbl.gov (Xiaoye S. Li) Date: Fri, 7 Mar 2014 13:00:33 -0800 Subject: [petsc-users] Error using MUMPS to solve large linear system In-Reply-To: References: <0a26f0bf3e454a7cb284fd136453e7e5@NAGURSKI.anl.gov> <3f57971afae8492fb66dd34c8e2f1b03@NAGURSKI.anl.gov> <3FB64BEE-FFF6-4A1C-943D-52900613AF70@ldeo.columbia.edu> <51132AD1-BCA6-4865-8147-BE5F77A90DC7@mcs.anl.gov> <9D66FB28-3132-4E5E-B207-22A16517E28B@ldeo.columbia.edu> Message-ID: For superlu_dist, you can try: options.ReplaceTinyPivot = NO; (I think default is YES) and/or options.IterRefine = YES; Sherry Li On Sun, Mar 2, 2014 at 2:23 PM, Matt Landreman wrote: > Hi, > > I'm having some problems with my PETSc application similar to the ones > discussed in this thread, so perhaps one of you can help. In my application > I factorize a preconditioner matrix with mumps or superlu_dist, using this > factorized preconditioner to accelerate gmres on a matrix that is denser > than the preconditioner. I've been running on edison at nersc. My program > works reliably for problem sizes below about 1 million x 1 million, but > above this size, the factorization step fails in one of many possible ways, > depending on the compiler, # of nodes, # of procs/node, etc: > > When I use superlu_dist, I get 1 of 2 failure modes: > (1) the first step of KSP returns "0 KSP residual norm -nan" and ksp then > returns KSPConvergedReason = -9, or > (2) the factorization completes, but GMRES then converges excruciatingly > slowly or not at all, even if I choose the "real" matrix to be identical to > the preconditioner matrix so KSP ought to converge in 1 step (which it does > for smaller matrices). > > For mumps, the factorization can fail in many different ways: > (3) With the intel compiler I usually get "Caught signal number 11 SEGV: > Segmentation Violation" > (4) Sometimes with the intel compiler I get "Caught signal number 7 BUS: > Bus Error" > (5) With the gnu compiler I often get a bunch of lines like "problem with > NIV2_FLOPS message -5.9604644775390625E-008 0 > -227464733.99999997" > (6) Other times with gnu I get a mumps error with INFO(1)=-9 or > INFO(1)=-17. The mumps documentation suggests I should increase icntl(14), > but what is an appropriate value? 50? 10000? > (7) With the Cray compiler I consistently get this cryptic error: > Fatal error in PMPI_Test: Invalid MPI_Request, error stack: > PMPI_Test(166): MPI_Test(request=0xb228dbf3c, flag=0x7ffffffe097c, > status=0x7ffffffe0a00) failed > PMPI_Test(121): Invalid MPI_Request > _pmiu_daemon(SIGCHLD): [NID 02784] [c6-1c1s8n0] [Sun Mar 2 10:35:20 2014] > PE RANK 0 exit signal Aborted > [NID 02784] 2014-03-02 10:35:20 Apid 3374579: initiated application > termination > Application 3374579 exit codes: 134 > > For linear systems smaller than around 1 million^2, my application is very > robust, working consistently with both mumps & superlu_dist, working for a > wide range of # of nodes and # of procs/node, and working with all 3 > available compilers on edison (intel, gnu, cray). > > By the way, mumps failed for much smaller problems until I tried > -mat_mumps_icntl_7 2 (inspired by your conversation last week). I tried all > the other options for icntl(7), icntl(28), and icntl(29), finding > icntl(7)=2 works best by far. I tried the flags that worked for Samar > (-mat_superlu_dist_colperm PARMETIS -mat_superlu_dist_parsymbfact 1) with > superlu_dist, but they did not appear to change anything in my case. > > Can you recommend any other parameters of petsc, superlu_dist, or mumps > that I should try changing? I don't care in the end whether I use > superlu_dist or mumps. > > Thanks! > > Matt Landreman > > > On Tue, Feb 25, 2014 at 3:50 PM, Xiaoye S. Li wrote: > >> Very good! Thanks for the update. >> I guess you are using all 16 cores per node? Since superlu_dist >> currently is MPI-only, if you generate 16 MPI tasks, serial symbolic >> factorization only has less than 2 GB memory to work with. >> >> Sherry >> >> >> On Tue, Feb 25, 2014 at 12:22 PM, Samar Khatiwala wrote: >> >>> Hi Sherry, >>> >>> Thanks! I tried your suggestions and it worked! >>> >>> For the record I added these flags: -mat_superlu_dist_colperm PARMETIS >>> -mat_superlu_dist_parsymbfact 1 >>> >>> Also, for completeness and since you asked: >>> >>> size: 2346346 x 2346346 >>> nnz: 60856894 >>> unsymmetric >>> >>> The hardware (http://www2.cisl.ucar.edu/resources/yellowstone/hardware) >>> specs are: 2 GB/core, 32 GB/node (27 GB usable), (16 cores per node) >>> I've been running on 8 nodes (so 8 x 27 ~ 216 GB). >>> >>> Thanks again for your help! >>> >>> Samar >>> >>> On Feb 25, 2014, at 1:00 PM, "Xiaoye S. Li" wrote: >>> >>> I didn't follow the discussion thread closely ... How large is your >>> matrix dimension, and number of nonzeros? >>> How large is the memory per core (or per node)? >>> >>> The default setting in superlu_dist is to use serial symbolic >>> factorization. You can turn on parallel symbolic factorization by: >>> >>> options.ParSymbFact = YES; >>> options.ColPerm = PARMETIS; >>> >>> Is your matrix symmetric? if so, you need to give both upper and lower >>> half of matrix A to superlu, which doesn't exploit symmetry. >>> >>> Do you know whether you need numerical pivoting? If not, you can turn >>> off pivoting by: >>> >>> options.RowPerm = NATURAL; >>> >>> This avoids some other serial bottleneck. >>> >>> All these options can be turned on in the petsc interface. Please check >>> out the syntax there. >>> >>> >>> Sherry >>> >>> >>> >>> On Tue, Feb 25, 2014 at 8:07 AM, Samar Khatiwala wrote: >>> >>>> Hi Barry, >>>> >>>> You're probably right. I note that the error occurs almost instantly >>>> and I've tried increasing the number of CPUs >>>> (as many as ~1000 on Yellowstone) to no avail. I know this is a big >>>> problem but I didn't think it was that big! >>>> >>>> Sherry: Is there any way to write out more diagnostic info? E.g.,how >>>> much memory superlu thinks it needs/is attempting >>>> to allocate. >>>> >>>> Thanks, >>>> >>>> Samar >>>> >>>> On Feb 25, 2014, at 10:57 AM, Barry Smith wrote: >>>> > >>>> >> >>>> >> I tried superlu_dist again and it crashes even more quickly than >>>> MUMPS with just the following error: >>>> >> >>>> >> ERROR: 0031-250 task 128: Killed >>>> > >>>> > This is usually a symptom of running out of memory. >>>> > >>>> >> >>>> >> Absolutely nothing else is written out to either stderr or stdout. >>>> This is with -mat_superlu_dist_statprint. >>>> >> The program works fine on a smaller matrix. >>>> >> >>>> >> This is the sequence of calls: >>>> >> >>>> >> KSPSetType(ksp,KSPPREONLY); >>>> >> PCSetType(pc,PCLU); >>>> >> PCFactorSetMatSolverPackage(pc,MATSOLVERSUPERLU_DIST); >>>> >> KSPSetFromOptions(ksp); >>>> >> PCSetFromOptions(pc); >>>> >> KSPSolve(ksp,b,x); >>>> >> >>>> >> All of these successfully return *except* the very last one to >>>> KSPSolve. >>>> >> >>>> >> Any help would be appreciated. Thanks! >>>> >> >>>> >> Samar >>>> >> >>>> >> On Feb 24, 2014, at 3:58 PM, Xiaoye S. Li wrote: >>>> >> >>>> >>> Samar: >>>> >>> If you include the error message while crashing using superlu_dist, >>>> I probably know the reason. (better yet, include the printout before the >>>> crash. ) >>>> >>> >>>> >>> Sherry >>>> >>> >>>> >>> >>>> >>> On Mon, Feb 24, 2014 at 9:56 AM, Hong Zhang >>>> wrote: >>>> >>> Samar : >>>> >>> There are limitations for direct solvers. >>>> >>> Do not expect any solver can be used on arbitrarily large problems. >>>> >>> Since superlu_dist also crashes, direct solvers may not be able to >>>> work on your application. >>>> >>> This is why I suggest to increase size incrementally. >>>> >>> You may have to experiment other type of solvers. >>>> >>> >>>> >>> Hong >>>> >>> >>>> >>> Hi Hong and Jed, >>>> >>> >>>> >>> Many thanks for replying. It would indeed be nice if the error >>>> messages from MUMPS were less cryptic! >>>> >>> >>>> >>> 1) I have tried smaller matrices although given how my problem is >>>> set up a jump is difficult to avoid. But a good idea >>>> >>> that I will try. >>>> >>> >>>> >>> 2) I did try various ordering but not the one you suggested. >>>> >>> >>>> >>> 3) Tracing the error through the MUMPS code suggest a rather abrupt >>>> termination of the program (there should be more >>>> >>> error messages if, for example, memory was a problem). I therefore >>>> thought it might be an interface problem rather than >>>> >>> one with mumps and turned to the petsc-users group first. >>>> >>> >>>> >>> 4) I've tried superlu_dist but it also crashes (also unclear as to >>>> why) at which point I decided to try mumps. The fact that both >>>> >>> crash would again indicate a common (memory?) problem. >>>> >>> >>>> >>> I'll try a few more things before asking the MUMPS developers. >>>> >>> >>>> >>> Thanks again for your help! >>>> >>> >>>> >>> Samar >>>> >>> >>>> >>> On Feb 24, 2014, at 11:47 AM, Hong Zhang >>>> wrote: >>>> >>> >>>> >>>> Samar: >>>> >>>> The crash occurs in >>>> >>>> ... >>>> >>>> [161]PETSC ERROR: Error in external library! >>>> >>>> [161]PETSC ERROR: Error reported by MUMPS in numerical >>>> factorization phase: INFO(1)=-1, INFO(2)=48 >>>> >>>> >>>> >>>> for very large matrix, likely memory problem as you suspected. >>>> >>>> I would suggest >>>> >>>> 1. run problems with increased sizes (not jump from a small one to >>>> a very large one) and observe memory usage using >>>> >>>> '-ksp_view'. >>>> >>>> I see you use '-mat_mumps_icntl_14 1000', i.e., percentage of >>>> estimated workspace increase. Is it too large? >>>> >>>> Anyway, this input should not cause the crash, I guess. >>>> >>>> 2. experimenting with different matrix ordering -mat_mumps_icntl_7 >>>> <> (I usually use sequential ordering 2) >>>> >>>> I see you use parallel ordering -mat_mumps_icntl_29 2. >>>> >>>> 3. send bug report to mumps developers for their suggestion. >>>> >>>> >>>> >>>> 4. try other direct solvers, e.g., superlu_dist. >>>> >>>> >>>> >>>> ... >>>> >>>> >>>> >>>> etc etc. The above error I can tell has something to do with >>>> processor 48 (INFO(2)) and so forth but not the previous one. >>>> >>>> >>>> >>>> The full output enabled with -mat_mumps_icntl_4 3 looks as in the >>>> attached file. Any hints as to what could be giving this >>>> >>>> error would be very much appreciated. >>>> >>>> >>>> >>>> I do not know how to interpret this output file. mumps developer >>>> would give you better suggestion on it. >>>> >>>> I would appreciate to learn as well :-) >>>> >>>> >>>> >>>> Hong >>>> >>> >>>> >>> >>>> >>> >>>> >> >>>> > >>>> >>>> >>> >>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ling.zou at inl.gov Fri Mar 7 17:45:25 2014 From: ling.zou at inl.gov (Zou (Non-US), Ling) Date: Fri, 7 Mar 2014 16:45:25 -0700 Subject: [petsc-users] snes + matrix free + coloring In-Reply-To: References: <37756.94.66.14.234.1349901124.squirrel@mail.mech.upatras.gr> <56511.94.66.14.234.1349901842.squirrel@mail.mech.upatras.gr> Message-ID: Hi Jed, I was searching a topic on preconditioning in the mail list and found this email. On Wed, Oct 10, 2012 at 2:55 PM, Jed Brown wrote: > Then you aren't using a matrix-free method at all. In case of using SNESDefaultComputeJacobianColor to compute Jacobian matrix and mf_operator command, is this exactly what you are describing ('aren't using a matrix-free method at all')? The SNESDefaultComputeJacobianColor I found in lines 149-155 from http://www.mcs.anl.gov/petsc/petsc-2.3.3/src/snes/examples/tutorials/ex5.c.html Best, Ling > Just don't call SNESSetJacobian and it will be done for you. Are you using > a DM? If not, you will also need to create a MatFDColoring. The easiest way > to do that, if you don't have a structured mesh, is to assemble the nonzero > pattern of your operator. > > > On Wed, Oct 10, 2012 at 3:44 PM, wrote: > >> Is exactly the same. >> >> Kostas >> >> >> >> >> >> >> > On Wed, Oct 10, 2012 at 3:32 PM, wrote: >> > >> >> Dear all >> >> >> >> I have a non-linear system which i want to solve it with matrix free >> >> method. However i must form the whole proconditioning matrix using >> >> finite >> >> differences and coloring method. >> >> >> > >> > How is the preconditioning matrix different from the actual operator? >> > >> > >> >> >> >> I can set the function F() of non-linear system (i take the notation >> F() >> >> from documentation). >> >> >> >> Could you send me exactly the commands and the order of it which i must >> >> use or an apt example for that? >> >> >> >> I have looked for that in internet but i could not find any suitable >> >> example. >> >> >> >> Kostas >> >> >> >> Thanks >> >> >> >> >> > >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Fri Mar 7 17:56:18 2014 From: jed at jedbrown.org (Jed Brown) Date: Fri, 07 Mar 2014 16:56:18 -0700 Subject: [petsc-users] snes + matrix free + coloring In-Reply-To: References: <37756.94.66.14.234.1349901124.squirrel@mail.mech.upatras.gr> <56511.94.66.14.234.1349901842.squirrel@mail.mech.upatras.gr> Message-ID: <8738itbm1p.fsf@jedbrown.org> "Zou (Non-US), Ling" writes: > Hi Jed, > > I was searching a topic on preconditioning in the mail list and found this > email. > > > On Wed, Oct 10, 2012 at 2:55 PM, Jed Brown wrote: > >> Then you aren't using a matrix-free method at all. > > > In case of using SNESDefaultComputeJacobianColor to compute Jacobian matrix > and mf_operator command, is this exactly what you are describing ('aren't > using a matrix-free method at all')? > > The SNESDefaultComputeJacobianColor I found in lines 149-155 from > http://www.mcs.anl.gov/petsc/petsc-2.3.3/src/snes/examples/tutorials/ex5.c.html 1. Please upgrade PETSc. I don't think we have any links to these antique man pages, so I assume you specifically changed the version number back rather than landing there accidentally. 2. It is NOT a matrix-free method because coloring is used to assemble the matrix. It usually doesn't make sense to assemble a matrix using coloring, but then not use it to apply the matrix (-snes_mf_operator only uses the assembled matrix for preconditioning). -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From ling.zou at inl.gov Fri Mar 7 18:22:48 2014 From: ling.zou at inl.gov (Zou (Non-US), Ling) Date: Fri, 7 Mar 2014 17:22:48 -0700 Subject: [petsc-users] snes + matrix free + coloring In-Reply-To: <8738itbm1p.fsf@jedbrown.org> References: <37756.94.66.14.234.1349901124.squirrel@mail.mech.upatras.gr> <56511.94.66.14.234.1349901842.squirrel@mail.mech.upatras.gr> <8738itbm1p.fsf@jedbrown.org> Message-ID: On Fri, Mar 7, 2014 at 4:56 PM, Jed Brown wrote: > "Zou (Non-US), Ling" writes: > > > Hi Jed, > > > > I was searching a topic on preconditioning in the mail list and found > this > > email. > > > > > > On Wed, Oct 10, 2012 at 2:55 PM, Jed Brown wrote: > > > >> Then you aren't using a matrix-free method at all. > > > > > > In case of using SNESDefaultComputeJacobianColor to compute Jacobian > matrix > > and mf_operator command, is this exactly what you are describing ('aren't > > using a matrix-free method at all')? > > > > The SNESDefaultComputeJacobianColor I found in lines 149-155 from > > > http://www.mcs.anl.gov/petsc/petsc-2.3.3/src/snes/examples/tutorials/ex5.c.html > > 1. Please upgrade PETSc. I don't think we have any links to these > antique man pages, so I assume you specifically changed the version > number back rather than landing there accidentally. > Hmm... I don't know how come I got this old version document. We are using version 3.4 I believe. I simply did google 'SNESDefaultComputeJacobianColor' and the first search result linked me here. I searched 'SNESDefaultComputeJacobianColor' in the petsc 3.4 manual, and it is not existing. > 2. It is NOT a matrix-free method because coloring is used to assemble > the matrix. It usually doesn't make sense to assemble a matrix using > coloring, but then not use it to apply the matrix (-snes_mf_operator > only uses the assembled matrix for preconditioning). > I agree... Ling -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Fri Mar 7 18:42:51 2014 From: jed at jedbrown.org (Jed Brown) Date: Fri, 07 Mar 2014 17:42:51 -0700 Subject: [petsc-users] snes + matrix free + coloring In-Reply-To: References: <37756.94.66.14.234.1349901124.squirrel@mail.mech.upatras.gr> <56511.94.66.14.234.1349901842.squirrel@mail.mech.upatras.gr> <8738itbm1p.fsf@jedbrown.org> Message-ID: <87zjl1a5bo.fsf@jedbrown.org> "Zou (Non-US), Ling" writes: >> http://www.mcs.anl.gov/petsc/petsc-2.3.3/src/snes/examples/tutorials/ex5.c.html >> >> 1. Please upgrade PETSc. I don't think we have any links to these >> antique man pages, so I assume you specifically changed the version >> number back rather than landing there accidentally. >> > Hmm... I don't know how come I got this old version document. We are using > version 3.4 I believe. I simply did google > 'SNESDefaultComputeJacobianColor' and the first search result linked me > here. I searched 'SNESDefaultComputeJacobianColor' in the petsc 3.4 manual, > and it is not existing. The function name changed for consistency. http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/SNES/SNESComputeJacobianDefaultColor.html http://www.mcs.anl.gov/petsc/documentation/changes/34.html -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From luca.argenti at uam.es Sat Mar 8 13:42:04 2014 From: luca.argenti at uam.es (Luca Argenti) Date: Sat, 8 Mar 2014 20:42:04 +0100 Subject: [petsc-users] Best way to compute the null space of a sparse maximum-rank rectangular matrix Message-ID: <214B9C59-C22F-48CF-A3A5-8AB32327A27D@uam.es> Dear all, I need to evaluate the null space Span{v_i} of a sparse rectangular matrix A \in C^{ N x (N+k)}, A v_i = 0, (1) where i) N is typically very big ( N ~ 500?000 ) and k, in comparison, is very small ( k ~ 500 ). ii) The sub-matrix A_ij i,j<= N is hermitian and non-singular. Equation (1), therefore, has exactly k solutions. iii)The matrix is sparse, with a fill typically <= 3%, and its columns/rows can be reordered in such a way that a very large block, A_ij with i,j > n, & i,j<=N, n << N, is band-diagonal. iv) A has a dominant diagonal. v) For large values of i,j, the number of non-zero diagonals in the central band drop by about an order of magnitude. vi) Finally, this problem must be solved for several (thousands) closely-spaced values of an external parameter Q on which A depends continuously, A = A(Q). Most of the time, therefore, the null space at Q_{i+1} is arguably very close to the null-space at Q_i . My feeling is that this problem is very well defined, and that a parallel sparse iterative method should be able to solve it with no issues or unnecessary operations. Yet, probably because I am not an expert of either PETSc or SLEPc, the two libraries I have considered so far, all the possible solutions that I found seem to provide much more information than needed (thus, consuming much more resources than warranted). For example: is it really necessary to make a sparse LU factorization for the *whole* matrix? In practice, one is looking for the null eigenspace of A^h A. However, SLEPc suggests that this operation is much more expensive than for a sparse A matrix alone (is it so? Shouldn?t Lanczos be implementable at just twice the cost?), or maybe I misinterpreted the user guide. Your suggestions will be greatly appreciated. Thank you so much for your help! Cheers, Luca -- Luca Argenti Departamento de Qu?mica Universidad Aut?noma de Madrid 28049 Madrid, Spain Module 13 Office 308 e-mail: luca.argenti at uam.es tel : +34 914973360 fax: +34 914975238 group homepage: http://www.xchem.uam.es -------------- next part -------------- An HTML attachment was scrubbed... URL: From jroman at dsic.upv.es Sat Mar 8 16:11:23 2014 From: jroman at dsic.upv.es (Jose E. Roman) Date: Sat, 8 Mar 2014 23:11:23 +0100 Subject: [petsc-users] Best way to compute the null space of a sparse maximum-rank rectangular matrix In-Reply-To: <214B9C59-C22F-48CF-A3A5-8AB32327A27D@uam.es> References: <214B9C59-C22F-48CF-A3A5-8AB32327A27D@uam.es> Message-ID: <7DACA6F4-7895-4AC2-AC61-7D073A6FDF7C@dsic.upv.es> El 08/03/2014, a las 20:42, Luca Argenti escribi?: > Dear all, > > I need to evaluate the null space Span{v_i} of a sparse rectangular matrix A \in C^{ N x (N+k)}, > > A v_i = 0, (1) > > where > > i) N is typically very big ( N ~ 500?000 ) and k, in comparison, is very small ( k ~ 500 ). > ii) The sub-matrix A_ij i,j<= N is hermitian and non-singular. Equation (1), therefore, has exactly k solutions. > iii)The matrix is sparse, with a fill typically <= 3%, and its columns/rows can be reordered in such a way that > a very large block, A_ij with i,j > n, & i,j<=N, n << N, is band-diagonal. > iv) A has a dominant diagonal. > v) For large values of i,j, the number of non-zero diagonals in the central band drop by about an order of magnitude. > vi) Finally, this problem must be solved for several (thousands) closely-spaced values of an external parameter > Q on which A depends continuously, A = A(Q). Most of the time, therefore, the null space at Q_{i+1} is arguably > very close to the null-space at Q_i . > > My feeling is that this problem is very well defined, and that a parallel sparse iterative method should be > able to solve it with no issues or unnecessary operations. Yet, probably because I am not an expert > of either PETSc or SLEPc, the two libraries I have considered so far, all the possible solutions that I found > seem to provide much more information than needed (thus, consuming much more resources than > warranted). For example: is it really necessary to make a sparse LU factorization for the *whole* matrix? > In practice, one is looking for the null eigenspace of A^h A. However, SLEPc suggests that this operation is > much more expensive than for a sparse A matrix alone (is it so? Shouldn?t Lanczos be implementable at just > twice the cost?), or maybe I misinterpreted the user guide. > > Your suggestions will be greatly appreciated. Thank you so much for your help! > > Cheers, > > Luca > The nullspace of A^h A is equal to the right singular space of A corresponding to the zero singular value. It should be possible to compute this with SLEPc's SVD. Computing a large number of zeros may be problematic, so I cannot say in advance if the method will succeed. If you can generate a small matrix with these properties, send it to my personal address (not the list) and I will give it a try. Jose From song.gao2 at mail.mcgill.ca Sat Mar 8 19:46:28 2014 From: song.gao2 at mail.mcgill.ca (Song Gao) Date: Sat, 8 Mar 2014 20:46:28 -0500 Subject: [petsc-users] Is it possible to use -snes_ksp_ew in the framework of KSP? In-Reply-To: References: Message-ID: Thank you all. Most of the difficulties come from the current structure of the program. Its size and "age" make any radical modification a challenge. The large uses of global variables and deprecated "goto" statements call for a revision, which however is unlikely to occur, against our better judgement... That being said, the current tools provided byPETSc are sufficient, however not necessarily convenient for our purposes. As a wishful thinking we can say that the implementation of PETSc SNES features into existing codes would be easier if the outer Newton-like loop could be managed by the original code, out of the SNES context. On the other hand, we realize that this might be in contrast with the requirements of PETSc itself. The application of the Eistenstat-Walker method together with a Matrix-Free approach are of key importance. Thus, as kindly suggested by Jed Brown, implementing our own version of EW scheme might turn out to be the only way. We would be happy to provide more details if you think that they might be helpful for the future developments of PETSc. On Fri, Mar 7, 2014 at 2:39 PM, Matthew Knepley wrote: > On Fri, Mar 7, 2014 at 1:21 PM, Song Gao wrote: > >> Hello, >> >> We are working on a legacy codes which solves the NS equations. The codes >> take control of newton iterations itself and use KSP to solve the linear >> system. >> >> We modified the codes, used SNES to control the newton iteration and >> changed the code to matrix free fashion. But the legacy codes did a lot of >> other things between two newton iterations (such as output solution, update >> variables....). I know we could use linesearchpostcheck but it is >> difficulty to do that correctly. Therefore, we decide to go back to the KSP >> framework but still use matrix free. >> > > What makes it difficult? > > Thanks, > > Matt > > >> When using SNES, we always use the runtime option -snes_ksp_ew, we >> observe that for some test cases, the residual stalls without -snes_ksp_ew, >> but converges with-snes_ksp_ew. So I'm thinking if it is possible to >> use -snes_ksp_ew in KSP? Thanks in advance. >> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From qiyuelu1 at gmail.com Sun Mar 9 12:16:13 2014 From: qiyuelu1 at gmail.com (Qiyue Lu) Date: Sun, 9 Mar 2014 12:16:13 -0500 Subject: [petsc-users] Functions to return system time Message-ID: Dear all: I have a segment of code in PETSc and am trying to measure its execution time. Is there built-in functions in PETSc returning the current system time? So I can call this function before and after the segment of code then do a 'minus' to get the time consumed? like: A=get_system_time(); ************************* the segment of code ************************* B=get_system_time(); Then B-A is the time I am looking for. Thanks Qiyue Lu -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sun Mar 9 12:36:25 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sun, 9 Mar 2014 12:36:25 -0500 Subject: [petsc-users] Functions to return system time In-Reply-To: References: Message-ID: <2CDE3EA3-ECD1-4077-AB25-E93731E88304@mcs.anl.gov> We much recommend using PETSc?s logging for this purpose. If you run with -log_summary it outputs information about the amount of time and percentage of time and floating point operations and messages sent etc for different parts of the code. It also handles summing over all processes. To add your own events use PetscLogEventRegister(), then at the beginning of the piece you want timed use PetscLogEventBegin() and PetscLogEventEnd() at the end. Barry MPI_Wtime() can be used https://www.google.com/search?client=safari&rls=en&q=MPI_Wtime&ie=UTF-8&oe=UTF-8 On Mar 9, 2014, at 12:16 PM, Qiyue Lu wrote: > Dear all: > > I have a segment of code in PETSc and am trying to measure its execution time. Is there built-in functions in PETSc returning the current system time? So I can call this function before and after the segment of code then do a 'minus' to get the time consumed? > > like: > > A=get_system_time(); > > ************************* > the segment of code > ************************* > B=get_system_time(); > > Then B-A is the time I am looking for. > > Thanks > > Qiyue Lu From luca.argenti at uam.es Sun Mar 9 12:36:43 2014 From: luca.argenti at uam.es (Luca Argenti) Date: Sun, 9 Mar 2014 18:36:43 +0100 Subject: [petsc-users] Best way to compute the null space of a sparse maximum-rank rectangular matrix In-Reply-To: <7DACA6F4-7895-4AC2-AC61-7D073A6FDF7C@dsic.upv.es> References: <214B9C59-C22F-48CF-A3A5-8AB32327A27D@uam.es> <7DACA6F4-7895-4AC2-AC61-7D073A6FDF7C@dsic.upv.es> Message-ID: <12D3F600-D849-4A87-AFD3-7FEB1514549A@uam.es> On 08 Mar 2014, at 23:11, Jose E. Roman wrote: > The nullspace of A^h A is equal to the right singular space of A corresponding to the zero singular value. It should be possible to compute this with SLEPc's SVD. Computing a large number of zeros may be problematic, so I cannot say in advance if the method will succeed. If you can generate a small matrix with these properties, send it to my personal address (not the list) and I will give it a try. Dear Jose, thank you for the answer. I thought of using the SVD (my solution would correspond to Eq. 4.4). SLEPc guide, however, focuses on the case of a null space for A* which is much larger than its range, and says that the null space is often not computed at all. Furthermore, of the AA^h and A^hA cases, it says it takes the smallest one. In my case, that would be AA^h, and the right singular space of A would be gone at the outset. I am glad to hear that the calculation is possible. Most probably. however, I?ll need some hints on how to achieve that. Finally, I really appreciate your offer of testing the algorithm; I will try to prepare a test case. Cheers, Luca -------------- next part -------------- An HTML attachment was scrubbed... URL: From jroman at dsic.upv.es Sun Mar 9 16:41:42 2014 From: jroman at dsic.upv.es (Jose E. Roman) Date: Sun, 9 Mar 2014 22:41:42 +0100 Subject: [petsc-users] Best way to compute the null space of a sparse maximum-rank rectangular matrix In-Reply-To: <12D3F600-D849-4A87-AFD3-7FEB1514549A@uam.es> References: <214B9C59-C22F-48CF-A3A5-8AB32327A27D@uam.es> <7DACA6F4-7895-4AC2-AC61-7D073A6FDF7C@dsic.upv.es> <12D3F600-D849-4A87-AFD3-7FEB1514549A@uam.es> Message-ID: <29656879-D1B9-4ED4-99EB-2B84B5652946@dsic.upv.es> El 09/03/2014, a las 18:36, Luca Argenti escribi?: > On 08 Mar 2014, at 23:11, Jose E. Roman wrote: > >> The nullspace of A^h A is equal to the right singular space of A corresponding to the zero singular value. It should be possible to compute this with SLEPc's SVD. Computing a large number of zeros may be problematic, so I cannot say in advance if the method will succeed. If you can generate a small matrix with these properties, send it to my personal address (not the list) and I will give it a try. > > Dear Jose, thank you for the answer. I thought of using the SVD (my solution would correspond to Eq. 4.4). > SLEPc guide, however, focuses on the case of a null space for A* which is much larger than its range, and > says that the null space is often not computed at all. Furthermore, of the AA^h and A^hA cases, it says it > takes the smallest one. In my case, that would be AA^h, and the right singular space of A would be gone at > the outset. I am glad to hear that the calculation is possible. Most probably. however, I?ll need some hints on > how to achieve that. Finally, I really appreciate your offer of testing the algorithm; I will try to prepare a test case. > > Cheers, > > Luca > Yes, that is true. Maybe it is possible to add an option to workaround this. I will check. The 'cyclic' solver does not have this problem, but may be difficult to compute the null space in that setting. An alternative is, if you are able to permute the columns in a way that A = [ B C ] with B square and non-singular, then the nullspace will be [ B^{-1}*C ] [ I_k ] The difficult part here is to do the permutation, but you said your A matrix is already in this form. Jose From luca.argenti at uam.es Sun Mar 9 16:52:30 2014 From: luca.argenti at uam.es (Luca Argenti) Date: Sun, 9 Mar 2014 22:52:30 +0100 Subject: [petsc-users] Best way to compute the null space of a sparse maximum-rank rectangular matrix In-Reply-To: <29656879-D1B9-4ED4-99EB-2B84B5652946@dsic.upv.es> References: <214B9C59-C22F-48CF-A3A5-8AB32327A27D@uam.es> <7DACA6F4-7895-4AC2-AC61-7D073A6FDF7C@dsic.upv.es> <12D3F600-D849-4A87-AFD3-7FEB1514549A@uam.es> <29656879-D1B9-4ED4-99EB-2B84B5652946@dsic.upv.es> Message-ID: <86786CB4-714E-49BF-B906-0150570656B3@uam.es> > Yes, that is true. Maybe it is possible to add an option to workaround this. I will check. The 'cyclic' solver does not have this problem, but may be difficult to compute the null space in that setting. > > An alternative is, if you are able to permute the columns in a way that A = [ B C ] with B square and non-singular, then the nullspace will be > [ B^{-1}*C ] > [ I_k ] > The difficult part here is to do the permutation, but you said your A matrix is already in this form. This is another possibility. In fact, the matrix B you indicate is diagonalizable with the same unitary transformation for all the values of the external parameter Q. This is the method of choice when we can diagonalize the matrix. If one is not able to make the diagonalization, then I imagine this solution is still viable, but I don?t know how to get an inexpensive way of approximating the action of B^{-1} on C. Also, for some parameters, there may be a couple of eigenvalues of B that get very very close to zero. From m.bahaa.eldein at gmail.com Mon Mar 10 09:36:47 2014 From: m.bahaa.eldein at gmail.com (Mohammad Bahaa) Date: Mon, 10 Mar 2014 16:36:47 +0200 Subject: [petsc-users] Using PETSC for a CFD solver Message-ID: Hi, I'm pretty new to PETSC, so pardon me if the question is primitive somehow, I used *METIS *to partition my grid (represented by a system of linear equations Ax=b) to a number of sub-grids, say 4 sub-grids, with 4 different systems of linear Equations (A1x1=b1, A2x2=b2, ...), can anyone post an example showing how to solve these "n" sub-systems simultaneously, I've tried the following program, but it's not working correctly, as when I use *MatGetOwnershipRange *in each process I find that A1 ownership range is 1/4 the matrix size for the first process, while it should be all of it. subroutine test_drive_2 implicit none ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! Include files ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! ! This program uses CPP for preprocessing, as indicated by the use of ! PETSc include files in the directory petsc/include/finclude. This ! convention enables use of the CPP preprocessor, which allows the use ! of the #include statements that define PETSc objects and variables. ! ! Use of the conventional Fortran include statements is also supported ! In this case, the PETsc include files are located in the directory ! petsc/include/foldinclude. ! ! Since one must be very careful to include each file no more than once ! in a Fortran routine, application programmers must exlicitly list ! each file needed for the various PETSc components within their ! program (unlike the C/C++ interface). ! ! See the Fortran section of the PETSc users manual for details. ! ! The following include statements are required for KSP Fortran programs: ! petscsys.h - base PETSc routines ! petscvec.h - vectors ! petscmat.h - matrices ! petscksp.h - Krylov subspace methods ! petscpc.h - preconditioners ! Other include statements may be needed if using additional PETSc ! routines in a Fortran program, e.g., ! petscviewer.h - viewers ! petscis.h - index sets ! ! main includes #include #include #include #include #include ! includes for F90 specific functions #include #include #include #include ! ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! Variable declarations ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! ! Variables: ! ksp - linear solver context ! ksp - Krylov subspace method context ! pc - preconditioner context ! x, b, u - approx solution, right-hand-side, exact solution vectors ! A - matrix that defines linear system ! its - iterations for convergence ! norm - norm of error in solution ! Vec x,b,u Mat A KSP ksp PC pc PetscReal norm,tol PetscErrorCode ierr PetscInt i,n,col(3),its,i1,i2,i3 PetscBool flg PetscMPIInt size,rank PetscScalar none,one,value(3), testa PetscScalar, pointer :: xx_v(:) PetscScalar, allocatable, dimension(:) :: myx PetscOffset i_x !real(4) :: myx(10), myu(10), myb(10) !real(8), allocatable, dimension(:) :: myx integer :: ic, nc, ncmax, nz, ncols, j integer :: fileunit, ione integer, allocatable, dimension(:,:) :: neighb, cols integer, allocatable, dimension(:) :: nnz, vcols real(8), allocatable, dimension(:,:) :: acoef, vals real(8), allocatable, dimension(:) :: ap, su, vvals character (len=100) :: rankstring, filename, folder real(8) :: atol, rtol, dtol integer :: mxit, istart, iend real(8) :: rvar, minvalx call PetscInitialize(PETSC_NULL_CHARACTER,ierr) call MPI_Comm_size(PETSC_COMM_WORLD,size,ierr) call MPI_Comm_rank(PETSC_COMM_WORLD,rank,ierr) ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! Load the linear system ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - !mbs read data file for experimentation !if(rank.EQ.0) read(*,'(A)'), folder folder = '60' write(rankstring,'(I)'), rank filename = trim(adjustl(folder)) // '/linearsys_' // trim(adjustl(rankstring)) // '.txt' fileunit = 9000 + rank open(unit=fileunit, file=trim(filename)) read(fileunit,*), ncmax read(fileunit,*), nc read(fileunit,*), allocate( neighb(6,ncmax), acoef(6,ncmax), ap(ncmax), su(ncmax) ) allocate( nnz(0:ncmax-1), cols(6,0:ncmax-1), vals(6,0:ncmax-1) ) allocate( vcols(0:ncmax-1), vvals(0:ncmax-1) ) !allocate( xx_v(ncmax), myx(ncmax) ) !allocate( xx_v(0:ncmax), myx(0:ncmax) ) allocate( myx(ncmax) ) do i=1,nc read(fileunit,'(I)'), ic read(fileunit,'(6I)'), ( neighb(j,ic), j=1,6 ) read(fileunit,'(6F)'), ( acoef(j,ic), j=1,6 ) read(fileunit,'(2F)'), ap(ic), su(ic) read(fileunit,*), enddo close(fileunit) nz = 7 nnz = 0 do ic=0,nc-1 ! values for coefficient matrix (diagonal) nnz(ic) = nnz(ic) + 1 cols(nnz(ic),ic) = ic vals(nnz(ic),ic) = ap(ic+1) ! values for coefficient matrix (off diagonal) do j=1,6 if(neighb(j,ic+1).GT.0)then nnz(ic) = nnz(ic) + 1 cols(nnz(ic),ic) = neighb(j,ic+1) - 1 vals(nnz(ic),ic) = acoef(j,ic+1) endif enddo ! values for RHS vcols(ic) = ic vvals(ic) = su(ic+1) enddo ! add dummy values for remaining rows (if any) if(ncmax.GT.nc)then do ic=nc,ncmax-1 ! coeff matrix nnz(ic) = 1 cols(nnz(ic),ic) = ic vals(nnz(ic),ic) = 1.0 ! RHS vcols(ic) = ic vvals(ic) = 0.0 enddo endif print*, 'rank', rank, 'says nc is', nc ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! Beginning of program ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - !if (size .ne. 1) then ! call MPI_Comm_rank(PETSC_COMM_WORLD,rank,ierr) ! if (rank .eq. 0) then ! write(6,*) 'This is a uniprocessor example only!' ! endif ! SETERRQ(PETSC_COMM_WORLD,1,' ',ierr) !endif ione = 1 none = -1.0 one = 1.0 n = ncmax i1 = 1 i2 = 2 i3 = 3 call PetscOptionsGetInt(PETSC_NULL_CHARACTER,'-n',n,flg,ierr) ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! Compute the matrix and right-hand-side vector that define ! the linear system, Ax = b. ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! Create matrix. When using MatCreate(), the matrix format can ! be specified at runtime. call MatCreate(PETSC_COMM_WORLD,A,ierr) !call MatCreateSeqAij(PETSC_COMM_SELF,A,ierr) call MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,n,n,ierr) call MatSetFromOptions(A,ierr) call MatSetUp(A,ierr) call MatGetOwnershipRange(A,istart,iend,ierr) print*, rank, istart, iend ! Assemble matrix. ! - Note that MatSetValues() uses 0-based row and column numbers ! in Fortran as well as in C (as set here in the array "col"). ! value(1) = -1.0 ! value(2) = 2.0 ! value(3) = -1.0 ! do 50 i=1,n-2 ! col(1) = i-1 ! col(2) = i ! col(3) = i+1 ! call MatSetValues(A,i1,i,i3,col,value,INSERT_VALUES,ierr) !50 continue ! i = n - 1 ! col(1) = n - 2 ! col(2) = n - 1 ! call MatSetValues(A,i1,i,i2,col,value,INSERT_VALUES,ierr) ! i = 0 ! col(1) = 0 ! col(2) = 1 ! value(1) = 2.0 ! value(2) = -1.0 ! call MatSetValues(A,i1,i,i2,col,value,INSERT_VALUES,ierr) ! call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr) ! call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr) do ic=0,ncmax-1 call MatSetValues(A,ione,ic,nnz(ic),cols(1:nnz(ic),ic),vals(1:nnz(ic),ic),INSERT_VALUES,ierr) enddo call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr) call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr) ! Create vectors. Note that we form 1 vector from scratch and ! then duplicate as needed. call VecCreate(PETSC_COMM_WORLD,x,ierr) !call VecCreateSeq(PETSC_COMM_SELF,x,ierr) call VecSetSizes(x,PETSC_DECIDE,n,ierr) call VecSetFromOptions(x,ierr) call VecDuplicate(x,b,ierr) call VecDuplicate(x,u,ierr) ! Set exact solution; then compute right-hand-side vector. call VecSet(u,one,ierr) call VecSet(x,one*10,ierr) !call MatMult(A,u,b,ierr) ! set source terms vector call VecSetValues(b,ncmax,vcols,vvals,INSERT_VALUES,ierr) call VecAssemblyBegin(b,ierr) call VecAssemblyEnd(b,ierr) ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! Create the linear solver and set various options ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! Create linear solver context call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) ! Set operators. Here the matrix that defines the linear system ! also serves as the preconditioning matrix. call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr) ! Set linear solver defaults for this problem (optional). ! - By extracting the KSP and PC contexts from the KSP context, ! we can then directly directly call any KSP and PC routines ! to set various options. ! - The following four statements are optional; all of these ! parameters could alternatively be specified at runtime via ! KSPSetFromOptions(); call KSPGetPC(ksp,pc,ierr) call PCSetType(pc,PCJACOBI,ierr) atol = 1.d-12 rtol = 1.d-12 dtol = 1.d10 mxit = 100 ! call KSPSetTolerances(ksp,tol,PETSC_DEFAULT_DOUBLE_PRECISION, & ! & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) call KSPSetTolerances(ksp,atol,rtol,dtol,mxit,ierr) ! Set runtime options, e.g., ! -ksp_type -pc_type -ksp_monitor -ksp_rtol ! These options will override those specified above as long as ! KSPSetFromOptions() is called _after_ any other customization ! routines. call KSPSetFromOptions(ksp,ierr) ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! Solve the linear system ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - call KSPSetType(ksp,KSPBCGS,ierr) call KSPSolve(ksp,b,x,ierr) ! View solver info; we could instead use the option -ksp_view !call KSPView(ksp,PETSC_VIEWER_STDOUT_WORLD,ierr) ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! Check solution and clean up ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - call VecGetArrayF90(x,xx_v,ierr) !xx_v = 5.1d0 !do ic=1,ncmax ! myx(ic) = xx_v(ic) !enddo myx = xx_v !call VecGetArray(x,myx,i_x,ierr) !value = x_array(i_x + 1) !call VecRestoreArray(x,myx,i_x,ierr) !rvar = xx_v(3) call VecRestoreArrayF90(x,xx_v,ierr) ! !call VecGetArrayF90(b,xx_v,ierr) !myb = xx_v !call VecRestoreArrayF90(x,xx_v,ierr) ! !call VecView(x,PETSC_VIEWER_STDOUT_SELF) !call MatView(a,PETSC_VIEWER_STDOUT_SELF) !print*, 'rank', rank, 'says max x is', maxval(myx) ! print*, xx_v ! Check the error call MatMult(A,x,u,ierr) call VecAXPY(u,none,b,ierr) call VecNorm(u,NORM_2,norm,ierr) call KSPGetIterationNumber(ksp,its,ierr) if (norm .gt. 1.e-12) then write(6,100) norm,its else write(6,200) its endif 100 format('Norm of error = ',e11.4,', Iterations = ',i5) 200 format('Norm of error < 1.e-12,Iterations = ',i5) !call KSPGetSolution(ksp,myx) minvalx = 1.0e15 do ic=1,ncmax if(myx(ic).LT.minvalx) minvalx = myx(ic) enddo !write(*,300), rank, maxval(myx(1:nc)), minvalx if(rank.EQ.0) print*, myx 300 format('Rank ', I1, ' says max/min x are: ', F12.4, ' / ', F12.4) ! Free work space. All PETSc objects should be destroyed when they ! are no longer needed. call VecDestroy(x,ierr) call VecDestroy(u,ierr) call VecDestroy(b,ierr) call MatDestroy(A,ierr) call KSPDestroy(ksp,ierr) call PetscFinalize(ierr) continue end -- Mohamamd Bahaa ElDin -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Mar 10 09:50:19 2014 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 10 Mar 2014 09:50:19 -0500 Subject: [petsc-users] Using PETSC for a CFD solver In-Reply-To: References: Message-ID: On Mon, Mar 10, 2014 at 9:36 AM, Mohammad Bahaa wrote: > Hi, > > I'm pretty new to PETSC, so pardon me if the question is primitive > somehow, I used *METIS *to partition my grid (represented by a system of > linear equations Ax=b) to a number of sub-grids, say 4 sub-grids, with 4 > different systems of linear Equations (A1x1=b1, A2x2=b2, ...), can anyone > post an example showing how to solve these "n" sub-systems simultaneously, > I've tried the following program, but it's not working correctly, as when I > use *MatGetOwnershipRange *in each process I find that A1 ownership range > is 1/4 the matrix size for the first process, while it should be all of it. > I will answer this two ways. First, here is the "PETSc strategy" for doing the same thing. 1) Write a code that assembles and solves the entire thing 2) Use -pc_type bjacobi -ksp_type preonly -sub_ksp_type -mat 3) You can use ParMetis inside PETSc with the MatPartitioning This will solve the individual systems with no coupling (this is what it sounds like you want above). If you want to manage everything yourself, and you want to form individual systems on every process, just create the solvers using a smaller communicator. PETSC_COMM_SELF means that every system is serial. You can make smaller comms with MPI_Comm_split() if you want smaller comms, but some parallelism for each system. Thanks, Matt > subroutine test_drive_2 > > implicit none > > ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > ! Include files > ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > ! > ! This program uses CPP for preprocessing, as indicated by the use of > ! PETSc include files in the directory petsc/include/finclude. This > ! convention enables use of the CPP preprocessor, which allows the use > ! of the #include statements that define PETSc objects and variables. > ! > ! Use of the conventional Fortran include statements is also supported > ! In this case, the PETsc include files are located in the directory > ! petsc/include/foldinclude. > ! > ! Since one must be very careful to include each file no more than once > ! in a Fortran routine, application programmers must exlicitly list > ! each file needed for the various PETSc components within their > ! program (unlike the C/C++ interface). > ! > ! See the Fortran section of the PETSc users manual for details. > ! > ! The following include statements are required for KSP Fortran programs: > ! petscsys.h - base PETSc routines > ! petscvec.h - vectors > ! petscmat.h - matrices > ! petscksp.h - Krylov subspace methods > ! petscpc.h - preconditioners > ! Other include statements may be needed if using additional PETSc > ! routines in a Fortran program, e.g., > ! petscviewer.h - viewers > ! petscis.h - index sets > ! > > ! main includes > #include > #include > #include > #include > #include > > ! includes for F90 specific functions > #include > #include > #include > #include > ! > ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > ! Variable declarations > ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > ! > ! Variables: > ! ksp - linear solver context > ! ksp - Krylov subspace method context > ! pc - preconditioner context > ! x, b, u - approx solution, right-hand-side, exact solution vectors > ! A - matrix that defines linear system > ! its - iterations for convergence > ! norm - norm of error in solution > ! > Vec x,b,u > Mat A > KSP ksp > PC pc > PetscReal norm,tol > PetscErrorCode ierr > PetscInt i,n,col(3),its,i1,i2,i3 > PetscBool flg > PetscMPIInt size,rank > PetscScalar none,one,value(3), testa > > PetscScalar, pointer :: xx_v(:) > PetscScalar, allocatable, dimension(:) :: myx > PetscOffset i_x > > !real(4) :: myx(10), myu(10), myb(10) > !real(8), allocatable, dimension(:) :: myx > > integer :: ic, nc, ncmax, nz, ncols, j > integer :: fileunit, ione > integer, allocatable, dimension(:,:) :: neighb, cols > integer, allocatable, dimension(:) :: nnz, vcols > real(8), allocatable, dimension(:,:) :: acoef, vals > real(8), allocatable, dimension(:) :: ap, su, vvals > character (len=100) :: rankstring, filename, folder > > real(8) :: atol, rtol, dtol > integer :: mxit, istart, iend > real(8) :: rvar, minvalx > > > call PetscInitialize(PETSC_NULL_CHARACTER,ierr) > call MPI_Comm_size(PETSC_COMM_WORLD,size,ierr) > call MPI_Comm_rank(PETSC_COMM_WORLD,rank,ierr) > > ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > ! Load the linear system > ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > > !mbs read data file for experimentation > !if(rank.EQ.0) read(*,'(A)'), folder > folder = '60' > write(rankstring,'(I)'), rank > filename = trim(adjustl(folder)) // '/linearsys_' // > trim(adjustl(rankstring)) // '.txt' > fileunit = 9000 + rank > open(unit=fileunit, file=trim(filename)) > > read(fileunit,*), ncmax > read(fileunit,*), nc > read(fileunit,*), > > allocate( neighb(6,ncmax), acoef(6,ncmax), ap(ncmax), su(ncmax) ) > allocate( nnz(0:ncmax-1), cols(6,0:ncmax-1), vals(6,0:ncmax-1) ) > allocate( vcols(0:ncmax-1), vvals(0:ncmax-1) ) > !allocate( xx_v(ncmax), myx(ncmax) ) > !allocate( xx_v(0:ncmax), myx(0:ncmax) ) > allocate( myx(ncmax) ) > > do i=1,nc > read(fileunit,'(I)'), ic > read(fileunit,'(6I)'), ( neighb(j,ic), j=1,6 ) > read(fileunit,'(6F)'), ( acoef(j,ic), j=1,6 ) > read(fileunit,'(2F)'), ap(ic), su(ic) > read(fileunit,*), > enddo > > close(fileunit) > > nz = 7 > nnz = 0 > do ic=0,nc-1 > > ! values for coefficient matrix (diagonal) > nnz(ic) = nnz(ic) + 1 > cols(nnz(ic),ic) = ic > vals(nnz(ic),ic) = ap(ic+1) > > ! values for coefficient matrix (off diagonal) > do j=1,6 > if(neighb(j,ic+1).GT.0)then > nnz(ic) = nnz(ic) + 1 > cols(nnz(ic),ic) = neighb(j,ic+1) - 1 > vals(nnz(ic),ic) = acoef(j,ic+1) > endif > enddo > > ! values for RHS > vcols(ic) = ic > vvals(ic) = su(ic+1) > > enddo > > ! add dummy values for remaining rows (if any) > if(ncmax.GT.nc)then > do ic=nc,ncmax-1 > ! coeff matrix > nnz(ic) = 1 > cols(nnz(ic),ic) = ic > vals(nnz(ic),ic) = 1.0 > ! RHS > vcols(ic) = ic > vvals(ic) = 0.0 > enddo > endif > > print*, 'rank', rank, 'says nc is', nc > > ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > ! Beginning of program > ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > > !if (size .ne. 1) then > ! call MPI_Comm_rank(PETSC_COMM_WORLD,rank,ierr) > ! if (rank .eq. 0) then > ! write(6,*) 'This is a uniprocessor example only!' > ! endif > ! SETERRQ(PETSC_COMM_WORLD,1,' ',ierr) > !endif > > ione = 1 > none = -1.0 > one = 1.0 > n = ncmax > i1 = 1 > i2 = 2 > i3 = 3 > call PetscOptionsGetInt(PETSC_NULL_CHARACTER,'-n',n,flg,ierr) > > ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > ! Compute the matrix and right-hand-side vector that define > ! the linear system, Ax = b. > ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > > ! Create matrix. When using MatCreate(), the matrix format can > ! be specified at runtime. > > call MatCreate(PETSC_COMM_WORLD,A,ierr) > !call MatCreateSeqAij(PETSC_COMM_SELF,A,ierr) > call MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,n,n,ierr) > call MatSetFromOptions(A,ierr) > call MatSetUp(A,ierr) > > call MatGetOwnershipRange(A,istart,iend,ierr) > print*, rank, istart, iend > > ! Assemble matrix. > ! - Note that MatSetValues() uses 0-based row and column numbers > ! in Fortran as well as in C (as set here in the array "col"). > > ! value(1) = -1.0 > ! value(2) = 2.0 > ! value(3) = -1.0 > ! do 50 i=1,n-2 > ! col(1) = i-1 > ! col(2) = i > ! col(3) = i+1 > ! call MatSetValues(A,i1,i,i3,col,value,INSERT_VALUES,ierr) > !50 continue > ! i = n - 1 > ! col(1) = n - 2 > ! col(2) = n - 1 > ! call MatSetValues(A,i1,i,i2,col,value,INSERT_VALUES,ierr) > ! i = 0 > ! col(1) = 0 > ! col(2) = 1 > ! value(1) = 2.0 > ! value(2) = -1.0 > ! call MatSetValues(A,i1,i,i2,col,value,INSERT_VALUES,ierr) > ! call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr) > ! call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr) > > do ic=0,ncmax-1 > call > MatSetValues(A,ione,ic,nnz(ic),cols(1:nnz(ic),ic),vals(1:nnz(ic),ic),INSERT_VALUES,ierr) > enddo > > call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr) > call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr) > > ! Create vectors. Note that we form 1 vector from scratch and > ! then duplicate as needed. > > call VecCreate(PETSC_COMM_WORLD,x,ierr) > !call VecCreateSeq(PETSC_COMM_SELF,x,ierr) > call VecSetSizes(x,PETSC_DECIDE,n,ierr) > call VecSetFromOptions(x,ierr) > call VecDuplicate(x,b,ierr) > call VecDuplicate(x,u,ierr) > > ! Set exact solution; then compute right-hand-side vector. > > call VecSet(u,one,ierr) > call VecSet(x,one*10,ierr) > !call MatMult(A,u,b,ierr) > > ! set source terms vector > call VecSetValues(b,ncmax,vcols,vvals,INSERT_VALUES,ierr) > call VecAssemblyBegin(b,ierr) > call VecAssemblyEnd(b,ierr) > > ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > ! Create the linear solver and set various options > ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > > ! Create linear solver context > > call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) > > ! Set operators. Here the matrix that defines the linear system > ! also serves as the preconditioning matrix. > > call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr) > > ! Set linear solver defaults for this problem (optional). > ! - By extracting the KSP and PC contexts from the KSP context, > ! we can then directly directly call any KSP and PC routines > ! to set various options. > ! - The following four statements are optional; all of these > ! parameters could alternatively be specified at runtime via > ! KSPSetFromOptions(); > > call KSPGetPC(ksp,pc,ierr) > call PCSetType(pc,PCJACOBI,ierr) > atol = 1.d-12 > rtol = 1.d-12 > dtol = 1.d10 > mxit = 100 > ! call KSPSetTolerances(ksp,tol,PETSC_DEFAULT_DOUBLE_PRECISION, & > ! & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) > > call KSPSetTolerances(ksp,atol,rtol,dtol,mxit,ierr) > > ! Set runtime options, e.g., > ! -ksp_type -pc_type -ksp_monitor -ksp_rtol > ! These options will override those specified above as long as > ! KSPSetFromOptions() is called _after_ any other customization > ! routines. > > call KSPSetFromOptions(ksp,ierr) > > ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > ! Solve the linear system > ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > > call KSPSetType(ksp,KSPBCGS,ierr) > > call KSPSolve(ksp,b,x,ierr) > > ! View solver info; we could instead use the option -ksp_view > > !call KSPView(ksp,PETSC_VIEWER_STDOUT_WORLD,ierr) > > ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > ! Check solution and clean up > ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > > call VecGetArrayF90(x,xx_v,ierr) > !xx_v = 5.1d0 > !do ic=1,ncmax > ! myx(ic) = xx_v(ic) > !enddo > myx = xx_v > !call VecGetArray(x,myx,i_x,ierr) > !value = x_array(i_x + 1) > !call VecRestoreArray(x,myx,i_x,ierr) > !rvar = xx_v(3) > call VecRestoreArrayF90(x,xx_v,ierr) > ! > !call VecGetArrayF90(b,xx_v,ierr) > !myb = xx_v > !call VecRestoreArrayF90(x,xx_v,ierr) > ! > !call VecView(x,PETSC_VIEWER_STDOUT_SELF) > !call MatView(a,PETSC_VIEWER_STDOUT_SELF) > > !print*, 'rank', rank, 'says max x is', maxval(myx) > ! print*, xx_v > > ! Check the error > > call MatMult(A,x,u,ierr) > call VecAXPY(u,none,b,ierr) > call VecNorm(u,NORM_2,norm,ierr) > call KSPGetIterationNumber(ksp,its,ierr) > > if (norm .gt. 1.e-12) then > write(6,100) norm,its > else > write(6,200) its > endif > 100 format('Norm of error = ',e11.4,', Iterations = ',i5) > 200 format('Norm of error < 1.e-12,Iterations = ',i5) > > !call KSPGetSolution(ksp,myx) > > minvalx = 1.0e15 > do ic=1,ncmax > if(myx(ic).LT.minvalx) minvalx = myx(ic) > enddo > > !write(*,300), rank, maxval(myx(1:nc)), minvalx > > if(rank.EQ.0) print*, myx > > 300 format('Rank ', I1, ' says max/min x are: ', F12.4, ' / ', F12.4) > > ! Free work space. All PETSc objects should be destroyed when they > ! are no longer needed. > > call VecDestroy(x,ierr) > call VecDestroy(u,ierr) > call VecDestroy(b,ierr) > call MatDestroy(A,ierr) > call KSPDestroy(ksp,ierr) > call PetscFinalize(ierr) > > continue > > end > > -- > Mohamamd Bahaa ElDin > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From fd.kong at siat.ac.cn Mon Mar 10 23:29:20 2014 From: fd.kong at siat.ac.cn (Fande Kong) Date: Mon, 10 Mar 2014 22:29:20 -0600 Subject: [petsc-users] How to reuse a snes object? Message-ID: Hi all, I am trying to solve two different nonlinear equations. I first solve the first nonlinear equation using a snes within which a ksp is preconditioned by a pcasm. And then I want to use the solution of the first nonlinear equation as an initial guess to solve the second nonlinear equation, but this time I want to switch the preconditioner to pcmg. How could I finish this by creating snes only once, then using snessolve() twice, and switching preconditioner for linear solvers? Fande -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Mar 10 23:35:51 2014 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 10 Mar 2014 23:35:51 -0500 Subject: [petsc-users] How to reuse a snes object? In-Reply-To: References: Message-ID: On Mon, Mar 10, 2014 at 11:29 PM, Fande Kong wrote: > Hi all, > > I am trying to solve two different nonlinear equations. I first solve the > first nonlinear equation using a snes within which a ksp is preconditioned > by a pcasm. And then I want to use the solution of the first nonlinear > equation as an initial guess to solve the second nonlinear equation, but > this time I want to switch the preconditioner to pcmg. How could I finish > this by creating snes only once, then using snessolve() twice, and > switching preconditioner for linear solvers? > It seems like you gain nothing at all by reusing the SNES, and complicate your code. I would not do it. If you insist, you can a) Change the prefix, and call SNESSetFromOptions() before the second solve b) Pull out the PC (SNESGetKSP, SNESGetPC) and set it there Note that in any event, you will have to reset the function/Jacobian methods for the second solve. Thanks, Matt > Fande > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Mar 11 07:41:42 2014 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 11 Mar 2014 07:41:42 -0500 Subject: [petsc-users] How to reuse a snes object? In-Reply-To: References: Message-ID: On Tue, Mar 11, 2014 at 12:46 AM, Fande Kong wrote: > Thanks, > Never mail me personally. That is what the list is for. > If we follow a), we can not change any options from command line for the > second solve. > No, you miss the whole point of changing the prefix: SNESSetOptionsPrefix(snes, "second_"); so that afterwards second_pc_type lu will be picked up by the second call to SNESSetFromOptions(). However, as I said before, I think there is no reason to do this. Matt > On Mon, Mar 10, 2014 at 10:35 PM, Matthew Knepley wrote: > >> On Mon, Mar 10, 2014 at 11:29 PM, Fande Kong wrote: >> >>> Hi all, >>> >>> I am trying to solve two different nonlinear equations. I first solve >>> the first nonlinear equation using a snes within which a ksp is >>> preconditioned by a pcasm. And then I want to use the solution of the first >>> nonlinear equation as an initial guess to solve the second nonlinear >>> equation, but this time I want to switch the preconditioner to pcmg. How >>> could I finish this by creating snes only once, then using snessolve() >>> twice, and switching preconditioner for linear solvers? >>> >> >> It seems like you gain nothing at all by reusing the SNES, and complicate >> your code. I would not do it. >> >> If you insist, you can >> >> a) Change the prefix, and call SNESSetFromOptions() before the second >> solve >> >> b) Pull out the PC (SNESGetKSP, SNESGetPC) and set it there >> >> Note that in any event, you will have to reset the function/Jacobian >> methods for the second solve. >> >> Thanks, >> >> Matt >> >> >>> Fande >>> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From luchao at mail.iggcas.ac.cn Tue Mar 11 08:37:52 2014 From: luchao at mail.iggcas.ac.cn (=?GBK?B?wsCzrA==?=) Date: Tue, 11 Mar 2014 21:37:52 +0800 (GMT+08:00) Subject: [petsc-users] petsc malloc multidimensional array Message-ID: <9731bb.23bae.144b15d1aad.Coremail.luchao@mail.iggcas.ac.cn> Hi, recently,when I use PETSc to bulid 2d arrays such as PetscScalar A[512][512],B[512][512],C,D,E,F,..., program always has error of Segmentation Violation. So I want to use PetscMalloc to bulid 2d array, and I hope that I can also use these 2d array A[i][j] by subscripts as before. Could do please tell me how can I do? Thank you. LV CHAO 2014/3/11 -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Mar 11 08:41:21 2014 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 11 Mar 2014 08:41:21 -0500 Subject: [petsc-users] petsc malloc multidimensional array In-Reply-To: <9731bb.23bae.144b15d1aad.Coremail.luchao@mail.iggcas.ac.cn> References: <9731bb.23bae.144b15d1aad.Coremail.luchao@mail.iggcas.ac.cn> Message-ID: On Tue, Mar 11, 2014 at 8:37 AM, ?? wrote: > Hi, recently,when I use PETSc to bulid 2d arrays such as PetscScalar > A[512][512],B[512][512],C,D,E,F,..., program always has error of > Segmentation Violation. So I want to use PetscMalloc to bulid 2d array, and > I hope that I can also use these 2d array A[i][j] by subscripts as before. > Could do please tell me how can I do? Thank you. > 1) This is a C question, not a PETSc question 2) If you are using 2D arrays for fields, you should be using the DMDA, which has a section in the manual Thanks, Matt > LV CHAO > > 2014/3/11 > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From luchao at mail.iggcas.ac.cn Tue Mar 11 09:05:42 2014 From: luchao at mail.iggcas.ac.cn (=?UTF-8?B?5ZCV6LaF?=) Date: Tue, 11 Mar 2014 22:05:42 +0800 (GMT+08:00) Subject: [petsc-users] petsc malloc multidimensional array In-Reply-To: References: <9731bb.23bae.144b15d1aad.Coremail.luchao@mail.iggcas.ac.cn> Message-ID: <3c3192.23cc0.144b17694a0.Coremail.luchao@mail.iggcas.ac.cn> Hi, Matthew: Thank you for your reply so fast! but I also have some questions: 2d arrays I used is just intermediate variable, not for fields, and the fields is used Vector. In Finite element method, when I use element stiffness matrix to assemble global stiffness matrix, I always first compute the 2d element stiffness matrixs whose size is 512*512(inner points in element),so big for static arrays. So i want to use PetscMalloc to bulid 2d arrays to store element stiffness matrixs' values. And I don't know how to do, could do tell me? then use another 1d array to abstract the nonzero values from 2d arrays above-mentioned. could do you please tell me some other methods much more convenient and faster? -----????----- ???: "Matthew Knepley" ????: 2014?3?11? ??? ???: "??" ??: petsc-users ??: Re: [petsc-users] petsc malloc multidimensional array On Tue, Mar 11, 2014 at 8:37 AM, ?? wrote: Hi, recently,when I use PETSc to bulid 2d arrays such as PetscScalar A[512][512],B[512][512],C,D,E,F,..., program always has error of Segmentation Violation. So I want to use PetscMalloc to bulid 2d array, and I hope that I can also use these 2d array A[i][j] by subscripts as before. Could do please tell me how can I do? Thank you. 1) This is a C question, not a PETSc question 2) If you are using 2D arrays for fields, you should be using the DMDA, which has a section in the manual Thanks, Matt LV CHAO 2014/3/11 -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Mar 11 09:22:34 2014 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 11 Mar 2014 09:22:34 -0500 Subject: [petsc-users] petsc malloc multidimensional array In-Reply-To: <3c3192.23cc0.144b17694a0.Coremail.luchao@mail.iggcas.ac.cn> References: <9731bb.23bae.144b15d1aad.Coremail.luchao@mail.iggcas.ac.cn> <3c3192.23cc0.144b17694a0.Coremail.luchao@mail.iggcas.ac.cn> Message-ID: On Tue, Mar 11, 2014 at 9:05 AM, ?? wrote: > Hi, Matthew: > > Thank you for your reply so fast! but I also have some questions: > > 2d arrays I used is just intermediate variable, not for fields, and > the fields is used Vector. In Finite element method, when I use element > stiffness matrix to assemble global stiffness matrix, I always first > compute the 2d element stiffness matrixs whose size is 512*512(inner points > in element),so big for static arrays. So i want to use PetscMalloc to bulid > 2d arrays to store element stiffness matrixs' values. And I don't know how > to do, could do tell me? > AGAIN, these are C questions. You could a) malloc() each row b) malloc() the whole thing, and make pointers to each row Notice that this is enormous for an element matrix. This is likely not optimal for performance. Matt > then use another 1d array to abstract the nonzero values from 2d > arrays above-mentioned. could do you please tell me some other methods > much more convenient and faster? > > > -----????----- > *???:* "Matthew Knepley" > *????:* 2014?3?11? ??? > *???:* "??" > *??:* petsc-users > *??:* Re: [petsc-users] petsc malloc multidimensional array > > On Tue, Mar 11, 2014 at 8:37 AM, ?? wrote: > >> Hi, recently,when I use PETSc to bulid 2d arrays such as PetscScalar >> A[512][512],B[512][512],C,D,E,F,..., program always has error of >> Segmentation Violation. So I want to use PetscMalloc to bulid 2d array, and >> I hope that I can also use these 2d array A[i][j] by subscripts as before. >> Could do please tell me how can I do? Thank you. >> > 1) This is a C question, not a PETSc question > > 2) If you are using 2D arrays for fields, you should be using the DMDA, > which has a section in the manual > > Thanks, > > Matt > > >> LV CHAO >> >> 2014/3/11 >> >> >> >> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From luc.berger.vergiat at gmail.com Tue Mar 11 09:56:01 2014 From: luc.berger.vergiat at gmail.com (Luc Berger-Vergiat) Date: Tue, 11 Mar 2014 10:56:01 -0400 Subject: [petsc-users] Fieldsplit schur complement with preonly solves Message-ID: Hi all, I am testing some preconditioners for a FEM problem involving different types of fields (displacements, temperature, stresses and plastic strain). To make sure that things are working correctly I am first solving this problem with: -ksp_type preonly -pc_type lu, which works fine obviously. Then I move on to do: -ksp_type gmres -pc_type lu, and I get very good convergence (one gmres linear iteration per time step) which I expected. So solving the problem exactly in a preconditioner to gmres leads to optimal results. This can be done using a Schur complement, but when I pass the following options: -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_schur_factorization_type full -pc_fieldsplit_0_fields 2,3 -pc_fieldsplit_1_fields 0,1 -fieldsplit_0_ksp_type preonly -fieldsplit_0_pc_type lu -fieldsplit_1_ksp_type preonly -fieldsplit_1_pc_type lu My results are terrible, gmres does not converge and my FEM code reduces the size of the time step in order to converge. This does not make much sense to me... Curiously if I use the following options: -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_schur_factorization_type full -pc_fieldsplit_0_fields 2,3 -pc_fieldsplit_1_fields 0,1 -fieldsplit_0_ksp_type gmres -fieldsplit_0_pc_type lu -fieldsplit_1_ksp_type gmres -fieldsplit_1_pc_type lu then the global gmres converges in two iterations. So using a pair of ksp gmres/pc lu on the A00 block and the Schur complements works, but using lu directly doesn't. Because I think that all this is quite strange, I decided to dump some matrices out. Namely, I dumped the complete FEM jacobian, I also do a MatView on jac->B, jac->C and the result of KSPGetOperators on kspA. These returns three out of the four blocks needed to do the Schur complement. They are correct and I assume that the last block is also correct. When I import jac->B, jac->C and the matrix corresponding to kspA in MATLAB to compute the inverse of the Schur complement and pass it to gmres as preconditioner the problem is solved in 1 iteration. So MATLAB says: -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_schur_factorization_type full -pc_fieldsplit_0_fields 2,3 -pc_fieldsplit_1_fields 0,1 -fieldsplit_0_ksp_type preonly -fieldsplit_0_pc_type lu -fieldsplit_1_ksp_type preonly -fieldsplit_1_pc_type lu should yield only one iteration (maybe two depending on implementation). Any ideas why the Petsc doesn't solve this correctly? Best, Luc -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Mar 11 10:06:53 2014 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 11 Mar 2014 10:06:53 -0500 Subject: [petsc-users] Fieldsplit schur complement with preonly solves In-Reply-To: References: Message-ID: On Tue, Mar 11, 2014 at 9:56 AM, Luc Berger-Vergiat < luc.berger.vergiat at gmail.com> wrote: > Hi all, > I am testing some preconditioners for a FEM problem involving different > types of fields (displacements, temperature, stresses and plastic strain). > To make sure that things are working correctly I am first solving this > problem with: > > -ksp_type preonly -pc_type lu, which works fine obviously. > > > Then I move on to do: > > -ksp_type gmres -pc_type lu, and I get very good convergence (one gmres > linear iteration per time step) which I expected. > > > So solving the problem exactly in a preconditioner to gmres leads to > optimal results. > This can be done using a Schur complement, but when I pass the following > options: > > -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur > -pc_fieldsplit_schur_factorization_type full -pc_fieldsplit_0_fields 2,3 > -pc_fieldsplit_1_fields 0,1 -fieldsplit_0_ksp_type > preonly -fieldsplit_0_pc_type lu -fieldsplit_1_ksp_type > preonly -fieldsplit_1_pc_type lu > > My results are terrible, gmres does not converge and my FEM code reduces > the size of the time step in order to converge. > This does not make much sense to me... > For any convergence question, we need the output of -ksp_view -ksp_monitor, and for this we would also like -fieldsplit_0_ksp_monitor -fieldsplit_1_ksp_monitor Yes, something is wrong. Very often this is caused by submatrices which have a null space. Of course that null space should show up in LU, unless you are perturbing the factorization. Matt > Curiously if I use the following options: > > -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur > -pc_fieldsplit_schur_factorization_type full -pc_fieldsplit_0_fields 2,3 > -pc_fieldsplit_1_fields 0,1 -fieldsplit_0_ksp_type gmres > -fieldsplit_0_pc_type lu -fieldsplit_1_ksp_type gmres -fieldsplit_1_pc_type > lu > > then the global gmres converges in two iterations. > > So using a pair of ksp gmres/pc lu on the A00 block and the Schur > complements works, but using lu directly doesn't. > > Because I think that all this is quite strange, I decided to dump some > matrices out. Namely, I dumped the complete FEM jacobian, I also do a > MatView on jac->B, jac->C and the result of KSPGetOperators on kspA. These > returns three out of the four blocks needed to do the Schur complement. > They are correct and I assume that the last block is also correct. > When I import jac->B, jac->C and the matrix corresponding to kspA in > MATLAB to compute the inverse of the Schur complement and pass it to gmres > as preconditioner the problem is solved in 1 iteration. > > So MATLAB says: > > -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur > -pc_fieldsplit_schur_factorization_type full -pc_fieldsplit_0_fields 2,3 > -pc_fieldsplit_1_fields 0,1 -fieldsplit_0_ksp_type > preonly -fieldsplit_0_pc_type lu -fieldsplit_1_ksp_type > preonly -fieldsplit_1_pc_type lu > > should yield only one iteration (maybe two depending on implementation). > > Any ideas why the Petsc doesn't solve this correctly? > > Best, > Luc > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From luc.berger.vergiat at gmail.com Tue Mar 11 10:31:17 2014 From: luc.berger.vergiat at gmail.com (Luc) Date: Tue, 11 Mar 2014 11:31:17 -0400 Subject: [petsc-users] Fwd: Re: Fieldsplit schur complement with preonly solves In-Reply-To: <531F2A38.1070308@gmail.com> References: <531F2A38.1070308@gmail.com> Message-ID: <531F2C45.1030603@gmail.com> Oops, the message below was not sent to the mailing list. Sorry. -------- Original Message -------- Subject: Re: Fieldsplit schur complement with preonly solves Date: Tue, 11 Mar 2014 11:22:32 -0400 From: Luc To: Matthew Knepley Here are the -ksp_monitor and -ksp_view. For some reason I don't get anything out of -fieldsplit_0_ksp_monitor or -fieldsplit_1_ksp_monitor Is the output of these options stored in some file? Best, Luc On 03/11/2014 11:06 AM, Matthew Knepley wrote: > On Tue, Mar 11, 2014 at 9:56 AM, Luc Berger-Vergiat > > > wrote: > > Hi all, > I am testing some preconditioners for a FEM problem involving > different types of fields (displacements, temperature, stresses > and plastic strain). > To make sure that things are working correctly I am first solving > this problem with: > > -ksp_type preonly -pc_type lu, which works fine obviously. > > > Then I move on to do: > > -ksp_type gmres -pc_type lu, and I get very good convergence > (one gmres linear iteration per time step) which I expected. > > > So solving the problem exactly in a preconditioner to gmres leads > to optimal results. > This can be done using a Schur complement, but when I pass the > following options: > > -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur > -pc_fieldsplit_schur_factorization_type full > -pc_fieldsplit_0_fields 2,3 -pc_fieldsplit_1_fields 0,1 > -fieldsplit_0_ksp_type preonly -fieldsplit_0_pc_type > lu -fieldsplit_1_ksp_type preonly -fieldsplit_1_pc_type lu > > My results are terrible, gmres does not converge and my FEM code > reduces the size of the time step in order to converge. > This does not make much sense to me... > > > For any convergence question, we need the output of -ksp_view > -ksp_monitor, and for this we would also like > > -fieldsplit_0_ksp_monitor -fieldsplit_1_ksp_monitor > > Yes, something is wrong. Very often this is caused by submatrices > which have a null space. Of course that null space > should show up in LU, unless you are perturbing the factorization. > > Matt > > Curiously if I use the following options: > > -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur > -pc_fieldsplit_schur_factorization_type full > -pc_fieldsplit_0_fields 2,3 -pc_fieldsplit_1_fields 0,1 > -fieldsplit_0_ksp_type gmres -fieldsplit_0_pc_type > lu -fieldsplit_1_ksp_type gmres -fieldsplit_1_pc_type lu > > then the global gmres converges in two iterations. > > So using a pair of ksp gmres/pc lu on the A00 block and the Schur > complements works, but using lu directly doesn't. > > Because I think that all this is quite strange, I decided to dump > some matrices out. Namely, I dumped the complete FEM jacobian, I > also do a MatView on jac->B, jac->C and the result of > KSPGetOperators on kspA. These returns three out of the four > blocks needed to do the Schur complement. They are correct and I > assume that the last block is also correct. > When I import jac->B, jac->C and the matrix corresponding to kspA > in MATLAB to compute the inverse of the Schur complement and pass > it to gmres as preconditioner the problem is solved in 1 iteration. > > So MATLAB says: > > -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur > -pc_fieldsplit_schur_factorization_type full > -pc_fieldsplit_0_fields 2,3 -pc_fieldsplit_1_fields 0,1 > -fieldsplit_0_ksp_type preonly -fieldsplit_0_pc_type > lu -fieldsplit_1_ksp_type preonly -fieldsplit_1_pc_type lu > > should yield only one iteration (maybe two depending on > implementation). > > Any ideas why the Petsc doesn't solve this correctly? > > Best, > Luc > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- 0 KSP Residual norm 5.897183365409e+03 1 KSP Residual norm 2.357360750913e+03 2 KSP Residual norm 1.245115753185e+03 3 KSP Residual norm 7.494498291470e+02 4 KSP Residual norm 4.899331176210e+02 5 KSP Residual norm 3.296614155229e+02 6 KSP Residual norm 2.535141099025e+02 7 KSP Residual norm 1.848120199314e+02 8 KSP Residual norm 1.434395424307e+02 9 KSP Residual norm 1.080331131990e+02 10 KSP Residual norm 9.338277500035e+01 11 KSP Residual norm 6.544635890670e+01 12 KSP Residual norm 5.424636369024e+01 13 KSP Residual norm 5.030472769304e+01 14 KSP Residual norm 4.672584713181e+01 15 KSP Residual norm 4.207789949290e+01 16 KSP Residual norm 3.300863895958e+01 17 KSP Residual norm 2.550023161038e+01 18 KSP Residual norm 1.894202108393e+01 19 KSP Residual norm 1.481671652453e+01 20 KSP Residual norm 1.154264257191e+01 21 KSP Residual norm 9.426943797892e+00 22 KSP Residual norm 8.052410603809e+00 23 KSP Residual norm 6.907745793925e+00 24 KSP Residual norm 6.044753173613e+00 25 KSP Residual norm 5.537563785495e+00 26 KSP Residual norm 5.103038012673e+00 27 KSP Residual norm 4.657330910529e+00 28 KSP Residual norm 4.292761843691e+00 29 KSP Residual norm 4.029372196066e+00 30 KSP Residual norm 3.799130941110e+00 31 KSP Residual norm 3.719459377334e+00 32 KSP Residual norm 3.673829927252e+00 33 KSP Residual norm 3.593224950113e+00 34 KSP Residual norm 3.489258916404e+00 35 KSP Residual norm 3.363266816929e+00 36 KSP Residual norm 3.209234608324e+00 37 KSP Residual norm 3.042439215741e+00 38 KSP Residual norm 2.877622671449e+00 39 KSP Residual norm 2.728546188987e+00 40 KSP Residual norm 2.547527479156e+00 41 KSP Residual norm 2.395477146243e+00 42 KSP Residual norm 2.212193386074e+00 43 KSP Residual norm 1.972111351765e+00 44 KSP Residual norm 1.786505669346e+00 45 KSP Residual norm 1.719636784341e+00 46 KSP Residual norm 1.676148900999e+00 47 KSP Residual norm 1.591488773403e+00 48 KSP Residual norm 1.499202550326e+00 49 KSP Residual norm 1.422194063089e+00 50 KSP Residual norm 1.359790559159e+00 51 KSP Residual norm 1.285046681774e+00 52 KSP Residual norm 1.203288780427e+00 53 KSP Residual norm 1.136928971789e+00 54 KSP Residual norm 1.107297202580e+00 55 KSP Residual norm 1.101074698660e+00 56 KSP Residual norm 1.093331606685e+00 57 KSP Residual norm 1.083704496039e+00 58 KSP Residual norm 1.040484535384e+00 59 KSP Residual norm 9.771849314513e-01 60 KSP Residual norm 9.344826248494e-01 61 KSP Residual norm 9.325693177458e-01 62 KSP Residual norm 9.303050370074e-01 63 KSP Residual norm 9.291459224090e-01 64 KSP Residual norm 9.290645462896e-01 65 KSP Residual norm 9.287381711083e-01 66 KSP Residual norm 9.282629835097e-01 67 KSP Residual norm 9.277741376833e-01 68 KSP Residual norm 9.271803369073e-01 69 KSP Residual norm 9.250217569118e-01 70 KSP Residual norm 9.212099392648e-01 71 KSP Residual norm 9.212073588376e-01 72 KSP Residual norm 9.211767536689e-01 73 KSP Residual norm 9.084092709913e-01 74 KSP Residual norm 8.947842875988e-01 75 KSP Residual norm 8.914265297804e-01 76 KSP Residual norm 8.820940650258e-01 77 KSP Residual norm 8.733506393051e-01 78 KSP Residual norm 8.725287382103e-01 79 KSP Residual norm 8.725284775915e-01 80 KSP Residual norm 8.713202203880e-01 81 KSP Residual norm 8.711493244369e-01 82 KSP Residual norm 8.675811306599e-01 83 KSP Residual norm 8.645022544578e-01 84 KSP Residual norm 8.644418675588e-01 85 KSP Residual norm 8.546661807528e-01 86 KSP Residual norm 8.020619997764e-01 87 KSP Residual norm 7.517756437439e-01 88 KSP Residual norm 7.337219408201e-01 89 KSP Residual norm 7.270134369251e-01 90 KSP Residual norm 7.259315359628e-01 91 KSP Residual norm 7.259103097075e-01 92 KSP Residual norm 7.258080643414e-01 93 KSP Residual norm 7.256618341910e-01 94 KSP Residual norm 7.247786879048e-01 95 KSP Residual norm 7.236668099862e-01 96 KSP Residual norm 7.224299688377e-01 97 KSP Residual norm 7.220913973301e-01 98 KSP Residual norm 7.220752382894e-01 99 KSP Residual norm 7.219128793284e-01 100 KSP Residual norm 7.213346547480e-01 101 KSP Residual norm 7.207984556253e-01 102 KSP Residual norm 7.179896279491e-01 103 KSP Residual norm 7.173049295420e-01 104 KSP Residual norm 7.171924631613e-01 105 KSP Residual norm 7.171536072092e-01 106 KSP Residual norm 7.167957189647e-01 107 KSP Residual norm 7.148539622376e-01 108 KSP Residual norm 7.118563897705e-01 109 KSP Residual norm 7.113376449982e-01 110 KSP Residual norm 7.104632681400e-01 111 KSP Residual norm 7.062410993373e-01 112 KSP Residual norm 6.985958984486e-01 113 KSP Residual norm 6.967438053741e-01 114 KSP Residual norm 6.961598269050e-01 115 KSP Residual norm 6.960411975255e-01 116 KSP Residual norm 6.875341285503e-01 117 KSP Residual norm 6.799113001347e-01 118 KSP Residual norm 6.662584363706e-01 119 KSP Residual norm 6.500754715525e-01 120 KSP Residual norm 6.491541601024e-01 121 KSP Residual norm 6.491536786911e-01 122 KSP Residual norm 6.491467894184e-01 123 KSP Residual norm 6.491463412984e-01 124 KSP Residual norm 6.490930020745e-01 125 KSP Residual norm 6.490182995735e-01 126 KSP Residual norm 6.487114361681e-01 127 KSP Residual norm 6.486007337274e-01 128 KSP Residual norm 6.485960401344e-01 129 KSP Residual norm 6.485907316401e-01 130 KSP Residual norm 6.485302185413e-01 131 KSP Residual norm 6.485010111033e-01 132 KSP Residual norm 6.480725604192e-01 133 KSP Residual norm 6.480715558631e-01 134 KSP Residual norm 6.473276661987e-01 135 KSP Residual norm 6.467991125520e-01 136 KSP Residual norm 6.460739427552e-01 137 KSP Residual norm 6.450574376839e-01 138 KSP Residual norm 6.449856151333e-01 139 KSP Residual norm 6.449807015520e-01 140 KSP Residual norm 6.448411288002e-01 141 KSP Residual norm 6.430292960800e-01 142 KSP Residual norm 6.351901300089e-01 143 KSP Residual norm 6.293336513308e-01 144 KSP Residual norm 6.284759302154e-01 145 KSP Residual norm 6.263015684893e-01 146 KSP Residual norm 6.237643741697e-01 147 KSP Residual norm 6.187993755041e-01 148 KSP Residual norm 6.105134599993e-01 149 KSP Residual norm 5.978549909096e-01 150 KSP Residual norm 5.859768500068e-01 151 KSP Residual norm 5.857999341728e-01 152 KSP Residual norm 5.857974249440e-01 153 KSP Residual norm 5.857552092023e-01 154 KSP Residual norm 5.857373718886e-01 155 KSP Residual norm 5.857146512030e-01 156 KSP Residual norm 5.856517770978e-01 157 KSP Residual norm 5.856306370807e-01 158 KSP Residual norm 5.855913715132e-01 159 KSP Residual norm 5.855365875377e-01 160 KSP Residual norm 5.855358521319e-01 161 KSP Residual norm 5.854920296868e-01 162 KSP Residual norm 5.854915803861e-01 163 KSP Residual norm 5.854161398357e-01 164 KSP Residual norm 5.852145740669e-01 165 KSP Residual norm 5.852108880492e-01 166 KSP Residual norm 5.850803934245e-01 167 KSP Residual norm 5.850137305528e-01 168 KSP Residual norm 5.849717031248e-01 169 KSP Residual norm 5.849602723989e-01 170 KSP Residual norm 5.847074198265e-01 171 KSP Residual norm 5.807492285585e-01 172 KSP Residual norm 5.757451061380e-01 173 KSP Residual norm 5.692901264535e-01 174 KSP Residual norm 5.660871794540e-01 175 KSP Residual norm 5.646756478719e-01 176 KSP Residual norm 5.607237018444e-01 177 KSP Residual norm 5.568961372015e-01 178 KSP Residual norm 5.463721879174e-01 179 KSP Residual norm 5.297241830339e-01 180 KSP Residual norm 5.084608751613e-01 181 KSP Residual norm 5.067075313753e-01 182 KSP Residual norm 5.066794639501e-01 183 KSP Residual norm 5.064206960247e-01 184 KSP Residual norm 5.062486457591e-01 185 KSP Residual norm 5.061886567870e-01 186 KSP Residual norm 5.061870041614e-01 187 KSP Residual norm 5.061616211607e-01 188 KSP Residual norm 5.057570655971e-01 189 KSP Residual norm 5.051560913380e-01 190 KSP Residual norm 5.033621039962e-01 191 KSP Residual norm 5.031066356944e-01 192 KSP Residual norm 5.022197408323e-01 193 KSP Residual norm 5.022145697213e-01 194 KSP Residual norm 4.976476425674e-01 195 KSP Residual norm 4.976218395910e-01 196 KSP Residual norm 4.966905050003e-01 197 KSP Residual norm 4.925609944257e-01 198 KSP Residual norm 4.905718755718e-01 199 KSP Residual norm 4.844897921087e-01 200 KSP Residual norm 4.801159231515e-01 201 KSP Residual norm 4.799282062344e-01 202 KSP Residual norm 4.797247904383e-01 203 KSP Residual norm 4.794104209663e-01 204 KSP Residual norm 4.621238886554e-01 205 KSP Residual norm 4.525659945242e-01 206 KSP Residual norm 4.474609940274e-01 207 KSP Residual norm 4.447765101316e-01 208 KSP Residual norm 4.440608627726e-01 209 KSP Residual norm 4.416362777456e-01 210 KSP Residual norm 4.413541708428e-01 211 KSP Residual norm 4.413514515529e-01 212 KSP Residual norm 4.413309756150e-01 213 KSP Residual norm 4.413301745276e-01 214 KSP Residual norm 4.413119977039e-01 215 KSP Residual norm 4.412786574722e-01 216 KSP Residual norm 4.411335800621e-01 217 KSP Residual norm 4.410378827798e-01 218 KSP Residual norm 4.409620278464e-01 219 KSP Residual norm 4.409041814846e-01 220 KSP Residual norm 4.408410716864e-01 221 KSP Residual norm 4.405698903578e-01 222 KSP Residual norm 4.403397095690e-01 223 KSP Residual norm 4.401057585458e-01 224 KSP Residual norm 4.401053843369e-01 225 KSP Residual norm 4.400183730374e-01 226 KSP Residual norm 4.399042116281e-01 227 KSP Residual norm 4.398823250934e-01 228 KSP Residual norm 4.395598770197e-01 229 KSP Residual norm 4.393612765410e-01 230 KSP Residual norm 4.393536185966e-01 231 KSP Residual norm 4.384077355453e-01 232 KSP Residual norm 4.367582404156e-01 233 KSP Residual norm 4.357974590007e-01 234 KSP Residual norm 4.351967466584e-01 235 KSP Residual norm 4.340367621424e-01 236 KSP Residual norm 4.328657551427e-01 237 KSP Residual norm 4.328092628837e-01 238 KSP Residual norm 4.302195000259e-01 239 KSP Residual norm 4.250813928306e-01 240 KSP Residual norm 4.228819748046e-01 241 KSP Residual norm 4.228461571431e-01 242 KSP Residual norm 4.228332252601e-01 243 KSP Residual norm 4.228155186454e-01 244 KSP Residual norm 4.227881271753e-01 245 KSP Residual norm 4.227222839989e-01 246 KSP Residual norm 4.226121850177e-01 247 KSP Residual norm 4.226059298702e-01 248 KSP Residual norm 4.225526732207e-01 249 KSP Residual norm 4.224635593563e-01 250 KSP Residual norm 4.223966346858e-01 251 KSP Residual norm 4.223167024361e-01 252 KSP Residual norm 4.217803998648e-01 253 KSP Residual norm 4.215341658431e-01 254 KSP Residual norm 4.214152712861e-01 255 KSP Residual norm 4.207464541866e-01 256 KSP Residual norm 4.187263493199e-01 257 KSP Residual norm 4.174849400571e-01 258 KSP Residual norm 4.174743800366e-01 259 KSP Residual norm 4.172591554628e-01 260 KSP Residual norm 4.166154379809e-01 261 KSP Residual norm 4.162453690155e-01 262 KSP Residual norm 4.158343913450e-01 263 KSP Residual norm 4.158042411120e-01 264 KSP Residual norm 4.156480136324e-01 265 KSP Residual norm 4.150731220767e-01 266 KSP Residual norm 4.141177271256e-01 267 KSP Residual norm 4.137210438868e-01 268 KSP Residual norm 4.134313759421e-01 269 KSP Residual norm 4.133954495632e-01 270 KSP Residual norm 4.131588004462e-01 271 KSP Residual norm 4.131569777177e-01 272 KSP Residual norm 4.131566073891e-01 273 KSP Residual norm 4.131560869169e-01 274 KSP Residual norm 4.131523705316e-01 275 KSP Residual norm 4.131492847718e-01 276 KSP Residual norm 4.131429444088e-01 277 KSP Residual norm 4.131427961128e-01 278 KSP Residual norm 4.131427092386e-01 279 KSP Residual norm 4.131383827821e-01 280 KSP Residual norm 4.131280730710e-01 281 KSP Residual norm 4.131220026688e-01 282 KSP Residual norm 4.131220017715e-01 283 KSP Residual norm 4.131166850171e-01 284 KSP Residual norm 4.131152639803e-01 285 KSP Residual norm 4.131014038484e-01 286 KSP Residual norm 4.130990488669e-01 287 KSP Residual norm 4.130978784657e-01 288 KSP Residual norm 4.130954308308e-01 289 KSP Residual norm 4.130951613684e-01 290 KSP Residual norm 4.130858755040e-01 291 KSP Residual norm 4.130618206915e-01 292 KSP Residual norm 4.130614178608e-01 293 KSP Residual norm 4.130600046533e-01 294 KSP Residual norm 4.130493047400e-01 295 KSP Residual norm 4.130435934454e-01 296 KSP Residual norm 4.130366996329e-01 297 KSP Residual norm 4.130234922487e-01 298 KSP Residual norm 4.130177314227e-01 299 KSP Residual norm 4.130176522204e-01 300 KSP Residual norm 4.129534057140e-01 301 KSP Residual norm 4.129528591173e-01 302 KSP Residual norm 4.129523752066e-01 303 KSP Residual norm 4.129520369631e-01 304 KSP Residual norm 4.129520332031e-01 305 KSP Residual norm 4.129519983974e-01 306 KSP Residual norm 4.129519167264e-01 307 KSP Residual norm 4.129518697287e-01 308 KSP Residual norm 4.129518516031e-01 309 KSP Residual norm 4.129512056879e-01 310 KSP Residual norm 4.129511076424e-01 311 KSP Residual norm 4.129509443352e-01 312 KSP Residual norm 4.129509177932e-01 313 KSP Residual norm 4.129470826923e-01 314 KSP Residual norm 4.129415428082e-01 315 KSP Residual norm 4.129401406169e-01 316 KSP Residual norm 4.129308860592e-01 317 KSP Residual norm 4.129308818234e-01 318 KSP Residual norm 4.129219682038e-01 319 KSP Residual norm 4.129215984358e-01 320 KSP Residual norm 4.129129234473e-01 321 KSP Residual norm 4.128984258494e-01 322 KSP Residual norm 4.128860599457e-01 323 KSP Residual norm 4.128511524188e-01 324 KSP Residual norm 4.128236156004e-01 325 KSP Residual norm 4.128156355028e-01 326 KSP Residual norm 4.128127833751e-01 327 KSP Residual norm 4.127940181798e-01 328 KSP Residual norm 4.127122873232e-01 329 KSP Residual norm 4.126848237241e-01 330 KSP Residual norm 4.125288390034e-01 331 KSP Residual norm 4.125279039617e-01 332 KSP Residual norm 4.125271137848e-01 333 KSP Residual norm 4.125270674022e-01 334 KSP Residual norm 4.125263302909e-01 335 KSP Residual norm 4.125238543296e-01 336 KSP Residual norm 4.125218710270e-01 337 KSP Residual norm 4.125189823191e-01 338 KSP Residual norm 4.125189815044e-01 339 KSP Residual norm 4.125154375438e-01 340 KSP Residual norm 4.125093701825e-01 341 KSP Residual norm 4.124941653848e-01 342 KSP Residual norm 4.124679316366e-01 343 KSP Residual norm 4.124319450569e-01 344 KSP Residual norm 4.124012371420e-01 345 KSP Residual norm 4.123849460237e-01 346 KSP Residual norm 4.123834839950e-01 347 KSP Residual norm 4.123605873795e-01 348 KSP Residual norm 4.122541751975e-01 349 KSP Residual norm 4.121422741887e-01 350 KSP Residual norm 4.120897806200e-01 351 KSP Residual norm 4.120891912321e-01 352 KSP Residual norm 4.120771333064e-01 353 KSP Residual norm 4.120442427028e-01 354 KSP Residual norm 4.120092981965e-01 355 KSP Residual norm 4.119909465725e-01 356 KSP Residual norm 4.119848164775e-01 357 KSP Residual norm 4.119789382657e-01 358 KSP Residual norm 4.119315361600e-01 359 KSP Residual norm 4.119131664202e-01 360 KSP Residual norm 4.118477844444e-01 361 KSP Residual norm 4.118475092192e-01 362 KSP Residual norm 4.118472864054e-01 363 KSP Residual norm 4.118472457491e-01 364 KSP Residual norm 4.118472000918e-01 365 KSP Residual norm 4.118471179103e-01 366 KSP Residual norm 4.118470061776e-01 367 KSP Residual norm 4.118465675721e-01 368 KSP Residual norm 4.118462409935e-01 369 KSP Residual norm 4.118458954947e-01 370 KSP Residual norm 4.118455957542e-01 371 KSP Residual norm 4.118455643951e-01 372 KSP Residual norm 4.118424156274e-01 373 KSP Residual norm 4.118246220348e-01 374 KSP Residual norm 4.118126881061e-01 375 KSP Residual norm 4.118124918127e-01 376 KSP Residual norm 4.118122309987e-01 377 KSP Residual norm 4.118017305429e-01 378 KSP Residual norm 4.117616375652e-01 379 KSP Residual norm 4.117460260214e-01 380 KSP Residual norm 4.117321680683e-01 381 KSP Residual norm 4.117288094339e-01 382 KSP Residual norm 4.117258670197e-01 383 KSP Residual norm 4.117199535337e-01 384 KSP Residual norm 4.116750446358e-01 385 KSP Residual norm 4.115800705535e-01 386 KSP Residual norm 4.115565946326e-01 387 KSP Residual norm 4.115486254259e-01 388 KSP Residual norm 4.115459153423e-01 389 KSP Residual norm 4.114557593727e-01 390 KSP Residual norm 4.112968818723e-01 391 KSP Residual norm 4.112961924634e-01 392 KSP Residual norm 4.112949197149e-01 393 KSP Residual norm 4.112948837894e-01 394 KSP Residual norm 4.112948740443e-01 395 KSP Residual norm 4.112948499530e-01 396 KSP Residual norm 4.112948467462e-01 397 KSP Residual norm 4.112948227425e-01 398 KSP Residual norm 4.112948085304e-01 399 KSP Residual norm 4.112928219621e-01 400 KSP Residual norm 4.112925574665e-01 401 KSP Residual norm 4.112919632366e-01 402 KSP Residual norm 4.112919546621e-01 403 KSP Residual norm 4.112919249106e-01 404 KSP Residual norm 4.112918910125e-01 405 KSP Residual norm 4.112918906290e-01 406 KSP Residual norm 4.112904523160e-01 407 KSP Residual norm 4.112889761952e-01 408 KSP Residual norm 4.112843834422e-01 409 KSP Residual norm 4.112838117561e-01 410 KSP Residual norm 4.112722694344e-01 411 KSP Residual norm 4.112711141168e-01 412 KSP Residual norm 4.112621822774e-01 413 KSP Residual norm 4.112590845289e-01 414 KSP Residual norm 4.112365791858e-01 415 KSP Residual norm 4.111837342472e-01 416 KSP Residual norm 4.111830086772e-01 417 KSP Residual norm 4.111820985151e-01 418 KSP Residual norm 4.111820230480e-01 419 KSP Residual norm 4.111348589695e-01 420 KSP Residual norm 4.111180639749e-01 421 KSP Residual norm 4.111179949939e-01 422 KSP Residual norm 4.111175895202e-01 423 KSP Residual norm 4.111175866420e-01 424 KSP Residual norm 4.111175831555e-01 425 KSP Residual norm 4.111175827722e-01 426 KSP Residual norm 4.111171539621e-01 427 KSP Residual norm 4.111169842172e-01 428 KSP Residual norm 4.111169804173e-01 429 KSP Residual norm 4.111168641763e-01 430 KSP Residual norm 4.111165776093e-01 431 KSP Residual norm 4.111153424518e-01 432 KSP Residual norm 4.111153414850e-01 433 KSP Residual norm 4.111149617146e-01 434 KSP Residual norm 4.111146980847e-01 435 KSP Residual norm 4.111124920750e-01 436 KSP Residual norm 4.111121681870e-01 437 KSP Residual norm 4.111086423635e-01 438 KSP Residual norm 4.111039338682e-01 439 KSP Residual norm 4.111019904632e-01 440 KSP Residual norm 4.111015684753e-01 441 KSP Residual norm 4.111013970140e-01 442 KSP Residual norm 4.110912618426e-01 443 KSP Residual norm 4.110860817412e-01 444 KSP Residual norm 4.110854091652e-01 445 KSP Residual norm 4.110810902494e-01 446 KSP Residual norm 4.110610594641e-01 447 KSP Residual norm 4.110606454688e-01 448 KSP Residual norm 4.110432570908e-01 449 KSP Residual norm 4.110138359395e-01 450 KSP Residual norm 4.110136781584e-01 451 KSP Residual norm 4.110136775392e-01 452 KSP Residual norm 4.110135379090e-01 453 KSP Residual norm 4.110134425567e-01 454 KSP Residual norm 4.110134424107e-01 455 KSP Residual norm 4.110133660199e-01 456 KSP Residual norm 4.110132437712e-01 457 KSP Residual norm 4.110131945190e-01 458 KSP Residual norm 4.110131905299e-01 459 KSP Residual norm 4.110130498558e-01 460 KSP Residual norm 4.110130444200e-01 461 KSP Residual norm 4.110126777100e-01 462 KSP Residual norm 4.110123152607e-01 463 KSP Residual norm 4.110120695804e-01 464 KSP Residual norm 4.110119203523e-01 465 KSP Residual norm 4.110085511638e-01 466 KSP Residual norm 4.110056808585e-01 467 KSP Residual norm 4.110047668793e-01 468 KSP Residual norm 4.110044035482e-01 469 KSP Residual norm 4.109971203534e-01 470 KSP Residual norm 4.109939550888e-01 471 KSP Residual norm 4.109926575893e-01 472 KSP Residual norm 4.109872068754e-01 473 KSP Residual norm 4.109758171529e-01 474 KSP Residual norm 4.109749842280e-01 475 KSP Residual norm 4.109497845184e-01 476 KSP Residual norm 4.109107070657e-01 477 KSP Residual norm 4.109103430697e-01 478 KSP Residual norm 4.108817299023e-01 479 KSP Residual norm 4.108089708079e-01 480 KSP Residual norm 4.108054952434e-01 481 KSP Residual norm 4.108054845016e-01 482 KSP Residual norm 4.108052143682e-01 483 KSP Residual norm 4.108050354278e-01 484 KSP Residual norm 4.108050325079e-01 485 KSP Residual norm 4.108049126057e-01 486 KSP Residual norm 4.108044472506e-01 487 KSP Residual norm 4.108043930173e-01 488 KSP Residual norm 4.108043914606e-01 489 KSP Residual norm 4.108043803213e-01 490 KSP Residual norm 4.108043254200e-01 491 KSP Residual norm 4.108039841525e-01 492 KSP Residual norm 4.108029422594e-01 493 KSP Residual norm 4.108026670949e-01 494 KSP Residual norm 4.108024171313e-01 495 KSP Residual norm 4.107994970936e-01 496 KSP Residual norm 4.107950800220e-01 497 KSP Residual norm 4.107923429724e-01 498 KSP Residual norm 4.107911889666e-01 499 KSP Residual norm 4.107819505236e-01 500 KSP Residual norm 4.107779816439e-01 501 KSP Residual norm 4.107779709289e-01 502 KSP Residual norm 4.107687322328e-01 503 KSP Residual norm 4.107368207616e-01 504 KSP Residual norm 4.107364849640e-01 505 KSP Residual norm 4.107117215986e-01 506 KSP Residual norm 4.106175458852e-01 507 KSP Residual norm 4.106107266722e-01 508 KSP Residual norm 4.105548526365e-01 509 KSP Residual norm 4.104107852856e-01 510 KSP Residual norm 4.103941912561e-01 511 KSP Residual norm 4.103941476280e-01 512 KSP Residual norm 4.103936896686e-01 513 KSP Residual norm 4.103933316433e-01 514 KSP Residual norm 4.103933295040e-01 515 KSP Residual norm 4.103930603956e-01 516 KSP Residual norm 4.103921856336e-01 517 KSP Residual norm 4.103918481555e-01 518 KSP Residual norm 4.103918343146e-01 519 KSP Residual norm 4.103915618419e-01 520 KSP Residual norm 4.103913620199e-01 521 KSP Residual norm 4.103909841362e-01 522 KSP Residual norm 4.103894926924e-01 523 KSP Residual norm 4.103889195587e-01 524 KSP Residual norm 4.103887983649e-01 525 KSP Residual norm 4.103859745916e-01 526 KSP Residual norm 4.103814413798e-01 527 KSP Residual norm 4.103765009219e-01 528 KSP Residual norm 4.103764971619e-01 529 KSP Residual norm 4.103679491786e-01 530 KSP Residual norm 4.103575142635e-01 531 KSP Residual norm 4.103558497423e-01 532 KSP Residual norm 4.103505396724e-01 533 KSP Residual norm 4.102947800976e-01 534 KSP Residual norm 4.102826711851e-01 535 KSP Residual norm 4.102525969885e-01 536 KSP Residual norm 4.100855886617e-01 537 KSP Residual norm 4.100545989712e-01 538 KSP Residual norm 4.100101599087e-01 539 KSP Residual norm 4.097539286712e-01 540 KSP Residual norm 4.096396804056e-01 541 KSP Residual norm 4.096393703566e-01 542 KSP Residual norm 4.096384449171e-01 543 KSP Residual norm 4.096379939185e-01 544 KSP Residual norm 4.096379804106e-01 545 KSP Residual norm 4.096367773644e-01 546 KSP Residual norm 4.096339941082e-01 547 KSP Residual norm 4.096333642593e-01 548 KSP Residual norm 4.096333455589e-01 549 KSP Residual norm 4.096322654032e-01 550 KSP Residual norm 4.096308522703e-01 551 KSP Residual norm 4.096303889525e-01 552 KSP Residual norm 4.096233735061e-01 553 KSP Residual norm 4.096214918319e-01 554 KSP Residual norm 4.096207777904e-01 555 KSP Residual norm 4.096113673654e-01 556 KSP Residual norm 4.095817218270e-01 557 KSP Residual norm 4.095405610225e-01 558 KSP Residual norm 4.095402598581e-01 559 KSP Residual norm 4.095079233661e-01 560 KSP Residual norm 4.094566001358e-01 561 KSP Residual norm 4.094294099890e-01 562 KSP Residual norm 4.094208769187e-01 563 KSP Residual norm 4.091925561988e-01 564 KSP Residual norm 4.091310281156e-01 565 KSP Residual norm 4.090731720975e-01 566 KSP Residual norm 4.084766246016e-01 567 KSP Residual norm 4.082589104266e-01 568 KSP Residual norm 4.080959563688e-01 569 KSP Residual norm 4.066323778778e-01 570 KSP Residual norm 4.053880392822e-01 571 KSP Residual norm 4.053847086380e-01 572 KSP Residual norm 4.053811758091e-01 573 KSP Residual norm 4.053806836266e-01 574 KSP Residual norm 4.053800554073e-01 575 KSP Residual norm 4.053758077681e-01 576 KSP Residual norm 4.053685458044e-01 577 KSP Residual norm 4.053676845986e-01 578 KSP Residual norm 4.053665216163e-01 579 KSP Residual norm 4.053640067292e-01 580 KSP Residual norm 4.053619886212e-01 581 KSP Residual norm 4.053615593351e-01 582 KSP Residual norm 4.053334835005e-01 583 KSP Residual norm 4.053288100912e-01 584 KSP Residual norm 4.053251849159e-01 585 KSP Residual norm 4.053208544494e-01 586 KSP Residual norm 4.052818327609e-01 587 KSP Residual norm 4.051672403346e-01 588 KSP Residual norm 4.051578151395e-01 589 KSP Residual norm 4.051364262623e-01 590 KSP Residual norm 4.050918117160e-01 591 KSP Residual norm 4.050622051936e-01 592 KSP Residual norm 4.050592625161e-01 593 KSP Residual norm 4.048774208725e-01 594 KSP Residual norm 4.048440401025e-01 595 KSP Residual norm 4.048079905008e-01 596 KSP Residual norm 4.047186348599e-01 597 KSP Residual norm 4.047128038647e-01 598 KSP Residual norm 4.045992122695e-01 599 KSP Residual norm 4.044997235089e-01 600 KSP Residual norm 4.044991824407e-01 601 KSP Residual norm 4.044991813075e-01 602 KSP Residual norm 4.044989382049e-01 603 KSP Residual norm 4.044988170758e-01 604 KSP Residual norm 4.044987600540e-01 605 KSP Residual norm 4.044985496110e-01 606 KSP Residual norm 4.044984397910e-01 607 KSP Residual norm 4.044983591982e-01 608 KSP Residual norm 4.044980602256e-01 609 KSP Residual norm 4.044980551814e-01 610 KSP Residual norm 4.044980509617e-01 611 KSP Residual norm 4.044979975454e-01 612 KSP Residual norm 4.044978002484e-01 613 KSP Residual norm 4.044962707393e-01 614 KSP Residual norm 4.044939386660e-01 615 KSP Residual norm 4.044930078891e-01 616 KSP Residual norm 4.044888755844e-01 617 KSP Residual norm 4.044691825578e-01 618 KSP Residual norm 4.044671952381e-01 619 KSP Residual norm 4.044627494625e-01 620 KSP Residual norm 4.044558962554e-01 621 KSP Residual norm 4.044504773942e-01 622 KSP Residual norm 4.044504300599e-01 623 KSP Residual norm 4.044310074684e-01 624 KSP Residual norm 4.044138556143e-01 625 KSP Residual norm 4.044137690543e-01 626 KSP Residual norm 4.043802684583e-01 627 KSP Residual norm 4.043464246674e-01 628 KSP Residual norm 4.043177621619e-01 629 KSP Residual norm 4.043086641092e-01 630 KSP Residual norm 4.040580113182e-01 631 KSP Residual norm 4.040567944878e-01 632 KSP Residual norm 4.040565354371e-01 633 KSP Residual norm 4.040562171575e-01 634 KSP Residual norm 4.040560416300e-01 635 KSP Residual norm 4.040560371803e-01 636 KSP Residual norm 4.040560275825e-01 637 KSP Residual norm 4.040560200332e-01 638 KSP Residual norm 4.040559460965e-01 639 KSP Residual norm 4.040557718662e-01 640 KSP Residual norm 4.040557700203e-01 641 KSP Residual norm 4.040555224838e-01 642 KSP Residual norm 4.040423235951e-01 643 KSP Residual norm 4.040417473433e-01 644 KSP Residual norm 4.040211135337e-01 645 KSP Residual norm 4.040197408867e-01 646 KSP Residual norm 4.040086355585e-01 647 KSP Residual norm 4.040066297924e-01 648 KSP Residual norm 4.040010440714e-01 649 KSP Residual norm 4.039900815634e-01 650 KSP Residual norm 4.039900670898e-01 651 KSP Residual norm 4.039897452619e-01 652 KSP Residual norm 4.039861721040e-01 653 KSP Residual norm 4.039166464094e-01 654 KSP Residual norm 4.039128737908e-01 655 KSP Residual norm 4.037218643476e-01 656 KSP Residual norm 4.034182638644e-01 657 KSP Residual norm 4.034022923546e-01 658 KSP Residual norm 4.025007116701e-01 659 KSP Residual norm 4.004351448644e-01 660 KSP Residual norm 4.000770467007e-01 661 KSP Residual norm 4.000765424816e-01 662 KSP Residual norm 4.000745033088e-01 663 KSP Residual norm 4.000744315150e-01 664 KSP Residual norm 4.000727234553e-01 665 KSP Residual norm 4.000726252273e-01 666 KSP Residual norm 4.000707344499e-01 667 KSP Residual norm 4.000654208963e-01 668 KSP Residual norm 4.000627735776e-01 669 KSP Residual norm 4.000505209179e-01 670 KSP Residual norm 4.000380459841e-01 671 KSP Residual norm 4.000213132017e-01 672 KSP Residual norm 3.999661348653e-01 673 KSP Residual norm 3.999654402867e-01 674 KSP Residual norm 3.998686958127e-01 675 KSP Residual norm 3.996434209160e-01 676 KSP Residual norm 3.994710833261e-01 677 KSP Residual norm 3.994690989318e-01 678 KSP Residual norm 3.994480541024e-01 679 KSP Residual norm 3.993946501573e-01 680 KSP Residual norm 3.993163561782e-01 681 KSP Residual norm 3.993155363856e-01 682 KSP Residual norm 3.991682106454e-01 683 KSP Residual norm 3.989095131221e-01 684 KSP Residual norm 3.988447880332e-01 685 KSP Residual norm 3.980527307200e-01 686 KSP Residual norm 3.976530797501e-01 687 KSP Residual norm 3.976529261326e-01 688 KSP Residual norm 3.971495227501e-01 689 KSP Residual norm 3.971214559329e-01 690 KSP Residual norm 3.970677700679e-01 691 KSP Residual norm 3.970677000834e-01 692 KSP Residual norm 3.970674595462e-01 693 KSP Residual norm 3.970672677623e-01 694 KSP Residual norm 3.970668443087e-01 695 KSP Residual norm 3.970661849035e-01 696 KSP Residual norm 3.970659072119e-01 697 KSP Residual norm 3.970653773571e-01 698 KSP Residual norm 3.970651594645e-01 699 KSP Residual norm 3.970645405759e-01 700 KSP Residual norm 3.970617640726e-01 701 KSP Residual norm 3.970592329678e-01 702 KSP Residual norm 3.970340573152e-01 703 KSP Residual norm 3.970325907197e-01 704 KSP Residual norm 3.969751743134e-01 705 KSP Residual norm 3.969681550599e-01 706 KSP Residual norm 3.969680210402e-01 707 KSP Residual norm 3.969656301657e-01 708 KSP Residual norm 3.969561724977e-01 709 KSP Residual norm 3.969559381409e-01 710 KSP Residual norm 3.969403066484e-01 711 KSP Residual norm 3.969332811307e-01 712 KSP Residual norm 3.969258794885e-01 713 KSP Residual norm 3.968626450839e-01 714 KSP Residual norm 3.968490348324e-01 715 KSP Residual norm 3.967889983150e-01 716 KSP Residual norm 3.967612973961e-01 717 KSP Residual norm 3.967189091810e-01 718 KSP Residual norm 3.967116052214e-01 719 KSP Residual norm 3.967103743918e-01 720 KSP Residual norm 3.964678774111e-01 721 KSP Residual norm 3.964677499049e-01 722 KSP Residual norm 3.964677498644e-01 723 KSP Residual norm 3.964674592538e-01 724 KSP Residual norm 3.964674442235e-01 725 KSP Residual norm 3.964671514341e-01 726 KSP Residual norm 3.964671233807e-01 727 KSP Residual norm 3.964669431869e-01 728 KSP Residual norm 3.964661596295e-01 729 KSP Residual norm 3.964655641204e-01 730 KSP Residual norm 3.964643267593e-01 731 KSP Residual norm 3.964608185073e-01 732 KSP Residual norm 3.964524985033e-01 733 KSP Residual norm 3.964511179722e-01 734 KSP Residual norm 3.964340604507e-01 735 KSP Residual norm 3.964306584905e-01 736 KSP Residual norm 3.964306150295e-01 737 KSP Residual norm 3.964117313003e-01 738 KSP Residual norm 3.964109484137e-01 739 KSP Residual norm 3.963811465750e-01 740 KSP Residual norm 3.963810211576e-01 741 KSP Residual norm 3.963552548206e-01 742 KSP Residual norm 3.963527621093e-01 743 KSP Residual norm 3.962884937912e-01 744 KSP Residual norm 3.962227620727e-01 745 KSP Residual norm 3.961775911195e-01 746 KSP Residual norm 3.960895737119e-01 747 KSP Residual norm 3.959492929184e-01 748 KSP Residual norm 3.959453326018e-01 749 KSP Residual norm 3.956989412461e-01 750 KSP Residual norm 3.953191573018e-01 751 KSP Residual norm 3.953189967861e-01 752 KSP Residual norm 3.953189963497e-01 753 KSP Residual norm 3.953187292499e-01 754 KSP Residual norm 3.953186187047e-01 755 KSP Residual norm 3.953174325272e-01 756 KSP Residual norm 3.953172754251e-01 757 KSP Residual norm 3.953165950643e-01 758 KSP Residual norm 3.953164517133e-01 759 KSP Residual norm 3.953150346775e-01 760 KSP Residual norm 3.953141483769e-01 761 KSP Residual norm 3.953073504038e-01 762 KSP Residual norm 3.952994568912e-01 763 KSP Residual norm 3.952971919403e-01 764 KSP Residual norm 3.952827414252e-01 765 KSP Residual norm 3.952820455482e-01 766 KSP Residual norm 3.952813224923e-01 767 KSP Residual norm 3.952773900448e-01 768 KSP Residual norm 3.952770215516e-01 769 KSP Residual norm 3.952757562885e-01 770 KSP Residual norm 3.952717645913e-01 771 KSP Residual norm 3.952717536957e-01 772 KSP Residual norm 3.952692221769e-01 773 KSP Residual norm 3.952685011520e-01 774 KSP Residual norm 3.952670550720e-01 775 KSP Residual norm 3.952648252931e-01 776 KSP Residual norm 3.952511747549e-01 777 KSP Residual norm 3.952095173270e-01 778 KSP Residual norm 3.951831681530e-01 779 KSP Residual norm 3.951633567641e-01 780 KSP Residual norm 3.951200363359e-01 781 KSP Residual norm 3.951200298459e-01 782 KSP Residual norm 3.951200297803e-01 783 KSP Residual norm 3.951199966919e-01 784 KSP Residual norm 3.951199060006e-01 785 KSP Residual norm 3.951199059440e-01 786 KSP Residual norm 3.951198051427e-01 787 KSP Residual norm 3.951197440224e-01 788 KSP Residual norm 3.951196683846e-01 789 KSP Residual norm 3.951192323114e-01 790 KSP Residual norm 3.951191986847e-01 791 KSP Residual norm 3.951168474838e-01 792 KSP Residual norm 3.951166021478e-01 793 KSP Residual norm 3.951154752648e-01 794 KSP Residual norm 3.951147391700e-01 795 KSP Residual norm 3.951137837719e-01 796 KSP Residual norm 3.951129050510e-01 797 KSP Residual norm 3.951111903597e-01 798 KSP Residual norm 3.951108050269e-01 799 KSP Residual norm 3.951066690325e-01 800 KSP Residual norm 3.951054856874e-01 801 KSP Residual norm 3.950996393735e-01 802 KSP Residual norm 3.950923810045e-01 803 KSP Residual norm 3.950739106683e-01 804 KSP Residual norm 3.950002133546e-01 805 KSP Residual norm 3.949960347810e-01 806 KSP Residual norm 3.949028709805e-01 807 KSP Residual norm 3.948668717591e-01 808 KSP Residual norm 3.948050837012e-01 809 KSP Residual norm 3.948007549716e-01 810 KSP Residual norm 3.947595592930e-01 811 KSP Residual norm 3.947595501001e-01 812 KSP Residual norm 3.947595438239e-01 813 KSP Residual norm 3.947594535466e-01 814 KSP Residual norm 3.947594064932e-01 815 KSP Residual norm 3.947592318762e-01 816 KSP Residual norm 3.947589476911e-01 817 KSP Residual norm 3.947589329835e-01 818 KSP Residual norm 3.947584808126e-01 819 KSP Residual norm 3.947582620980e-01 820 KSP Residual norm 3.947576789265e-01 821 KSP Residual norm 3.947530845594e-01 822 KSP Residual norm 3.947502415111e-01 823 KSP Residual norm 3.947458876651e-01 824 KSP Residual norm 3.947396615082e-01 825 KSP Residual norm 3.947316317585e-01 826 KSP Residual norm 3.947298389369e-01 827 KSP Residual norm 3.947177075405e-01 828 KSP Residual norm 3.947168519283e-01 829 KSP Residual norm 3.947090646616e-01 830 KSP Residual norm 3.947019725950e-01 831 KSP Residual norm 3.947003533281e-01 832 KSP Residual norm 3.946953469962e-01 833 KSP Residual norm 3.946873271876e-01 834 KSP Residual norm 3.946805698304e-01 835 KSP Residual norm 3.946480679733e-01 836 KSP Residual norm 3.945927421717e-01 837 KSP Residual norm 3.945103145113e-01 838 KSP Residual norm 3.944688068986e-01 839 KSP Residual norm 3.944606176654e-01 840 KSP Residual norm 3.944392218041e-01 841 KSP Residual norm 3.944392196483e-01 842 KSP Residual norm 3.944392155635e-01 843 KSP Residual norm 3.944391252336e-01 844 KSP Residual norm 3.944390181193e-01 845 KSP Residual norm 3.944390111415e-01 846 KSP Residual norm 3.944389080843e-01 847 KSP Residual norm 3.944388933527e-01 848 KSP Residual norm 3.944388480814e-01 849 KSP Residual norm 3.944387436866e-01 850 KSP Residual norm 3.944387353326e-01 851 KSP Residual norm 3.944380632260e-01 852 KSP Residual norm 3.944380605171e-01 853 KSP Residual norm 3.944379695991e-01 854 KSP Residual norm 3.944379536730e-01 855 KSP Residual norm 3.944379302972e-01 856 KSP Residual norm 3.944376861138e-01 857 KSP Residual norm 3.944376281034e-01 858 KSP Residual norm 3.944368880080e-01 859 KSP Residual norm 3.944358177390e-01 860 KSP Residual norm 3.944357729065e-01 861 KSP Residual norm 3.944337298839e-01 862 KSP Residual norm 3.944326459537e-01 863 KSP Residual norm 3.944274089630e-01 864 KSP Residual norm 3.944060372374e-01 865 KSP Residual norm 3.944053971987e-01 866 KSP Residual norm 3.943963435966e-01 867 KSP Residual norm 3.943961994499e-01 868 KSP Residual norm 3.943928081242e-01 869 KSP Residual norm 3.943918918297e-01 870 KSP Residual norm 3.943892952181e-01 871 KSP Residual norm 3.943892950235e-01 872 KSP Residual norm 3.943892932463e-01 873 KSP Residual norm 3.943892873370e-01 874 KSP Residual norm 3.943892859476e-01 875 KSP Residual norm 3.943892850121e-01 876 KSP Residual norm 3.943892846531e-01 877 KSP Residual norm 3.943892739880e-01 878 KSP Residual norm 3.943892738871e-01 879 KSP Residual norm 3.943892175320e-01 880 KSP Residual norm 3.943891924999e-01 881 KSP Residual norm 3.943891899316e-01 882 KSP Residual norm 3.943891565422e-01 883 KSP Residual norm 3.943891279450e-01 884 KSP Residual norm 3.943890891100e-01 885 KSP Residual norm 3.943888105957e-01 886 KSP Residual norm 3.943887766644e-01 887 KSP Residual norm 3.943885430499e-01 888 KSP Residual norm 3.943879560517e-01 889 KSP Residual norm 3.943879518064e-01 890 KSP Residual norm 3.943870096605e-01 891 KSP Residual norm 3.943865923245e-01 892 KSP Residual norm 3.943865220344e-01 893 KSP Residual norm 3.943858738521e-01 894 KSP Residual norm 3.943824193180e-01 895 KSP Residual norm 3.943799967223e-01 896 KSP Residual norm 3.943799834026e-01 897 KSP Residual norm 3.943780007471e-01 898 KSP Residual norm 3.943776544020e-01 899 KSP Residual norm 3.943775575326e-01 900 KSP Residual norm 3.943760304866e-01 901 KSP Residual norm 3.943760303549e-01 902 KSP Residual norm 3.943760297405e-01 903 KSP Residual norm 3.943760240670e-01 904 KSP Residual norm 3.943760140434e-01 905 KSP Residual norm 3.943760131839e-01 906 KSP Residual norm 3.943760106045e-01 907 KSP Residual norm 3.943760010940e-01 908 KSP Residual norm 3.943760010880e-01 909 KSP Residual norm 3.943759826394e-01 910 KSP Residual norm 3.943759781647e-01 911 KSP Residual norm 3.943759524710e-01 912 KSP Residual norm 3.943759178695e-01 913 KSP Residual norm 3.943759079375e-01 914 KSP Residual norm 3.943758922601e-01 915 KSP Residual norm 3.943757695610e-01 916 KSP Residual norm 3.943757360833e-01 917 KSP Residual norm 3.943755513856e-01 918 KSP Residual norm 3.943750708635e-01 919 KSP Residual norm 3.943750581555e-01 920 KSP Residual norm 3.943745567396e-01 921 KSP Residual norm 3.943741565194e-01 922 KSP Residual norm 3.943741522063e-01 923 KSP Residual norm 3.943732024225e-01 924 KSP Residual norm 3.943699659052e-01 925 KSP Residual norm 3.943688401003e-01 926 KSP Residual norm 3.943688222022e-01 927 KSP Residual norm 3.943679058852e-01 928 KSP Residual norm 3.943675951917e-01 929 KSP Residual norm 3.943675951520e-01 930 KSP Residual norm 3.943669294280e-01 931 KSP Residual norm 3.943669293703e-01 932 KSP Residual norm 3.943669290600e-01 933 KSP Residual norm 3.943669268442e-01 934 KSP Residual norm 3.943669237765e-01 935 KSP Residual norm 3.943669237243e-01 936 KSP Residual norm 3.943669214658e-01 937 KSP Residual norm 3.943669149406e-01 938 KSP Residual norm 3.943669149383e-01 939 KSP Residual norm 3.943668993861e-01 940 KSP Residual norm 3.943668950971e-01 941 KSP Residual norm 3.943668801731e-01 942 KSP Residual norm 3.943668589586e-01 943 KSP Residual norm 3.943668542770e-01 944 KSP Residual norm 3.943668398040e-01 945 KSP Residual norm 3.943667391126e-01 946 KSP Residual norm 3.943667100270e-01 947 KSP Residual norm 3.943665861475e-01 948 KSP Residual norm 3.943662577209e-01 949 KSP Residual norm 3.943662536439e-01 950 KSP Residual norm 3.943658601713e-01 951 KSP Residual norm 3.943655438533e-01 952 KSP Residual norm 3.943655400518e-01 953 KSP Residual norm 3.943648416325e-01 954 KSP Residual norm 3.943621887261e-01 955 KSP Residual norm 3.943610992566e-01 956 KSP Residual norm 3.943610981974e-01 957 KSP Residual norm 3.943602916328e-01 958 KSP Residual norm 3.943599707828e-01 959 KSP Residual norm 3.943599698134e-01 960 KSP Residual norm 3.943592984477e-01 961 KSP Residual norm 3.943592983882e-01 962 KSP Residual norm 3.943592980681e-01 963 KSP Residual norm 3.943592958402e-01 964 KSP Residual norm 3.943592927808e-01 965 KSP Residual norm 3.943592927181e-01 966 KSP Residual norm 3.943592908846e-01 967 KSP Residual norm 3.943592856572e-01 968 KSP Residual norm 3.943592856527e-01 969 KSP Residual norm 3.943592708723e-01 970 KSP Residual norm 3.943592667882e-01 971 KSP Residual norm 3.943592529122e-01 972 KSP Residual norm 3.943592330760e-01 973 KSP Residual norm 3.943592279270e-01 974 KSP Residual norm 3.943592177113e-01 975 KSP Residual norm 3.943591392718e-01 976 KSP Residual norm 3.943591168455e-01 977 KSP Residual norm 3.943590118187e-01 978 KSP Residual norm 3.943587414604e-01 979 KSP Residual norm 3.943587368107e-01 980 KSP Residual norm 3.943584394872e-01 981 KSP Residual norm 3.943581992643e-01 982 KSP Residual norm 3.943581965979e-01 983 KSP Residual norm 3.943576468368e-01 984 KSP Residual norm 3.943557301453e-01 985 KSP Residual norm 3.943550381450e-01 986 KSP Residual norm 3.943550328739e-01 987 KSP Residual norm 3.943544687752e-01 988 KSP Residual norm 3.943542756703e-01 989 KSP Residual norm 3.943542726753e-01 990 KSP Residual norm 3.943536647745e-01 991 KSP Residual norm 3.943536647206e-01 992 KSP Residual norm 3.943536644561e-01 993 KSP Residual norm 3.943536624356e-01 994 KSP Residual norm 3.943536593549e-01 995 KSP Residual norm 3.943536591722e-01 996 KSP Residual norm 3.943536579745e-01 997 KSP Residual norm 3.943536536516e-01 998 KSP Residual norm 3.943536536213e-01 999 KSP Residual norm 3.943536455575e-01 1000 KSP Residual norm 3.943536434321e-01 1001 KSP Residual norm 3.943536331741e-01 1002 KSP Residual norm 3.943536192670e-01 1003 KSP Residual norm 3.943536162840e-01 1004 KSP Residual norm 3.943536078132e-01 1005 KSP Residual norm 3.943535487363e-01 1006 KSP Residual norm 3.943535317372e-01 1007 KSP Residual norm 3.943534512078e-01 1008 KSP Residual norm 3.943532427298e-01 1009 KSP Residual norm 3.943532393100e-01 1010 KSP Residual norm 3.943530043021e-01 1011 KSP Residual norm 3.943528110823e-01 1012 KSP Residual norm 3.943528092284e-01 1013 KSP Residual norm 3.943523699781e-01 1014 KSP Residual norm 3.943507352991e-01 1015 KSP Residual norm 3.943500800704e-01 1016 KSP Residual norm 3.943500790732e-01 1017 KSP Residual norm 3.943495791957e-01 1018 KSP Residual norm 3.943493639277e-01 1019 KSP Residual norm 3.943493614596e-01 1020 KSP Residual norm 3.943489901691e-01 1021 KSP Residual norm 3.943489901360e-01 1022 KSP Residual norm 3.943489899587e-01 1023 KSP Residual norm 3.943489887649e-01 1024 KSP Residual norm 3.943489872290e-01 1025 KSP Residual norm 3.943489872192e-01 1026 KSP Residual norm 3.943489860026e-01 1027 KSP Residual norm 3.943489828437e-01 1028 KSP Residual norm 3.943489828372e-01 1029 KSP Residual norm 3.943489734058e-01 1030 KSP Residual norm 3.943489707597e-01 1031 KSP Residual norm 3.943489626177e-01 1032 KSP Residual norm 3.943489509429e-01 1033 KSP Residual norm 3.943489480207e-01 1034 KSP Residual norm 3.943489415534e-01 1035 KSP Residual norm 3.943488929503e-01 1036 KSP Residual norm 3.943488787539e-01 1037 KSP Residual norm 3.943488166225e-01 1038 KSP Residual norm 3.943486552307e-01 1039 KSP Residual norm 3.943486527389e-01 1040 KSP Residual norm 3.943484701326e-01 1041 KSP Residual norm 3.943483226279e-01 1042 KSP Residual norm 3.943483210128e-01 1043 KSP Residual norm 3.943479872156e-01 1044 KSP Residual norm 3.943468073679e-01 1045 KSP Residual norm 3.943463764717e-01 1046 KSP Residual norm 3.943463733813e-01 1047 KSP Residual norm 3.943460250791e-01 1048 KSP Residual norm 3.943459094312e-01 1049 KSP Residual norm 3.943459066341e-01 1050 KSP Residual norm 3.943455077790e-01 1051 KSP Residual norm 3.943455077436e-01 1052 KSP Residual norm 3.943455075754e-01 1053 KSP Residual norm 3.943455062735e-01 1054 KSP Residual norm 3.943455042461e-01 1055 KSP Residual norm 3.943455041074e-01 1056 KSP Residual norm 3.943455034097e-01 1057 KSP Residual norm 3.943455007714e-01 1058 KSP Residual norm 3.943455007511e-01 1059 KSP Residual norm 3.943454958810e-01 1060 KSP Residual norm 3.943454946027e-01 1061 KSP Residual norm 3.943454882116e-01 1062 KSP Residual norm 3.943454795274e-01 1063 KSP Residual norm 3.943454776190e-01 1064 KSP Residual norm 3.943454725902e-01 1065 KSP Residual norm 3.943454369596e-01 1066 KSP Residual norm 3.943454268295e-01 1067 KSP Residual norm 3.943453767855e-01 1068 KSP Residual norm 3.943452483256e-01 1069 KSP Residual norm 3.943452461156e-01 1070 KSP Residual norm 3.943451035768e-01 1071 KSP Residual norm 3.943449862056e-01 1072 KSP Residual norm 3.943449850755e-01 1073 KSP Residual norm 3.943447160190e-01 1074 KSP Residual norm 3.943437254806e-01 1075 KSP Residual norm 3.943433354215e-01 1076 KSP Residual norm 3.943433344102e-01 1077 KSP Residual norm 3.943430281787e-01 1078 KSP Residual norm 3.943428982356e-01 1079 KSP Residual norm 3.943428971044e-01 1080 KSP Residual norm 3.943426675940e-01 1081 KSP Residual norm 3.943426675735e-01 1082 KSP Residual norm 3.943426674667e-01 1083 KSP Residual norm 3.943426667408e-01 1084 KSP Residual norm 3.943426657997e-01 1085 KSP Residual norm 3.943426657927e-01 1086 KSP Residual norm 3.943426650588e-01 1087 KSP Residual norm 3.943426631258e-01 1088 KSP Residual norm 3.943426631236e-01 1089 KSP Residual norm 3.943426575574e-01 1090 KSP Residual norm 3.943426559965e-01 1091 KSP Residual norm 3.943426511138e-01 1092 KSP Residual norm 3.943426441433e-01 1093 KSP Residual norm 3.943426424387e-01 1094 KSP Residual norm 3.943426384778e-01 1095 KSP Residual norm 3.943426090272e-01 1096 KSP Residual norm 3.943426004029e-01 1097 KSP Residual norm 3.943425628787e-01 1098 KSP Residual norm 3.943424654735e-01 1099 KSP Residual norm 3.943424640157e-01 1100 KSP Residual norm 3.943423528668e-01 1101 KSP Residual norm 3.943422630375e-01 1102 KSP Residual norm 3.943422620626e-01 1103 KSP Residual norm 3.943420589782e-01 1104 KSP Residual norm 3.943413372247e-01 1105 KSP Residual norm 3.943410722608e-01 1106 KSP Residual norm 3.943410703604e-01 1107 KSP Residual norm 3.943408559021e-01 1108 KSP Residual norm 3.943407843331e-01 1109 KSP Residual norm 3.943407827702e-01 1110 KSP Residual norm 3.943405400209e-01 1111 KSP Residual norm 3.943405399994e-01 1112 KSP Residual norm 3.943405398983e-01 1113 KSP Residual norm 3.943405391183e-01 1114 KSP Residual norm 3.943405379088e-01 1115 KSP Residual norm 3.943405378289e-01 1116 KSP Residual norm 3.943405374040e-01 1117 KSP Residual norm 3.943405358188e-01 1118 KSP Residual norm 3.943405358087e-01 1119 KSP Residual norm 3.943405327802e-01 1120 KSP Residual norm 3.943405319810e-01 1121 KSP Residual norm 3.943405281037e-01 1122 KSP Residual norm 3.943405228095e-01 1123 KSP Residual norm 3.943405216286e-01 1124 KSP Residual norm 3.943405186135e-01 1125 KSP Residual norm 3.943404970603e-01 1126 KSP Residual norm 3.943404909509e-01 1127 KSP Residual norm 3.943404605765e-01 1128 KSP Residual norm 3.943403828450e-01 1129 KSP Residual norm 3.943403815135e-01 1130 KSP Residual norm 3.943402953680e-01 1131 KSP Residual norm 3.943402245405e-01 1132 KSP Residual norm 3.943402238466e-01 1133 KSP Residual norm 3.943400611758e-01 1134 KSP Residual norm 3.943394662452e-01 1135 KSP Residual norm 3.943392350873e-01 1136 KSP Residual norm 3.943392342578e-01 1137 KSP Residual norm 3.943390496868e-01 1138 KSP Residual norm 3.943389735494e-01 1139 KSP Residual norm 3.943389731976e-01 1140 KSP Residual norm 3.943388288574e-01 1141 KSP Residual norm 3.943388288446e-01 1142 KSP Residual norm 3.943388287791e-01 1143 KSP Residual norm 3.943388283265e-01 1144 KSP Residual norm 3.943388277262e-01 1145 KSP Residual norm 3.943388277192e-01 1146 KSP Residual norm 3.943388272892e-01 1147 KSP Residual norm 3.943388261151e-01 1148 KSP Residual norm 3.943388261146e-01 1149 KSP Residual norm 3.943388228863e-01 1150 KSP Residual norm 3.943388219851e-01 1151 KSP Residual norm 3.943388190465e-01 1152 KSP Residual norm 3.943388148732e-01 1153 KSP Residual norm 3.943388138677e-01 1154 KSP Residual norm 3.943388114816e-01 1155 KSP Residual norm 3.943387938411e-01 1156 KSP Residual norm 3.943387886911e-01 1157 KSP Residual norm 3.943387659994e-01 1158 KSP Residual norm 3.943387072626e-01 1159 KSP Residual norm 3.943387063847e-01 1160 KSP Residual norm 3.943386393307e-01 1161 KSP Residual norm 3.943385850698e-01 1162 KSP Residual norm 3.943385844852e-01 1163 KSP Residual norm 3.943384615936e-01 1164 KSP Residual norm 3.943380236045e-01 1165 KSP Residual norm 3.943378622640e-01 1166 KSP Residual norm 3.943378610995e-01 1167 KSP Residual norm 3.943377299902e-01 1168 KSP Residual norm 3.943376855678e-01 1169 KSP Residual norm 3.943376848166e-01 1170 KSP Residual norm 3.943375406501e-01 1171 KSP Residual norm 3.943375406374e-01 1172 KSP Residual norm 3.943375405775e-01 1173 KSP Residual norm 3.943375401194e-01 1174 KSP Residual norm 3.943375394162e-01 1175 KSP Residual norm 3.943375393734e-01 1176 KSP Residual norm 3.943375391123e-01 1177 KSP Residual norm 3.943375381645e-01 1178 KSP Residual norm 3.943375381597e-01 1179 KSP Residual norm 3.943375362799e-01 1180 KSP Residual norm 3.943375357804e-01 1181 KSP Residual norm 3.943375334535e-01 1182 KSP Residual norm 3.943375302625e-01 1183 KSP Residual norm 3.943375295443e-01 1184 KSP Residual norm 3.943375277343e-01 1185 KSP Residual norm 3.943375147299e-01 1186 KSP Residual norm 3.943375110409e-01 1187 KSP Residual norm 3.943374928063e-01 1188 KSP Residual norm 3.943374461831e-01 1189 KSP Residual norm 3.943374453969e-01 1190 KSP Residual norm 3.943373935859e-01 1191 KSP Residual norm 3.943373510679e-01 1192 KSP Residual norm 3.943373506452e-01 1193 KSP Residual norm 3.943372530137e-01 1194 KSP Residual norm 3.943368974961e-01 1195 KSP Residual norm 3.943367606756e-01 1196 KSP Residual norm 3.943367600800e-01 1197 KSP Residual norm 3.943366498015e-01 1198 KSP Residual norm 3.943366054760e-01 1199 KSP Residual norm 3.943366053900e-01 1200 KSP Residual norm 3.943365153677e-01 1201 KSP Residual norm 3.943365153597e-01 1202 KSP Residual norm 3.943365153197e-01 1203 KSP Residual norm 3.943365150387e-01 1204 KSP Residual norm 3.943365146581e-01 1205 KSP Residual norm 3.943365146517e-01 1206 KSP Residual norm 3.943365144011e-01 1207 KSP Residual norm 3.943365136929e-01 1208 KSP Residual norm 3.943365136929e-01 1209 KSP Residual norm 3.943365118217e-01 1210 KSP Residual norm 3.943365113018e-01 1211 KSP Residual norm 3.943365095367e-01 1212 KSP Residual norm 3.943365070410e-01 1213 KSP Residual norm 3.943365064456e-01 1214 KSP Residual norm 3.943365050184e-01 1215 KSP Residual norm 3.943364945040e-01 1216 KSP Residual norm 3.943364914463e-01 1217 KSP Residual norm 3.943364777718e-01 1218 KSP Residual norm 3.943364424684e-01 1219 KSP Residual norm 3.943364419367e-01 1220 KSP Residual norm 3.943364017084e-01 1221 KSP Residual norm 3.943363691117e-01 1222 KSP Residual norm 3.943363687627e-01 1223 KSP Residual norm 3.943362947849e-01 1224 KSP Residual norm 3.943360306129e-01 1225 KSP Residual norm 3.943359330157e-01 1226 KSP Residual norm 3.943359323126e-01 1227 KSP Residual norm 3.943358527374e-01 1228 KSP Residual norm 3.943358253456e-01 1229 KSP Residual norm 3.943358250000e-01 1230 KSP Residual norm 3.943357400253e-01 1231 KSP Residual norm 3.943357400178e-01 1232 KSP Residual norm 3.943357399825e-01 1233 KSP Residual norm 3.943357397144e-01 1234 KSP Residual norm 3.943357393073e-01 1235 KSP Residual norm 3.943357392845e-01 1236 KSP Residual norm 3.943357391246e-01 1237 KSP Residual norm 3.943357385594e-01 1238 KSP Residual norm 3.943357385572e-01 1239 KSP Residual norm 3.943357373994e-01 1240 KSP Residual norm 3.943357370901e-01 1241 KSP Residual norm 3.943357357000e-01 1242 KSP Residual norm 3.943357337871e-01 1243 KSP Residual norm 3.943357333539e-01 1244 KSP Residual norm 3.943357322692e-01 1245 KSP Residual norm 3.943357244500e-01 1246 KSP Residual norm 3.943357222280e-01 1247 KSP Residual norm 3.943357113359e-01 1248 KSP Residual norm 3.943356834877e-01 1249 KSP Residual norm 3.943356830252e-01 1250 KSP Residual norm 3.943356519811e-01 1251 KSP Residual norm 3.943356265466e-01 1252 KSP Residual norm 3.943356262910e-01 1253 KSP Residual norm 3.943355679312e-01 1254 KSP Residual norm 3.943353560604e-01 1255 KSP Residual norm 3.943352750766e-01 1256 KSP Residual norm 3.943352746789e-01 1257 KSP Residual norm 3.943352090492e-01 1258 KSP Residual norm 3.943351832141e-01 1259 KSP Residual norm 3.943351831988e-01 1260 KSP Residual norm 3.943351276975e-01 1261 KSP Residual norm 3.943351276926e-01 1262 KSP Residual norm 3.943351276682e-01 1263 KSP Residual norm 3.943351274954e-01 1264 KSP Residual norm 3.943351272574e-01 1265 KSP Residual norm 3.943351272524e-01 1266 KSP Residual norm 3.943351271061e-01 1267 KSP Residual norm 3.943351266808e-01 1268 KSP Residual norm 3.943351266808e-01 1269 KSP Residual norm 3.943351255923e-01 1270 KSP Residual norm 3.943351252911e-01 1271 KSP Residual norm 3.943351242337e-01 1272 KSP Residual norm 3.943351227437e-01 1273 KSP Residual norm 3.943351223907e-01 1274 KSP Residual norm 3.943351215395e-01 1275 KSP Residual norm 3.943351152847e-01 1276 KSP Residual norm 3.943351134717e-01 1277 KSP Residual norm 3.943351052609e-01 1278 KSP Residual norm 3.943350841050e-01 1279 KSP Residual norm 3.943350837836e-01 1280 KSP Residual norm 3.943350597275e-01 1281 KSP Residual norm 3.943350402122e-01 1282 KSP Residual norm 3.943350400044e-01 1283 KSP Residual norm 3.943349956396e-01 1284 KSP Residual norm 3.943348369709e-01 1285 KSP Residual norm 3.943347781970e-01 1286 KSP Residual norm 3.943347777776e-01 1287 KSP Residual norm 3.943347297547e-01 1288 KSP Residual norm 3.943347129942e-01 1289 KSP Residual norm 3.943347128357e-01 1290 KSP Residual norm 3.943346628223e-01 1291 KSP Residual norm 3.943346628179e-01 1292 KSP Residual norm 3.943346627971e-01 1293 KSP Residual norm 3.943346626400e-01 1294 KSP Residual norm 3.943346624038e-01 1295 KSP Residual norm 3.943346623915e-01 1296 KSP Residual norm 3.943346622942e-01 1297 KSP Residual norm 3.943346619576e-01 1298 KSP Residual norm 3.943346619565e-01 1299 KSP Residual norm 3.943346612490e-01 1300 KSP Residual norm 3.943346610591e-01 1301 KSP Residual norm 3.943346602304e-01 1302 KSP Residual norm 3.943346590869e-01 1303 KSP Residual norm 3.943346588267e-01 1304 KSP Residual norm 3.943346581780e-01 1305 KSP Residual norm 3.943346534900e-01 1306 KSP Residual norm 3.943346521554e-01 1307 KSP Residual norm 3.943346456642e-01 1308 KSP Residual norm 3.943346290637e-01 1309 KSP Residual norm 3.943346287914e-01 1310 KSP Residual norm 3.943346102369e-01 1311 KSP Residual norm 3.943345950547e-01 1312 KSP Residual norm 3.943345949008e-01 1313 KSP Residual norm 3.943345600925e-01 1314 KSP Residual norm 3.943344339997e-01 1315 KSP Residual norm 3.943343860399e-01 1316 KSP Residual norm 3.943343857851e-01 1317 KSP Residual norm 3.943343467895e-01 1318 KSP Residual norm 3.943343316836e-01 1319 KSP Residual norm 3.943343316824e-01 1320 KSP Residual norm 3.943342977885e-01 1321 KSP Residual norm 3.943342977855e-01 1322 KSP Residual norm 3.943342977708e-01 1323 KSP Residual norm 3.943342976654e-01 1324 KSP Residual norm 3.943342975184e-01 1325 KSP Residual norm 3.943342975147e-01 1326 KSP Residual norm 3.943342974291e-01 1327 KSP Residual norm 3.943342971744e-01 1328 KSP Residual norm 3.943342971744e-01 1329 KSP Residual norm 3.943342965385e-01 1330 KSP Residual norm 3.943342963631e-01 1331 KSP Residual norm 3.943342957310e-01 1332 KSP Residual norm 3.943342948425e-01 1333 KSP Residual norm 3.943342946331e-01 1334 KSP Residual norm 3.943342941261e-01 1335 KSP Residual norm 3.943342904077e-01 1336 KSP Residual norm 3.943342893327e-01 1337 KSP Residual norm 3.943342844159e-01 1338 KSP Residual norm 3.943342717655e-01 1339 KSP Residual norm 3.943342715718e-01 1340 KSP Residual norm 3.943342572137e-01 1341 KSP Residual norm 3.943342455547e-01 1342 KSP Residual norm 3.943342454312e-01 1343 KSP Residual norm 3.943342188924e-01 1344 KSP Residual norm 3.943341238583e-01 1345 KSP Residual norm 3.943340885746e-01 1346 KSP Residual norm 3.943340883261e-01 1347 KSP Residual norm 3.943340594624e-01 1348 KSP Residual norm 3.943340492740e-01 1349 KSP Residual norm 3.943340492002e-01 1350 KSP Residual norm 3.943340197419e-01 1351 KSP Residual norm 3.943340197393e-01 1352 KSP Residual norm 3.943340197270e-01 1353 KSP Residual norm 3.943340196348e-01 1354 KSP Residual norm 3.943340194971e-01 1355 KSP Residual norm 3.943340194904e-01 1356 KSP Residual norm 3.943340194315e-01 1357 KSP Residual norm 3.943340192312e-01 1358 KSP Residual norm 3.943340192306e-01 1359 KSP Residual norm 3.943340188010e-01 1360 KSP Residual norm 3.943340186852e-01 1361 KSP Residual norm 3.943340181917e-01 1362 KSP Residual norm 3.943340175092e-01 1363 KSP Residual norm 3.943340173533e-01 1364 KSP Residual norm 3.943340169659e-01 1365 KSP Residual norm 3.943340141611e-01 1366 KSP Residual norm 3.943340133614e-01 1367 KSP Residual norm 3.943340094971e-01 1368 KSP Residual norm 3.943339996111e-01 1369 KSP Residual norm 3.943339994505e-01 1370 KSP Residual norm 3.943339883781e-01 1371 KSP Residual norm 3.943339793271e-01 1372 KSP Residual norm 3.943339792348e-01 1373 KSP Residual norm 3.943339584980e-01 1374 KSP Residual norm 3.943338835001e-01 1375 KSP Residual norm 3.943338550770e-01 1376 KSP Residual norm 3.943338549176e-01 1377 KSP Residual norm 3.943338317588e-01 1378 KSP Residual norm 3.943338228967e-01 1379 KSP Residual norm 3.943338228966e-01 1380 KSP Residual norm 3.943338023423e-01 1381 KSP Residual norm 3.943338023405e-01 1382 KSP Residual norm 3.943338023317e-01 1383 KSP Residual norm 3.943338022678e-01 1384 KSP Residual norm 3.943338021778e-01 1385 KSP Residual norm 3.943338021753e-01 1386 KSP Residual norm 3.943338021250e-01 1387 KSP Residual norm 3.943338019728e-01 1388 KSP Residual norm 3.943338019728e-01 1389 KSP Residual norm 3.943338015998e-01 1390 KSP Residual norm 3.943338014973e-01 1391 KSP Residual norm 3.943338011199e-01 1392 KSP Residual norm 3.943338005905e-01 1393 KSP Residual norm 3.943338004663e-01 1394 KSP Residual norm 3.943338001645e-01 1395 KSP Residual norm 3.943337979542e-01 1396 KSP Residual norm 3.943337973164e-01 1397 KSP Residual norm 3.943337943777e-01 1398 KSP Residual norm 3.943337868243e-01 1399 KSP Residual norm 3.943337867079e-01 1400 KSP Residual norm 3.943337781477e-01 1401 KSP Residual norm 3.943337711914e-01 1402 KSP Residual norm 3.943337711179e-01 1403 KSP Residual norm 3.943337552685e-01 1404 KSP Residual norm 3.943336984544e-01 1405 KSP Residual norm 3.943336773196e-01 1406 KSP Residual norm 3.943336771728e-01 1407 KSP Residual norm 3.943336598734e-01 1408 KSP Residual norm 3.943336537114e-01 1409 KSP Residual norm 3.943336536763e-01 1410 KSP Residual norm 3.943336363009e-01 1411 KSP Residual norm 3.943336362993e-01 1412 KSP Residual norm 3.943336362921e-01 1413 KSP Residual norm 3.943336362378e-01 1414 KSP Residual norm 3.943336361572e-01 1415 KSP Residual norm 3.943336361535e-01 1416 KSP Residual norm 3.943336361180e-01 1417 KSP Residual norm 3.943336359988e-01 1418 KSP Residual norm 3.943336359985e-01 1419 KSP Residual norm 3.943336357388e-01 1420 KSP Residual norm 3.943336356687e-01 1421 KSP Residual norm 3.943336353749e-01 1422 KSP Residual norm 3.943336349680e-01 1423 KSP Residual norm 3.943336348747e-01 1424 KSP Residual norm 3.943336346436e-01 1425 KSP Residual norm 3.943336329680e-01 1426 KSP Residual norm 3.943336324896e-01 1427 KSP Residual norm 3.943336301901e-01 1428 KSP Residual norm 3.943336243055e-01 1429 KSP Residual norm 3.943336242106e-01 1430 KSP Residual norm 3.943336176095e-01 1431 KSP Residual norm 3.943336122176e-01 1432 KSP Residual norm 3.943336121624e-01 1433 KSP Residual norm 3.943335998160e-01 1434 KSP Residual norm 3.943335552172e-01 1435 KSP Residual norm 3.943335383598e-01 1436 KSP Residual norm 3.943335382618e-01 1437 KSP Residual norm 3.943335245080e-01 1438 KSP Residual norm 3.943335192935e-01 1439 KSP Residual norm 3.943335192927e-01 1440 KSP Residual norm 3.943335068901e-01 1441 KSP Residual norm 3.943335068890e-01 1442 KSP Residual norm 3.943335068837e-01 1443 KSP Residual norm 3.943335068452e-01 1444 KSP Residual norm 3.943335067905e-01 1445 KSP Residual norm 3.943335067889e-01 1446 KSP Residual norm 3.943335067592e-01 1447 KSP Residual norm 3.943335066684e-01 1448 KSP Residual norm 3.943335066684e-01 1449 KSP Residual norm 3.943335064489e-01 1450 KSP Residual norm 3.943335063887e-01 1451 KSP Residual norm 3.943335061636e-01 1452 KSP Residual norm 3.943335058484e-01 1453 KSP Residual norm 3.943335057746e-01 1454 KSP Residual norm 3.943335055950e-01 1455 KSP Residual norm 3.943335042811e-01 1456 KSP Residual norm 3.943335039025e-01 1457 KSP Residual norm 3.943335021483e-01 1458 KSP Residual norm 3.943334976426e-01 1459 KSP Residual norm 3.943334975728e-01 1460 KSP Residual norm 3.943334924725e-01 1461 KSP Residual norm 3.943334883254e-01 1462 KSP Residual norm 3.943334882818e-01 1463 KSP Residual norm 3.943334788263e-01 1464 KSP Residual norm 3.943334449040e-01 1465 KSP Residual norm 3.943334322644e-01 1466 KSP Residual norm 3.943334321778e-01 1467 KSP Residual norm 3.943334218295e-01 1468 KSP Residual norm 3.943334181170e-01 1469 KSP Residual norm 3.943334180998e-01 1470 KSP Residual norm 3.943334078354e-01 1471 KSP Residual norm 3.943334078345e-01 1472 KSP Residual norm 3.943334078302e-01 1473 KSP Residual norm 3.943334077982e-01 1474 KSP Residual norm 3.943334077509e-01 1475 KSP Residual norm 3.943334077488e-01 1476 KSP Residual norm 3.943334077274e-01 1477 KSP Residual norm 3.943334076565e-01 1478 KSP Residual norm 3.943334076564e-01 1479 KSP Residual norm 3.943334075000e-01 1480 KSP Residual norm 3.943334074577e-01 1481 KSP Residual norm 3.943334072828e-01 1482 KSP Residual norm 3.943334070403e-01 1483 KSP Residual norm 3.943334069846e-01 1484 KSP Residual norm 3.943334068468e-01 1485 KSP Residual norm 3.943334058468e-01 1486 KSP Residual norm 3.943334055610e-01 1487 KSP Residual norm 3.943334041930e-01 1488 KSP Residual norm 3.943334006909e-01 1489 KSP Residual norm 3.943334006347e-01 1490 KSP Residual norm 3.943333967016e-01 1491 KSP Residual norm 3.943333934908e-01 1492 KSP Residual norm 3.943333934578e-01 1493 KSP Residual norm 3.943333861090e-01 1494 KSP Residual norm 3.943333595875e-01 1495 KSP Residual norm 3.943333495827e-01 1496 KSP Residual norm 3.943333495229e-01 1497 KSP Residual norm 3.943333413528e-01 1498 KSP Residual norm 3.943333382771e-01 1499 KSP Residual norm 3.943333382760e-01 1500 KSP Residual norm 3.943333308189e-01 1501 KSP Residual norm 3.943333308182e-01 1502 KSP Residual norm 3.943333308151e-01 1503 KSP Residual norm 3.943333307919e-01 1504 KSP Residual norm 3.943333307588e-01 1505 KSP Residual norm 3.943333307578e-01 1506 KSP Residual norm 3.943333307403e-01 1507 KSP Residual norm 3.943333306861e-01 1508 KSP Residual norm 3.943333306861e-01 1509 KSP Residual norm 3.943333305567e-01 1510 KSP Residual norm 3.943333305212e-01 1511 KSP Residual norm 3.943333303870e-01 1512 KSP Residual norm 3.943333301994e-01 1513 KSP Residual norm 3.943333301555e-01 1514 KSP Residual norm 3.943333300486e-01 1515 KSP Residual norm 3.943333292674e-01 1516 KSP Residual norm 3.943333290426e-01 1517 KSP Residual norm 3.943333279963e-01 1518 KSP Residual norm 3.943333253103e-01 1519 KSP Residual norm 3.943333252686e-01 1520 KSP Residual norm 3.943333222309e-01 1521 KSP Residual norm 3.943333197598e-01 1522 KSP Residual norm 3.943333197339e-01 1523 KSP Residual norm 3.943333140968e-01 1524 KSP Residual norm 3.943332938600e-01 1525 KSP Residual norm 3.943332863098e-01 1526 KSP Residual norm 3.943332862587e-01 1527 KSP Residual norm 3.943332800767e-01 1528 KSP Residual norm 3.943332778465e-01 1529 KSP Residual norm 3.943332778378e-01 1530 KSP Residual norm 3.943332717656e-01 1531 KSP Residual norm 3.943332717651e-01 1532 KSP Residual norm 3.943332717625e-01 1533 KSP Residual norm 3.943332717436e-01 1534 KSP Residual norm 3.943332717158e-01 1535 KSP Residual norm 3.943332717146e-01 1536 KSP Residual norm 3.943332717018e-01 1537 KSP Residual norm 3.943332716596e-01 1538 KSP Residual norm 3.943332716595e-01 1539 KSP Residual norm 3.943332715656e-01 1540 KSP Residual norm 3.943332715401e-01 1541 KSP Residual norm 3.943332714361e-01 1542 KSP Residual norm 3.943332712916e-01 1543 KSP Residual norm 3.943332712584e-01 1544 KSP Residual norm 3.943332711763e-01 1545 KSP Residual norm 3.943332705798e-01 1546 KSP Residual norm 3.943332704093e-01 1547 KSP Residual norm 3.943332695954e-01 1548 KSP Residual norm 3.943332675113e-01 1549 KSP Residual norm 3.943332674780e-01 1550 KSP Residual norm 3.943332651354e-01 1551 KSP Residual norm 3.943332632238e-01 1552 KSP Residual norm 3.943332632041e-01 1553 KSP Residual norm 3.943332588306e-01 1554 KSP Residual norm 3.943332430574e-01 1555 KSP Residual norm 3.943332371161e-01 1556 KSP Residual norm 3.943332370799e-01 1557 KSP Residual norm 3.943332322252e-01 1558 KSP Residual norm 3.943332304074e-01 1559 KSP Residual norm 3.943332304063e-01 1560 KSP Residual norm 3.943332259340e-01 1561 KSP Residual norm 3.943332259336e-01 1562 KSP Residual norm 3.943332259317e-01 1563 KSP Residual norm 3.943332259178e-01 1564 KSP Residual norm 3.943332258979e-01 1565 KSP Residual norm 3.943332258973e-01 1566 KSP Residual norm 3.943332258869e-01 1567 KSP Residual norm 3.943332258546e-01 1568 KSP Residual norm 3.943332258546e-01 1569 KSP Residual norm 3.943332257781e-01 1570 KSP Residual norm 3.943332257572e-01 1571 KSP Residual norm 3.943332256772e-01 1572 KSP Residual norm 3.943332255655e-01 1573 KSP Residual norm 3.943332255395e-01 1574 KSP Residual norm 3.943332254759e-01 1575 KSP Residual norm 3.943332250113e-01 1576 KSP Residual norm 3.943332248777e-01 1577 KSP Residual norm 3.943332242540e-01 1578 KSP Residual norm 3.943332226535e-01 1579 KSP Residual norm 3.943332226285e-01 1580 KSP Residual norm 3.943332208198e-01 1581 KSP Residual norm 3.943332193478e-01 1582 KSP Residual norm 3.943332193324e-01 1583 KSP Residual norm 3.943332159733e-01 1584 KSP Residual norm 3.943332039079e-01 1585 KSP Residual norm 3.943331994016e-01 1586 KSP Residual norm 3.943331993714e-01 1587 KSP Residual norm 3.943331956819e-01 1588 KSP Residual norm 3.943331943450e-01 1589 KSP Residual norm 3.943331943406e-01 1590 KSP Residual norm 3.943331907439e-01 1591 KSP Residual norm 3.943331907436e-01 1592 KSP Residual norm 3.943331907421e-01 1593 KSP Residual norm 3.943331907309e-01 1594 KSP Residual norm 3.943331907144e-01 1595 KSP Residual norm 3.943331907138e-01 1596 KSP Residual norm 3.943331907061e-01 1597 KSP Residual norm 3.943331906810e-01 1598 KSP Residual norm 3.943331906810e-01 1599 KSP Residual norm 3.943331906247e-01 1600 KSP Residual norm 3.943331906094e-01 1601 KSP Residual norm 3.943331905475e-01 1602 KSP Residual norm 3.943331904614e-01 1603 KSP Residual norm 3.943331904416e-01 1604 KSP Residual norm 3.943331903927e-01 1605 KSP Residual norm 3.943331900372e-01 1606 KSP Residual norm 3.943331899354e-01 1607 KSP Residual norm 3.943331894512e-01 1608 KSP Residual norm 3.943331882110e-01 1609 KSP Residual norm 3.943331881912e-01 1610 KSP Residual norm 3.943331867962e-01 1611 KSP Residual norm 3.943331856583e-01 1612 KSP Residual norm 3.943331856465e-01 1613 KSP Residual norm 3.943331830438e-01 1614 KSP Residual norm 3.943331736618e-01 1615 KSP Residual norm 3.943331701318e-01 1616 KSP Residual norm 3.943331701100e-01 1617 KSP Residual norm 3.943331672244e-01 1618 KSP Residual norm 3.943331661484e-01 1619 KSP Residual norm 3.943331661475e-01 1620 KSP Residual norm 3.943331634707e-01 1621 KSP Residual norm 3.943331634704e-01 1622 KSP Residual norm 3.943331634693e-01 1623 KSP Residual norm 3.943331634610e-01 1624 KSP Residual norm 3.943331634490e-01 1625 KSP Residual norm 3.943331634486e-01 1626 KSP Residual norm 3.943331634425e-01 1627 KSP Residual norm 3.943331634232e-01 1628 KSP Residual norm 3.943331634232e-01 1629 KSP Residual norm 3.943331633780e-01 1630 KSP Residual norm 3.943331633656e-01 1631 KSP Residual norm 3.943331633180e-01 1632 KSP Residual norm 3.943331632515e-01 1633 KSP Residual norm 3.943331632360e-01 1634 KSP Residual norm 3.943331631981e-01 1635 KSP Residual norm 3.943331629218e-01 1636 KSP Residual norm 3.943331628424e-01 1637 KSP Residual norm 3.943331624707e-01 1638 KSP Residual norm 3.943331615173e-01 1639 KSP Residual norm 3.943331615024e-01 1640 KSP Residual norm 3.943331604255e-01 1641 KSP Residual norm 3.943331595489e-01 1642 KSP Residual norm 3.943331595398e-01 1643 KSP Residual norm 3.943331575387e-01 1644 KSP Residual norm 3.943331503481e-01 1645 KSP Residual norm 3.943331476604e-01 1646 KSP Residual norm 3.943331476425e-01 1647 KSP Residual norm 3.943331454420e-01 1648 KSP Residual norm 3.943331446419e-01 1649 KSP Residual norm 3.943331446396e-01 1650 KSP Residual norm 3.943331425071e-01 1651 KSP Residual norm 3.943331425069e-01 1652 KSP Residual norm 3.943331425060e-01 1653 KSP Residual norm 3.943331424993e-01 1654 KSP Residual norm 3.943331424896e-01 1655 KSP Residual norm 3.943331424892e-01 1656 KSP Residual norm 3.943331424846e-01 1657 KSP Residual norm 3.943331424697e-01 1658 KSP Residual norm 3.943331424697e-01 1659 KSP Residual norm 3.943331424360e-01 1660 KSP Residual norm 3.943331424269e-01 1661 KSP Residual norm 3.943331423900e-01 1662 KSP Residual norm 3.943331423388e-01 1663 KSP Residual norm 3.943331423269e-01 1664 KSP Residual norm 3.943331422978e-01 1665 KSP Residual norm 3.943331420860e-01 1666 KSP Residual norm 3.943331420253e-01 1667 KSP Residual norm 3.943331417372e-01 1668 KSP Residual norm 3.943331409991e-01 1669 KSP Residual norm 3.943331409874e-01 1670 KSP Residual norm 3.943331401568e-01 1671 KSP Residual norm 3.943331394794e-01 1672 KSP Residual norm 3.943331394724e-01 1673 KSP Residual norm 3.943331379235e-01 1674 KSP Residual norm 3.943331323422e-01 1675 KSP Residual norm 3.943331302441e-01 1676 KSP Residual norm 3.943331302310e-01 1677 KSP Residual norm 3.943331285153e-01 1678 KSP Residual norm 3.943331278776e-01 1679 KSP Residual norm 3.943331278770e-01 1680 KSP Residual norm 3.943331262767e-01 1681 KSP Residual norm 3.943331262766e-01 1682 KSP Residual norm 3.943331262759e-01 1683 KSP Residual norm 3.943331262709e-01 1684 KSP Residual norm 3.943331262638e-01 1685 KSP Residual norm 3.943331262635e-01 1686 KSP Residual norm 3.943331262599e-01 1687 KSP Residual norm 3.943331262484e-01 1688 KSP Residual norm 3.943331262484e-01 1689 KSP Residual norm 3.943331262216e-01 1690 KSP Residual norm 3.943331262142e-01 1691 KSP Residual norm 3.943331261859e-01 1692 KSP Residual norm 3.943331261463e-01 1693 KSP Residual norm 3.943331261371e-01 1694 KSP Residual norm 3.943331261145e-01 1695 KSP Residual norm 3.943331259502e-01 1696 KSP Residual norm 3.943331259030e-01 1697 KSP Residual norm 3.943331256816e-01 1698 KSP Residual norm 3.943331251137e-01 1699 KSP Residual norm 3.943331251048e-01 1700 KSP Residual norm 3.943331244637e-01 1701 KSP Residual norm 3.943331239417e-01 1702 KSP Residual norm 3.943331239363e-01 1703 KSP Residual norm 3.943331227444e-01 1704 KSP Residual norm 3.943331184604e-01 1705 KSP Residual norm 3.943331168581e-01 1706 KSP Residual norm 3.943331168475e-01 1707 KSP Residual norm 3.943331155357e-01 1708 KSP Residual norm 3.943331150575e-01 1709 KSP Residual norm 3.943331150562e-01 1710 KSP Residual norm 3.943331137907e-01 1711 KSP Residual norm 3.943331137906e-01 1712 KSP Residual norm 3.943331137901e-01 1713 KSP Residual norm 3.943331137861e-01 1714 KSP Residual norm 3.943331137804e-01 1715 KSP Residual norm 3.943331137802e-01 1716 KSP Residual norm 3.943331137774e-01 1717 KSP Residual norm 3.943331137685e-01 1718 KSP Residual norm 3.943331137685e-01 1719 KSP Residual norm 3.943331137484e-01 1720 KSP Residual norm 3.943331137429e-01 1721 KSP Residual norm 3.943331137210e-01 1722 KSP Residual norm 3.943331136905e-01 1723 KSP Residual norm 3.943331136834e-01 1724 KSP Residual norm 3.943331136661e-01 1725 KSP Residual norm 3.943331135399e-01 1726 KSP Residual norm 3.943331135037e-01 1727 KSP Residual norm 3.943331133323e-01 1728 KSP Residual norm 3.943331128930e-01 1729 KSP Residual norm 3.943331128861e-01 1730 KSP Residual norm 3.943331123916e-01 1731 KSP Residual norm 3.943331119884e-01 1732 KSP Residual norm 3.943331119842e-01 1733 KSP Residual norm 3.943331110623e-01 1734 KSP Residual norm 3.943331077417e-01 1735 KSP Residual norm 3.943331064942e-01 1736 KSP Residual norm 3.943331064864e-01 1737 KSP Residual norm 3.943331054661e-01 1738 KSP Residual norm 3.943331050877e-01 1739 KSP Residual norm 3.943331050873e-01 1740 KSP Residual norm 3.943331041317e-01 1741 KSP Residual norm 3.943331041316e-01 1742 KSP Residual norm 3.943331041312e-01 1743 KSP Residual norm 3.943331041283e-01 1744 KSP Residual norm 3.943331041240e-01 1745 KSP Residual norm 3.943331041238e-01 1746 KSP Residual norm 3.943331041217e-01 1747 KSP Residual norm 3.943331041148e-01 1748 KSP Residual norm 3.943331041148e-01 1749 KSP Residual norm 3.943331040989e-01 1750 KSP Residual norm 3.943331040946e-01 1751 KSP Residual norm 3.943331040777e-01 1752 KSP Residual norm 3.943331040541e-01 1753 KSP Residual norm 3.943331040486e-01 1754 KSP Residual norm 3.943331040352e-01 1755 KSP Residual norm 3.943331039374e-01 1756 KSP Residual norm 3.943331039094e-01 1757 KSP Residual norm 3.943331037775e-01 1758 KSP Residual norm 3.943331034393e-01 1759 KSP Residual norm 3.943331034340e-01 1760 KSP Residual norm 3.943331030524e-01 1761 KSP Residual norm 3.943331027416e-01 1762 KSP Residual norm 3.943331027384e-01 1763 KSP Residual norm 3.943331020286e-01 1764 KSP Residual norm 3.943330994768e-01 1765 KSP Residual norm 3.943330985219e-01 1766 KSP Residual norm 3.943330985156e-01 1767 KSP Residual norm 3.943330977339e-01 1768 KSP Residual norm 3.943330974483e-01 1769 KSP Residual norm 3.943330974476e-01 1770 KSP Residual norm 3.943330966962e-01 1771 KSP Residual norm 3.943330966961e-01 1772 KSP Residual norm 3.943330966958e-01 1773 KSP Residual norm 3.943330966935e-01 1774 KSP Residual norm 3.943330966900e-01 1775 KSP Residual norm 3.943330966899e-01 1776 KSP Residual norm 3.943330966883e-01 1777 KSP Residual norm 3.943330966830e-01 1778 KSP Residual norm 3.943330966830e-01 1779 KSP Residual norm 3.943330966710e-01 1780 KSP Residual norm 3.943330966677e-01 1781 KSP Residual norm 3.943330966546e-01 1782 KSP Residual norm 3.943330966365e-01 1783 KSP Residual norm 3.943330966323e-01 1784 KSP Residual norm 3.943330966220e-01 1785 KSP Residual norm 3.943330965468e-01 1786 KSP Residual norm 3.943330965252e-01 1787 KSP Residual norm 3.943330964232e-01 1788 KSP Residual norm 3.943330961618e-01 1789 KSP Residual norm 3.943330961577e-01 1790 KSP Residual norm 3.943330958633e-01 1791 KSP Residual norm 3.943330956233e-01 1792 KSP Residual norm 3.943330956208e-01 1793 KSP Residual norm 3.943330950721e-01 1794 KSP Residual norm 3.943330930963e-01 1795 KSP Residual norm 3.943330923543e-01 1796 KSP Residual norm 3.943330923497e-01 1797 KSP Residual norm 3.943330917428e-01 1798 KSP Residual norm 3.943330915181e-01 1799 KSP Residual norm 3.943330915178e-01 1800 KSP Residual norm 3.943330909476e-01 1801 KSP Residual norm 3.943330909476e-01 1802 KSP Residual norm 3.943330909473e-01 1803 KSP Residual norm 3.943330909456e-01 1804 KSP Residual norm 3.943330909430e-01 1805 KSP Residual norm 3.943330909429e-01 1806 KSP Residual norm 3.943330909416e-01 1807 KSP Residual norm 3.943330909376e-01 1808 KSP Residual norm 3.943330909376e-01 1809 KSP Residual norm 3.943330909281e-01 1810 KSP Residual norm 3.943330909255e-01 1811 KSP Residual norm 3.943330909155e-01 1812 KSP Residual norm 3.943330909014e-01 1813 KSP Residual norm 3.943330908982e-01 1814 KSP Residual norm 3.943330908902e-01 1815 KSP Residual norm 3.943330908320e-01 1816 KSP Residual norm 3.943330908153e-01 1817 KSP Residual norm 3.943330907368e-01 1818 KSP Residual norm 3.943330905354e-01 1819 KSP Residual norm 3.943330905323e-01 1820 KSP Residual norm 3.943330903051e-01 1821 KSP Residual norm 3.943330901200e-01 1822 KSP Residual norm 3.943330901181e-01 1823 KSP Residual norm 3.943330896955e-01 1824 KSP Residual norm 3.943330881757e-01 1825 KSP Residual norm 3.943330876068e-01 1826 KSP Residual norm 3.943330876031e-01 1827 KSP Residual norm 3.943330871373e-01 1828 KSP Residual norm 3.943330869669e-01 1829 KSP Residual norm 3.943330869666e-01 1830 KSP Residual norm 3.943330865203e-01 1831 KSP Residual norm 3.943330865202e-01 1832 KSP Residual norm 3.943330865201e-01 1833 KSP Residual norm 3.943330865187e-01 1834 KSP Residual norm 3.943330865166e-01 1835 KSP Residual norm 3.943330865166e-01 1836 KSP Residual norm 3.943330865156e-01 1837 KSP Residual norm 3.943330865124e-01 1838 KSP Residual norm 3.943330865124e-01 1839 KSP Residual norm 3.943330865053e-01 1840 KSP Residual norm 3.943330865033e-01 1841 KSP Residual norm 3.943330864956e-01 1842 KSP Residual norm 3.943330864847e-01 1843 KSP Residual norm 3.943330864822e-01 1844 KSP Residual norm 3.943330864761e-01 1845 KSP Residual norm 3.943330864313e-01 1846 KSP Residual norm 3.943330864185e-01 1847 KSP Residual norm 3.943330863578e-01 1848 KSP Residual norm 3.943330862022e-01 1849 KSP Residual norm 3.943330861997e-01 1850 KSP Residual norm 3.943330860245e-01 1851 KSP Residual norm 3.943330858816e-01 1852 KSP Residual norm 3.943330858801e-01 1853 KSP Residual norm 3.943330855536e-01 1854 KSP Residual norm 3.943330843778e-01 1855 KSP Residual norm 3.943330839365e-01 1856 KSP Residual norm 3.943330839337e-01 1857 KSP Residual norm 3.943330835726e-01 1858 KSP Residual norm 3.943330834392e-01 1859 KSP Residual norm 3.943330834390e-01 1860 KSP Residual norm 3.943330830987e-01 1861 KSP Residual norm 3.943330830987e-01 1862 KSP Residual norm 3.943330830986e-01 1863 KSP Residual norm 3.943330830975e-01 1864 KSP Residual norm 3.943330830960e-01 1865 KSP Residual norm 3.943330830959e-01 1866 KSP Residual norm 3.943330830952e-01 1867 KSP Residual norm 3.943330830927e-01 1868 KSP Residual norm 3.943330830927e-01 1869 KSP Residual norm 3.943330830871e-01 1870 KSP Residual norm 3.943330830856e-01 1871 KSP Residual norm 3.943330830796e-01 1872 KSP Residual norm 3.943330830713e-01 1873 KSP Residual norm 3.943330830693e-01 1874 KSP Residual norm 3.943330830646e-01 1875 KSP Residual norm 3.943330830299e-01 1876 KSP Residual norm 3.943330830200e-01 1877 KSP Residual norm 3.943330829732e-01 1878 KSP Residual norm 3.943330828534e-01 1879 KSP Residual norm 3.943330828515e-01 1880 KSP Residual norm 3.943330827162e-01 1881 KSP Residual norm 3.943330826061e-01 1882 KSP Residual norm 3.943330826049e-01 1883 KSP Residual norm 3.943330823533e-01 1884 KSP Residual norm 3.943330814483e-01 1885 KSP Residual norm 3.943330811094e-01 1886 KSP Residual norm 3.943330811072e-01 1887 KSP Residual norm 3.943330808298e-01 1888 KSP Residual norm 3.943330807281e-01 1889 KSP Residual norm 3.943330807279e-01 1890 KSP Residual norm 3.943330804628e-01 1891 KSP Residual norm 3.943330804628e-01 1892 KSP Residual norm 3.943330804627e-01 1893 KSP Residual norm 3.943330804619e-01 1894 KSP Residual norm 3.943330804607e-01 1895 KSP Residual norm 3.943330804606e-01 1896 KSP Residual norm 3.943330804600e-01 1897 KSP Residual norm 3.943330804582e-01 1898 KSP Residual norm 3.943330804582e-01 1899 KSP Residual norm 3.943330804539e-01 1900 KSP Residual norm 3.943330804527e-01 1901 KSP Residual norm 3.943330804481e-01 1902 KSP Residual norm 3.943330804417e-01 1903 KSP Residual norm 3.943330804402e-01 1904 KSP Residual norm 3.943330804365e-01 1905 KSP Residual norm 3.943330804099e-01 1906 KSP Residual norm 3.943330804022e-01 1907 KSP Residual norm 3.943330803661e-01 1908 KSP Residual norm 3.943330802735e-01 1909 KSP Residual norm 3.943330802720e-01 1910 KSP Residual norm 3.943330801677e-01 1911 KSP Residual norm 3.943330800826e-01 1912 KSP Residual norm 3.943330800818e-01 1913 KSP Residual norm 3.943330798874e-01 1914 KSP Residual norm 3.943330791877e-01 1915 KSP Residual norm 3.943330789251e-01 1916 KSP Residual norm 3.943330789234e-01 1917 KSP Residual norm 3.943330787086e-01 1918 KSP Residual norm 3.943330786293e-01 1919 KSP Residual norm 3.943330786292e-01 1920 KSP Residual norm 3.943330784262e-01 1921 KSP Residual norm 3.943330784262e-01 1922 KSP Residual norm 3.943330784261e-01 1923 KSP Residual norm 3.943330784255e-01 1924 KSP Residual norm 3.943330784246e-01 1925 KSP Residual norm 3.943330784246e-01 1926 KSP Residual norm 3.943330784241e-01 1927 KSP Residual norm 3.943330784227e-01 1928 KSP Residual norm 3.943330784227e-01 1929 KSP Residual norm 3.943330784193e-01 1930 KSP Residual norm 3.943330784184e-01 1931 KSP Residual norm 3.943330784148e-01 1932 KSP Residual norm 3.943330784099e-01 1933 KSP Residual norm 3.943330784087e-01 1934 KSP Residual norm 3.943330784059e-01 1935 KSP Residual norm 3.943330783853e-01 1936 KSP Residual norm 3.943330783794e-01 1937 KSP Residual norm 3.943330783515e-01 1938 KSP Residual norm 3.943330782802e-01 1939 KSP Residual norm 3.943330782790e-01 1940 KSP Residual norm 3.943330781985e-01 1941 KSP Residual norm 3.943330781329e-01 1942 KSP Residual norm 3.943330781323e-01 1943 KSP Residual norm 3.943330779824e-01 1944 KSP Residual norm 3.943330774436e-01 1945 KSP Residual norm 3.943330772417e-01 1946 KSP Residual norm 3.943330772404e-01 1947 KSP Residual norm 3.943330770752e-01 1948 KSP Residual norm 3.943330770146e-01 1949 KSP Residual norm 3.943330770145e-01 1950 KSP Residual norm 3.943330768567e-01 1951 KSP Residual norm 3.943330768567e-01 1952 KSP Residual norm 3.943330768567e-01 1953 KSP Residual norm 3.943330768562e-01 1954 KSP Residual norm 3.943330768555e-01 1955 KSP Residual norm 3.943330768554e-01 1956 KSP Residual norm 3.943330768551e-01 1957 KSP Residual norm 3.943330768540e-01 1958 KSP Residual norm 3.943330768540e-01 1959 KSP Residual norm 3.943330768514e-01 1960 KSP Residual norm 3.943330768507e-01 1961 KSP Residual norm 3.943330768480e-01 1962 KSP Residual norm 3.943330768441e-01 1963 KSP Residual norm 3.943330768433e-01 1964 KSP Residual norm 3.943330768411e-01 1965 KSP Residual norm 3.943330768252e-01 1966 KSP Residual norm 3.943330768206e-01 1967 KSP Residual norm 3.943330767991e-01 1968 KSP Residual norm 3.943330767440e-01 1969 KSP Residual norm 3.943330767431e-01 1970 KSP Residual norm 3.943330766810e-01 1971 KSP Residual norm 3.943330766304e-01 1972 KSP Residual norm 3.943330766299e-01 1973 KSP Residual norm 3.943330765142e-01 1974 KSP Residual norm 3.943330760978e-01 1975 KSP Residual norm 3.943330759415e-01 1976 KSP Residual norm 3.943330759405e-01 1977 KSP Residual norm 3.943330758127e-01 1978 KSP Residual norm 3.943330757655e-01 1979 KSP Residual norm 3.943330757655e-01 1980 KSP Residual norm 3.943330756446e-01 1981 KSP Residual norm 3.943330756446e-01 1982 KSP Residual norm 3.943330756446e-01 1983 KSP Residual norm 3.943330756442e-01 1984 KSP Residual norm 3.943330756436e-01 1985 KSP Residual norm 3.943330756436e-01 1986 KSP Residual norm 3.943330756433e-01 1987 KSP Residual norm 3.943330756425e-01 1988 KSP Residual norm 3.943330756425e-01 1989 KSP Residual norm 3.943330756405e-01 1990 KSP Residual norm 3.943330756400e-01 1991 KSP Residual norm 3.943330756378e-01 1992 KSP Residual norm 3.943330756349e-01 1993 KSP Residual norm 3.943330756342e-01 1994 KSP Residual norm 3.943330756325e-01 1995 KSP Residual norm 3.943330756202e-01 1996 KSP Residual norm 3.943330756167e-01 1997 KSP Residual norm 3.943330756002e-01 1998 KSP Residual norm 3.943330755577e-01 1999 KSP Residual norm 3.943330755570e-01 2000 KSP Residual norm 3.943330755091e-01 2001 KSP Residual norm 3.943330754700e-01 2002 KSP Residual norm 3.943330754696e-01 2003 KSP Residual norm 3.943330753804e-01 2004 KSP Residual norm 3.943330750596e-01 2005 KSP Residual norm 3.943330749394e-01 2006 KSP Residual norm 3.943330749386e-01 2007 KSP Residual norm 3.943330748402e-01 2008 KSP Residual norm 3.943330748041e-01 2009 KSP Residual norm 3.943330748040e-01 2010 KSP Residual norm 3.943330747100e-01 2011 KSP Residual norm 3.943330747100e-01 2012 KSP Residual norm 3.943330747099e-01 2013 KSP Residual norm 3.943330747097e-01 2014 KSP Residual norm 3.943330747092e-01 2015 KSP Residual norm 3.943330747092e-01 2016 KSP Residual norm 3.943330747090e-01 2017 KSP Residual norm 3.943330747083e-01 2018 KSP Residual norm 3.943330747083e-01 2019 KSP Residual norm 3.943330747068e-01 2020 KSP Residual norm 3.943330747064e-01 2021 KSP Residual norm 3.943330747048e-01 2022 KSP Residual norm 3.943330747025e-01 2023 KSP Residual norm 3.943330747020e-01 2024 KSP Residual norm 3.943330747007e-01 2025 KSP Residual norm 3.943330746912e-01 2026 KSP Residual norm 3.943330746885e-01 2027 KSP Residual norm 3.943330746757e-01 2028 KSP Residual norm 3.943330746429e-01 2029 KSP Residual norm 3.943330746424e-01 2030 KSP Residual norm 3.943330746054e-01 2031 KSP Residual norm 3.943330745753e-01 2032 KSP Residual norm 3.943330745750e-01 2033 KSP Residual norm 3.943330745061e-01 2034 KSP Residual norm 3.943330742582e-01 2035 KSP Residual norm 3.943330741653e-01 2036 KSP Residual norm 3.943330741647e-01 2037 KSP Residual norm 3.943330740886e-01 2038 KSP Residual norm 3.943330740606e-01 2039 KSP Residual norm 3.943330740605e-01 2040 KSP Residual norm 3.943330739887e-01 2041 KSP Residual norm 3.943330739887e-01 2042 KSP Residual norm 3.943330739887e-01 2043 KSP Residual norm 3.943330739885e-01 2044 KSP Residual norm 3.943330739881e-01 2045 KSP Residual norm 3.943330739881e-01 2046 KSP Residual norm 3.943330739880e-01 2047 KSP Residual norm 3.943330739874e-01 2048 KSP Residual norm 3.943330739874e-01 2049 KSP Residual norm 3.943330739863e-01 2050 KSP Residual norm 3.943330739859e-01 2051 KSP Residual norm 3.943330739847e-01 2052 KSP Residual norm 3.943330739829e-01 2053 KSP Residual norm 3.943330739825e-01 2054 KSP Residual norm 3.943330739815e-01 2055 KSP Residual norm 3.943330739742e-01 2056 KSP Residual norm 3.943330739721e-01 2057 KSP Residual norm 3.943330739622e-01 2058 KSP Residual norm 3.943330739369e-01 2059 KSP Residual norm 3.943330739366e-01 2060 KSP Residual norm 3.943330739080e-01 2061 KSP Residual norm 3.943330738848e-01 2062 KSP Residual norm 3.943330738845e-01 2063 KSP Residual norm 3.943330738314e-01 2064 KSP Residual norm 3.943330736404e-01 2065 KSP Residual norm 3.943330735688e-01 2066 KSP Residual norm 3.943330735684e-01 2067 KSP Residual norm 3.943330735098e-01 2068 KSP Residual norm 3.943330734883e-01 2069 KSP Residual norm 3.943330734882e-01 2070 KSP Residual norm 3.943330734324e-01 2071 KSP Residual norm 3.943330734324e-01 2072 KSP Residual norm 3.943330734324e-01 2073 KSP Residual norm 3.943330734322e-01 2074 KSP Residual norm 3.943330734320e-01 2075 KSP Residual norm 3.943330734320e-01 2076 KSP Residual norm 3.943330734319e-01 2077 KSP Residual norm 3.943330734315e-01 2078 KSP Residual norm 3.943330734315e-01 2079 KSP Residual norm 3.943330734306e-01 2080 KSP Residual norm 3.943330734303e-01 2081 KSP Residual norm 3.943330734293e-01 2082 KSP Residual norm 3.943330734280e-01 2083 KSP Residual norm 3.943330734277e-01 2084 KSP Residual norm 3.943330734269e-01 2085 KSP Residual norm 3.943330734213e-01 2086 KSP Residual norm 3.943330734196e-01 2087 KSP Residual norm 3.943330734120e-01 2088 KSP Residual norm 3.943330733925e-01 2089 KSP Residual norm 3.943330733922e-01 2090 KSP Residual norm 3.943330733702e-01 2091 KSP Residual norm 3.943330733522e-01 2092 KSP Residual norm 3.943330733521e-01 2093 KSP Residual norm 3.943330733111e-01 2094 KSP Residual norm 3.943330731635e-01 2095 KSP Residual norm 3.943330731082e-01 2096 KSP Residual norm 3.943330731079e-01 2097 KSP Residual norm 3.943330730626e-01 2098 KSP Residual norm 3.943330730459e-01 2099 KSP Residual norm 3.943330730459e-01 2100 KSP Residual norm 3.943330730030e-01 2101 KSP Residual norm 3.943330730030e-01 2102 KSP Residual norm 3.943330730030e-01 2103 KSP Residual norm 3.943330730029e-01 2104 KSP Residual norm 3.943330730027e-01 2105 KSP Residual norm 3.943330730027e-01 2106 KSP Residual norm 3.943330730026e-01 2107 KSP Residual norm 3.943330730023e-01 2108 KSP Residual norm 3.943330730023e-01 2109 KSP Residual norm 3.943330730016e-01 2110 KSP Residual norm 3.943330730014e-01 2111 KSP Residual norm 3.943330730006e-01 2112 KSP Residual norm 3.943330729996e-01 2113 KSP Residual norm 3.943330729993e-01 2114 KSP Residual norm 3.943330729987e-01 2115 KSP Residual norm 3.943330729944e-01 2116 KSP Residual norm 3.943330729932e-01 2117 KSP Residual norm 3.943330729873e-01 2118 KSP Residual norm 3.943330729722e-01 2119 KSP Residual norm 3.943330729720e-01 2120 KSP Residual norm 3.943330729550e-01 2121 KSP Residual norm 3.943330729412e-01 2122 KSP Residual norm 3.943330729410e-01 2123 KSP Residual norm 3.943330729094e-01 2124 KSP Residual norm 3.943330727957e-01 2125 KSP Residual norm 3.943330727531e-01 2126 KSP Residual norm 3.943330727528e-01 2127 KSP Residual norm 3.943330727179e-01 2128 KSP Residual norm 3.943330727051e-01 2129 KSP Residual norm 3.943330727051e-01 2130 KSP Residual norm 3.943330726717e-01 2131 KSP Residual norm 3.943330726717e-01 2132 KSP Residual norm 3.943330726717e-01 2133 KSP Residual norm 3.943330726716e-01 2134 KSP Residual norm 3.943330726714e-01 2135 KSP Residual norm 3.943330726714e-01 2136 KSP Residual norm 3.943330726713e-01 2137 KSP Residual norm 3.943330726711e-01 2138 KSP Residual norm 3.943330726711e-01 2139 KSP Residual norm 3.943330726705e-01 2140 KSP Residual norm 3.943330726704e-01 2141 KSP Residual norm 3.943330726698e-01 2142 KSP Residual norm 3.943330726690e-01 2143 KSP Residual norm 3.943330726688e-01 2144 KSP Residual norm 3.943330726684e-01 2145 KSP Residual norm 3.943330726650e-01 2146 KSP Residual norm 3.943330726641e-01 2147 KSP Residual norm 3.943330726595e-01 2148 KSP Residual norm 3.943330726479e-01 2149 KSP Residual norm 3.943330726477e-01 2150 KSP Residual norm 3.943330726346e-01 2151 KSP Residual norm 3.943330726239e-01 2152 KSP Residual norm 3.943330726238e-01 2153 KSP Residual norm 3.943330725994e-01 2154 KSP Residual norm 3.943330725116e-01 2155 KSP Residual norm 3.943330724787e-01 2156 KSP Residual norm 3.943330724785e-01 2157 KSP Residual norm 3.943330724515e-01 2158 KSP Residual norm 3.943330724416e-01 2159 KSP Residual norm 3.943330724416e-01 2160 KSP Residual norm 3.943330724162e-01 2161 KSP Residual norm 3.943330724161e-01 2162 KSP Residual norm 3.943330724161e-01 2163 KSP Residual norm 3.943330724161e-01 2164 KSP Residual norm 3.943330724159e-01 2165 KSP Residual norm 3.943330724159e-01 2166 KSP Residual norm 3.943330724159e-01 2167 KSP Residual norm 3.943330724157e-01 2168 KSP Residual norm 3.943330724157e-01 2169 KSP Residual norm 3.943330724153e-01 2170 KSP Residual norm 3.943330724152e-01 2171 KSP Residual norm 3.943330724147e-01 2172 KSP Residual norm 3.943330724141e-01 2173 KSP Residual norm 3.943330724140e-01 2174 KSP Residual norm 3.943330724136e-01 2175 KSP Residual norm 3.943330724110e-01 2176 KSP Residual norm 3.943330724103e-01 2177 KSP Residual norm 3.943330724068e-01 2178 KSP Residual norm 3.943330723978e-01 2179 KSP Residual norm 3.943330723977e-01 2180 KSP Residual norm 3.943330723876e-01 2181 KSP Residual norm 3.943330723793e-01 2182 KSP Residual norm 3.943330723792e-01 2183 KSP Residual norm 3.943330723604e-01 2184 KSP Residual norm 3.943330722927e-01 2185 KSP Residual norm 3.943330722673e-01 2186 KSP Residual norm 3.943330722672e-01 2187 KSP Residual norm 3.943330722464e-01 2188 KSP Residual norm 3.943330722388e-01 2189 KSP Residual norm 3.943330722388e-01 2190 KSP Residual norm 3.943330722189e-01 2191 KSP Residual norm 3.943330722189e-01 2192 KSP Residual norm 3.943330722189e-01 2193 KSP Residual norm 3.943330722188e-01 2194 KSP Residual norm 3.943330722187e-01 2195 KSP Residual norm 3.943330722187e-01 2196 KSP Residual norm 3.943330722187e-01 2197 KSP Residual norm 3.943330722185e-01 2198 KSP Residual norm 3.943330722185e-01 2199 KSP Residual norm 3.943330722182e-01 2200 KSP Residual norm 3.943330722181e-01 2201 KSP Residual norm 3.943330722178e-01 2202 KSP Residual norm 3.943330722173e-01 2203 KSP Residual norm 3.943330722172e-01 2204 KSP Residual norm 3.943330722169e-01 2205 KSP Residual norm 3.943330722149e-01 2206 KSP Residual norm 3.943330722143e-01 2207 KSP Residual norm 3.943330722116e-01 2208 KSP Residual norm 3.943330722047e-01 2209 KSP Residual norm 3.943330722046e-01 2210 KSP Residual norm 3.943330721968e-01 2211 KSP Residual norm 3.943330721904e-01 2212 KSP Residual norm 3.943330721904e-01 2213 KSP Residual norm 3.943330721759e-01 2214 KSP Residual norm 3.943330721236e-01 2215 KSP Residual norm 3.943330721040e-01 2216 KSP Residual norm 3.943330721039e-01 2217 KSP Residual norm 3.943330720878e-01 2218 KSP Residual norm 3.943330720819e-01 2219 KSP Residual norm 3.943330720819e-01 2220 KSP Residual norm 3.943330720668e-01 2221 KSP Residual norm 3.943330720668e-01 2222 KSP Residual norm 3.943330720668e-01 2223 KSP Residual norm 3.943330720668e-01 2224 KSP Residual norm 3.943330720667e-01 2225 KSP Residual norm 3.943330720667e-01 2226 KSP Residual norm 3.943330720667e-01 2227 KSP Residual norm 3.943330720666e-01 2228 KSP Residual norm 3.943330720666e-01 2229 KSP Residual norm 3.943330720663e-01 2230 KSP Residual norm 3.943330720663e-01 2231 KSP Residual norm 3.943330720660e-01 2232 KSP Residual norm 3.943330720656e-01 2233 KSP Residual norm 3.943330720655e-01 2234 KSP Residual norm 3.943330720653e-01 2235 KSP Residual norm 3.943330720638e-01 2236 KSP Residual norm 3.943330720633e-01 2237 KSP Residual norm 3.943330720613e-01 2238 KSP Residual norm 3.943330720559e-01 2239 KSP Residual norm 3.943330720558e-01 2240 KSP Residual norm 3.943330720498e-01 2241 KSP Residual norm 3.943330720449e-01 2242 KSP Residual norm 3.943330720449e-01 2243 KSP Residual norm 3.943330720337e-01 2244 KSP Residual norm 3.943330719934e-01 2245 KSP Residual norm 3.943330719782e-01 2246 KSP Residual norm 3.943330719782e-01 2247 KSP Residual norm 3.943330719658e-01 2248 KSP Residual norm 3.943330719612e-01 2249 KSP Residual norm 3.943330719612e-01 2250 KSP Residual norm 3.943330719495e-01 2251 KSP Residual norm 3.943330719495e-01 2252 KSP Residual norm 3.943330719495e-01 2253 KSP Residual norm 3.943330719494e-01 2254 KSP Residual norm 3.943330719494e-01 2255 KSP Residual norm 3.943330719494e-01 2256 KSP Residual norm 3.943330719493e-01 2257 KSP Residual norm 3.943330719493e-01 2258 KSP Residual norm 3.943330719493e-01 2259 KSP Residual norm 3.943330719491e-01 2260 KSP Residual norm 3.943330719490e-01 2261 KSP Residual norm 3.943330719488e-01 2262 KSP Residual norm 3.943330719485e-01 2263 KSP Residual norm 3.943330719484e-01 2264 KSP Residual norm 3.943330719483e-01 2265 KSP Residual norm 3.943330719471e-01 2266 KSP Residual norm 3.943330719468e-01 2267 KSP Residual norm 3.943330719452e-01 2268 KSP Residual norm 3.943330719410e-01 2269 KSP Residual norm 3.943330719410e-01 2270 KSP Residual norm 3.943330719363e-01 2271 KSP Residual norm 3.943330719325e-01 2272 KSP Residual norm 3.943330719325e-01 2273 KSP Residual norm 3.943330719239e-01 2274 KSP Residual norm 3.943330718927e-01 2275 KSP Residual norm 3.943330718811e-01 2276 KSP Residual norm 3.943330718810e-01 2277 KSP Residual norm 3.943330718715e-01 2278 KSP Residual norm 3.943330718679e-01 2279 KSP Residual norm 3.943330718679e-01 2280 KSP Residual norm 3.943330718588e-01 2281 KSP Residual norm 3.943330718588e-01 2282 KSP Residual norm 3.943330718588e-01 2283 KSP Residual norm 3.943330718588e-01 2284 KSP Residual norm 3.943330718588e-01 2285 KSP Residual norm 3.943330718588e-01 2286 KSP Residual norm 3.943330718588e-01 2287 KSP Residual norm 3.943330718587e-01 2288 KSP Residual norm 3.943330718587e-01 2289 KSP Residual norm 3.943330718585e-01 2290 KSP Residual norm 3.943330718585e-01 2291 KSP Residual norm 3.943330718583e-01 2292 KSP Residual norm 3.943330718581e-01 2293 KSP Residual norm 3.943330718581e-01 2294 KSP Residual norm 3.943330718579e-01 2295 KSP Residual norm 3.943330718570e-01 2296 KSP Residual norm 3.943330718568e-01 2297 KSP Residual norm 3.943330718555e-01 2298 KSP Residual norm 3.943330718523e-01 2299 KSP Residual norm 3.943330718523e-01 2300 KSP Residual norm 3.943330718487e-01 2301 KSP Residual norm 3.943330718458e-01 2302 KSP Residual norm 3.943330718458e-01 2303 KSP Residual norm 3.943330718391e-01 2304 KSP Residual norm 3.943330718151e-01 2305 KSP Residual norm 3.943330718061e-01 2306 KSP Residual norm 3.943330718060e-01 2307 KSP Residual norm 3.943330717987e-01 2308 KSP Residual norm 3.943330717960e-01 2309 KSP Residual norm 3.943330717960e-01 2310 KSP Residual norm 3.943330717887e-01 2311 KSP Residual norm 3.943330717887e-01 2312 KSP Residual norm 3.943330717887e-01 2313 KSP Residual norm 3.943330717887e-01 2314 KSP Residual norm 3.943330717887e-01 2315 KSP Residual norm 3.943330717887e-01 2316 KSP Residual norm 3.943330717887e-01 2317 KSP Residual norm 3.943330717886e-01 2318 KSP Residual norm 3.943330717886e-01 2319 KSP Residual norm 3.943330717885e-01 2320 KSP Residual norm 3.943330717885e-01 2321 KSP Residual norm 3.943330717883e-01 2322 KSP Residual norm 3.943330717882e-01 2323 KSP Residual norm 3.943330717881e-01 2324 KSP Residual norm 3.943330717880e-01 2325 KSP Residual norm 3.943330717873e-01 2326 KSP Residual norm 3.943330717871e-01 2327 KSP Residual norm 3.943330717862e-01 2328 KSP Residual norm 3.943330717837e-01 2329 KSP Residual norm 3.943330717837e-01 2330 KSP Residual norm 3.943330717809e-01 2331 KSP Residual norm 3.943330717787e-01 2332 KSP Residual norm 3.943330717786e-01 2333 KSP Residual norm 3.943330717735e-01 2334 KSP Residual norm 3.943330717550e-01 2335 KSP Residual norm 3.943330717480e-01 2336 KSP Residual norm 3.943330717480e-01 2337 KSP Residual norm 3.943330717423e-01 2338 KSP Residual norm 3.943330717402e-01 2339 KSP Residual norm 3.943330717402e-01 2340 KSP Residual norm 3.943330717350e-01 2341 KSP Residual norm 3.943330717350e-01 2342 KSP Residual norm 3.943330717350e-01 2343 KSP Residual norm 3.943330717350e-01 2344 KSP Residual norm 3.943330717349e-01 2345 KSP Residual norm 3.943330717349e-01 2346 KSP Residual norm 3.943330717349e-01 2347 KSP Residual norm 3.943330717349e-01 2348 KSP Residual norm 3.943330717349e-01 2349 KSP Residual norm 3.943330717348e-01 2350 KSP Residual norm 3.943330717348e-01 2351 KSP Residual norm 3.943330717347e-01 2352 KSP Residual norm 3.943330717346e-01 2353 KSP Residual norm 3.943330717345e-01 2354 KSP Residual norm 3.943330717344e-01 2355 KSP Residual norm 3.943330717339e-01 2356 KSP Residual norm 3.943330717337e-01 2357 KSP Residual norm 3.943330717330e-01 2358 KSP Residual norm 3.943330717311e-01 2359 KSP Residual norm 3.943330717311e-01 2360 KSP Residual norm 3.943330717290e-01 2361 KSP Residual norm 3.943330717272e-01 2362 KSP Residual norm 3.943330717272e-01 2363 KSP Residual norm 3.943330717232e-01 2364 KSP Residual norm 3.943330717089e-01 2365 KSP Residual norm 3.943330717036e-01 2366 KSP Residual norm 3.943330717036e-01 2367 KSP Residual norm 3.943330716992e-01 2368 KSP Residual norm 3.943330716976e-01 2369 KSP Residual norm 3.943330716976e-01 2370 KSP Residual norm 3.943330716933e-01 2371 KSP Residual norm 3.943330716933e-01 2372 KSP Residual norm 3.943330716933e-01 2373 KSP Residual norm 3.943330716933e-01 2374 KSP Residual norm 3.943330716933e-01 2375 KSP Residual norm 3.943330716933e-01 2376 KSP Residual norm 3.943330716932e-01 2377 KSP Residual norm 3.943330716932e-01 2378 KSP Residual norm 3.943330716932e-01 2379 KSP Residual norm 3.943330716932e-01 2380 KSP Residual norm 3.943330716931e-01 2381 KSP Residual norm 3.943330716931e-01 2382 KSP Residual norm 3.943330716930e-01 2383 KSP Residual norm 3.943330716929e-01 2384 KSP Residual norm 3.943330716929e-01 2385 KSP Residual norm 3.943330716925e-01 2386 KSP Residual norm 3.943330716923e-01 2387 KSP Residual norm 3.943330716918e-01 2388 KSP Residual norm 3.943330716903e-01 2389 KSP Residual norm 3.943330716903e-01 2390 KSP Residual norm 3.943330716886e-01 2391 KSP Residual norm 3.943330716873e-01 2392 KSP Residual norm 3.943330716873e-01 2393 KSP Residual norm 3.943330716842e-01 2394 KSP Residual norm 3.943330716732e-01 2395 KSP Residual norm 3.943330716691e-01 2396 KSP Residual norm 3.943330716690e-01 2397 KSP Residual norm 3.943330716657e-01 2398 KSP Residual norm 3.943330716644e-01 2399 KSP Residual norm 3.943330716644e-01 2400 KSP Residual norm 3.943330716613e-01 2401 KSP Residual norm 3.943330716613e-01 2402 KSP Residual norm 3.943330716613e-01 2403 KSP Residual norm 3.943330716613e-01 2404 KSP Residual norm 3.943330716612e-01 2405 KSP Residual norm 3.943330716612e-01 2406 KSP Residual norm 3.943330716612e-01 2407 KSP Residual norm 3.943330716612e-01 2408 KSP Residual norm 3.943330716612e-01 2409 KSP Residual norm 3.943330716612e-01 2410 KSP Residual norm 3.943330716612e-01 2411 KSP Residual norm 3.943330716611e-01 2412 KSP Residual norm 3.943330716610e-01 2413 KSP Residual norm 3.943330716610e-01 2414 KSP Residual norm 3.943330716610e-01 2415 KSP Residual norm 3.943330716606e-01 2416 KSP Residual norm 3.943330716605e-01 2417 KSP Residual norm 3.943330716601e-01 2418 KSP Residual norm 3.943330716590e-01 2419 KSP Residual norm 3.943330716590e-01 2420 KSP Residual norm 3.943330716577e-01 2421 KSP Residual norm 3.943330716567e-01 2422 KSP Residual norm 3.943330716566e-01 2423 KSP Residual norm 3.943330716543e-01 2424 KSP Residual norm 3.943330716458e-01 2425 KSP Residual norm 3.943330716426e-01 2426 KSP Residual norm 3.943330716426e-01 2427 KSP Residual norm 3.943330716400e-01 2428 KSP Residual norm 3.943330716390e-01 2429 KSP Residual norm 3.943330716390e-01 2430 KSP Residual norm 3.943330716365e-01 2431 KSP Residual norm 3.943330716365e-01 2432 KSP Residual norm 3.943330716365e-01 2433 KSP Residual norm 3.943330716365e-01 2434 KSP Residual norm 3.943330716365e-01 2435 KSP Residual norm 3.943330716365e-01 2436 KSP Residual norm 3.943330716365e-01 2437 KSP Residual norm 3.943330716365e-01 2438 KSP Residual norm 3.943330716365e-01 2439 KSP Residual norm 3.943330716364e-01 2440 KSP Residual norm 3.943330716364e-01 2441 KSP Residual norm 3.943330716364e-01 2442 KSP Residual norm 3.943330716363e-01 2443 KSP Residual norm 3.943330716363e-01 2444 KSP Residual norm 3.943330716363e-01 2445 KSP Residual norm 3.943330716360e-01 2446 KSP Residual norm 3.943330716360e-01 2447 KSP Residual norm 3.943330716356e-01 2448 KSP Residual norm 3.943330716347e-01 2449 KSP Residual norm 3.943330716347e-01 2450 KSP Residual norm 3.943330716338e-01 2451 KSP Residual norm 3.943330716330e-01 2452 KSP Residual norm 3.943330716330e-01 2453 KSP Residual norm 3.943330716311e-01 2454 KSP Residual norm 3.943330716246e-01 2455 KSP Residual norm 3.943330716221e-01 2456 KSP Residual norm 3.943330716221e-01 2457 KSP Residual norm 3.943330716201e-01 2458 KSP Residual norm 3.943330716193e-01 2459 KSP Residual norm 3.943330716193e-01 2460 KSP Residual norm 3.943330716176e-01 2461 KSP Residual norm 3.943330716176e-01 2462 KSP Residual norm 3.943330716176e-01 2463 KSP Residual norm 3.943330716176e-01 2464 KSP Residual norm 3.943330716176e-01 2465 KSP Residual norm 3.943330716176e-01 2466 KSP Residual norm 3.943330716176e-01 2467 KSP Residual norm 3.943330716176e-01 2468 KSP Residual norm 3.943330716176e-01 2469 KSP Residual norm 3.943330716175e-01 2470 KSP Residual norm 3.943330716175e-01 2471 KSP Residual norm 3.943330716175e-01 2472 KSP Residual norm 3.943330716174e-01 2473 KSP Residual norm 3.943330716174e-01 2474 KSP Residual norm 3.943330716174e-01 2475 KSP Residual norm 3.943330716172e-01 2476 KSP Residual norm 3.943330716171e-01 2477 KSP Residual norm 3.943330716169e-01 2478 KSP Residual norm 3.943330716162e-01 2479 KSP Residual norm 3.943330716162e-01 2480 KSP Residual norm 3.943330716154e-01 2481 KSP Residual norm 3.943330716148e-01 2482 KSP Residual norm 3.943330716148e-01 2483 KSP Residual norm 3.943330716134e-01 2484 KSP Residual norm 3.943330716084e-01 2485 KSP Residual norm 3.943330716065e-01 2486 KSP Residual norm 3.943330716064e-01 2487 KSP Residual norm 3.943330716049e-01 2488 KSP Residual norm 3.943330716043e-01 2489 KSP Residual norm 3.943330716043e-01 2490 KSP Residual norm 3.943330716028e-01 2491 KSP Residual norm 3.943330716028e-01 2492 KSP Residual norm 3.943330716028e-01 2493 KSP Residual norm 3.943330716028e-01 2494 KSP Residual norm 3.943330716028e-01 2495 KSP Residual norm 3.943330716028e-01 2496 KSP Residual norm 3.943330716028e-01 2497 KSP Residual norm 3.943330716028e-01 2498 KSP Residual norm 3.943330716028e-01 2499 KSP Residual norm 3.943330716027e-01 2500 KSP Residual norm 3.943330716027e-01 2501 KSP Residual norm 3.943330716027e-01 2502 KSP Residual norm 3.943330716027e-01 2503 KSP Residual norm 3.943330716027e-01 2504 KSP Residual norm 3.943330716026e-01 2505 KSP Residual norm 3.943330716025e-01 2506 KSP Residual norm 3.943330716024e-01 2507 KSP Residual norm 3.943330716022e-01 2508 KSP Residual norm 3.943330716017e-01 2509 KSP Residual norm 3.943330716017e-01 2510 KSP Residual norm 3.943330716011e-01 2511 KSP Residual norm 3.943330716007e-01 2512 KSP Residual norm 3.943330716007e-01 2513 KSP Residual norm 3.943330715996e-01 2514 KSP Residual norm 3.943330715957e-01 2515 KSP Residual norm 3.943330715942e-01 2516 KSP Residual norm 3.943330715942e-01 2517 KSP Residual norm 3.943330715930e-01 2518 KSP Residual norm 3.943330715926e-01 2519 KSP Residual norm 3.943330715926e-01 2520 KSP Residual norm 3.943330715912e-01 2521 KSP Residual norm 3.943330715912e-01 2522 KSP Residual norm 3.943330715912e-01 2523 KSP Residual norm 3.943330715912e-01 2524 KSP Residual norm 3.943330715912e-01 2525 KSP Residual norm 3.943330715912e-01 2526 KSP Residual norm 3.943330715912e-01 2527 KSP Residual norm 3.943330715912e-01 2528 KSP Residual norm 3.943330715912e-01 2529 KSP Residual norm 3.943330715912e-01 2530 KSP Residual norm 3.943330715912e-01 2531 KSP Residual norm 3.943330715912e-01 2532 KSP Residual norm 3.943330715911e-01 2533 KSP Residual norm 3.943330715911e-01 2534 KSP Residual norm 3.943330715911e-01 2535 KSP Residual norm 3.943330715910e-01 2536 KSP Residual norm 3.943330715910e-01 2537 KSP Residual norm 3.943330715908e-01 2538 KSP Residual norm 3.943330715904e-01 2539 KSP Residual norm 3.943330715904e-01 2540 KSP Residual norm 3.943330715900e-01 2541 KSP Residual norm 3.943330715896e-01 2542 KSP Residual norm 3.943330715896e-01 2543 KSP Residual norm 3.943330715888e-01 2544 KSP Residual norm 3.943330715857e-01 2545 KSP Residual norm 3.943330715846e-01 2546 KSP Residual norm 3.943330715846e-01 2547 KSP Residual norm 3.943330715837e-01 2548 KSP Residual norm 3.943330715833e-01 2549 KSP Residual norm 3.943330715833e-01 2550 KSP Residual norm 3.943330715824e-01 2551 KSP Residual norm 3.943330715824e-01 2552 KSP Residual norm 3.943330715824e-01 2553 KSP Residual norm 3.943330715824e-01 2554 KSP Residual norm 3.943330715824e-01 2555 KSP Residual norm 3.943330715824e-01 2556 KSP Residual norm 3.943330715824e-01 2557 KSP Residual norm 3.943330715824e-01 2558 KSP Residual norm 3.943330715824e-01 2559 KSP Residual norm 3.943330715824e-01 2560 KSP Residual norm 3.943330715824e-01 2561 KSP Residual norm 3.943330715824e-01 2562 KSP Residual norm 3.943330715823e-01 2563 KSP Residual norm 3.943330715823e-01 2564 KSP Residual norm 3.943330715823e-01 2565 KSP Residual norm 3.943330715822e-01 2566 KSP Residual norm 3.943330715822e-01 2567 KSP Residual norm 3.943330715821e-01 2568 KSP Residual norm 3.943330715818e-01 2569 KSP Residual norm 3.943330715818e-01 2570 KSP Residual norm 3.943330715814e-01 2571 KSP Residual norm 3.943330715811e-01 2572 KSP Residual norm 3.943330715811e-01 2573 KSP Residual norm 3.943330715805e-01 2574 KSP Residual norm 3.943330715782e-01 2575 KSP Residual norm 3.943330715773e-01 2576 KSP Residual norm 3.943330715773e-01 2577 KSP Residual norm 3.943330715766e-01 2578 KSP Residual norm 3.943330715763e-01 2579 KSP Residual norm 3.943330715763e-01 2580 KSP Residual norm 3.943330715757e-01 2581 KSP Residual norm 3.943330715757e-01 2582 KSP Residual norm 3.943330715757e-01 2583 KSP Residual norm 3.943330715757e-01 2584 KSP Residual norm 3.943330715757e-01 2585 KSP Residual norm 3.943330715757e-01 2586 KSP Residual norm 3.943330715757e-01 2587 KSP Residual norm 3.943330715757e-01 2588 KSP Residual norm 3.943330715757e-01 2589 KSP Residual norm 3.943330715756e-01 2590 KSP Residual norm 3.943330715756e-01 2591 KSP Residual norm 3.943330715756e-01 2592 KSP Residual norm 3.943330715756e-01 2593 KSP Residual norm 3.943330715756e-01 2594 KSP Residual norm 3.943330715756e-01 2595 KSP Residual norm 3.943330715755e-01 2596 KSP Residual norm 3.943330715755e-01 2597 KSP Residual norm 3.943330715754e-01 2598 KSP Residual norm 3.943330715752e-01 2599 KSP Residual norm 3.943330715752e-01 2600 KSP Residual norm 3.943330715749e-01 2601 KSP Residual norm 3.943330715747e-01 2602 KSP Residual norm 3.943330715747e-01 2603 KSP Residual norm 3.943330715742e-01 2604 KSP Residual norm 3.943330715724e-01 2605 KSP Residual norm 3.943330715717e-01 2606 KSP Residual norm 3.943330715717e-01 2607 KSP Residual norm 3.943330715712e-01 2608 KSP Residual norm 3.943330715710e-01 2609 KSP Residual norm 3.943330715710e-01 2610 KSP Residual norm 3.943330715705e-01 2611 KSP Residual norm 3.943330715705e-01 2612 KSP Residual norm 3.943330715705e-01 2613 KSP Residual norm 3.943330715705e-01 2614 KSP Residual norm 3.943330715705e-01 2615 KSP Residual norm 3.943330715705e-01 2616 KSP Residual norm 3.943330715705e-01 2617 KSP Residual norm 3.943330715705e-01 2618 KSP Residual norm 3.943330715705e-01 2619 KSP Residual norm 3.943330715704e-01 2620 KSP Residual norm 3.943330715704e-01 2621 KSP Residual norm 3.943330715704e-01 2622 KSP Residual norm 3.943330715704e-01 2623 KSP Residual norm 3.943330715704e-01 2624 KSP Residual norm 3.943330715704e-01 2625 KSP Residual norm 3.943330715704e-01 2626 KSP Residual norm 3.943330715703e-01 2627 KSP Residual norm 3.943330715703e-01 2628 KSP Residual norm 3.943330715701e-01 2629 KSP Residual norm 3.943330715701e-01 2630 KSP Residual norm 3.943330715699e-01 2631 KSP Residual norm 3.943330715697e-01 2632 KSP Residual norm 3.943330715697e-01 2633 KSP Residual norm 3.943330715693e-01 2634 KSP Residual norm 3.943330715679e-01 2635 KSP Residual norm 3.943330715674e-01 2636 KSP Residual norm 3.943330715674e-01 2637 KSP Residual norm 3.943330715670e-01 2638 KSP Residual norm 3.943330715668e-01 2639 KSP Residual norm 3.943330715668e-01 2640 KSP Residual norm 3.943330715663e-01 2641 KSP Residual norm 3.943330715663e-01 2642 KSP Residual norm 3.943330715663e-01 2643 KSP Residual norm 3.943330715663e-01 2644 KSP Residual norm 3.943330715663e-01 2645 KSP Residual norm 3.943330715663e-01 2646 KSP Residual norm 3.943330715663e-01 2647 KSP Residual norm 3.943330715663e-01 2648 KSP Residual norm 3.943330715663e-01 2649 KSP Residual norm 3.943330715663e-01 2650 KSP Residual norm 3.943330715663e-01 2651 KSP Residual norm 3.943330715663e-01 2652 KSP Residual norm 3.943330715663e-01 2653 KSP Residual norm 3.943330715663e-01 2654 KSP Residual norm 3.943330715663e-01 2655 KSP Residual norm 3.943330715663e-01 2656 KSP Residual norm 3.943330715663e-01 2657 KSP Residual norm 3.943330715662e-01 2658 KSP Residual norm 3.943330715661e-01 2659 KSP Residual norm 3.943330715661e-01 2660 KSP Residual norm 3.943330715659e-01 2661 KSP Residual norm 3.943330715658e-01 2662 KSP Residual norm 3.943330715658e-01 2663 KSP Residual norm 3.943330715655e-01 2664 KSP Residual norm 3.943330715644e-01 2665 KSP Residual norm 3.943330715640e-01 2666 KSP Residual norm 3.943330715640e-01 2667 KSP Residual norm 3.943330715637e-01 2668 KSP Residual norm 3.943330715635e-01 2669 KSP Residual norm 3.943330715635e-01 2670 KSP Residual norm 3.943330715634e-01 2671 KSP Residual norm 3.943330715634e-01 2672 KSP Residual norm 3.943330715634e-01 2673 KSP Residual norm 3.943330715634e-01 2674 KSP Residual norm 3.943330715634e-01 2675 KSP Residual norm 3.943330715634e-01 2676 KSP Residual norm 3.943330715634e-01 2677 KSP Residual norm 3.943330715633e-01 2678 KSP Residual norm 3.943330715633e-01 2679 KSP Residual norm 3.943330715633e-01 2680 KSP Residual norm 3.943330715633e-01 2681 KSP Residual norm 3.943330715633e-01 2682 KSP Residual norm 3.943330715633e-01 2683 KSP Residual norm 3.943330715633e-01 2684 KSP Residual norm 3.943330715633e-01 2685 KSP Residual norm 3.943330715633e-01 2686 KSP Residual norm 3.943330715633e-01 2687 KSP Residual norm 3.943330715632e-01 2688 KSP Residual norm 3.943330715631e-01 2689 KSP Residual norm 3.943330715631e-01 2690 KSP Residual norm 3.943330715630e-01 2691 KSP Residual norm 3.943330715629e-01 2692 KSP Residual norm 3.943330715629e-01 2693 KSP Residual norm 3.943330715627e-01 2694 KSP Residual norm 3.943330715619e-01 2695 KSP Residual norm 3.943330715615e-01 2696 KSP Residual norm 3.943330715615e-01 2697 KSP Residual norm 3.943330715613e-01 2698 KSP Residual norm 3.943330715612e-01 2699 KSP Residual norm 3.943330715612e-01 2700 KSP Residual norm 3.943330715609e-01 2701 KSP Residual norm 3.943330715609e-01 2702 KSP Residual norm 3.943330715609e-01 2703 KSP Residual norm 3.943330715609e-01 2704 KSP Residual norm 3.943330715609e-01 2705 KSP Residual norm 3.943330715609e-01 2706 KSP Residual norm 3.943330715609e-01 2707 KSP Residual norm 3.943330715609e-01 2708 KSP Residual norm 3.943330715609e-01 2709 KSP Residual norm 3.943330715609e-01 2710 KSP Residual norm 3.943330715609e-01 2711 KSP Residual norm 3.943330715609e-01 2712 KSP Residual norm 3.943330715609e-01 2713 KSP Residual norm 3.943330715609e-01 2714 KSP Residual norm 3.943330715608e-01 2715 KSP Residual norm 3.943330715608e-01 2716 KSP Residual norm 3.943330715608e-01 2717 KSP Residual norm 3.943330715608e-01 2718 KSP Residual norm 3.943330715607e-01 2719 KSP Residual norm 3.943330715607e-01 2720 KSP Residual norm 3.943330715606e-01 2721 KSP Residual norm 3.943330715605e-01 2722 KSP Residual norm 3.943330715605e-01 2723 KSP Residual norm 3.943330715603e-01 2724 KSP Residual norm 3.943330715597e-01 2725 KSP Residual norm 3.943330715595e-01 2726 KSP Residual norm 3.943330715595e-01 2727 KSP Residual norm 3.943330715593e-01 2728 KSP Residual norm 3.943330715592e-01 2729 KSP Residual norm 3.943330715592e-01 2730 KSP Residual norm 3.943330715591e-01 2731 KSP Residual norm 3.943330715591e-01 2732 KSP Residual norm 3.943330715591e-01 2733 KSP Residual norm 3.943330715591e-01 2734 KSP Residual norm 3.943330715591e-01 2735 KSP Residual norm 3.943330715591e-01 2736 KSP Residual norm 3.943330715591e-01 2737 KSP Residual norm 3.943330715591e-01 2738 KSP Residual norm 3.943330715591e-01 2739 KSP Residual norm 3.943330715591e-01 2740 KSP Residual norm 3.943330715591e-01 2741 KSP Residual norm 3.943330715591e-01 2742 KSP Residual norm 3.943330715591e-01 2743 KSP Residual norm 3.943330715591e-01 2744 KSP Residual norm 3.943330715591e-01 2745 KSP Residual norm 3.943330715591e-01 2746 KSP Residual norm 3.943330715591e-01 2747 KSP Residual norm 3.943330715590e-01 2748 KSP Residual norm 3.943330715590e-01 2749 KSP Residual norm 3.943330715590e-01 2750 KSP Residual norm 3.943330715589e-01 2751 KSP Residual norm 3.943330715588e-01 2752 KSP Residual norm 3.943330715588e-01 2753 KSP Residual norm 3.943330715587e-01 2754 KSP Residual norm 3.943330715582e-01 2755 KSP Residual norm 3.943330715580e-01 2756 KSP Residual norm 3.943330715580e-01 2757 KSP Residual norm 3.943330715579e-01 2758 KSP Residual norm 3.943330715578e-01 2759 KSP Residual norm 3.943330715578e-01 2760 KSP Residual norm 3.943330715576e-01 2761 KSP Residual norm 3.943330715576e-01 2762 KSP Residual norm 3.943330715576e-01 2763 KSP Residual norm 3.943330715576e-01 2764 KSP Residual norm 3.943330715576e-01 2765 KSP Residual norm 3.943330715576e-01 2766 KSP Residual norm 3.943330715576e-01 2767 KSP Residual norm 3.943330715576e-01 2768 KSP Residual norm 3.943330715576e-01 2769 KSP Residual norm 3.943330715576e-01 2770 KSP Residual norm 3.943330715576e-01 2771 KSP Residual norm 3.943330715576e-01 2772 KSP Residual norm 3.943330715576e-01 2773 KSP Residual norm 3.943330715576e-01 2774 KSP Residual norm 3.943330715576e-01 2775 KSP Residual norm 3.943330715576e-01 2776 KSP Residual norm 3.943330715576e-01 2777 KSP Residual norm 3.943330715576e-01 2778 KSP Residual norm 3.943330715575e-01 2779 KSP Residual norm 3.943330715575e-01 2780 KSP Residual norm 3.943330715575e-01 2781 KSP Residual norm 3.943330715574e-01 2782 KSP Residual norm 3.943330715574e-01 2783 KSP Residual norm 3.943330715573e-01 2784 KSP Residual norm 3.943330715569e-01 2785 KSP Residual norm 3.943330715568e-01 2786 KSP Residual norm 3.943330715568e-01 2787 KSP Residual norm 3.943330715567e-01 2788 KSP Residual norm 3.943330715566e-01 2789 KSP Residual norm 3.943330715566e-01 2790 KSP Residual norm 3.943330715565e-01 2791 KSP Residual norm 3.943330715565e-01 2792 KSP Residual norm 3.943330715565e-01 2793 KSP Residual norm 3.943330715565e-01 2794 KSP Residual norm 3.943330715565e-01 2795 KSP Residual norm 3.943330715565e-01 2796 KSP Residual norm 3.943330715565e-01 2797 KSP Residual norm 3.943330715565e-01 2798 KSP Residual norm 3.943330715565e-01 2799 KSP Residual norm 3.943330715565e-01 2800 KSP Residual norm 3.943330715565e-01 2801 KSP Residual norm 3.943330715565e-01 2802 KSP Residual norm 3.943330715565e-01 2803 KSP Residual norm 3.943330715565e-01 2804 KSP Residual norm 3.943330715565e-01 2805 KSP Residual norm 3.943330715565e-01 2806 KSP Residual norm 3.943330715565e-01 2807 KSP Residual norm 3.943330715565e-01 2808 KSP Residual norm 3.943330715564e-01 2809 KSP Residual norm 3.943330715564e-01 2810 KSP Residual norm 3.943330715564e-01 2811 KSP Residual norm 3.943330715563e-01 2812 KSP Residual norm 3.943330715563e-01 2813 KSP Residual norm 3.943330715563e-01 2814 KSP Residual norm 3.943330715560e-01 2815 KSP Residual norm 3.943330715559e-01 2816 KSP Residual norm 3.943330715559e-01 2817 KSP Residual norm 3.943330715558e-01 2818 KSP Residual norm 3.943330715557e-01 2819 KSP Residual norm 3.943330715557e-01 2820 KSP Residual norm 3.943330715556e-01 2821 KSP Residual norm 3.943330715556e-01 2822 KSP Residual norm 3.943330715556e-01 2823 KSP Residual norm 3.943330715556e-01 2824 KSP Residual norm 3.943330715556e-01 2825 KSP Residual norm 3.943330715556e-01 2826 KSP Residual norm 3.943330715556e-01 2827 KSP Residual norm 3.943330715556e-01 2828 KSP Residual norm 3.943330715556e-01 2829 KSP Residual norm 3.943330715556e-01 2830 KSP Residual norm 3.943330715556e-01 2831 KSP Residual norm 3.943330715556e-01 2832 KSP Residual norm 3.943330715556e-01 2833 KSP Residual norm 3.943330715556e-01 2834 KSP Residual norm 3.943330715556e-01 2835 KSP Residual norm 3.943330715555e-01 2836 KSP Residual norm 3.943330715555e-01 2837 KSP Residual norm 3.943330715555e-01 2838 KSP Residual norm 3.943330715555e-01 2839 KSP Residual norm 3.943330715555e-01 2840 KSP Residual norm 3.943330715555e-01 2841 KSP Residual norm 3.943330715554e-01 2842 KSP Residual norm 3.943330715554e-01 2843 KSP Residual norm 3.943330715554e-01 2844 KSP Residual norm 3.943330715551e-01 2845 KSP Residual norm 3.943330715551e-01 2846 KSP Residual norm 3.943330715551e-01 2847 KSP Residual norm 3.943330715550e-01 2848 KSP Residual norm 3.943330715550e-01 2849 KSP Residual norm 3.943330715550e-01 2850 KSP Residual norm 3.943330715550e-01 2851 KSP Residual norm 3.943330715550e-01 2852 KSP Residual norm 3.943330715550e-01 2853 KSP Residual norm 3.943330715550e-01 2854 KSP Residual norm 3.943330715550e-01 2855 KSP Residual norm 3.943330715550e-01 2856 KSP Residual norm 3.943330715550e-01 2857 KSP Residual norm 3.943330715550e-01 2858 KSP Residual norm 3.943330715550e-01 2859 KSP Residual norm 3.943330715550e-01 2860 KSP Residual norm 3.943330715550e-01 2861 KSP Residual norm 3.943330715550e-01 2862 KSP Residual norm 3.943330715550e-01 2863 KSP Residual norm 3.943330715550e-01 2864 KSP Residual norm 3.943330715550e-01 2865 KSP Residual norm 3.943330715550e-01 2866 KSP Residual norm 3.943330715550e-01 2867 KSP Residual norm 3.943330715550e-01 2868 KSP Residual norm 3.943330715549e-01 2869 KSP Residual norm 3.943330715549e-01 2870 KSP Residual norm 3.943330715549e-01 2871 KSP Residual norm 3.943330715549e-01 2872 KSP Residual norm 3.943330715549e-01 2873 KSP Residual norm 3.943330715548e-01 2874 KSP Residual norm 3.943330715547e-01 2875 KSP Residual norm 3.943330715546e-01 2876 KSP Residual norm 3.943330715546e-01 2877 KSP Residual norm 3.943330715545e-01 2878 KSP Residual norm 3.943330715545e-01 2879 KSP Residual norm 3.943330715545e-01 2880 KSP Residual norm 3.943330715545e-01 2881 KSP Residual norm 3.943330715545e-01 2882 KSP Residual norm 3.943330715545e-01 2883 KSP Residual norm 3.943330715545e-01 2884 KSP Residual norm 3.943330715545e-01 2885 KSP Residual norm 3.943330715545e-01 2886 KSP Residual norm 3.943330715545e-01 2887 KSP Residual norm 3.943330715545e-01 2888 KSP Residual norm 3.943330715545e-01 2889 KSP Residual norm 3.943330715545e-01 2890 KSP Residual norm 3.943330715545e-01 2891 KSP Residual norm 3.943330715545e-01 2892 KSP Residual norm 3.943330715545e-01 2893 KSP Residual norm 3.943330715545e-01 2894 KSP Residual norm 3.943330715545e-01 2895 KSP Residual norm 3.943330715545e-01 2896 KSP Residual norm 3.943330715545e-01 2897 KSP Residual norm 3.943330715545e-01 2898 KSP Residual norm 3.943330715545e-01 2899 KSP Residual norm 3.943330715545e-01 2900 KSP Residual norm 3.943330715544e-01 2901 KSP Residual norm 3.943330715544e-01 2902 KSP Residual norm 3.943330715544e-01 2903 KSP Residual norm 3.943330715544e-01 2904 KSP Residual norm 3.943330715543e-01 2905 KSP Residual norm 3.943330715542e-01 2906 KSP Residual norm 3.943330715542e-01 2907 KSP Residual norm 3.943330715542e-01 2908 KSP Residual norm 3.943330715542e-01 2909 KSP Residual norm 3.943330715542e-01 2910 KSP Residual norm 3.943330715541e-01 2911 KSP Residual norm 3.943330715541e-01 2912 KSP Residual norm 3.943330715541e-01 2913 KSP Residual norm 3.943330715541e-01 2914 KSP Residual norm 3.943330715541e-01 2915 KSP Residual norm 3.943330715541e-01 2916 KSP Residual norm 3.943330715541e-01 2917 KSP Residual norm 3.943330715541e-01 2918 KSP Residual norm 3.943330715541e-01 2919 KSP Residual norm 3.943330715541e-01 2920 KSP Residual norm 3.943330715541e-01 2921 KSP Residual norm 3.943330715541e-01 2922 KSP Residual norm 3.943330715541e-01 2923 KSP Residual norm 3.943330715541e-01 2924 KSP Residual norm 3.943330715541e-01 2925 KSP Residual norm 3.943330715541e-01 2926 KSP Residual norm 3.943330715541e-01 2927 KSP Residual norm 3.943330715541e-01 2928 KSP Residual norm 3.943330715540e-01 2929 KSP Residual norm 3.943330715540e-01 2930 KSP Residual norm 3.943330715540e-01 2931 KSP Residual norm 3.943330715540e-01 2932 KSP Residual norm 3.943330715540e-01 2933 KSP Residual norm 3.943330715540e-01 2934 KSP Residual norm 3.943330715539e-01 2935 KSP Residual norm 3.943330715538e-01 2936 KSP Residual norm 3.943330715538e-01 2937 KSP Residual norm 3.943330715538e-01 2938 KSP Residual norm 3.943330715538e-01 2939 KSP Residual norm 3.943330715538e-01 2940 KSP Residual norm 3.943330715536e-01 2941 KSP Residual norm 3.943330715536e-01 2942 KSP Residual norm 3.943330715536e-01 2943 KSP Residual norm 3.943330715536e-01 2944 KSP Residual norm 3.943330715536e-01 2945 KSP Residual norm 3.943330715536e-01 2946 KSP Residual norm 3.943330715536e-01 2947 KSP Residual norm 3.943330715536e-01 2948 KSP Residual norm 3.943330715536e-01 2949 KSP Residual norm 3.943330715536e-01 2950 KSP Residual norm 3.943330715536e-01 2951 KSP Residual norm 3.943330715536e-01 2952 KSP Residual norm 3.943330715536e-01 2953 KSP Residual norm 3.943330715536e-01 2954 KSP Residual norm 3.943330715536e-01 2955 KSP Residual norm 3.943330715536e-01 2956 KSP Residual norm 3.943330715536e-01 2957 KSP Residual norm 3.943330715536e-01 2958 KSP Residual norm 3.943330715536e-01 2959 KSP Residual norm 3.943330715536e-01 2960 KSP Residual norm 3.943330715536e-01 2961 KSP Residual norm 3.943330715536e-01 2962 KSP Residual norm 3.943330715536e-01 2963 KSP Residual norm 3.943330715535e-01 2964 KSP Residual norm 3.943330715535e-01 2965 KSP Residual norm 3.943330715534e-01 2966 KSP Residual norm 3.943330715534e-01 2967 KSP Residual norm 3.943330715534e-01 2968 KSP Residual norm 3.943330715534e-01 2969 KSP Residual norm 3.943330715534e-01 2970 KSP Residual norm 3.943330715532e-01 2971 KSP Residual norm 3.943330715532e-01 2972 KSP Residual norm 3.943330715532e-01 2973 KSP Residual norm 3.943330715532e-01 2974 KSP Residual norm 3.943330715532e-01 2975 KSP Residual norm 3.943330715532e-01 2976 KSP Residual norm 3.943330715532e-01 2977 KSP Residual norm 3.943330715532e-01 2978 KSP Residual norm 3.943330715532e-01 2979 KSP Residual norm 3.943330715532e-01 2980 KSP Residual norm 3.943330715532e-01 2981 KSP Residual norm 3.943330715532e-01 2982 KSP Residual norm 3.943330715532e-01 2983 KSP Residual norm 3.943330715532e-01 2984 KSP Residual norm 3.943330715532e-01 2985 KSP Residual norm 3.943330715532e-01 2986 KSP Residual norm 3.943330715532e-01 2987 KSP Residual norm 3.943330715532e-01 2988 KSP Residual norm 3.943330715532e-01 2989 KSP Residual norm 3.943330715532e-01 2990 KSP Residual norm 3.943330715531e-01 2991 KSP Residual norm 3.943330715531e-01 2992 KSP Residual norm 3.943330715531e-01 2993 KSP Residual norm 3.943330715531e-01 2994 KSP Residual norm 3.943330715531e-01 2995 KSP Residual norm 3.943330715530e-01 2996 KSP Residual norm 3.943330715530e-01 2997 KSP Residual norm 3.943330715530e-01 2998 KSP Residual norm 3.943330715530e-01 2999 KSP Residual norm 3.943330715530e-01 3000 KSP Residual norm 3.943330715531e-01 3001 KSP Residual norm 3.943330715531e-01 3002 KSP Residual norm 3.943330715531e-01 3003 KSP Residual norm 3.943330715531e-01 3004 KSP Residual norm 3.943330715531e-01 3005 KSP Residual norm 3.943330715531e-01 3006 KSP Residual norm 3.943330715531e-01 3007 KSP Residual norm 3.943330715531e-01 3008 KSP Residual norm 3.943330715531e-01 3009 KSP Residual norm 3.943330715531e-01 3010 KSP Residual norm 3.943330715531e-01 3011 KSP Residual norm 3.943330715531e-01 3012 KSP Residual norm 3.943330715530e-01 3013 KSP Residual norm 3.943330715530e-01 3014 KSP Residual norm 3.943330715530e-01 3015 KSP Residual norm 3.943330715530e-01 3016 KSP Residual norm 3.943330715530e-01 3017 KSP Residual norm 3.943330715530e-01 3018 KSP Residual norm 3.943330715530e-01 3019 KSP Residual norm 3.943330715530e-01 3020 KSP Residual norm 3.943330715530e-01 3021 KSP Residual norm 3.943330715530e-01 3022 KSP Residual norm 3.943330715530e-01 3023 KSP Residual norm 3.943330715530e-01 3024 KSP Residual norm 3.943330715530e-01 3025 KSP Residual norm 3.943330715529e-01 3026 KSP Residual norm 3.943330715529e-01 3027 KSP Residual norm 3.943330715529e-01 3028 KSP Residual norm 3.943330715529e-01 3029 KSP Residual norm 3.943330715529e-01 3030 KSP Residual norm 3.943330715529e-01 3031 KSP Residual norm 3.943330715529e-01 3032 KSP Residual norm 3.943330715529e-01 3033 KSP Residual norm 3.943330715529e-01 3034 KSP Residual norm 3.943330715529e-01 3035 KSP Residual norm 3.943330715529e-01 3036 KSP Residual norm 3.943330715529e-01 3037 KSP Residual norm 3.943330715529e-01 3038 KSP Residual norm 3.943330715529e-01 3039 KSP Residual norm 3.943330715529e-01 3040 KSP Residual norm 3.943330715529e-01 3041 KSP Residual norm 3.943330715529e-01 3042 KSP Residual norm 3.943330715529e-01 3043 KSP Residual norm 3.943330715529e-01 3044 KSP Residual norm 3.943330715529e-01 3045 KSP Residual norm 3.943330715529e-01 3046 KSP Residual norm 3.943330715529e-01 3047 KSP Residual norm 3.943330715529e-01 3048 KSP Residual norm 3.943330715529e-01 3049 KSP Residual norm 3.943330715529e-01 3050 KSP Residual norm 3.943330715529e-01 3051 KSP Residual norm 3.943330715529e-01 3052 KSP Residual norm 3.943330715529e-01 3053 KSP Residual norm 3.943330715529e-01 3054 KSP Residual norm 3.943330715529e-01 3055 KSP Residual norm 3.943330715529e-01 3056 KSP Residual norm 3.943330715529e-01 3057 KSP Residual norm 3.943330715528e-01 3058 KSP Residual norm 3.943330715528e-01 3059 KSP Residual norm 3.943330715528e-01 3060 KSP Residual norm 3.943330715529e-01 3061 KSP Residual norm 3.943330715529e-01 3062 KSP Residual norm 3.943330715529e-01 3063 KSP Residual norm 3.943330715529e-01 3064 KSP Residual norm 3.943330715529e-01 3065 KSP Residual norm 3.943330715529e-01 3066 KSP Residual norm 3.943330715529e-01 3067 KSP Residual norm 3.943330715529e-01 3068 KSP Residual norm 3.943330715529e-01 3069 KSP Residual norm 3.943330715529e-01 3070 KSP Residual norm 3.943330715529e-01 3071 KSP Residual norm 3.943330715529e-01 3072 KSP Residual norm 3.943330715529e-01 3073 KSP Residual norm 3.943330715529e-01 3074 KSP Residual norm 3.943330715529e-01 3075 KSP Residual norm 3.943330715529e-01 3076 KSP Residual norm 3.943330715529e-01 3077 KSP Residual norm 3.943330715529e-01 3078 KSP Residual norm 3.943330715529e-01 3079 KSP Residual norm 3.943330715529e-01 3080 KSP Residual norm 3.943330715529e-01 3081 KSP Residual norm 3.943330715529e-01 3082 KSP Residual norm 3.943330715529e-01 3083 KSP Residual norm 3.943330715529e-01 3084 KSP Residual norm 3.943330715528e-01 3085 KSP Residual norm 3.943330715528e-01 3086 KSP Residual norm 3.943330715528e-01 3087 KSP Residual norm 3.943330715528e-01 3088 KSP Residual norm 3.943330715528e-01 3089 KSP Residual norm 3.943330715528e-01 3090 KSP Residual norm 3.943330715528e-01 3091 KSP Residual norm 3.943330715528e-01 3092 KSP Residual norm 3.943330715528e-01 3093 KSP Residual norm 3.943330715528e-01 3094 KSP Residual norm 3.943330715528e-01 3095 KSP Residual norm 3.943330715528e-01 3096 KSP Residual norm 3.943330715528e-01 3097 KSP Residual norm 3.943330715528e-01 3098 KSP Residual norm 3.943330715528e-01 3099 KSP Residual norm 3.943330715528e-01 3100 KSP Residual norm 3.943330715528e-01 3101 KSP Residual norm 3.943330715528e-01 3102 KSP Residual norm 3.943330715528e-01 3103 KSP Residual norm 3.943330715528e-01 3104 KSP Residual norm 3.943330715528e-01 3105 KSP Residual norm 3.943330715528e-01 3106 KSP Residual norm 3.943330715528e-01 3107 KSP Residual norm 3.943330715528e-01 3108 KSP Residual norm 3.943330715528e-01 3109 KSP Residual norm 3.943330715528e-01 3110 KSP Residual norm 3.943330715528e-01 3111 KSP Residual norm 3.943330715528e-01 3112 KSP Residual norm 3.943330715528e-01 3113 KSP Residual norm 3.943330715528e-01 3114 KSP Residual norm 3.943330715528e-01 3115 KSP Residual norm 3.943330715528e-01 3116 KSP Residual norm 3.943330715528e-01 3117 KSP Residual norm 3.943330715528e-01 3118 KSP Residual norm 3.943330715528e-01 3119 KSP Residual norm 3.943330715528e-01 3120 KSP Residual norm 3.943330715528e-01 3121 KSP Residual norm 3.943330715528e-01 3122 KSP Residual norm 3.943330715528e-01 3123 KSP Residual norm 3.943330715528e-01 3124 KSP Residual norm 3.943330715528e-01 3125 KSP Residual norm 3.943330715528e-01 3126 KSP Residual norm 3.943330715528e-01 3127 KSP Residual norm 3.943330715528e-01 3128 KSP Residual norm 3.943330715528e-01 3129 KSP Residual norm 3.943330715528e-01 3130 KSP Residual norm 3.943330715528e-01 3131 KSP Residual norm 3.943330715528e-01 3132 KSP Residual norm 3.943330715528e-01 3133 KSP Residual norm 3.943330715528e-01 3134 KSP Residual norm 3.943330715528e-01 3135 KSP Residual norm 3.943330715528e-01 3136 KSP Residual norm 3.943330715528e-01 3137 KSP Residual norm 3.943330715528e-01 3138 KSP Residual norm 3.943330715528e-01 3139 KSP Residual norm 3.943330715528e-01 3140 KSP Residual norm 3.943330715528e-01 3141 KSP Residual norm 3.943330715528e-01 3142 KSP Residual norm 3.943330715528e-01 3143 KSP Residual norm 3.943330715528e-01 3144 KSP Residual norm 3.943330715527e-01 3145 KSP Residual norm 3.943330715527e-01 3146 KSP Residual norm 3.943330715527e-01 3147 KSP Residual norm 3.943330715527e-01 3148 KSP Residual norm 3.943330715527e-01 3149 KSP Residual norm 3.943330715527e-01 3150 KSP Residual norm 3.943330715527e-01 3151 KSP Residual norm 3.943330715527e-01 3152 KSP Residual norm 3.943330715527e-01 3153 KSP Residual norm 3.943330715527e-01 3154 KSP Residual norm 3.943330715527e-01 3155 KSP Residual norm 3.943330715527e-01 3156 KSP Residual norm 3.943330715527e-01 3157 KSP Residual norm 3.943330715527e-01 3158 KSP Residual norm 3.943330715527e-01 3159 KSP Residual norm 3.943330715527e-01 3160 KSP Residual norm 3.943330715527e-01 3161 KSP Residual norm 3.943330715527e-01 3162 KSP Residual norm 3.943330715527e-01 3163 KSP Residual norm 3.943330715527e-01 3164 KSP Residual norm 3.943330715527e-01 3165 KSP Residual norm 3.943330715527e-01 3166 KSP Residual norm 3.943330715527e-01 3167 KSP Residual norm 3.943330715527e-01 3168 KSP Residual norm 3.943330715527e-01 3169 KSP Residual norm 3.943330715527e-01 3170 KSP Residual norm 3.943330715527e-01 3171 KSP Residual norm 3.943330715527e-01 3172 KSP Residual norm 3.943330715527e-01 3173 KSP Residual norm 3.943330715527e-01 3174 KSP Residual norm 3.943330715527e-01 3175 KSP Residual norm 3.943330715527e-01 3176 KSP Residual norm 3.943330715527e-01 3177 KSP Residual norm 3.943330715527e-01 3178 KSP Residual norm 3.943330715527e-01 3179 KSP Residual norm 3.943330715527e-01 3180 KSP Residual norm 3.943330715526e-01 3181 KSP Residual norm 3.943330715526e-01 3182 KSP Residual norm 3.943330715526e-01 3183 KSP Residual norm 3.943330715526e-01 3184 KSP Residual norm 3.943330715526e-01 3185 KSP Residual norm 3.943330715526e-01 3186 KSP Residual norm 3.943330715526e-01 3187 KSP Residual norm 3.943330715526e-01 3188 KSP Residual norm 3.943330715526e-01 3189 KSP Residual norm 3.943330715526e-01 3190 KSP Residual norm 3.943330715526e-01 3191 KSP Residual norm 3.943330715526e-01 3192 KSP Residual norm 3.943330715526e-01 3193 KSP Residual norm 3.943330715526e-01 3194 KSP Residual norm 3.943330715526e-01 3195 KSP Residual norm 3.943330715526e-01 3196 KSP Residual norm 3.943330715526e-01 3197 KSP Residual norm 3.943330715526e-01 3198 KSP Residual norm 3.943330715526e-01 3199 KSP Residual norm 3.943330715526e-01 3200 KSP Residual norm 3.943330715526e-01 3201 KSP Residual norm 3.943330715526e-01 3202 KSP Residual norm 3.943330715526e-01 3203 KSP Residual norm 3.943330715526e-01 3204 KSP Residual norm 3.943330715526e-01 3205 KSP Residual norm 3.943330715526e-01 3206 KSP Residual norm 3.943330715526e-01 3207 KSP Residual norm 3.943330715526e-01 3208 KSP Residual norm 3.943330715526e-01 3209 KSP Residual norm 3.943330715526e-01 3210 KSP Residual norm 3.943330715529e-01 3211 KSP Residual norm 3.943330715529e-01 3212 KSP Residual norm 3.943330715529e-01 3213 KSP Residual norm 3.943330715529e-01 3214 KSP Residual norm 3.943330715529e-01 3215 KSP Residual norm 3.943330715529e-01 3216 KSP Residual norm 3.943330715529e-01 3217 KSP Residual norm 3.943330715529e-01 3218 KSP Residual norm 3.943330715529e-01 3219 KSP Residual norm 3.943330715529e-01 3220 KSP Residual norm 3.943330715529e-01 3221 KSP Residual norm 3.943330715529e-01 3222 KSP Residual norm 3.943330715529e-01 3223 KSP Residual norm 3.943330715529e-01 3224 KSP Residual norm 3.943330715529e-01 3225 KSP Residual norm 3.943330715529e-01 3226 KSP Residual norm 3.943330715529e-01 3227 KSP Residual norm 3.943330715529e-01 3228 KSP Residual norm 3.943330715529e-01 3229 KSP Residual norm 3.943330715529e-01 3230 KSP Residual norm 3.943330715529e-01 3231 KSP Residual norm 3.943330715529e-01 3232 KSP Residual norm 3.943330715529e-01 3233 KSP Residual norm 3.943330715529e-01 3234 KSP Residual norm 3.943330715529e-01 3235 KSP Residual norm 3.943330715529e-01 3236 KSP Residual norm 3.943330715529e-01 3237 KSP Residual norm 3.943330715529e-01 3238 KSP Residual norm 3.943330715529e-01 3239 KSP Residual norm 3.943330715529e-01 3240 KSP Residual norm 3.943330715529e-01 3241 KSP Residual norm 3.943330715529e-01 3242 KSP Residual norm 3.943330715529e-01 3243 KSP Residual norm 3.943330715529e-01 3244 KSP Residual norm 3.943330715529e-01 3245 KSP Residual norm 3.943330715529e-01 3246 KSP Residual norm 3.943330715529e-01 3247 KSP Residual norm 3.943330715529e-01 3248 KSP Residual norm 3.943330715529e-01 3249 KSP Residual norm 3.943330715529e-01 3250 KSP Residual norm 3.943330715529e-01 3251 KSP Residual norm 3.943330715529e-01 3252 KSP Residual norm 3.943330715529e-01 3253 KSP Residual norm 3.943330715529e-01 3254 KSP Residual norm 3.943330715529e-01 3255 KSP Residual norm 3.943330715529e-01 3256 KSP Residual norm 3.943330715529e-01 3257 KSP Residual norm 3.943330715529e-01 3258 KSP Residual norm 3.943330715529e-01 3259 KSP Residual norm 3.943330715529e-01 3260 KSP Residual norm 3.943330715529e-01 3261 KSP Residual norm 3.943330715529e-01 3262 KSP Residual norm 3.943330715529e-01 3263 KSP Residual norm 3.943330715529e-01 3264 KSP Residual norm 3.943330715529e-01 3265 KSP Residual norm 3.943330715529e-01 3266 KSP Residual norm 3.943330715529e-01 3267 KSP Residual norm 3.943330715529e-01 3268 KSP Residual norm 3.943330715529e-01 3269 KSP Residual norm 3.943330715529e-01 3270 KSP Residual norm 3.943330715530e-01 3271 KSP Residual norm 3.943330715530e-01 3272 KSP Residual norm 3.943330715530e-01 3273 KSP Residual norm 3.943330715530e-01 3274 KSP Residual norm 3.943330715530e-01 3275 KSP Residual norm 3.943330715530e-01 3276 KSP Residual norm 3.943330715530e-01 3277 KSP Residual norm 3.943330715530e-01 3278 KSP Residual norm 3.943330715530e-01 3279 KSP Residual norm 3.943330715530e-01 3280 KSP Residual norm 3.943330715530e-01 3281 KSP Residual norm 3.943330715530e-01 3282 KSP Residual norm 3.943330715530e-01 3283 KSP Residual norm 3.943330715530e-01 3284 KSP Residual norm 3.943330715530e-01 3285 KSP Residual norm 3.943330715530e-01 3286 KSP Residual norm 3.943330715530e-01 3287 KSP Residual norm 3.943330715530e-01 3288 KSP Residual norm 3.943330715530e-01 3289 KSP Residual norm 3.943330715530e-01 3290 KSP Residual norm 3.943330715530e-01 3291 KSP Residual norm 3.943330715530e-01 3292 KSP Residual norm 3.943330715530e-01 3293 KSP Residual norm 3.943330715530e-01 3294 KSP Residual norm 3.943330715530e-01 3295 KSP Residual norm 3.943330715530e-01 3296 KSP Residual norm 3.943330715530e-01 3297 KSP Residual norm 3.943330715530e-01 3298 KSP Residual norm 3.943330715530e-01 3299 KSP Residual norm 3.943330715530e-01 3300 KSP Residual norm 3.943330715529e-01 3301 KSP Residual norm 3.943330715529e-01 3302 KSP Residual norm 3.943330715529e-01 3303 KSP Residual norm 3.943330715529e-01 3304 KSP Residual norm 3.943330715529e-01 3305 KSP Residual norm 3.943330715529e-01 3306 KSP Residual norm 3.943330715529e-01 3307 KSP Residual norm 3.943330715529e-01 3308 KSP Residual norm 3.943330715529e-01 3309 KSP Residual norm 3.943330715529e-01 3310 KSP Residual norm 3.943330715529e-01 3311 KSP Residual norm 3.943330715529e-01 3312 KSP Residual norm 3.943330715529e-01 3313 KSP Residual norm 3.943330715529e-01 3314 KSP Residual norm 3.943330715529e-01 3315 KSP Residual norm 3.943330715529e-01 3316 KSP Residual norm 3.943330715529e-01 3317 KSP Residual norm 3.943330715529e-01 3318 KSP Residual norm 3.943330715529e-01 3319 KSP Residual norm 3.943330715529e-01 3320 KSP Residual norm 3.943330715529e-01 3321 KSP Residual norm 3.943330715529e-01 3322 KSP Residual norm 3.943330715529e-01 3323 KSP Residual norm 3.943330715529e-01 3324 KSP Residual norm 3.943330715529e-01 3325 KSP Residual norm 3.943330715529e-01 3326 KSP Residual norm 3.943330715529e-01 3327 KSP Residual norm 3.943330715528e-01 3328 KSP Residual norm 3.943330715528e-01 3329 KSP Residual norm 3.943330715528e-01 3330 KSP Residual norm 3.943330715531e-01 3331 KSP Residual norm 3.943330715531e-01 3332 KSP Residual norm 3.943330715531e-01 3333 KSP Residual norm 3.943330715531e-01 3334 KSP Residual norm 3.943330715531e-01 3335 KSP Residual norm 3.943330715531e-01 3336 KSP Residual norm 3.943330715531e-01 3337 KSP Residual norm 3.943330715531e-01 3338 KSP Residual norm 3.943330715531e-01 3339 KSP Residual norm 3.943330715531e-01 3340 KSP Residual norm 3.943330715531e-01 3341 KSP Residual norm 3.943330715531e-01 3342 KSP Residual norm 3.943330715531e-01 3343 KSP Residual norm 3.943330715531e-01 3344 KSP Residual norm 3.943330715531e-01 3345 KSP Residual norm 3.943330715531e-01 3346 KSP Residual norm 3.943330715531e-01 3347 KSP Residual norm 3.943330715531e-01 3348 KSP Residual norm 3.943330715531e-01 3349 KSP Residual norm 3.943330715531e-01 3350 KSP Residual norm 3.943330715531e-01 3351 KSP Residual norm 3.943330715531e-01 3352 KSP Residual norm 3.943330715531e-01 3353 KSP Residual norm 3.943330715531e-01 3354 KSP Residual norm 3.943330715531e-01 3355 KSP Residual norm 3.943330715531e-01 3356 KSP Residual norm 3.943330715531e-01 3357 KSP Residual norm 3.943330715531e-01 3358 KSP Residual norm 3.943330715531e-01 3359 KSP Residual norm 3.943330715531e-01 3360 KSP Residual norm 3.943330715530e-01 3361 KSP Residual norm 3.943330715530e-01 3362 KSP Residual norm 3.943330715530e-01 3363 KSP Residual norm 3.943330715530e-01 3364 KSP Residual norm 3.943330715530e-01 3365 KSP Residual norm 3.943330715530e-01 3366 KSP Residual norm 3.943330715530e-01 3367 KSP Residual norm 3.943330715530e-01 3368 KSP Residual norm 3.943330715530e-01 3369 KSP Residual norm 3.943330715530e-01 3370 KSP Residual norm 3.943330715530e-01 3371 KSP Residual norm 3.943330715530e-01 3372 KSP Residual norm 3.943330715530e-01 3373 KSP Residual norm 3.943330715530e-01 3374 KSP Residual norm 3.943330715530e-01 3375 KSP Residual norm 3.943330715530e-01 3376 KSP Residual norm 3.943330715530e-01 3377 KSP Residual norm 3.943330715530e-01 3378 KSP Residual norm 3.943330715530e-01 3379 KSP Residual norm 3.943330715530e-01 3380 KSP Residual norm 3.943330715530e-01 3381 KSP Residual norm 3.943330715530e-01 3382 KSP Residual norm 3.943330715530e-01 3383 KSP Residual norm 3.943330715530e-01 3384 KSP Residual norm 3.943330715530e-01 3385 KSP Residual norm 3.943330715530e-01 3386 KSP Residual norm 3.943330715530e-01 3387 KSP Residual norm 3.943330715530e-01 3388 KSP Residual norm 3.943330715530e-01 3389 KSP Residual norm 3.943330715530e-01 3390 KSP Residual norm 3.943330715530e-01 3391 KSP Residual norm 3.943330715530e-01 3392 KSP Residual norm 3.943330715530e-01 3393 KSP Residual norm 3.943330715530e-01 3394 KSP Residual norm 3.943330715530e-01 3395 KSP Residual norm 3.943330715530e-01 3396 KSP Residual norm 3.943330715530e-01 3397 KSP Residual norm 3.943330715530e-01 3398 KSP Residual norm 3.943330715530e-01 3399 KSP Residual norm 3.943330715530e-01 3400 KSP Residual norm 3.943330715530e-01 3401 KSP Residual norm 3.943330715530e-01 3402 KSP Residual norm 3.943330715530e-01 3403 KSP Residual norm 3.943330715530e-01 3404 KSP Residual norm 3.943330715530e-01 3405 KSP Residual norm 3.943330715530e-01 3406 KSP Residual norm 3.943330715530e-01 3407 KSP Residual norm 3.943330715530e-01 3408 KSP Residual norm 3.943330715530e-01 3409 KSP Residual norm 3.943330715530e-01 3410 KSP Residual norm 3.943330715530e-01 3411 KSP Residual norm 3.943330715530e-01 3412 KSP Residual norm 3.943330715530e-01 3413 KSP Residual norm 3.943330715530e-01 3414 KSP Residual norm 3.943330715530e-01 3415 KSP Residual norm 3.943330715530e-01 3416 KSP Residual norm 3.943330715530e-01 3417 KSP Residual norm 3.943330715530e-01 3418 KSP Residual norm 3.943330715530e-01 3419 KSP Residual norm 3.943330715530e-01 3420 KSP Residual norm 3.943330715530e-01 3421 KSP Residual norm 3.943330715530e-01 3422 KSP Residual norm 3.943330715530e-01 3423 KSP Residual norm 3.943330715530e-01 3424 KSP Residual norm 3.943330715530e-01 3425 KSP Residual norm 3.943330715530e-01 3426 KSP Residual norm 3.943330715530e-01 3427 KSP Residual norm 3.943330715530e-01 3428 KSP Residual norm 3.943330715530e-01 3429 KSP Residual norm 3.943330715530e-01 3430 KSP Residual norm 3.943330715530e-01 3431 KSP Residual norm 3.943330715530e-01 3432 KSP Residual norm 3.943330715530e-01 3433 KSP Residual norm 3.943330715530e-01 3434 KSP Residual norm 3.943330715530e-01 3435 KSP Residual norm 3.943330715530e-01 3436 KSP Residual norm 3.943330715530e-01 3437 KSP Residual norm 3.943330715530e-01 3438 KSP Residual norm 3.943330715530e-01 3439 KSP Residual norm 3.943330715530e-01 3440 KSP Residual norm 3.943330715530e-01 3441 KSP Residual norm 3.943330715530e-01 3442 KSP Residual norm 3.943330715530e-01 3443 KSP Residual norm 3.943330715530e-01 3444 KSP Residual norm 3.943330715530e-01 3445 KSP Residual norm 3.943330715530e-01 3446 KSP Residual norm 3.943330715530e-01 3447 KSP Residual norm 3.943330715530e-01 3448 KSP Residual norm 3.943330715530e-01 3449 KSP Residual norm 3.943330715530e-01 3450 KSP Residual norm 3.943330715529e-01 3451 KSP Residual norm 3.943330715529e-01 3452 KSP Residual norm 3.943330715529e-01 3453 KSP Residual norm 3.943330715529e-01 3454 KSP Residual norm 3.943330715529e-01 3455 KSP Residual norm 3.943330715529e-01 3456 KSP Residual norm 3.943330715529e-01 3457 KSP Residual norm 3.943330715529e-01 3458 KSP Residual norm 3.943330715529e-01 3459 KSP Residual norm 3.943330715529e-01 3460 KSP Residual norm 3.943330715529e-01 3461 KSP Residual norm 3.943330715529e-01 3462 KSP Residual norm 3.943330715529e-01 3463 KSP Residual norm 3.943330715529e-01 3464 KSP Residual norm 3.943330715529e-01 3465 KSP Residual norm 3.943330715529e-01 3466 KSP Residual norm 3.943330715529e-01 3467 KSP Residual norm 3.943330715529e-01 3468 KSP Residual norm 3.943330715529e-01 3469 KSP Residual norm 3.943330715529e-01 3470 KSP Residual norm 3.943330715529e-01 3471 KSP Residual norm 3.943330715529e-01 3472 KSP Residual norm 3.943330715529e-01 3473 KSP Residual norm 3.943330715529e-01 3474 KSP Residual norm 3.943330715529e-01 3475 KSP Residual norm 3.943330715529e-01 3476 KSP Residual norm 3.943330715529e-01 3477 KSP Residual norm 3.943330715529e-01 3478 KSP Residual norm 3.943330715529e-01 3479 KSP Residual norm 3.943330715529e-01 3480 KSP Residual norm 3.943330715531e-01 3481 KSP Residual norm 3.943330715531e-01 3482 KSP Residual norm 3.943330715531e-01 3483 KSP Residual norm 3.943330715531e-01 3484 KSP Residual norm 3.943330715531e-01 3485 KSP Residual norm 3.943330715531e-01 3486 KSP Residual norm 3.943330715531e-01 3487 KSP Residual norm 3.943330715531e-01 3488 KSP Residual norm 3.943330715531e-01 3489 KSP Residual norm 3.943330715531e-01 3490 KSP Residual norm 3.943330715531e-01 3491 KSP Residual norm 3.943330715531e-01 3492 KSP Residual norm 3.943330715531e-01 3493 KSP Residual norm 3.943330715531e-01 3494 KSP Residual norm 3.943330715531e-01 3495 KSP Residual norm 3.943330715531e-01 3496 KSP Residual norm 3.943330715531e-01 3497 KSP Residual norm 3.943330715531e-01 3498 KSP Residual norm 3.943330715531e-01 3499 KSP Residual norm 3.943330715531e-01 3500 KSP Residual norm 3.943330715531e-01 3501 KSP Residual norm 3.943330715531e-01 3502 KSP Residual norm 3.943330715531e-01 3503 KSP Residual norm 3.943330715531e-01 3504 KSP Residual norm 3.943330715531e-01 3505 KSP Residual norm 3.943330715531e-01 3506 KSP Residual norm 3.943330715531e-01 3507 KSP Residual norm 3.943330715531e-01 3508 KSP Residual norm 3.943330715531e-01 3509 KSP Residual norm 3.943330715531e-01 3510 KSP Residual norm 3.943330715527e-01 3511 KSP Residual norm 3.943330715527e-01 3512 KSP Residual norm 3.943330715527e-01 3513 KSP Residual norm 3.943330715527e-01 3514 KSP Residual norm 3.943330715527e-01 3515 KSP Residual norm 3.943330715527e-01 3516 KSP Residual norm 3.943330715527e-01 3517 KSP Residual norm 3.943330715527e-01 3518 KSP Residual norm 3.943330715527e-01 3519 KSP Residual norm 3.943330715527e-01 3520 KSP Residual norm 3.943330715527e-01 3521 KSP Residual norm 3.943330715527e-01 3522 KSP Residual norm 3.943330715527e-01 3523 KSP Residual norm 3.943330715527e-01 3524 KSP Residual norm 3.943330715527e-01 3525 KSP Residual norm 3.943330715527e-01 3526 KSP Residual norm 3.943330715527e-01 3527 KSP Residual norm 3.943330715527e-01 3528 KSP Residual norm 3.943330715527e-01 3529 KSP Residual norm 3.943330715527e-01 3530 KSP Residual norm 3.943330715527e-01 3531 KSP Residual norm 3.943330715527e-01 3532 KSP Residual norm 3.943330715527e-01 3533 KSP Residual norm 3.943330715527e-01 3534 KSP Residual norm 3.943330715527e-01 3535 KSP Residual norm 3.943330715527e-01 3536 KSP Residual norm 3.943330715527e-01 3537 KSP Residual norm 3.943330715527e-01 3538 KSP Residual norm 3.943330715527e-01 3539 KSP Residual norm 3.943330715527e-01 3540 KSP Residual norm 3.943330715530e-01 3541 KSP Residual norm 3.943330715530e-01 3542 KSP Residual norm 3.943330715530e-01 3543 KSP Residual norm 3.943330715530e-01 3544 KSP Residual norm 3.943330715530e-01 3545 KSP Residual norm 3.943330715530e-01 3546 KSP Residual norm 3.943330715530e-01 3547 KSP Residual norm 3.943330715530e-01 3548 KSP Residual norm 3.943330715530e-01 3549 KSP Residual norm 3.943330715530e-01 3550 KSP Residual norm 3.943330715530e-01 3551 KSP Residual norm 3.943330715530e-01 3552 KSP Residual norm 3.943330715530e-01 3553 KSP Residual norm 3.943330715530e-01 3554 KSP Residual norm 3.943330715530e-01 3555 KSP Residual norm 3.943330715530e-01 3556 KSP Residual norm 3.943330715530e-01 3557 KSP Residual norm 3.943330715530e-01 3558 KSP Residual norm 3.943330715530e-01 3559 KSP Residual norm 3.943330715530e-01 3560 KSP Residual norm 3.943330715530e-01 3561 KSP Residual norm 3.943330715530e-01 3562 KSP Residual norm 3.943330715530e-01 3563 KSP Residual norm 3.943330715530e-01 3564 KSP Residual norm 3.943330715530e-01 3565 KSP Residual norm 3.943330715530e-01 3566 KSP Residual norm 3.943330715530e-01 3567 KSP Residual norm 3.943330715530e-01 3568 KSP Residual norm 3.943330715530e-01 3569 KSP Residual norm 3.943330715530e-01 3570 KSP Residual norm 3.943330715528e-01 3571 KSP Residual norm 3.943330715528e-01 3572 KSP Residual norm 3.943330715528e-01 3573 KSP Residual norm 3.943330715528e-01 3574 KSP Residual norm 3.943330715528e-01 3575 KSP Residual norm 3.943330715528e-01 3576 KSP Residual norm 3.943330715528e-01 3577 KSP Residual norm 3.943330715528e-01 3578 KSP Residual norm 3.943330715528e-01 3579 KSP Residual norm 3.943330715528e-01 3580 KSP Residual norm 3.943330715528e-01 3581 KSP Residual norm 3.943330715528e-01 3582 KSP Residual norm 3.943330715528e-01 3583 KSP Residual norm 3.943330715528e-01 3584 KSP Residual norm 3.943330715528e-01 3585 KSP Residual norm 3.943330715528e-01 3586 KSP Residual norm 3.943330715528e-01 3587 KSP Residual norm 3.943330715528e-01 3588 KSP Residual norm 3.943330715528e-01 3589 KSP Residual norm 3.943330715528e-01 3590 KSP Residual norm 3.943330715528e-01 3591 KSP Residual norm 3.943330715528e-01 3592 KSP Residual norm 3.943330715528e-01 3593 KSP Residual norm 3.943330715528e-01 3594 KSP Residual norm 3.943330715528e-01 3595 KSP Residual norm 3.943330715528e-01 3596 KSP Residual norm 3.943330715528e-01 3597 KSP Residual norm 3.943330715528e-01 3598 KSP Residual norm 3.943330715528e-01 3599 KSP Residual norm 3.943330715528e-01 3600 KSP Residual norm 3.943330715528e-01 3601 KSP Residual norm 3.943330715528e-01 3602 KSP Residual norm 3.943330715528e-01 3603 KSP Residual norm 3.943330715528e-01 3604 KSP Residual norm 3.943330715528e-01 3605 KSP Residual norm 3.943330715528e-01 3606 KSP Residual norm 3.943330715528e-01 3607 KSP Residual norm 3.943330715528e-01 3608 KSP Residual norm 3.943330715528e-01 3609 KSP Residual norm 3.943330715528e-01 3610 KSP Residual norm 3.943330715528e-01 3611 KSP Residual norm 3.943330715528e-01 3612 KSP Residual norm 3.943330715528e-01 3613 KSP Residual norm 3.943330715528e-01 3614 KSP Residual norm 3.943330715528e-01 3615 KSP Residual norm 3.943330715528e-01 3616 KSP Residual norm 3.943330715528e-01 3617 KSP Residual norm 3.943330715528e-01 3618 KSP Residual norm 3.943330715528e-01 3619 KSP Residual norm 3.943330715528e-01 3620 KSP Residual norm 3.943330715528e-01 3621 KSP Residual norm 3.943330715528e-01 3622 KSP Residual norm 3.943330715528e-01 3623 KSP Residual norm 3.943330715528e-01 3624 KSP Residual norm 3.943330715528e-01 3625 KSP Residual norm 3.943330715528e-01 3626 KSP Residual norm 3.943330715528e-01 3627 KSP Residual norm 3.943330715528e-01 3628 KSP Residual norm 3.943330715528e-01 3629 KSP Residual norm 3.943330715528e-01 3630 KSP Residual norm 3.943330715528e-01 3631 KSP Residual norm 3.943330715528e-01 3632 KSP Residual norm 3.943330715528e-01 3633 KSP Residual norm 3.943330715528e-01 3634 KSP Residual norm 3.943330715528e-01 3635 KSP Residual norm 3.943330715528e-01 3636 KSP Residual norm 3.943330715528e-01 3637 KSP Residual norm 3.943330715528e-01 3638 KSP Residual norm 3.943330715528e-01 3639 KSP Residual norm 3.943330715528e-01 3640 KSP Residual norm 3.943330715528e-01 3641 KSP Residual norm 3.943330715528e-01 3642 KSP Residual norm 3.943330715528e-01 3643 KSP Residual norm 3.943330715528e-01 3644 KSP Residual norm 3.943330715528e-01 3645 KSP Residual norm 3.943330715528e-01 3646 KSP Residual norm 3.943330715528e-01 3647 KSP Residual norm 3.943330715528e-01 3648 KSP Residual norm 3.943330715528e-01 3649 KSP Residual norm 3.943330715528e-01 3650 KSP Residual norm 3.943330715528e-01 3651 KSP Residual norm 3.943330715528e-01 3652 KSP Residual norm 3.943330715528e-01 3653 KSP Residual norm 3.943330715528e-01 3654 KSP Residual norm 3.943330715528e-01 3655 KSP Residual norm 3.943330715528e-01 3656 KSP Residual norm 3.943330715528e-01 3657 KSP Residual norm 3.943330715528e-01 3658 KSP Residual norm 3.943330715528e-01 3659 KSP Residual norm 3.943330715528e-01 3660 KSP Residual norm 3.943330715527e-01 3661 KSP Residual norm 3.943330715527e-01 3662 KSP Residual norm 3.943330715527e-01 3663 KSP Residual norm 3.943330715527e-01 3664 KSP Residual norm 3.943330715527e-01 3665 KSP Residual norm 3.943330715527e-01 3666 KSP Residual norm 3.943330715527e-01 3667 KSP Residual norm 3.943330715527e-01 3668 KSP Residual norm 3.943330715527e-01 3669 KSP Residual norm 3.943330715527e-01 3670 KSP Residual norm 3.943330715527e-01 3671 KSP Residual norm 3.943330715527e-01 3672 KSP Residual norm 3.943330715527e-01 3673 KSP Residual norm 3.943330715527e-01 3674 KSP Residual norm 3.943330715527e-01 3675 KSP Residual norm 3.943330715527e-01 3676 KSP Residual norm 3.943330715527e-01 3677 KSP Residual norm 3.943330715527e-01 3678 KSP Residual norm 3.943330715527e-01 3679 KSP Residual norm 3.943330715527e-01 3680 KSP Residual norm 3.943330715527e-01 3681 KSP Residual norm 3.943330715527e-01 3682 KSP Residual norm 3.943330715527e-01 3683 KSP Residual norm 3.943330715527e-01 3684 KSP Residual norm 3.943330715527e-01 3685 KSP Residual norm 3.943330715527e-01 3686 KSP Residual norm 3.943330715527e-01 3687 KSP Residual norm 3.943330715527e-01 3688 KSP Residual norm 3.943330715527e-01 3689 KSP Residual norm 3.943330715527e-01 3690 KSP Residual norm 3.943330715529e-01 3691 KSP Residual norm 3.943330715529e-01 3692 KSP Residual norm 3.943330715529e-01 3693 KSP Residual norm 3.943330715529e-01 3694 KSP Residual norm 3.943330715529e-01 3695 KSP Residual norm 3.943330715529e-01 3696 KSP Residual norm 3.943330715529e-01 3697 KSP Residual norm 3.943330715529e-01 3698 KSP Residual norm 3.943330715529e-01 3699 KSP Residual norm 3.943330715529e-01 3700 KSP Residual norm 3.943330715529e-01 3701 KSP Residual norm 3.943330715529e-01 3702 KSP Residual norm 3.943330715529e-01 3703 KSP Residual norm 3.943330715529e-01 3704 KSP Residual norm 3.943330715529e-01 3705 KSP Residual norm 3.943330715529e-01 3706 KSP Residual norm 3.943330715529e-01 3707 KSP Residual norm 3.943330715529e-01 3708 KSP Residual norm 3.943330715529e-01 3709 KSP Residual norm 3.943330715529e-01 3710 KSP Residual norm 3.943330715529e-01 3711 KSP Residual norm 3.943330715529e-01 3712 KSP Residual norm 3.943330715529e-01 3713 KSP Residual norm 3.943330715529e-01 3714 KSP Residual norm 3.943330715529e-01 3715 KSP Residual norm 3.943330715529e-01 3716 KSP Residual norm 3.943330715529e-01 3717 KSP Residual norm 3.943330715529e-01 3718 KSP Residual norm 3.943330715529e-01 3719 KSP Residual norm 3.943330715529e-01 3720 KSP Residual norm 3.943330715529e-01 3721 KSP Residual norm 3.943330715529e-01 3722 KSP Residual norm 3.943330715529e-01 3723 KSP Residual norm 3.943330715529e-01 3724 KSP Residual norm 3.943330715529e-01 3725 KSP Residual norm 3.943330715529e-01 3726 KSP Residual norm 3.943330715529e-01 3727 KSP Residual norm 3.943330715529e-01 3728 KSP Residual norm 3.943330715529e-01 3729 KSP Residual norm 3.943330715529e-01 3730 KSP Residual norm 3.943330715529e-01 3731 KSP Residual norm 3.943330715529e-01 3732 KSP Residual norm 3.943330715529e-01 3733 KSP Residual norm 3.943330715529e-01 3734 KSP Residual norm 3.943330715529e-01 3735 KSP Residual norm 3.943330715529e-01 3736 KSP Residual norm 3.943330715529e-01 3737 KSP Residual norm 3.943330715529e-01 3738 KSP Residual norm 3.943330715529e-01 3739 KSP Residual norm 3.943330715529e-01 3740 KSP Residual norm 3.943330715529e-01 3741 KSP Residual norm 3.943330715529e-01 3742 KSP Residual norm 3.943330715529e-01 3743 KSP Residual norm 3.943330715529e-01 3744 KSP Residual norm 3.943330715529e-01 3745 KSP Residual norm 3.943330715529e-01 3746 KSP Residual norm 3.943330715529e-01 3747 KSP Residual norm 3.943330715529e-01 3748 KSP Residual norm 3.943330715529e-01 3749 KSP Residual norm 3.943330715529e-01 3750 KSP Residual norm 3.943330715530e-01 3751 KSP Residual norm 3.943330715530e-01 3752 KSP Residual norm 3.943330715530e-01 3753 KSP Residual norm 3.943330715530e-01 3754 KSP Residual norm 3.943330715530e-01 3755 KSP Residual norm 3.943330715530e-01 3756 KSP Residual norm 3.943330715530e-01 3757 KSP Residual norm 3.943330715530e-01 3758 KSP Residual norm 3.943330715530e-01 3759 KSP Residual norm 3.943330715530e-01 3760 KSP Residual norm 3.943330715530e-01 3761 KSP Residual norm 3.943330715530e-01 3762 KSP Residual norm 3.943330715530e-01 3763 KSP Residual norm 3.943330715530e-01 3764 KSP Residual norm 3.943330715530e-01 3765 KSP Residual norm 3.943330715530e-01 3766 KSP Residual norm 3.943330715530e-01 3767 KSP Residual norm 3.943330715530e-01 3768 KSP Residual norm 3.943330715530e-01 3769 KSP Residual norm 3.943330715530e-01 3770 KSP Residual norm 3.943330715530e-01 3771 KSP Residual norm 3.943330715530e-01 3772 KSP Residual norm 3.943330715530e-01 3773 KSP Residual norm 3.943330715530e-01 3774 KSP Residual norm 3.943330715530e-01 3775 KSP Residual norm 3.943330715530e-01 3776 KSP Residual norm 3.943330715530e-01 3777 KSP Residual norm 3.943330715530e-01 3778 KSP Residual norm 3.943330715530e-01 3779 KSP Residual norm 3.943330715530e-01 3780 KSP Residual norm 3.943330715527e-01 3781 KSP Residual norm 3.943330715527e-01 3782 KSP Residual norm 3.943330715527e-01 3783 KSP Residual norm 3.943330715527e-01 3784 KSP Residual norm 3.943330715527e-01 3785 KSP Residual norm 3.943330715527e-01 3786 KSP Residual norm 3.943330715527e-01 3787 KSP Residual norm 3.943330715527e-01 3788 KSP Residual norm 3.943330715527e-01 3789 KSP Residual norm 3.943330715527e-01 3790 KSP Residual norm 3.943330715527e-01 3791 KSP Residual norm 3.943330715527e-01 3792 KSP Residual norm 3.943330715527e-01 3793 KSP Residual norm 3.943330715527e-01 3794 KSP Residual norm 3.943330715527e-01 3795 KSP Residual norm 3.943330715527e-01 3796 KSP Residual norm 3.943330715527e-01 3797 KSP Residual norm 3.943330715527e-01 3798 KSP Residual norm 3.943330715527e-01 3799 KSP Residual norm 3.943330715527e-01 3800 KSP Residual norm 3.943330715527e-01 3801 KSP Residual norm 3.943330715527e-01 3802 KSP Residual norm 3.943330715527e-01 3803 KSP Residual norm 3.943330715527e-01 3804 KSP Residual norm 3.943330715527e-01 3805 KSP Residual norm 3.943330715527e-01 3806 KSP Residual norm 3.943330715527e-01 3807 KSP Residual norm 3.943330715527e-01 3808 KSP Residual norm 3.943330715527e-01 3809 KSP Residual norm 3.943330715527e-01 3810 KSP Residual norm 3.943330715528e-01 3811 KSP Residual norm 3.943330715528e-01 3812 KSP Residual norm 3.943330715528e-01 3813 KSP Residual norm 3.943330715528e-01 3814 KSP Residual norm 3.943330715528e-01 3815 KSP Residual norm 3.943330715528e-01 3816 KSP Residual norm 3.943330715528e-01 3817 KSP Residual norm 3.943330715528e-01 3818 KSP Residual norm 3.943330715528e-01 3819 KSP Residual norm 3.943330715528e-01 3820 KSP Residual norm 3.943330715528e-01 3821 KSP Residual norm 3.943330715528e-01 3822 KSP Residual norm 3.943330715528e-01 3823 KSP Residual norm 3.943330715528e-01 3824 KSP Residual norm 3.943330715528e-01 3825 KSP Residual norm 3.943330715528e-01 3826 KSP Residual norm 3.943330715528e-01 3827 KSP Residual norm 3.943330715528e-01 3828 KSP Residual norm 3.943330715528e-01 3829 KSP Residual norm 3.943330715528e-01 3830 KSP Residual norm 3.943330715528e-01 3831 KSP Residual norm 3.943330715528e-01 3832 KSP Residual norm 3.943330715528e-01 3833 KSP Residual norm 3.943330715528e-01 3834 KSP Residual norm 3.943330715528e-01 3835 KSP Residual norm 3.943330715528e-01 3836 KSP Residual norm 3.943330715528e-01 3837 KSP Residual norm 3.943330715528e-01 3838 KSP Residual norm 3.943330715528e-01 3839 KSP Residual norm 3.943330715528e-01 3840 KSP Residual norm 3.943330715527e-01 3841 KSP Residual norm 3.943330715527e-01 3842 KSP Residual norm 3.943330715527e-01 3843 KSP Residual norm 3.943330715527e-01 3844 KSP Residual norm 3.943330715527e-01 3845 KSP Residual norm 3.943330715527e-01 3846 KSP Residual norm 3.943330715527e-01 3847 KSP Residual norm 3.943330715527e-01 3848 KSP Residual norm 3.943330715527e-01 3849 KSP Residual norm 3.943330715527e-01 3850 KSP Residual norm 3.943330715527e-01 3851 KSP Residual norm 3.943330715527e-01 3852 KSP Residual norm 3.943330715527e-01 3853 KSP Residual norm 3.943330715527e-01 3854 KSP Residual norm 3.943330715527e-01 3855 KSP Residual norm 3.943330715527e-01 3856 KSP Residual norm 3.943330715527e-01 3857 KSP Residual norm 3.943330715527e-01 3858 KSP Residual norm 3.943330715527e-01 3859 KSP Residual norm 3.943330715527e-01 3860 KSP Residual norm 3.943330715527e-01 3861 KSP Residual norm 3.943330715527e-01 3862 KSP Residual norm 3.943330715527e-01 3863 KSP Residual norm 3.943330715527e-01 3864 KSP Residual norm 3.943330715527e-01 3865 KSP Residual norm 3.943330715527e-01 3866 KSP Residual norm 3.943330715527e-01 3867 KSP Residual norm 3.943330715527e-01 3868 KSP Residual norm 3.943330715527e-01 3869 KSP Residual norm 3.943330715527e-01 3870 KSP Residual norm 3.943330715525e-01 3871 KSP Residual norm 3.943330715525e-01 3872 KSP Residual norm 3.943330715525e-01 3873 KSP Residual norm 3.943330715525e-01 3874 KSP Residual norm 3.943330715525e-01 3875 KSP Residual norm 3.943330715525e-01 3876 KSP Residual norm 3.943330715525e-01 3877 KSP Residual norm 3.943330715525e-01 3878 KSP Residual norm 3.943330715525e-01 3879 KSP Residual norm 3.943330715525e-01 3880 KSP Residual norm 3.943330715525e-01 3881 KSP Residual norm 3.943330715525e-01 3882 KSP Residual norm 3.943330715525e-01 3883 KSP Residual norm 3.943330715525e-01 3884 KSP Residual norm 3.943330715525e-01 3885 KSP Residual norm 3.943330715525e-01 3886 KSP Residual norm 3.943330715525e-01 3887 KSP Residual norm 3.943330715525e-01 3888 KSP Residual norm 3.943330715525e-01 3889 KSP Residual norm 3.943330715525e-01 3890 KSP Residual norm 3.943330715525e-01 3891 KSP Residual norm 3.943330715525e-01 3892 KSP Residual norm 3.943330715525e-01 3893 KSP Residual norm 3.943330715525e-01 3894 KSP Residual norm 3.943330715525e-01 3895 KSP Residual norm 3.943330715525e-01 3896 KSP Residual norm 3.943330715525e-01 3897 KSP Residual norm 3.943330715525e-01 3898 KSP Residual norm 3.943330715525e-01 3899 KSP Residual norm 3.943330715525e-01 3900 KSP Residual norm 3.943330715529e-01 3901 KSP Residual norm 3.943330715529e-01 3902 KSP Residual norm 3.943330715529e-01 3903 KSP Residual norm 3.943330715529e-01 3904 KSP Residual norm 3.943330715529e-01 3905 KSP Residual norm 3.943330715529e-01 3906 KSP Residual norm 3.943330715529e-01 3907 KSP Residual norm 3.943330715529e-01 3908 KSP Residual norm 3.943330715529e-01 3909 KSP Residual norm 3.943330715529e-01 3910 KSP Residual norm 3.943330715529e-01 3911 KSP Residual norm 3.943330715529e-01 3912 KSP Residual norm 3.943330715529e-01 3913 KSP Residual norm 3.943330715529e-01 3914 KSP Residual norm 3.943330715529e-01 3915 KSP Residual norm 3.943330715529e-01 3916 KSP Residual norm 3.943330715529e-01 3917 KSP Residual norm 3.943330715529e-01 3918 KSP Residual norm 3.943330715529e-01 3919 KSP Residual norm 3.943330715529e-01 3920 KSP Residual norm 3.943330715529e-01 3921 KSP Residual norm 3.943330715529e-01 3922 KSP Residual norm 3.943330715529e-01 3923 KSP Residual norm 3.943330715529e-01 3924 KSP Residual norm 3.943330715529e-01 3925 KSP Residual norm 3.943330715529e-01 3926 KSP Residual norm 3.943330715529e-01 3927 KSP Residual norm 3.943330715529e-01 3928 KSP Residual norm 3.943330715529e-01 3929 KSP Residual norm 3.943330715529e-01 3930 KSP Residual norm 3.943330715529e-01 3931 KSP Residual norm 3.943330715529e-01 3932 KSP Residual norm 3.943330715529e-01 3933 KSP Residual norm 3.943330715529e-01 3934 KSP Residual norm 3.943330715529e-01 3935 KSP Residual norm 3.943330715529e-01 3936 KSP Residual norm 3.943330715529e-01 3937 KSP Residual norm 3.943330715529e-01 3938 KSP Residual norm 3.943330715529e-01 3939 KSP Residual norm 3.943330715529e-01 3940 KSP Residual norm 3.943330715529e-01 3941 KSP Residual norm 3.943330715529e-01 3942 KSP Residual norm 3.943330715529e-01 3943 KSP Residual norm 3.943330715529e-01 3944 KSP Residual norm 3.943330715529e-01 3945 KSP Residual norm 3.943330715529e-01 3946 KSP Residual norm 3.943330715529e-01 3947 KSP Residual norm 3.943330715529e-01 3948 KSP Residual norm 3.943330715529e-01 3949 KSP Residual norm 3.943330715529e-01 3950 KSP Residual norm 3.943330715529e-01 3951 KSP Residual norm 3.943330715529e-01 3952 KSP Residual norm 3.943330715529e-01 3953 KSP Residual norm 3.943330715529e-01 3954 KSP Residual norm 3.943330715529e-01 3955 KSP Residual norm 3.943330715529e-01 3956 KSP Residual norm 3.943330715529e-01 3957 KSP Residual norm 3.943330715529e-01 3958 KSP Residual norm 3.943330715529e-01 3959 KSP Residual norm 3.943330715529e-01 3960 KSP Residual norm 3.943330715529e-01 3961 KSP Residual norm 3.943330715529e-01 3962 KSP Residual norm 3.943330715529e-01 3963 KSP Residual norm 3.943330715529e-01 3964 KSP Residual norm 3.943330715529e-01 3965 KSP Residual norm 3.943330715529e-01 3966 KSP Residual norm 3.943330715529e-01 3967 KSP Residual norm 3.943330715529e-01 3968 KSP Residual norm 3.943330715529e-01 3969 KSP Residual norm 3.943330715529e-01 3970 KSP Residual norm 3.943330715529e-01 3971 KSP Residual norm 3.943330715529e-01 3972 KSP Residual norm 3.943330715529e-01 3973 KSP Residual norm 3.943330715529e-01 3974 KSP Residual norm 3.943330715529e-01 3975 KSP Residual norm 3.943330715529e-01 3976 KSP Residual norm 3.943330715529e-01 3977 KSP Residual norm 3.943330715529e-01 3978 KSP Residual norm 3.943330715529e-01 3979 KSP Residual norm 3.943330715529e-01 3980 KSP Residual norm 3.943330715529e-01 3981 KSP Residual norm 3.943330715529e-01 3982 KSP Residual norm 3.943330715529e-01 3983 KSP Residual norm 3.943330715529e-01 3984 KSP Residual norm 3.943330715529e-01 3985 KSP Residual norm 3.943330715529e-01 3986 KSP Residual norm 3.943330715529e-01 3987 KSP Residual norm 3.943330715529e-01 3988 KSP Residual norm 3.943330715529e-01 3989 KSP Residual norm 3.943330715529e-01 3990 KSP Residual norm 3.943330715530e-01 3991 KSP Residual norm 3.943330715530e-01 3992 KSP Residual norm 3.943330715530e-01 3993 KSP Residual norm 3.943330715530e-01 3994 KSP Residual norm 3.943330715530e-01 3995 KSP Residual norm 3.943330715530e-01 3996 KSP Residual norm 3.943330715530e-01 3997 KSP Residual norm 3.943330715530e-01 3998 KSP Residual norm 3.943330715530e-01 3999 KSP Residual norm 3.943330715530e-01 4000 KSP Residual norm 3.943330715530e-01 4001 KSP Residual norm 3.943330715530e-01 4002 KSP Residual norm 3.943330715530e-01 4003 KSP Residual norm 3.943330715530e-01 4004 KSP Residual norm 3.943330715530e-01 4005 KSP Residual norm 3.943330715530e-01 4006 KSP Residual norm 3.943330715530e-01 4007 KSP Residual norm 3.943330715530e-01 4008 KSP Residual norm 3.943330715530e-01 4009 KSP Residual norm 3.943330715530e-01 4010 KSP Residual norm 3.943330715530e-01 4011 KSP Residual norm 3.943330715530e-01 4012 KSP Residual norm 3.943330715530e-01 4013 KSP Residual norm 3.943330715530e-01 4014 KSP Residual norm 3.943330715530e-01 4015 KSP Residual norm 3.943330715530e-01 4016 KSP Residual norm 3.943330715530e-01 4017 KSP Residual norm 3.943330715530e-01 4018 KSP Residual norm 3.943330715530e-01 4019 KSP Residual norm 3.943330715530e-01 4020 KSP Residual norm 3.943330715531e-01 4021 KSP Residual norm 3.943330715531e-01 4022 KSP Residual norm 3.943330715531e-01 4023 KSP Residual norm 3.943330715531e-01 4024 KSP Residual norm 3.943330715531e-01 4025 KSP Residual norm 3.943330715531e-01 4026 KSP Residual norm 3.943330715531e-01 4027 KSP Residual norm 3.943330715531e-01 4028 KSP Residual norm 3.943330715531e-01 4029 KSP Residual norm 3.943330715531e-01 4030 KSP Residual norm 3.943330715531e-01 4031 KSP Residual norm 3.943330715531e-01 4032 KSP Residual norm 3.943330715531e-01 4033 KSP Residual norm 3.943330715531e-01 4034 KSP Residual norm 3.943330715531e-01 4035 KSP Residual norm 3.943330715531e-01 4036 KSP Residual norm 3.943330715531e-01 4037 KSP Residual norm 3.943330715531e-01 4038 KSP Residual norm 3.943330715531e-01 4039 KSP Residual norm 3.943330715531e-01 4040 KSP Residual norm 3.943330715531e-01 4041 KSP Residual norm 3.943330715531e-01 4042 KSP Residual norm 3.943330715531e-01 4043 KSP Residual norm 3.943330715531e-01 4044 KSP Residual norm 3.943330715531e-01 4045 KSP Residual norm 3.943330715531e-01 4046 KSP Residual norm 3.943330715531e-01 4047 KSP Residual norm 3.943330715531e-01 4048 KSP Residual norm 3.943330715531e-01 4049 KSP Residual norm 3.943330715531e-01 4050 KSP Residual norm 3.943330715531e-01 4051 KSP Residual norm 3.943330715531e-01 4052 KSP Residual norm 3.943330715531e-01 4053 KSP Residual norm 3.943330715531e-01 4054 KSP Residual norm 3.943330715531e-01 4055 KSP Residual norm 3.943330715531e-01 4056 KSP Residual norm 3.943330715531e-01 4057 KSP Residual norm 3.943330715531e-01 4058 KSP Residual norm 3.943330715531e-01 4059 KSP Residual norm 3.943330715531e-01 4060 KSP Residual norm 3.943330715531e-01 4061 KSP Residual norm 3.943330715531e-01 4062 KSP Residual norm 3.943330715531e-01 4063 KSP Residual norm 3.943330715531e-01 4064 KSP Residual norm 3.943330715531e-01 4065 KSP Residual norm 3.943330715531e-01 4066 KSP Residual norm 3.943330715531e-01 4067 KSP Residual norm 3.943330715531e-01 4068 KSP Residual norm 3.943330715531e-01 4069 KSP Residual norm 3.943330715531e-01 4070 KSP Residual norm 3.943330715531e-01 4071 KSP Residual norm 3.943330715531e-01 4072 KSP Residual norm 3.943330715531e-01 4073 KSP Residual norm 3.943330715531e-01 4074 KSP Residual norm 3.943330715531e-01 4075 KSP Residual norm 3.943330715531e-01 4076 KSP Residual norm 3.943330715531e-01 4077 KSP Residual norm 3.943330715531e-01 4078 KSP Residual norm 3.943330715531e-01 4079 KSP Residual norm 3.943330715531e-01 4080 KSP Residual norm 3.943330715532e-01 4081 KSP Residual norm 3.943330715532e-01 4082 KSP Residual norm 3.943330715532e-01 4083 KSP Residual norm 3.943330715532e-01 4084 KSP Residual norm 3.943330715532e-01 4085 KSP Residual norm 3.943330715532e-01 4086 KSP Residual norm 3.943330715532e-01 4087 KSP Residual norm 3.943330715532e-01 4088 KSP Residual norm 3.943330715532e-01 4089 KSP Residual norm 3.943330715532e-01 4090 KSP Residual norm 3.943330715532e-01 4091 KSP Residual norm 3.943330715532e-01 4092 KSP Residual norm 3.943330715532e-01 4093 KSP Residual norm 3.943330715532e-01 4094 KSP Residual norm 3.943330715532e-01 4095 KSP Residual norm 3.943330715532e-01 4096 KSP Residual norm 3.943330715532e-01 4097 KSP Residual norm 3.943330715532e-01 4098 KSP Residual norm 3.943330715532e-01 4099 KSP Residual norm 3.943330715532e-01 4100 KSP Residual norm 3.943330715532e-01 4101 KSP Residual norm 3.943330715532e-01 4102 KSP Residual norm 3.943330715532e-01 4103 KSP Residual norm 3.943330715532e-01 4104 KSP Residual norm 3.943330715532e-01 4105 KSP Residual norm 3.943330715532e-01 4106 KSP Residual norm 3.943330715532e-01 4107 KSP Residual norm 3.943330715532e-01 4108 KSP Residual norm 3.943330715532e-01 4109 KSP Residual norm 3.943330715532e-01 4110 KSP Residual norm 3.943330715532e-01 4111 KSP Residual norm 3.943330715532e-01 4112 KSP Residual norm 3.943330715532e-01 4113 KSP Residual norm 3.943330715532e-01 4114 KSP Residual norm 3.943330715532e-01 4115 KSP Residual norm 3.943330715532e-01 4116 KSP Residual norm 3.943330715532e-01 4117 KSP Residual norm 3.943330715532e-01 4118 KSP Residual norm 3.943330715532e-01 4119 KSP Residual norm 3.943330715532e-01 4120 KSP Residual norm 3.943330715532e-01 4121 KSP Residual norm 3.943330715532e-01 4122 KSP Residual norm 3.943330715532e-01 4123 KSP Residual norm 3.943330715532e-01 4124 KSP Residual norm 3.943330715532e-01 4125 KSP Residual norm 3.943330715532e-01 4126 KSP Residual norm 3.943330715532e-01 4127 KSP Residual norm 3.943330715532e-01 4128 KSP Residual norm 3.943330715532e-01 4129 KSP Residual norm 3.943330715532e-01 4130 KSP Residual norm 3.943330715532e-01 4131 KSP Residual norm 3.943330715532e-01 4132 KSP Residual norm 3.943330715532e-01 4133 KSP Residual norm 3.943330715532e-01 4134 KSP Residual norm 3.943330715532e-01 4135 KSP Residual norm 3.943330715532e-01 4136 KSP Residual norm 3.943330715532e-01 4137 KSP Residual norm 3.943330715532e-01 4138 KSP Residual norm 3.943330715532e-01 4139 KSP Residual norm 3.943330715532e-01 4140 KSP Residual norm 3.943330715532e-01 4141 KSP Residual norm 3.943330715532e-01 4142 KSP Residual norm 3.943330715532e-01 4143 KSP Residual norm 3.943330715532e-01 4144 KSP Residual norm 3.943330715532e-01 4145 KSP Residual norm 3.943330715532e-01 4146 KSP Residual norm 3.943330715532e-01 4147 KSP Residual norm 3.943330715532e-01 4148 KSP Residual norm 3.943330715532e-01 4149 KSP Residual norm 3.943330715532e-01 4150 KSP Residual norm 3.943330715532e-01 4151 KSP Residual norm 3.943330715532e-01 4152 KSP Residual norm 3.943330715532e-01 4153 KSP Residual norm 3.943330715532e-01 4154 KSP Residual norm 3.943330715532e-01 4155 KSP Residual norm 3.943330715532e-01 4156 KSP Residual norm 3.943330715532e-01 4157 KSP Residual norm 3.943330715532e-01 4158 KSP Residual norm 3.943330715532e-01 4159 KSP Residual norm 3.943330715532e-01 4160 KSP Residual norm 3.943330715532e-01 4161 KSP Residual norm 3.943330715532e-01 4162 KSP Residual norm 3.943330715532e-01 4163 KSP Residual norm 3.943330715532e-01 4164 KSP Residual norm 3.943330715532e-01 4165 KSP Residual norm 3.943330715532e-01 4166 KSP Residual norm 3.943330715532e-01 4167 KSP Residual norm 3.943330715532e-01 4168 KSP Residual norm 3.943330715532e-01 4169 KSP Residual norm 3.943330715532e-01 4170 KSP Residual norm 3.943330715531e-01 4171 KSP Residual norm 3.943330715531e-01 4172 KSP Residual norm 3.943330715531e-01 4173 KSP Residual norm 3.943330715531e-01 4174 KSP Residual norm 3.943330715531e-01 4175 KSP Residual norm 3.943330715531e-01 4176 KSP Residual norm 3.943330715531e-01 4177 KSP Residual norm 3.943330715531e-01 4178 KSP Residual norm 3.943330715531e-01 4179 KSP Residual norm 3.943330715531e-01 4180 KSP Residual norm 3.943330715531e-01 4181 KSP Residual norm 3.943330715531e-01 4182 KSP Residual norm 3.943330715531e-01 4183 KSP Residual norm 3.943330715531e-01 4184 KSP Residual norm 3.943330715531e-01 4185 KSP Residual norm 3.943330715531e-01 4186 KSP Residual norm 3.943330715531e-01 4187 KSP Residual norm 3.943330715531e-01 4188 KSP Residual norm 3.943330715531e-01 4189 KSP Residual norm 3.943330715531e-01 4190 KSP Residual norm 3.943330715531e-01 4191 KSP Residual norm 3.943330715531e-01 4192 KSP Residual norm 3.943330715531e-01 4193 KSP Residual norm 3.943330715531e-01 4194 KSP Residual norm 3.943330715531e-01 4195 KSP Residual norm 3.943330715531e-01 4196 KSP Residual norm 3.943330715531e-01 4197 KSP Residual norm 3.943330715531e-01 4198 KSP Residual norm 3.943330715531e-01 4199 KSP Residual norm 3.943330715531e-01 4200 KSP Residual norm 3.943330715533e-01 4201 KSP Residual norm 3.943330715533e-01 4202 KSP Residual norm 3.943330715533e-01 4203 KSP Residual norm 3.943330715533e-01 4204 KSP Residual norm 3.943330715533e-01 4205 KSP Residual norm 3.943330715533e-01 4206 KSP Residual norm 3.943330715533e-01 4207 KSP Residual norm 3.943330715533e-01 4208 KSP Residual norm 3.943330715533e-01 4209 KSP Residual norm 3.943330715533e-01 4210 KSP Residual norm 3.943330715533e-01 4211 KSP Residual norm 3.943330715533e-01 4212 KSP Residual norm 3.943330715533e-01 4213 KSP Residual norm 3.943330715533e-01 4214 KSP Residual norm 3.943330715533e-01 4215 KSP Residual norm 3.943330715533e-01 4216 KSP Residual norm 3.943330715533e-01 4217 KSP Residual norm 3.943330715533e-01 4218 KSP Residual norm 3.943330715533e-01 4219 KSP Residual norm 3.943330715533e-01 4220 KSP Residual norm 3.943330715533e-01 4221 KSP Residual norm 3.943330715533e-01 4222 KSP Residual norm 3.943330715533e-01 4223 KSP Residual norm 3.943330715533e-01 4224 KSP Residual norm 3.943330715533e-01 4225 KSP Residual norm 3.943330715533e-01 4226 KSP Residual norm 3.943330715533e-01 4227 KSP Residual norm 3.943330715533e-01 4228 KSP Residual norm 3.943330715533e-01 4229 KSP Residual norm 3.943330715533e-01 4230 KSP Residual norm 3.943330715531e-01 4231 KSP Residual norm 3.943330715531e-01 4232 KSP Residual norm 3.943330715531e-01 4233 KSP Residual norm 3.943330715531e-01 4234 KSP Residual norm 3.943330715531e-01 4235 KSP Residual norm 3.943330715531e-01 4236 KSP Residual norm 3.943330715531e-01 4237 KSP Residual norm 3.943330715531e-01 4238 KSP Residual norm 3.943330715531e-01 4239 KSP Residual norm 3.943330715531e-01 4240 KSP Residual norm 3.943330715531e-01 4241 KSP Residual norm 3.943330715531e-01 4242 KSP Residual norm 3.943330715531e-01 4243 KSP Residual norm 3.943330715531e-01 4244 KSP Residual norm 3.943330715531e-01 4245 KSP Residual norm 3.943330715531e-01 4246 KSP Residual norm 3.943330715531e-01 4247 KSP Residual norm 3.943330715531e-01 4248 KSP Residual norm 3.943330715531e-01 4249 KSP Residual norm 3.943330715531e-01 4250 KSP Residual norm 3.943330715531e-01 4251 KSP Residual norm 3.943330715531e-01 4252 KSP Residual norm 3.943330715531e-01 4253 KSP Residual norm 3.943330715531e-01 4254 KSP Residual norm 3.943330715531e-01 4255 KSP Residual norm 3.943330715531e-01 4256 KSP Residual norm 3.943330715531e-01 4257 KSP Residual norm 3.943330715531e-01 4258 KSP Residual norm 3.943330715531e-01 4259 KSP Residual norm 3.943330715531e-01 4260 KSP Residual norm 3.943330715531e-01 4261 KSP Residual norm 3.943330715531e-01 4262 KSP Residual norm 3.943330715531e-01 4263 KSP Residual norm 3.943330715531e-01 4264 KSP Residual norm 3.943330715531e-01 4265 KSP Residual norm 3.943330715531e-01 4266 KSP Residual norm 3.943330715531e-01 4267 KSP Residual norm 3.943330715531e-01 4268 KSP Residual norm 3.943330715531e-01 4269 KSP Residual norm 3.943330715531e-01 4270 KSP Residual norm 3.943330715531e-01 4271 KSP Residual norm 3.943330715531e-01 4272 KSP Residual norm 3.943330715531e-01 4273 KSP Residual norm 3.943330715531e-01 4274 KSP Residual norm 3.943330715531e-01 4275 KSP Residual norm 3.943330715531e-01 4276 KSP Residual norm 3.943330715531e-01 4277 KSP Residual norm 3.943330715531e-01 4278 KSP Residual norm 3.943330715531e-01 4279 KSP Residual norm 3.943330715531e-01 4280 KSP Residual norm 3.943330715531e-01 4281 KSP Residual norm 3.943330715531e-01 4282 KSP Residual norm 3.943330715531e-01 4283 KSP Residual norm 3.943330715531e-01 4284 KSP Residual norm 3.943330715531e-01 4285 KSP Residual norm 3.943330715531e-01 4286 KSP Residual norm 3.943330715531e-01 4287 KSP Residual norm 3.943330715531e-01 4288 KSP Residual norm 3.943330715531e-01 4289 KSP Residual norm 3.943330715531e-01 4290 KSP Residual norm 3.943330715532e-01 4291 KSP Residual norm 3.943330715532e-01 4292 KSP Residual norm 3.943330715532e-01 4293 KSP Residual norm 3.943330715532e-01 4294 KSP Residual norm 3.943330715532e-01 4295 KSP Residual norm 3.943330715532e-01 4296 KSP Residual norm 3.943330715532e-01 4297 KSP Residual norm 3.943330715532e-01 4298 KSP Residual norm 3.943330715532e-01 4299 KSP Residual norm 3.943330715532e-01 4300 KSP Residual norm 3.943330715532e-01 4301 KSP Residual norm 3.943330715532e-01 4302 KSP Residual norm 3.943330715532e-01 4303 KSP Residual norm 3.943330715532e-01 4304 KSP Residual norm 3.943330715532e-01 4305 KSP Residual norm 3.943330715532e-01 4306 KSP Residual norm 3.943330715532e-01 4307 KSP Residual norm 3.943330715532e-01 4308 KSP Residual norm 3.943330715532e-01 4309 KSP Residual norm 3.943330715532e-01 4310 KSP Residual norm 3.943330715532e-01 4311 KSP Residual norm 3.943330715532e-01 4312 KSP Residual norm 3.943330715532e-01 4313 KSP Residual norm 3.943330715532e-01 4314 KSP Residual norm 3.943330715532e-01 4315 KSP Residual norm 3.943330715532e-01 4316 KSP Residual norm 3.943330715532e-01 4317 KSP Residual norm 3.943330715532e-01 4318 KSP Residual norm 3.943330715532e-01 4319 KSP Residual norm 3.943330715532e-01 4320 KSP Residual norm 3.943330715532e-01 4321 KSP Residual norm 3.943330715532e-01 4322 KSP Residual norm 3.943330715532e-01 4323 KSP Residual norm 3.943330715532e-01 4324 KSP Residual norm 3.943330715532e-01 4325 KSP Residual norm 3.943330715532e-01 4326 KSP Residual norm 3.943330715532e-01 4327 KSP Residual norm 3.943330715532e-01 4328 KSP Residual norm 3.943330715532e-01 4329 KSP Residual norm 3.943330715532e-01 4330 KSP Residual norm 3.943330715532e-01 4331 KSP Residual norm 3.943330715532e-01 4332 KSP Residual norm 3.943330715532e-01 4333 KSP Residual norm 3.943330715532e-01 4334 KSP Residual norm 3.943330715532e-01 4335 KSP Residual norm 3.943330715532e-01 4336 KSP Residual norm 3.943330715532e-01 4337 KSP Residual norm 3.943330715532e-01 4338 KSP Residual norm 3.943330715532e-01 4339 KSP Residual norm 3.943330715532e-01 4340 KSP Residual norm 3.943330715532e-01 4341 KSP Residual norm 3.943330715532e-01 4342 KSP Residual norm 3.943330715532e-01 4343 KSP Residual norm 3.943330715532e-01 4344 KSP Residual norm 3.943330715532e-01 4345 KSP Residual norm 3.943330715532e-01 4346 KSP Residual norm 3.943330715532e-01 4347 KSP Residual norm 3.943330715532e-01 4348 KSP Residual norm 3.943330715532e-01 4349 KSP Residual norm 3.943330715532e-01 4350 KSP Residual norm 3.943330715532e-01 4351 KSP Residual norm 3.943330715532e-01 4352 KSP Residual norm 3.943330715532e-01 4353 KSP Residual norm 3.943330715532e-01 4354 KSP Residual norm 3.943330715532e-01 4355 KSP Residual norm 3.943330715532e-01 4356 KSP Residual norm 3.943330715532e-01 4357 KSP Residual norm 3.943330715532e-01 4358 KSP Residual norm 3.943330715532e-01 4359 KSP Residual norm 3.943330715532e-01 4360 KSP Residual norm 3.943330715532e-01 4361 KSP Residual norm 3.943330715532e-01 4362 KSP Residual norm 3.943330715532e-01 4363 KSP Residual norm 3.943330715532e-01 4364 KSP Residual norm 3.943330715532e-01 4365 KSP Residual norm 3.943330715532e-01 4366 KSP Residual norm 3.943330715532e-01 4367 KSP Residual norm 3.943330715532e-01 4368 KSP Residual norm 3.943330715532e-01 4369 KSP Residual norm 3.943330715532e-01 4370 KSP Residual norm 3.943330715532e-01 4371 KSP Residual norm 3.943330715532e-01 4372 KSP Residual norm 3.943330715532e-01 4373 KSP Residual norm 3.943330715532e-01 4374 KSP Residual norm 3.943330715532e-01 4375 KSP Residual norm 3.943330715532e-01 4376 KSP Residual norm 3.943330715532e-01 4377 KSP Residual norm 3.943330715532e-01 4378 KSP Residual norm 3.943330715532e-01 4379 KSP Residual norm 3.943330715532e-01 4380 KSP Residual norm 3.943330715532e-01 4381 KSP Residual norm 3.943330715532e-01 4382 KSP Residual norm 3.943330715532e-01 4383 KSP Residual norm 3.943330715532e-01 4384 KSP Residual norm 3.943330715532e-01 4385 KSP Residual norm 3.943330715532e-01 4386 KSP Residual norm 3.943330715532e-01 4387 KSP Residual norm 3.943330715532e-01 4388 KSP Residual norm 3.943330715532e-01 4389 KSP Residual norm 3.943330715532e-01 4390 KSP Residual norm 3.943330715532e-01 4391 KSP Residual norm 3.943330715532e-01 4392 KSP Residual norm 3.943330715532e-01 4393 KSP Residual norm 3.943330715532e-01 4394 KSP Residual norm 3.943330715532e-01 4395 KSP Residual norm 3.943330715532e-01 4396 KSP Residual norm 3.943330715532e-01 4397 KSP Residual norm 3.943330715532e-01 4398 KSP Residual norm 3.943330715532e-01 4399 KSP Residual norm 3.943330715532e-01 4400 KSP Residual norm 3.943330715532e-01 4401 KSP Residual norm 3.943330715532e-01 4402 KSP Residual norm 3.943330715532e-01 4403 KSP Residual norm 3.943330715532e-01 4404 KSP Residual norm 3.943330715532e-01 4405 KSP Residual norm 3.943330715532e-01 4406 KSP Residual norm 3.943330715532e-01 4407 KSP Residual norm 3.943330715532e-01 4408 KSP Residual norm 3.943330715532e-01 4409 KSP Residual norm 3.943330715532e-01 4410 KSP Residual norm 3.943330715532e-01 4411 KSP Residual norm 3.943330715532e-01 4412 KSP Residual norm 3.943330715532e-01 4413 KSP Residual norm 3.943330715532e-01 4414 KSP Residual norm 3.943330715532e-01 4415 KSP Residual norm 3.943330715532e-01 4416 KSP Residual norm 3.943330715532e-01 4417 KSP Residual norm 3.943330715532e-01 4418 KSP Residual norm 3.943330715532e-01 4419 KSP Residual norm 3.943330715532e-01 4420 KSP Residual norm 3.943330715532e-01 4421 KSP Residual norm 3.943330715532e-01 4422 KSP Residual norm 3.943330715532e-01 4423 KSP Residual norm 3.943330715532e-01 4424 KSP Residual norm 3.943330715532e-01 4425 KSP Residual norm 3.943330715532e-01 4426 KSP Residual norm 3.943330715532e-01 4427 KSP Residual norm 3.943330715532e-01 4428 KSP Residual norm 3.943330715532e-01 4429 KSP Residual norm 3.943330715532e-01 4430 KSP Residual norm 3.943330715532e-01 4431 KSP Residual norm 3.943330715532e-01 4432 KSP Residual norm 3.943330715532e-01 4433 KSP Residual norm 3.943330715532e-01 4434 KSP Residual norm 3.943330715532e-01 4435 KSP Residual norm 3.943330715532e-01 4436 KSP Residual norm 3.943330715532e-01 4437 KSP Residual norm 3.943330715532e-01 4438 KSP Residual norm 3.943330715532e-01 4439 KSP Residual norm 3.943330715532e-01 4440 KSP Residual norm 3.943330715530e-01 4441 KSP Residual norm 3.943330715530e-01 4442 KSP Residual norm 3.943330715530e-01 4443 KSP Residual norm 3.943330715530e-01 4444 KSP Residual norm 3.943330715530e-01 4445 KSP Residual norm 3.943330715530e-01 4446 KSP Residual norm 3.943330715530e-01 4447 KSP Residual norm 3.943330715530e-01 4448 KSP Residual norm 3.943330715530e-01 4449 KSP Residual norm 3.943330715530e-01 4450 KSP Residual norm 3.943330715530e-01 4451 KSP Residual norm 3.943330715530e-01 4452 KSP Residual norm 3.943330715530e-01 4453 KSP Residual norm 3.943330715530e-01 4454 KSP Residual norm 3.943330715530e-01 4455 KSP Residual norm 3.943330715530e-01 4456 KSP Residual norm 3.943330715530e-01 4457 KSP Residual norm 3.943330715530e-01 4458 KSP Residual norm 3.943330715530e-01 4459 KSP Residual norm 3.943330715530e-01 4460 KSP Residual norm 3.943330715530e-01 4461 KSP Residual norm 3.943330715530e-01 4462 KSP Residual norm 3.943330715530e-01 4463 KSP Residual norm 3.943330715530e-01 4464 KSP Residual norm 3.943330715530e-01 4465 KSP Residual norm 3.943330715530e-01 4466 KSP Residual norm 3.943330715530e-01 4467 KSP Residual norm 3.943330715530e-01 4468 KSP Residual norm 3.943330715530e-01 4469 KSP Residual norm 3.943330715530e-01 4470 KSP Residual norm 3.943330715532e-01 4471 KSP Residual norm 3.943330715532e-01 4472 KSP Residual norm 3.943330715532e-01 4473 KSP Residual norm 3.943330715532e-01 4474 KSP Residual norm 3.943330715532e-01 4475 KSP Residual norm 3.943330715532e-01 4476 KSP Residual norm 3.943330715532e-01 4477 KSP Residual norm 3.943330715532e-01 4478 KSP Residual norm 3.943330715532e-01 4479 KSP Residual norm 3.943330715532e-01 4480 KSP Residual norm 3.943330715532e-01 4481 KSP Residual norm 3.943330715532e-01 4482 KSP Residual norm 3.943330715532e-01 4483 KSP Residual norm 3.943330715532e-01 4484 KSP Residual norm 3.943330715532e-01 4485 KSP Residual norm 3.943330715532e-01 4486 KSP Residual norm 3.943330715532e-01 4487 KSP Residual norm 3.943330715532e-01 4488 KSP Residual norm 3.943330715532e-01 4489 KSP Residual norm 3.943330715532e-01 4490 KSP Residual norm 3.943330715532e-01 4491 KSP Residual norm 3.943330715532e-01 4492 KSP Residual norm 3.943330715532e-01 4493 KSP Residual norm 3.943330715532e-01 4494 KSP Residual norm 3.943330715532e-01 4495 KSP Residual norm 3.943330715532e-01 4496 KSP Residual norm 3.943330715532e-01 4497 KSP Residual norm 3.943330715532e-01 4498 KSP Residual norm 3.943330715532e-01 4499 KSP Residual norm 3.943330715532e-01 4500 KSP Residual norm 3.943330715531e-01 4501 KSP Residual norm 3.943330715531e-01 4502 KSP Residual norm 3.943330715531e-01 4503 KSP Residual norm 3.943330715531e-01 4504 KSP Residual norm 3.943330715531e-01 4505 KSP Residual norm 3.943330715531e-01 4506 KSP Residual norm 3.943330715531e-01 4507 KSP Residual norm 3.943330715531e-01 4508 KSP Residual norm 3.943330715531e-01 4509 KSP Residual norm 3.943330715531e-01 4510 KSP Residual norm 3.943330715531e-01 4511 KSP Residual norm 3.943330715531e-01 4512 KSP Residual norm 3.943330715531e-01 4513 KSP Residual norm 3.943330715531e-01 4514 KSP Residual norm 3.943330715531e-01 4515 KSP Residual norm 3.943330715531e-01 4516 KSP Residual norm 3.943330715531e-01 4517 KSP Residual norm 3.943330715531e-01 4518 KSP Residual norm 3.943330715531e-01 4519 KSP Residual norm 3.943330715531e-01 4520 KSP Residual norm 3.943330715531e-01 4521 KSP Residual norm 3.943330715531e-01 4522 KSP Residual norm 3.943330715531e-01 4523 KSP Residual norm 3.943330715531e-01 4524 KSP Residual norm 3.943330715531e-01 4525 KSP Residual norm 3.943330715531e-01 4526 KSP Residual norm 3.943330715531e-01 4527 KSP Residual norm 3.943330715531e-01 4528 KSP Residual norm 3.943330715531e-01 4529 KSP Residual norm 3.943330715531e-01 4530 KSP Residual norm 3.943330715532e-01 4531 KSP Residual norm 3.943330715532e-01 4532 KSP Residual norm 3.943330715532e-01 4533 KSP Residual norm 3.943330715532e-01 4534 KSP Residual norm 3.943330715532e-01 4535 KSP Residual norm 3.943330715532e-01 4536 KSP Residual norm 3.943330715532e-01 4537 KSP Residual norm 3.943330715532e-01 4538 KSP Residual norm 3.943330715532e-01 4539 KSP Residual norm 3.943330715532e-01 4540 KSP Residual norm 3.943330715532e-01 4541 KSP Residual norm 3.943330715532e-01 4542 KSP Residual norm 3.943330715532e-01 4543 KSP Residual norm 3.943330715532e-01 4544 KSP Residual norm 3.943330715532e-01 4545 KSP Residual norm 3.943330715532e-01 4546 KSP Residual norm 3.943330715532e-01 4547 KSP Residual norm 3.943330715532e-01 4548 KSP Residual norm 3.943330715532e-01 4549 KSP Residual norm 3.943330715532e-01 4550 KSP Residual norm 3.943330715532e-01 4551 KSP Residual norm 3.943330715532e-01 4552 KSP Residual norm 3.943330715532e-01 4553 KSP Residual norm 3.943330715532e-01 4554 KSP Residual norm 3.943330715532e-01 4555 KSP Residual norm 3.943330715532e-01 4556 KSP Residual norm 3.943330715532e-01 4557 KSP Residual norm 3.943330715532e-01 4558 KSP Residual norm 3.943330715532e-01 4559 KSP Residual norm 3.943330715532e-01 4560 KSP Residual norm 3.943330715535e-01 4561 KSP Residual norm 3.943330715535e-01 4562 KSP Residual norm 3.943330715535e-01 4563 KSP Residual norm 3.943330715535e-01 4564 KSP Residual norm 3.943330715535e-01 4565 KSP Residual norm 3.943330715535e-01 4566 KSP Residual norm 3.943330715535e-01 4567 KSP Residual norm 3.943330715535e-01 4568 KSP Residual norm 3.943330715535e-01 4569 KSP Residual norm 3.943330715535e-01 4570 KSP Residual norm 3.943330715535e-01 4571 KSP Residual norm 3.943330715535e-01 4572 KSP Residual norm 3.943330715535e-01 4573 KSP Residual norm 3.943330715535e-01 4574 KSP Residual norm 3.943330715535e-01 4575 KSP Residual norm 3.943330715535e-01 4576 KSP Residual norm 3.943330715535e-01 4577 KSP Residual norm 3.943330715535e-01 4578 KSP Residual norm 3.943330715535e-01 4579 KSP Residual norm 3.943330715535e-01 4580 KSP Residual norm 3.943330715535e-01 4581 KSP Residual norm 3.943330715535e-01 4582 KSP Residual norm 3.943330715535e-01 4583 KSP Residual norm 3.943330715535e-01 4584 KSP Residual norm 3.943330715535e-01 4585 KSP Residual norm 3.943330715535e-01 4586 KSP Residual norm 3.943330715535e-01 4587 KSP Residual norm 3.943330715535e-01 4588 KSP Residual norm 3.943330715535e-01 4589 KSP Residual norm 3.943330715535e-01 4590 KSP Residual norm 3.943330715533e-01 4591 KSP Residual norm 3.943330715533e-01 4592 KSP Residual norm 3.943330715533e-01 4593 KSP Residual norm 3.943330715533e-01 4594 KSP Residual norm 3.943330715533e-01 4595 KSP Residual norm 3.943330715533e-01 4596 KSP Residual norm 3.943330715533e-01 4597 KSP Residual norm 3.943330715533e-01 4598 KSP Residual norm 3.943330715533e-01 4599 KSP Residual norm 3.943330715533e-01 4600 KSP Residual norm 3.943330715533e-01 4601 KSP Residual norm 3.943330715533e-01 4602 KSP Residual norm 3.943330715533e-01 4603 KSP Residual norm 3.943330715533e-01 4604 KSP Residual norm 3.943330715533e-01 4605 KSP Residual norm 3.943330715533e-01 4606 KSP Residual norm 3.943330715533e-01 4607 KSP Residual norm 3.943330715533e-01 4608 KSP Residual norm 3.943330715533e-01 4609 KSP Residual norm 3.943330715533e-01 4610 KSP Residual norm 3.943330715533e-01 4611 KSP Residual norm 3.943330715533e-01 4612 KSP Residual norm 3.943330715533e-01 4613 KSP Residual norm 3.943330715533e-01 4614 KSP Residual norm 3.943330715533e-01 4615 KSP Residual norm 3.943330715533e-01 4616 KSP Residual norm 3.943330715533e-01 4617 KSP Residual norm 3.943330715533e-01 4618 KSP Residual norm 3.943330715533e-01 4619 KSP Residual norm 3.943330715533e-01 4620 KSP Residual norm 3.943330715532e-01 4621 KSP Residual norm 3.943330715532e-01 4622 KSP Residual norm 3.943330715532e-01 4623 KSP Residual norm 3.943330715532e-01 4624 KSP Residual norm 3.943330715532e-01 4625 KSP Residual norm 3.943330715532e-01 4626 KSP Residual norm 3.943330715532e-01 4627 KSP Residual norm 3.943330715532e-01 4628 KSP Residual norm 3.943330715532e-01 4629 KSP Residual norm 3.943330715532e-01 4630 KSP Residual norm 3.943330715532e-01 4631 KSP Residual norm 3.943330715532e-01 4632 KSP Residual norm 3.943330715532e-01 4633 KSP Residual norm 3.943330715532e-01 4634 KSP Residual norm 3.943330715532e-01 4635 KSP Residual norm 3.943330715532e-01 4636 KSP Residual norm 3.943330715532e-01 4637 KSP Residual norm 3.943330715532e-01 4638 KSP Residual norm 3.943330715532e-01 4639 KSP Residual norm 3.943330715532e-01 4640 KSP Residual norm 3.943330715532e-01 4641 KSP Residual norm 3.943330715532e-01 4642 KSP Residual norm 3.943330715532e-01 4643 KSP Residual norm 3.943330715532e-01 4644 KSP Residual norm 3.943330715532e-01 4645 KSP Residual norm 3.943330715532e-01 4646 KSP Residual norm 3.943330715532e-01 4647 KSP Residual norm 3.943330715532e-01 4648 KSP Residual norm 3.943330715532e-01 4649 KSP Residual norm 3.943330715532e-01 4650 KSP Residual norm 3.943330715531e-01 4651 KSP Residual norm 3.943330715531e-01 4652 KSP Residual norm 3.943330715531e-01 4653 KSP Residual norm 3.943330715531e-01 4654 KSP Residual norm 3.943330715531e-01 4655 KSP Residual norm 3.943330715531e-01 4656 KSP Residual norm 3.943330715531e-01 4657 KSP Residual norm 3.943330715531e-01 4658 KSP Residual norm 3.943330715531e-01 4659 KSP Residual norm 3.943330715531e-01 4660 KSP Residual norm 3.943330715531e-01 4661 KSP Residual norm 3.943330715531e-01 4662 KSP Residual norm 3.943330715531e-01 4663 KSP Residual norm 3.943330715531e-01 4664 KSP Residual norm 3.943330715531e-01 4665 KSP Residual norm 3.943330715531e-01 4666 KSP Residual norm 3.943330715531e-01 4667 KSP Residual norm 3.943330715531e-01 4668 KSP Residual norm 3.943330715531e-01 4669 KSP Residual norm 3.943330715531e-01 4670 KSP Residual norm 3.943330715531e-01 4671 KSP Residual norm 3.943330715531e-01 4672 KSP Residual norm 3.943330715531e-01 4673 KSP Residual norm 3.943330715531e-01 4674 KSP Residual norm 3.943330715531e-01 4675 KSP Residual norm 3.943330715531e-01 4676 KSP Residual norm 3.943330715531e-01 4677 KSP Residual norm 3.943330715531e-01 4678 KSP Residual norm 3.943330715531e-01 4679 KSP Residual norm 3.943330715531e-01 4680 KSP Residual norm 3.943330715530e-01 4681 KSP Residual norm 3.943330715530e-01 4682 KSP Residual norm 3.943330715530e-01 4683 KSP Residual norm 3.943330715530e-01 4684 KSP Residual norm 3.943330715530e-01 4685 KSP Residual norm 3.943330715530e-01 4686 KSP Residual norm 3.943330715530e-01 4687 KSP Residual norm 3.943330715530e-01 4688 KSP Residual norm 3.943330715530e-01 4689 KSP Residual norm 3.943330715530e-01 4690 KSP Residual norm 3.943330715530e-01 4691 KSP Residual norm 3.943330715530e-01 4692 KSP Residual norm 3.943330715530e-01 4693 KSP Residual norm 3.943330715530e-01 4694 KSP Residual norm 3.943330715530e-01 4695 KSP Residual norm 3.943330715530e-01 4696 KSP Residual norm 3.943330715530e-01 4697 KSP Residual norm 3.943330715530e-01 4698 KSP Residual norm 3.943330715530e-01 4699 KSP Residual norm 3.943330715530e-01 4700 KSP Residual norm 3.943330715530e-01 4701 KSP Residual norm 3.943330715530e-01 4702 KSP Residual norm 3.943330715530e-01 4703 KSP Residual norm 3.943330715530e-01 4704 KSP Residual norm 3.943330715530e-01 4705 KSP Residual norm 3.943330715530e-01 4706 KSP Residual norm 3.943330715530e-01 4707 KSP Residual norm 3.943330715530e-01 4708 KSP Residual norm 3.943330715530e-01 4709 KSP Residual norm 3.943330715530e-01 4710 KSP Residual norm 3.943330715532e-01 4711 KSP Residual norm 3.943330715532e-01 4712 KSP Residual norm 3.943330715532e-01 4713 KSP Residual norm 3.943330715532e-01 4714 KSP Residual norm 3.943330715532e-01 4715 KSP Residual norm 3.943330715532e-01 4716 KSP Residual norm 3.943330715532e-01 4717 KSP Residual norm 3.943330715532e-01 4718 KSP Residual norm 3.943330715532e-01 4719 KSP Residual norm 3.943330715532e-01 4720 KSP Residual norm 3.943330715532e-01 4721 KSP Residual norm 3.943330715532e-01 4722 KSP Residual norm 3.943330715532e-01 4723 KSP Residual norm 3.943330715532e-01 4724 KSP Residual norm 3.943330715532e-01 4725 KSP Residual norm 3.943330715532e-01 4726 KSP Residual norm 3.943330715532e-01 4727 KSP Residual norm 3.943330715532e-01 4728 KSP Residual norm 3.943330715532e-01 4729 KSP Residual norm 3.943330715532e-01 4730 KSP Residual norm 3.943330715532e-01 4731 KSP Residual norm 3.943330715532e-01 4732 KSP Residual norm 3.943330715532e-01 4733 KSP Residual norm 3.943330715532e-01 4734 KSP Residual norm 3.943330715532e-01 4735 KSP Residual norm 3.943330715532e-01 4736 KSP Residual norm 3.943330715532e-01 4737 KSP Residual norm 3.943330715532e-01 4738 KSP Residual norm 3.943330715532e-01 4739 KSP Residual norm 3.943330715532e-01 4740 KSP Residual norm 3.943330715530e-01 4741 KSP Residual norm 3.943330715530e-01 4742 KSP Residual norm 3.943330715530e-01 4743 KSP Residual norm 3.943330715530e-01 4744 KSP Residual norm 3.943330715530e-01 4745 KSP Residual norm 3.943330715530e-01 4746 KSP Residual norm 3.943330715530e-01 4747 KSP Residual norm 3.943330715530e-01 4748 KSP Residual norm 3.943330715530e-01 4749 KSP Residual norm 3.943330715530e-01 4750 KSP Residual norm 3.943330715530e-01 4751 KSP Residual norm 3.943330715530e-01 4752 KSP Residual norm 3.943330715530e-01 4753 KSP Residual norm 3.943330715530e-01 4754 KSP Residual norm 3.943330715530e-01 4755 KSP Residual norm 3.943330715530e-01 4756 KSP Residual norm 3.943330715530e-01 4757 KSP Residual norm 3.943330715530e-01 4758 KSP Residual norm 3.943330715530e-01 4759 KSP Residual norm 3.943330715530e-01 4760 KSP Residual norm 3.943330715530e-01 4761 KSP Residual norm 3.943330715530e-01 4762 KSP Residual norm 3.943330715530e-01 4763 KSP Residual norm 3.943330715530e-01 4764 KSP Residual norm 3.943330715530e-01 4765 KSP Residual norm 3.943330715530e-01 4766 KSP Residual norm 3.943330715530e-01 4767 KSP Residual norm 3.943330715530e-01 4768 KSP Residual norm 3.943330715530e-01 4769 KSP Residual norm 3.943330715530e-01 4770 KSP Residual norm 3.943330715530e-01 4771 KSP Residual norm 3.943330715530e-01 4772 KSP Residual norm 3.943330715530e-01 4773 KSP Residual norm 3.943330715530e-01 4774 KSP Residual norm 3.943330715530e-01 4775 KSP Residual norm 3.943330715530e-01 4776 KSP Residual norm 3.943330715530e-01 4777 KSP Residual norm 3.943330715530e-01 4778 KSP Residual norm 3.943330715530e-01 4779 KSP Residual norm 3.943330715530e-01 4780 KSP Residual norm 3.943330715530e-01 4781 KSP Residual norm 3.943330715530e-01 4782 KSP Residual norm 3.943330715530e-01 4783 KSP Residual norm 3.943330715530e-01 4784 KSP Residual norm 3.943330715530e-01 4785 KSP Residual norm 3.943330715530e-01 4786 KSP Residual norm 3.943330715530e-01 4787 KSP Residual norm 3.943330715530e-01 4788 KSP Residual norm 3.943330715530e-01 4789 KSP Residual norm 3.943330715530e-01 4790 KSP Residual norm 3.943330715530e-01 4791 KSP Residual norm 3.943330715530e-01 4792 KSP Residual norm 3.943330715530e-01 4793 KSP Residual norm 3.943330715530e-01 4794 KSP Residual norm 3.943330715530e-01 4795 KSP Residual norm 3.943330715530e-01 4796 KSP Residual norm 3.943330715530e-01 4797 KSP Residual norm 3.943330715530e-01 4798 KSP Residual norm 3.943330715530e-01 4799 KSP Residual norm 3.943330715530e-01 4800 KSP Residual norm 3.943330715530e-01 4801 KSP Residual norm 3.943330715530e-01 4802 KSP Residual norm 3.943330715530e-01 4803 KSP Residual norm 3.943330715530e-01 4804 KSP Residual norm 3.943330715530e-01 4805 KSP Residual norm 3.943330715530e-01 4806 KSP Residual norm 3.943330715530e-01 4807 KSP Residual norm 3.943330715530e-01 4808 KSP Residual norm 3.943330715530e-01 4809 KSP Residual norm 3.943330715530e-01 4810 KSP Residual norm 3.943330715530e-01 4811 KSP Residual norm 3.943330715530e-01 4812 KSP Residual norm 3.943330715530e-01 4813 KSP Residual norm 3.943330715530e-01 4814 KSP Residual norm 3.943330715530e-01 4815 KSP Residual norm 3.943330715530e-01 4816 KSP Residual norm 3.943330715530e-01 4817 KSP Residual norm 3.943330715530e-01 4818 KSP Residual norm 3.943330715530e-01 4819 KSP Residual norm 3.943330715530e-01 4820 KSP Residual norm 3.943330715530e-01 4821 KSP Residual norm 3.943330715530e-01 4822 KSP Residual norm 3.943330715530e-01 4823 KSP Residual norm 3.943330715530e-01 4824 KSP Residual norm 3.943330715530e-01 4825 KSP Residual norm 3.943330715530e-01 4826 KSP Residual norm 3.943330715530e-01 4827 KSP Residual norm 3.943330715530e-01 4828 KSP Residual norm 3.943330715530e-01 4829 KSP Residual norm 3.943330715530e-01 4830 KSP Residual norm 3.943330715531e-01 4831 KSP Residual norm 3.943330715531e-01 4832 KSP Residual norm 3.943330715531e-01 4833 KSP Residual norm 3.943330715531e-01 4834 KSP Residual norm 3.943330715531e-01 4835 KSP Residual norm 3.943330715531e-01 4836 KSP Residual norm 3.943330715531e-01 4837 KSP Residual norm 3.943330715531e-01 4838 KSP Residual norm 3.943330715531e-01 4839 KSP Residual norm 3.943330715531e-01 4840 KSP Residual norm 3.943330715531e-01 4841 KSP Residual norm 3.943330715531e-01 4842 KSP Residual norm 3.943330715531e-01 4843 KSP Residual norm 3.943330715531e-01 4844 KSP Residual norm 3.943330715531e-01 4845 KSP Residual norm 3.943330715531e-01 4846 KSP Residual norm 3.943330715531e-01 4847 KSP Residual norm 3.943330715531e-01 4848 KSP Residual norm 3.943330715531e-01 4849 KSP Residual norm 3.943330715531e-01 4850 KSP Residual norm 3.943330715531e-01 4851 KSP Residual norm 3.943330715531e-01 4852 KSP Residual norm 3.943330715531e-01 4853 KSP Residual norm 3.943330715531e-01 4854 KSP Residual norm 3.943330715531e-01 4855 KSP Residual norm 3.943330715531e-01 4856 KSP Residual norm 3.943330715531e-01 4857 KSP Residual norm 3.943330715531e-01 4858 KSP Residual norm 3.943330715531e-01 4859 KSP Residual norm 3.943330715531e-01 4860 KSP Residual norm 3.943330715530e-01 4861 KSP Residual norm 3.943330715530e-01 4862 KSP Residual norm 3.943330715530e-01 4863 KSP Residual norm 3.943330715530e-01 4864 KSP Residual norm 3.943330715530e-01 4865 KSP Residual norm 3.943330715530e-01 4866 KSP Residual norm 3.943330715530e-01 4867 KSP Residual norm 3.943330715530e-01 4868 KSP Residual norm 3.943330715530e-01 4869 KSP Residual norm 3.943330715530e-01 4870 KSP Residual norm 3.943330715530e-01 4871 KSP Residual norm 3.943330715530e-01 4872 KSP Residual norm 3.943330715530e-01 4873 KSP Residual norm 3.943330715530e-01 4874 KSP Residual norm 3.943330715530e-01 4875 KSP Residual norm 3.943330715530e-01 4876 KSP Residual norm 3.943330715530e-01 4877 KSP Residual norm 3.943330715530e-01 4878 KSP Residual norm 3.943330715530e-01 4879 KSP Residual norm 3.943330715530e-01 4880 KSP Residual norm 3.943330715530e-01 4881 KSP Residual norm 3.943330715530e-01 4882 KSP Residual norm 3.943330715530e-01 4883 KSP Residual norm 3.943330715530e-01 4884 KSP Residual norm 3.943330715530e-01 4885 KSP Residual norm 3.943330715530e-01 4886 KSP Residual norm 3.943330715530e-01 4887 KSP Residual norm 3.943330715530e-01 4888 KSP Residual norm 3.943330715530e-01 4889 KSP Residual norm 3.943330715530e-01 4890 KSP Residual norm 3.943330715530e-01 4891 KSP Residual norm 3.943330715530e-01 4892 KSP Residual norm 3.943330715530e-01 4893 KSP Residual norm 3.943330715530e-01 4894 KSP Residual norm 3.943330715530e-01 4895 KSP Residual norm 3.943330715530e-01 4896 KSP Residual norm 3.943330715530e-01 4897 KSP Residual norm 3.943330715530e-01 4898 KSP Residual norm 3.943330715530e-01 4899 KSP Residual norm 3.943330715530e-01 4900 KSP Residual norm 3.943330715530e-01 4901 KSP Residual norm 3.943330715530e-01 4902 KSP Residual norm 3.943330715530e-01 4903 KSP Residual norm 3.943330715530e-01 4904 KSP Residual norm 3.943330715530e-01 4905 KSP Residual norm 3.943330715530e-01 4906 KSP Residual norm 3.943330715530e-01 4907 KSP Residual norm 3.943330715530e-01 4908 KSP Residual norm 3.943330715530e-01 4909 KSP Residual norm 3.943330715530e-01 4910 KSP Residual norm 3.943330715530e-01 4911 KSP Residual norm 3.943330715530e-01 4912 KSP Residual norm 3.943330715530e-01 4913 KSP Residual norm 3.943330715530e-01 4914 KSP Residual norm 3.943330715530e-01 4915 KSP Residual norm 3.943330715530e-01 4916 KSP Residual norm 3.943330715530e-01 4917 KSP Residual norm 3.943330715530e-01 4918 KSP Residual norm 3.943330715530e-01 4919 KSP Residual norm 3.943330715530e-01 4920 KSP Residual norm 3.943330715532e-01 4921 KSP Residual norm 3.943330715532e-01 4922 KSP Residual norm 3.943330715532e-01 4923 KSP Residual norm 3.943330715532e-01 4924 KSP Residual norm 3.943330715532e-01 4925 KSP Residual norm 3.943330715532e-01 4926 KSP Residual norm 3.943330715532e-01 4927 KSP Residual norm 3.943330715532e-01 4928 KSP Residual norm 3.943330715532e-01 4929 KSP Residual norm 3.943330715532e-01 4930 KSP Residual norm 3.943330715532e-01 4931 KSP Residual norm 3.943330715532e-01 4932 KSP Residual norm 3.943330715532e-01 4933 KSP Residual norm 3.943330715532e-01 4934 KSP Residual norm 3.943330715532e-01 4935 KSP Residual norm 3.943330715532e-01 4936 KSP Residual norm 3.943330715532e-01 4937 KSP Residual norm 3.943330715532e-01 4938 KSP Residual norm 3.943330715532e-01 4939 KSP Residual norm 3.943330715532e-01 4940 KSP Residual norm 3.943330715532e-01 4941 KSP Residual norm 3.943330715532e-01 4942 KSP Residual norm 3.943330715532e-01 4943 KSP Residual norm 3.943330715532e-01 4944 KSP Residual norm 3.943330715532e-01 4945 KSP Residual norm 3.943330715532e-01 4946 KSP Residual norm 3.943330715532e-01 4947 KSP Residual norm 3.943330715532e-01 4948 KSP Residual norm 3.943330715532e-01 4949 KSP Residual norm 3.943330715532e-01 4950 KSP Residual norm 3.943330715531e-01 4951 KSP Residual norm 3.943330715531e-01 4952 KSP Residual norm 3.943330715531e-01 4953 KSP Residual norm 3.943330715531e-01 4954 KSP Residual norm 3.943330715531e-01 4955 KSP Residual norm 3.943330715531e-01 4956 KSP Residual norm 3.943330715531e-01 4957 KSP Residual norm 3.943330715531e-01 4958 KSP Residual norm 3.943330715531e-01 4959 KSP Residual norm 3.943330715531e-01 4960 KSP Residual norm 3.943330715531e-01 4961 KSP Residual norm 3.943330715531e-01 4962 KSP Residual norm 3.943330715531e-01 4963 KSP Residual norm 3.943330715531e-01 4964 KSP Residual norm 3.943330715531e-01 4965 KSP Residual norm 3.943330715531e-01 4966 KSP Residual norm 3.943330715531e-01 4967 KSP Residual norm 3.943330715531e-01 4968 KSP Residual norm 3.943330715531e-01 4969 KSP Residual norm 3.943330715531e-01 4970 KSP Residual norm 3.943330715531e-01 4971 KSP Residual norm 3.943330715531e-01 4972 KSP Residual norm 3.943330715531e-01 4973 KSP Residual norm 3.943330715531e-01 4974 KSP Residual norm 3.943330715531e-01 4975 KSP Residual norm 3.943330715531e-01 4976 KSP Residual norm 3.943330715531e-01 4977 KSP Residual norm 3.943330715531e-01 4978 KSP Residual norm 3.943330715531e-01 4979 KSP Residual norm 3.943330715531e-01 4980 KSP Residual norm 3.943330715531e-01 4981 KSP Residual norm 3.943330715531e-01 4982 KSP Residual norm 3.943330715531e-01 4983 KSP Residual norm 3.943330715531e-01 4984 KSP Residual norm 3.943330715531e-01 4985 KSP Residual norm 3.943330715531e-01 4986 KSP Residual norm 3.943330715531e-01 4987 KSP Residual norm 3.943330715531e-01 4988 KSP Residual norm 3.943330715531e-01 4989 KSP Residual norm 3.943330715531e-01 4990 KSP Residual norm 3.943330715531e-01 4991 KSP Residual norm 3.943330715531e-01 4992 KSP Residual norm 3.943330715531e-01 4993 KSP Residual norm 3.943330715531e-01 4994 KSP Residual norm 3.943330715531e-01 4995 KSP Residual norm 3.943330715531e-01 4996 KSP Residual norm 3.943330715531e-01 4997 KSP Residual norm 3.943330715531e-01 4998 KSP Residual norm 3.943330715531e-01 4999 KSP Residual norm 3.943330715531e-01 5000 KSP Residual norm 3.943330715531e-01 5001 KSP Residual norm 3.943330715531e-01 5002 KSP Residual norm 3.943330715531e-01 5003 KSP Residual norm 3.943330715531e-01 5004 KSP Residual norm 3.943330715531e-01 5005 KSP Residual norm 3.943330715531e-01 5006 KSP Residual norm 3.943330715531e-01 5007 KSP Residual norm 3.943330715531e-01 5008 KSP Residual norm 3.943330715531e-01 5009 KSP Residual norm 3.943330715531e-01 5010 KSP Residual norm 3.943330715532e-01 5011 KSP Residual norm 3.943330715532e-01 5012 KSP Residual norm 3.943330715532e-01 5013 KSP Residual norm 3.943330715532e-01 5014 KSP Residual norm 3.943330715532e-01 5015 KSP Residual norm 3.943330715532e-01 5016 KSP Residual norm 3.943330715532e-01 5017 KSP Residual norm 3.943330715532e-01 5018 KSP Residual norm 3.943330715532e-01 5019 KSP Residual norm 3.943330715532e-01 5020 KSP Residual norm 3.943330715532e-01 5021 KSP Residual norm 3.943330715532e-01 5022 KSP Residual norm 3.943330715532e-01 5023 KSP Residual norm 3.943330715532e-01 5024 KSP Residual norm 3.943330715532e-01 5025 KSP Residual norm 3.943330715532e-01 5026 KSP Residual norm 3.943330715532e-01 5027 KSP Residual norm 3.943330715532e-01 5028 KSP Residual norm 3.943330715532e-01 5029 KSP Residual norm 3.943330715532e-01 5030 KSP Residual norm 3.943330715532e-01 5031 KSP Residual norm 3.943330715532e-01 5032 KSP Residual norm 3.943330715532e-01 5033 KSP Residual norm 3.943330715532e-01 5034 KSP Residual norm 3.943330715532e-01 5035 KSP Residual norm 3.943330715532e-01 5036 KSP Residual norm 3.943330715532e-01 5037 KSP Residual norm 3.943330715532e-01 5038 KSP Residual norm 3.943330715532e-01 5039 KSP Residual norm 3.943330715532e-01 5040 KSP Residual norm 3.943330715533e-01 5041 KSP Residual norm 3.943330715533e-01 5042 KSP Residual norm 3.943330715533e-01 5043 KSP Residual norm 3.943330715533e-01 5044 KSP Residual norm 3.943330715533e-01 5045 KSP Residual norm 3.943330715533e-01 5046 KSP Residual norm 3.943330715533e-01 5047 KSP Residual norm 3.943330715533e-01 5048 KSP Residual norm 3.943330715533e-01 5049 KSP Residual norm 3.943330715533e-01 5050 KSP Residual norm 3.943330715533e-01 5051 KSP Residual norm 3.943330715533e-01 5052 KSP Residual norm 3.943330715533e-01 5053 KSP Residual norm 3.943330715533e-01 5054 KSP Residual norm 3.943330715533e-01 5055 KSP Residual norm 3.943330715533e-01 5056 KSP Residual norm 3.943330715533e-01 5057 KSP Residual norm 3.943330715533e-01 5058 KSP Residual norm 3.943330715533e-01 5059 KSP Residual norm 3.943330715533e-01 5060 KSP Residual norm 3.943330715533e-01 5061 KSP Residual norm 3.943330715533e-01 5062 KSP Residual norm 3.943330715533e-01 5063 KSP Residual norm 3.943330715533e-01 5064 KSP Residual norm 3.943330715533e-01 5065 KSP Residual norm 3.943330715533e-01 5066 KSP Residual norm 3.943330715533e-01 5067 KSP Residual norm 3.943330715533e-01 5068 KSP Residual norm 3.943330715533e-01 5069 KSP Residual norm 3.943330715533e-01 5070 KSP Residual norm 3.943330715535e-01 5071 KSP Residual norm 3.943330715535e-01 5072 KSP Residual norm 3.943330715535e-01 5073 KSP Residual norm 3.943330715535e-01 5074 KSP Residual norm 3.943330715535e-01 5075 KSP Residual norm 3.943330715535e-01 5076 KSP Residual norm 3.943330715535e-01 5077 KSP Residual norm 3.943330715535e-01 5078 KSP Residual norm 3.943330715535e-01 5079 KSP Residual norm 3.943330715535e-01 5080 KSP Residual norm 3.943330715535e-01 5081 KSP Residual norm 3.943330715535e-01 5082 KSP Residual norm 3.943330715535e-01 5083 KSP Residual norm 3.943330715535e-01 5084 KSP Residual norm 3.943330715535e-01 5085 KSP Residual norm 3.943330715535e-01 5086 KSP Residual norm 3.943330715535e-01 5087 KSP Residual norm 3.943330715535e-01 5088 KSP Residual norm 3.943330715535e-01 5089 KSP Residual norm 3.943330715535e-01 5090 KSP Residual norm 3.943330715535e-01 5091 KSP Residual norm 3.943330715535e-01 5092 KSP Residual norm 3.943330715535e-01 5093 KSP Residual norm 3.943330715535e-01 5094 KSP Residual norm 3.943330715535e-01 5095 KSP Residual norm 3.943330715535e-01 5096 KSP Residual norm 3.943330715535e-01 5097 KSP Residual norm 3.943330715535e-01 5098 KSP Residual norm 3.943330715535e-01 5099 KSP Residual norm 3.943330715535e-01 5100 KSP Residual norm 3.943330715535e-01 5101 KSP Residual norm 3.943330715535e-01 5102 KSP Residual norm 3.943330715535e-01 5103 KSP Residual norm 3.943330715535e-01 5104 KSP Residual norm 3.943330715535e-01 5105 KSP Residual norm 3.943330715535e-01 5106 KSP Residual norm 3.943330715535e-01 5107 KSP Residual norm 3.943330715535e-01 5108 KSP Residual norm 3.943330715535e-01 5109 KSP Residual norm 3.943330715535e-01 5110 KSP Residual norm 3.943330715535e-01 5111 KSP Residual norm 3.943330715535e-01 5112 KSP Residual norm 3.943330715535e-01 5113 KSP Residual norm 3.943330715535e-01 5114 KSP Residual norm 3.943330715535e-01 5115 KSP Residual norm 3.943330715535e-01 5116 KSP Residual norm 3.943330715535e-01 5117 KSP Residual norm 3.943330715535e-01 5118 KSP Residual norm 3.943330715535e-01 5119 KSP Residual norm 3.943330715535e-01 5120 KSP Residual norm 3.943330715535e-01 5121 KSP Residual norm 3.943330715535e-01 5122 KSP Residual norm 3.943330715535e-01 5123 KSP Residual norm 3.943330715535e-01 5124 KSP Residual norm 3.943330715535e-01 5125 KSP Residual norm 3.943330715535e-01 5126 KSP Residual norm 3.943330715535e-01 5127 KSP Residual norm 3.943330715535e-01 5128 KSP Residual norm 3.943330715535e-01 5129 KSP Residual norm 3.943330715535e-01 5130 KSP Residual norm 3.943330715534e-01 5131 KSP Residual norm 3.943330715534e-01 5132 KSP Residual norm 3.943330715534e-01 5133 KSP Residual norm 3.943330715534e-01 5134 KSP Residual norm 3.943330715534e-01 5135 KSP Residual norm 3.943330715534e-01 5136 KSP Residual norm 3.943330715534e-01 5137 KSP Residual norm 3.943330715534e-01 5138 KSP Residual norm 3.943330715534e-01 5139 KSP Residual norm 3.943330715534e-01 5140 KSP Residual norm 3.943330715534e-01 5141 KSP Residual norm 3.943330715534e-01 5142 KSP Residual norm 3.943330715534e-01 5143 KSP Residual norm 3.943330715534e-01 5144 KSP Residual norm 3.943330715534e-01 5145 KSP Residual norm 3.943330715534e-01 5146 KSP Residual norm 3.943330715534e-01 5147 KSP Residual norm 3.943330715534e-01 5148 KSP Residual norm 3.943330715534e-01 5149 KSP Residual norm 3.943330715534e-01 5150 KSP Residual norm 3.943330715534e-01 5151 KSP Residual norm 3.943330715534e-01 5152 KSP Residual norm 3.943330715534e-01 5153 KSP Residual norm 3.943330715534e-01 5154 KSP Residual norm 3.943330715534e-01 5155 KSP Residual norm 3.943330715534e-01 5156 KSP Residual norm 3.943330715534e-01 5157 KSP Residual norm 3.943330715534e-01 5158 KSP Residual norm 3.943330715534e-01 5159 KSP Residual norm 3.943330715534e-01 5160 KSP Residual norm 3.943330715533e-01 5161 KSP Residual norm 3.943330715533e-01 5162 KSP Residual norm 3.943330715533e-01 5163 KSP Residual norm 3.943330715533e-01 5164 KSP Residual norm 3.943330715533e-01 5165 KSP Residual norm 3.943330715533e-01 5166 KSP Residual norm 3.943330715533e-01 5167 KSP Residual norm 3.943330715533e-01 5168 KSP Residual norm 3.943330715533e-01 5169 KSP Residual norm 3.943330715533e-01 5170 KSP Residual norm 3.943330715533e-01 5171 KSP Residual norm 3.943330715533e-01 5172 KSP Residual norm 3.943330715533e-01 5173 KSP Residual norm 3.943330715533e-01 5174 KSP Residual norm 3.943330715533e-01 5175 KSP Residual norm 3.943330715533e-01 5176 KSP Residual norm 3.943330715533e-01 5177 KSP Residual norm 3.943330715533e-01 5178 KSP Residual norm 3.943330715533e-01 5179 KSP Residual norm 3.943330715533e-01 5180 KSP Residual norm 3.943330715533e-01 5181 KSP Residual norm 3.943330715533e-01 5182 KSP Residual norm 3.943330715533e-01 5183 KSP Residual norm 3.943330715533e-01 5184 KSP Residual norm 3.943330715533e-01 5185 KSP Residual norm 3.943330715533e-01 5186 KSP Residual norm 3.943330715533e-01 5187 KSP Residual norm 3.943330715533e-01 5188 KSP Residual norm 3.943330715533e-01 5189 KSP Residual norm 3.943330715533e-01 5190 KSP Residual norm 3.943330715533e-01 5191 KSP Residual norm 3.943330715533e-01 5192 KSP Residual norm 3.943330715533e-01 5193 KSP Residual norm 3.943330715533e-01 5194 KSP Residual norm 3.943330715533e-01 5195 KSP Residual norm 3.943330715533e-01 5196 KSP Residual norm 3.943330715533e-01 5197 KSP Residual norm 3.943330715533e-01 5198 KSP Residual norm 3.943330715533e-01 5199 KSP Residual norm 3.943330715533e-01 5200 KSP Residual norm 3.943330715533e-01 5201 KSP Residual norm 3.943330715533e-01 5202 KSP Residual norm 3.943330715533e-01 5203 KSP Residual norm 3.943330715533e-01 5204 KSP Residual norm 3.943330715533e-01 5205 KSP Residual norm 3.943330715533e-01 5206 KSP Residual norm 3.943330715533e-01 5207 KSP Residual norm 3.943330715533e-01 5208 KSP Residual norm 3.943330715533e-01 5209 KSP Residual norm 3.943330715533e-01 5210 KSP Residual norm 3.943330715533e-01 5211 KSP Residual norm 3.943330715533e-01 5212 KSP Residual norm 3.943330715533e-01 5213 KSP Residual norm 3.943330715533e-01 5214 KSP Residual norm 3.943330715533e-01 5215 KSP Residual norm 3.943330715533e-01 5216 KSP Residual norm 3.943330715533e-01 5217 KSP Residual norm 3.943330715533e-01 5218 KSP Residual norm 3.943330715533e-01 5219 KSP Residual norm 3.943330715533e-01 5220 KSP Residual norm 3.943330715536e-01 5221 KSP Residual norm 3.943330715536e-01 5222 KSP Residual norm 3.943330715536e-01 5223 KSP Residual norm 3.943330715536e-01 5224 KSP Residual norm 3.943330715536e-01 5225 KSP Residual norm 3.943330715536e-01 5226 KSP Residual norm 3.943330715536e-01 5227 KSP Residual norm 3.943330715536e-01 5228 KSP Residual norm 3.943330715536e-01 5229 KSP Residual norm 3.943330715536e-01 5230 KSP Residual norm 3.943330715536e-01 5231 KSP Residual norm 3.943330715536e-01 5232 KSP Residual norm 3.943330715536e-01 5233 KSP Residual norm 3.943330715536e-01 5234 KSP Residual norm 3.943330715536e-01 5235 KSP Residual norm 3.943330715536e-01 5236 KSP Residual norm 3.943330715536e-01 5237 KSP Residual norm 3.943330715536e-01 5238 KSP Residual norm 3.943330715536e-01 5239 KSP Residual norm 3.943330715536e-01 5240 KSP Residual norm 3.943330715536e-01 5241 KSP Residual norm 3.943330715536e-01 5242 KSP Residual norm 3.943330715536e-01 5243 KSP Residual norm 3.943330715536e-01 5244 KSP Residual norm 3.943330715536e-01 5245 KSP Residual norm 3.943330715536e-01 5246 KSP Residual norm 3.943330715536e-01 5247 KSP Residual norm 3.943330715536e-01 5248 KSP Residual norm 3.943330715536e-01 5249 KSP Residual norm 3.943330715536e-01 5250 KSP Residual norm 3.943330715538e-01 5251 KSP Residual norm 3.943330715538e-01 5252 KSP Residual norm 3.943330715538e-01 5253 KSP Residual norm 3.943330715538e-01 5254 KSP Residual norm 3.943330715538e-01 5255 KSP Residual norm 3.943330715538e-01 5256 KSP Residual norm 3.943330715538e-01 5257 KSP Residual norm 3.943330715538e-01 5258 KSP Residual norm 3.943330715538e-01 5259 KSP Residual norm 3.943330715538e-01 5260 KSP Residual norm 3.943330715538e-01 5261 KSP Residual norm 3.943330715538e-01 5262 KSP Residual norm 3.943330715538e-01 5263 KSP Residual norm 3.943330715538e-01 5264 KSP Residual norm 3.943330715538e-01 5265 KSP Residual norm 3.943330715538e-01 5266 KSP Residual norm 3.943330715538e-01 5267 KSP Residual norm 3.943330715538e-01 5268 KSP Residual norm 3.943330715538e-01 5269 KSP Residual norm 3.943330715538e-01 5270 KSP Residual norm 3.943330715538e-01 5271 KSP Residual norm 3.943330715538e-01 5272 KSP Residual norm 3.943330715538e-01 5273 KSP Residual norm 3.943330715538e-01 5274 KSP Residual norm 3.943330715538e-01 5275 KSP Residual norm 3.943330715538e-01 5276 KSP Residual norm 3.943330715538e-01 5277 KSP Residual norm 3.943330715538e-01 5278 KSP Residual norm 3.943330715538e-01 5279 KSP Residual norm 3.943330715538e-01 5280 KSP Residual norm 3.943330715538e-01 5281 KSP Residual norm 3.943330715538e-01 5282 KSP Residual norm 3.943330715538e-01 5283 KSP Residual norm 3.943330715538e-01 5284 KSP Residual norm 3.943330715538e-01 5285 KSP Residual norm 3.943330715538e-01 5286 KSP Residual norm 3.943330715538e-01 5287 KSP Residual norm 3.943330715538e-01 5288 KSP Residual norm 3.943330715538e-01 5289 KSP Residual norm 3.943330715538e-01 5290 KSP Residual norm 3.943330715538e-01 5291 KSP Residual norm 3.943330715538e-01 5292 KSP Residual norm 3.943330715538e-01 5293 KSP Residual norm 3.943330715538e-01 5294 KSP Residual norm 3.943330715538e-01 5295 KSP Residual norm 3.943330715538e-01 5296 KSP Residual norm 3.943330715538e-01 5297 KSP Residual norm 3.943330715538e-01 5298 KSP Residual norm 3.943330715538e-01 5299 KSP Residual norm 3.943330715538e-01 5300 KSP Residual norm 3.943330715538e-01 5301 KSP Residual norm 3.943330715538e-01 5302 KSP Residual norm 3.943330715538e-01 5303 KSP Residual norm 3.943330715538e-01 5304 KSP Residual norm 3.943330715538e-01 5305 KSP Residual norm 3.943330715538e-01 5306 KSP Residual norm 3.943330715538e-01 5307 KSP Residual norm 3.943330715538e-01 5308 KSP Residual norm 3.943330715538e-01 5309 KSP Residual norm 3.943330715538e-01 5310 KSP Residual norm 3.943330715538e-01 5311 KSP Residual norm 3.943330715538e-01 5312 KSP Residual norm 3.943330715538e-01 5313 KSP Residual norm 3.943330715538e-01 5314 KSP Residual norm 3.943330715538e-01 5315 KSP Residual norm 3.943330715538e-01 5316 KSP Residual norm 3.943330715538e-01 5317 KSP Residual norm 3.943330715538e-01 5318 KSP Residual norm 3.943330715538e-01 5319 KSP Residual norm 3.943330715538e-01 5320 KSP Residual norm 3.943330715538e-01 5321 KSP Residual norm 3.943330715538e-01 5322 KSP Residual norm 3.943330715538e-01 5323 KSP Residual norm 3.943330715538e-01 5324 KSP Residual norm 3.943330715538e-01 5325 KSP Residual norm 3.943330715538e-01 5326 KSP Residual norm 3.943330715538e-01 5327 KSP Residual norm 3.943330715538e-01 5328 KSP Residual norm 3.943330715538e-01 5329 KSP Residual norm 3.943330715538e-01 5330 KSP Residual norm 3.943330715538e-01 5331 KSP Residual norm 3.943330715538e-01 5332 KSP Residual norm 3.943330715538e-01 5333 KSP Residual norm 3.943330715538e-01 5334 KSP Residual norm 3.943330715538e-01 5335 KSP Residual norm 3.943330715538e-01 5336 KSP Residual norm 3.943330715538e-01 5337 KSP Residual norm 3.943330715538e-01 5338 KSP Residual norm 3.943330715538e-01 5339 KSP Residual norm 3.943330715538e-01 5340 KSP Residual norm 3.943330715539e-01 5341 KSP Residual norm 3.943330715539e-01 5342 KSP Residual norm 3.943330715539e-01 5343 KSP Residual norm 3.943330715539e-01 5344 KSP Residual norm 3.943330715539e-01 5345 KSP Residual norm 3.943330715539e-01 5346 KSP Residual norm 3.943330715539e-01 5347 KSP Residual norm 3.943330715539e-01 5348 KSP Residual norm 3.943330715539e-01 5349 KSP Residual norm 3.943330715539e-01 5350 KSP Residual norm 3.943330715539e-01 5351 KSP Residual norm 3.943330715539e-01 5352 KSP Residual norm 3.943330715539e-01 5353 KSP Residual norm 3.943330715539e-01 5354 KSP Residual norm 3.943330715539e-01 5355 KSP Residual norm 3.943330715539e-01 5356 KSP Residual norm 3.943330715539e-01 5357 KSP Residual norm 3.943330715539e-01 5358 KSP Residual norm 3.943330715539e-01 5359 KSP Residual norm 3.943330715539e-01 5360 KSP Residual norm 3.943330715539e-01 5361 KSP Residual norm 3.943330715539e-01 5362 KSP Residual norm 3.943330715539e-01 5363 KSP Residual norm 3.943330715539e-01 5364 KSP Residual norm 3.943330715539e-01 5365 KSP Residual norm 3.943330715539e-01 5366 KSP Residual norm 3.943330715539e-01 5367 KSP Residual norm 3.943330715539e-01 5368 KSP Residual norm 3.943330715539e-01 5369 KSP Residual norm 3.943330715539e-01 5370 KSP Residual norm 3.943330715539e-01 5371 KSP Residual norm 3.943330715539e-01 5372 KSP Residual norm 3.943330715539e-01 5373 KSP Residual norm 3.943330715539e-01 5374 KSP Residual norm 3.943330715539e-01 5375 KSP Residual norm 3.943330715539e-01 5376 KSP Residual norm 3.943330715539e-01 5377 KSP Residual norm 3.943330715539e-01 5378 KSP Residual norm 3.943330715539e-01 5379 KSP Residual norm 3.943330715539e-01 5380 KSP Residual norm 3.943330715539e-01 5381 KSP Residual norm 3.943330715539e-01 5382 KSP Residual norm 3.943330715539e-01 5383 KSP Residual norm 3.943330715539e-01 5384 KSP Residual norm 3.943330715539e-01 5385 KSP Residual norm 3.943330715539e-01 5386 KSP Residual norm 3.943330715539e-01 5387 KSP Residual norm 3.943330715539e-01 5388 KSP Residual norm 3.943330715539e-01 5389 KSP Residual norm 3.943330715539e-01 5390 KSP Residual norm 3.943330715539e-01 5391 KSP Residual norm 3.943330715539e-01 5392 KSP Residual norm 3.943330715539e-01 5393 KSP Residual norm 3.943330715539e-01 5394 KSP Residual norm 3.943330715539e-01 5395 KSP Residual norm 3.943330715539e-01 5396 KSP Residual norm 3.943330715539e-01 5397 KSP Residual norm 3.943330715539e-01 5398 KSP Residual norm 3.943330715539e-01 5399 KSP Residual norm 3.943330715539e-01 5400 KSP Residual norm 3.943330715539e-01 5401 KSP Residual norm 3.943330715539e-01 5402 KSP Residual norm 3.943330715539e-01 5403 KSP Residual norm 3.943330715539e-01 5404 KSP Residual norm 3.943330715539e-01 5405 KSP Residual norm 3.943330715539e-01 5406 KSP Residual norm 3.943330715539e-01 5407 KSP Residual norm 3.943330715539e-01 5408 KSP Residual norm 3.943330715539e-01 5409 KSP Residual norm 3.943330715539e-01 5410 KSP Residual norm 3.943330715539e-01 5411 KSP Residual norm 3.943330715539e-01 5412 KSP Residual norm 3.943330715539e-01 5413 KSP Residual norm 3.943330715539e-01 5414 KSP Residual norm 3.943330715539e-01 5415 KSP Residual norm 3.943330715539e-01 5416 KSP Residual norm 3.943330715539e-01 5417 KSP Residual norm 3.943330715539e-01 5418 KSP Residual norm 3.943330715539e-01 5419 KSP Residual norm 3.943330715539e-01 5420 KSP Residual norm 3.943330715539e-01 5421 KSP Residual norm 3.943330715539e-01 5422 KSP Residual norm 3.943330715539e-01 5423 KSP Residual norm 3.943330715539e-01 5424 KSP Residual norm 3.943330715539e-01 5425 KSP Residual norm 3.943330715539e-01 5426 KSP Residual norm 3.943330715539e-01 5427 KSP Residual norm 3.943330715539e-01 5428 KSP Residual norm 3.943330715539e-01 5429 KSP Residual norm 3.943330715539e-01 5430 KSP Residual norm 3.943330715538e-01 5431 KSP Residual norm 3.943330715538e-01 5432 KSP Residual norm 3.943330715538e-01 5433 KSP Residual norm 3.943330715538e-01 5434 KSP Residual norm 3.943330715538e-01 5435 KSP Residual norm 3.943330715538e-01 5436 KSP Residual norm 3.943330715538e-01 5437 KSP Residual norm 3.943330715538e-01 5438 KSP Residual norm 3.943330715538e-01 5439 KSP Residual norm 3.943330715538e-01 5440 KSP Residual norm 3.943330715538e-01 5441 KSP Residual norm 3.943330715538e-01 5442 KSP Residual norm 3.943330715538e-01 5443 KSP Residual norm 3.943330715538e-01 5444 KSP Residual norm 3.943330715538e-01 5445 KSP Residual norm 3.943330715538e-01 5446 KSP Residual norm 3.943330715538e-01 5447 KSP Residual norm 3.943330715538e-01 5448 KSP Residual norm 3.943330715538e-01 5449 KSP Residual norm 3.943330715538e-01 5450 KSP Residual norm 3.943330715538e-01 5451 KSP Residual norm 3.943330715538e-01 5452 KSP Residual norm 3.943330715538e-01 5453 KSP Residual norm 3.943330715538e-01 5454 KSP Residual norm 3.943330715538e-01 5455 KSP Residual norm 3.943330715538e-01 5456 KSP Residual norm 3.943330715538e-01 5457 KSP Residual norm 3.943330715538e-01 5458 KSP Residual norm 3.943330715538e-01 5459 KSP Residual norm 3.943330715538e-01 5460 KSP Residual norm 3.943330715539e-01 5461 KSP Residual norm 3.943330715539e-01 5462 KSP Residual norm 3.943330715539e-01 5463 KSP Residual norm 3.943330715539e-01 5464 KSP Residual norm 3.943330715539e-01 5465 KSP Residual norm 3.943330715539e-01 5466 KSP Residual norm 3.943330715539e-01 5467 KSP Residual norm 3.943330715539e-01 5468 KSP Residual norm 3.943330715539e-01 5469 KSP Residual norm 3.943330715539e-01 5470 KSP Residual norm 3.943330715539e-01 5471 KSP Residual norm 3.943330715539e-01 5472 KSP Residual norm 3.943330715539e-01 5473 KSP Residual norm 3.943330715539e-01 5474 KSP Residual norm 3.943330715539e-01 5475 KSP Residual norm 3.943330715539e-01 5476 KSP Residual norm 3.943330715539e-01 5477 KSP Residual norm 3.943330715539e-01 5478 KSP Residual norm 3.943330715539e-01 5479 KSP Residual norm 3.943330715539e-01 5480 KSP Residual norm 3.943330715539e-01 5481 KSP Residual norm 3.943330715539e-01 5482 KSP Residual norm 3.943330715539e-01 5483 KSP Residual norm 3.943330715539e-01 5484 KSP Residual norm 3.943330715539e-01 5485 KSP Residual norm 3.943330715539e-01 5486 KSP Residual norm 3.943330715539e-01 5487 KSP Residual norm 3.943330715539e-01 5488 KSP Residual norm 3.943330715539e-01 5489 KSP Residual norm 3.943330715539e-01 5490 KSP Residual norm 3.943330715534e-01 5491 KSP Residual norm 3.943330715534e-01 5492 KSP Residual norm 3.943330715534e-01 5493 KSP Residual norm 3.943330715534e-01 5494 KSP Residual norm 3.943330715534e-01 5495 KSP Residual norm 3.943330715534e-01 5496 KSP Residual norm 3.943330715534e-01 5497 KSP Residual norm 3.943330715534e-01 5498 KSP Residual norm 3.943330715534e-01 5499 KSP Residual norm 3.943330715534e-01 5500 KSP Residual norm 3.943330715534e-01 5501 KSP Residual norm 3.943330715534e-01 5502 KSP Residual norm 3.943330715534e-01 5503 KSP Residual norm 3.943330715534e-01 5504 KSP Residual norm 3.943330715534e-01 5505 KSP Residual norm 3.943330715534e-01 5506 KSP Residual norm 3.943330715534e-01 5507 KSP Residual norm 3.943330715534e-01 5508 KSP Residual norm 3.943330715534e-01 5509 KSP Residual norm 3.943330715534e-01 5510 KSP Residual norm 3.943330715534e-01 5511 KSP Residual norm 3.943330715534e-01 5512 KSP Residual norm 3.943330715534e-01 5513 KSP Residual norm 3.943330715534e-01 5514 KSP Residual norm 3.943330715534e-01 5515 KSP Residual norm 3.943330715534e-01 5516 KSP Residual norm 3.943330715534e-01 5517 KSP Residual norm 3.943330715534e-01 5518 KSP Residual norm 3.943330715534e-01 5519 KSP Residual norm 3.943330715534e-01 5520 KSP Residual norm 3.943330715536e-01 5521 KSP Residual norm 3.943330715536e-01 5522 KSP Residual norm 3.943330715536e-01 5523 KSP Residual norm 3.943330715536e-01 5524 KSP Residual norm 3.943330715536e-01 5525 KSP Residual norm 3.943330715536e-01 5526 KSP Residual norm 3.943330715536e-01 5527 KSP Residual norm 3.943330715536e-01 5528 KSP Residual norm 3.943330715536e-01 5529 KSP Residual norm 3.943330715536e-01 5530 KSP Residual norm 3.943330715536e-01 5531 KSP Residual norm 3.943330715536e-01 5532 KSP Residual norm 3.943330715536e-01 5533 KSP Residual norm 3.943330715536e-01 5534 KSP Residual norm 3.943330715536e-01 5535 KSP Residual norm 3.943330715536e-01 5536 KSP Residual norm 3.943330715536e-01 5537 KSP Residual norm 3.943330715536e-01 5538 KSP Residual norm 3.943330715536e-01 5539 KSP Residual norm 3.943330715536e-01 5540 KSP Residual norm 3.943330715536e-01 5541 KSP Residual norm 3.943330715536e-01 5542 KSP Residual norm 3.943330715536e-01 5543 KSP Residual norm 3.943330715536e-01 5544 KSP Residual norm 3.943330715536e-01 5545 KSP Residual norm 3.943330715536e-01 5546 KSP Residual norm 3.943330715536e-01 5547 KSP Residual norm 3.943330715536e-01 5548 KSP Residual norm 3.943330715536e-01 5549 KSP Residual norm 3.943330715536e-01 5550 KSP Residual norm 3.943330715534e-01 5551 KSP Residual norm 3.943330715534e-01 5552 KSP Residual norm 3.943330715534e-01 5553 KSP Residual norm 3.943330715534e-01 5554 KSP Residual norm 3.943330715534e-01 5555 KSP Residual norm 3.943330715534e-01 5556 KSP Residual norm 3.943330715534e-01 5557 KSP Residual norm 3.943330715534e-01 5558 KSP Residual norm 3.943330715534e-01 5559 KSP Residual norm 3.943330715534e-01 5560 KSP Residual norm 3.943330715534e-01 5561 KSP Residual norm 3.943330715534e-01 5562 KSP Residual norm 3.943330715534e-01 5563 KSP Residual norm 3.943330715534e-01 5564 KSP Residual norm 3.943330715534e-01 5565 KSP Residual norm 3.943330715534e-01 5566 KSP Residual norm 3.943330715534e-01 5567 KSP Residual norm 3.943330715534e-01 5568 KSP Residual norm 3.943330715534e-01 5569 KSP Residual norm 3.943330715534e-01 5570 KSP Residual norm 3.943330715534e-01 5571 KSP Residual norm 3.943330715534e-01 5572 KSP Residual norm 3.943330715534e-01 5573 KSP Residual norm 3.943330715534e-01 5574 KSP Residual norm 3.943330715534e-01 5575 KSP Residual norm 3.943330715534e-01 5576 KSP Residual norm 3.943330715534e-01 5577 KSP Residual norm 3.943330715534e-01 5578 KSP Residual norm 3.943330715534e-01 5579 KSP Residual norm 3.943330715534e-01 5580 KSP Residual norm 3.943330715534e-01 5581 KSP Residual norm 3.943330715534e-01 5582 KSP Residual norm 3.943330715534e-01 5583 KSP Residual norm 3.943330715534e-01 5584 KSP Residual norm 3.943330715534e-01 5585 KSP Residual norm 3.943330715534e-01 5586 KSP Residual norm 3.943330715534e-01 5587 KSP Residual norm 3.943330715534e-01 5588 KSP Residual norm 3.943330715534e-01 5589 KSP Residual norm 3.943330715534e-01 5590 KSP Residual norm 3.943330715534e-01 5591 KSP Residual norm 3.943330715534e-01 5592 KSP Residual norm 3.943330715534e-01 5593 KSP Residual norm 3.943330715534e-01 5594 KSP Residual norm 3.943330715534e-01 5595 KSP Residual norm 3.943330715534e-01 5596 KSP Residual norm 3.943330715534e-01 5597 KSP Residual norm 3.943330715534e-01 5598 KSP Residual norm 3.943330715534e-01 5599 KSP Residual norm 3.943330715534e-01 5600 KSP Residual norm 3.943330715534e-01 5601 KSP Residual norm 3.943330715534e-01 5602 KSP Residual norm 3.943330715534e-01 5603 KSP Residual norm 3.943330715534e-01 5604 KSP Residual norm 3.943330715534e-01 5605 KSP Residual norm 3.943330715534e-01 5606 KSP Residual norm 3.943330715534e-01 5607 KSP Residual norm 3.943330715534e-01 5608 KSP Residual norm 3.943330715534e-01 5609 KSP Residual norm 3.943330715534e-01 5610 KSP Residual norm 3.943330715536e-01 5611 KSP Residual norm 3.943330715536e-01 5612 KSP Residual norm 3.943330715536e-01 5613 KSP Residual norm 3.943330715536e-01 5614 KSP Residual norm 3.943330715536e-01 5615 KSP Residual norm 3.943330715536e-01 5616 KSP Residual norm 3.943330715536e-01 5617 KSP Residual norm 3.943330715536e-01 5618 KSP Residual norm 3.943330715536e-01 5619 KSP Residual norm 3.943330715536e-01 5620 KSP Residual norm 3.943330715536e-01 5621 KSP Residual norm 3.943330715536e-01 5622 KSP Residual norm 3.943330715536e-01 5623 KSP Residual norm 3.943330715536e-01 5624 KSP Residual norm 3.943330715536e-01 5625 KSP Residual norm 3.943330715536e-01 5626 KSP Residual norm 3.943330715536e-01 5627 KSP Residual norm 3.943330715536e-01 5628 KSP Residual norm 3.943330715536e-01 5629 KSP Residual norm 3.943330715536e-01 5630 KSP Residual norm 3.943330715536e-01 5631 KSP Residual norm 3.943330715536e-01 5632 KSP Residual norm 3.943330715536e-01 5633 KSP Residual norm 3.943330715536e-01 5634 KSP Residual norm 3.943330715536e-01 5635 KSP Residual norm 3.943330715536e-01 5636 KSP Residual norm 3.943330715536e-01 5637 KSP Residual norm 3.943330715536e-01 5638 KSP Residual norm 3.943330715536e-01 5639 KSP Residual norm 3.943330715536e-01 5640 KSP Residual norm 3.943330715535e-01 5641 KSP Residual norm 3.943330715535e-01 5642 KSP Residual norm 3.943330715535e-01 5643 KSP Residual norm 3.943330715535e-01 5644 KSP Residual norm 3.943330715535e-01 5645 KSP Residual norm 3.943330715535e-01 5646 KSP Residual norm 3.943330715535e-01 5647 KSP Residual norm 3.943330715535e-01 5648 KSP Residual norm 3.943330715535e-01 5649 KSP Residual norm 3.943330715535e-01 5650 KSP Residual norm 3.943330715535e-01 5651 KSP Residual norm 3.943330715535e-01 5652 KSP Residual norm 3.943330715535e-01 5653 KSP Residual norm 3.943330715535e-01 5654 KSP Residual norm 3.943330715535e-01 5655 KSP Residual norm 3.943330715535e-01 5656 KSP Residual norm 3.943330715535e-01 5657 KSP Residual norm 3.943330715535e-01 5658 KSP Residual norm 3.943330715535e-01 5659 KSP Residual norm 3.943330715535e-01 5660 KSP Residual norm 3.943330715535e-01 5661 KSP Residual norm 3.943330715535e-01 5662 KSP Residual norm 3.943330715535e-01 5663 KSP Residual norm 3.943330715535e-01 5664 KSP Residual norm 3.943330715535e-01 5665 KSP Residual norm 3.943330715535e-01 5666 KSP Residual norm 3.943330715535e-01 5667 KSP Residual norm 3.943330715535e-01 5668 KSP Residual norm 3.943330715535e-01 5669 KSP Residual norm 3.943330715535e-01 5670 KSP Residual norm 3.943330715536e-01 5671 KSP Residual norm 3.943330715536e-01 5672 KSP Residual norm 3.943330715536e-01 5673 KSP Residual norm 3.943330715536e-01 5674 KSP Residual norm 3.943330715536e-01 5675 KSP Residual norm 3.943330715536e-01 5676 KSP Residual norm 3.943330715536e-01 5677 KSP Residual norm 3.943330715536e-01 5678 KSP Residual norm 3.943330715536e-01 5679 KSP Residual norm 3.943330715536e-01 5680 KSP Residual norm 3.943330715536e-01 5681 KSP Residual norm 3.943330715536e-01 5682 KSP Residual norm 3.943330715536e-01 5683 KSP Residual norm 3.943330715536e-01 5684 KSP Residual norm 3.943330715536e-01 5685 KSP Residual norm 3.943330715536e-01 5686 KSP Residual norm 3.943330715536e-01 5687 KSP Residual norm 3.943330715536e-01 5688 KSP Residual norm 3.943330715536e-01 5689 KSP Residual norm 3.943330715536e-01 5690 KSP Residual norm 3.943330715536e-01 5691 KSP Residual norm 3.943330715536e-01 5692 KSP Residual norm 3.943330715536e-01 5693 KSP Residual norm 3.943330715536e-01 5694 KSP Residual norm 3.943330715536e-01 5695 KSP Residual norm 3.943330715536e-01 5696 KSP Residual norm 3.943330715536e-01 5697 KSP Residual norm 3.943330715536e-01 5698 KSP Residual norm 3.943330715536e-01 5699 KSP Residual norm 3.943330715536e-01 5700 KSP Residual norm 3.943330715535e-01 5701 KSP Residual norm 3.943330715535e-01 5702 KSP Residual norm 3.943330715535e-01 5703 KSP Residual norm 3.943330715535e-01 5704 KSP Residual norm 3.943330715535e-01 5705 KSP Residual norm 3.943330715535e-01 5706 KSP Residual norm 3.943330715535e-01 5707 KSP Residual norm 3.943330715535e-01 5708 KSP Residual norm 3.943330715535e-01 5709 KSP Residual norm 3.943330715535e-01 5710 KSP Residual norm 3.943330715535e-01 5711 KSP Residual norm 3.943330715535e-01 5712 KSP Residual norm 3.943330715535e-01 5713 KSP Residual norm 3.943330715535e-01 5714 KSP Residual norm 3.943330715535e-01 5715 KSP Residual norm 3.943330715535e-01 5716 KSP Residual norm 3.943330715535e-01 5717 KSP Residual norm 3.943330715535e-01 5718 KSP Residual norm 3.943330715535e-01 5719 KSP Residual norm 3.943330715535e-01 5720 KSP Residual norm 3.943330715535e-01 5721 KSP Residual norm 3.943330715535e-01 5722 KSP Residual norm 3.943330715535e-01 5723 KSP Residual norm 3.943330715535e-01 5724 KSP Residual norm 3.943330715535e-01 5725 KSP Residual norm 3.943330715535e-01 5726 KSP Residual norm 3.943330715535e-01 5727 KSP Residual norm 3.943330715535e-01 5728 KSP Residual norm 3.943330715535e-01 5729 KSP Residual norm 3.943330715535e-01 5730 KSP Residual norm 3.943330715534e-01 5731 KSP Residual norm 3.943330715534e-01 5732 KSP Residual norm 3.943330715534e-01 5733 KSP Residual norm 3.943330715534e-01 5734 KSP Residual norm 3.943330715534e-01 5735 KSP Residual norm 3.943330715534e-01 5736 KSP Residual norm 3.943330715534e-01 5737 KSP Residual norm 3.943330715534e-01 5738 KSP Residual norm 3.943330715534e-01 5739 KSP Residual norm 3.943330715534e-01 5740 KSP Residual norm 3.943330715534e-01 5741 KSP Residual norm 3.943330715534e-01 5742 KSP Residual norm 3.943330715534e-01 5743 KSP Residual norm 3.943330715534e-01 5744 KSP Residual norm 3.943330715534e-01 5745 KSP Residual norm 3.943330715534e-01 5746 KSP Residual norm 3.943330715534e-01 5747 KSP Residual norm 3.943330715534e-01 5748 KSP Residual norm 3.943330715534e-01 5749 KSP Residual norm 3.943330715534e-01 5750 KSP Residual norm 3.943330715534e-01 5751 KSP Residual norm 3.943330715534e-01 5752 KSP Residual norm 3.943330715534e-01 5753 KSP Residual norm 3.943330715534e-01 5754 KSP Residual norm 3.943330715534e-01 5755 KSP Residual norm 3.943330715534e-01 5756 KSP Residual norm 3.943330715534e-01 5757 KSP Residual norm 3.943330715534e-01 5758 KSP Residual norm 3.943330715534e-01 5759 KSP Residual norm 3.943330715534e-01 5760 KSP Residual norm 3.943330715535e-01 5761 KSP Residual norm 3.943330715535e-01 5762 KSP Residual norm 3.943330715535e-01 5763 KSP Residual norm 3.943330715535e-01 5764 KSP Residual norm 3.943330715535e-01 5765 KSP Residual norm 3.943330715535e-01 5766 KSP Residual norm 3.943330715535e-01 5767 KSP Residual norm 3.943330715535e-01 5768 KSP Residual norm 3.943330715535e-01 5769 KSP Residual norm 3.943330715535e-01 5770 KSP Residual norm 3.943330715535e-01 5771 KSP Residual norm 3.943330715535e-01 5772 KSP Residual norm 3.943330715535e-01 5773 KSP Residual norm 3.943330715535e-01 5774 KSP Residual norm 3.943330715535e-01 5775 KSP Residual norm 3.943330715535e-01 5776 KSP Residual norm 3.943330715535e-01 5777 KSP Residual norm 3.943330715535e-01 5778 KSP Residual norm 3.943330715535e-01 5779 KSP Residual norm 3.943330715535e-01 5780 KSP Residual norm 3.943330715535e-01 5781 KSP Residual norm 3.943330715535e-01 5782 KSP Residual norm 3.943330715535e-01 5783 KSP Residual norm 3.943330715535e-01 5784 KSP Residual norm 3.943330715535e-01 5785 KSP Residual norm 3.943330715535e-01 5786 KSP Residual norm 3.943330715535e-01 5787 KSP Residual norm 3.943330715535e-01 5788 KSP Residual norm 3.943330715535e-01 5789 KSP Residual norm 3.943330715535e-01 5790 KSP Residual norm 3.943330715531e-01 5791 KSP Residual norm 3.943330715531e-01 5792 KSP Residual norm 3.943330715531e-01 5793 KSP Residual norm 3.943330715531e-01 5794 KSP Residual norm 3.943330715531e-01 5795 KSP Residual norm 3.943330715531e-01 5796 KSP Residual norm 3.943330715531e-01 5797 KSP Residual norm 3.943330715531e-01 5798 KSP Residual norm 3.943330715531e-01 5799 KSP Residual norm 3.943330715531e-01 5800 KSP Residual norm 3.943330715531e-01 5801 KSP Residual norm 3.943330715531e-01 5802 KSP Residual norm 3.943330715531e-01 5803 KSP Residual norm 3.943330715531e-01 5804 KSP Residual norm 3.943330715531e-01 5805 KSP Residual norm 3.943330715531e-01 5806 KSP Residual norm 3.943330715531e-01 5807 KSP Residual norm 3.943330715531e-01 5808 KSP Residual norm 3.943330715531e-01 5809 KSP Residual norm 3.943330715531e-01 5810 KSP Residual norm 3.943330715531e-01 5811 KSP Residual norm 3.943330715531e-01 5812 KSP Residual norm 3.943330715531e-01 5813 KSP Residual norm 3.943330715531e-01 5814 KSP Residual norm 3.943330715531e-01 5815 KSP Residual norm 3.943330715531e-01 5816 KSP Residual norm 3.943330715531e-01 5817 KSP Residual norm 3.943330715531e-01 5818 KSP Residual norm 3.943330715531e-01 5819 KSP Residual norm 3.943330715531e-01 5820 KSP Residual norm 3.943330715531e-01 5821 KSP Residual norm 3.943330715531e-01 5822 KSP Residual norm 3.943330715531e-01 5823 KSP Residual norm 3.943330715531e-01 5824 KSP Residual norm 3.943330715531e-01 5825 KSP Residual norm 3.943330715531e-01 5826 KSP Residual norm 3.943330715531e-01 5827 KSP Residual norm 3.943330715531e-01 5828 KSP Residual norm 3.943330715531e-01 5829 KSP Residual norm 3.943330715531e-01 5830 KSP Residual norm 3.943330715531e-01 5831 KSP Residual norm 3.943330715531e-01 5832 KSP Residual norm 3.943330715531e-01 5833 KSP Residual norm 3.943330715531e-01 5834 KSP Residual norm 3.943330715531e-01 5835 KSP Residual norm 3.943330715531e-01 5836 KSP Residual norm 3.943330715531e-01 5837 KSP Residual norm 3.943330715531e-01 5838 KSP Residual norm 3.943330715531e-01 5839 KSP Residual norm 3.943330715531e-01 5840 KSP Residual norm 3.943330715531e-01 5841 KSP Residual norm 3.943330715531e-01 5842 KSP Residual norm 3.943330715531e-01 5843 KSP Residual norm 3.943330715531e-01 5844 KSP Residual norm 3.943330715531e-01 5845 KSP Residual norm 3.943330715531e-01 5846 KSP Residual norm 3.943330715531e-01 5847 KSP Residual norm 3.943330715531e-01 5848 KSP Residual norm 3.943330715531e-01 5849 KSP Residual norm 3.943330715531e-01 5850 KSP Residual norm 3.943330715533e-01 5851 KSP Residual norm 3.943330715533e-01 5852 KSP Residual norm 3.943330715533e-01 5853 KSP Residual norm 3.943330715533e-01 5854 KSP Residual norm 3.943330715533e-01 5855 KSP Residual norm 3.943330715533e-01 5856 KSP Residual norm 3.943330715533e-01 5857 KSP Residual norm 3.943330715533e-01 5858 KSP Residual norm 3.943330715533e-01 5859 KSP Residual norm 3.943330715533e-01 5860 KSP Residual norm 3.943330715533e-01 5861 KSP Residual norm 3.943330715533e-01 5862 KSP Residual norm 3.943330715533e-01 5863 KSP Residual norm 3.943330715533e-01 5864 KSP Residual norm 3.943330715533e-01 5865 KSP Residual norm 3.943330715533e-01 5866 KSP Residual norm 3.943330715533e-01 5867 KSP Residual norm 3.943330715533e-01 5868 KSP Residual norm 3.943330715533e-01 5869 KSP Residual norm 3.943330715533e-01 5870 KSP Residual norm 3.943330715533e-01 5871 KSP Residual norm 3.943330715533e-01 5872 KSP Residual norm 3.943330715533e-01 5873 KSP Residual norm 3.943330715533e-01 5874 KSP Residual norm 3.943330715533e-01 5875 KSP Residual norm 3.943330715533e-01 5876 KSP Residual norm 3.943330715533e-01 5877 KSP Residual norm 3.943330715533e-01 5878 KSP Residual norm 3.943330715533e-01 5879 KSP Residual norm 3.943330715533e-01 5880 KSP Residual norm 3.943330715534e-01 5881 KSP Residual norm 3.943330715534e-01 5882 KSP Residual norm 3.943330715534e-01 5883 KSP Residual norm 3.943330715534e-01 5884 KSP Residual norm 3.943330715534e-01 5885 KSP Residual norm 3.943330715534e-01 5886 KSP Residual norm 3.943330715534e-01 5887 KSP Residual norm 3.943330715534e-01 5888 KSP Residual norm 3.943330715534e-01 5889 KSP Residual norm 3.943330715534e-01 5890 KSP Residual norm 3.943330715534e-01 5891 KSP Residual norm 3.943330715534e-01 5892 KSP Residual norm 3.943330715534e-01 5893 KSP Residual norm 3.943330715534e-01 5894 KSP Residual norm 3.943330715534e-01 5895 KSP Residual norm 3.943330715534e-01 5896 KSP Residual norm 3.943330715534e-01 5897 KSP Residual norm 3.943330715534e-01 5898 KSP Residual norm 3.943330715534e-01 5899 KSP Residual norm 3.943330715534e-01 5900 KSP Residual norm 3.943330715534e-01 5901 KSP Residual norm 3.943330715534e-01 5902 KSP Residual norm 3.943330715534e-01 5903 KSP Residual norm 3.943330715534e-01 5904 KSP Residual norm 3.943330715534e-01 5905 KSP Residual norm 3.943330715534e-01 5906 KSP Residual norm 3.943330715534e-01 5907 KSP Residual norm 3.943330715534e-01 5908 KSP Residual norm 3.943330715534e-01 5909 KSP Residual norm 3.943330715534e-01 5910 KSP Residual norm 3.943330715537e-01 5911 KSP Residual norm 3.943330715537e-01 5912 KSP Residual norm 3.943330715537e-01 5913 KSP Residual norm 3.943330715537e-01 5914 KSP Residual norm 3.943330715537e-01 5915 KSP Residual norm 3.943330715537e-01 5916 KSP Residual norm 3.943330715537e-01 5917 KSP Residual norm 3.943330715537e-01 5918 KSP Residual norm 3.943330715537e-01 5919 KSP Residual norm 3.943330715537e-01 5920 KSP Residual norm 3.943330715537e-01 5921 KSP Residual norm 3.943330715537e-01 5922 KSP Residual norm 3.943330715537e-01 5923 KSP Residual norm 3.943330715537e-01 5924 KSP Residual norm 3.943330715537e-01 5925 KSP Residual norm 3.943330715537e-01 5926 KSP Residual norm 3.943330715537e-01 5927 KSP Residual norm 3.943330715537e-01 5928 KSP Residual norm 3.943330715537e-01 5929 KSP Residual norm 3.943330715537e-01 5930 KSP Residual norm 3.943330715537e-01 5931 KSP Residual norm 3.943330715537e-01 5932 KSP Residual norm 3.943330715537e-01 5933 KSP Residual norm 3.943330715537e-01 5934 KSP Residual norm 3.943330715537e-01 5935 KSP Residual norm 3.943330715537e-01 5936 KSP Residual norm 3.943330715537e-01 5937 KSP Residual norm 3.943330715537e-01 5938 KSP Residual norm 3.943330715537e-01 5939 KSP Residual norm 3.943330715537e-01 5940 KSP Residual norm 3.943330715536e-01 5941 KSP Residual norm 3.943330715536e-01 5942 KSP Residual norm 3.943330715536e-01 5943 KSP Residual norm 3.943330715536e-01 5944 KSP Residual norm 3.943330715536e-01 5945 KSP Residual norm 3.943330715536e-01 5946 KSP Residual norm 3.943330715536e-01 5947 KSP Residual norm 3.943330715536e-01 5948 KSP Residual norm 3.943330715536e-01 5949 KSP Residual norm 3.943330715536e-01 5950 KSP Residual norm 3.943330715536e-01 5951 KSP Residual norm 3.943330715536e-01 5952 KSP Residual norm 3.943330715536e-01 5953 KSP Residual norm 3.943330715536e-01 5954 KSP Residual norm 3.943330715536e-01 5955 KSP Residual norm 3.943330715536e-01 5956 KSP Residual norm 3.943330715536e-01 5957 KSP Residual norm 3.943330715536e-01 5958 KSP Residual norm 3.943330715536e-01 5959 KSP Residual norm 3.943330715536e-01 5960 KSP Residual norm 3.943330715536e-01 5961 KSP Residual norm 3.943330715536e-01 5962 KSP Residual norm 3.943330715536e-01 5963 KSP Residual norm 3.943330715536e-01 5964 KSP Residual norm 3.943330715536e-01 5965 KSP Residual norm 3.943330715536e-01 5966 KSP Residual norm 3.943330715536e-01 5967 KSP Residual norm 3.943330715536e-01 5968 KSP Residual norm 3.943330715536e-01 5969 KSP Residual norm 3.943330715536e-01 5970 KSP Residual norm 3.943330715535e-01 5971 KSP Residual norm 3.943330715535e-01 5972 KSP Residual norm 3.943330715535e-01 5973 KSP Residual norm 3.943330715535e-01 5974 KSP Residual norm 3.943330715535e-01 5975 KSP Residual norm 3.943330715535e-01 5976 KSP Residual norm 3.943330715535e-01 5977 KSP Residual norm 3.943330715535e-01 5978 KSP Residual norm 3.943330715535e-01 5979 KSP Residual norm 3.943330715535e-01 5980 KSP Residual norm 3.943330715535e-01 5981 KSP Residual norm 3.943330715535e-01 5982 KSP Residual norm 3.943330715535e-01 5983 KSP Residual norm 3.943330715535e-01 5984 KSP Residual norm 3.943330715535e-01 5985 KSP Residual norm 3.943330715535e-01 5986 KSP Residual norm 3.943330715535e-01 5987 KSP Residual norm 3.943330715535e-01 5988 KSP Residual norm 3.943330715535e-01 5989 KSP Residual norm 3.943330715535e-01 5990 KSP Residual norm 3.943330715535e-01 5991 KSP Residual norm 3.943330715535e-01 5992 KSP Residual norm 3.943330715535e-01 5993 KSP Residual norm 3.943330715535e-01 5994 KSP Residual norm 3.943330715535e-01 5995 KSP Residual norm 3.943330715535e-01 5996 KSP Residual norm 3.943330715535e-01 5997 KSP Residual norm 3.943330715535e-01 5998 KSP Residual norm 3.943330715535e-01 5999 KSP Residual norm 3.943330715535e-01 6000 KSP Residual norm 3.943330715535e-01 6001 KSP Residual norm 3.943330715535e-01 6002 KSP Residual norm 3.943330715535e-01 6003 KSP Residual norm 3.943330715535e-01 6004 KSP Residual norm 3.943330715535e-01 6005 KSP Residual norm 3.943330715535e-01 6006 KSP Residual norm 3.943330715535e-01 6007 KSP Residual norm 3.943330715535e-01 6008 KSP Residual norm 3.943330715535e-01 6009 KSP Residual norm 3.943330715535e-01 6010 KSP Residual norm 3.943330715535e-01 6011 KSP Residual norm 3.943330715535e-01 6012 KSP Residual norm 3.943330715535e-01 6013 KSP Residual norm 3.943330715535e-01 6014 KSP Residual norm 3.943330715535e-01 6015 KSP Residual norm 3.943330715535e-01 6016 KSP Residual norm 3.943330715535e-01 6017 KSP Residual norm 3.943330715535e-01 6018 KSP Residual norm 3.943330715535e-01 6019 KSP Residual norm 3.943330715535e-01 6020 KSP Residual norm 3.943330715535e-01 6021 KSP Residual norm 3.943330715535e-01 6022 KSP Residual norm 3.943330715535e-01 6023 KSP Residual norm 3.943330715535e-01 6024 KSP Residual norm 3.943330715535e-01 6025 KSP Residual norm 3.943330715535e-01 6026 KSP Residual norm 3.943330715535e-01 6027 KSP Residual norm 3.943330715535e-01 6028 KSP Residual norm 3.943330715535e-01 6029 KSP Residual norm 3.943330715535e-01 6030 KSP Residual norm 3.943330715534e-01 6031 KSP Residual norm 3.943330715534e-01 6032 KSP Residual norm 3.943330715534e-01 6033 KSP Residual norm 3.943330715534e-01 6034 KSP Residual norm 3.943330715534e-01 6035 KSP Residual norm 3.943330715534e-01 6036 KSP Residual norm 3.943330715534e-01 6037 KSP Residual norm 3.943330715534e-01 6038 KSP Residual norm 3.943330715534e-01 6039 KSP Residual norm 3.943330715534e-01 6040 KSP Residual norm 3.943330715534e-01 6041 KSP Residual norm 3.943330715534e-01 6042 KSP Residual norm 3.943330715534e-01 6043 KSP Residual norm 3.943330715534e-01 6044 KSP Residual norm 3.943330715534e-01 6045 KSP Residual norm 3.943330715534e-01 6046 KSP Residual norm 3.943330715534e-01 6047 KSP Residual norm 3.943330715534e-01 6048 KSP Residual norm 3.943330715534e-01 6049 KSP Residual norm 3.943330715534e-01 6050 KSP Residual norm 3.943330715534e-01 6051 KSP Residual norm 3.943330715534e-01 6052 KSP Residual norm 3.943330715534e-01 6053 KSP Residual norm 3.943330715534e-01 6054 KSP Residual norm 3.943330715534e-01 6055 KSP Residual norm 3.943330715534e-01 6056 KSP Residual norm 3.943330715534e-01 6057 KSP Residual norm 3.943330715534e-01 6058 KSP Residual norm 3.943330715534e-01 6059 KSP Residual norm 3.943330715534e-01 6060 KSP Residual norm 3.943330715536e-01 6061 KSP Residual norm 3.943330715536e-01 6062 KSP Residual norm 3.943330715536e-01 6063 KSP Residual norm 3.943330715536e-01 6064 KSP Residual norm 3.943330715536e-01 6065 KSP Residual norm 3.943330715536e-01 6066 KSP Residual norm 3.943330715536e-01 6067 KSP Residual norm 3.943330715536e-01 6068 KSP Residual norm 3.943330715536e-01 6069 KSP Residual norm 3.943330715536e-01 6070 KSP Residual norm 3.943330715536e-01 6071 KSP Residual norm 3.943330715536e-01 6072 KSP Residual norm 3.943330715536e-01 6073 KSP Residual norm 3.943330715536e-01 6074 KSP Residual norm 3.943330715536e-01 6075 KSP Residual norm 3.943330715536e-01 6076 KSP Residual norm 3.943330715536e-01 6077 KSP Residual norm 3.943330715536e-01 6078 KSP Residual norm 3.943330715536e-01 6079 KSP Residual norm 3.943330715536e-01 6080 KSP Residual norm 3.943330715536e-01 6081 KSP Residual norm 3.943330715536e-01 6082 KSP Residual norm 3.943330715536e-01 6083 KSP Residual norm 3.943330715536e-01 6084 KSP Residual norm 3.943330715536e-01 6085 KSP Residual norm 3.943330715536e-01 6086 KSP Residual norm 3.943330715536e-01 6087 KSP Residual norm 3.943330715536e-01 6088 KSP Residual norm 3.943330715536e-01 6089 KSP Residual norm 3.943330715536e-01 6090 KSP Residual norm 3.943330715534e-01 6091 KSP Residual norm 3.943330715534e-01 6092 KSP Residual norm 3.943330715534e-01 6093 KSP Residual norm 3.943330715534e-01 6094 KSP Residual norm 3.943330715534e-01 6095 KSP Residual norm 3.943330715534e-01 6096 KSP Residual norm 3.943330715534e-01 6097 KSP Residual norm 3.943330715534e-01 6098 KSP Residual norm 3.943330715534e-01 6099 KSP Residual norm 3.943330715534e-01 6100 KSP Residual norm 3.943330715534e-01 6101 KSP Residual norm 3.943330715534e-01 6102 KSP Residual norm 3.943330715534e-01 6103 KSP Residual norm 3.943330715534e-01 6104 KSP Residual norm 3.943330715534e-01 6105 KSP Residual norm 3.943330715534e-01 6106 KSP Residual norm 3.943330715534e-01 6107 KSP Residual norm 3.943330715534e-01 6108 KSP Residual norm 3.943330715534e-01 6109 KSP Residual norm 3.943330715534e-01 6110 KSP Residual norm 3.943330715534e-01 6111 KSP Residual norm 3.943330715534e-01 6112 KSP Residual norm 3.943330715534e-01 6113 KSP Residual norm 3.943330715534e-01 6114 KSP Residual norm 3.943330715534e-01 6115 KSP Residual norm 3.943330715534e-01 6116 KSP Residual norm 3.943330715534e-01 6117 KSP Residual norm 3.943330715534e-01 6118 KSP Residual norm 3.943330715534e-01 6119 KSP Residual norm 3.943330715534e-01 6120 KSP Residual norm 3.943330715536e-01 6121 KSP Residual norm 3.943330715536e-01 6122 KSP Residual norm 3.943330715536e-01 6123 KSP Residual norm 3.943330715536e-01 6124 KSP Residual norm 3.943330715536e-01 6125 KSP Residual norm 3.943330715536e-01 6126 KSP Residual norm 3.943330715536e-01 6127 KSP Residual norm 3.943330715536e-01 6128 KSP Residual norm 3.943330715536e-01 6129 KSP Residual norm 3.943330715536e-01 6130 KSP Residual norm 3.943330715536e-01 6131 KSP Residual norm 3.943330715536e-01 6132 KSP Residual norm 3.943330715536e-01 6133 KSP Residual norm 3.943330715536e-01 6134 KSP Residual norm 3.943330715536e-01 6135 KSP Residual norm 3.943330715536e-01 6136 KSP Residual norm 3.943330715536e-01 6137 KSP Residual norm 3.943330715536e-01 6138 KSP Residual norm 3.943330715536e-01 6139 KSP Residual norm 3.943330715536e-01 6140 KSP Residual norm 3.943330715536e-01 6141 KSP Residual norm 3.943330715536e-01 6142 KSP Residual norm 3.943330715536e-01 6143 KSP Residual norm 3.943330715536e-01 6144 KSP Residual norm 3.943330715536e-01 6145 KSP Residual norm 3.943330715536e-01 6146 KSP Residual norm 3.943330715536e-01 6147 KSP Residual norm 3.943330715536e-01 6148 KSP Residual norm 3.943330715536e-01 6149 KSP Residual norm 3.943330715536e-01 6150 KSP Residual norm 3.943330715534e-01 6151 KSP Residual norm 3.943330715534e-01 6152 KSP Residual norm 3.943330715534e-01 6153 KSP Residual norm 3.943330715534e-01 6154 KSP Residual norm 3.943330715534e-01 6155 KSP Residual norm 3.943330715534e-01 6156 KSP Residual norm 3.943330715534e-01 6157 KSP Residual norm 3.943330715534e-01 6158 KSP Residual norm 3.943330715534e-01 6159 KSP Residual norm 3.943330715534e-01 6160 KSP Residual norm 3.943330715534e-01 6161 KSP Residual norm 3.943330715534e-01 6162 KSP Residual norm 3.943330715534e-01 6163 KSP Residual norm 3.943330715534e-01 6164 KSP Residual norm 3.943330715534e-01 6165 KSP Residual norm 3.943330715534e-01 6166 KSP Residual norm 3.943330715534e-01 6167 KSP Residual norm 3.943330715534e-01 6168 KSP Residual norm 3.943330715534e-01 6169 KSP Residual norm 3.943330715534e-01 6170 KSP Residual norm 3.943330715534e-01 6171 KSP Residual norm 3.943330715534e-01 6172 KSP Residual norm 3.943330715534e-01 6173 KSP Residual norm 3.943330715534e-01 6174 KSP Residual norm 3.943330715534e-01 6175 KSP Residual norm 3.943330715534e-01 6176 KSP Residual norm 3.943330715534e-01 6177 KSP Residual norm 3.943330715534e-01 6178 KSP Residual norm 3.943330715534e-01 6179 KSP Residual norm 3.943330715534e-01 6180 KSP Residual norm 3.943330715536e-01 6181 KSP Residual norm 3.943330715536e-01 6182 KSP Residual norm 3.943330715536e-01 6183 KSP Residual norm 3.943330715536e-01 6184 KSP Residual norm 3.943330715536e-01 6185 KSP Residual norm 3.943330715536e-01 6186 KSP Residual norm 3.943330715536e-01 6187 KSP Residual norm 3.943330715536e-01 6188 KSP Residual norm 3.943330715536e-01 6189 KSP Residual norm 3.943330715536e-01 6190 KSP Residual norm 3.943330715536e-01 6191 KSP Residual norm 3.943330715536e-01 6192 KSP Residual norm 3.943330715536e-01 6193 KSP Residual norm 3.943330715536e-01 6194 KSP Residual norm 3.943330715536e-01 6195 KSP Residual norm 3.943330715536e-01 6196 KSP Residual norm 3.943330715536e-01 6197 KSP Residual norm 3.943330715536e-01 6198 KSP Residual norm 3.943330715536e-01 6199 KSP Residual norm 3.943330715536e-01 6200 KSP Residual norm 3.943330715536e-01 6201 KSP Residual norm 3.943330715536e-01 6202 KSP Residual norm 3.943330715536e-01 6203 KSP Residual norm 3.943330715536e-01 6204 KSP Residual norm 3.943330715536e-01 6205 KSP Residual norm 3.943330715536e-01 6206 KSP Residual norm 3.943330715536e-01 6207 KSP Residual norm 3.943330715536e-01 6208 KSP Residual norm 3.943330715536e-01 6209 KSP Residual norm 3.943330715536e-01 6210 KSP Residual norm 3.943330715533e-01 6211 KSP Residual norm 3.943330715533e-01 6212 KSP Residual norm 3.943330715533e-01 6213 KSP Residual norm 3.943330715533e-01 6214 KSP Residual norm 3.943330715533e-01 6215 KSP Residual norm 3.943330715533e-01 6216 KSP Residual norm 3.943330715533e-01 6217 KSP Residual norm 3.943330715533e-01 6218 KSP Residual norm 3.943330715533e-01 6219 KSP Residual norm 3.943330715533e-01 6220 KSP Residual norm 3.943330715533e-01 6221 KSP Residual norm 3.943330715533e-01 6222 KSP Residual norm 3.943330715533e-01 6223 KSP Residual norm 3.943330715533e-01 6224 KSP Residual norm 3.943330715533e-01 6225 KSP Residual norm 3.943330715533e-01 6226 KSP Residual norm 3.943330715533e-01 6227 KSP Residual norm 3.943330715533e-01 6228 KSP Residual norm 3.943330715533e-01 6229 KSP Residual norm 3.943330715533e-01 6230 KSP Residual norm 3.943330715533e-01 6231 KSP Residual norm 3.943330715533e-01 6232 KSP Residual norm 3.943330715533e-01 6233 KSP Residual norm 3.943330715533e-01 6234 KSP Residual norm 3.943330715533e-01 6235 KSP Residual norm 3.943330715533e-01 6236 KSP Residual norm 3.943330715533e-01 6237 KSP Residual norm 3.943330715533e-01 6238 KSP Residual norm 3.943330715533e-01 6239 KSP Residual norm 3.943330715533e-01 6240 KSP Residual norm 3.943330715531e-01 6241 KSP Residual norm 3.943330715531e-01 6242 KSP Residual norm 3.943330715531e-01 6243 KSP Residual norm 3.943330715531e-01 6244 KSP Residual norm 3.943330715531e-01 6245 KSP Residual norm 3.943330715531e-01 6246 KSP Residual norm 3.943330715531e-01 6247 KSP Residual norm 3.943330715531e-01 6248 KSP Residual norm 3.943330715531e-01 6249 KSP Residual norm 3.943330715531e-01 6250 KSP Residual norm 3.943330715531e-01 6251 KSP Residual norm 3.943330715531e-01 6252 KSP Residual norm 3.943330715531e-01 6253 KSP Residual norm 3.943330715531e-01 6254 KSP Residual norm 3.943330715531e-01 6255 KSP Residual norm 3.943330715531e-01 6256 KSP Residual norm 3.943330715531e-01 6257 KSP Residual norm 3.943330715531e-01 6258 KSP Residual norm 3.943330715531e-01 6259 KSP Residual norm 3.943330715531e-01 6260 KSP Residual norm 3.943330715531e-01 6261 KSP Residual norm 3.943330715531e-01 6262 KSP Residual norm 3.943330715531e-01 6263 KSP Residual norm 3.943330715531e-01 6264 KSP Residual norm 3.943330715531e-01 6265 KSP Residual norm 3.943330715531e-01 6266 KSP Residual norm 3.943330715531e-01 6267 KSP Residual norm 3.943330715531e-01 6268 KSP Residual norm 3.943330715531e-01 6269 KSP Residual norm 3.943330715531e-01 6270 KSP Residual norm 3.943330715534e-01 6271 KSP Residual norm 3.943330715534e-01 6272 KSP Residual norm 3.943330715534e-01 6273 KSP Residual norm 3.943330715534e-01 6274 KSP Residual norm 3.943330715534e-01 6275 KSP Residual norm 3.943330715534e-01 6276 KSP Residual norm 3.943330715534e-01 6277 KSP Residual norm 3.943330715534e-01 6278 KSP Residual norm 3.943330715534e-01 6279 KSP Residual norm 3.943330715534e-01 6280 KSP Residual norm 3.943330715534e-01 6281 KSP Residual norm 3.943330715534e-01 6282 KSP Residual norm 3.943330715534e-01 6283 KSP Residual norm 3.943330715534e-01 6284 KSP Residual norm 3.943330715534e-01 6285 KSP Residual norm 3.943330715534e-01 6286 KSP Residual norm 3.943330715534e-01 6287 KSP Residual norm 3.943330715534e-01 6288 KSP Residual norm 3.943330715534e-01 6289 KSP Residual norm 3.943330715534e-01 6290 KSP Residual norm 3.943330715534e-01 6291 KSP Residual norm 3.943330715534e-01 6292 KSP Residual norm 3.943330715534e-01 6293 KSP Residual norm 3.943330715534e-01 6294 KSP Residual norm 3.943330715534e-01 6295 KSP Residual norm 3.943330715534e-01 6296 KSP Residual norm 3.943330715534e-01 6297 KSP Residual norm 3.943330715534e-01 6298 KSP Residual norm 3.943330715534e-01 6299 KSP Residual norm 3.943330715534e-01 6300 KSP Residual norm 3.943330715534e-01 6301 KSP Residual norm 3.943330715534e-01 6302 KSP Residual norm 3.943330715534e-01 6303 KSP Residual norm 3.943330715534e-01 6304 KSP Residual norm 3.943330715534e-01 6305 KSP Residual norm 3.943330715534e-01 6306 KSP Residual norm 3.943330715534e-01 6307 KSP Residual norm 3.943330715534e-01 6308 KSP Residual norm 3.943330715534e-01 6309 KSP Residual norm 3.943330715534e-01 6310 KSP Residual norm 3.943330715534e-01 6311 KSP Residual norm 3.943330715534e-01 6312 KSP Residual norm 3.943330715534e-01 6313 KSP Residual norm 3.943330715534e-01 6314 KSP Residual norm 3.943330715534e-01 6315 KSP Residual norm 3.943330715534e-01 6316 KSP Residual norm 3.943330715534e-01 6317 KSP Residual norm 3.943330715534e-01 6318 KSP Residual norm 3.943330715534e-01 6319 KSP Residual norm 3.943330715534e-01 6320 KSP Residual norm 3.943330715534e-01 6321 KSP Residual norm 3.943330715534e-01 6322 KSP Residual norm 3.943330715534e-01 6323 KSP Residual norm 3.943330715534e-01 6324 KSP Residual norm 3.943330715534e-01 6325 KSP Residual norm 3.943330715534e-01 6326 KSP Residual norm 3.943330715534e-01 6327 KSP Residual norm 3.943330715534e-01 6328 KSP Residual norm 3.943330715534e-01 6329 KSP Residual norm 3.943330715534e-01 6330 KSP Residual norm 3.943330715537e-01 6331 KSP Residual norm 3.943330715537e-01 6332 KSP Residual norm 3.943330715537e-01 6333 KSP Residual norm 3.943330715537e-01 6334 KSP Residual norm 3.943330715537e-01 6335 KSP Residual norm 3.943330715537e-01 6336 KSP Residual norm 3.943330715537e-01 6337 KSP Residual norm 3.943330715537e-01 6338 KSP Residual norm 3.943330715537e-01 6339 KSP Residual norm 3.943330715537e-01 6340 KSP Residual norm 3.943330715537e-01 6341 KSP Residual norm 3.943330715537e-01 6342 KSP Residual norm 3.943330715537e-01 6343 KSP Residual norm 3.943330715537e-01 6344 KSP Residual norm 3.943330715537e-01 6345 KSP Residual norm 3.943330715537e-01 6346 KSP Residual norm 3.943330715537e-01 6347 KSP Residual norm 3.943330715537e-01 6348 KSP Residual norm 3.943330715537e-01 6349 KSP Residual norm 3.943330715537e-01 6350 KSP Residual norm 3.943330715537e-01 6351 KSP Residual norm 3.943330715537e-01 6352 KSP Residual norm 3.943330715537e-01 6353 KSP Residual norm 3.943330715537e-01 6354 KSP Residual norm 3.943330715537e-01 6355 KSP Residual norm 3.943330715537e-01 6356 KSP Residual norm 3.943330715537e-01 6357 KSP Residual norm 3.943330715537e-01 6358 KSP Residual norm 3.943330715537e-01 6359 KSP Residual norm 3.943330715537e-01 6360 KSP Residual norm 3.943330715535e-01 6361 KSP Residual norm 3.943330715535e-01 6362 KSP Residual norm 3.943330715535e-01 6363 KSP Residual norm 3.943330715535e-01 6364 KSP Residual norm 3.943330715535e-01 6365 KSP Residual norm 3.943330715535e-01 6366 KSP Residual norm 3.943330715535e-01 6367 KSP Residual norm 3.943330715535e-01 6368 KSP Residual norm 3.943330715535e-01 6369 KSP Residual norm 3.943330715535e-01 6370 KSP Residual norm 3.943330715535e-01 6371 KSP Residual norm 3.943330715535e-01 6372 KSP Residual norm 3.943330715535e-01 6373 KSP Residual norm 3.943330715535e-01 6374 KSP Residual norm 3.943330715535e-01 6375 KSP Residual norm 3.943330715535e-01 6376 KSP Residual norm 3.943330715535e-01 6377 KSP Residual norm 3.943330715535e-01 6378 KSP Residual norm 3.943330715535e-01 6379 KSP Residual norm 3.943330715535e-01 6380 KSP Residual norm 3.943330715535e-01 6381 KSP Residual norm 3.943330715535e-01 6382 KSP Residual norm 3.943330715535e-01 6383 KSP Residual norm 3.943330715535e-01 6384 KSP Residual norm 3.943330715535e-01 6385 KSP Residual norm 3.943330715535e-01 6386 KSP Residual norm 3.943330715535e-01 6387 KSP Residual norm 3.943330715535e-01 6388 KSP Residual norm 3.943330715535e-01 6389 KSP Residual norm 3.943330715535e-01 6390 KSP Residual norm 3.943330715537e-01 6391 KSP Residual norm 3.943330715537e-01 6392 KSP Residual norm 3.943330715537e-01 6393 KSP Residual norm 3.943330715537e-01 6394 KSP Residual norm 3.943330715537e-01 6395 KSP Residual norm 3.943330715537e-01 6396 KSP Residual norm 3.943330715537e-01 6397 KSP Residual norm 3.943330715537e-01 6398 KSP Residual norm 3.943330715537e-01 6399 KSP Residual norm 3.943330715537e-01 6400 KSP Residual norm 3.943330715537e-01 6401 KSP Residual norm 3.943330715537e-01 6402 KSP Residual norm 3.943330715537e-01 6403 KSP Residual norm 3.943330715537e-01 6404 KSP Residual norm 3.943330715537e-01 6405 KSP Residual norm 3.943330715537e-01 6406 KSP Residual norm 3.943330715537e-01 6407 KSP Residual norm 3.943330715537e-01 6408 KSP Residual norm 3.943330715537e-01 6409 KSP Residual norm 3.943330715537e-01 6410 KSP Residual norm 3.943330715537e-01 6411 KSP Residual norm 3.943330715537e-01 6412 KSP Residual norm 3.943330715537e-01 6413 KSP Residual norm 3.943330715537e-01 6414 KSP Residual norm 3.943330715537e-01 6415 KSP Residual norm 3.943330715537e-01 6416 KSP Residual norm 3.943330715537e-01 6417 KSP Residual norm 3.943330715537e-01 6418 KSP Residual norm 3.943330715537e-01 6419 KSP Residual norm 3.943330715537e-01 6420 KSP Residual norm 3.943330715536e-01 6421 KSP Residual norm 3.943330715536e-01 6422 KSP Residual norm 3.943330715536e-01 6423 KSP Residual norm 3.943330715536e-01 6424 KSP Residual norm 3.943330715536e-01 6425 KSP Residual norm 3.943330715536e-01 6426 KSP Residual norm 3.943330715536e-01 6427 KSP Residual norm 3.943330715536e-01 6428 KSP Residual norm 3.943330715536e-01 6429 KSP Residual norm 3.943330715536e-01 6430 KSP Residual norm 3.943330715536e-01 6431 KSP Residual norm 3.943330715536e-01 6432 KSP Residual norm 3.943330715536e-01 6433 KSP Residual norm 3.943330715536e-01 6434 KSP Residual norm 3.943330715536e-01 6435 KSP Residual norm 3.943330715536e-01 6436 KSP Residual norm 3.943330715536e-01 6437 KSP Residual norm 3.943330715536e-01 6438 KSP Residual norm 3.943330715536e-01 6439 KSP Residual norm 3.943330715536e-01 6440 KSP Residual norm 3.943330715536e-01 6441 KSP Residual norm 3.943330715536e-01 6442 KSP Residual norm 3.943330715536e-01 6443 KSP Residual norm 3.943330715536e-01 6444 KSP Residual norm 3.943330715536e-01 6445 KSP Residual norm 3.943330715536e-01 6446 KSP Residual norm 3.943330715536e-01 6447 KSP Residual norm 3.943330715536e-01 6448 KSP Residual norm 3.943330715536e-01 6449 KSP Residual norm 3.943330715536e-01 6450 KSP Residual norm 3.943330715539e-01 6451 KSP Residual norm 3.943330715539e-01 6452 KSP Residual norm 3.943330715539e-01 6453 KSP Residual norm 3.943330715539e-01 6454 KSP Residual norm 3.943330715539e-01 6455 KSP Residual norm 3.943330715539e-01 6456 KSP Residual norm 3.943330715539e-01 6457 KSP Residual norm 3.943330715539e-01 6458 KSP Residual norm 3.943330715539e-01 6459 KSP Residual norm 3.943330715539e-01 6460 KSP Residual norm 3.943330715539e-01 6461 KSP Residual norm 3.943330715539e-01 6462 KSP Residual norm 3.943330715539e-01 6463 KSP Residual norm 3.943330715539e-01 6464 KSP Residual norm 3.943330715539e-01 6465 KSP Residual norm 3.943330715539e-01 6466 KSP Residual norm 3.943330715539e-01 6467 KSP Residual norm 3.943330715539e-01 6468 KSP Residual norm 3.943330715539e-01 6469 KSP Residual norm 3.943330715539e-01 6470 KSP Residual norm 3.943330715539e-01 6471 KSP Residual norm 3.943330715539e-01 6472 KSP Residual norm 3.943330715539e-01 6473 KSP Residual norm 3.943330715539e-01 6474 KSP Residual norm 3.943330715539e-01 6475 KSP Residual norm 3.943330715539e-01 6476 KSP Residual norm 3.943330715539e-01 6477 KSP Residual norm 3.943330715539e-01 6478 KSP Residual norm 3.943330715539e-01 6479 KSP Residual norm 3.943330715539e-01 6480 KSP Residual norm 3.943330715539e-01 6481 KSP Residual norm 3.943330715539e-01 6482 KSP Residual norm 3.943330715539e-01 6483 KSP Residual norm 3.943330715539e-01 6484 KSP Residual norm 3.943330715539e-01 6485 KSP Residual norm 3.943330715539e-01 6486 KSP Residual norm 3.943330715539e-01 6487 KSP Residual norm 3.943330715539e-01 6488 KSP Residual norm 3.943330715539e-01 6489 KSP Residual norm 3.943330715539e-01 6490 KSP Residual norm 3.943330715539e-01 6491 KSP Residual norm 3.943330715539e-01 6492 KSP Residual norm 3.943330715539e-01 6493 KSP Residual norm 3.943330715539e-01 6494 KSP Residual norm 3.943330715539e-01 6495 KSP Residual norm 3.943330715539e-01 6496 KSP Residual norm 3.943330715539e-01 6497 KSP Residual norm 3.943330715539e-01 6498 KSP Residual norm 3.943330715539e-01 6499 KSP Residual norm 3.943330715539e-01 6500 KSP Residual norm 3.943330715539e-01 6501 KSP Residual norm 3.943330715539e-01 6502 KSP Residual norm 3.943330715539e-01 6503 KSP Residual norm 3.943330715539e-01 6504 KSP Residual norm 3.943330715539e-01 6505 KSP Residual norm 3.943330715539e-01 6506 KSP Residual norm 3.943330715539e-01 6507 KSP Residual norm 3.943330715539e-01 6508 KSP Residual norm 3.943330715539e-01 6509 KSP Residual norm 3.943330715539e-01 6510 KSP Residual norm 3.943330715540e-01 6511 KSP Residual norm 3.943330715540e-01 6512 KSP Residual norm 3.943330715540e-01 6513 KSP Residual norm 3.943330715540e-01 6514 KSP Residual norm 3.943330715540e-01 6515 KSP Residual norm 3.943330715540e-01 6516 KSP Residual norm 3.943330715540e-01 6517 KSP Residual norm 3.943330715540e-01 6518 KSP Residual norm 3.943330715540e-01 6519 KSP Residual norm 3.943330715540e-01 6520 KSP Residual norm 3.943330715540e-01 6521 KSP Residual norm 3.943330715540e-01 6522 KSP Residual norm 3.943330715540e-01 6523 KSP Residual norm 3.943330715540e-01 6524 KSP Residual norm 3.943330715540e-01 6525 KSP Residual norm 3.943330715540e-01 6526 KSP Residual norm 3.943330715540e-01 6527 KSP Residual norm 3.943330715540e-01 6528 KSP Residual norm 3.943330715540e-01 6529 KSP Residual norm 3.943330715540e-01 6530 KSP Residual norm 3.943330715540e-01 6531 KSP Residual norm 3.943330715540e-01 6532 KSP Residual norm 3.943330715540e-01 6533 KSP Residual norm 3.943330715540e-01 6534 KSP Residual norm 3.943330715540e-01 6535 KSP Residual norm 3.943330715540e-01 6536 KSP Residual norm 3.943330715540e-01 6537 KSP Residual norm 3.943330715540e-01 6538 KSP Residual norm 3.943330715540e-01 6539 KSP Residual norm 3.943330715540e-01 6540 KSP Residual norm 3.943330715540e-01 6541 KSP Residual norm 3.943330715540e-01 6542 KSP Residual norm 3.943330715540e-01 6543 KSP Residual norm 3.943330715540e-01 6544 KSP Residual norm 3.943330715540e-01 6545 KSP Residual norm 3.943330715540e-01 6546 KSP Residual norm 3.943330715540e-01 6547 KSP Residual norm 3.943330715540e-01 6548 KSP Residual norm 3.943330715540e-01 6549 KSP Residual norm 3.943330715540e-01 6550 KSP Residual norm 3.943330715540e-01 6551 KSP Residual norm 3.943330715540e-01 6552 KSP Residual norm 3.943330715540e-01 6553 KSP Residual norm 3.943330715540e-01 6554 KSP Residual norm 3.943330715540e-01 6555 KSP Residual norm 3.943330715540e-01 6556 KSP Residual norm 3.943330715540e-01 6557 KSP Residual norm 3.943330715540e-01 6558 KSP Residual norm 3.943330715540e-01 6559 KSP Residual norm 3.943330715540e-01 6560 KSP Residual norm 3.943330715540e-01 6561 KSP Residual norm 3.943330715540e-01 6562 KSP Residual norm 3.943330715540e-01 6563 KSP Residual norm 3.943330715540e-01 6564 KSP Residual norm 3.943330715540e-01 6565 KSP Residual norm 3.943330715540e-01 6566 KSP Residual norm 3.943330715540e-01 6567 KSP Residual norm 3.943330715540e-01 6568 KSP Residual norm 3.943330715540e-01 6569 KSP Residual norm 3.943330715540e-01 6570 KSP Residual norm 3.943330715539e-01 6571 KSP Residual norm 3.943330715539e-01 6572 KSP Residual norm 3.943330715539e-01 6573 KSP Residual norm 3.943330715539e-01 6574 KSP Residual norm 3.943330715539e-01 6575 KSP Residual norm 3.943330715539e-01 6576 KSP Residual norm 3.943330715539e-01 6577 KSP Residual norm 3.943330715539e-01 6578 KSP Residual norm 3.943330715539e-01 6579 KSP Residual norm 3.943330715539e-01 6580 KSP Residual norm 3.943330715539e-01 6581 KSP Residual norm 3.943330715539e-01 6582 KSP Residual norm 3.943330715539e-01 6583 KSP Residual norm 3.943330715539e-01 6584 KSP Residual norm 3.943330715539e-01 6585 KSP Residual norm 3.943330715539e-01 6586 KSP Residual norm 3.943330715539e-01 6587 KSP Residual norm 3.943330715539e-01 6588 KSP Residual norm 3.943330715539e-01 6589 KSP Residual norm 3.943330715539e-01 6590 KSP Residual norm 3.943330715539e-01 6591 KSP Residual norm 3.943330715539e-01 6592 KSP Residual norm 3.943330715539e-01 6593 KSP Residual norm 3.943330715539e-01 6594 KSP Residual norm 3.943330715539e-01 6595 KSP Residual norm 3.943330715539e-01 6596 KSP Residual norm 3.943330715539e-01 6597 KSP Residual norm 3.943330715539e-01 6598 KSP Residual norm 3.943330715539e-01 6599 KSP Residual norm 3.943330715539e-01 6600 KSP Residual norm 3.943330715540e-01 6601 KSP Residual norm 3.943330715540e-01 6602 KSP Residual norm 3.943330715540e-01 6603 KSP Residual norm 3.943330715540e-01 6604 KSP Residual norm 3.943330715540e-01 6605 KSP Residual norm 3.943330715540e-01 6606 KSP Residual norm 3.943330715540e-01 6607 KSP Residual norm 3.943330715540e-01 6608 KSP Residual norm 3.943330715540e-01 6609 KSP Residual norm 3.943330715540e-01 6610 KSP Residual norm 3.943330715540e-01 6611 KSP Residual norm 3.943330715540e-01 6612 KSP Residual norm 3.943330715540e-01 6613 KSP Residual norm 3.943330715540e-01 6614 KSP Residual norm 3.943330715540e-01 6615 KSP Residual norm 3.943330715540e-01 6616 KSP Residual norm 3.943330715540e-01 6617 KSP Residual norm 3.943330715540e-01 6618 KSP Residual norm 3.943330715540e-01 6619 KSP Residual norm 3.943330715540e-01 6620 KSP Residual norm 3.943330715540e-01 6621 KSP Residual norm 3.943330715540e-01 6622 KSP Residual norm 3.943330715540e-01 6623 KSP Residual norm 3.943330715540e-01 6624 KSP Residual norm 3.943330715540e-01 6625 KSP Residual norm 3.943330715540e-01 6626 KSP Residual norm 3.943330715540e-01 6627 KSP Residual norm 3.943330715540e-01 6628 KSP Residual norm 3.943330715540e-01 6629 KSP Residual norm 3.943330715540e-01 6630 KSP Residual norm 3.943330715538e-01 6631 KSP Residual norm 3.943330715538e-01 6632 KSP Residual norm 3.943330715538e-01 6633 KSP Residual norm 3.943330715538e-01 6634 KSP Residual norm 3.943330715538e-01 6635 KSP Residual norm 3.943330715538e-01 6636 KSP Residual norm 3.943330715538e-01 6637 KSP Residual norm 3.943330715538e-01 6638 KSP Residual norm 3.943330715538e-01 6639 KSP Residual norm 3.943330715538e-01 6640 KSP Residual norm 3.943330715538e-01 6641 KSP Residual norm 3.943330715538e-01 6642 KSP Residual norm 3.943330715538e-01 6643 KSP Residual norm 3.943330715538e-01 6644 KSP Residual norm 3.943330715538e-01 6645 KSP Residual norm 3.943330715538e-01 6646 KSP Residual norm 3.943330715538e-01 6647 KSP Residual norm 3.943330715538e-01 6648 KSP Residual norm 3.943330715538e-01 6649 KSP Residual norm 3.943330715538e-01 6650 KSP Residual norm 3.943330715538e-01 6651 KSP Residual norm 3.943330715538e-01 6652 KSP Residual norm 3.943330715538e-01 6653 KSP Residual norm 3.943330715538e-01 6654 KSP Residual norm 3.943330715538e-01 6655 KSP Residual norm 3.943330715538e-01 6656 KSP Residual norm 3.943330715538e-01 6657 KSP Residual norm 3.943330715538e-01 6658 KSP Residual norm 3.943330715538e-01 6659 KSP Residual norm 3.943330715538e-01 6660 KSP Residual norm 3.943330715541e-01 6661 KSP Residual norm 3.943330715541e-01 6662 KSP Residual norm 3.943330715541e-01 6663 KSP Residual norm 3.943330715541e-01 6664 KSP Residual norm 3.943330715541e-01 6665 KSP Residual norm 3.943330715541e-01 6666 KSP Residual norm 3.943330715541e-01 6667 KSP Residual norm 3.943330715541e-01 6668 KSP Residual norm 3.943330715541e-01 6669 KSP Residual norm 3.943330715541e-01 6670 KSP Residual norm 3.943330715541e-01 6671 KSP Residual norm 3.943330715541e-01 6672 KSP Residual norm 3.943330715541e-01 6673 KSP Residual norm 3.943330715541e-01 6674 KSP Residual norm 3.943330715541e-01 6675 KSP Residual norm 3.943330715541e-01 6676 KSP Residual norm 3.943330715541e-01 6677 KSP Residual norm 3.943330715541e-01 6678 KSP Residual norm 3.943330715541e-01 6679 KSP Residual norm 3.943330715541e-01 6680 KSP Residual norm 3.943330715541e-01 6681 KSP Residual norm 3.943330715541e-01 6682 KSP Residual norm 3.943330715541e-01 6683 KSP Residual norm 3.943330715541e-01 6684 KSP Residual norm 3.943330715541e-01 6685 KSP Residual norm 3.943330715541e-01 6686 KSP Residual norm 3.943330715541e-01 6687 KSP Residual norm 3.943330715541e-01 6688 KSP Residual norm 3.943330715541e-01 6689 KSP Residual norm 3.943330715541e-01 6690 KSP Residual norm 3.943330715539e-01 6691 KSP Residual norm 3.943330715539e-01 6692 KSP Residual norm 3.943330715539e-01 6693 KSP Residual norm 3.943330715539e-01 6694 KSP Residual norm 3.943330715539e-01 6695 KSP Residual norm 3.943330715539e-01 6696 KSP Residual norm 3.943330715539e-01 6697 KSP Residual norm 3.943330715539e-01 6698 KSP Residual norm 3.943330715539e-01 6699 KSP Residual norm 3.943330715539e-01 6700 KSP Residual norm 3.943330715539e-01 6701 KSP Residual norm 3.943330715539e-01 6702 KSP Residual norm 3.943330715539e-01 6703 KSP Residual norm 3.943330715539e-01 6704 KSP Residual norm 3.943330715539e-01 6705 KSP Residual norm 3.943330715539e-01 6706 KSP Residual norm 3.943330715539e-01 6707 KSP Residual norm 3.943330715539e-01 6708 KSP Residual norm 3.943330715539e-01 6709 KSP Residual norm 3.943330715539e-01 6710 KSP Residual norm 3.943330715539e-01 6711 KSP Residual norm 3.943330715539e-01 6712 KSP Residual norm 3.943330715539e-01 6713 KSP Residual norm 3.943330715539e-01 6714 KSP Residual norm 3.943330715539e-01 6715 KSP Residual norm 3.943330715539e-01 6716 KSP Residual norm 3.943330715539e-01 6717 KSP Residual norm 3.943330715539e-01 6718 KSP Residual norm 3.943330715539e-01 6719 KSP Residual norm 3.943330715539e-01 6720 KSP Residual norm 3.943330715540e-01 6721 KSP Residual norm 3.943330715540e-01 6722 KSP Residual norm 3.943330715540e-01 6723 KSP Residual norm 3.943330715540e-01 6724 KSP Residual norm 3.943330715540e-01 6725 KSP Residual norm 3.943330715540e-01 6726 KSP Residual norm 3.943330715540e-01 6727 KSP Residual norm 3.943330715540e-01 6728 KSP Residual norm 3.943330715540e-01 6729 KSP Residual norm 3.943330715540e-01 6730 KSP Residual norm 3.943330715540e-01 6731 KSP Residual norm 3.943330715540e-01 6732 KSP Residual norm 3.943330715540e-01 6733 KSP Residual norm 3.943330715540e-01 6734 KSP Residual norm 3.943330715540e-01 6735 KSP Residual norm 3.943330715540e-01 6736 KSP Residual norm 3.943330715540e-01 6737 KSP Residual norm 3.943330715540e-01 6738 KSP Residual norm 3.943330715540e-01 6739 KSP Residual norm 3.943330715540e-01 6740 KSP Residual norm 3.943330715540e-01 6741 KSP Residual norm 3.943330715540e-01 6742 KSP Residual norm 3.943330715540e-01 6743 KSP Residual norm 3.943330715540e-01 6744 KSP Residual norm 3.943330715540e-01 6745 KSP Residual norm 3.943330715540e-01 6746 KSP Residual norm 3.943330715540e-01 6747 KSP Residual norm 3.943330715540e-01 6748 KSP Residual norm 3.943330715540e-01 6749 KSP Residual norm 3.943330715540e-01 6750 KSP Residual norm 3.943330715540e-01 6751 KSP Residual norm 3.943330715540e-01 6752 KSP Residual norm 3.943330715540e-01 6753 KSP Residual norm 3.943330715540e-01 6754 KSP Residual norm 3.943330715540e-01 6755 KSP Residual norm 3.943330715540e-01 6756 KSP Residual norm 3.943330715540e-01 6757 KSP Residual norm 3.943330715540e-01 6758 KSP Residual norm 3.943330715540e-01 6759 KSP Residual norm 3.943330715540e-01 6760 KSP Residual norm 3.943330715540e-01 6761 KSP Residual norm 3.943330715540e-01 6762 KSP Residual norm 3.943330715540e-01 6763 KSP Residual norm 3.943330715540e-01 6764 KSP Residual norm 3.943330715540e-01 6765 KSP Residual norm 3.943330715540e-01 6766 KSP Residual norm 3.943330715540e-01 6767 KSP Residual norm 3.943330715540e-01 6768 KSP Residual norm 3.943330715540e-01 6769 KSP Residual norm 3.943330715540e-01 6770 KSP Residual norm 3.943330715540e-01 6771 KSP Residual norm 3.943330715540e-01 6772 KSP Residual norm 3.943330715540e-01 6773 KSP Residual norm 3.943330715540e-01 6774 KSP Residual norm 3.943330715540e-01 6775 KSP Residual norm 3.943330715540e-01 6776 KSP Residual norm 3.943330715540e-01 6777 KSP Residual norm 3.943330715540e-01 6778 KSP Residual norm 3.943330715540e-01 6779 KSP Residual norm 3.943330715540e-01 6780 KSP Residual norm 3.943330715541e-01 6781 KSP Residual norm 3.943330715541e-01 6782 KSP Residual norm 3.943330715541e-01 6783 KSP Residual norm 3.943330715541e-01 6784 KSP Residual norm 3.943330715541e-01 6785 KSP Residual norm 3.943330715541e-01 6786 KSP Residual norm 3.943330715541e-01 6787 KSP Residual norm 3.943330715541e-01 6788 KSP Residual norm 3.943330715541e-01 6789 KSP Residual norm 3.943330715541e-01 6790 KSP Residual norm 3.943330715541e-01 6791 KSP Residual norm 3.943330715541e-01 6792 KSP Residual norm 3.943330715541e-01 6793 KSP Residual norm 3.943330715541e-01 6794 KSP Residual norm 3.943330715541e-01 6795 KSP Residual norm 3.943330715541e-01 6796 KSP Residual norm 3.943330715541e-01 6797 KSP Residual norm 3.943330715541e-01 6798 KSP Residual norm 3.943330715541e-01 6799 KSP Residual norm 3.943330715541e-01 6800 KSP Residual norm 3.943330715541e-01 6801 KSP Residual norm 3.943330715541e-01 6802 KSP Residual norm 3.943330715541e-01 6803 KSP Residual norm 3.943330715541e-01 6804 KSP Residual norm 3.943330715541e-01 6805 KSP Residual norm 3.943330715541e-01 6806 KSP Residual norm 3.943330715541e-01 6807 KSP Residual norm 3.943330715541e-01 6808 KSP Residual norm 3.943330715541e-01 6809 KSP Residual norm 3.943330715541e-01 6810 KSP Residual norm 3.943330715539e-01 6811 KSP Residual norm 3.943330715539e-01 6812 KSP Residual norm 3.943330715539e-01 6813 KSP Residual norm 3.943330715539e-01 6814 KSP Residual norm 3.943330715539e-01 6815 KSP Residual norm 3.943330715539e-01 6816 KSP Residual norm 3.943330715539e-01 6817 KSP Residual norm 3.943330715539e-01 6818 KSP Residual norm 3.943330715539e-01 6819 KSP Residual norm 3.943330715539e-01 6820 KSP Residual norm 3.943330715539e-01 6821 KSP Residual norm 3.943330715539e-01 6822 KSP Residual norm 3.943330715539e-01 6823 KSP Residual norm 3.943330715539e-01 6824 KSP Residual norm 3.943330715539e-01 6825 KSP Residual norm 3.943330715539e-01 6826 KSP Residual norm 3.943330715539e-01 6827 KSP Residual norm 3.943330715539e-01 6828 KSP Residual norm 3.943330715539e-01 6829 KSP Residual norm 3.943330715539e-01 6830 KSP Residual norm 3.943330715539e-01 6831 KSP Residual norm 3.943330715539e-01 6832 KSP Residual norm 3.943330715539e-01 6833 KSP Residual norm 3.943330715539e-01 6834 KSP Residual norm 3.943330715539e-01 6835 KSP Residual norm 3.943330715539e-01 6836 KSP Residual norm 3.943330715539e-01 6837 KSP Residual norm 3.943330715539e-01 6838 KSP Residual norm 3.943330715539e-01 6839 KSP Residual norm 3.943330715539e-01 6840 KSP Residual norm 3.943330715539e-01 6841 KSP Residual norm 3.943330715539e-01 6842 KSP Residual norm 3.943330715539e-01 6843 KSP Residual norm 3.943330715539e-01 6844 KSP Residual norm 3.943330715539e-01 6845 KSP Residual norm 3.943330715539e-01 6846 KSP Residual norm 3.943330715539e-01 6847 KSP Residual norm 3.943330715539e-01 6848 KSP Residual norm 3.943330715539e-01 6849 KSP Residual norm 3.943330715539e-01 6850 KSP Residual norm 3.943330715539e-01 6851 KSP Residual norm 3.943330715539e-01 6852 KSP Residual norm 3.943330715539e-01 6853 KSP Residual norm 3.943330715539e-01 6854 KSP Residual norm 3.943330715539e-01 6855 KSP Residual norm 3.943330715539e-01 6856 KSP Residual norm 3.943330715539e-01 6857 KSP Residual norm 3.943330715539e-01 6858 KSP Residual norm 3.943330715539e-01 6859 KSP Residual norm 3.943330715539e-01 6860 KSP Residual norm 3.943330715539e-01 6861 KSP Residual norm 3.943330715539e-01 6862 KSP Residual norm 3.943330715539e-01 6863 KSP Residual norm 3.943330715539e-01 6864 KSP Residual norm 3.943330715539e-01 6865 KSP Residual norm 3.943330715539e-01 6866 KSP Residual norm 3.943330715539e-01 6867 KSP Residual norm 3.943330715539e-01 6868 KSP Residual norm 3.943330715539e-01 6869 KSP Residual norm 3.943330715539e-01 6870 KSP Residual norm 3.943330715539e-01 6871 KSP Residual norm 3.943330715539e-01 6872 KSP Residual norm 3.943330715539e-01 6873 KSP Residual norm 3.943330715539e-01 6874 KSP Residual norm 3.943330715539e-01 6875 KSP Residual norm 3.943330715539e-01 6876 KSP Residual norm 3.943330715539e-01 6877 KSP Residual norm 3.943330715539e-01 6878 KSP Residual norm 3.943330715539e-01 6879 KSP Residual norm 3.943330715539e-01 6880 KSP Residual norm 3.943330715539e-01 6881 KSP Residual norm 3.943330715539e-01 6882 KSP Residual norm 3.943330715539e-01 6883 KSP Residual norm 3.943330715539e-01 6884 KSP Residual norm 3.943330715539e-01 6885 KSP Residual norm 3.943330715539e-01 6886 KSP Residual norm 3.943330715539e-01 6887 KSP Residual norm 3.943330715539e-01 6888 KSP Residual norm 3.943330715539e-01 6889 KSP Residual norm 3.943330715539e-01 6890 KSP Residual norm 3.943330715539e-01 6891 KSP Residual norm 3.943330715539e-01 6892 KSP Residual norm 3.943330715539e-01 6893 KSP Residual norm 3.943330715539e-01 6894 KSP Residual norm 3.943330715539e-01 6895 KSP Residual norm 3.943330715539e-01 6896 KSP Residual norm 3.943330715539e-01 6897 KSP Residual norm 3.943330715539e-01 6898 KSP Residual norm 3.943330715539e-01 6899 KSP Residual norm 3.943330715539e-01 6900 KSP Residual norm 3.943330715542e-01 6901 KSP Residual norm 3.943330715542e-01 6902 KSP Residual norm 3.943330715542e-01 6903 KSP Residual norm 3.943330715542e-01 6904 KSP Residual norm 3.943330715542e-01 6905 KSP Residual norm 3.943330715542e-01 6906 KSP Residual norm 3.943330715542e-01 6907 KSP Residual norm 3.943330715542e-01 6908 KSP Residual norm 3.943330715542e-01 6909 KSP Residual norm 3.943330715542e-01 6910 KSP Residual norm 3.943330715542e-01 6911 KSP Residual norm 3.943330715542e-01 6912 KSP Residual norm 3.943330715542e-01 6913 KSP Residual norm 3.943330715542e-01 6914 KSP Residual norm 3.943330715542e-01 6915 KSP Residual norm 3.943330715542e-01 6916 KSP Residual norm 3.943330715542e-01 6917 KSP Residual norm 3.943330715542e-01 6918 KSP Residual norm 3.943330715542e-01 6919 KSP Residual norm 3.943330715542e-01 6920 KSP Residual norm 3.943330715542e-01 6921 KSP Residual norm 3.943330715542e-01 6922 KSP Residual norm 3.943330715542e-01 6923 KSP Residual norm 3.943330715542e-01 6924 KSP Residual norm 3.943330715542e-01 6925 KSP Residual norm 3.943330715542e-01 6926 KSP Residual norm 3.943330715542e-01 6927 KSP Residual norm 3.943330715542e-01 6928 KSP Residual norm 3.943330715542e-01 6929 KSP Residual norm 3.943330715542e-01 6930 KSP Residual norm 3.943330715541e-01 6931 KSP Residual norm 3.943330715541e-01 6932 KSP Residual norm 3.943330715541e-01 6933 KSP Residual norm 3.943330715541e-01 6934 KSP Residual norm 3.943330715541e-01 6935 KSP Residual norm 3.943330715541e-01 6936 KSP Residual norm 3.943330715541e-01 6937 KSP Residual norm 3.943330715541e-01 6938 KSP Residual norm 3.943330715541e-01 6939 KSP Residual norm 3.943330715541e-01 6940 KSP Residual norm 3.943330715541e-01 6941 KSP Residual norm 3.943330715541e-01 6942 KSP Residual norm 3.943330715541e-01 6943 KSP Residual norm 3.943330715541e-01 6944 KSP Residual norm 3.943330715541e-01 6945 KSP Residual norm 3.943330715541e-01 6946 KSP Residual norm 3.943330715541e-01 6947 KSP Residual norm 3.943330715541e-01 6948 KSP Residual norm 3.943330715541e-01 6949 KSP Residual norm 3.943330715541e-01 6950 KSP Residual norm 3.943330715541e-01 6951 KSP Residual norm 3.943330715541e-01 6952 KSP Residual norm 3.943330715541e-01 6953 KSP Residual norm 3.943330715541e-01 6954 KSP Residual norm 3.943330715541e-01 6955 KSP Residual norm 3.943330715541e-01 6956 KSP Residual norm 3.943330715541e-01 6957 KSP Residual norm 3.943330715541e-01 6958 KSP Residual norm 3.943330715541e-01 6959 KSP Residual norm 3.943330715541e-01 6960 KSP Residual norm 3.943330715538e-01 6961 KSP Residual norm 3.943330715538e-01 6962 KSP Residual norm 3.943330715538e-01 6963 KSP Residual norm 3.943330715538e-01 6964 KSP Residual norm 3.943330715538e-01 6965 KSP Residual norm 3.943330715538e-01 6966 KSP Residual norm 3.943330715538e-01 6967 KSP Residual norm 3.943330715538e-01 6968 KSP Residual norm 3.943330715538e-01 6969 KSP Residual norm 3.943330715538e-01 6970 KSP Residual norm 3.943330715538e-01 6971 KSP Residual norm 3.943330715538e-01 6972 KSP Residual norm 3.943330715538e-01 6973 KSP Residual norm 3.943330715538e-01 6974 KSP Residual norm 3.943330715538e-01 6975 KSP Residual norm 3.943330715538e-01 6976 KSP Residual norm 3.943330715538e-01 6977 KSP Residual norm 3.943330715538e-01 6978 KSP Residual norm 3.943330715538e-01 6979 KSP Residual norm 3.943330715538e-01 6980 KSP Residual norm 3.943330715538e-01 6981 KSP Residual norm 3.943330715538e-01 6982 KSP Residual norm 3.943330715538e-01 6983 KSP Residual norm 3.943330715538e-01 6984 KSP Residual norm 3.943330715538e-01 6985 KSP Residual norm 3.943330715538e-01 6986 KSP Residual norm 3.943330715538e-01 6987 KSP Residual norm 3.943330715538e-01 6988 KSP Residual norm 3.943330715538e-01 6989 KSP Residual norm 3.943330715538e-01 6990 KSP Residual norm 3.943330715539e-01 6991 KSP Residual norm 3.943330715539e-01 6992 KSP Residual norm 3.943330715539e-01 6993 KSP Residual norm 3.943330715539e-01 6994 KSP Residual norm 3.943330715539e-01 6995 KSP Residual norm 3.943330715539e-01 6996 KSP Residual norm 3.943330715539e-01 6997 KSP Residual norm 3.943330715539e-01 6998 KSP Residual norm 3.943330715539e-01 6999 KSP Residual norm 3.943330715539e-01 7000 KSP Residual norm 3.943330715539e-01 7001 KSP Residual norm 3.943330715539e-01 7002 KSP Residual norm 3.943330715539e-01 7003 KSP Residual norm 3.943330715539e-01 7004 KSP Residual norm 3.943330715539e-01 7005 KSP Residual norm 3.943330715539e-01 7006 KSP Residual norm 3.943330715539e-01 7007 KSP Residual norm 3.943330715539e-01 7008 KSP Residual norm 3.943330715539e-01 7009 KSP Residual norm 3.943330715539e-01 7010 KSP Residual norm 3.943330715539e-01 7011 KSP Residual norm 3.943330715539e-01 7012 KSP Residual norm 3.943330715539e-01 7013 KSP Residual norm 3.943330715539e-01 7014 KSP Residual norm 3.943330715539e-01 7015 KSP Residual norm 3.943330715539e-01 7016 KSP Residual norm 3.943330715539e-01 7017 KSP Residual norm 3.943330715539e-01 7018 KSP Residual norm 3.943330715539e-01 7019 KSP Residual norm 3.943330715539e-01 7020 KSP Residual norm 3.943330715539e-01 7021 KSP Residual norm 3.943330715539e-01 7022 KSP Residual norm 3.943330715539e-01 7023 KSP Residual norm 3.943330715539e-01 7024 KSP Residual norm 3.943330715539e-01 7025 KSP Residual norm 3.943330715539e-01 7026 KSP Residual norm 3.943330715539e-01 7027 KSP Residual norm 3.943330715539e-01 7028 KSP Residual norm 3.943330715539e-01 7029 KSP Residual norm 3.943330715539e-01 7030 KSP Residual norm 3.943330715539e-01 7031 KSP Residual norm 3.943330715539e-01 7032 KSP Residual norm 3.943330715539e-01 7033 KSP Residual norm 3.943330715539e-01 7034 KSP Residual norm 3.943330715539e-01 7035 KSP Residual norm 3.943330715539e-01 7036 KSP Residual norm 3.943330715539e-01 7037 KSP Residual norm 3.943330715539e-01 7038 KSP Residual norm 3.943330715539e-01 7039 KSP Residual norm 3.943330715539e-01 7040 KSP Residual norm 3.943330715539e-01 7041 KSP Residual norm 3.943330715539e-01 7042 KSP Residual norm 3.943330715539e-01 7043 KSP Residual norm 3.943330715539e-01 7044 KSP Residual norm 3.943330715539e-01 7045 KSP Residual norm 3.943330715539e-01 7046 KSP Residual norm 3.943330715539e-01 7047 KSP Residual norm 3.943330715539e-01 7048 KSP Residual norm 3.943330715539e-01 7049 KSP Residual norm 3.943330715539e-01 7050 KSP Residual norm 3.943330715537e-01 7051 KSP Residual norm 3.943330715537e-01 7052 KSP Residual norm 3.943330715537e-01 7053 KSP Residual norm 3.943330715537e-01 7054 KSP Residual norm 3.943330715537e-01 7055 KSP Residual norm 3.943330715537e-01 7056 KSP Residual norm 3.943330715537e-01 7057 KSP Residual norm 3.943330715537e-01 7058 KSP Residual norm 3.943330715537e-01 7059 KSP Residual norm 3.943330715537e-01 7060 KSP Residual norm 3.943330715537e-01 7061 KSP Residual norm 3.943330715537e-01 7062 KSP Residual norm 3.943330715537e-01 7063 KSP Residual norm 3.943330715537e-01 7064 KSP Residual norm 3.943330715537e-01 7065 KSP Residual norm 3.943330715537e-01 7066 KSP Residual norm 3.943330715537e-01 7067 KSP Residual norm 3.943330715537e-01 7068 KSP Residual norm 3.943330715537e-01 7069 KSP Residual norm 3.943330715537e-01 7070 KSP Residual norm 3.943330715537e-01 7071 KSP Residual norm 3.943330715537e-01 7072 KSP Residual norm 3.943330715537e-01 7073 KSP Residual norm 3.943330715537e-01 7074 KSP Residual norm 3.943330715537e-01 7075 KSP Residual norm 3.943330715537e-01 7076 KSP Residual norm 3.943330715537e-01 7077 KSP Residual norm 3.943330715537e-01 7078 KSP Residual norm 3.943330715537e-01 7079 KSP Residual norm 3.943330715537e-01 7080 KSP Residual norm 3.943330715535e-01 7081 KSP Residual norm 3.943330715535e-01 7082 KSP Residual norm 3.943330715535e-01 7083 KSP Residual norm 3.943330715535e-01 7084 KSP Residual norm 3.943330715535e-01 7085 KSP Residual norm 3.943330715535e-01 7086 KSP Residual norm 3.943330715535e-01 7087 KSP Residual norm 3.943330715535e-01 7088 KSP Residual norm 3.943330715535e-01 7089 KSP Residual norm 3.943330715535e-01 7090 KSP Residual norm 3.943330715535e-01 7091 KSP Residual norm 3.943330715535e-01 7092 KSP Residual norm 3.943330715535e-01 7093 KSP Residual norm 3.943330715535e-01 7094 KSP Residual norm 3.943330715535e-01 7095 KSP Residual norm 3.943330715535e-01 7096 KSP Residual norm 3.943330715535e-01 7097 KSP Residual norm 3.943330715535e-01 7098 KSP Residual norm 3.943330715535e-01 7099 KSP Residual norm 3.943330715535e-01 7100 KSP Residual norm 3.943330715535e-01 7101 KSP Residual norm 3.943330715535e-01 7102 KSP Residual norm 3.943330715535e-01 7103 KSP Residual norm 3.943330715535e-01 7104 KSP Residual norm 3.943330715535e-01 7105 KSP Residual norm 3.943330715535e-01 7106 KSP Residual norm 3.943330715535e-01 7107 KSP Residual norm 3.943330715535e-01 7108 KSP Residual norm 3.943330715535e-01 7109 KSP Residual norm 3.943330715535e-01 7110 KSP Residual norm 3.943330715536e-01 7111 KSP Residual norm 3.943330715536e-01 7112 KSP Residual norm 3.943330715536e-01 7113 KSP Residual norm 3.943330715536e-01 7114 KSP Residual norm 3.943330715536e-01 7115 KSP Residual norm 3.943330715536e-01 7116 KSP Residual norm 3.943330715536e-01 7117 KSP Residual norm 3.943330715536e-01 7118 KSP Residual norm 3.943330715536e-01 7119 KSP Residual norm 3.943330715536e-01 7120 KSP Residual norm 3.943330715536e-01 7121 KSP Residual norm 3.943330715536e-01 7122 KSP Residual norm 3.943330715536e-01 7123 KSP Residual norm 3.943330715536e-01 7124 KSP Residual norm 3.943330715536e-01 7125 KSP Residual norm 3.943330715536e-01 7126 KSP Residual norm 3.943330715536e-01 7127 KSP Residual norm 3.943330715536e-01 7128 KSP Residual norm 3.943330715536e-01 7129 KSP Residual norm 3.943330715536e-01 7130 KSP Residual norm 3.943330715536e-01 7131 KSP Residual norm 3.943330715536e-01 7132 KSP Residual norm 3.943330715536e-01 7133 KSP Residual norm 3.943330715536e-01 7134 KSP Residual norm 3.943330715536e-01 7135 KSP Residual norm 3.943330715536e-01 7136 KSP Residual norm 3.943330715536e-01 7137 KSP Residual norm 3.943330715536e-01 7138 KSP Residual norm 3.943330715536e-01 7139 KSP Residual norm 3.943330715536e-01 7140 KSP Residual norm 3.943330715536e-01 7141 KSP Residual norm 3.943330715536e-01 7142 KSP Residual norm 3.943330715536e-01 7143 KSP Residual norm 3.943330715536e-01 7144 KSP Residual norm 3.943330715536e-01 7145 KSP Residual norm 3.943330715536e-01 7146 KSP Residual norm 3.943330715536e-01 7147 KSP Residual norm 3.943330715536e-01 7148 KSP Residual norm 3.943330715536e-01 7149 KSP Residual norm 3.943330715536e-01 7150 KSP Residual norm 3.943330715536e-01 7151 KSP Residual norm 3.943330715536e-01 7152 KSP Residual norm 3.943330715536e-01 7153 KSP Residual norm 3.943330715536e-01 7154 KSP Residual norm 3.943330715536e-01 7155 KSP Residual norm 3.943330715536e-01 7156 KSP Residual norm 3.943330715536e-01 7157 KSP Residual norm 3.943330715536e-01 7158 KSP Residual norm 3.943330715536e-01 7159 KSP Residual norm 3.943330715536e-01 7160 KSP Residual norm 3.943330715536e-01 7161 KSP Residual norm 3.943330715536e-01 7162 KSP Residual norm 3.943330715536e-01 7163 KSP Residual norm 3.943330715536e-01 7164 KSP Residual norm 3.943330715536e-01 7165 KSP Residual norm 3.943330715536e-01 7166 KSP Residual norm 3.943330715536e-01 7167 KSP Residual norm 3.943330715536e-01 7168 KSP Residual norm 3.943330715536e-01 7169 KSP Residual norm 3.943330715536e-01 7170 KSP Residual norm 3.943330715531e-01 7171 KSP Residual norm 3.943330715531e-01 7172 KSP Residual norm 3.943330715531e-01 7173 KSP Residual norm 3.943330715531e-01 7174 KSP Residual norm 3.943330715531e-01 7175 KSP Residual norm 3.943330715531e-01 7176 KSP Residual norm 3.943330715531e-01 7177 KSP Residual norm 3.943330715531e-01 7178 KSP Residual norm 3.943330715531e-01 7179 KSP Residual norm 3.943330715531e-01 7180 KSP Residual norm 3.943330715531e-01 7181 KSP Residual norm 3.943330715531e-01 7182 KSP Residual norm 3.943330715531e-01 7183 KSP Residual norm 3.943330715531e-01 7184 KSP Residual norm 3.943330715531e-01 7185 KSP Residual norm 3.943330715531e-01 7186 KSP Residual norm 3.943330715531e-01 7187 KSP Residual norm 3.943330715531e-01 7188 KSP Residual norm 3.943330715531e-01 7189 KSP Residual norm 3.943330715531e-01 7190 KSP Residual norm 3.943330715531e-01 7191 KSP Residual norm 3.943330715531e-01 7192 KSP Residual norm 3.943330715531e-01 7193 KSP Residual norm 3.943330715531e-01 7194 KSP Residual norm 3.943330715531e-01 7195 KSP Residual norm 3.943330715531e-01 7196 KSP Residual norm 3.943330715531e-01 7197 KSP Residual norm 3.943330715531e-01 7198 KSP Residual norm 3.943330715531e-01 7199 KSP Residual norm 3.943330715531e-01 7200 KSP Residual norm 3.943330715537e-01 7201 KSP Residual norm 3.943330715537e-01 7202 KSP Residual norm 3.943330715537e-01 7203 KSP Residual norm 3.943330715537e-01 7204 KSP Residual norm 3.943330715537e-01 7205 KSP Residual norm 3.943330715537e-01 7206 KSP Residual norm 3.943330715537e-01 7207 KSP Residual norm 3.943330715537e-01 7208 KSP Residual norm 3.943330715537e-01 7209 KSP Residual norm 3.943330715537e-01 7210 KSP Residual norm 3.943330715537e-01 7211 KSP Residual norm 3.943330715537e-01 7212 KSP Residual norm 3.943330715537e-01 7213 KSP Residual norm 3.943330715537e-01 7214 KSP Residual norm 3.943330715537e-01 7215 KSP Residual norm 3.943330715537e-01 7216 KSP Residual norm 3.943330715537e-01 7217 KSP Residual norm 3.943330715537e-01 7218 KSP Residual norm 3.943330715537e-01 7219 KSP Residual norm 3.943330715537e-01 7220 KSP Residual norm 3.943330715537e-01 7221 KSP Residual norm 3.943330715537e-01 7222 KSP Residual norm 3.943330715537e-01 7223 KSP Residual norm 3.943330715537e-01 7224 KSP Residual norm 3.943330715537e-01 7225 KSP Residual norm 3.943330715537e-01 7226 KSP Residual norm 3.943330715537e-01 7227 KSP Residual norm 3.943330715537e-01 7228 KSP Residual norm 3.943330715537e-01 7229 KSP Residual norm 3.943330715537e-01 7230 KSP Residual norm 3.943330715534e-01 7231 KSP Residual norm 3.943330715534e-01 7232 KSP Residual norm 3.943330715534e-01 7233 KSP Residual norm 3.943330715534e-01 7234 KSP Residual norm 3.943330715534e-01 7235 KSP Residual norm 3.943330715534e-01 7236 KSP Residual norm 3.943330715534e-01 7237 KSP Residual norm 3.943330715534e-01 7238 KSP Residual norm 3.943330715534e-01 7239 KSP Residual norm 3.943330715534e-01 7240 KSP Residual norm 3.943330715534e-01 7241 KSP Residual norm 3.943330715534e-01 7242 KSP Residual norm 3.943330715534e-01 7243 KSP Residual norm 3.943330715534e-01 7244 KSP Residual norm 3.943330715534e-01 7245 KSP Residual norm 3.943330715534e-01 7246 KSP Residual norm 3.943330715534e-01 7247 KSP Residual norm 3.943330715534e-01 7248 KSP Residual norm 3.943330715534e-01 7249 KSP Residual norm 3.943330715534e-01 7250 KSP Residual norm 3.943330715534e-01 7251 KSP Residual norm 3.943330715534e-01 7252 KSP Residual norm 3.943330715534e-01 7253 KSP Residual norm 3.943330715534e-01 7254 KSP Residual norm 3.943330715534e-01 7255 KSP Residual norm 3.943330715534e-01 7256 KSP Residual norm 3.943330715534e-01 7257 KSP Residual norm 3.943330715534e-01 7258 KSP Residual norm 3.943330715534e-01 7259 KSP Residual norm 3.943330715534e-01 7260 KSP Residual norm 3.943330715535e-01 7261 KSP Residual norm 3.943330715535e-01 7262 KSP Residual norm 3.943330715535e-01 7263 KSP Residual norm 3.943330715535e-01 7264 KSP Residual norm 3.943330715535e-01 7265 KSP Residual norm 3.943330715535e-01 7266 KSP Residual norm 3.943330715535e-01 7267 KSP Residual norm 3.943330715535e-01 7268 KSP Residual norm 3.943330715535e-01 7269 KSP Residual norm 3.943330715535e-01 7270 KSP Residual norm 3.943330715535e-01 7271 KSP Residual norm 3.943330715535e-01 7272 KSP Residual norm 3.943330715535e-01 7273 KSP Residual norm 3.943330715535e-01 7274 KSP Residual norm 3.943330715535e-01 7275 KSP Residual norm 3.943330715535e-01 7276 KSP Residual norm 3.943330715535e-01 7277 KSP Residual norm 3.943330715535e-01 7278 KSP Residual norm 3.943330715535e-01 7279 KSP Residual norm 3.943330715535e-01 7280 KSP Residual norm 3.943330715535e-01 7281 KSP Residual norm 3.943330715535e-01 7282 KSP Residual norm 3.943330715535e-01 7283 KSP Residual norm 3.943330715535e-01 7284 KSP Residual norm 3.943330715535e-01 7285 KSP Residual norm 3.943330715535e-01 7286 KSP Residual norm 3.943330715535e-01 7287 KSP Residual norm 3.943330715535e-01 7288 KSP Residual norm 3.943330715535e-01 7289 KSP Residual norm 3.943330715535e-01 7290 KSP Residual norm 3.943330715535e-01 7291 KSP Residual norm 3.943330715535e-01 7292 KSP Residual norm 3.943330715535e-01 7293 KSP Residual norm 3.943330715535e-01 7294 KSP Residual norm 3.943330715535e-01 7295 KSP Residual norm 3.943330715535e-01 7296 KSP Residual norm 3.943330715535e-01 7297 KSP Residual norm 3.943330715535e-01 7298 KSP Residual norm 3.943330715535e-01 7299 KSP Residual norm 3.943330715535e-01 7300 KSP Residual norm 3.943330715535e-01 7301 KSP Residual norm 3.943330715535e-01 7302 KSP Residual norm 3.943330715535e-01 7303 KSP Residual norm 3.943330715535e-01 7304 KSP Residual norm 3.943330715535e-01 7305 KSP Residual norm 3.943330715535e-01 7306 KSP Residual norm 3.943330715535e-01 7307 KSP Residual norm 3.943330715535e-01 7308 KSP Residual norm 3.943330715535e-01 7309 KSP Residual norm 3.943330715535e-01 7310 KSP Residual norm 3.943330715535e-01 7311 KSP Residual norm 3.943330715535e-01 7312 KSP Residual norm 3.943330715535e-01 7313 KSP Residual norm 3.943330715535e-01 7314 KSP Residual norm 3.943330715535e-01 7315 KSP Residual norm 3.943330715535e-01 7316 KSP Residual norm 3.943330715535e-01 7317 KSP Residual norm 3.943330715535e-01 7318 KSP Residual norm 3.943330715535e-01 7319 KSP Residual norm 3.943330715535e-01 7320 KSP Residual norm 3.943330715536e-01 7321 KSP Residual norm 3.943330715536e-01 7322 KSP Residual norm 3.943330715536e-01 7323 KSP Residual norm 3.943330715536e-01 7324 KSP Residual norm 3.943330715536e-01 7325 KSP Residual norm 3.943330715536e-01 7326 KSP Residual norm 3.943330715536e-01 7327 KSP Residual norm 3.943330715536e-01 7328 KSP Residual norm 3.943330715536e-01 7329 KSP Residual norm 3.943330715536e-01 7330 KSP Residual norm 3.943330715536e-01 7331 KSP Residual norm 3.943330715536e-01 7332 KSP Residual norm 3.943330715536e-01 7333 KSP Residual norm 3.943330715536e-01 7334 KSP Residual norm 3.943330715536e-01 7335 KSP Residual norm 3.943330715536e-01 7336 KSP Residual norm 3.943330715536e-01 7337 KSP Residual norm 3.943330715536e-01 7338 KSP Residual norm 3.943330715536e-01 7339 KSP Residual norm 3.943330715536e-01 7340 KSP Residual norm 3.943330715536e-01 7341 KSP Residual norm 3.943330715536e-01 7342 KSP Residual norm 3.943330715536e-01 7343 KSP Residual norm 3.943330715536e-01 7344 KSP Residual norm 3.943330715536e-01 7345 KSP Residual norm 3.943330715536e-01 7346 KSP Residual norm 3.943330715536e-01 7347 KSP Residual norm 3.943330715536e-01 7348 KSP Residual norm 3.943330715536e-01 7349 KSP Residual norm 3.943330715536e-01 7350 KSP Residual norm 3.943330715536e-01 7351 KSP Residual norm 3.943330715536e-01 7352 KSP Residual norm 3.943330715536e-01 7353 KSP Residual norm 3.943330715536e-01 7354 KSP Residual norm 3.943330715536e-01 7355 KSP Residual norm 3.943330715536e-01 7356 KSP Residual norm 3.943330715536e-01 7357 KSP Residual norm 3.943330715536e-01 7358 KSP Residual norm 3.943330715536e-01 7359 KSP Residual norm 3.943330715536e-01 7360 KSP Residual norm 3.943330715536e-01 7361 KSP Residual norm 3.943330715536e-01 7362 KSP Residual norm 3.943330715536e-01 7363 KSP Residual norm 3.943330715536e-01 7364 KSP Residual norm 3.943330715536e-01 7365 KSP Residual norm 3.943330715536e-01 7366 KSP Residual norm 3.943330715536e-01 7367 KSP Residual norm 3.943330715536e-01 7368 KSP Residual norm 3.943330715536e-01 7369 KSP Residual norm 3.943330715536e-01 7370 KSP Residual norm 3.943330715536e-01 7371 KSP Residual norm 3.943330715536e-01 7372 KSP Residual norm 3.943330715536e-01 7373 KSP Residual norm 3.943330715536e-01 7374 KSP Residual norm 3.943330715536e-01 7375 KSP Residual norm 3.943330715536e-01 7376 KSP Residual norm 3.943330715536e-01 7377 KSP Residual norm 3.943330715536e-01 7378 KSP Residual norm 3.943330715536e-01 7379 KSP Residual norm 3.943330715536e-01 7380 KSP Residual norm 3.943330715534e-01 7381 KSP Residual norm 3.943330715534e-01 7382 KSP Residual norm 3.943330715534e-01 7383 KSP Residual norm 3.943330715534e-01 7384 KSP Residual norm 3.943330715534e-01 7385 KSP Residual norm 3.943330715534e-01 7386 KSP Residual norm 3.943330715534e-01 7387 KSP Residual norm 3.943330715534e-01 7388 KSP Residual norm 3.943330715534e-01 7389 KSP Residual norm 3.943330715534e-01 7390 KSP Residual norm 3.943330715534e-01 7391 KSP Residual norm 3.943330715534e-01 7392 KSP Residual norm 3.943330715534e-01 7393 KSP Residual norm 3.943330715534e-01 7394 KSP Residual norm 3.943330715534e-01 7395 KSP Residual norm 3.943330715534e-01 7396 KSP Residual norm 3.943330715534e-01 7397 KSP Residual norm 3.943330715534e-01 7398 KSP Residual norm 3.943330715534e-01 7399 KSP Residual norm 3.943330715534e-01 7400 KSP Residual norm 3.943330715534e-01 7401 KSP Residual norm 3.943330715534e-01 7402 KSP Residual norm 3.943330715534e-01 7403 KSP Residual norm 3.943330715534e-01 7404 KSP Residual norm 3.943330715534e-01 7405 KSP Residual norm 3.943330715534e-01 7406 KSP Residual norm 3.943330715534e-01 7407 KSP Residual norm 3.943330715534e-01 7408 KSP Residual norm 3.943330715534e-01 7409 KSP Residual norm 3.943330715534e-01 7410 KSP Residual norm 3.943330715535e-01 7411 KSP Residual norm 3.943330715535e-01 7412 KSP Residual norm 3.943330715535e-01 7413 KSP Residual norm 3.943330715535e-01 7414 KSP Residual norm 3.943330715535e-01 7415 KSP Residual norm 3.943330715535e-01 7416 KSP Residual norm 3.943330715535e-01 7417 KSP Residual norm 3.943330715535e-01 7418 KSP Residual norm 3.943330715535e-01 7419 KSP Residual norm 3.943330715535e-01 7420 KSP Residual norm 3.943330715535e-01 7421 KSP Residual norm 3.943330715535e-01 7422 KSP Residual norm 3.943330715535e-01 7423 KSP Residual norm 3.943330715535e-01 7424 KSP Residual norm 3.943330715535e-01 7425 KSP Residual norm 3.943330715535e-01 7426 KSP Residual norm 3.943330715535e-01 7427 KSP Residual norm 3.943330715535e-01 7428 KSP Residual norm 3.943330715535e-01 7429 KSP Residual norm 3.943330715535e-01 7430 KSP Residual norm 3.943330715535e-01 7431 KSP Residual norm 3.943330715535e-01 7432 KSP Residual norm 3.943330715535e-01 7433 KSP Residual norm 3.943330715535e-01 7434 KSP Residual norm 3.943330715535e-01 7435 KSP Residual norm 3.943330715535e-01 7436 KSP Residual norm 3.943330715535e-01 7437 KSP Residual norm 3.943330715535e-01 7438 KSP Residual norm 3.943330715535e-01 7439 KSP Residual norm 3.943330715535e-01 7440 KSP Residual norm 3.943330715534e-01 7441 KSP Residual norm 3.943330715534e-01 7442 KSP Residual norm 3.943330715534e-01 7443 KSP Residual norm 3.943330715534e-01 7444 KSP Residual norm 3.943330715534e-01 7445 KSP Residual norm 3.943330715534e-01 7446 KSP Residual norm 3.943330715534e-01 7447 KSP Residual norm 3.943330715534e-01 7448 KSP Residual norm 3.943330715534e-01 7449 KSP Residual norm 3.943330715534e-01 7450 KSP Residual norm 3.943330715534e-01 7451 KSP Residual norm 3.943330715534e-01 7452 KSP Residual norm 3.943330715534e-01 7453 KSP Residual norm 3.943330715534e-01 7454 KSP Residual norm 3.943330715534e-01 7455 KSP Residual norm 3.943330715534e-01 7456 KSP Residual norm 3.943330715534e-01 7457 KSP Residual norm 3.943330715534e-01 7458 KSP Residual norm 3.943330715534e-01 7459 KSP Residual norm 3.943330715534e-01 7460 KSP Residual norm 3.943330715534e-01 7461 KSP Residual norm 3.943330715534e-01 7462 KSP Residual norm 3.943330715534e-01 7463 KSP Residual norm 3.943330715534e-01 7464 KSP Residual norm 3.943330715534e-01 7465 KSP Residual norm 3.943330715534e-01 7466 KSP Residual norm 3.943330715534e-01 7467 KSP Residual norm 3.943330715534e-01 7468 KSP Residual norm 3.943330715534e-01 7469 KSP Residual norm 3.943330715534e-01 7470 KSP Residual norm 3.943330715535e-01 7471 KSP Residual norm 3.943330715535e-01 7472 KSP Residual norm 3.943330715535e-01 7473 KSP Residual norm 3.943330715535e-01 7474 KSP Residual norm 3.943330715535e-01 7475 KSP Residual norm 3.943330715535e-01 7476 KSP Residual norm 3.943330715535e-01 7477 KSP Residual norm 3.943330715535e-01 7478 KSP Residual norm 3.943330715535e-01 7479 KSP Residual norm 3.943330715535e-01 7480 KSP Residual norm 3.943330715535e-01 7481 KSP Residual norm 3.943330715535e-01 7482 KSP Residual norm 3.943330715535e-01 7483 KSP Residual norm 3.943330715535e-01 7484 KSP Residual norm 3.943330715535e-01 7485 KSP Residual norm 3.943330715535e-01 7486 KSP Residual norm 3.943330715535e-01 7487 KSP Residual norm 3.943330715535e-01 7488 KSP Residual norm 3.943330715535e-01 7489 KSP Residual norm 3.943330715535e-01 7490 KSP Residual norm 3.943330715535e-01 7491 KSP Residual norm 3.943330715535e-01 7492 KSP Residual norm 3.943330715535e-01 7493 KSP Residual norm 3.943330715535e-01 7494 KSP Residual norm 3.943330715535e-01 7495 KSP Residual norm 3.943330715535e-01 7496 KSP Residual norm 3.943330715535e-01 7497 KSP Residual norm 3.943330715535e-01 7498 KSP Residual norm 3.943330715535e-01 7499 KSP Residual norm 3.943330715535e-01 7500 KSP Residual norm 3.943330715537e-01 7501 KSP Residual norm 3.943330715537e-01 7502 KSP Residual norm 3.943330715537e-01 7503 KSP Residual norm 3.943330715537e-01 7504 KSP Residual norm 3.943330715537e-01 7505 KSP Residual norm 3.943330715537e-01 7506 KSP Residual norm 3.943330715537e-01 7507 KSP Residual norm 3.943330715537e-01 7508 KSP Residual norm 3.943330715537e-01 7509 KSP Residual norm 3.943330715537e-01 7510 KSP Residual norm 3.943330715537e-01 7511 KSP Residual norm 3.943330715537e-01 7512 KSP Residual norm 3.943330715537e-01 7513 KSP Residual norm 3.943330715537e-01 7514 KSP Residual norm 3.943330715537e-01 7515 KSP Residual norm 3.943330715537e-01 7516 KSP Residual norm 3.943330715537e-01 7517 KSP Residual norm 3.943330715537e-01 7518 KSP Residual norm 3.943330715537e-01 7519 KSP Residual norm 3.943330715537e-01 7520 KSP Residual norm 3.943330715537e-01 7521 KSP Residual norm 3.943330715537e-01 7522 KSP Residual norm 3.943330715537e-01 7523 KSP Residual norm 3.943330715537e-01 7524 KSP Residual norm 3.943330715537e-01 7525 KSP Residual norm 3.943330715537e-01 7526 KSP Residual norm 3.943330715537e-01 7527 KSP Residual norm 3.943330715537e-01 7528 KSP Residual norm 3.943330715537e-01 7529 KSP Residual norm 3.943330715537e-01 7530 KSP Residual norm 3.943330715536e-01 7531 KSP Residual norm 3.943330715536e-01 7532 KSP Residual norm 3.943330715536e-01 7533 KSP Residual norm 3.943330715536e-01 7534 KSP Residual norm 3.943330715536e-01 7535 KSP Residual norm 3.943330715536e-01 7536 KSP Residual norm 3.943330715536e-01 7537 KSP Residual norm 3.943330715536e-01 7538 KSP Residual norm 3.943330715536e-01 7539 KSP Residual norm 3.943330715536e-01 7540 KSP Residual norm 3.943330715536e-01 7541 KSP Residual norm 3.943330715536e-01 7542 KSP Residual norm 3.943330715536e-01 7543 KSP Residual norm 3.943330715536e-01 7544 KSP Residual norm 3.943330715536e-01 7545 KSP Residual norm 3.943330715536e-01 7546 KSP Residual norm 3.943330715536e-01 7547 KSP Residual norm 3.943330715536e-01 7548 KSP Residual norm 3.943330715536e-01 7549 KSP Residual norm 3.943330715536e-01 7550 KSP Residual norm 3.943330715536e-01 7551 KSP Residual norm 3.943330715536e-01 7552 KSP Residual norm 3.943330715536e-01 7553 KSP Residual norm 3.943330715536e-01 7554 KSP Residual norm 3.943330715536e-01 7555 KSP Residual norm 3.943330715536e-01 7556 KSP Residual norm 3.943330715536e-01 7557 KSP Residual norm 3.943330715536e-01 7558 KSP Residual norm 3.943330715536e-01 7559 KSP Residual norm 3.943330715536e-01 7560 KSP Residual norm 3.943330715535e-01 7561 KSP Residual norm 3.943330715535e-01 7562 KSP Residual norm 3.943330715535e-01 7563 KSP Residual norm 3.943330715535e-01 7564 KSP Residual norm 3.943330715535e-01 7565 KSP Residual norm 3.943330715535e-01 7566 KSP Residual norm 3.943330715535e-01 7567 KSP Residual norm 3.943330715535e-01 7568 KSP Residual norm 3.943330715535e-01 7569 KSP Residual norm 3.943330715535e-01 7570 KSP Residual norm 3.943330715535e-01 7571 KSP Residual norm 3.943330715535e-01 7572 KSP Residual norm 3.943330715535e-01 7573 KSP Residual norm 3.943330715535e-01 7574 KSP Residual norm 3.943330715535e-01 7575 KSP Residual norm 3.943330715535e-01 7576 KSP Residual norm 3.943330715535e-01 7577 KSP Residual norm 3.943330715535e-01 7578 KSP Residual norm 3.943330715535e-01 7579 KSP Residual norm 3.943330715535e-01 7580 KSP Residual norm 3.943330715535e-01 7581 KSP Residual norm 3.943330715535e-01 7582 KSP Residual norm 3.943330715535e-01 7583 KSP Residual norm 3.943330715535e-01 7584 KSP Residual norm 3.943330715535e-01 7585 KSP Residual norm 3.943330715535e-01 7586 KSP Residual norm 3.943330715535e-01 7587 KSP Residual norm 3.943330715535e-01 7588 KSP Residual norm 3.943330715535e-01 7589 KSP Residual norm 3.943330715535e-01 7590 KSP Residual norm 3.943330715532e-01 7591 KSP Residual norm 3.943330715532e-01 7592 KSP Residual norm 3.943330715532e-01 7593 KSP Residual norm 3.943330715532e-01 7594 KSP Residual norm 3.943330715532e-01 7595 KSP Residual norm 3.943330715532e-01 7596 KSP Residual norm 3.943330715532e-01 7597 KSP Residual norm 3.943330715532e-01 7598 KSP Residual norm 3.943330715532e-01 7599 KSP Residual norm 3.943330715532e-01 7600 KSP Residual norm 3.943330715532e-01 7601 KSP Residual norm 3.943330715532e-01 7602 KSP Residual norm 3.943330715532e-01 7603 KSP Residual norm 3.943330715532e-01 7604 KSP Residual norm 3.943330715532e-01 7605 KSP Residual norm 3.943330715532e-01 7606 KSP Residual norm 3.943330715532e-01 7607 KSP Residual norm 3.943330715532e-01 7608 KSP Residual norm 3.943330715532e-01 7609 KSP Residual norm 3.943330715532e-01 7610 KSP Residual norm 3.943330715532e-01 7611 KSP Residual norm 3.943330715532e-01 7612 KSP Residual norm 3.943330715532e-01 7613 KSP Residual norm 3.943330715532e-01 7614 KSP Residual norm 3.943330715532e-01 7615 KSP Residual norm 3.943330715532e-01 7616 KSP Residual norm 3.943330715532e-01 7617 KSP Residual norm 3.943330715532e-01 7618 KSP Residual norm 3.943330715532e-01 7619 KSP Residual norm 3.943330715532e-01 7620 KSP Residual norm 3.943330715531e-01 7621 KSP Residual norm 3.943330715531e-01 7622 KSP Residual norm 3.943330715531e-01 7623 KSP Residual norm 3.943330715531e-01 7624 KSP Residual norm 3.943330715531e-01 7625 KSP Residual norm 3.943330715531e-01 7626 KSP Residual norm 3.943330715531e-01 7627 KSP Residual norm 3.943330715531e-01 7628 KSP Residual norm 3.943330715531e-01 7629 KSP Residual norm 3.943330715531e-01 7630 KSP Residual norm 3.943330715531e-01 7631 KSP Residual norm 3.943330715531e-01 7632 KSP Residual norm 3.943330715531e-01 7633 KSP Residual norm 3.943330715531e-01 7634 KSP Residual norm 3.943330715531e-01 7635 KSP Residual norm 3.943330715531e-01 7636 KSP Residual norm 3.943330715531e-01 7637 KSP Residual norm 3.943330715531e-01 7638 KSP Residual norm 3.943330715531e-01 7639 KSP Residual norm 3.943330715531e-01 7640 KSP Residual norm 3.943330715531e-01 7641 KSP Residual norm 3.943330715531e-01 7642 KSP Residual norm 3.943330715531e-01 7643 KSP Residual norm 3.943330715531e-01 7644 KSP Residual norm 3.943330715531e-01 7645 KSP Residual norm 3.943330715531e-01 7646 KSP Residual norm 3.943330715531e-01 7647 KSP Residual norm 3.943330715531e-01 7648 KSP Residual norm 3.943330715531e-01 7649 KSP Residual norm 3.943330715531e-01 7650 KSP Residual norm 3.943330715531e-01 7651 KSP Residual norm 3.943330715531e-01 7652 KSP Residual norm 3.943330715531e-01 7653 KSP Residual norm 3.943330715531e-01 7654 KSP Residual norm 3.943330715531e-01 7655 KSP Residual norm 3.943330715531e-01 7656 KSP Residual norm 3.943330715531e-01 7657 KSP Residual norm 3.943330715531e-01 7658 KSP Residual norm 3.943330715531e-01 7659 KSP Residual norm 3.943330715531e-01 7660 KSP Residual norm 3.943330715531e-01 7661 KSP Residual norm 3.943330715531e-01 7662 KSP Residual norm 3.943330715531e-01 7663 KSP Residual norm 3.943330715531e-01 7664 KSP Residual norm 3.943330715531e-01 7665 KSP Residual norm 3.943330715531e-01 7666 KSP Residual norm 3.943330715531e-01 7667 KSP Residual norm 3.943330715531e-01 7668 KSP Residual norm 3.943330715531e-01 7669 KSP Residual norm 3.943330715531e-01 7670 KSP Residual norm 3.943330715531e-01 7671 KSP Residual norm 3.943330715531e-01 7672 KSP Residual norm 3.943330715531e-01 7673 KSP Residual norm 3.943330715531e-01 7674 KSP Residual norm 3.943330715531e-01 7675 KSP Residual norm 3.943330715531e-01 7676 KSP Residual norm 3.943330715531e-01 7677 KSP Residual norm 3.943330715531e-01 7678 KSP Residual norm 3.943330715531e-01 7679 KSP Residual norm 3.943330715531e-01 7680 KSP Residual norm 3.943330715531e-01 7681 KSP Residual norm 3.943330715531e-01 7682 KSP Residual norm 3.943330715531e-01 7683 KSP Residual norm 3.943330715531e-01 7684 KSP Residual norm 3.943330715531e-01 7685 KSP Residual norm 3.943330715531e-01 7686 KSP Residual norm 3.943330715531e-01 7687 KSP Residual norm 3.943330715531e-01 7688 KSP Residual norm 3.943330715531e-01 7689 KSP Residual norm 3.943330715531e-01 7690 KSP Residual norm 3.943330715531e-01 7691 KSP Residual norm 3.943330715531e-01 7692 KSP Residual norm 3.943330715531e-01 7693 KSP Residual norm 3.943330715531e-01 7694 KSP Residual norm 3.943330715531e-01 7695 KSP Residual norm 3.943330715531e-01 7696 KSP Residual norm 3.943330715531e-01 7697 KSP Residual norm 3.943330715531e-01 7698 KSP Residual norm 3.943330715531e-01 7699 KSP Residual norm 3.943330715531e-01 7700 KSP Residual norm 3.943330715531e-01 7701 KSP Residual norm 3.943330715531e-01 7702 KSP Residual norm 3.943330715531e-01 7703 KSP Residual norm 3.943330715531e-01 7704 KSP Residual norm 3.943330715531e-01 7705 KSP Residual norm 3.943330715531e-01 7706 KSP Residual norm 3.943330715531e-01 7707 KSP Residual norm 3.943330715531e-01 7708 KSP Residual norm 3.943330715531e-01 7709 KSP Residual norm 3.943330715531e-01 7710 KSP Residual norm 3.943330715529e-01 7711 KSP Residual norm 3.943330715529e-01 7712 KSP Residual norm 3.943330715529e-01 7713 KSP Residual norm 3.943330715529e-01 7714 KSP Residual norm 3.943330715529e-01 7715 KSP Residual norm 3.943330715529e-01 7716 KSP Residual norm 3.943330715529e-01 7717 KSP Residual norm 3.943330715529e-01 7718 KSP Residual norm 3.943330715529e-01 7719 KSP Residual norm 3.943330715529e-01 7720 KSP Residual norm 3.943330715529e-01 7721 KSP Residual norm 3.943330715529e-01 7722 KSP Residual norm 3.943330715529e-01 7723 KSP Residual norm 3.943330715529e-01 7724 KSP Residual norm 3.943330715529e-01 7725 KSP Residual norm 3.943330715529e-01 7726 KSP Residual norm 3.943330715529e-01 7727 KSP Residual norm 3.943330715529e-01 7728 KSP Residual norm 3.943330715529e-01 7729 KSP Residual norm 3.943330715529e-01 7730 KSP Residual norm 3.943330715529e-01 7731 KSP Residual norm 3.943330715529e-01 7732 KSP Residual norm 3.943330715529e-01 7733 KSP Residual norm 3.943330715529e-01 7734 KSP Residual norm 3.943330715529e-01 7735 KSP Residual norm 3.943330715529e-01 7736 KSP Residual norm 3.943330715529e-01 7737 KSP Residual norm 3.943330715529e-01 7738 KSP Residual norm 3.943330715529e-01 7739 KSP Residual norm 3.943330715529e-01 7740 KSP Residual norm 3.943330715529e-01 7741 KSP Residual norm 3.943330715529e-01 7742 KSP Residual norm 3.943330715529e-01 7743 KSP Residual norm 3.943330715529e-01 7744 KSP Residual norm 3.943330715529e-01 7745 KSP Residual norm 3.943330715529e-01 7746 KSP Residual norm 3.943330715529e-01 7747 KSP Residual norm 3.943330715529e-01 7748 KSP Residual norm 3.943330715529e-01 7749 KSP Residual norm 3.943330715529e-01 7750 KSP Residual norm 3.943330715529e-01 7751 KSP Residual norm 3.943330715529e-01 7752 KSP Residual norm 3.943330715529e-01 7753 KSP Residual norm 3.943330715529e-01 7754 KSP Residual norm 3.943330715529e-01 7755 KSP Residual norm 3.943330715529e-01 7756 KSP Residual norm 3.943330715529e-01 7757 KSP Residual norm 3.943330715529e-01 7758 KSP Residual norm 3.943330715529e-01 7759 KSP Residual norm 3.943330715529e-01 7760 KSP Residual norm 3.943330715529e-01 7761 KSP Residual norm 3.943330715529e-01 7762 KSP Residual norm 3.943330715529e-01 7763 KSP Residual norm 3.943330715529e-01 7764 KSP Residual norm 3.943330715529e-01 7765 KSP Residual norm 3.943330715529e-01 7766 KSP Residual norm 3.943330715529e-01 7767 KSP Residual norm 3.943330715529e-01 7768 KSP Residual norm 3.943330715529e-01 7769 KSP Residual norm 3.943330715529e-01 7770 KSP Residual norm 3.943330715528e-01 7771 KSP Residual norm 3.943330715528e-01 7772 KSP Residual norm 3.943330715528e-01 7773 KSP Residual norm 3.943330715528e-01 7774 KSP Residual norm 3.943330715528e-01 7775 KSP Residual norm 3.943330715528e-01 7776 KSP Residual norm 3.943330715528e-01 7777 KSP Residual norm 3.943330715528e-01 7778 KSP Residual norm 3.943330715528e-01 7779 KSP Residual norm 3.943330715528e-01 7780 KSP Residual norm 3.943330715528e-01 7781 KSP Residual norm 3.943330715528e-01 7782 KSP Residual norm 3.943330715528e-01 7783 KSP Residual norm 3.943330715528e-01 7784 KSP Residual norm 3.943330715528e-01 7785 KSP Residual norm 3.943330715528e-01 7786 KSP Residual norm 3.943330715528e-01 7787 KSP Residual norm 3.943330715528e-01 7788 KSP Residual norm 3.943330715528e-01 7789 KSP Residual norm 3.943330715528e-01 7790 KSP Residual norm 3.943330715528e-01 7791 KSP Residual norm 3.943330715528e-01 7792 KSP Residual norm 3.943330715528e-01 7793 KSP Residual norm 3.943330715528e-01 7794 KSP Residual norm 3.943330715528e-01 7795 KSP Residual norm 3.943330715528e-01 7796 KSP Residual norm 3.943330715528e-01 7797 KSP Residual norm 3.943330715528e-01 7798 KSP Residual norm 3.943330715528e-01 7799 KSP Residual norm 3.943330715528e-01 7800 KSP Residual norm 3.943330715530e-01 7801 KSP Residual norm 3.943330715530e-01 7802 KSP Residual norm 3.943330715530e-01 7803 KSP Residual norm 3.943330715530e-01 7804 KSP Residual norm 3.943330715530e-01 7805 KSP Residual norm 3.943330715530e-01 7806 KSP Residual norm 3.943330715530e-01 7807 KSP Residual norm 3.943330715530e-01 7808 KSP Residual norm 3.943330715530e-01 7809 KSP Residual norm 3.943330715530e-01 7810 KSP Residual norm 3.943330715530e-01 7811 KSP Residual norm 3.943330715530e-01 7812 KSP Residual norm 3.943330715530e-01 7813 KSP Residual norm 3.943330715530e-01 7814 KSP Residual norm 3.943330715530e-01 7815 KSP Residual norm 3.943330715530e-01 7816 KSP Residual norm 3.943330715530e-01 7817 KSP Residual norm 3.943330715530e-01 7818 KSP Residual norm 3.943330715530e-01 7819 KSP Residual norm 3.943330715530e-01 7820 KSP Residual norm 3.943330715530e-01 7821 KSP Residual norm 3.943330715530e-01 7822 KSP Residual norm 3.943330715530e-01 7823 KSP Residual norm 3.943330715530e-01 7824 KSP Residual norm 3.943330715530e-01 7825 KSP Residual norm 3.943330715530e-01 7826 KSP Residual norm 3.943330715530e-01 7827 KSP Residual norm 3.943330715530e-01 7828 KSP Residual norm 3.943330715530e-01 7829 KSP Residual norm 3.943330715530e-01 7830 KSP Residual norm 3.943330715530e-01 7831 KSP Residual norm 3.943330715530e-01 7832 KSP Residual norm 3.943330715530e-01 7833 KSP Residual norm 3.943330715530e-01 7834 KSP Residual norm 3.943330715530e-01 7835 KSP Residual norm 3.943330715530e-01 7836 KSP Residual norm 3.943330715530e-01 7837 KSP Residual norm 3.943330715530e-01 7838 KSP Residual norm 3.943330715530e-01 7839 KSP Residual norm 3.943330715530e-01 7840 KSP Residual norm 3.943330715530e-01 7841 KSP Residual norm 3.943330715530e-01 7842 KSP Residual norm 3.943330715530e-01 7843 KSP Residual norm 3.943330715530e-01 7844 KSP Residual norm 3.943330715530e-01 7845 KSP Residual norm 3.943330715530e-01 7846 KSP Residual norm 3.943330715530e-01 7847 KSP Residual norm 3.943330715530e-01 7848 KSP Residual norm 3.943330715530e-01 7849 KSP Residual norm 3.943330715530e-01 7850 KSP Residual norm 3.943330715530e-01 7851 KSP Residual norm 3.943330715530e-01 7852 KSP Residual norm 3.943330715530e-01 7853 KSP Residual norm 3.943330715530e-01 7854 KSP Residual norm 3.943330715530e-01 7855 KSP Residual norm 3.943330715530e-01 7856 KSP Residual norm 3.943330715530e-01 7857 KSP Residual norm 3.943330715530e-01 7858 KSP Residual norm 3.943330715530e-01 7859 KSP Residual norm 3.943330715530e-01 7860 KSP Residual norm 3.943330715529e-01 7861 KSP Residual norm 3.943330715529e-01 7862 KSP Residual norm 3.943330715529e-01 7863 KSP Residual norm 3.943330715529e-01 7864 KSP Residual norm 3.943330715529e-01 7865 KSP Residual norm 3.943330715529e-01 7866 KSP Residual norm 3.943330715529e-01 7867 KSP Residual norm 3.943330715529e-01 7868 KSP Residual norm 3.943330715529e-01 7869 KSP Residual norm 3.943330715529e-01 7870 KSP Residual norm 3.943330715529e-01 7871 KSP Residual norm 3.943330715529e-01 7872 KSP Residual norm 3.943330715529e-01 7873 KSP Residual norm 3.943330715529e-01 7874 KSP Residual norm 3.943330715529e-01 7875 KSP Residual norm 3.943330715529e-01 7876 KSP Residual norm 3.943330715529e-01 7877 KSP Residual norm 3.943330715529e-01 7878 KSP Residual norm 3.943330715529e-01 7879 KSP Residual norm 3.943330715529e-01 7880 KSP Residual norm 3.943330715529e-01 7881 KSP Residual norm 3.943330715529e-01 7882 KSP Residual norm 3.943330715529e-01 7883 KSP Residual norm 3.943330715529e-01 7884 KSP Residual norm 3.943330715529e-01 7885 KSP Residual norm 3.943330715529e-01 7886 KSP Residual norm 3.943330715529e-01 7887 KSP Residual norm 3.943330715529e-01 7888 KSP Residual norm 3.943330715529e-01 7889 KSP Residual norm 3.943330715529e-01 7890 KSP Residual norm 3.943330715527e-01 7891 KSP Residual norm 3.943330715527e-01 7892 KSP Residual norm 3.943330715527e-01 7893 KSP Residual norm 3.943330715527e-01 7894 KSP Residual norm 3.943330715527e-01 7895 KSP Residual norm 3.943330715527e-01 7896 KSP Residual norm 3.943330715527e-01 7897 KSP Residual norm 3.943330715527e-01 7898 KSP Residual norm 3.943330715527e-01 7899 KSP Residual norm 3.943330715527e-01 7900 KSP Residual norm 3.943330715527e-01 7901 KSP Residual norm 3.943330715527e-01 7902 KSP Residual norm 3.943330715527e-01 7903 KSP Residual norm 3.943330715527e-01 7904 KSP Residual norm 3.943330715527e-01 7905 KSP Residual norm 3.943330715527e-01 7906 KSP Residual norm 3.943330715527e-01 7907 KSP Residual norm 3.943330715527e-01 7908 KSP Residual norm 3.943330715527e-01 7909 KSP Residual norm 3.943330715527e-01 7910 KSP Residual norm 3.943330715527e-01 7911 KSP Residual norm 3.943330715527e-01 7912 KSP Residual norm 3.943330715527e-01 7913 KSP Residual norm 3.943330715527e-01 7914 KSP Residual norm 3.943330715527e-01 7915 KSP Residual norm 3.943330715527e-01 7916 KSP Residual norm 3.943330715527e-01 7917 KSP Residual norm 3.943330715527e-01 7918 KSP Residual norm 3.943330715527e-01 7919 KSP Residual norm 3.943330715527e-01 7920 KSP Residual norm 3.943330715528e-01 7921 KSP Residual norm 3.943330715528e-01 7922 KSP Residual norm 3.943330715528e-01 7923 KSP Residual norm 3.943330715528e-01 7924 KSP Residual norm 3.943330715528e-01 7925 KSP Residual norm 3.943330715528e-01 7926 KSP Residual norm 3.943330715528e-01 7927 KSP Residual norm 3.943330715528e-01 7928 KSP Residual norm 3.943330715528e-01 7929 KSP Residual norm 3.943330715528e-01 7930 KSP Residual norm 3.943330715528e-01 7931 KSP Residual norm 3.943330715528e-01 7932 KSP Residual norm 3.943330715528e-01 7933 KSP Residual norm 3.943330715528e-01 7934 KSP Residual norm 3.943330715528e-01 7935 KSP Residual norm 3.943330715528e-01 7936 KSP Residual norm 3.943330715528e-01 7937 KSP Residual norm 3.943330715528e-01 7938 KSP Residual norm 3.943330715528e-01 7939 KSP Residual norm 3.943330715528e-01 7940 KSP Residual norm 3.943330715528e-01 7941 KSP Residual norm 3.943330715528e-01 7942 KSP Residual norm 3.943330715528e-01 7943 KSP Residual norm 3.943330715528e-01 7944 KSP Residual norm 3.943330715528e-01 7945 KSP Residual norm 3.943330715528e-01 7946 KSP Residual norm 3.943330715528e-01 7947 KSP Residual norm 3.943330715528e-01 7948 KSP Residual norm 3.943330715528e-01 7949 KSP Residual norm 3.943330715528e-01 7950 KSP Residual norm 3.943330715527e-01 7951 KSP Residual norm 3.943330715527e-01 7952 KSP Residual norm 3.943330715527e-01 7953 KSP Residual norm 3.943330715527e-01 7954 KSP Residual norm 3.943330715527e-01 7955 KSP Residual norm 3.943330715527e-01 7956 KSP Residual norm 3.943330715527e-01 7957 KSP Residual norm 3.943330715527e-01 7958 KSP Residual norm 3.943330715527e-01 7959 KSP Residual norm 3.943330715527e-01 7960 KSP Residual norm 3.943330715527e-01 7961 KSP Residual norm 3.943330715527e-01 7962 KSP Residual norm 3.943330715527e-01 7963 KSP Residual norm 3.943330715527e-01 7964 KSP Residual norm 3.943330715527e-01 7965 KSP Residual norm 3.943330715527e-01 7966 KSP Residual norm 3.943330715527e-01 7967 KSP Residual norm 3.943330715527e-01 7968 KSP Residual norm 3.943330715527e-01 7969 KSP Residual norm 3.943330715527e-01 7970 KSP Residual norm 3.943330715527e-01 7971 KSP Residual norm 3.943330715527e-01 7972 KSP Residual norm 3.943330715527e-01 7973 KSP Residual norm 3.943330715527e-01 7974 KSP Residual norm 3.943330715527e-01 7975 KSP Residual norm 3.943330715527e-01 7976 KSP Residual norm 3.943330715527e-01 7977 KSP Residual norm 3.943330715527e-01 7978 KSP Residual norm 3.943330715527e-01 7979 KSP Residual norm 3.943330715527e-01 7980 KSP Residual norm 3.943330715528e-01 7981 KSP Residual norm 3.943330715528e-01 7982 KSP Residual norm 3.943330715528e-01 7983 KSP Residual norm 3.943330715528e-01 7984 KSP Residual norm 3.943330715528e-01 7985 KSP Residual norm 3.943330715528e-01 7986 KSP Residual norm 3.943330715528e-01 7987 KSP Residual norm 3.943330715528e-01 7988 KSP Residual norm 3.943330715528e-01 7989 KSP Residual norm 3.943330715528e-01 7990 KSP Residual norm 3.943330715528e-01 7991 KSP Residual norm 3.943330715528e-01 7992 KSP Residual norm 3.943330715528e-01 7993 KSP Residual norm 3.943330715528e-01 7994 KSP Residual norm 3.943330715528e-01 7995 KSP Residual norm 3.943330715528e-01 7996 KSP Residual norm 3.943330715528e-01 7997 KSP Residual norm 3.943330715528e-01 7998 KSP Residual norm 3.943330715528e-01 7999 KSP Residual norm 3.943330715528e-01 8000 KSP Residual norm 3.943330715528e-01 8001 KSP Residual norm 3.943330715528e-01 8002 KSP Residual norm 3.943330715528e-01 8003 KSP Residual norm 3.943330715528e-01 8004 KSP Residual norm 3.943330715528e-01 8005 KSP Residual norm 3.943330715528e-01 8006 KSP Residual norm 3.943330715528e-01 8007 KSP Residual norm 3.943330715528e-01 8008 KSP Residual norm 3.943330715528e-01 8009 KSP Residual norm 3.943330715528e-01 8010 KSP Residual norm 3.943330715528e-01 8011 KSP Residual norm 3.943330715528e-01 8012 KSP Residual norm 3.943330715528e-01 8013 KSP Residual norm 3.943330715528e-01 8014 KSP Residual norm 3.943330715528e-01 8015 KSP Residual norm 3.943330715528e-01 8016 KSP Residual norm 3.943330715528e-01 8017 KSP Residual norm 3.943330715528e-01 8018 KSP Residual norm 3.943330715528e-01 8019 KSP Residual norm 3.943330715528e-01 8020 KSP Residual norm 3.943330715528e-01 8021 KSP Residual norm 3.943330715528e-01 8022 KSP Residual norm 3.943330715528e-01 8023 KSP Residual norm 3.943330715528e-01 8024 KSP Residual norm 3.943330715528e-01 8025 KSP Residual norm 3.943330715528e-01 8026 KSP Residual norm 3.943330715528e-01 8027 KSP Residual norm 3.943330715528e-01 8028 KSP Residual norm 3.943330715528e-01 8029 KSP Residual norm 3.943330715528e-01 8030 KSP Residual norm 3.943330715528e-01 8031 KSP Residual norm 3.943330715528e-01 8032 KSP Residual norm 3.943330715528e-01 8033 KSP Residual norm 3.943330715528e-01 8034 KSP Residual norm 3.943330715528e-01 8035 KSP Residual norm 3.943330715528e-01 8036 KSP Residual norm 3.943330715528e-01 8037 KSP Residual norm 3.943330715528e-01 8038 KSP Residual norm 3.943330715528e-01 8039 KSP Residual norm 3.943330715528e-01 8040 KSP Residual norm 3.943330715527e-01 8041 KSP Residual norm 3.943330715527e-01 8042 KSP Residual norm 3.943330715527e-01 8043 KSP Residual norm 3.943330715527e-01 8044 KSP Residual norm 3.943330715527e-01 8045 KSP Residual norm 3.943330715527e-01 8046 KSP Residual norm 3.943330715527e-01 8047 KSP Residual norm 3.943330715527e-01 8048 KSP Residual norm 3.943330715527e-01 8049 KSP Residual norm 3.943330715527e-01 8050 KSP Residual norm 3.943330715527e-01 8051 KSP Residual norm 3.943330715527e-01 8052 KSP Residual norm 3.943330715527e-01 8053 KSP Residual norm 3.943330715527e-01 8054 KSP Residual norm 3.943330715527e-01 8055 KSP Residual norm 3.943330715527e-01 8056 KSP Residual norm 3.943330715527e-01 8057 KSP Residual norm 3.943330715527e-01 8058 KSP Residual norm 3.943330715527e-01 8059 KSP Residual norm 3.943330715527e-01 8060 KSP Residual norm 3.943330715527e-01 8061 KSP Residual norm 3.943330715527e-01 8062 KSP Residual norm 3.943330715527e-01 8063 KSP Residual norm 3.943330715527e-01 8064 KSP Residual norm 3.943330715527e-01 8065 KSP Residual norm 3.943330715527e-01 8066 KSP Residual norm 3.943330715527e-01 8067 KSP Residual norm 3.943330715527e-01 8068 KSP Residual norm 3.943330715527e-01 8069 KSP Residual norm 3.943330715527e-01 8070 KSP Residual norm 3.943330715526e-01 8071 KSP Residual norm 3.943330715526e-01 8072 KSP Residual norm 3.943330715526e-01 8073 KSP Residual norm 3.943330715526e-01 8074 KSP Residual norm 3.943330715526e-01 8075 KSP Residual norm 3.943330715526e-01 8076 KSP Residual norm 3.943330715526e-01 8077 KSP Residual norm 3.943330715526e-01 8078 KSP Residual norm 3.943330715526e-01 8079 KSP Residual norm 3.943330715526e-01 8080 KSP Residual norm 3.943330715526e-01 8081 KSP Residual norm 3.943330715526e-01 8082 KSP Residual norm 3.943330715526e-01 8083 KSP Residual norm 3.943330715526e-01 8084 KSP Residual norm 3.943330715526e-01 8085 KSP Residual norm 3.943330715526e-01 8086 KSP Residual norm 3.943330715526e-01 8087 KSP Residual norm 3.943330715526e-01 8088 KSP Residual norm 3.943330715526e-01 8089 KSP Residual norm 3.943330715526e-01 8090 KSP Residual norm 3.943330715526e-01 8091 KSP Residual norm 3.943330715526e-01 8092 KSP Residual norm 3.943330715526e-01 8093 KSP Residual norm 3.943330715526e-01 8094 KSP Residual norm 3.943330715526e-01 8095 KSP Residual norm 3.943330715526e-01 8096 KSP Residual norm 3.943330715526e-01 8097 KSP Residual norm 3.943330715526e-01 8098 KSP Residual norm 3.943330715526e-01 8099 KSP Residual norm 3.943330715526e-01 8100 KSP Residual norm 3.943330715527e-01 8101 KSP Residual norm 3.943330715527e-01 8102 KSP Residual norm 3.943330715527e-01 8103 KSP Residual norm 3.943330715527e-01 8104 KSP Residual norm 3.943330715527e-01 8105 KSP Residual norm 3.943330715527e-01 8106 KSP Residual norm 3.943330715527e-01 8107 KSP Residual norm 3.943330715527e-01 8108 KSP Residual norm 3.943330715527e-01 8109 KSP Residual norm 3.943330715527e-01 8110 KSP Residual norm 3.943330715527e-01 8111 KSP Residual norm 3.943330715527e-01 8112 KSP Residual norm 3.943330715527e-01 8113 KSP Residual norm 3.943330715527e-01 8114 KSP Residual norm 3.943330715527e-01 8115 KSP Residual norm 3.943330715527e-01 8116 KSP Residual norm 3.943330715527e-01 8117 KSP Residual norm 3.943330715527e-01 8118 KSP Residual norm 3.943330715527e-01 8119 KSP Residual norm 3.943330715527e-01 8120 KSP Residual norm 3.943330715527e-01 8121 KSP Residual norm 3.943330715527e-01 8122 KSP Residual norm 3.943330715527e-01 8123 KSP Residual norm 3.943330715527e-01 8124 KSP Residual norm 3.943330715527e-01 8125 KSP Residual norm 3.943330715527e-01 8126 KSP Residual norm 3.943330715527e-01 8127 KSP Residual norm 3.943330715527e-01 8128 KSP Residual norm 3.943330715527e-01 8129 KSP Residual norm 3.943330715527e-01 8130 KSP Residual norm 3.943330715528e-01 8131 KSP Residual norm 3.943330715528e-01 8132 KSP Residual norm 3.943330715528e-01 8133 KSP Residual norm 3.943330715528e-01 8134 KSP Residual norm 3.943330715528e-01 8135 KSP Residual norm 3.943330715528e-01 8136 KSP Residual norm 3.943330715528e-01 8137 KSP Residual norm 3.943330715528e-01 8138 KSP Residual norm 3.943330715528e-01 8139 KSP Residual norm 3.943330715528e-01 8140 KSP Residual norm 3.943330715528e-01 8141 KSP Residual norm 3.943330715528e-01 8142 KSP Residual norm 3.943330715528e-01 8143 KSP Residual norm 3.943330715528e-01 8144 KSP Residual norm 3.943330715528e-01 8145 KSP Residual norm 3.943330715528e-01 8146 KSP Residual norm 3.943330715528e-01 8147 KSP Residual norm 3.943330715528e-01 8148 KSP Residual norm 3.943330715528e-01 8149 KSP Residual norm 3.943330715528e-01 8150 KSP Residual norm 3.943330715528e-01 8151 KSP Residual norm 3.943330715528e-01 8152 KSP Residual norm 3.943330715528e-01 8153 KSP Residual norm 3.943330715528e-01 8154 KSP Residual norm 3.943330715528e-01 8155 KSP Residual norm 3.943330715528e-01 8156 KSP Residual norm 3.943330715528e-01 8157 KSP Residual norm 3.943330715528e-01 8158 KSP Residual norm 3.943330715528e-01 8159 KSP Residual norm 3.943330715528e-01 8160 KSP Residual norm 3.943330715528e-01 8161 KSP Residual norm 3.943330715528e-01 8162 KSP Residual norm 3.943330715528e-01 8163 KSP Residual norm 3.943330715528e-01 8164 KSP Residual norm 3.943330715528e-01 8165 KSP Residual norm 3.943330715528e-01 8166 KSP Residual norm 3.943330715528e-01 8167 KSP Residual norm 3.943330715528e-01 8168 KSP Residual norm 3.943330715528e-01 8169 KSP Residual norm 3.943330715528e-01 8170 KSP Residual norm 3.943330715528e-01 8171 KSP Residual norm 3.943330715528e-01 8172 KSP Residual norm 3.943330715528e-01 8173 KSP Residual norm 3.943330715528e-01 8174 KSP Residual norm 3.943330715528e-01 8175 KSP Residual norm 3.943330715528e-01 8176 KSP Residual norm 3.943330715528e-01 8177 KSP Residual norm 3.943330715528e-01 8178 KSP Residual norm 3.943330715528e-01 8179 KSP Residual norm 3.943330715528e-01 8180 KSP Residual norm 3.943330715528e-01 8181 KSP Residual norm 3.943330715528e-01 8182 KSP Residual norm 3.943330715528e-01 8183 KSP Residual norm 3.943330715528e-01 8184 KSP Residual norm 3.943330715528e-01 8185 KSP Residual norm 3.943330715528e-01 8186 KSP Residual norm 3.943330715528e-01 8187 KSP Residual norm 3.943330715528e-01 8188 KSP Residual norm 3.943330715528e-01 8189 KSP Residual norm 3.943330715528e-01 8190 KSP Residual norm 3.943330715530e-01 8191 KSP Residual norm 3.943330715530e-01 8192 KSP Residual norm 3.943330715530e-01 8193 KSP Residual norm 3.943330715530e-01 8194 KSP Residual norm 3.943330715530e-01 8195 KSP Residual norm 3.943330715530e-01 8196 KSP Residual norm 3.943330715530e-01 8197 KSP Residual norm 3.943330715530e-01 8198 KSP Residual norm 3.943330715530e-01 8199 KSP Residual norm 3.943330715530e-01 8200 KSP Residual norm 3.943330715530e-01 8201 KSP Residual norm 3.943330715530e-01 8202 KSP Residual norm 3.943330715530e-01 8203 KSP Residual norm 3.943330715530e-01 8204 KSP Residual norm 3.943330715530e-01 8205 KSP Residual norm 3.943330715530e-01 8206 KSP Residual norm 3.943330715530e-01 8207 KSP Residual norm 3.943330715530e-01 8208 KSP Residual norm 3.943330715530e-01 8209 KSP Residual norm 3.943330715530e-01 8210 KSP Residual norm 3.943330715530e-01 8211 KSP Residual norm 3.943330715530e-01 8212 KSP Residual norm 3.943330715530e-01 8213 KSP Residual norm 3.943330715530e-01 8214 KSP Residual norm 3.943330715530e-01 8215 KSP Residual norm 3.943330715530e-01 8216 KSP Residual norm 3.943330715530e-01 8217 KSP Residual norm 3.943330715530e-01 8218 KSP Residual norm 3.943330715530e-01 8219 KSP Residual norm 3.943330715530e-01 8220 KSP Residual norm 3.943330715529e-01 8221 KSP Residual norm 3.943330715529e-01 8222 KSP Residual norm 3.943330715529e-01 8223 KSP Residual norm 3.943330715529e-01 8224 KSP Residual norm 3.943330715529e-01 8225 KSP Residual norm 3.943330715529e-01 8226 KSP Residual norm 3.943330715529e-01 8227 KSP Residual norm 3.943330715529e-01 8228 KSP Residual norm 3.943330715529e-01 8229 KSP Residual norm 3.943330715529e-01 8230 KSP Residual norm 3.943330715529e-01 8231 KSP Residual norm 3.943330715529e-01 8232 KSP Residual norm 3.943330715529e-01 8233 KSP Residual norm 3.943330715529e-01 8234 KSP Residual norm 3.943330715529e-01 8235 KSP Residual norm 3.943330715529e-01 8236 KSP Residual norm 3.943330715529e-01 8237 KSP Residual norm 3.943330715529e-01 8238 KSP Residual norm 3.943330715529e-01 8239 KSP Residual norm 3.943330715529e-01 8240 KSP Residual norm 3.943330715529e-01 8241 KSP Residual norm 3.943330715529e-01 8242 KSP Residual norm 3.943330715529e-01 8243 KSP Residual norm 3.943330715529e-01 8244 KSP Residual norm 3.943330715529e-01 8245 KSP Residual norm 3.943330715529e-01 8246 KSP Residual norm 3.943330715529e-01 8247 KSP Residual norm 3.943330715529e-01 8248 KSP Residual norm 3.943330715529e-01 8249 KSP Residual norm 3.943330715529e-01 8250 KSP Residual norm 3.943330715527e-01 8251 KSP Residual norm 3.943330715527e-01 8252 KSP Residual norm 3.943330715527e-01 8253 KSP Residual norm 3.943330715527e-01 8254 KSP Residual norm 3.943330715527e-01 8255 KSP Residual norm 3.943330715527e-01 8256 KSP Residual norm 3.943330715527e-01 8257 KSP Residual norm 3.943330715527e-01 8258 KSP Residual norm 3.943330715527e-01 8259 KSP Residual norm 3.943330715527e-01 8260 KSP Residual norm 3.943330715527e-01 8261 KSP Residual norm 3.943330715527e-01 8262 KSP Residual norm 3.943330715527e-01 8263 KSP Residual norm 3.943330715527e-01 8264 KSP Residual norm 3.943330715527e-01 8265 KSP Residual norm 3.943330715527e-01 8266 KSP Residual norm 3.943330715527e-01 8267 KSP Residual norm 3.943330715527e-01 8268 KSP Residual norm 3.943330715527e-01 8269 KSP Residual norm 3.943330715527e-01 8270 KSP Residual norm 3.943330715527e-01 8271 KSP Residual norm 3.943330715527e-01 8272 KSP Residual norm 3.943330715527e-01 8273 KSP Residual norm 3.943330715527e-01 8274 KSP Residual norm 3.943330715527e-01 8275 KSP Residual norm 3.943330715527e-01 8276 KSP Residual norm 3.943330715527e-01 8277 KSP Residual norm 3.943330715527e-01 8278 KSP Residual norm 3.943330715527e-01 8279 KSP Residual norm 3.943330715527e-01 8280 KSP Residual norm 3.943330715528e-01 8281 KSP Residual norm 3.943330715528e-01 8282 KSP Residual norm 3.943330715528e-01 8283 KSP Residual norm 3.943330715528e-01 8284 KSP Residual norm 3.943330715528e-01 8285 KSP Residual norm 3.943330715528e-01 8286 KSP Residual norm 3.943330715528e-01 8287 KSP Residual norm 3.943330715528e-01 8288 KSP Residual norm 3.943330715528e-01 8289 KSP Residual norm 3.943330715528e-01 8290 KSP Residual norm 3.943330715528e-01 8291 KSP Residual norm 3.943330715528e-01 8292 KSP Residual norm 3.943330715528e-01 8293 KSP Residual norm 3.943330715528e-01 8294 KSP Residual norm 3.943330715528e-01 8295 KSP Residual norm 3.943330715528e-01 8296 KSP Residual norm 3.943330715528e-01 8297 KSP Residual norm 3.943330715528e-01 8298 KSP Residual norm 3.943330715528e-01 8299 KSP Residual norm 3.943330715528e-01 8300 KSP Residual norm 3.943330715528e-01 8301 KSP Residual norm 3.943330715528e-01 8302 KSP Residual norm 3.943330715528e-01 8303 KSP Residual norm 3.943330715528e-01 8304 KSP Residual norm 3.943330715528e-01 8305 KSP Residual norm 3.943330715528e-01 8306 KSP Residual norm 3.943330715528e-01 8307 KSP Residual norm 3.943330715528e-01 8308 KSP Residual norm 3.943330715528e-01 8309 KSP Residual norm 3.943330715528e-01 8310 KSP Residual norm 3.943330715526e-01 8311 KSP Residual norm 3.943330715526e-01 8312 KSP Residual norm 3.943330715526e-01 8313 KSP Residual norm 3.943330715526e-01 8314 KSP Residual norm 3.943330715526e-01 8315 KSP Residual norm 3.943330715526e-01 8316 KSP Residual norm 3.943330715526e-01 8317 KSP Residual norm 3.943330715526e-01 8318 KSP Residual norm 3.943330715526e-01 8319 KSP Residual norm 3.943330715526e-01 8320 KSP Residual norm 3.943330715526e-01 8321 KSP Residual norm 3.943330715526e-01 8322 KSP Residual norm 3.943330715526e-01 8323 KSP Residual norm 3.943330715526e-01 8324 KSP Residual norm 3.943330715526e-01 8325 KSP Residual norm 3.943330715526e-01 8326 KSP Residual norm 3.943330715526e-01 8327 KSP Residual norm 3.943330715526e-01 8328 KSP Residual norm 3.943330715526e-01 8329 KSP Residual norm 3.943330715526e-01 8330 KSP Residual norm 3.943330715526e-01 8331 KSP Residual norm 3.943330715526e-01 8332 KSP Residual norm 3.943330715526e-01 8333 KSP Residual norm 3.943330715526e-01 8334 KSP Residual norm 3.943330715526e-01 8335 KSP Residual norm 3.943330715526e-01 8336 KSP Residual norm 3.943330715526e-01 8337 KSP Residual norm 3.943330715526e-01 8338 KSP Residual norm 3.943330715526e-01 8339 KSP Residual norm 3.943330715526e-01 8340 KSP Residual norm 3.943330715529e-01 8341 KSP Residual norm 3.943330715529e-01 8342 KSP Residual norm 3.943330715529e-01 8343 KSP Residual norm 3.943330715529e-01 8344 KSP Residual norm 3.943330715529e-01 8345 KSP Residual norm 3.943330715529e-01 8346 KSP Residual norm 3.943330715529e-01 8347 KSP Residual norm 3.943330715529e-01 8348 KSP Residual norm 3.943330715529e-01 8349 KSP Residual norm 3.943330715529e-01 8350 KSP Residual norm 3.943330715529e-01 8351 KSP Residual norm 3.943330715529e-01 8352 KSP Residual norm 3.943330715529e-01 8353 KSP Residual norm 3.943330715529e-01 8354 KSP Residual norm 3.943330715529e-01 8355 KSP Residual norm 3.943330715529e-01 8356 KSP Residual norm 3.943330715529e-01 8357 KSP Residual norm 3.943330715529e-01 8358 KSP Residual norm 3.943330715529e-01 8359 KSP Residual norm 3.943330715529e-01 8360 KSP Residual norm 3.943330715529e-01 8361 KSP Residual norm 3.943330715529e-01 8362 KSP Residual norm 3.943330715529e-01 8363 KSP Residual norm 3.943330715529e-01 8364 KSP Residual norm 3.943330715529e-01 8365 KSP Residual norm 3.943330715529e-01 8366 KSP Residual norm 3.943330715529e-01 8367 KSP Residual norm 3.943330715529e-01 8368 KSP Residual norm 3.943330715529e-01 8369 KSP Residual norm 3.943330715529e-01 8370 KSP Residual norm 3.943330715530e-01 8371 KSP Residual norm 3.943330715530e-01 8372 KSP Residual norm 3.943330715530e-01 8373 KSP Residual norm 3.943330715530e-01 8374 KSP Residual norm 3.943330715530e-01 8375 KSP Residual norm 3.943330715530e-01 8376 KSP Residual norm 3.943330715530e-01 8377 KSP Residual norm 3.943330715530e-01 8378 KSP Residual norm 3.943330715530e-01 8379 KSP Residual norm 3.943330715530e-01 8380 KSP Residual norm 3.943330715530e-01 8381 KSP Residual norm 3.943330715530e-01 8382 KSP Residual norm 3.943330715530e-01 8383 KSP Residual norm 3.943330715530e-01 8384 KSP Residual norm 3.943330715530e-01 8385 KSP Residual norm 3.943330715530e-01 8386 KSP Residual norm 3.943330715530e-01 8387 KSP Residual norm 3.943330715530e-01 8388 KSP Residual norm 3.943330715530e-01 8389 KSP Residual norm 3.943330715530e-01 8390 KSP Residual norm 3.943330715530e-01 8391 KSP Residual norm 3.943330715530e-01 8392 KSP Residual norm 3.943330715530e-01 8393 KSP Residual norm 3.943330715530e-01 8394 KSP Residual norm 3.943330715530e-01 8395 KSP Residual norm 3.943330715530e-01 8396 KSP Residual norm 3.943330715530e-01 8397 KSP Residual norm 3.943330715530e-01 8398 KSP Residual norm 3.943330715530e-01 8399 KSP Residual norm 3.943330715530e-01 8400 KSP Residual norm 3.943330715530e-01 8401 KSP Residual norm 3.943330715530e-01 8402 KSP Residual norm 3.943330715530e-01 8403 KSP Residual norm 3.943330715530e-01 8404 KSP Residual norm 3.943330715530e-01 8405 KSP Residual norm 3.943330715530e-01 8406 KSP Residual norm 3.943330715530e-01 8407 KSP Residual norm 3.943330715530e-01 8408 KSP Residual norm 3.943330715530e-01 8409 KSP Residual norm 3.943330715530e-01 8410 KSP Residual norm 3.943330715530e-01 8411 KSP Residual norm 3.943330715530e-01 8412 KSP Residual norm 3.943330715530e-01 8413 KSP Residual norm 3.943330715530e-01 8414 KSP Residual norm 3.943330715530e-01 8415 KSP Residual norm 3.943330715530e-01 8416 KSP Residual norm 3.943330715530e-01 8417 KSP Residual norm 3.943330715530e-01 8418 KSP Residual norm 3.943330715530e-01 8419 KSP Residual norm 3.943330715530e-01 8420 KSP Residual norm 3.943330715530e-01 8421 KSP Residual norm 3.943330715530e-01 8422 KSP Residual norm 3.943330715530e-01 8423 KSP Residual norm 3.943330715530e-01 8424 KSP Residual norm 3.943330715530e-01 8425 KSP Residual norm 3.943330715530e-01 8426 KSP Residual norm 3.943330715530e-01 8427 KSP Residual norm 3.943330715530e-01 8428 KSP Residual norm 3.943330715530e-01 8429 KSP Residual norm 3.943330715530e-01 8430 KSP Residual norm 3.943330715530e-01 8431 KSP Residual norm 3.943330715530e-01 8432 KSP Residual norm 3.943330715530e-01 8433 KSP Residual norm 3.943330715530e-01 8434 KSP Residual norm 3.943330715530e-01 8435 KSP Residual norm 3.943330715530e-01 8436 KSP Residual norm 3.943330715530e-01 8437 KSP Residual norm 3.943330715530e-01 8438 KSP Residual norm 3.943330715530e-01 8439 KSP Residual norm 3.943330715530e-01 8440 KSP Residual norm 3.943330715530e-01 8441 KSP Residual norm 3.943330715530e-01 8442 KSP Residual norm 3.943330715530e-01 8443 KSP Residual norm 3.943330715530e-01 8444 KSP Residual norm 3.943330715530e-01 8445 KSP Residual norm 3.943330715530e-01 8446 KSP Residual norm 3.943330715530e-01 8447 KSP Residual norm 3.943330715530e-01 8448 KSP Residual norm 3.943330715530e-01 8449 KSP Residual norm 3.943330715530e-01 8450 KSP Residual norm 3.943330715530e-01 8451 KSP Residual norm 3.943330715530e-01 8452 KSP Residual norm 3.943330715530e-01 8453 KSP Residual norm 3.943330715530e-01 8454 KSP Residual norm 3.943330715530e-01 8455 KSP Residual norm 3.943330715530e-01 8456 KSP Residual norm 3.943330715530e-01 8457 KSP Residual norm 3.943330715530e-01 8458 KSP Residual norm 3.943330715530e-01 8459 KSP Residual norm 3.943330715530e-01 8460 KSP Residual norm 3.943330715531e-01 8461 KSP Residual norm 3.943330715531e-01 8462 KSP Residual norm 3.943330715531e-01 8463 KSP Residual norm 3.943330715531e-01 8464 KSP Residual norm 3.943330715531e-01 8465 KSP Residual norm 3.943330715531e-01 8466 KSP Residual norm 3.943330715531e-01 8467 KSP Residual norm 3.943330715531e-01 8468 KSP Residual norm 3.943330715531e-01 8469 KSP Residual norm 3.943330715531e-01 8470 KSP Residual norm 3.943330715531e-01 8471 KSP Residual norm 3.943330715531e-01 8472 KSP Residual norm 3.943330715531e-01 8473 KSP Residual norm 3.943330715531e-01 8474 KSP Residual norm 3.943330715531e-01 8475 KSP Residual norm 3.943330715531e-01 8476 KSP Residual norm 3.943330715531e-01 8477 KSP Residual norm 3.943330715531e-01 8478 KSP Residual norm 3.943330715531e-01 8479 KSP Residual norm 3.943330715531e-01 8480 KSP Residual norm 3.943330715531e-01 8481 KSP Residual norm 3.943330715531e-01 8482 KSP Residual norm 3.943330715531e-01 8483 KSP Residual norm 3.943330715531e-01 8484 KSP Residual norm 3.943330715531e-01 8485 KSP Residual norm 3.943330715531e-01 8486 KSP Residual norm 3.943330715531e-01 8487 KSP Residual norm 3.943330715531e-01 8488 KSP Residual norm 3.943330715531e-01 8489 KSP Residual norm 3.943330715531e-01 8490 KSP Residual norm 3.943330715531e-01 8491 KSP Residual norm 3.943330715531e-01 8492 KSP Residual norm 3.943330715531e-01 8493 KSP Residual norm 3.943330715531e-01 8494 KSP Residual norm 3.943330715531e-01 8495 KSP Residual norm 3.943330715531e-01 8496 KSP Residual norm 3.943330715531e-01 8497 KSP Residual norm 3.943330715531e-01 8498 KSP Residual norm 3.943330715531e-01 8499 KSP Residual norm 3.943330715531e-01 8500 KSP Residual norm 3.943330715531e-01 8501 KSP Residual norm 3.943330715531e-01 8502 KSP Residual norm 3.943330715531e-01 8503 KSP Residual norm 3.943330715531e-01 8504 KSP Residual norm 3.943330715531e-01 8505 KSP Residual norm 3.943330715531e-01 8506 KSP Residual norm 3.943330715531e-01 8507 KSP Residual norm 3.943330715531e-01 8508 KSP Residual norm 3.943330715531e-01 8509 KSP Residual norm 3.943330715531e-01 8510 KSP Residual norm 3.943330715531e-01 8511 KSP Residual norm 3.943330715531e-01 8512 KSP Residual norm 3.943330715531e-01 8513 KSP Residual norm 3.943330715531e-01 8514 KSP Residual norm 3.943330715531e-01 8515 KSP Residual norm 3.943330715531e-01 8516 KSP Residual norm 3.943330715531e-01 8517 KSP Residual norm 3.943330715531e-01 8518 KSP Residual norm 3.943330715531e-01 8519 KSP Residual norm 3.943330715531e-01 8520 KSP Residual norm 3.943330715529e-01 8521 KSP Residual norm 3.943330715529e-01 8522 KSP Residual norm 3.943330715529e-01 8523 KSP Residual norm 3.943330715529e-01 8524 KSP Residual norm 3.943330715529e-01 8525 KSP Residual norm 3.943330715529e-01 8526 KSP Residual norm 3.943330715529e-01 8527 KSP Residual norm 3.943330715529e-01 8528 KSP Residual norm 3.943330715529e-01 8529 KSP Residual norm 3.943330715529e-01 8530 KSP Residual norm 3.943330715529e-01 8531 KSP Residual norm 3.943330715529e-01 8532 KSP Residual norm 3.943330715529e-01 8533 KSP Residual norm 3.943330715529e-01 8534 KSP Residual norm 3.943330715529e-01 8535 KSP Residual norm 3.943330715529e-01 8536 KSP Residual norm 3.943330715529e-01 8537 KSP Residual norm 3.943330715529e-01 8538 KSP Residual norm 3.943330715529e-01 8539 KSP Residual norm 3.943330715529e-01 8540 KSP Residual norm 3.943330715529e-01 8541 KSP Residual norm 3.943330715529e-01 8542 KSP Residual norm 3.943330715529e-01 8543 KSP Residual norm 3.943330715529e-01 8544 KSP Residual norm 3.943330715529e-01 8545 KSP Residual norm 3.943330715529e-01 8546 KSP Residual norm 3.943330715529e-01 8547 KSP Residual norm 3.943330715529e-01 8548 KSP Residual norm 3.943330715529e-01 8549 KSP Residual norm 3.943330715529e-01 8550 KSP Residual norm 3.943330715528e-01 8551 KSP Residual norm 3.943330715528e-01 8552 KSP Residual norm 3.943330715528e-01 8553 KSP Residual norm 3.943330715528e-01 8554 KSP Residual norm 3.943330715528e-01 8555 KSP Residual norm 3.943330715528e-01 8556 KSP Residual norm 3.943330715528e-01 8557 KSP Residual norm 3.943330715528e-01 8558 KSP Residual norm 3.943330715528e-01 8559 KSP Residual norm 3.943330715528e-01 8560 KSP Residual norm 3.943330715528e-01 8561 KSP Residual norm 3.943330715528e-01 8562 KSP Residual norm 3.943330715528e-01 8563 KSP Residual norm 3.943330715528e-01 8564 KSP Residual norm 3.943330715528e-01 8565 KSP Residual norm 3.943330715528e-01 8566 KSP Residual norm 3.943330715528e-01 8567 KSP Residual norm 3.943330715528e-01 8568 KSP Residual norm 3.943330715528e-01 8569 KSP Residual norm 3.943330715528e-01 8570 KSP Residual norm 3.943330715528e-01 8571 KSP Residual norm 3.943330715528e-01 8572 KSP Residual norm 3.943330715528e-01 8573 KSP Residual norm 3.943330715528e-01 8574 KSP Residual norm 3.943330715528e-01 8575 KSP Residual norm 3.943330715528e-01 8576 KSP Residual norm 3.943330715528e-01 8577 KSP Residual norm 3.943330715528e-01 8578 KSP Residual norm 3.943330715528e-01 8579 KSP Residual norm 3.943330715528e-01 8580 KSP Residual norm 3.943330715528e-01 8581 KSP Residual norm 3.943330715528e-01 8582 KSP Residual norm 3.943330715528e-01 8583 KSP Residual norm 3.943330715528e-01 8584 KSP Residual norm 3.943330715528e-01 8585 KSP Residual norm 3.943330715528e-01 8586 KSP Residual norm 3.943330715528e-01 8587 KSP Residual norm 3.943330715528e-01 8588 KSP Residual norm 3.943330715528e-01 8589 KSP Residual norm 3.943330715528e-01 8590 KSP Residual norm 3.943330715528e-01 8591 KSP Residual norm 3.943330715528e-01 8592 KSP Residual norm 3.943330715528e-01 8593 KSP Residual norm 3.943330715528e-01 8594 KSP Residual norm 3.943330715528e-01 8595 KSP Residual norm 3.943330715528e-01 8596 KSP Residual norm 3.943330715528e-01 8597 KSP Residual norm 3.943330715528e-01 8598 KSP Residual norm 3.943330715528e-01 8599 KSP Residual norm 3.943330715528e-01 8600 KSP Residual norm 3.943330715528e-01 8601 KSP Residual norm 3.943330715528e-01 8602 KSP Residual norm 3.943330715528e-01 8603 KSP Residual norm 3.943330715528e-01 8604 KSP Residual norm 3.943330715528e-01 8605 KSP Residual norm 3.943330715528e-01 8606 KSP Residual norm 3.943330715528e-01 8607 KSP Residual norm 3.943330715528e-01 8608 KSP Residual norm 3.943330715528e-01 8609 KSP Residual norm 3.943330715528e-01 8610 KSP Residual norm 3.943330715528e-01 8611 KSP Residual norm 3.943330715528e-01 8612 KSP Residual norm 3.943330715528e-01 8613 KSP Residual norm 3.943330715528e-01 8614 KSP Residual norm 3.943330715528e-01 8615 KSP Residual norm 3.943330715528e-01 8616 KSP Residual norm 3.943330715528e-01 8617 KSP Residual norm 3.943330715528e-01 8618 KSP Residual norm 3.943330715528e-01 8619 KSP Residual norm 3.943330715528e-01 8620 KSP Residual norm 3.943330715528e-01 8621 KSP Residual norm 3.943330715528e-01 8622 KSP Residual norm 3.943330715528e-01 8623 KSP Residual norm 3.943330715528e-01 8624 KSP Residual norm 3.943330715528e-01 8625 KSP Residual norm 3.943330715528e-01 8626 KSP Residual norm 3.943330715528e-01 8627 KSP Residual norm 3.943330715528e-01 8628 KSP Residual norm 3.943330715528e-01 8629 KSP Residual norm 3.943330715528e-01 8630 KSP Residual norm 3.943330715528e-01 8631 KSP Residual norm 3.943330715528e-01 8632 KSP Residual norm 3.943330715528e-01 8633 KSP Residual norm 3.943330715528e-01 8634 KSP Residual norm 3.943330715528e-01 8635 KSP Residual norm 3.943330715528e-01 8636 KSP Residual norm 3.943330715528e-01 8637 KSP Residual norm 3.943330715528e-01 8638 KSP Residual norm 3.943330715528e-01 8639 KSP Residual norm 3.943330715528e-01 8640 KSP Residual norm 3.943330715530e-01 8641 KSP Residual norm 3.943330715530e-01 8642 KSP Residual norm 3.943330715530e-01 8643 KSP Residual norm 3.943330715530e-01 8644 KSP Residual norm 3.943330715530e-01 8645 KSP Residual norm 3.943330715530e-01 8646 KSP Residual norm 3.943330715530e-01 8647 KSP Residual norm 3.943330715530e-01 8648 KSP Residual norm 3.943330715530e-01 8649 KSP Residual norm 3.943330715530e-01 8650 KSP Residual norm 3.943330715530e-01 8651 KSP Residual norm 3.943330715530e-01 8652 KSP Residual norm 3.943330715530e-01 8653 KSP Residual norm 3.943330715530e-01 8654 KSP Residual norm 3.943330715530e-01 8655 KSP Residual norm 3.943330715530e-01 8656 KSP Residual norm 3.943330715530e-01 8657 KSP Residual norm 3.943330715530e-01 8658 KSP Residual norm 3.943330715530e-01 8659 KSP Residual norm 3.943330715530e-01 8660 KSP Residual norm 3.943330715530e-01 8661 KSP Residual norm 3.943330715530e-01 8662 KSP Residual norm 3.943330715530e-01 8663 KSP Residual norm 3.943330715530e-01 8664 KSP Residual norm 3.943330715530e-01 8665 KSP Residual norm 3.943330715530e-01 8666 KSP Residual norm 3.943330715530e-01 8667 KSP Residual norm 3.943330715530e-01 8668 KSP Residual norm 3.943330715530e-01 8669 KSP Residual norm 3.943330715530e-01 8670 KSP Residual norm 3.943330715531e-01 8671 KSP Residual norm 3.943330715531e-01 8672 KSP Residual norm 3.943330715531e-01 8673 KSP Residual norm 3.943330715531e-01 8674 KSP Residual norm 3.943330715531e-01 8675 KSP Residual norm 3.943330715531e-01 8676 KSP Residual norm 3.943330715531e-01 8677 KSP Residual norm 3.943330715531e-01 8678 KSP Residual norm 3.943330715531e-01 8679 KSP Residual norm 3.943330715531e-01 8680 KSP Residual norm 3.943330715531e-01 8681 KSP Residual norm 3.943330715531e-01 8682 KSP Residual norm 3.943330715531e-01 8683 KSP Residual norm 3.943330715531e-01 8684 KSP Residual norm 3.943330715531e-01 8685 KSP Residual norm 3.943330715531e-01 8686 KSP Residual norm 3.943330715531e-01 8687 KSP Residual norm 3.943330715531e-01 8688 KSP Residual norm 3.943330715531e-01 8689 KSP Residual norm 3.943330715531e-01 8690 KSP Residual norm 3.943330715531e-01 8691 KSP Residual norm 3.943330715531e-01 8692 KSP Residual norm 3.943330715531e-01 8693 KSP Residual norm 3.943330715531e-01 8694 KSP Residual norm 3.943330715531e-01 8695 KSP Residual norm 3.943330715531e-01 8696 KSP Residual norm 3.943330715531e-01 8697 KSP Residual norm 3.943330715531e-01 8698 KSP Residual norm 3.943330715531e-01 8699 KSP Residual norm 3.943330715531e-01 8700 KSP Residual norm 3.943330715531e-01 8701 KSP Residual norm 3.943330715531e-01 8702 KSP Residual norm 3.943330715531e-01 8703 KSP Residual norm 3.943330715531e-01 8704 KSP Residual norm 3.943330715531e-01 8705 KSP Residual norm 3.943330715531e-01 8706 KSP Residual norm 3.943330715531e-01 8707 KSP Residual norm 3.943330715531e-01 8708 KSP Residual norm 3.943330715531e-01 8709 KSP Residual norm 3.943330715531e-01 8710 KSP Residual norm 3.943330715531e-01 8711 KSP Residual norm 3.943330715531e-01 8712 KSP Residual norm 3.943330715531e-01 8713 KSP Residual norm 3.943330715531e-01 8714 KSP Residual norm 3.943330715531e-01 8715 KSP Residual norm 3.943330715531e-01 8716 KSP Residual norm 3.943330715531e-01 8717 KSP Residual norm 3.943330715531e-01 8718 KSP Residual norm 3.943330715531e-01 8719 KSP Residual norm 3.943330715531e-01 8720 KSP Residual norm 3.943330715531e-01 8721 KSP Residual norm 3.943330715531e-01 8722 KSP Residual norm 3.943330715531e-01 8723 KSP Residual norm 3.943330715531e-01 8724 KSP Residual norm 3.943330715531e-01 8725 KSP Residual norm 3.943330715531e-01 8726 KSP Residual norm 3.943330715531e-01 8727 KSP Residual norm 3.943330715531e-01 8728 KSP Residual norm 3.943330715531e-01 8729 KSP Residual norm 3.943330715531e-01 8730 KSP Residual norm 3.943330715529e-01 8731 KSP Residual norm 3.943330715529e-01 8732 KSP Residual norm 3.943330715529e-01 8733 KSP Residual norm 3.943330715529e-01 8734 KSP Residual norm 3.943330715529e-01 8735 KSP Residual norm 3.943330715529e-01 8736 KSP Residual norm 3.943330715529e-01 8737 KSP Residual norm 3.943330715529e-01 8738 KSP Residual norm 3.943330715529e-01 8739 KSP Residual norm 3.943330715529e-01 8740 KSP Residual norm 3.943330715529e-01 8741 KSP Residual norm 3.943330715529e-01 8742 KSP Residual norm 3.943330715529e-01 8743 KSP Residual norm 3.943330715529e-01 8744 KSP Residual norm 3.943330715529e-01 8745 KSP Residual norm 3.943330715529e-01 8746 KSP Residual norm 3.943330715529e-01 8747 KSP Residual norm 3.943330715529e-01 8748 KSP Residual norm 3.943330715529e-01 8749 KSP Residual norm 3.943330715529e-01 8750 KSP Residual norm 3.943330715529e-01 8751 KSP Residual norm 3.943330715529e-01 8752 KSP Residual norm 3.943330715529e-01 8753 KSP Residual norm 3.943330715529e-01 8754 KSP Residual norm 3.943330715529e-01 8755 KSP Residual norm 3.943330715529e-01 8756 KSP Residual norm 3.943330715529e-01 8757 KSP Residual norm 3.943330715529e-01 8758 KSP Residual norm 3.943330715529e-01 8759 KSP Residual norm 3.943330715529e-01 8760 KSP Residual norm 3.943330715532e-01 8761 KSP Residual norm 3.943330715532e-01 8762 KSP Residual norm 3.943330715532e-01 8763 KSP Residual norm 3.943330715532e-01 8764 KSP Residual norm 3.943330715532e-01 8765 KSP Residual norm 3.943330715532e-01 8766 KSP Residual norm 3.943330715532e-01 8767 KSP Residual norm 3.943330715532e-01 8768 KSP Residual norm 3.943330715532e-01 8769 KSP Residual norm 3.943330715532e-01 8770 KSP Residual norm 3.943330715532e-01 8771 KSP Residual norm 3.943330715532e-01 8772 KSP Residual norm 3.943330715532e-01 8773 KSP Residual norm 3.943330715532e-01 8774 KSP Residual norm 3.943330715532e-01 8775 KSP Residual norm 3.943330715532e-01 8776 KSP Residual norm 3.943330715532e-01 8777 KSP Residual norm 3.943330715532e-01 8778 KSP Residual norm 3.943330715532e-01 8779 KSP Residual norm 3.943330715532e-01 8780 KSP Residual norm 3.943330715532e-01 8781 KSP Residual norm 3.943330715532e-01 8782 KSP Residual norm 3.943330715532e-01 8783 KSP Residual norm 3.943330715532e-01 8784 KSP Residual norm 3.943330715532e-01 8785 KSP Residual norm 3.943330715532e-01 8786 KSP Residual norm 3.943330715532e-01 8787 KSP Residual norm 3.943330715532e-01 8788 KSP Residual norm 3.943330715532e-01 8789 KSP Residual norm 3.943330715532e-01 8790 KSP Residual norm 3.943330715530e-01 8791 KSP Residual norm 3.943330715530e-01 8792 KSP Residual norm 3.943330715530e-01 8793 KSP Residual norm 3.943330715530e-01 8794 KSP Residual norm 3.943330715530e-01 8795 KSP Residual norm 3.943330715530e-01 8796 KSP Residual norm 3.943330715530e-01 8797 KSP Residual norm 3.943330715530e-01 8798 KSP Residual norm 3.943330715530e-01 8799 KSP Residual norm 3.943330715530e-01 8800 KSP Residual norm 3.943330715530e-01 8801 KSP Residual norm 3.943330715530e-01 8802 KSP Residual norm 3.943330715530e-01 8803 KSP Residual norm 3.943330715530e-01 8804 KSP Residual norm 3.943330715530e-01 8805 KSP Residual norm 3.943330715530e-01 8806 KSP Residual norm 3.943330715530e-01 8807 KSP Residual norm 3.943330715530e-01 8808 KSP Residual norm 3.943330715530e-01 8809 KSP Residual norm 3.943330715530e-01 8810 KSP Residual norm 3.943330715530e-01 8811 KSP Residual norm 3.943330715530e-01 8812 KSP Residual norm 3.943330715530e-01 8813 KSP Residual norm 3.943330715530e-01 8814 KSP Residual norm 3.943330715530e-01 8815 KSP Residual norm 3.943330715530e-01 8816 KSP Residual norm 3.943330715530e-01 8817 KSP Residual norm 3.943330715530e-01 8818 KSP Residual norm 3.943330715530e-01 8819 KSP Residual norm 3.943330715530e-01 8820 KSP Residual norm 3.943330715529e-01 8821 KSP Residual norm 3.943330715529e-01 8822 KSP Residual norm 3.943330715529e-01 8823 KSP Residual norm 3.943330715529e-01 8824 KSP Residual norm 3.943330715529e-01 8825 KSP Residual norm 3.943330715529e-01 8826 KSP Residual norm 3.943330715529e-01 8827 KSP Residual norm 3.943330715529e-01 8828 KSP Residual norm 3.943330715529e-01 8829 KSP Residual norm 3.943330715529e-01 8830 KSP Residual norm 3.943330715529e-01 8831 KSP Residual norm 3.943330715529e-01 8832 KSP Residual norm 3.943330715529e-01 8833 KSP Residual norm 3.943330715529e-01 8834 KSP Residual norm 3.943330715529e-01 8835 KSP Residual norm 3.943330715529e-01 8836 KSP Residual norm 3.943330715529e-01 8837 KSP Residual norm 3.943330715529e-01 8838 KSP Residual norm 3.943330715529e-01 8839 KSP Residual norm 3.943330715529e-01 8840 KSP Residual norm 3.943330715529e-01 8841 KSP Residual norm 3.943330715529e-01 8842 KSP Residual norm 3.943330715529e-01 8843 KSP Residual norm 3.943330715529e-01 8844 KSP Residual norm 3.943330715529e-01 8845 KSP Residual norm 3.943330715529e-01 8846 KSP Residual norm 3.943330715529e-01 8847 KSP Residual norm 3.943330715529e-01 8848 KSP Residual norm 3.943330715529e-01 8849 KSP Residual norm 3.943330715529e-01 8850 KSP Residual norm 3.943330715528e-01 8851 KSP Residual norm 3.943330715528e-01 8852 KSP Residual norm 3.943330715528e-01 8853 KSP Residual norm 3.943330715528e-01 8854 KSP Residual norm 3.943330715528e-01 8855 KSP Residual norm 3.943330715528e-01 8856 KSP Residual norm 3.943330715528e-01 8857 KSP Residual norm 3.943330715528e-01 8858 KSP Residual norm 3.943330715528e-01 8859 KSP Residual norm 3.943330715528e-01 8860 KSP Residual norm 3.943330715528e-01 8861 KSP Residual norm 3.943330715528e-01 8862 KSP Residual norm 3.943330715528e-01 8863 KSP Residual norm 3.943330715528e-01 8864 KSP Residual norm 3.943330715528e-01 8865 KSP Residual norm 3.943330715528e-01 8866 KSP Residual norm 3.943330715528e-01 8867 KSP Residual norm 3.943330715528e-01 8868 KSP Residual norm 3.943330715528e-01 8869 KSP Residual norm 3.943330715528e-01 8870 KSP Residual norm 3.943330715528e-01 8871 KSP Residual norm 3.943330715528e-01 8872 KSP Residual norm 3.943330715528e-01 8873 KSP Residual norm 3.943330715528e-01 8874 KSP Residual norm 3.943330715528e-01 8875 KSP Residual norm 3.943330715528e-01 8876 KSP Residual norm 3.943330715528e-01 8877 KSP Residual norm 3.943330715528e-01 8878 KSP Residual norm 3.943330715528e-01 8879 KSP Residual norm 3.943330715528e-01 8880 KSP Residual norm 3.943330715530e-01 8881 KSP Residual norm 3.943330715530e-01 8882 KSP Residual norm 3.943330715530e-01 8883 KSP Residual norm 3.943330715530e-01 8884 KSP Residual norm 3.943330715530e-01 8885 KSP Residual norm 3.943330715530e-01 8886 KSP Residual norm 3.943330715530e-01 8887 KSP Residual norm 3.943330715530e-01 8888 KSP Residual norm 3.943330715530e-01 8889 KSP Residual norm 3.943330715530e-01 8890 KSP Residual norm 3.943330715530e-01 8891 KSP Residual norm 3.943330715530e-01 8892 KSP Residual norm 3.943330715530e-01 8893 KSP Residual norm 3.943330715530e-01 8894 KSP Residual norm 3.943330715530e-01 8895 KSP Residual norm 3.943330715530e-01 8896 KSP Residual norm 3.943330715530e-01 8897 KSP Residual norm 3.943330715530e-01 8898 KSP Residual norm 3.943330715530e-01 8899 KSP Residual norm 3.943330715530e-01 8900 KSP Residual norm 3.943330715530e-01 8901 KSP Residual norm 3.943330715530e-01 8902 KSP Residual norm 3.943330715530e-01 8903 KSP Residual norm 3.943330715530e-01 8904 KSP Residual norm 3.943330715530e-01 8905 KSP Residual norm 3.943330715530e-01 8906 KSP Residual norm 3.943330715530e-01 8907 KSP Residual norm 3.943330715530e-01 8908 KSP Residual norm 3.943330715530e-01 8909 KSP Residual norm 3.943330715530e-01 8910 KSP Residual norm 3.943330715528e-01 8911 KSP Residual norm 3.943330715528e-01 8912 KSP Residual norm 3.943330715528e-01 8913 KSP Residual norm 3.943330715528e-01 8914 KSP Residual norm 3.943330715528e-01 8915 KSP Residual norm 3.943330715528e-01 8916 KSP Residual norm 3.943330715528e-01 8917 KSP Residual norm 3.943330715528e-01 8918 KSP Residual norm 3.943330715528e-01 8919 KSP Residual norm 3.943330715528e-01 8920 KSP Residual norm 3.943330715528e-01 8921 KSP Residual norm 3.943330715528e-01 8922 KSP Residual norm 3.943330715528e-01 8923 KSP Residual norm 3.943330715528e-01 8924 KSP Residual norm 3.943330715528e-01 8925 KSP Residual norm 3.943330715528e-01 8926 KSP Residual norm 3.943330715528e-01 8927 KSP Residual norm 3.943330715528e-01 8928 KSP Residual norm 3.943330715528e-01 8929 KSP Residual norm 3.943330715528e-01 8930 KSP Residual norm 3.943330715528e-01 8931 KSP Residual norm 3.943330715528e-01 8932 KSP Residual norm 3.943330715528e-01 8933 KSP Residual norm 3.943330715528e-01 8934 KSP Residual norm 3.943330715528e-01 8935 KSP Residual norm 3.943330715528e-01 8936 KSP Residual norm 3.943330715528e-01 8937 KSP Residual norm 3.943330715528e-01 8938 KSP Residual norm 3.943330715528e-01 8939 KSP Residual norm 3.943330715528e-01 8940 KSP Residual norm 3.943330715528e-01 8941 KSP Residual norm 3.943330715528e-01 8942 KSP Residual norm 3.943330715528e-01 8943 KSP Residual norm 3.943330715528e-01 8944 KSP Residual norm 3.943330715528e-01 8945 KSP Residual norm 3.943330715528e-01 8946 KSP Residual norm 3.943330715528e-01 8947 KSP Residual norm 3.943330715528e-01 8948 KSP Residual norm 3.943330715528e-01 8949 KSP Residual norm 3.943330715528e-01 8950 KSP Residual norm 3.943330715528e-01 8951 KSP Residual norm 3.943330715528e-01 8952 KSP Residual norm 3.943330715528e-01 8953 KSP Residual norm 3.943330715528e-01 8954 KSP Residual norm 3.943330715528e-01 8955 KSP Residual norm 3.943330715528e-01 8956 KSP Residual norm 3.943330715528e-01 8957 KSP Residual norm 3.943330715528e-01 8958 KSP Residual norm 3.943330715528e-01 8959 KSP Residual norm 3.943330715528e-01 8960 KSP Residual norm 3.943330715528e-01 8961 KSP Residual norm 3.943330715528e-01 8962 KSP Residual norm 3.943330715528e-01 8963 KSP Residual norm 3.943330715528e-01 8964 KSP Residual norm 3.943330715528e-01 8965 KSP Residual norm 3.943330715528e-01 8966 KSP Residual norm 3.943330715528e-01 8967 KSP Residual norm 3.943330715528e-01 8968 KSP Residual norm 3.943330715528e-01 8969 KSP Residual norm 3.943330715528e-01 8970 KSP Residual norm 3.943330715529e-01 8971 KSP Residual norm 3.943330715529e-01 8972 KSP Residual norm 3.943330715529e-01 8973 KSP Residual norm 3.943330715529e-01 8974 KSP Residual norm 3.943330715529e-01 8975 KSP Residual norm 3.943330715529e-01 8976 KSP Residual norm 3.943330715529e-01 8977 KSP Residual norm 3.943330715529e-01 8978 KSP Residual norm 3.943330715529e-01 8979 KSP Residual norm 3.943330715529e-01 8980 KSP Residual norm 3.943330715529e-01 8981 KSP Residual norm 3.943330715529e-01 8982 KSP Residual norm 3.943330715529e-01 8983 KSP Residual norm 3.943330715529e-01 8984 KSP Residual norm 3.943330715529e-01 8985 KSP Residual norm 3.943330715529e-01 8986 KSP Residual norm 3.943330715529e-01 8987 KSP Residual norm 3.943330715529e-01 8988 KSP Residual norm 3.943330715529e-01 8989 KSP Residual norm 3.943330715529e-01 8990 KSP Residual norm 3.943330715529e-01 8991 KSP Residual norm 3.943330715529e-01 8992 KSP Residual norm 3.943330715529e-01 8993 KSP Residual norm 3.943330715529e-01 8994 KSP Residual norm 3.943330715529e-01 8995 KSP Residual norm 3.943330715529e-01 8996 KSP Residual norm 3.943330715529e-01 8997 KSP Residual norm 3.943330715529e-01 8998 KSP Residual norm 3.943330715529e-01 8999 KSP Residual norm 3.943330715529e-01 9000 KSP Residual norm 3.943330715528e-01 9001 KSP Residual norm 3.943330715528e-01 9002 KSP Residual norm 3.943330715528e-01 9003 KSP Residual norm 3.943330715528e-01 9004 KSP Residual norm 3.943330715528e-01 9005 KSP Residual norm 3.943330715528e-01 9006 KSP Residual norm 3.943330715528e-01 9007 KSP Residual norm 3.943330715528e-01 9008 KSP Residual norm 3.943330715528e-01 9009 KSP Residual norm 3.943330715528e-01 9010 KSP Residual norm 3.943330715528e-01 9011 KSP Residual norm 3.943330715528e-01 9012 KSP Residual norm 3.943330715528e-01 9013 KSP Residual norm 3.943330715528e-01 9014 KSP Residual norm 3.943330715528e-01 9015 KSP Residual norm 3.943330715528e-01 9016 KSP Residual norm 3.943330715528e-01 9017 KSP Residual norm 3.943330715528e-01 9018 KSP Residual norm 3.943330715528e-01 9019 KSP Residual norm 3.943330715528e-01 9020 KSP Residual norm 3.943330715528e-01 9021 KSP Residual norm 3.943330715528e-01 9022 KSP Residual norm 3.943330715528e-01 9023 KSP Residual norm 3.943330715528e-01 9024 KSP Residual norm 3.943330715528e-01 9025 KSP Residual norm 3.943330715528e-01 9026 KSP Residual norm 3.943330715528e-01 9027 KSP Residual norm 3.943330715528e-01 9028 KSP Residual norm 3.943330715528e-01 9029 KSP Residual norm 3.943330715528e-01 9030 KSP Residual norm 3.943330715528e-01 9031 KSP Residual norm 3.943330715528e-01 9032 KSP Residual norm 3.943330715528e-01 9033 KSP Residual norm 3.943330715528e-01 9034 KSP Residual norm 3.943330715528e-01 9035 KSP Residual norm 3.943330715528e-01 9036 KSP Residual norm 3.943330715528e-01 9037 KSP Residual norm 3.943330715528e-01 9038 KSP Residual norm 3.943330715528e-01 9039 KSP Residual norm 3.943330715528e-01 9040 KSP Residual norm 3.943330715528e-01 9041 KSP Residual norm 3.943330715528e-01 9042 KSP Residual norm 3.943330715528e-01 9043 KSP Residual norm 3.943330715528e-01 9044 KSP Residual norm 3.943330715528e-01 9045 KSP Residual norm 3.943330715528e-01 9046 KSP Residual norm 3.943330715528e-01 9047 KSP Residual norm 3.943330715528e-01 9048 KSP Residual norm 3.943330715528e-01 9049 KSP Residual norm 3.943330715528e-01 9050 KSP Residual norm 3.943330715528e-01 9051 KSP Residual norm 3.943330715528e-01 9052 KSP Residual norm 3.943330715528e-01 9053 KSP Residual norm 3.943330715528e-01 9054 KSP Residual norm 3.943330715528e-01 9055 KSP Residual norm 3.943330715528e-01 9056 KSP Residual norm 3.943330715528e-01 9057 KSP Residual norm 3.943330715528e-01 9058 KSP Residual norm 3.943330715528e-01 9059 KSP Residual norm 3.943330715528e-01 9060 KSP Residual norm 3.943330715529e-01 9061 KSP Residual norm 3.943330715529e-01 9062 KSP Residual norm 3.943330715529e-01 9063 KSP Residual norm 3.943330715529e-01 9064 KSP Residual norm 3.943330715529e-01 9065 KSP Residual norm 3.943330715529e-01 9066 KSP Residual norm 3.943330715529e-01 9067 KSP Residual norm 3.943330715529e-01 9068 KSP Residual norm 3.943330715529e-01 9069 KSP Residual norm 3.943330715529e-01 9070 KSP Residual norm 3.943330715529e-01 9071 KSP Residual norm 3.943330715529e-01 9072 KSP Residual norm 3.943330715529e-01 9073 KSP Residual norm 3.943330715529e-01 9074 KSP Residual norm 3.943330715529e-01 9075 KSP Residual norm 3.943330715529e-01 9076 KSP Residual norm 3.943330715529e-01 9077 KSP Residual norm 3.943330715529e-01 9078 KSP Residual norm 3.943330715529e-01 9079 KSP Residual norm 3.943330715529e-01 9080 KSP Residual norm 3.943330715529e-01 9081 KSP Residual norm 3.943330715529e-01 9082 KSP Residual norm 3.943330715529e-01 9083 KSP Residual norm 3.943330715529e-01 9084 KSP Residual norm 3.943330715529e-01 9085 KSP Residual norm 3.943330715529e-01 9086 KSP Residual norm 3.943330715529e-01 9087 KSP Residual norm 3.943330715529e-01 9088 KSP Residual norm 3.943330715529e-01 9089 KSP Residual norm 3.943330715529e-01 9090 KSP Residual norm 3.943330715530e-01 9091 KSP Residual norm 3.943330715530e-01 9092 KSP Residual norm 3.943330715530e-01 9093 KSP Residual norm 3.943330715530e-01 9094 KSP Residual norm 3.943330715530e-01 9095 KSP Residual norm 3.943330715530e-01 9096 KSP Residual norm 3.943330715530e-01 9097 KSP Residual norm 3.943330715530e-01 9098 KSP Residual norm 3.943330715530e-01 9099 KSP Residual norm 3.943330715530e-01 9100 KSP Residual norm 3.943330715530e-01 9101 KSP Residual norm 3.943330715530e-01 9102 KSP Residual norm 3.943330715530e-01 9103 KSP Residual norm 3.943330715530e-01 9104 KSP Residual norm 3.943330715530e-01 9105 KSP Residual norm 3.943330715530e-01 9106 KSP Residual norm 3.943330715530e-01 9107 KSP Residual norm 3.943330715530e-01 9108 KSP Residual norm 3.943330715530e-01 9109 KSP Residual norm 3.943330715530e-01 9110 KSP Residual norm 3.943330715530e-01 9111 KSP Residual norm 3.943330715530e-01 9112 KSP Residual norm 3.943330715530e-01 9113 KSP Residual norm 3.943330715530e-01 9114 KSP Residual norm 3.943330715530e-01 9115 KSP Residual norm 3.943330715530e-01 9116 KSP Residual norm 3.943330715530e-01 9117 KSP Residual norm 3.943330715530e-01 9118 KSP Residual norm 3.943330715530e-01 9119 KSP Residual norm 3.943330715530e-01 9120 KSP Residual norm 3.943330715528e-01 9121 KSP Residual norm 3.943330715528e-01 9122 KSP Residual norm 3.943330715528e-01 9123 KSP Residual norm 3.943330715528e-01 9124 KSP Residual norm 3.943330715528e-01 9125 KSP Residual norm 3.943330715528e-01 9126 KSP Residual norm 3.943330715528e-01 9127 KSP Residual norm 3.943330715528e-01 9128 KSP Residual norm 3.943330715528e-01 9129 KSP Residual norm 3.943330715528e-01 9130 KSP Residual norm 3.943330715528e-01 9131 KSP Residual norm 3.943330715528e-01 9132 KSP Residual norm 3.943330715528e-01 9133 KSP Residual norm 3.943330715528e-01 9134 KSP Residual norm 3.943330715528e-01 9135 KSP Residual norm 3.943330715528e-01 9136 KSP Residual norm 3.943330715528e-01 9137 KSP Residual norm 3.943330715528e-01 9138 KSP Residual norm 3.943330715528e-01 9139 KSP Residual norm 3.943330715528e-01 9140 KSP Residual norm 3.943330715528e-01 9141 KSP Residual norm 3.943330715528e-01 9142 KSP Residual norm 3.943330715528e-01 9143 KSP Residual norm 3.943330715528e-01 9144 KSP Residual norm 3.943330715528e-01 9145 KSP Residual norm 3.943330715528e-01 9146 KSP Residual norm 3.943330715528e-01 9147 KSP Residual norm 3.943330715528e-01 9148 KSP Residual norm 3.943330715528e-01 9149 KSP Residual norm 3.943330715528e-01 9150 KSP Residual norm 3.943330715530e-01 9151 KSP Residual norm 3.943330715530e-01 9152 KSP Residual norm 3.943330715530e-01 9153 KSP Residual norm 3.943330715530e-01 9154 KSP Residual norm 3.943330715530e-01 9155 KSP Residual norm 3.943330715530e-01 9156 KSP Residual norm 3.943330715530e-01 9157 KSP Residual norm 3.943330715530e-01 9158 KSP Residual norm 3.943330715530e-01 9159 KSP Residual norm 3.943330715530e-01 9160 KSP Residual norm 3.943330715530e-01 9161 KSP Residual norm 3.943330715530e-01 9162 KSP Residual norm 3.943330715530e-01 9163 KSP Residual norm 3.943330715530e-01 9164 KSP Residual norm 3.943330715530e-01 9165 KSP Residual norm 3.943330715530e-01 9166 KSP Residual norm 3.943330715530e-01 9167 KSP Residual norm 3.943330715530e-01 9168 KSP Residual norm 3.943330715530e-01 9169 KSP Residual norm 3.943330715530e-01 9170 KSP Residual norm 3.943330715530e-01 9171 KSP Residual norm 3.943330715530e-01 9172 KSP Residual norm 3.943330715530e-01 9173 KSP Residual norm 3.943330715530e-01 9174 KSP Residual norm 3.943330715530e-01 9175 KSP Residual norm 3.943330715530e-01 9176 KSP Residual norm 3.943330715530e-01 9177 KSP Residual norm 3.943330715530e-01 9178 KSP Residual norm 3.943330715530e-01 9179 KSP Residual norm 3.943330715530e-01 9180 KSP Residual norm 3.943330715531e-01 9181 KSP Residual norm 3.943330715531e-01 9182 KSP Residual norm 3.943330715531e-01 9183 KSP Residual norm 3.943330715531e-01 9184 KSP Residual norm 3.943330715531e-01 9185 KSP Residual norm 3.943330715531e-01 9186 KSP Residual norm 3.943330715531e-01 9187 KSP Residual norm 3.943330715531e-01 9188 KSP Residual norm 3.943330715531e-01 9189 KSP Residual norm 3.943330715531e-01 9190 KSP Residual norm 3.943330715531e-01 9191 KSP Residual norm 3.943330715531e-01 9192 KSP Residual norm 3.943330715531e-01 9193 KSP Residual norm 3.943330715531e-01 9194 KSP Residual norm 3.943330715531e-01 9195 KSP Residual norm 3.943330715531e-01 9196 KSP Residual norm 3.943330715531e-01 9197 KSP Residual norm 3.943330715531e-01 9198 KSP Residual norm 3.943330715531e-01 9199 KSP Residual norm 3.943330715531e-01 9200 KSP Residual norm 3.943330715531e-01 9201 KSP Residual norm 3.943330715531e-01 9202 KSP Residual norm 3.943330715531e-01 9203 KSP Residual norm 3.943330715531e-01 9204 KSP Residual norm 3.943330715531e-01 9205 KSP Residual norm 3.943330715531e-01 9206 KSP Residual norm 3.943330715531e-01 9207 KSP Residual norm 3.943330715531e-01 9208 KSP Residual norm 3.943330715531e-01 9209 KSP Residual norm 3.943330715531e-01 9210 KSP Residual norm 3.943330715530e-01 9211 KSP Residual norm 3.943330715530e-01 9212 KSP Residual norm 3.943330715530e-01 9213 KSP Residual norm 3.943330715530e-01 9214 KSP Residual norm 3.943330715530e-01 9215 KSP Residual norm 3.943330715530e-01 9216 KSP Residual norm 3.943330715530e-01 9217 KSP Residual norm 3.943330715530e-01 9218 KSP Residual norm 3.943330715530e-01 9219 KSP Residual norm 3.943330715530e-01 9220 KSP Residual norm 3.943330715530e-01 9221 KSP Residual norm 3.943330715530e-01 9222 KSP Residual norm 3.943330715530e-01 9223 KSP Residual norm 3.943330715530e-01 9224 KSP Residual norm 3.943330715530e-01 9225 KSP Residual norm 3.943330715530e-01 9226 KSP Residual norm 3.943330715530e-01 9227 KSP Residual norm 3.943330715530e-01 9228 KSP Residual norm 3.943330715530e-01 9229 KSP Residual norm 3.943330715530e-01 9230 KSP Residual norm 3.943330715530e-01 9231 KSP Residual norm 3.943330715530e-01 9232 KSP Residual norm 3.943330715530e-01 9233 KSP Residual norm 3.943330715530e-01 9234 KSP Residual norm 3.943330715530e-01 9235 KSP Residual norm 3.943330715530e-01 9236 KSP Residual norm 3.943330715530e-01 9237 KSP Residual norm 3.943330715530e-01 9238 KSP Residual norm 3.943330715530e-01 9239 KSP Residual norm 3.943330715530e-01 9240 KSP Residual norm 3.943330715529e-01 9241 KSP Residual norm 3.943330715529e-01 9242 KSP Residual norm 3.943330715529e-01 9243 KSP Residual norm 3.943330715529e-01 9244 KSP Residual norm 3.943330715529e-01 9245 KSP Residual norm 3.943330715529e-01 9246 KSP Residual norm 3.943330715529e-01 9247 KSP Residual norm 3.943330715529e-01 9248 KSP Residual norm 3.943330715529e-01 9249 KSP Residual norm 3.943330715529e-01 9250 KSP Residual norm 3.943330715529e-01 9251 KSP Residual norm 3.943330715529e-01 9252 KSP Residual norm 3.943330715529e-01 9253 KSP Residual norm 3.943330715529e-01 9254 KSP Residual norm 3.943330715529e-01 9255 KSP Residual norm 3.943330715529e-01 9256 KSP Residual norm 3.943330715529e-01 9257 KSP Residual norm 3.943330715529e-01 9258 KSP Residual norm 3.943330715529e-01 9259 KSP Residual norm 3.943330715529e-01 9260 KSP Residual norm 3.943330715529e-01 9261 KSP Residual norm 3.943330715529e-01 9262 KSP Residual norm 3.943330715529e-01 9263 KSP Residual norm 3.943330715529e-01 9264 KSP Residual norm 3.943330715529e-01 9265 KSP Residual norm 3.943330715529e-01 9266 KSP Residual norm 3.943330715529e-01 9267 KSP Residual norm 3.943330715529e-01 9268 KSP Residual norm 3.943330715529e-01 9269 KSP Residual norm 3.943330715529e-01 9270 KSP Residual norm 3.943330715529e-01 9271 KSP Residual norm 3.943330715529e-01 9272 KSP Residual norm 3.943330715529e-01 9273 KSP Residual norm 3.943330715529e-01 9274 KSP Residual norm 3.943330715529e-01 9275 KSP Residual norm 3.943330715529e-01 9276 KSP Residual norm 3.943330715529e-01 9277 KSP Residual norm 3.943330715529e-01 9278 KSP Residual norm 3.943330715529e-01 9279 KSP Residual norm 3.943330715529e-01 9280 KSP Residual norm 3.943330715529e-01 9281 KSP Residual norm 3.943330715529e-01 9282 KSP Residual norm 3.943330715529e-01 9283 KSP Residual norm 3.943330715529e-01 9284 KSP Residual norm 3.943330715529e-01 9285 KSP Residual norm 3.943330715529e-01 9286 KSP Residual norm 3.943330715529e-01 9287 KSP Residual norm 3.943330715529e-01 9288 KSP Residual norm 3.943330715529e-01 9289 KSP Residual norm 3.943330715529e-01 9290 KSP Residual norm 3.943330715529e-01 9291 KSP Residual norm 3.943330715529e-01 9292 KSP Residual norm 3.943330715529e-01 9293 KSP Residual norm 3.943330715529e-01 9294 KSP Residual norm 3.943330715529e-01 9295 KSP Residual norm 3.943330715529e-01 9296 KSP Residual norm 3.943330715529e-01 9297 KSP Residual norm 3.943330715529e-01 9298 KSP Residual norm 3.943330715529e-01 9299 KSP Residual norm 3.943330715529e-01 9300 KSP Residual norm 3.943330715528e-01 9301 KSP Residual norm 3.943330715528e-01 9302 KSP Residual norm 3.943330715528e-01 9303 KSP Residual norm 3.943330715528e-01 9304 KSP Residual norm 3.943330715528e-01 9305 KSP Residual norm 3.943330715528e-01 9306 KSP Residual norm 3.943330715528e-01 9307 KSP Residual norm 3.943330715528e-01 9308 KSP Residual norm 3.943330715528e-01 9309 KSP Residual norm 3.943330715528e-01 9310 KSP Residual norm 3.943330715528e-01 9311 KSP Residual norm 3.943330715528e-01 9312 KSP Residual norm 3.943330715528e-01 9313 KSP Residual norm 3.943330715528e-01 9314 KSP Residual norm 3.943330715528e-01 9315 KSP Residual norm 3.943330715528e-01 9316 KSP Residual norm 3.943330715528e-01 9317 KSP Residual norm 3.943330715528e-01 9318 KSP Residual norm 3.943330715528e-01 9319 KSP Residual norm 3.943330715528e-01 9320 KSP Residual norm 3.943330715528e-01 9321 KSP Residual norm 3.943330715528e-01 9322 KSP Residual norm 3.943330715528e-01 9323 KSP Residual norm 3.943330715528e-01 9324 KSP Residual norm 3.943330715528e-01 9325 KSP Residual norm 3.943330715528e-01 9326 KSP Residual norm 3.943330715528e-01 9327 KSP Residual norm 3.943330715528e-01 9328 KSP Residual norm 3.943330715528e-01 9329 KSP Residual norm 3.943330715528e-01 9330 KSP Residual norm 3.943330715526e-01 9331 KSP Residual norm 3.943330715526e-01 9332 KSP Residual norm 3.943330715526e-01 9333 KSP Residual norm 3.943330715526e-01 9334 KSP Residual norm 3.943330715526e-01 9335 KSP Residual norm 3.943330715526e-01 9336 KSP Residual norm 3.943330715526e-01 9337 KSP Residual norm 3.943330715526e-01 9338 KSP Residual norm 3.943330715526e-01 9339 KSP Residual norm 3.943330715526e-01 9340 KSP Residual norm 3.943330715526e-01 9341 KSP Residual norm 3.943330715526e-01 9342 KSP Residual norm 3.943330715526e-01 9343 KSP Residual norm 3.943330715526e-01 9344 KSP Residual norm 3.943330715526e-01 9345 KSP Residual norm 3.943330715526e-01 9346 KSP Residual norm 3.943330715526e-01 9347 KSP Residual norm 3.943330715526e-01 9348 KSP Residual norm 3.943330715526e-01 9349 KSP Residual norm 3.943330715526e-01 9350 KSP Residual norm 3.943330715526e-01 9351 KSP Residual norm 3.943330715526e-01 9352 KSP Residual norm 3.943330715526e-01 9353 KSP Residual norm 3.943330715526e-01 9354 KSP Residual norm 3.943330715526e-01 9355 KSP Residual norm 3.943330715526e-01 9356 KSP Residual norm 3.943330715526e-01 9357 KSP Residual norm 3.943330715526e-01 9358 KSP Residual norm 3.943330715526e-01 9359 KSP Residual norm 3.943330715526e-01 9360 KSP Residual norm 3.943330715530e-01 9361 KSP Residual norm 3.943330715530e-01 9362 KSP Residual norm 3.943330715530e-01 9363 KSP Residual norm 3.943330715530e-01 9364 KSP Residual norm 3.943330715530e-01 9365 KSP Residual norm 3.943330715530e-01 9366 KSP Residual norm 3.943330715530e-01 9367 KSP Residual norm 3.943330715530e-01 9368 KSP Residual norm 3.943330715530e-01 9369 KSP Residual norm 3.943330715530e-01 9370 KSP Residual norm 3.943330715530e-01 9371 KSP Residual norm 3.943330715530e-01 9372 KSP Residual norm 3.943330715530e-01 9373 KSP Residual norm 3.943330715530e-01 9374 KSP Residual norm 3.943330715530e-01 9375 KSP Residual norm 3.943330715530e-01 9376 KSP Residual norm 3.943330715530e-01 9377 KSP Residual norm 3.943330715530e-01 9378 KSP Residual norm 3.943330715530e-01 9379 KSP Residual norm 3.943330715530e-01 9380 KSP Residual norm 3.943330715530e-01 9381 KSP Residual norm 3.943330715530e-01 9382 KSP Residual norm 3.943330715530e-01 9383 KSP Residual norm 3.943330715530e-01 9384 KSP Residual norm 3.943330715530e-01 9385 KSP Residual norm 3.943330715530e-01 9386 KSP Residual norm 3.943330715530e-01 9387 KSP Residual norm 3.943330715530e-01 9388 KSP Residual norm 3.943330715530e-01 9389 KSP Residual norm 3.943330715530e-01 9390 KSP Residual norm 3.943330715528e-01 9391 KSP Residual norm 3.943330715528e-01 9392 KSP Residual norm 3.943330715528e-01 9393 KSP Residual norm 3.943330715528e-01 9394 KSP Residual norm 3.943330715528e-01 9395 KSP Residual norm 3.943330715528e-01 9396 KSP Residual norm 3.943330715528e-01 9397 KSP Residual norm 3.943330715528e-01 9398 KSP Residual norm 3.943330715528e-01 9399 KSP Residual norm 3.943330715528e-01 9400 KSP Residual norm 3.943330715528e-01 9401 KSP Residual norm 3.943330715528e-01 9402 KSP Residual norm 3.943330715528e-01 9403 KSP Residual norm 3.943330715528e-01 9404 KSP Residual norm 3.943330715528e-01 9405 KSP Residual norm 3.943330715528e-01 9406 KSP Residual norm 3.943330715528e-01 9407 KSP Residual norm 3.943330715528e-01 9408 KSP Residual norm 3.943330715528e-01 9409 KSP Residual norm 3.943330715528e-01 9410 KSP Residual norm 3.943330715528e-01 9411 KSP Residual norm 3.943330715528e-01 9412 KSP Residual norm 3.943330715528e-01 9413 KSP Residual norm 3.943330715528e-01 9414 KSP Residual norm 3.943330715528e-01 9415 KSP Residual norm 3.943330715528e-01 9416 KSP Residual norm 3.943330715528e-01 9417 KSP Residual norm 3.943330715528e-01 9418 KSP Residual norm 3.943330715528e-01 9419 KSP Residual norm 3.943330715528e-01 9420 KSP Residual norm 3.943330715528e-01 9421 KSP Residual norm 3.943330715528e-01 9422 KSP Residual norm 3.943330715528e-01 9423 KSP Residual norm 3.943330715528e-01 9424 KSP Residual norm 3.943330715528e-01 9425 KSP Residual norm 3.943330715528e-01 9426 KSP Residual norm 3.943330715528e-01 9427 KSP Residual norm 3.943330715528e-01 9428 KSP Residual norm 3.943330715528e-01 9429 KSP Residual norm 3.943330715528e-01 9430 KSP Residual norm 3.943330715528e-01 9431 KSP Residual norm 3.943330715528e-01 9432 KSP Residual norm 3.943330715528e-01 9433 KSP Residual norm 3.943330715528e-01 9434 KSP Residual norm 3.943330715528e-01 9435 KSP Residual norm 3.943330715528e-01 9436 KSP Residual norm 3.943330715528e-01 9437 KSP Residual norm 3.943330715528e-01 9438 KSP Residual norm 3.943330715528e-01 9439 KSP Residual norm 3.943330715528e-01 9440 KSP Residual norm 3.943330715528e-01 9441 KSP Residual norm 3.943330715528e-01 9442 KSP Residual norm 3.943330715528e-01 9443 KSP Residual norm 3.943330715528e-01 9444 KSP Residual norm 3.943330715528e-01 9445 KSP Residual norm 3.943330715528e-01 9446 KSP Residual norm 3.943330715528e-01 9447 KSP Residual norm 3.943330715528e-01 9448 KSP Residual norm 3.943330715528e-01 9449 KSP Residual norm 3.943330715528e-01 9450 KSP Residual norm 3.943330715528e-01 9451 KSP Residual norm 3.943330715528e-01 9452 KSP Residual norm 3.943330715528e-01 9453 KSP Residual norm 3.943330715528e-01 9454 KSP Residual norm 3.943330715528e-01 9455 KSP Residual norm 3.943330715528e-01 9456 KSP Residual norm 3.943330715528e-01 9457 KSP Residual norm 3.943330715528e-01 9458 KSP Residual norm 3.943330715528e-01 9459 KSP Residual norm 3.943330715528e-01 9460 KSP Residual norm 3.943330715528e-01 9461 KSP Residual norm 3.943330715528e-01 9462 KSP Residual norm 3.943330715528e-01 9463 KSP Residual norm 3.943330715528e-01 9464 KSP Residual norm 3.943330715528e-01 9465 KSP Residual norm 3.943330715528e-01 9466 KSP Residual norm 3.943330715528e-01 9467 KSP Residual norm 3.943330715528e-01 9468 KSP Residual norm 3.943330715528e-01 9469 KSP Residual norm 3.943330715528e-01 9470 KSP Residual norm 3.943330715528e-01 9471 KSP Residual norm 3.943330715528e-01 9472 KSP Residual norm 3.943330715528e-01 9473 KSP Residual norm 3.943330715528e-01 9474 KSP Residual norm 3.943330715528e-01 9475 KSP Residual norm 3.943330715528e-01 9476 KSP Residual norm 3.943330715528e-01 9477 KSP Residual norm 3.943330715528e-01 9478 KSP Residual norm 3.943330715528e-01 9479 KSP Residual norm 3.943330715528e-01 9480 KSP Residual norm 3.943330715527e-01 9481 KSP Residual norm 3.943330715527e-01 9482 KSP Residual norm 3.943330715527e-01 9483 KSP Residual norm 3.943330715527e-01 9484 KSP Residual norm 3.943330715527e-01 9485 KSP Residual norm 3.943330715527e-01 9486 KSP Residual norm 3.943330715527e-01 9487 KSP Residual norm 3.943330715527e-01 9488 KSP Residual norm 3.943330715527e-01 9489 KSP Residual norm 3.943330715527e-01 9490 KSP Residual norm 3.943330715527e-01 9491 KSP Residual norm 3.943330715527e-01 9492 KSP Residual norm 3.943330715527e-01 9493 KSP Residual norm 3.943330715527e-01 9494 KSP Residual norm 3.943330715527e-01 9495 KSP Residual norm 3.943330715527e-01 9496 KSP Residual norm 3.943330715527e-01 9497 KSP Residual norm 3.943330715527e-01 9498 KSP Residual norm 3.943330715527e-01 9499 KSP Residual norm 3.943330715527e-01 9500 KSP Residual norm 3.943330715527e-01 9501 KSP Residual norm 3.943330715527e-01 9502 KSP Residual norm 3.943330715527e-01 9503 KSP Residual norm 3.943330715527e-01 9504 KSP Residual norm 3.943330715527e-01 9505 KSP Residual norm 3.943330715527e-01 9506 KSP Residual norm 3.943330715527e-01 9507 KSP Residual norm 3.943330715527e-01 9508 KSP Residual norm 3.943330715527e-01 9509 KSP Residual norm 3.943330715527e-01 9510 KSP Residual norm 3.943330715528e-01 9511 KSP Residual norm 3.943330715528e-01 9512 KSP Residual norm 3.943330715528e-01 9513 KSP Residual norm 3.943330715528e-01 9514 KSP Residual norm 3.943330715528e-01 9515 KSP Residual norm 3.943330715528e-01 9516 KSP Residual norm 3.943330715528e-01 9517 KSP Residual norm 3.943330715528e-01 9518 KSP Residual norm 3.943330715528e-01 9519 KSP Residual norm 3.943330715528e-01 9520 KSP Residual norm 3.943330715528e-01 9521 KSP Residual norm 3.943330715528e-01 9522 KSP Residual norm 3.943330715528e-01 9523 KSP Residual norm 3.943330715528e-01 9524 KSP Residual norm 3.943330715528e-01 9525 KSP Residual norm 3.943330715528e-01 9526 KSP Residual norm 3.943330715528e-01 9527 KSP Residual norm 3.943330715528e-01 9528 KSP Residual norm 3.943330715528e-01 9529 KSP Residual norm 3.943330715528e-01 9530 KSP Residual norm 3.943330715528e-01 9531 KSP Residual norm 3.943330715528e-01 9532 KSP Residual norm 3.943330715528e-01 9533 KSP Residual norm 3.943330715528e-01 9534 KSP Residual norm 3.943330715528e-01 9535 KSP Residual norm 3.943330715528e-01 9536 KSP Residual norm 3.943330715528e-01 9537 KSP Residual norm 3.943330715528e-01 9538 KSP Residual norm 3.943330715528e-01 9539 KSP Residual norm 3.943330715528e-01 9540 KSP Residual norm 3.943330715526e-01 9541 KSP Residual norm 3.943330715526e-01 9542 KSP Residual norm 3.943330715526e-01 9543 KSP Residual norm 3.943330715526e-01 9544 KSP Residual norm 3.943330715526e-01 9545 KSP Residual norm 3.943330715526e-01 9546 KSP Residual norm 3.943330715526e-01 9547 KSP Residual norm 3.943330715526e-01 9548 KSP Residual norm 3.943330715526e-01 9549 KSP Residual norm 3.943330715526e-01 9550 KSP Residual norm 3.943330715526e-01 9551 KSP Residual norm 3.943330715526e-01 9552 KSP Residual norm 3.943330715526e-01 9553 KSP Residual norm 3.943330715526e-01 9554 KSP Residual norm 3.943330715526e-01 9555 KSP Residual norm 3.943330715526e-01 9556 KSP Residual norm 3.943330715526e-01 9557 KSP Residual norm 3.943330715526e-01 9558 KSP Residual norm 3.943330715526e-01 9559 KSP Residual norm 3.943330715526e-01 9560 KSP Residual norm 3.943330715526e-01 9561 KSP Residual norm 3.943330715526e-01 9562 KSP Residual norm 3.943330715526e-01 9563 KSP Residual norm 3.943330715526e-01 9564 KSP Residual norm 3.943330715526e-01 9565 KSP Residual norm 3.943330715526e-01 9566 KSP Residual norm 3.943330715526e-01 9567 KSP Residual norm 3.943330715526e-01 9568 KSP Residual norm 3.943330715526e-01 9569 KSP Residual norm 3.943330715526e-01 9570 KSP Residual norm 3.943330715528e-01 9571 KSP Residual norm 3.943330715528e-01 9572 KSP Residual norm 3.943330715528e-01 9573 KSP Residual norm 3.943330715528e-01 9574 KSP Residual norm 3.943330715528e-01 9575 KSP Residual norm 3.943330715528e-01 9576 KSP Residual norm 3.943330715528e-01 9577 KSP Residual norm 3.943330715528e-01 9578 KSP Residual norm 3.943330715528e-01 9579 KSP Residual norm 3.943330715528e-01 9580 KSP Residual norm 3.943330715528e-01 9581 KSP Residual norm 3.943330715528e-01 9582 KSP Residual norm 3.943330715528e-01 9583 KSP Residual norm 3.943330715528e-01 9584 KSP Residual norm 3.943330715528e-01 9585 KSP Residual norm 3.943330715528e-01 9586 KSP Residual norm 3.943330715528e-01 9587 KSP Residual norm 3.943330715528e-01 9588 KSP Residual norm 3.943330715528e-01 9589 KSP Residual norm 3.943330715528e-01 9590 KSP Residual norm 3.943330715528e-01 9591 KSP Residual norm 3.943330715528e-01 9592 KSP Residual norm 3.943330715528e-01 9593 KSP Residual norm 3.943330715528e-01 9594 KSP Residual norm 3.943330715528e-01 9595 KSP Residual norm 3.943330715528e-01 9596 KSP Residual norm 3.943330715528e-01 9597 KSP Residual norm 3.943330715528e-01 9598 KSP Residual norm 3.943330715528e-01 9599 KSP Residual norm 3.943330715528e-01 9600 KSP Residual norm 3.943330715529e-01 9601 KSP Residual norm 3.943330715529e-01 9602 KSP Residual norm 3.943330715529e-01 9603 KSP Residual norm 3.943330715529e-01 9604 KSP Residual norm 3.943330715529e-01 9605 KSP Residual norm 3.943330715529e-01 9606 KSP Residual norm 3.943330715529e-01 9607 KSP Residual norm 3.943330715529e-01 9608 KSP Residual norm 3.943330715529e-01 9609 KSP Residual norm 3.943330715529e-01 9610 KSP Residual norm 3.943330715529e-01 9611 KSP Residual norm 3.943330715529e-01 9612 KSP Residual norm 3.943330715529e-01 9613 KSP Residual norm 3.943330715529e-01 9614 KSP Residual norm 3.943330715529e-01 9615 KSP Residual norm 3.943330715529e-01 9616 KSP Residual norm 3.943330715529e-01 9617 KSP Residual norm 3.943330715529e-01 9618 KSP Residual norm 3.943330715529e-01 9619 KSP Residual norm 3.943330715529e-01 9620 KSP Residual norm 3.943330715529e-01 9621 KSP Residual norm 3.943330715529e-01 9622 KSP Residual norm 3.943330715529e-01 9623 KSP Residual norm 3.943330715529e-01 9624 KSP Residual norm 3.943330715529e-01 9625 KSP Residual norm 3.943330715529e-01 9626 KSP Residual norm 3.943330715529e-01 9627 KSP Residual norm 3.943330715529e-01 9628 KSP Residual norm 3.943330715529e-01 9629 KSP Residual norm 3.943330715529e-01 9630 KSP Residual norm 3.943330715529e-01 9631 KSP Residual norm 3.943330715529e-01 9632 KSP Residual norm 3.943330715529e-01 9633 KSP Residual norm 3.943330715529e-01 9634 KSP Residual norm 3.943330715529e-01 9635 KSP Residual norm 3.943330715529e-01 9636 KSP Residual norm 3.943330715529e-01 9637 KSP Residual norm 3.943330715529e-01 9638 KSP Residual norm 3.943330715529e-01 9639 KSP Residual norm 3.943330715529e-01 9640 KSP Residual norm 3.943330715529e-01 9641 KSP Residual norm 3.943330715529e-01 9642 KSP Residual norm 3.943330715529e-01 9643 KSP Residual norm 3.943330715529e-01 9644 KSP Residual norm 3.943330715529e-01 9645 KSP Residual norm 3.943330715529e-01 9646 KSP Residual norm 3.943330715529e-01 9647 KSP Residual norm 3.943330715529e-01 9648 KSP Residual norm 3.943330715529e-01 9649 KSP Residual norm 3.943330715529e-01 9650 KSP Residual norm 3.943330715529e-01 9651 KSP Residual norm 3.943330715529e-01 9652 KSP Residual norm 3.943330715529e-01 9653 KSP Residual norm 3.943330715529e-01 9654 KSP Residual norm 3.943330715529e-01 9655 KSP Residual norm 3.943330715529e-01 9656 KSP Residual norm 3.943330715529e-01 9657 KSP Residual norm 3.943330715529e-01 9658 KSP Residual norm 3.943330715529e-01 9659 KSP Residual norm 3.943330715529e-01 9660 KSP Residual norm 3.943330715525e-01 9661 KSP Residual norm 3.943330715525e-01 9662 KSP Residual norm 3.943330715525e-01 9663 KSP Residual norm 3.943330715525e-01 9664 KSP Residual norm 3.943330715525e-01 9665 KSP Residual norm 3.943330715525e-01 9666 KSP Residual norm 3.943330715525e-01 9667 KSP Residual norm 3.943330715525e-01 9668 KSP Residual norm 3.943330715525e-01 9669 KSP Residual norm 3.943330715525e-01 9670 KSP Residual norm 3.943330715525e-01 9671 KSP Residual norm 3.943330715525e-01 9672 KSP Residual norm 3.943330715525e-01 9673 KSP Residual norm 3.943330715525e-01 9674 KSP Residual norm 3.943330715525e-01 9675 KSP Residual norm 3.943330715525e-01 9676 KSP Residual norm 3.943330715525e-01 9677 KSP Residual norm 3.943330715525e-01 9678 KSP Residual norm 3.943330715525e-01 9679 KSP Residual norm 3.943330715525e-01 9680 KSP Residual norm 3.943330715525e-01 9681 KSP Residual norm 3.943330715525e-01 9682 KSP Residual norm 3.943330715525e-01 9683 KSP Residual norm 3.943330715525e-01 9684 KSP Residual norm 3.943330715525e-01 9685 KSP Residual norm 3.943330715525e-01 9686 KSP Residual norm 3.943330715525e-01 9687 KSP Residual norm 3.943330715525e-01 9688 KSP Residual norm 3.943330715525e-01 9689 KSP Residual norm 3.943330715525e-01 9690 KSP Residual norm 3.943330715528e-01 9691 KSP Residual norm 3.943330715528e-01 9692 KSP Residual norm 3.943330715528e-01 9693 KSP Residual norm 3.943330715528e-01 9694 KSP Residual norm 3.943330715528e-01 9695 KSP Residual norm 3.943330715528e-01 9696 KSP Residual norm 3.943330715528e-01 9697 KSP Residual norm 3.943330715528e-01 9698 KSP Residual norm 3.943330715528e-01 9699 KSP Residual norm 3.943330715528e-01 9700 KSP Residual norm 3.943330715528e-01 9701 KSP Residual norm 3.943330715528e-01 9702 KSP Residual norm 3.943330715528e-01 9703 KSP Residual norm 3.943330715528e-01 9704 KSP Residual norm 3.943330715528e-01 9705 KSP Residual norm 3.943330715528e-01 9706 KSP Residual norm 3.943330715528e-01 9707 KSP Residual norm 3.943330715528e-01 9708 KSP Residual norm 3.943330715528e-01 9709 KSP Residual norm 3.943330715528e-01 9710 KSP Residual norm 3.943330715528e-01 9711 KSP Residual norm 3.943330715528e-01 9712 KSP Residual norm 3.943330715528e-01 9713 KSP Residual norm 3.943330715528e-01 9714 KSP Residual norm 3.943330715528e-01 9715 KSP Residual norm 3.943330715528e-01 9716 KSP Residual norm 3.943330715528e-01 9717 KSP Residual norm 3.943330715528e-01 9718 KSP Residual norm 3.943330715528e-01 9719 KSP Residual norm 3.943330715528e-01 9720 KSP Residual norm 3.943330715527e-01 9721 KSP Residual norm 3.943330715527e-01 9722 KSP Residual norm 3.943330715527e-01 9723 KSP Residual norm 3.943330715527e-01 9724 KSP Residual norm 3.943330715527e-01 9725 KSP Residual norm 3.943330715527e-01 9726 KSP Residual norm 3.943330715527e-01 9727 KSP Residual norm 3.943330715527e-01 9728 KSP Residual norm 3.943330715527e-01 9729 KSP Residual norm 3.943330715527e-01 9730 KSP Residual norm 3.943330715527e-01 9731 KSP Residual norm 3.943330715527e-01 9732 KSP Residual norm 3.943330715527e-01 9733 KSP Residual norm 3.943330715527e-01 9734 KSP Residual norm 3.943330715527e-01 9735 KSP Residual norm 3.943330715527e-01 9736 KSP Residual norm 3.943330715527e-01 9737 KSP Residual norm 3.943330715527e-01 9738 KSP Residual norm 3.943330715527e-01 9739 KSP Residual norm 3.943330715527e-01 9740 KSP Residual norm 3.943330715527e-01 9741 KSP Residual norm 3.943330715527e-01 9742 KSP Residual norm 3.943330715527e-01 9743 KSP Residual norm 3.943330715527e-01 9744 KSP Residual norm 3.943330715527e-01 9745 KSP Residual norm 3.943330715527e-01 9746 KSP Residual norm 3.943330715527e-01 9747 KSP Residual norm 3.943330715527e-01 9748 KSP Residual norm 3.943330715527e-01 9749 KSP Residual norm 3.943330715527e-01 9750 KSP Residual norm 3.943330715529e-01 9751 KSP Residual norm 3.943330715529e-01 9752 KSP Residual norm 3.943330715529e-01 9753 KSP Residual norm 3.943330715529e-01 9754 KSP Residual norm 3.943330715529e-01 9755 KSP Residual norm 3.943330715529e-01 9756 KSP Residual norm 3.943330715529e-01 9757 KSP Residual norm 3.943330715529e-01 9758 KSP Residual norm 3.943330715529e-01 9759 KSP Residual norm 3.943330715529e-01 9760 KSP Residual norm 3.943330715529e-01 9761 KSP Residual norm 3.943330715529e-01 9762 KSP Residual norm 3.943330715529e-01 9763 KSP Residual norm 3.943330715529e-01 9764 KSP Residual norm 3.943330715529e-01 9765 KSP Residual norm 3.943330715529e-01 9766 KSP Residual norm 3.943330715529e-01 9767 KSP Residual norm 3.943330715529e-01 9768 KSP Residual norm 3.943330715529e-01 9769 KSP Residual norm 3.943330715529e-01 9770 KSP Residual norm 3.943330715529e-01 9771 KSP Residual norm 3.943330715529e-01 9772 KSP Residual norm 3.943330715529e-01 9773 KSP Residual norm 3.943330715529e-01 9774 KSP Residual norm 3.943330715529e-01 9775 KSP Residual norm 3.943330715529e-01 9776 KSP Residual norm 3.943330715529e-01 9777 KSP Residual norm 3.943330715529e-01 9778 KSP Residual norm 3.943330715529e-01 9779 KSP Residual norm 3.943330715529e-01 9780 KSP Residual norm 3.943330715528e-01 9781 KSP Residual norm 3.943330715528e-01 9782 KSP Residual norm 3.943330715528e-01 9783 KSP Residual norm 3.943330715528e-01 9784 KSP Residual norm 3.943330715528e-01 9785 KSP Residual norm 3.943330715528e-01 9786 KSP Residual norm 3.943330715528e-01 9787 KSP Residual norm 3.943330715528e-01 9788 KSP Residual norm 3.943330715528e-01 9789 KSP Residual norm 3.943330715528e-01 9790 KSP Residual norm 3.943330715528e-01 9791 KSP Residual norm 3.943330715528e-01 9792 KSP Residual norm 3.943330715528e-01 9793 KSP Residual norm 3.943330715528e-01 9794 KSP Residual norm 3.943330715528e-01 9795 KSP Residual norm 3.943330715528e-01 9796 KSP Residual norm 3.943330715528e-01 9797 KSP Residual norm 3.943330715528e-01 9798 KSP Residual norm 3.943330715528e-01 9799 KSP Residual norm 3.943330715528e-01 9800 KSP Residual norm 3.943330715528e-01 9801 KSP Residual norm 3.943330715528e-01 9802 KSP Residual norm 3.943330715528e-01 9803 KSP Residual norm 3.943330715528e-01 9804 KSP Residual norm 3.943330715528e-01 9805 KSP Residual norm 3.943330715528e-01 9806 KSP Residual norm 3.943330715528e-01 9807 KSP Residual norm 3.943330715528e-01 9808 KSP Residual norm 3.943330715528e-01 9809 KSP Residual norm 3.943330715528e-01 9810 KSP Residual norm 3.943330715528e-01 9811 KSP Residual norm 3.943330715528e-01 9812 KSP Residual norm 3.943330715528e-01 9813 KSP Residual norm 3.943330715528e-01 9814 KSP Residual norm 3.943330715528e-01 9815 KSP Residual norm 3.943330715528e-01 9816 KSP Residual norm 3.943330715528e-01 9817 KSP Residual norm 3.943330715528e-01 9818 KSP Residual norm 3.943330715528e-01 9819 KSP Residual norm 3.943330715528e-01 9820 KSP Residual norm 3.943330715528e-01 9821 KSP Residual norm 3.943330715528e-01 9822 KSP Residual norm 3.943330715528e-01 9823 KSP Residual norm 3.943330715528e-01 9824 KSP Residual norm 3.943330715528e-01 9825 KSP Residual norm 3.943330715528e-01 9826 KSP Residual norm 3.943330715528e-01 9827 KSP Residual norm 3.943330715528e-01 9828 KSP Residual norm 3.943330715528e-01 9829 KSP Residual norm 3.943330715528e-01 9830 KSP Residual norm 3.943330715528e-01 9831 KSP Residual norm 3.943330715528e-01 9832 KSP Residual norm 3.943330715528e-01 9833 KSP Residual norm 3.943330715528e-01 9834 KSP Residual norm 3.943330715528e-01 9835 KSP Residual norm 3.943330715528e-01 9836 KSP Residual norm 3.943330715528e-01 9837 KSP Residual norm 3.943330715528e-01 9838 KSP Residual norm 3.943330715528e-01 9839 KSP Residual norm 3.943330715528e-01 9840 KSP Residual norm 3.943330715529e-01 9841 KSP Residual norm 3.943330715529e-01 9842 KSP Residual norm 3.943330715529e-01 9843 KSP Residual norm 3.943330715529e-01 9844 KSP Residual norm 3.943330715529e-01 9845 KSP Residual norm 3.943330715529e-01 9846 KSP Residual norm 3.943330715529e-01 9847 KSP Residual norm 3.943330715529e-01 9848 KSP Residual norm 3.943330715529e-01 9849 KSP Residual norm 3.943330715529e-01 9850 KSP Residual norm 3.943330715529e-01 9851 KSP Residual norm 3.943330715529e-01 9852 KSP Residual norm 3.943330715529e-01 9853 KSP Residual norm 3.943330715529e-01 9854 KSP Residual norm 3.943330715529e-01 9855 KSP Residual norm 3.943330715529e-01 9856 KSP Residual norm 3.943330715529e-01 9857 KSP Residual norm 3.943330715529e-01 9858 KSP Residual norm 3.943330715529e-01 9859 KSP Residual norm 3.943330715529e-01 9860 KSP Residual norm 3.943330715529e-01 9861 KSP Residual norm 3.943330715529e-01 9862 KSP Residual norm 3.943330715529e-01 9863 KSP Residual norm 3.943330715529e-01 9864 KSP Residual norm 3.943330715529e-01 9865 KSP Residual norm 3.943330715529e-01 9866 KSP Residual norm 3.943330715529e-01 9867 KSP Residual norm 3.943330715529e-01 9868 KSP Residual norm 3.943330715529e-01 9869 KSP Residual norm 3.943330715529e-01 9870 KSP Residual norm 3.943330715529e-01 9871 KSP Residual norm 3.943330715529e-01 9872 KSP Residual norm 3.943330715529e-01 9873 KSP Residual norm 3.943330715529e-01 9874 KSP Residual norm 3.943330715529e-01 9875 KSP Residual norm 3.943330715529e-01 9876 KSP Residual norm 3.943330715529e-01 9877 KSP Residual norm 3.943330715529e-01 9878 KSP Residual norm 3.943330715529e-01 9879 KSP Residual norm 3.943330715529e-01 9880 KSP Residual norm 3.943330715529e-01 9881 KSP Residual norm 3.943330715529e-01 9882 KSP Residual norm 3.943330715529e-01 9883 KSP Residual norm 3.943330715529e-01 9884 KSP Residual norm 3.943330715529e-01 9885 KSP Residual norm 3.943330715529e-01 9886 KSP Residual norm 3.943330715529e-01 9887 KSP Residual norm 3.943330715529e-01 9888 KSP Residual norm 3.943330715529e-01 9889 KSP Residual norm 3.943330715529e-01 9890 KSP Residual norm 3.943330715529e-01 9891 KSP Residual norm 3.943330715529e-01 9892 KSP Residual norm 3.943330715529e-01 9893 KSP Residual norm 3.943330715529e-01 9894 KSP Residual norm 3.943330715529e-01 9895 KSP Residual norm 3.943330715529e-01 9896 KSP Residual norm 3.943330715529e-01 9897 KSP Residual norm 3.943330715529e-01 9898 KSP Residual norm 3.943330715529e-01 9899 KSP Residual norm 3.943330715529e-01 9900 KSP Residual norm 3.943330715532e-01 9901 KSP Residual norm 3.943330715532e-01 9902 KSP Residual norm 3.943330715532e-01 9903 KSP Residual norm 3.943330715532e-01 9904 KSP Residual norm 3.943330715532e-01 9905 KSP Residual norm 3.943330715532e-01 9906 KSP Residual norm 3.943330715532e-01 9907 KSP Residual norm 3.943330715532e-01 9908 KSP Residual norm 3.943330715532e-01 9909 KSP Residual norm 3.943330715532e-01 9910 KSP Residual norm 3.943330715532e-01 9911 KSP Residual norm 3.943330715532e-01 9912 KSP Residual norm 3.943330715532e-01 9913 KSP Residual norm 3.943330715532e-01 9914 KSP Residual norm 3.943330715532e-01 9915 KSP Residual norm 3.943330715532e-01 9916 KSP Residual norm 3.943330715532e-01 9917 KSP Residual norm 3.943330715532e-01 9918 KSP Residual norm 3.943330715532e-01 9919 KSP Residual norm 3.943330715532e-01 9920 KSP Residual norm 3.943330715532e-01 9921 KSP Residual norm 3.943330715532e-01 9922 KSP Residual norm 3.943330715532e-01 9923 KSP Residual norm 3.943330715532e-01 9924 KSP Residual norm 3.943330715532e-01 9925 KSP Residual norm 3.943330715532e-01 9926 KSP Residual norm 3.943330715532e-01 9927 KSP Residual norm 3.943330715532e-01 9928 KSP Residual norm 3.943330715532e-01 9929 KSP Residual norm 3.943330715532e-01 9930 KSP Residual norm 3.943330715531e-01 9931 KSP Residual norm 3.943330715531e-01 9932 KSP Residual norm 3.943330715531e-01 9933 KSP Residual norm 3.943330715531e-01 9934 KSP Residual norm 3.943330715531e-01 9935 KSP Residual norm 3.943330715531e-01 9936 KSP Residual norm 3.943330715531e-01 9937 KSP Residual norm 3.943330715531e-01 9938 KSP Residual norm 3.943330715531e-01 9939 KSP Residual norm 3.943330715531e-01 9940 KSP Residual norm 3.943330715531e-01 9941 KSP Residual norm 3.943330715531e-01 9942 KSP Residual norm 3.943330715531e-01 9943 KSP Residual norm 3.943330715531e-01 9944 KSP Residual norm 3.943330715531e-01 9945 KSP Residual norm 3.943330715531e-01 9946 KSP Residual norm 3.943330715531e-01 9947 KSP Residual norm 3.943330715531e-01 9948 KSP Residual norm 3.943330715531e-01 9949 KSP Residual norm 3.943330715531e-01 9950 KSP Residual norm 3.943330715531e-01 9951 KSP Residual norm 3.943330715531e-01 9952 KSP Residual norm 3.943330715531e-01 9953 KSP Residual norm 3.943330715531e-01 9954 KSP Residual norm 3.943330715531e-01 9955 KSP Residual norm 3.943330715531e-01 9956 KSP Residual norm 3.943330715531e-01 9957 KSP Residual norm 3.943330715531e-01 9958 KSP Residual norm 3.943330715531e-01 9959 KSP Residual norm 3.943330715531e-01 9960 KSP Residual norm 3.943330715532e-01 9961 KSP Residual norm 3.943330715532e-01 9962 KSP Residual norm 3.943330715532e-01 9963 KSP Residual norm 3.943330715532e-01 9964 KSP Residual norm 3.943330715532e-01 9965 KSP Residual norm 3.943330715532e-01 9966 KSP Residual norm 3.943330715532e-01 9967 KSP Residual norm 3.943330715532e-01 9968 KSP Residual norm 3.943330715532e-01 9969 KSP Residual norm 3.943330715532e-01 9970 KSP Residual norm 3.943330715532e-01 9971 KSP Residual norm 3.943330715532e-01 9972 KSP Residual norm 3.943330715532e-01 9973 KSP Residual norm 3.943330715532e-01 9974 KSP Residual norm 3.943330715532e-01 9975 KSP Residual norm 3.943330715532e-01 9976 KSP Residual norm 3.943330715532e-01 9977 KSP Residual norm 3.943330715532e-01 9978 KSP Residual norm 3.943330715532e-01 9979 KSP Residual norm 3.943330715532e-01 9980 KSP Residual norm 3.943330715532e-01 9981 KSP Residual norm 3.943330715532e-01 9982 KSP Residual norm 3.943330715532e-01 9983 KSP Residual norm 3.943330715532e-01 9984 KSP Residual norm 3.943330715532e-01 9985 KSP Residual norm 3.943330715532e-01 9986 KSP Residual norm 3.943330715532e-01 9987 KSP Residual norm 3.943330715532e-01 9988 KSP Residual norm 3.943330715532e-01 9989 KSP Residual norm 3.943330715532e-01 9990 KSP Residual norm 3.943330715532e-01 9991 KSP Residual norm 3.943330715532e-01 9992 KSP Residual norm 3.943330715532e-01 9993 KSP Residual norm 3.943330715532e-01 9994 KSP Residual norm 3.943330715532e-01 9995 KSP Residual norm 3.943330715532e-01 9996 KSP Residual norm 3.943330715532e-01 9997 KSP Residual norm 3.943330715532e-01 9998 KSP Residual norm 3.943330715532e-01 9999 KSP Residual norm 3.943330715532e-01 10000 KSP Residual norm 3.943330715532e-01 -------------- next part -------------- KSP Object: 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=5, initial guess is zero tolerances: relative=1e-08, absolute=1e-16, divergence=1e+16 left preconditioning using PRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization FULL Preconditioner for the Schur complement formed from A11 Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (fieldsplit_0_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 5, needed 1 Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=2000, cols=2000 package used to perform factorization: petsc total: nonzeros=40000, allocated nonzeros=40000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 400 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 1 MPI processes type: seqaij rows=2000, cols=2000 total: nonzeros=40000, allocated nonzeros=40000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 400 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_1_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_1_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 5, needed 2.62994 Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=330, cols=330 package used to perform factorization: petsc total: nonzeros=20098, allocated nonzeros=20098 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 106 nodes, limit used is 5 linear system matrix followed by preconditioner matrix: Mat Object: (fieldsplit_1_) 1 MPI processes type: schurcomplement rows=330, cols=330 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (fieldsplit_1_) 1 MPI processes type: seqaij rows=330, cols=330 total: nonzeros=7642, allocated nonzeros=7642 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 121 nodes, limit used is 5 A10 Mat Object: 1 MPI processes type: seqaij rows=330, cols=2000 total: nonzeros=22800, allocated nonzeros=22800 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 121 nodes, limit used is 5 KSP of A00 KSP Object: (fieldsplit_0_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 5, needed 1 Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=2000, cols=2000 package used to perform factorization: petsc total: nonzeros=40000, allocated nonzeros=40000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 400 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 1 MPI processes type: seqaij rows=2000, cols=2000 total: nonzeros=40000, allocated nonzeros=40000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 400 nodes, limit used is 5 A01 Mat Object: 1 MPI processes type: seqaij rows=2000, cols=330 total: nonzeros=22800, allocated nonzeros=22800 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 400 nodes, limit used is 5 Mat Object: (fieldsplit_1_) 1 MPI processes type: seqaij rows=330, cols=330 total: nonzeros=7642, allocated nonzeros=7642 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 121 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=2330, cols=2330 total: nonzeros=93242, allocated nonzeros=93242 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 521 nodes, limit used is 5 From knepley at gmail.com Tue Mar 11 10:36:28 2014 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 11 Mar 2014 10:36:28 -0500 Subject: [petsc-users] Fieldsplit schur complement with preonly solves In-Reply-To: References: Message-ID: On Tue, Mar 11, 2014 at 9:56 AM, Luc Berger-Vergiat < luc.berger.vergiat at gmail.com> wrote: > Hi all, > I am testing some preconditioners for a FEM problem involving different > types of fields (displacements, temperature, stresses and plastic strain). > To make sure that things are working correctly I am first solving this > problem with: > > -ksp_type preonly -pc_type lu, which works fine obviously. > > > Then I move on to do: > > -ksp_type gmres -pc_type lu, and I get very good convergence (one gmres > linear iteration per time step) which I expected. > > > So solving the problem exactly in a preconditioner to gmres leads to > optimal results. > This can be done using a Schur complement, but when I pass the following > options: > > -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur > -pc_fieldsplit_schur_factorization_type full -pc_fieldsplit_0_fields 2,3 > -pc_fieldsplit_1_fields 0,1 -fieldsplit_0_ksp_type > preonly -fieldsplit_0_pc_type lu -fieldsplit_1_ksp_type > preonly -fieldsplit_1_pc_type lu > > My results are terrible, gmres does not converge and my FEM code reduces > the size of the time step in order to converge. > This does not make much sense to me... > The problem is the Schur complement block. We have S = C A^{-1} B PETSc does not form S explicitly, since it would require forming the dense inverse of A explicitly. Thus we only calculate the action of A. If you look in -ksp_view, you will see that the preconditioner for S is formed from A_11, which it sounds like is 0 in your case, so the LU of that is a crud preconditioner. Once you wrap the solve in GMRES, it will eventually converge. You can try using the LSC stuff if you do not have a preconditioner matrix for the Schur complement. Thanks, Matt > Curiously if I use the following options: > > -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur > -pc_fieldsplit_schur_factorization_type full -pc_fieldsplit_0_fields 2,3 > -pc_fieldsplit_1_fields 0,1 -fieldsplit_0_ksp_type gmres > -fieldsplit_0_pc_type lu -fieldsplit_1_ksp_type gmres -fieldsplit_1_pc_type > lu > > then the global gmres converges in two iterations. > > So using a pair of ksp gmres/pc lu on the A00 block and the Schur > complements works, but using lu directly doesn't. > > Because I think that all this is quite strange, I decided to dump some > matrices out. Namely, I dumped the complete FEM jacobian, I also do a > MatView on jac->B, jac->C and the result of KSPGetOperators on kspA. These > returns three out of the four blocks needed to do the Schur complement. > They are correct and I assume that the last block is also correct. > When I import jac->B, jac->C and the matrix corresponding to kspA in > MATLAB to compute the inverse of the Schur complement and pass it to gmres > as preconditioner the problem is solved in 1 iteration. > > So MATLAB says: > > -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur > -pc_fieldsplit_schur_factorization_type full -pc_fieldsplit_0_fields 2,3 > -pc_fieldsplit_1_fields 0,1 -fieldsplit_0_ksp_type > preonly -fieldsplit_0_pc_type lu -fieldsplit_1_ksp_type > preonly -fieldsplit_1_pc_type lu > > should yield only one iteration (maybe two depending on implementation). > > Any ideas why the Petsc doesn't solve this correctly? > > Best, > Luc > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From luc.berger.vergiat at gmail.com Tue Mar 11 11:05:38 2014 From: luc.berger.vergiat at gmail.com (Luc) Date: Tue, 11 Mar 2014 12:05:38 -0400 Subject: [petsc-users] Fieldsplit schur complement with preonly solves In-Reply-To: References: Message-ID: <531F3452.8050903@gmail.com> Hum, would a *-pc_fieldsplit_schur_precondition self *use the full Schur as preconditioner for itself? I made some special choices in order to keep A diagonal which makes it cheap to inverse. Actually I am assuming that Schur will be blazing fast with my type of discretization... Best, Luc On 03/11/2014 11:36 AM, Matthew Knepley wrote: > On Tue, Mar 11, 2014 at 9:56 AM, Luc Berger-Vergiat > > > wrote: > > Hi all, > I am testing some preconditioners for a FEM problem involving > different types of fields (displacements, temperature, stresses > and plastic strain). > To make sure that things are working correctly I am first solving > this problem with: > > -ksp_type preonly -pc_type lu, which works fine obviously. > > > Then I move on to do: > > -ksp_type gmres -pc_type lu, and I get very good convergence > (one gmres linear iteration per time step) which I expected. > > > So solving the problem exactly in a preconditioner to gmres leads > to optimal results. > This can be done using a Schur complement, but when I pass the > following options: > > -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur > -pc_fieldsplit_schur_factorization_type full > -pc_fieldsplit_0_fields 2,3 -pc_fieldsplit_1_fields 0,1 > -fieldsplit_0_ksp_type preonly -fieldsplit_0_pc_type > lu -fieldsplit_1_ksp_type preonly -fieldsplit_1_pc_type lu > > My results are terrible, gmres does not converge and my FEM code > reduces the size of the time step in order to converge. > This does not make much sense to me... > > > The problem is the Schur complement block. We have > > S = C A^{-1} B > > PETSc does not form S explicitly, since it would require forming the dense > inverse of A explicitly. Thus we only calculate the action of A. If > you look in > -ksp_view, you will see that the preconditioner for S is formed from A_11, > which it sounds like is 0 in your case, so the LU of that is a crud > preconditioner. > Once you wrap the solve in GMRES, it will eventually converge. > > You can try using the LSC stuff if you do not have a preconditioner matrix > for the Schur complement. > > Thanks, > > Matt > > Curiously if I use the following options: > > -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur > -pc_fieldsplit_schur_factorization_type full > -pc_fieldsplit_0_fields 2,3 -pc_fieldsplit_1_fields 0,1 > -fieldsplit_0_ksp_type gmres -fieldsplit_0_pc_type > lu -fieldsplit_1_ksp_type gmres -fieldsplit_1_pc_type lu > > then the global gmres converges in two iterations. > > So using a pair of ksp gmres/pc lu on the A00 block and the Schur > complements works, but using lu directly doesn't. > > Because I think that all this is quite strange, I decided to dump > some matrices out. Namely, I dumped the complete FEM jacobian, I > also do a MatView on jac->B, jac->C and the result of > KSPGetOperators on kspA. These returns three out of the four > blocks needed to do the Schur complement. They are correct and I > assume that the last block is also correct. > When I import jac->B, jac->C and the matrix corresponding to kspA > in MATLAB to compute the inverse of the Schur complement and pass > it to gmres as preconditioner the problem is solved in 1 iteration. > > So MATLAB says: > > -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur > -pc_fieldsplit_schur_factorization_type full > -pc_fieldsplit_0_fields 2,3 -pc_fieldsplit_1_fields 0,1 > -fieldsplit_0_ksp_type preonly -fieldsplit_0_pc_type > lu -fieldsplit_1_ksp_type preonly -fieldsplit_1_pc_type lu > > should yield only one iteration (maybe two depending on > implementation). > > Any ideas why the Petsc doesn't solve this correctly? > > Best, > Luc > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Mar 11 11:19:55 2014 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 11 Mar 2014 11:19:55 -0500 Subject: [petsc-users] Fieldsplit schur complement with preonly solves In-Reply-To: <531F3452.8050903@gmail.com> References: <531F3452.8050903@gmail.com> Message-ID: On Tue, Mar 11, 2014 at 11:05 AM, Luc wrote: > Hum, > would a *-pc_fieldsplit_schur_precondition self *use the full Schur as > preconditioner for itself? > I made some special choices in order to keep A diagonal which makes it > cheap to inverse. > Actually I am assuming that Schur will be blazing fast with my type of > discretization... > We reorganized, so that now what you want is "selfp": http://www.mcs.anl.gov/petsc/petsc-dev/docs/manualpages/PC/PCFieldSplitSchurPrecondition.html Thanks, Matt > Best, > > Luc > > On 03/11/2014 11:36 AM, Matthew Knepley wrote: > > On Tue, Mar 11, 2014 at 9:56 AM, Luc Berger-Vergiat < > luc.berger.vergiat at gmail.com> wrote: > >> Hi all, >> I am testing some preconditioners for a FEM problem involving different >> types of fields (displacements, temperature, stresses and plastic strain). >> To make sure that things are working correctly I am first solving this >> problem with: >> >> -ksp_type preonly -pc_type lu, which works fine obviously. >> >> >> Then I move on to do: >> >> -ksp_type gmres -pc_type lu, and I get very good convergence (one gmres >> linear iteration per time step) which I expected. >> >> >> So solving the problem exactly in a preconditioner to gmres leads to >> optimal results. >> This can be done using a Schur complement, but when I pass the following >> options: >> >> -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur >> -pc_fieldsplit_schur_factorization_type full -pc_fieldsplit_0_fields 2,3 >> -pc_fieldsplit_1_fields 0,1 -fieldsplit_0_ksp_type >> preonly -fieldsplit_0_pc_type lu -fieldsplit_1_ksp_type >> preonly -fieldsplit_1_pc_type lu >> >> My results are terrible, gmres does not converge and my FEM code reduces >> the size of the time step in order to converge. >> This does not make much sense to me... >> > > The problem is the Schur complement block. We have > > S = C A^{-1} B > > PETSc does not form S explicitly, since it would require forming the > dense > inverse of A explicitly. Thus we only calculate the action of A. If you > look in > -ksp_view, you will see that the preconditioner for S is formed from A_11, > which it sounds like is 0 in your case, so the LU of that is a crud > preconditioner. > Once you wrap the solve in GMRES, it will eventually converge. > > You can try using the LSC stuff if you do not have a preconditioner > matrix > for the Schur complement. > > Thanks, > > Matt > > >> Curiously if I use the following options: >> >> -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur >> -pc_fieldsplit_schur_factorization_type full -pc_fieldsplit_0_fields 2,3 >> -pc_fieldsplit_1_fields 0,1 -fieldsplit_0_ksp_type gmres >> -fieldsplit_0_pc_type lu -fieldsplit_1_ksp_type gmres -fieldsplit_1_pc_type >> lu >> >> then the global gmres converges in two iterations. >> >> So using a pair of ksp gmres/pc lu on the A00 block and the Schur >> complements works, but using lu directly doesn't. >> >> Because I think that all this is quite strange, I decided to dump some >> matrices out. Namely, I dumped the complete FEM jacobian, I also do a >> MatView on jac->B, jac->C and the result of KSPGetOperators on kspA. These >> returns three out of the four blocks needed to do the Schur complement. >> They are correct and I assume that the last block is also correct. >> When I import jac->B, jac->C and the matrix corresponding to kspA in >> MATLAB to compute the inverse of the Schur complement and pass it to gmres >> as preconditioner the problem is solved in 1 iteration. >> >> So MATLAB says: >> >> -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur >> -pc_fieldsplit_schur_factorization_type full -pc_fieldsplit_0_fields 2,3 >> -pc_fieldsplit_1_fields 0,1 -fieldsplit_0_ksp_type >> preonly -fieldsplit_0_pc_type lu -fieldsplit_1_ksp_type >> preonly -fieldsplit_1_pc_type lu >> >> should yield only one iteration (maybe two depending on implementation). >> >> Any ideas why the Petsc doesn't solve this correctly? >> >> Best, >> Luc >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Tue Mar 11 12:51:11 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 11 Mar 2014 12:51:11 -0500 Subject: [petsc-users] petsc malloc multidimensional array In-Reply-To: References: <9731bb.23bae.144b15d1aad.Coremail.luchao@mail.iggcas.ac.cn> <3c3192.23cc0.144b17694a0.Coremail.luchao@mail.iggcas.ac.cn> Message-ID: <29A791DF-5930-4CA8-88DE-50402A3E575C@mcs.anl.gov> On Mar 11, 2014, at 9:22 AM, Matthew Knepley wrote: > On Tue, Mar 11, 2014 at 9:05 AM, ?? wrote: > Hi, Matthew: > > Thank you for your reply so fast! but I also have some questions: > > 2d arrays I used is just intermediate variable, not for fields, and the fields is used Vector. In Finite element method, when I use element stiffness matrix to assemble global stiffness matrix, I always first compute the 2d element stiffness matrixs whose size is 512*512(inner points in element),so big for static arrays. If the 512 is a fixed number, not different for different elements you can in C 89 standard use mysubroutine(?.) double element[512][512]; ??. Use element and even pass element to MatSetValues(). In C99 you can even have the element size be a runtime value and do mysubroutine(int N,?..) double element[N][N]; ?? Use element and even pass element to MatSetValues(). C will automatically handle the allocation of the needed space and it will be efficient. Note that C will allocate the space as needed inside the routine, it is not kept around outside the routine. No need to use PETSc?s routines to allocate the space. Barry > So i want to use PetscMalloc to bulid 2d arrays to store element stiffness matrixs' values. And I don't know how to do, could do tell me? > > AGAIN, these are C questions. You could > > a) malloc() each row > > b) malloc() the whole thing, and make pointers to each row > > Notice that this is enormous for an element matrix. This is likely not optimal for performance. > > Matt > > then use another 1d array to abstract the nonzero values from 2d arrays above-mentioned. could do you please tell me some other methods much more convenient and faster? > > > > -----????----- > ???: "Matthew Knepley" > ????: 2014?3?11? ??? > ???: "??" > ??: petsc-users > ??: Re: [petsc-users] petsc malloc multidimensional array > > On Tue, Mar 11, 2014 at 8:37 AM, ?? wrote: > Hi, recently,when I use PETSc to bulid 2d arrays such as PetscScalar A[512][512],B[512][512],C,D,E,F,..., program always has error of Segmentation Violation. So I want to use PetscMalloc to bulid 2d array, and I hope that I can also use these 2d array A[i][j] by subscripts as before. Could do please tell me how can I do? Thank you. > > 1) This is a C question, not a PETSc question > > 2) If you are using 2D arrays for fields, you should be using the DMDA, which has a section in the manual > > Thanks, > > Matt > > LV CHAO > > 2014/3/11 > > > > > > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > > > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener From jed at jedbrown.org Tue Mar 11 13:28:57 2014 From: jed at jedbrown.org (Jed Brown) Date: Tue, 11 Mar 2014 12:28:57 -0600 Subject: [petsc-users] petsc malloc multidimensional array In-Reply-To: <29A791DF-5930-4CA8-88DE-50402A3E575C@mcs.anl.gov> References: <9731bb.23bae.144b15d1aad.Coremail.luchao@mail.iggcas.ac.cn> <3c3192.23cc0.144b17694a0.Coremail.luchao@mail.iggcas.ac.cn> <29A791DF-5930-4CA8-88DE-50402A3E575C@mcs.anl.gov> Message-ID: <87vbvk4mja.fsf@jedbrown.org> Barry Smith writes: > On Mar 11, 2014, at 9:22 AM, Matthew Knepley wrote: > >> On Tue, Mar 11, 2014 at 9:05 AM, ?? wrote: >> Hi, Matthew: >> >> Thank you for your reply so fast! but I also have some questions: >> >> 2d arrays I used is just intermediate variable, not for fields, and the fields is used Vector. In Finite element method, when I use element stiffness matrix to assemble global stiffness matrix, I always first compute the 2d element stiffness matrixs whose size is 512*512(inner points in element),so big for static arrays. > > If the 512 is a fixed number, not different for different elements you can in C 89 standard use > > mysubroutine(?.) > double element[512][512]; This is fairly "big" -- 2 MiB of what is typically an 8 MiB stack size, so if you allocate several of these, you may get a stack overflow. You can increase the stack size on most systems, but that makes it more complicated to run your program. I would allocate dynamically if you are worried about this. If you want to avoid PETSc array functions, you can allocate dynamically and access via a pointer: double *mem = malloc(M*N*sizeof(double)); double (*p)[N] = (double (*)[N])mem; // access p[i][j] free(mem); -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From gbisht at lbl.gov Tue Mar 11 16:59:13 2014 From: gbisht at lbl.gov (Gautam Bisht) Date: Tue, 11 Mar 2014 14:59:13 -0700 Subject: [petsc-users] Solution history in TS Message-ID: Hi, I'm trying to solve groundwater flow equations using TS with SUNDIALS. The domain for my problem is 3D, but the system will be solved as a collection of independent 1D soil columns within the subsurface domain. I'm trying to understand if I need TS corresponding to each soil column or can simply use a single TS and reuse the single TS for all soil columns. Does TS keeps history of solution between two calls to TSSolve()? If yes, then I would need TS for each column, otherwise I might get away with just a single TS. I appreciate your input on this topic. Thanks, -Gautam. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Tue Mar 11 17:15:50 2014 From: jed at jedbrown.org (Jed Brown) Date: Tue, 11 Mar 2014 16:15:50 -0600 Subject: [petsc-users] Solution history in TS In-Reply-To: References: Message-ID: <8738io4c15.fsf@jedbrown.org> Gautam Bisht writes: > Hi, > > I'm trying to solve groundwater flow equations using TS with SUNDIALS. The > domain for my problem is 3D, but the system will be solved as a collection > of independent 1D soil columns within the subsurface domain. I'm trying to > understand if I need TS corresponding to each soil column or can simply use > a single TS and reuse the single TS for all soil columns. Does TS keeps > history of solution between two calls to TSSolve()? If yes, then I would > need TS for each column, otherwise I might get away with just a single TS. > I appreciate your input on this topic. You can take a new Vec and call TSSolve. So you could reuse by sequentially running through columns. Note that you might want to do multiple columns at a time in order to vectorize. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From luchao at mail.iggcas.ac.cn Wed Mar 12 06:38:43 2014 From: luchao at mail.iggcas.ac.cn (=?utf-8?B?5ZCV6LaF?=) Date: Wed, 12 Mar 2014 19:38:43 +0800 (GMT+08:00) Subject: [petsc-users] petsc malloc multidimensional array In-Reply-To: <87vbvk4mja.fsf@jedbrown.org> References: <9731bb.23bae.144b15d1aad.Coremail.luchao@mail.iggcas.ac.cn> <3c3192.23cc0.144b17694a0.Coremail.luchao@mail.iggcas.ac.cn> <29A791DF-5930-4CA8-88DE-50402A3E575C@mcs.anl.gov> <87vbvk4mja.fsf@jedbrown.org> Message-ID: <15b7346.27171.144b6166011.Coremail.luchao@mail.iggcas.ac.cn> Many thanks for all your explanation, its really help me! Now my program run smoothly, and the result is satisfactory! For 3d results, I now make drawings by many slices in matlab. And I find it not visualized, so could do you please tell me how can I draw these 3d space fileds finely? > -----????----- > ???: "Jed Brown" > ????: 2014?3?12? ??? > ???: "Barry Smith" , "Matthew Knepley" > ??: "??" , petsc-users > ??: Re: [petsc-users] petsc malloc multidimensional array > > Barry Smith writes: > > > On Mar 11, 2014, at 9:22 AM, Matthew Knepley wrote: > > > >> On Tue, Mar 11, 2014 at 9:05 AM, ?? wrote: > >> Hi, Matthew: > >> > >> Thank you for your reply so fast! but I also have some questions: > >> > >> 2d arrays I used is just intermediate variable, not for fields, and the fields is used Vector. In Finite element method, when I use element stiffness matrix to assemble global stiffness matrix, I always first compute the 2d element stiffness matrixs whose size is 512*512(inner points in element),so big for static arrays. > > > > If the 512 is a fixed number, not different for different elements you can in C 89 standard use > > > > mysubroutine(?.) > > double element[512][512]; > > This is fairly "big" -- 2 MiB of what is typically an 8 MiB stack size, > so if you allocate several of these, you may get a stack overflow. You > can increase the stack size on most systems, but that makes it more > complicated to run your program. I would allocate dynamically if you > are worried about this. > > If you want to avoid PETSc array functions, you can allocate dynamically > and access via a pointer: > > double *mem = malloc(M*N*sizeof(double)); > double (*p)[N] = (double (*)[N])mem; > // access p[i][j] > free(mem); -------------- next part -------------- An HTML attachment was scrubbed... URL: From lawrence.mitchell at imperial.ac.uk Wed Mar 12 06:44:06 2014 From: lawrence.mitchell at imperial.ac.uk (Lawrence Mitchell) Date: Wed, 12 Mar 2014 11:44:06 +0000 Subject: [petsc-users] Null spaces with fieldsplit preconditioning Message-ID: <53204886.3080209@imperial.ac.uk> Hello, I have a mixed FEM system, for which the null space is constant functions in one of the sub spaces. That is, I have W = V * Q and the null space is a constant function in Q (and zero in V). If I assemble the operator, I get a block structure: X = [A B, C D] When I solve this monolithically, I project the null space out by attaching it to X. However, if I use Schur complement preconditioning, then S = D - C Ainv B has a null space of the constant functions as well. Do I therefore need to attach null spaces to both X and S in this case, or will attaching it to X be enough? If I do need to hang something on S, is there an easy way to do it? Cheers, Lawrence From knepley at gmail.com Wed Mar 12 07:03:53 2014 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 12 Mar 2014 07:03:53 -0500 Subject: [petsc-users] petsc malloc multidimensional array In-Reply-To: <15b7346.27171.144b6166011.Coremail.luchao@mail.iggcas.ac.cn> References: <9731bb.23bae.144b15d1aad.Coremail.luchao@mail.iggcas.ac.cn> <3c3192.23cc0.144b17694a0.Coremail.luchao@mail.iggcas.ac.cn> <29A791DF-5930-4CA8-88DE-50402A3E575C@mcs.anl.gov> <87vbvk4mja.fsf@jedbrown.org> <15b7346.27171.144b6166011.Coremail.luchao@mail.iggcas.ac.cn> Message-ID: On Wed, Mar 12, 2014 at 6:38 AM, ?? wrote: > Many thanks for all your explanation, its really help me! > > Now my program run smoothly, and the result is satisfactory! > > For 3d results, I now make drawings by many slices in matlab. > > And I find it not visualized, so could do you please tell me how can I > draw these 3d space fileds finely? > Consider using http://www.paraview.org/ Matt > > > -----????----- > > ???: "Jed Brown" > > ????: 2014?3?12? ??? > > ???: "Barry Smith" , "Matthew Knepley" < > knepley at gmail.com> > > ??: "??" , petsc-users < > petsc-users at mcs.anl.gov> > > ??: Re: [petsc-users] petsc malloc multidimensional array > > > > Barry Smith writes: > > > > > On Mar 11, 2014, at 9:22 AM, Matthew Knepley > wrote: > > > > > >> On Tue, Mar 11, 2014 at 9:05 AM, ?? wrote: > > >> Hi, Matthew: > > >> > > >> Thank you for your reply so fast! but I also have some questions: > > >> > > > >> 2d arrays I used is just intermediate variable, not for fields, and the fields is used Vector. In Finite element method, when I use element stiffness matrix to assemble global stiffness matrix, I always first compute the 2d element stiffness matrixs whose size is 512*512(inner points in element),so big for static arrays. > > > > > > > If the 512 is a fixed number, not different for different elements you can in C 89 standard use > > > > > > mysubroutine(?.) > > > double element[512][512]; > > > > This is fairly "big" -- 2 MiB of what is typically an 8 MiB stack size, > > so if you allocate several of these, you may get a stack overflow. You > > can increase the stack size on most systems, but that makes it more > > complicated to run your program. I would allocate dynamically if you > > are worried about this. > > > > If you want to avoid PETSc array functions, you can allocate dynamically > > and access via a pointer: > > > > double *mem = malloc(M*N*sizeof(double)); > > double (*p)[N] = (double (*)[N])mem; > > // access p[i][j] > > free(mem); > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Mar 12 07:10:08 2014 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 12 Mar 2014 07:10:08 -0500 Subject: [petsc-users] Null spaces with fieldsplit preconditioning In-Reply-To: <53204886.3080209@imperial.ac.uk> References: <53204886.3080209@imperial.ac.uk> Message-ID: On Wed, Mar 12, 2014 at 6:44 AM, Lawrence Mitchell < lawrence.mitchell at imperial.ac.uk> wrote: > Hello, > > I have a mixed FEM system, for which the null space is constant > functions in one of the sub spaces. That is, I have W = V * Q and the > null space is a constant function in Q (and zero in V). If I assemble > the operator, I get a block structure: > > X = [A B, > C D] > > When I solve this monolithically, I project the null space out by > attaching it to X. However, if I use Schur complement preconditioning, > then S = D - C Ainv B has a null space of the constant functions as > well. Do I therefore need to attach null spaces to both X and S in this > case, or will attaching it to X be enough? If I do need to hang > something on S, is there an easy way to do it? > Here are two ways to do this, depending on how you manage your fields. At the lowest level, an IS defines the field in PCFIELDSPLIT. So you can attach your MatNullSpace that way PetscObjectCompose(isD, "nullspace", nullspace) of if you use a DM, you can say DMGetField(dm, 1, &obj); PetscObjectCompose(obj, "nullspace", nullspace) This is also how we pass in "nearsullspace" for AMG, and "pmat" for a separate preconditioning operator. Thanks, Matt Cheers, > > Lawrence > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mc0710 at gmail.com Wed Mar 12 14:48:37 2014 From: mc0710 at gmail.com (Mani Chandra) Date: Wed, 12 Mar 2014 14:48:37 -0500 Subject: [petsc-users] Restart capability in TS Message-ID: Hi, Is there any inbuilt routine/switch to restart a run from a .vts/binary file using the TS module? I can write one in my code but I just wanted to check if such a capability already exists. Thanks, Mani -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Wed Mar 12 14:52:00 2014 From: jed at jedbrown.org (Jed Brown) Date: Wed, 12 Mar 2014 13:52:00 -0600 Subject: [petsc-users] Restart capability in TS In-Reply-To: References: Message-ID: <87y50f19gf.fsf@jedbrown.org> Mani Chandra writes: > Hi, > > Is there any inbuilt routine/switch to restart a run from a .vts/binary > file using the TS module? I can write one in my code but I just wanted to > check if such a capability already exists. There is a TSLoad in 'master', but most simulations have to load up lots of other data structures, so I think most people will just VecLoad their solution and TSSolve. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From bsmith at mcs.anl.gov Wed Mar 12 18:57:13 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 12 Mar 2014 18:57:13 -0500 Subject: [petsc-users] Restart capability in TS In-Reply-To: <87y50f19gf.fsf@jedbrown.org> References: <87y50f19gf.fsf@jedbrown.org> Message-ID: On Mar 12, 2014, at 2:52 PM, Jed Brown wrote: > Mani Chandra writes: > >> Hi, >> >> Is there any inbuilt routine/switch to restart a run from a .vts/binary >> file using the TS module? I can write one in my code but I just wanted to >> check if such a capability already exists. > > There is a TSLoad in 'master', but most simulations have to load up lots > of other data structures, so I think most people will just VecLoad their > solution and TSSolve. My hope is eventually to provide a way for users to register all the ?data times? (auxiliary vectors, meshes, ?) that need to be stored at the end and then have TSLoad be able to mange loading everything back up and transparently continuing the TS time stepping. If you have want to try the TSView() and TSLoad() for binary viewers in master I can provide guidance. It is in src/ts/examples/tutorials/ex28.c in the master branch. Barry From mc0710 at gmail.com Thu Mar 13 01:00:39 2014 From: mc0710 at gmail.com (Mani Chandra) Date: Thu, 13 Mar 2014 01:00:39 -0500 Subject: [petsc-users] Problems with nonlinear convergence when density or pressure is too low Message-ID: Hi, I'm trying to solve for accretion flows around blackholes using the theta method in TS. The accretion disk is surrounded by a very low density and pressure "atmosphere". The lower the values of the density and pressure in the atmosphere, the harder it is becoming for the nonlinear convergence. Eventually, it leads to diverging linesearches and the time step becomes very small. Is there a way to deal with very low regions of density and pressure and avoid killing the nonlinear convergence? Thanks, Mani -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Thu Mar 13 01:09:12 2014 From: jed at jedbrown.org (Jed Brown) Date: Thu, 13 Mar 2014 00:09:12 -0600 Subject: [petsc-users] Problems with nonlinear convergence when density or pressure is too low In-Reply-To: References: Message-ID: <874n321vg7.fsf@jedbrown.org> Mani Chandra writes: > Hi, > > I'm trying to solve for accretion flows around blackholes using the theta > method in TS. The accretion disk is surrounded by a very low density and > pressure "atmosphere". The lower the values of the density and pressure in > the atmosphere, the harder it is becoming for the nonlinear convergence. > Eventually, it leads to diverging linesearches and the time step becomes > very small. > > Is there a way to deal with very low regions of density and pressure and > avoid killing the nonlinear convergence? Large-amplitude dynamics with near-vacuum states/rarefied gasses are hard for lots of reasons, including nonlinear convergence if you choose an implicit method. You might be able to pull most of the nonlinearity out of the solve using an IMEX method, but there's no free lunch. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From luchao at mail.iggcas.ac.cn Thu Mar 13 09:09:13 2014 From: luchao at mail.iggcas.ac.cn (=?GBK?B?wsCzrA==?=) Date: Thu, 13 Mar 2014 22:09:13 +0800 (GMT+08:00) Subject: [petsc-users] malloc Message-ID: <3c8fc.2a8c6.144bbc68365.Coremail.luchao@mail.iggcas.ac.cn> Your faithfully: I use malloc to allocate memories to 1d or 2d array, if I use function free(array), error "Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range" always appear,. However, if I use PetscFree(array), there are no errors? could do you please tell me why? your sincerely LV CHAO 2014/3/13 -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Thu Mar 13 09:22:53 2014 From: jed at jedbrown.org (Jed Brown) Date: Thu, 13 Mar 2014 08:22:53 -0600 Subject: [petsc-users] malloc In-Reply-To: <3c8fc.2a8c6.144bbc68365.Coremail.luchao@mail.iggcas.ac.cn> References: <3c8fc.2a8c6.144bbc68365.Coremail.luchao@mail.iggcas.ac.cn> Message-ID: <877g7yyy82.fsf@jedbrown.org> ?? writes: > Your faithfully: > > I use malloc I presume you mean PetscMalloc. > to allocate memories to 1d or 2d array, if I use function > free(array), error "Caught signal number 11 SEGV: Segmentation > Violation, probably memory access out of range" always appear,. > > However, if I use PetscFree(array), there are no errors? could do > you please tell me why? The interface requires you to use PetscFree with PetscMallic. PetscMalloc has some debugging/alignment features by default, so you can't just pass the pointer to free(). -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From lu_qin_2000 at yahoo.com Thu Mar 13 14:43:34 2014 From: lu_qin_2000 at yahoo.com (Qin Lu) Date: Thu, 13 Mar 2014 12:43:34 -0700 (PDT) Subject: [petsc-users] KSPSolve crash Message-ID: <1394739814.87919.YahooMailNeo@web160201.mail.bf1.yahoo.com> PETSc team, I have a program using PETSc linear solver (using KSPBCG with PCILU level 0).?But with one case it crashed?inside KSPSolve without any error message. I have tested this program with many other cases successfully. Could you debug PETSc with the linear system? I can send you the matrix and rhs if you let me know where to upload them. Many thanks, Qin??? From lu_qin_2000 at yahoo.com Thu Mar 13 14:45:32 2014 From: lu_qin_2000 at yahoo.com (Qin Lu) Date: Thu, 13 Mar 2014 12:45:32 -0700 (PDT) Subject: [petsc-users] KSPSolve crash In-Reply-To: <1394739814.87919.YahooMailNeo@web160201.mail.bf1.yahoo.com> References: <1394739814.87919.YahooMailNeo@web160201.mail.bf1.yahoo.com> Message-ID: <1394739932.48891.YahooMailNeo@web160206.mail.bf1.yahoo.com> I forget to mention: it crashed in Win-7 only, while it runs fine in Linux. ? Qin ----- Original Message ----- From: Qin Lu To: petsc-users Cc: Sent: Thursday, March 13, 2014 2:43 PM Subject: [petsc-users] KSPSolve crash PETSc team, I have a program using PETSc linear solver (using KSPBCG with PCILU level 0).?But with one case it crashed?inside KSPSolve without any error message. I have tested this program with many other cases successfully. Could you debug PETSc with the linear system? I can send you the matrix and rhs if you let me know where to upload them. Many thanks, Qin??? From dharmareddy84 at gmail.com Thu Mar 13 15:29:48 2014 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Thu, 13 Mar 2014 15:29:48 -0500 Subject: [petsc-users] petsc configuer time Message-ID: Hello, How long does it take to configure petsc ? I understand that it depends on the options, but i am find the particular version i have is taking very long time (nearly 2 hours) before it begins configuring packages. I am using intel MPI and intel compilers. I am using the following config opts: PETSC_VERSION = petsc-3.4.3 MPICC=mpiicc MPIF90=mpiifort MPICXX=mpiicpc COMPILERS = --with-cc="$(MPICC)" --with-fc="$(MPIF90)" --with-cxx="$(MPICXX)" COPTFLAGS="$(O_LEVEL)" CXXOPTFLAGS="$(O_LEVEL)" FOPTFLAGS="$(O_LEVEL)" # COMPILERS = --with-mpi-dir=$(MPI_HOME) BLAS_LAPACK = $(PETSC_BLAS_LAPACK_DIR) PETSCExtPackagePath = /home/reddy/libs/petsc METISPATH=$(PETSCExtPackagePath)/metis-5.0.2-p3.tar.gz MUMPSPATH=$(PETSCExtPackagePath)/MUMPS_4.10.0-p3.tar.gz PARMETISPATH=$(PETSCExtPackagePath)/parmetis-4.0.2-p5.tar.gz SUPERLUPATH=$(PETSCExtPackagePath)/superlu_dist_3.3.tar.gz SCALPACKINC=$(MKLHOME)/include SCALPACKLIB="$(MKLROOT)/lib/intel64/libmkl_scalapack_lp64.a -Wl,--start-group $(MKLROOT)/lib/intel64/libmkl_intel_lp64.a $(MKLROOT)/lib/intel64/libmkl_core.a $(MKLROOT)/lib/intel64/libmkl_sequential.a -Wl,--end-group $(MKLROOT)/lib/intel64/libmkl_blacs_intelmpi_lp64.a -lpthread -lm" #BLACSINC=$(MKLHOME)/include #BLACSLIB=$(MKLHOME)/lib/intel64/libmkl_blacs_intelmpi_lp64.a confOptsCommon = --with-x=0 --with-make-np=12 --with-hdf5 --with-hdf5-dir=$(HDF5_DIR) --with-single-library=0 --with-pic=1 --with-shared-libraries=0 --with-blas-lapack-dir=$(BLAS_LAPACK) --with-clanguage=C++ --with-fortran --with-debugging=1 $(COMPILERS) --download-metis=$(METISPATH) --download-parmetis=$(PARMETISPATH) --download-superlu_dist=$(SUPERLUPATH) --download-mumps=$(MUMPSPATH) --with-scalapack-include=$(SCALPACKINC) --with-scalapack-lib=$(SCALPACKLIB) #--with-blacs-include=$(BLACSINC) --with-blacs-lib=$(BLACSLIB) ### configure command ./configure --with-scalar-type=real $(confOptsCommon) From bsmith at mcs.anl.gov Thu Mar 13 15:55:21 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 13 Mar 2014 15:55:21 -0500 Subject: [petsc-users] petsc configuer time In-Reply-To: References: Message-ID: <43666943-4347-470E-9964-9C11F5C9732B@mcs.anl.gov> The long time is pretty much always due to a slow file system (it takes about 3 minutes with my laptop using the local disk) but on a desktop machine using a network file system it can take up to 20 minutes. We generally always build on a local disk; since disk space is so cheap now pretty much any machine has gigabytes free of disk space that you can use to build on. I think two hours is totally unacceptably long. What type of system are you building on and where is the file system? My guess is /home/reddy is off on some slow filesystem away from the machine you are compiling on. Barry On Mar 13, 2014, at 3:29 PM, Dharmendar Reddy wrote: > Hello, > How long does it take to configure petsc ? I understand that > it depends on the options, but i am find the particular version i have > is taking very long time (nearly 2 hours) before it begins configuring > packages. > > I am using intel MPI and intel compilers. > > I am using the following config opts: > PETSC_VERSION = petsc-3.4.3 > MPICC=mpiicc > MPIF90=mpiifort > MPICXX=mpiicpc > COMPILERS = --with-cc="$(MPICC)" --with-fc="$(MPIF90)" > --with-cxx="$(MPICXX)" COPTFLAGS="$(O_LEVEL)" CXXOPTFLAGS="$(O_LEVEL)" > FOPTFLAGS="$(O_LEVEL)" > # COMPILERS = --with-mpi-dir=$(MPI_HOME) > > BLAS_LAPACK = $(PETSC_BLAS_LAPACK_DIR) > PETSCExtPackagePath = /home/reddy/libs/petsc > METISPATH=$(PETSCExtPackagePath)/metis-5.0.2-p3.tar.gz > MUMPSPATH=$(PETSCExtPackagePath)/MUMPS_4.10.0-p3.tar.gz > PARMETISPATH=$(PETSCExtPackagePath)/parmetis-4.0.2-p5.tar.gz > SUPERLUPATH=$(PETSCExtPackagePath)/superlu_dist_3.3.tar.gz > SCALPACKINC=$(MKLHOME)/include > SCALPACKLIB="$(MKLROOT)/lib/intel64/libmkl_scalapack_lp64.a > -Wl,--start-group $(MKLROOT)/lib/intel64/libmkl_intel_lp64.a > $(MKLROOT)/lib/intel64/libmkl_core.a > $(MKLROOT)/lib/intel64/libmkl_sequential.a -Wl,--end-group > $(MKLROOT)/lib/intel64/libmkl_blacs_intelmpi_lp64.a -lpthread -lm" > #BLACSINC=$(MKLHOME)/include > #BLACSLIB=$(MKLHOME)/lib/intel64/libmkl_blacs_intelmpi_lp64.a > confOptsCommon = --with-x=0 --with-make-np=12 --with-hdf5 > --with-hdf5-dir=$(HDF5_DIR) --with-single-library=0 --with-pic=1 > --with-shared-libraries=0 --with-blas-lapack-dir=$(BLAS_LAPACK) > --with-clanguage=C++ --with-fortran --with-debugging=1 $(COMPILERS) > --download-metis=$(METISPATH) --download-parmetis=$(PARMETISPATH) > --download-superlu_dist=$(SUPERLUPATH) --download-mumps=$(MUMPSPATH) > --with-scalapack-include=$(SCALPACKINC) > --with-scalapack-lib=$(SCALPACKLIB) > #--with-blacs-include=$(BLACSINC) --with-blacs-lib=$(BLACSLIB) > > ### configure command > ./configure --with-scalar-type=real $(confOptsCommon) From bsmith at mcs.anl.gov Thu Mar 13 15:57:05 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 13 Mar 2014 15:57:05 -0500 Subject: [petsc-users] KSPSolve crash In-Reply-To: <1394739932.48891.YahooMailNeo@web160206.mail.bf1.yahoo.com> References: <1394739814.87919.YahooMailNeo@web160201.mail.bf1.yahoo.com> <1394739932.48891.YahooMailNeo@web160206.mail.bf1.yahoo.com> Message-ID: <41A0D070-A1EA-41CC-9E8E-D49D96E8236C@mcs.anl.gov> How did it crash? Absolutely nothing printed to the screen, the program just ended? Please send any output. Since it worked on linux it is likely something specific to the windows machine like lack of memory, a compiler bug, ?. Does it always crash at the same place, can you run it in the debugger on the windows machine and where does it end up? Barry On Mar 13, 2014, at 2:45 PM, Qin Lu wrote: > I forget to mention: it crashed in Win-7 only, while it runs fine in Linux. > > Qin > > > ----- Original Message ----- > From: Qin Lu > To: petsc-users > Cc: > Sent: Thursday, March 13, 2014 2:43 PM > Subject: [petsc-users] KSPSolve crash > > PETSc team, > > I have a program using PETSc linear solver (using KSPBCG with PCILU level 0). But with one case it crashed inside KSPSolve without any error message. I have tested this program with many other cases successfully. > > Could you debug PETSc with the linear system? I can send you the matrix and rhs if you let me know where to upload them. > > Many thanks, > Qin From lu_qin_2000 at yahoo.com Thu Mar 13 16:02:54 2014 From: lu_qin_2000 at yahoo.com (Qin Lu) Date: Thu, 13 Mar 2014 14:02:54 -0700 (PDT) Subject: [petsc-users] KSPSolve crash In-Reply-To: <41A0D070-A1EA-41CC-9E8E-D49D96E8236C@mcs.anl.gov> References: <1394739814.87919.YahooMailNeo@web160201.mail.bf1.yahoo.com> <1394739932.48891.YahooMailNeo@web160206.mail.bf1.yahoo.com> <41A0D070-A1EA-41CC-9E8E-D49D96E8236C@mcs.anl.gov> Message-ID: <1394744574.24008.YahooMailNeo@web160203.mail.bf1.yahoo.com> Yes, it crashed without any printout. It always crashes at the first call to KSPSolve. Currently I only has a release version of petsc lib although my program has debug version. ? Thanks, Qin ----- Original Message ----- From: Barry Smith To: Qin Lu Cc: petsc-users Sent: Thursday, March 13, 2014 3:57 PM Subject: Re: [petsc-users] KSPSolve crash ? How did it crash? Absolutely nothing printed to the screen, the program just ended? Please send any output. Since it worked on linux it is likely something specific to the windows machine like lack of memory, a compiler bug, ?.? Does it always crash at the same place, can you run it in the debugger on the windows machine and where does it end up? ? Barry On Mar 13, 2014, at 2:45 PM, Qin Lu wrote: > I forget to mention: it crashed in Win-7 only, while it runs fine in Linux. >? > Qin > > > ----- Original Message ----- > From: Qin Lu > To: petsc-users > Cc: > Sent: Thursday, March 13, 2014 2:43 PM > Subject: [petsc-users] KSPSolve crash > > PETSc team, > > I have a program using PETSc linear solver (using KSPBCG with PCILU level 0). But with one case it crashed inside KSPSolve without any error message. I have tested this program with many other cases successfully. > > Could you debug PETSc with the linear system? I can send you the matrix and rhs if you let me know where to upload them. > > Many thanks, > Qin? ? ? From dharmareddy84 at gmail.com Thu Mar 13 16:03:43 2014 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Thu, 13 Mar 2014 16:03:43 -0500 Subject: [petsc-users] petsc configuer time In-Reply-To: <43666943-4347-470E-9964-9C11F5C9732B@mcs.anl.gov> References: <43666943-4347-470E-9964-9C11F5C9732B@mcs.anl.gov> Message-ID: Yes, my home directory is mounted on nfs. And i have configured and installed petsc many times on my laptop and TACC stampede (which also has my home directory mounted on network file system). But the particular computer that i am working on now has been extremely slow when it comes to petsc configure. Any suggestions on how i can fix this ? I do not have a choice of not having my home on nfs. Otherwise, i do not see big disk i/o impact even when i visualize large ( > 100 MB ) files for visualization. On Thu, Mar 13, 2014 at 3:55 PM, Barry Smith wrote: > > The long time is pretty much always due to a slow file system (it takes about 3 minutes with my laptop using the local disk) but on a desktop machine using a network file system it can take up to 20 minutes. We generally always build on a local disk; since disk space is so cheap now pretty much any machine has gigabytes free of disk space that you can use to build on. > > I think two hours is totally unacceptably long. What type of system are you building on and where is the file system? My guess is /home/reddy is off on some slow filesystem away from the machine you are compiling on. > > Barry > > On Mar 13, 2014, at 3:29 PM, Dharmendar Reddy wrote: > >> Hello, >> How long does it take to configure petsc ? I understand that >> it depends on the options, but i am find the particular version i have >> is taking very long time (nearly 2 hours) before it begins configuring >> packages. >> >> I am using intel MPI and intel compilers. >> >> I am using the following config opts: >> PETSC_VERSION = petsc-3.4.3 >> MPICC=mpiicc >> MPIF90=mpiifort >> MPICXX=mpiicpc >> COMPILERS = --with-cc="$(MPICC)" --with-fc="$(MPIF90)" >> --with-cxx="$(MPICXX)" COPTFLAGS="$(O_LEVEL)" CXXOPTFLAGS="$(O_LEVEL)" >> FOPTFLAGS="$(O_LEVEL)" >> # COMPILERS = --with-mpi-dir=$(MPI_HOME) >> >> BLAS_LAPACK = $(PETSC_BLAS_LAPACK_DIR) >> PETSCExtPackagePath = /home/reddy/libs/petsc >> METISPATH=$(PETSCExtPackagePath)/metis-5.0.2-p3.tar.gz >> MUMPSPATH=$(PETSCExtPackagePath)/MUMPS_4.10.0-p3.tar.gz >> PARMETISPATH=$(PETSCExtPackagePath)/parmetis-4.0.2-p5.tar.gz >> SUPERLUPATH=$(PETSCExtPackagePath)/superlu_dist_3.3.tar.gz >> SCALPACKINC=$(MKLHOME)/include >> SCALPACKLIB="$(MKLROOT)/lib/intel64/libmkl_scalapack_lp64.a >> -Wl,--start-group $(MKLROOT)/lib/intel64/libmkl_intel_lp64.a >> $(MKLROOT)/lib/intel64/libmkl_core.a >> $(MKLROOT)/lib/intel64/libmkl_sequential.a -Wl,--end-group >> $(MKLROOT)/lib/intel64/libmkl_blacs_intelmpi_lp64.a -lpthread -lm" >> #BLACSINC=$(MKLHOME)/include >> #BLACSLIB=$(MKLHOME)/lib/intel64/libmkl_blacs_intelmpi_lp64.a >> confOptsCommon = --with-x=0 --with-make-np=12 --with-hdf5 >> --with-hdf5-dir=$(HDF5_DIR) --with-single-library=0 --with-pic=1 >> --with-shared-libraries=0 --with-blas-lapack-dir=$(BLAS_LAPACK) >> --with-clanguage=C++ --with-fortran --with-debugging=1 $(COMPILERS) >> --download-metis=$(METISPATH) --download-parmetis=$(PARMETISPATH) >> --download-superlu_dist=$(SUPERLUPATH) --download-mumps=$(MUMPSPATH) >> --with-scalapack-include=$(SCALPACKINC) >> --with-scalapack-lib=$(SCALPACKLIB) >> #--with-blacs-include=$(BLACSINC) --with-blacs-lib=$(BLACSLIB) >> >> ### configure command >> ./configure --with-scalar-type=real $(confOptsCommon) > From dharmareddy84 at gmail.com Thu Mar 13 16:17:08 2014 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Thu, 13 Mar 2014 16:17:08 -0500 Subject: [petsc-users] petsc configuer time In-Reply-To: References: <43666943-4347-470E-9964-9C11F5C9732B@mcs.anl.gov> Message-ID: Also, I find it very slow even when i try to configure in local /tmp folder. How can i diagnose this ? On Thu, Mar 13, 2014 at 4:03 PM, Dharmendar Reddy wrote: > Yes, my home directory is mounted on nfs. And i have configured and > installed petsc many times on my laptop and TACC stampede (which also > has my home directory mounted on network file system). But the > particular computer that i am working on now has been extremely slow > when it comes to petsc configure. Any suggestions on how i can fix > this ? I do not have a choice of not having my home on nfs. > > > Otherwise, i do not see big disk i/o impact even when i visualize > large ( > 100 MB ) files for visualization. > > > > On Thu, Mar 13, 2014 at 3:55 PM, Barry Smith wrote: >> >> The long time is pretty much always due to a slow file system (it takes about 3 minutes with my laptop using the local disk) but on a desktop machine using a network file system it can take up to 20 minutes. We generally always build on a local disk; since disk space is so cheap now pretty much any machine has gigabytes free of disk space that you can use to build on. >> >> I think two hours is totally unacceptably long. What type of system are you building on and where is the file system? My guess is /home/reddy is off on some slow filesystem away from the machine you are compiling on. >> >> Barry >> >> On Mar 13, 2014, at 3:29 PM, Dharmendar Reddy wrote: >> >>> Hello, >>> How long does it take to configure petsc ? I understand that >>> it depends on the options, but i am find the particular version i have >>> is taking very long time (nearly 2 hours) before it begins configuring >>> packages. >>> >>> I am using intel MPI and intel compilers. >>> >>> I am using the following config opts: >>> PETSC_VERSION = petsc-3.4.3 >>> MPICC=mpiicc >>> MPIF90=mpiifort >>> MPICXX=mpiicpc >>> COMPILERS = --with-cc="$(MPICC)" --with-fc="$(MPIF90)" >>> --with-cxx="$(MPICXX)" COPTFLAGS="$(O_LEVEL)" CXXOPTFLAGS="$(O_LEVEL)" >>> FOPTFLAGS="$(O_LEVEL)" >>> # COMPILERS = --with-mpi-dir=$(MPI_HOME) >>> >>> BLAS_LAPACK = $(PETSC_BLAS_LAPACK_DIR) >>> PETSCExtPackagePath = /home/reddy/libs/petsc >>> METISPATH=$(PETSCExtPackagePath)/metis-5.0.2-p3.tar.gz >>> MUMPSPATH=$(PETSCExtPackagePath)/MUMPS_4.10.0-p3.tar.gz >>> PARMETISPATH=$(PETSCExtPackagePath)/parmetis-4.0.2-p5.tar.gz >>> SUPERLUPATH=$(PETSCExtPackagePath)/superlu_dist_3.3.tar.gz >>> SCALPACKINC=$(MKLHOME)/include >>> SCALPACKLIB="$(MKLROOT)/lib/intel64/libmkl_scalapack_lp64.a >>> -Wl,--start-group $(MKLROOT)/lib/intel64/libmkl_intel_lp64.a >>> $(MKLROOT)/lib/intel64/libmkl_core.a >>> $(MKLROOT)/lib/intel64/libmkl_sequential.a -Wl,--end-group >>> $(MKLROOT)/lib/intel64/libmkl_blacs_intelmpi_lp64.a -lpthread -lm" >>> #BLACSINC=$(MKLHOME)/include >>> #BLACSLIB=$(MKLHOME)/lib/intel64/libmkl_blacs_intelmpi_lp64.a >>> confOptsCommon = --with-x=0 --with-make-np=12 --with-hdf5 >>> --with-hdf5-dir=$(HDF5_DIR) --with-single-library=0 --with-pic=1 >>> --with-shared-libraries=0 --with-blas-lapack-dir=$(BLAS_LAPACK) >>> --with-clanguage=C++ --with-fortran --with-debugging=1 $(COMPILERS) >>> --download-metis=$(METISPATH) --download-parmetis=$(PARMETISPATH) >>> --download-superlu_dist=$(SUPERLUPATH) --download-mumps=$(MUMPSPATH) >>> --with-scalapack-include=$(SCALPACKINC) >>> --with-scalapack-lib=$(SCALPACKLIB) >>> #--with-blacs-include=$(BLACSINC) --with-blacs-lib=$(BLACSLIB) >>> >>> ### configure command >>> ./configure --with-scalar-type=real $(confOptsCommon) >> From bsmith at mcs.anl.gov Thu Mar 13 16:22:34 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 13 Mar 2014 16:22:34 -0500 Subject: [petsc-users] KSPSolve crash In-Reply-To: <1394744574.24008.YahooMailNeo@web160203.mail.bf1.yahoo.com> References: <1394739814.87919.YahooMailNeo@web160201.mail.bf1.yahoo.com> <1394739932.48891.YahooMailNeo@web160206.mail.bf1.yahoo.com> <41A0D070-A1EA-41CC-9E8E-D49D96E8236C@mcs.anl.gov> <1394744574.24008.YahooMailNeo@web160203.mail.bf1.yahoo.com> Message-ID: Compile in debug mode and run in the debugger (visual studio has it built in). On Mar 13, 2014, at 4:02 PM, Qin Lu wrote: > Yes, it crashed without any printout. It always crashes at the first call to KSPSolve. Currently I only has a release version of petsc lib although my program has debug version. > > Thanks, > Qin > > > ----- Original Message ----- > From: Barry Smith > To: Qin Lu > Cc: petsc-users > Sent: Thursday, March 13, 2014 3:57 PM > Subject: Re: [petsc-users] KSPSolve crash > > > How did it crash? Absolutely nothing printed to the screen, the program just ended? Please send any output. Since it worked on linux it is likely something specific to the windows machine like lack of memory, a compiler bug, ?. Does it always crash at the same place, can you run it in the debugger on the windows machine and where does it end up? > > Barry > > > On Mar 13, 2014, at 2:45 PM, Qin Lu wrote: > >> I forget to mention: it crashed in Win-7 only, while it runs fine in Linux. >> >> Qin >> >> >> ----- Original Message ----- >> From: Qin Lu >> To: petsc-users >> Cc: >> Sent: Thursday, March 13, 2014 2:43 PM >> Subject: [petsc-users] KSPSolve crash >> >> PETSc team, >> >> I have a program using PETSc linear solver (using KSPBCG with PCILU level 0). But with one case it crashed inside KSPSolve without any error message. I have tested this program with many other cases successfully. >> >> Could you debug PETSc with the linear system? I can send you the matrix and rhs if you let me know where to upload them. >> >> Many thanks, >> Qin From balay at mcs.anl.gov Thu Mar 13 16:23:57 2014 From: balay at mcs.anl.gov (Balay, Satish) Date: Thu, 13 Mar 2014 21:23:57 +0000 Subject: [petsc-users] petsc configuer time In-Reply-To: References: <43666943-4347-470E-9964-9C11F5C9732B@mcs.anl.gov>, Message-ID: <8F84BD8FD72DC64AA206CF6E3D7FC47413FD25B1@BUTKUS.anl.gov> You can try " tail -f configure.log" to see where its hanging during the run Intel compilers can hang waiting for response from license server. Satish ________________________________ From: Dharmendar Reddy Sent: ?3/?13/?2014 2:03 PM To: Smith, Barry F. Cc: PETSc users list Subject: Re: [petsc-users] petsc configuer time Yes, my home directory is mounted on nfs. And i have configured and installed petsc many times on my laptop and TACC stampede (which also has my home directory mounted on network file system). But the particular computer that i am working on now has been extremely slow when it comes to petsc configure. Any suggestions on how i can fix this ? I do not have a choice of not having my home on nfs. Otherwise, i do not see big disk i/o impact even when i visualize large ( > 100 MB ) files for visualization. On Thu, Mar 13, 2014 at 3:55 PM, Barry Smith wrote: > > The long time is pretty much always due to a slow file system (it takes about 3 minutes with my laptop using the local disk) but on a desktop machine using a network file system it can take up to 20 minutes. We generally always build on a local disk; since disk space is so cheap now pretty much any machine has gigabytes free of disk space that you can use to build on. > > I think two hours is totally unacceptably long. What type of system are you building on and where is the file system? My guess is /home/reddy is off on some slow filesystem away from the machine you are compiling on. > > Barry > > On Mar 13, 2014, at 3:29 PM, Dharmendar Reddy wrote: > >> Hello, >> How long does it take to configure petsc ? I understand that >> it depends on the options, but i am find the particular version i have >> is taking very long time (nearly 2 hours) before it begins configuring >> packages. >> >> I am using intel MPI and intel compilers. >> >> I am using the following config opts: >> PETSC_VERSION = petsc-3.4.3 >> MPICC=mpiicc >> MPIF90=mpiifort >> MPICXX=mpiicpc >> COMPILERS = --with-cc="$(MPICC)" --with-fc="$(MPIF90)" >> --with-cxx="$(MPICXX)" COPTFLAGS="$(O_LEVEL)" CXXOPTFLAGS="$(O_LEVEL)" >> FOPTFLAGS="$(O_LEVEL)" >> # COMPILERS = --with-mpi-dir=$(MPI_HOME) >> >> BLAS_LAPACK = $(PETSC_BLAS_LAPACK_DIR) >> PETSCExtPackagePath = /home/reddy/libs/petsc >> METISPATH=$(PETSCExtPackagePath)/metis-5.0.2-p3.tar.gz >> MUMPSPATH=$(PETSCExtPackagePath)/MUMPS_4.10.0-p3.tar.gz >> PARMETISPATH=$(PETSCExtPackagePath)/parmetis-4.0.2-p5.tar.gz >> SUPERLUPATH=$(PETSCExtPackagePath)/superlu_dist_3.3.tar.gz >> SCALPACKINC=$(MKLHOME)/include >> SCALPACKLIB="$(MKLROOT)/lib/intel64/libmkl_scalapack_lp64.a >> -Wl,--start-group $(MKLROOT)/lib/intel64/libmkl_intel_lp64.a >> $(MKLROOT)/lib/intel64/libmkl_core.a >> $(MKLROOT)/lib/intel64/libmkl_sequential.a -Wl,--end-group >> $(MKLROOT)/lib/intel64/libmkl_blacs_intelmpi_lp64.a -lpthread -lm" >> #BLACSINC=$(MKLHOME)/include >> #BLACSLIB=$(MKLHOME)/lib/intel64/libmkl_blacs_intelmpi_lp64.a >> confOptsCommon = --with-x=0 --with-make-np=12 --with-hdf5 >> --with-hdf5-dir=$(HDF5_DIR) --with-single-library=0 --with-pic=1 >> --with-shared-libraries=0 --with-blas-lapack-dir=$(BLAS_LAPACK) >> --with-clanguage=C++ --with-fortran --with-debugging=1 $(COMPILERS) >> --download-metis=$(METISPATH) --download-parmetis=$(PARMETISPATH) >> --download-superlu_dist=$(SUPERLUPATH) --download-mumps=$(MUMPSPATH) >> --with-scalapack-include=$(SCALPACKINC) >> --with-scalapack-lib=$(SCALPACKLIB) >> #--with-blacs-include=$(BLACSINC) --with-blacs-lib=$(BLACSLIB) >> >> ### configure command >> ./configure --with-scalar-type=real $(confOptsCommon) > -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Thu Mar 13 16:25:48 2014 From: balay at mcs.anl.gov (Balay, Satish) Date: Thu, 13 Mar 2014 21:25:48 +0000 Subject: [petsc-users] KSPSolve crash In-Reply-To: References: <1394739814.87919.YahooMailNeo@web160201.mail.bf1.yahoo.com> <1394739932.48891.YahooMailNeo@web160206.mail.bf1.yahoo.com> <41A0D070-A1EA-41CC-9E8E-D49D96E8236C@mcs.anl.gov> <1394744574.24008.YahooMailNeo@web160203.mail.bf1.yahoo.com>, Message-ID: <8F84BD8FD72DC64AA206CF6E3D7FC47413FD25DF@BUTKUS.anl.gov> Also run Linux version in valgrind ________________________________ From: Barry Smith Sent: ?3/?13/?2014 2:22 PM To: Qin Lu Cc: petsc-users Subject: Re: [petsc-users] KSPSolve crash Compile in debug mode and run in the debugger (visual studio has it built in). On Mar 13, 2014, at 4:02 PM, Qin Lu wrote: > Yes, it crashed without any printout. It always crashes at the first call to KSPSolve. Currently I only has a release version of petsc lib although my program has debug version. > > Thanks, > Qin > > > ----- Original Message ----- > From: Barry Smith > To: Qin Lu > Cc: petsc-users > Sent: Thursday, March 13, 2014 3:57 PM > Subject: Re: [petsc-users] KSPSolve crash > > > How did it crash? Absolutely nothing printed to the screen, the program just ended? Please send any output. Since it worked on linux it is likely something specific to the windows machine like lack of memory, a compiler bug, ?. Does it always crash at the same place, can you run it in the debugger on the windows machine and where does it end up? > > Barry > > > On Mar 13, 2014, at 2:45 PM, Qin Lu wrote: > >> I forget to mention: it crashed in Win-7 only, while it runs fine in Linux. >> >> Qin >> >> >> ----- Original Message ----- >> From: Qin Lu >> To: petsc-users >> Cc: >> Sent: Thursday, March 13, 2014 2:43 PM >> Subject: [petsc-users] KSPSolve crash >> >> PETSc team, >> >> I have a program using PETSc linear solver (using KSPBCG with PCILU level 0). But with one case it crashed inside KSPSolve without any error message. I have tested this program with many other cases successfully. >> >> Could you debug PETSc with the linear system? I can send you the matrix and rhs if you let me know where to upload them. >> >> Many thanks, >> Qin -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu Mar 13 16:28:24 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 13 Mar 2014 16:28:24 -0500 Subject: [petsc-users] petsc configuer time In-Reply-To: References: <43666943-4347-470E-9964-9C11F5C9732B@mcs.anl.gov> Message-ID: <5584D2CF-886B-4FC3-BA9B-F73A570A5CBD@mcs.anl.gov> On Mar 13, 2014, at 4:03 PM, Dharmendar Reddy wrote: > Yes, my home directory is mounted on nfs. And i have configured and > installed petsc many times on my laptop and TACC stampede (which also > has my home directory mounted on network file system). But the > particular computer that i am working on now has been extremely slow > when it comes to petsc configure. Any suggestions on how i can fix > this ? I do not have a choice of not having my home on nfs. > > > Otherwise, i do not see big disk i/o impact even when i visualize > large ( > 100 MB ) files for visualization. It is many many smallish file loads back and forth during configure that can make it slow. This ?machine? I take it is not yours but is some community workstation you wish to use? Do df / on it to see what filesystems are available and how much space they have. For example I did ~/Src/petsc barry/add-snes-get-final-function-value $ df / Filesystem 512-blocks Used Available Capacity iused ifree %iused Mounted on /dev/disk0s2 975425848 398228040 576685808 41% 49842503 72085726 41% / and see I have tons of space available on /dev/disk0s2 so I do cd / and can look at what I might be able to use. You may need to ask the system administrator of the machine to give you a directory on the machine that has lots of free space to use to work with PETSc. Or are there are other workstations connected to the same fileserver that you could try and might be faster? I personally do all development on my laptop because I hate waiting around for slow filesystems. Barry > > > > On Thu, Mar 13, 2014 at 3:55 PM, Barry Smith wrote: >> >> The long time is pretty much always due to a slow file system (it takes about 3 minutes with my laptop using the local disk) but on a desktop machine using a network file system it can take up to 20 minutes. We generally always build on a local disk; since disk space is so cheap now pretty much any machine has gigabytes free of disk space that you can use to build on. >> >> I think two hours is totally unacceptably long. What type of system are you building on and where is the file system? My guess is /home/reddy is off on some slow filesystem away from the machine you are compiling on. >> >> Barry >> >> On Mar 13, 2014, at 3:29 PM, Dharmendar Reddy wrote: >> >>> Hello, >>> How long does it take to configure petsc ? I understand that >>> it depends on the options, but i am find the particular version i have >>> is taking very long time (nearly 2 hours) before it begins configuring >>> packages. >>> >>> I am using intel MPI and intel compilers. >>> >>> I am using the following config opts: >>> PETSC_VERSION = petsc-3.4.3 >>> MPICC=mpiicc >>> MPIF90=mpiifort >>> MPICXX=mpiicpc >>> COMPILERS = --with-cc="$(MPICC)" --with-fc="$(MPIF90)" >>> --with-cxx="$(MPICXX)" COPTFLAGS="$(O_LEVEL)" CXXOPTFLAGS="$(O_LEVEL)" >>> FOPTFLAGS="$(O_LEVEL)" >>> # COMPILERS = --with-mpi-dir=$(MPI_HOME) >>> >>> BLAS_LAPACK = $(PETSC_BLAS_LAPACK_DIR) >>> PETSCExtPackagePath = /home/reddy/libs/petsc >>> METISPATH=$(PETSCExtPackagePath)/metis-5.0.2-p3.tar.gz >>> MUMPSPATH=$(PETSCExtPackagePath)/MUMPS_4.10.0-p3.tar.gz >>> PARMETISPATH=$(PETSCExtPackagePath)/parmetis-4.0.2-p5.tar.gz >>> SUPERLUPATH=$(PETSCExtPackagePath)/superlu_dist_3.3.tar.gz >>> SCALPACKINC=$(MKLHOME)/include >>> SCALPACKLIB="$(MKLROOT)/lib/intel64/libmkl_scalapack_lp64.a >>> -Wl,--start-group $(MKLROOT)/lib/intel64/libmkl_intel_lp64.a >>> $(MKLROOT)/lib/intel64/libmkl_core.a >>> $(MKLROOT)/lib/intel64/libmkl_sequential.a -Wl,--end-group >>> $(MKLROOT)/lib/intel64/libmkl_blacs_intelmpi_lp64.a -lpthread -lm" >>> #BLACSINC=$(MKLHOME)/include >>> #BLACSLIB=$(MKLHOME)/lib/intel64/libmkl_blacs_intelmpi_lp64.a >>> confOptsCommon = --with-x=0 --with-make-np=12 --with-hdf5 >>> --with-hdf5-dir=$(HDF5_DIR) --with-single-library=0 --with-pic=1 >>> --with-shared-libraries=0 --with-blas-lapack-dir=$(BLAS_LAPACK) >>> --with-clanguage=C++ --with-fortran --with-debugging=1 $(COMPILERS) >>> --download-metis=$(METISPATH) --download-parmetis=$(PARMETISPATH) >>> --download-superlu_dist=$(SUPERLUPATH) --download-mumps=$(MUMPSPATH) >>> --with-scalapack-include=$(SCALPACKINC) >>> --with-scalapack-lib=$(SCALPACKLIB) >>> #--with-blacs-include=$(BLACSINC) --with-blacs-lib=$(BLACSLIB) >>> >>> ### configure command >>> ./configure --with-scalar-type=real $(confOptsCommon) >> From dharmareddy84 at gmail.com Thu Mar 13 16:51:15 2014 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Thu, 13 Mar 2014 16:51:15 -0500 Subject: [petsc-users] petsc configuer time In-Reply-To: <8F84BD8FD72DC64AA206CF6E3D7FC47413FD25B1@BUTKUS.anl.gov> References: <43666943-4347-470E-9964-9C11F5C9732B@mcs.anl.gov> <8F84BD8FD72DC64AA206CF6E3D7FC47413FD25B1@BUTKUS.anl.gov> Message-ID: I see that at every pushing language, the screen statys there for a while and the compile execute commands quickly appear and go.... On Thu, Mar 13, 2014 at 4:23 PM, Balay, Satish wrote: > You can try " tail -f configure.log" to see where its hanging during the > run > > Intel compilers can hang waiting for response from license server. > > Satish > ________________________________ > From: Dharmendar Reddy > Sent: 3/13/2014 2:03 PM > To: Smith, Barry F. > Cc: PETSc users list > Subject: Re: [petsc-users] petsc configuer time > > Yes, my home directory is mounted on nfs. And i have configured and > installed petsc many times on my laptop and TACC stampede (which also > has my home directory mounted on network file system). But the > particular computer that i am working on now has been extremely slow > when it comes to petsc configure. Any suggestions on how i can fix > this ? I do not have a choice of not having my home on nfs. > > > Otherwise, i do not see big disk i/o impact even when i visualize > large ( > 100 MB ) files for visualization. > > > > On Thu, Mar 13, 2014 at 3:55 PM, Barry Smith wrote: >> >> The long time is pretty much always due to a slow file system (it takes >> about 3 minutes with my laptop using the local disk) but on a desktop >> machine using a network file system it can take up to 20 minutes. We >> generally always build on a local disk; since disk space is so cheap now >> pretty much any machine has gigabytes free of disk space that you can use to >> build on. >> >> I think two hours is totally unacceptably long. What type of system are >> you building on and where is the file system? My guess is /home/reddy is off >> on some slow filesystem away from the machine you are compiling on. >> >> Barry >> >> On Mar 13, 2014, at 3:29 PM, Dharmendar Reddy >> wrote: >> >>> Hello, >>> How long does it take to configure petsc ? I understand that >>> it depends on the options, but i am find the particular version i have >>> is taking very long time (nearly 2 hours) before it begins configuring >>> packages. >>> >>> I am using intel MPI and intel compilers. >>> >>> I am using the following config opts: >>> PETSC_VERSION = petsc-3.4.3 >>> MPICC=mpiicc >>> MPIF90=mpiifort >>> MPICXX=mpiicpc >>> COMPILERS = --with-cc="$(MPICC)" --with-fc="$(MPIF90)" >>> --with-cxx="$(MPICXX)" COPTFLAGS="$(O_LEVEL)" CXXOPTFLAGS="$(O_LEVEL)" >>> FOPTFLAGS="$(O_LEVEL)" >>> # COMPILERS = --with-mpi-dir=$(MPI_HOME) >>> >>> BLAS_LAPACK = $(PETSC_BLAS_LAPACK_DIR) >>> PETSCExtPackagePath = /home/reddy/libs/petsc >>> METISPATH=$(PETSCExtPackagePath)/metis-5.0.2-p3.tar.gz >>> MUMPSPATH=$(PETSCExtPackagePath)/MUMPS_4.10.0-p3.tar.gz >>> PARMETISPATH=$(PETSCExtPackagePath)/parmetis-4.0.2-p5.tar.gz >>> SUPERLUPATH=$(PETSCExtPackagePath)/superlu_dist_3.3.tar.gz >>> SCALPACKINC=$(MKLHOME)/include >>> SCALPACKLIB="$(MKLROOT)/lib/intel64/libmkl_scalapack_lp64.a >>> -Wl,--start-group $(MKLROOT)/lib/intel64/libmkl_intel_lp64.a >>> $(MKLROOT)/lib/intel64/libmkl_core.a >>> $(MKLROOT)/lib/intel64/libmkl_sequential.a -Wl,--end-group >>> $(MKLROOT)/lib/intel64/libmkl_blacs_intelmpi_lp64.a -lpthread -lm" >>> #BLACSINC=$(MKLHOME)/include >>> #BLACSLIB=$(MKLHOME)/lib/intel64/libmkl_blacs_intelmpi_lp64.a >>> confOptsCommon = --with-x=0 --with-make-np=12 --with-hdf5 >>> --with-hdf5-dir=$(HDF5_DIR) --with-single-library=0 --with-pic=1 >>> --with-shared-libraries=0 --with-blas-lapack-dir=$(BLAS_LAPACK) >>> --with-clanguage=C++ --with-fortran --with-debugging=1 $(COMPILERS) >>> --download-metis=$(METISPATH) --download-parmetis=$(PARMETISPATH) >>> --download-superlu_dist=$(SUPERLUPATH) --download-mumps=$(MUMPSPATH) >>> --with-scalapack-include=$(SCALPACKINC) >>> --with-scalapack-lib=$(SCALPACKLIB) >>> #--with-blacs-include=$(BLACSINC) --with-blacs-lib=$(BLACSLIB) >>> >>> ### configure command >>> ./configure --with-scalar-type=real $(confOptsCommon) >> From balay at mcs.anl.gov Thu Mar 13 18:04:37 2014 From: balay at mcs.anl.gov (Satish Balay) Date: Thu, 13 Mar 2014 18:04:37 -0500 Subject: [petsc-users] petsc configuer time In-Reply-To: References: <43666943-4347-470E-9964-9C11F5C9732B@mcs.anl.gov> <8F84BD8FD72DC64AA206CF6E3D7FC47413FD25B1@BUTKUS.anl.gov> Message-ID: Hm - configure should first print the 'executing' message - and then run the command. If its hanging at 'pusing language' message - I'm not sure what the cause is. Perhaps the python stdout-buffer-flush is off-sync. [and its icc/ifort thats hanging]. Or there is a problem with python on this machine? 2 things you can try to confirm. 1. run configure with gcc/gfortran and see if thats quicker. If so - then intel compilers are the cause for slowdown. 2. Try configure with the option --useThreads=0 and see if this makes a difference. [or tray a different python] Satish On Thu, 13 Mar 2014, Dharmendar Reddy wrote: > I see that at every pushing language, the screen statys there for a > while and the compile execute commands quickly appear and go.... > > > On Thu, Mar 13, 2014 at 4:23 PM, Balay, Satish wrote: > > You can try " tail -f configure.log" to see where its hanging during the > > run > > > > Intel compilers can hang waiting for response from license server. > > > > Satish > > ________________________________ > > From: Dharmendar Reddy > > Sent: 3/13/2014 2:03 PM > > To: Smith, Barry F. > > Cc: PETSc users list > > Subject: Re: [petsc-users] petsc configuer time > > > > Yes, my home directory is mounted on nfs. And i have configured and > > installed petsc many times on my laptop and TACC stampede (which also > > has my home directory mounted on network file system). But the > > particular computer that i am working on now has been extremely slow > > when it comes to petsc configure. Any suggestions on how i can fix > > this ? I do not have a choice of not having my home on nfs. > > > > > > Otherwise, i do not see big disk i/o impact even when i visualize > > large ( > 100 MB ) files for visualization. > > > > > > > > On Thu, Mar 13, 2014 at 3:55 PM, Barry Smith wrote: > >> > >> The long time is pretty much always due to a slow file system (it takes > >> about 3 minutes with my laptop using the local disk) but on a desktop > >> machine using a network file system it can take up to 20 minutes. We > >> generally always build on a local disk; since disk space is so cheap now > >> pretty much any machine has gigabytes free of disk space that you can use to > >> build on. > >> > >> I think two hours is totally unacceptably long. What type of system are > >> you building on and where is the file system? My guess is /home/reddy is off > >> on some slow filesystem away from the machine you are compiling on. > >> > >> Barry > >> > >> On Mar 13, 2014, at 3:29 PM, Dharmendar Reddy > >> wrote: > >> > >>> Hello, > >>> How long does it take to configure petsc ? I understand that > >>> it depends on the options, but i am find the particular version i have > >>> is taking very long time (nearly 2 hours) before it begins configuring > >>> packages. > >>> > >>> I am using intel MPI and intel compilers. > >>> > >>> I am using the following config opts: > >>> PETSC_VERSION = petsc-3.4.3 > >>> MPICC=mpiicc > >>> MPIF90=mpiifort > >>> MPICXX=mpiicpc > >>> COMPILERS = --with-cc="$(MPICC)" --with-fc="$(MPIF90)" > >>> --with-cxx="$(MPICXX)" COPTFLAGS="$(O_LEVEL)" CXXOPTFLAGS="$(O_LEVEL)" > >>> FOPTFLAGS="$(O_LEVEL)" > >>> # COMPILERS = --with-mpi-dir=$(MPI_HOME) > >>> > >>> BLAS_LAPACK = $(PETSC_BLAS_LAPACK_DIR) > >>> PETSCExtPackagePath = /home/reddy/libs/petsc > >>> METISPATH=$(PETSCExtPackagePath)/metis-5.0.2-p3.tar.gz > >>> MUMPSPATH=$(PETSCExtPackagePath)/MUMPS_4.10.0-p3.tar.gz > >>> PARMETISPATH=$(PETSCExtPackagePath)/parmetis-4.0.2-p5.tar.gz > >>> SUPERLUPATH=$(PETSCExtPackagePath)/superlu_dist_3.3.tar.gz > >>> SCALPACKINC=$(MKLHOME)/include > >>> SCALPACKLIB="$(MKLROOT)/lib/intel64/libmkl_scalapack_lp64.a > >>> -Wl,--start-group $(MKLROOT)/lib/intel64/libmkl_intel_lp64.a > >>> $(MKLROOT)/lib/intel64/libmkl_core.a > >>> $(MKLROOT)/lib/intel64/libmkl_sequential.a -Wl,--end-group > >>> $(MKLROOT)/lib/intel64/libmkl_blacs_intelmpi_lp64.a -lpthread -lm" > >>> #BLACSINC=$(MKLHOME)/include > >>> #BLACSLIB=$(MKLHOME)/lib/intel64/libmkl_blacs_intelmpi_lp64.a > >>> confOptsCommon = --with-x=0 --with-make-np=12 --with-hdf5 > >>> --with-hdf5-dir=$(HDF5_DIR) --with-single-library=0 --with-pic=1 > >>> --with-shared-libraries=0 --with-blas-lapack-dir=$(BLAS_LAPACK) > >>> --with-clanguage=C++ --with-fortran --with-debugging=1 $(COMPILERS) > >>> --download-metis=$(METISPATH) --download-parmetis=$(PARMETISPATH) > >>> --download-superlu_dist=$(SUPERLUPATH) --download-mumps=$(MUMPSPATH) > >>> --with-scalapack-include=$(SCALPACKINC) > >>> --with-scalapack-lib=$(SCALPACKLIB) > >>> #--with-blacs-include=$(BLACSINC) --with-blacs-lib=$(BLACSLIB) > >>> > >>> ### configure command > >>> ./configure --with-scalar-type=real $(confOptsCommon) > >> > From mc0710 at gmail.com Thu Mar 13 18:12:15 2014 From: mc0710 at gmail.com (Mani Chandra) Date: Thu, 13 Mar 2014 18:12:15 -0500 Subject: [petsc-users] Rescaling quantites to remove positivity constraints Message-ID: Hi, I'm trying to solve for fluid flows with large density and pressure variations using the theta method in the TS module. This inevitably leads to negative densities and pressures during the course of the evolution. In order to deal with this, I am trying to now solve for log(rho) and log(P) instead of rho and P. In order to do this, I changed my initial conditions for rho and P to log(rho) and log(P) and inside the residual evaluation function I keep the original code unchanged expect that I add the following lines above the original code: rho = exp(logrho) P = exp(logP) since I expect the solution vector x[j][i] to now have logrho and logP instead of rho and P. I now run the simulation with a modified newton krylov method and lag the jacobian for every 100 linear solves. The jacobian assembly is using colored finite differences. This goes smoothly from the point I start the simulation to the first 100 linear solves after which it needs to assemble the jacobian again. At this point the nonlinear solver crashes and I think something is wrong with the jacobian assembly. I checked that the simulation runs till it has to assemble the jacobian the second time by varying the lagging by say 10, 20, 30, ... It fails everytime it has to assemble the second time. When I view the solution using VecView inside TSMonitor during the time elasped when the simulation ran, the initial conditions are log(rho) and log(P) but the output of the vector actually has rho and P and not log(rho) and log(P) which I expect the solver to be solving for. However when I look at the .vts files generated by -ts_monitor_solution_vtk, the output is log(rho) and log(P). Should I be doing something else in order to rescale the quantities that TS/SNES should solve for instead of what I did above? The method suggesting that one rescales is described in this paper: "A newton-krylov solver for implicit solution of hydrodynamics in core-collapse supernova" http://iopscience.iop.org/1742-6596/125/1/012085/pdf/1742-6596_125_1_012085.pdf P.S I also tried the variational inequality SNES solver a while back but it didn't really work in enforcing the constraints. Thanks, Mani -------------- next part -------------- An HTML attachment was scrubbed... URL: From dharmareddy84 at gmail.com Thu Mar 13 18:12:17 2014 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Thu, 13 Mar 2014 18:12:17 -0500 Subject: [petsc-users] petsc configuer time In-Reply-To: References: <43666943-4347-470E-9964-9C11F5C9732B@mcs.anl.gov> <8F84BD8FD72DC64AA206CF6E3D7FC47413FD25B1@BUTKUS.anl.gov> Message-ID: I was suspecting python may be the cause, I am using: Enthought Canopy Python 2.7.3 | 64-bit | (default, Aug 8 2013, 05:43:23) I will try other approaches. Thanks Reddy On Thu, Mar 13, 2014 at 6:04 PM, Satish Balay wrote: > Hm - configure should first print the 'executing' message - and then > run the command. > > If its hanging at 'pusing language' message - I'm not sure what the > cause is. > > Perhaps the python stdout-buffer-flush is off-sync. [and its icc/ifort > thats hanging]. Or there is a problem with python on this machine? > > 2 things you can try to confirm. > > 1. run configure with gcc/gfortran and see if thats quicker. > If so - then intel compilers are the cause for slowdown. > > 2. Try configure with the option --useThreads=0 and see if this makes a difference. > [or tray a different python] > > Satish > > On Thu, 13 Mar 2014, Dharmendar Reddy wrote: > >> I see that at every pushing language, the screen statys there for a >> while and the compile execute commands quickly appear and go.... >> >> >> On Thu, Mar 13, 2014 at 4:23 PM, Balay, Satish wrote: >> > You can try " tail -f configure.log" to see where its hanging during the >> > run >> > >> > Intel compilers can hang waiting for response from license server. >> > >> > Satish >> > ________________________________ >> > From: Dharmendar Reddy >> > Sent: 3/13/2014 2:03 PM >> > To: Smith, Barry F. >> > Cc: PETSc users list >> > Subject: Re: [petsc-users] petsc configuer time >> > >> > Yes, my home directory is mounted on nfs. And i have configured and >> > installed petsc many times on my laptop and TACC stampede (which also >> > has my home directory mounted on network file system). But the >> > particular computer that i am working on now has been extremely slow >> > when it comes to petsc configure. Any suggestions on how i can fix >> > this ? I do not have a choice of not having my home on nfs. >> > >> > >> > Otherwise, i do not see big disk i/o impact even when i visualize >> > large ( > 100 MB ) files for visualization. >> > >> > >> > >> > On Thu, Mar 13, 2014 at 3:55 PM, Barry Smith wrote: >> >> >> >> The long time is pretty much always due to a slow file system (it takes >> >> about 3 minutes with my laptop using the local disk) but on a desktop >> >> machine using a network file system it can take up to 20 minutes. We >> >> generally always build on a local disk; since disk space is so cheap now >> >> pretty much any machine has gigabytes free of disk space that you can use to >> >> build on. >> >> >> >> I think two hours is totally unacceptably long. What type of system are >> >> you building on and where is the file system? My guess is /home/reddy is off >> >> on some slow filesystem away from the machine you are compiling on. >> >> >> >> Barry >> >> >> >> On Mar 13, 2014, at 3:29 PM, Dharmendar Reddy >> >> wrote: >> >> >> >>> Hello, >> >>> How long does it take to configure petsc ? I understand that >> >>> it depends on the options, but i am find the particular version i have >> >>> is taking very long time (nearly 2 hours) before it begins configuring >> >>> packages. >> >>> >> >>> I am using intel MPI and intel compilers. >> >>> >> >>> I am using the following config opts: >> >>> PETSC_VERSION = petsc-3.4.3 >> >>> MPICC=mpiicc >> >>> MPIF90=mpiifort >> >>> MPICXX=mpiicpc >> >>> COMPILERS = --with-cc="$(MPICC)" --with-fc="$(MPIF90)" >> >>> --with-cxx="$(MPICXX)" COPTFLAGS="$(O_LEVEL)" CXXOPTFLAGS="$(O_LEVEL)" >> >>> FOPTFLAGS="$(O_LEVEL)" >> >>> # COMPILERS = --with-mpi-dir=$(MPI_HOME) >> >>> >> >>> BLAS_LAPACK = $(PETSC_BLAS_LAPACK_DIR) >> >>> PETSCExtPackagePath = /home/reddy/libs/petsc >> >>> METISPATH=$(PETSCExtPackagePath)/metis-5.0.2-p3.tar.gz >> >>> MUMPSPATH=$(PETSCExtPackagePath)/MUMPS_4.10.0-p3.tar.gz >> >>> PARMETISPATH=$(PETSCExtPackagePath)/parmetis-4.0.2-p5.tar.gz >> >>> SUPERLUPATH=$(PETSCExtPackagePath)/superlu_dist_3.3.tar.gz >> >>> SCALPACKINC=$(MKLHOME)/include >> >>> SCALPACKLIB="$(MKLROOT)/lib/intel64/libmkl_scalapack_lp64.a >> >>> -Wl,--start-group $(MKLROOT)/lib/intel64/libmkl_intel_lp64.a >> >>> $(MKLROOT)/lib/intel64/libmkl_core.a >> >>> $(MKLROOT)/lib/intel64/libmkl_sequential.a -Wl,--end-group >> >>> $(MKLROOT)/lib/intel64/libmkl_blacs_intelmpi_lp64.a -lpthread -lm" >> >>> #BLACSINC=$(MKLHOME)/include >> >>> #BLACSLIB=$(MKLHOME)/lib/intel64/libmkl_blacs_intelmpi_lp64.a >> >>> confOptsCommon = --with-x=0 --with-make-np=12 --with-hdf5 >> >>> --with-hdf5-dir=$(HDF5_DIR) --with-single-library=0 --with-pic=1 >> >>> --with-shared-libraries=0 --with-blas-lapack-dir=$(BLAS_LAPACK) >> >>> --with-clanguage=C++ --with-fortran --with-debugging=1 $(COMPILERS) >> >>> --download-metis=$(METISPATH) --download-parmetis=$(PARMETISPATH) >> >>> --download-superlu_dist=$(SUPERLUPATH) --download-mumps=$(MUMPSPATH) >> >>> --with-scalapack-include=$(SCALPACKINC) >> >>> --with-scalapack-lib=$(SCALPACKLIB) >> >>> #--with-blacs-include=$(BLACSINC) --with-blacs-lib=$(BLACSLIB) >> >>> >> >>> ### configure command >> >>> ./configure --with-scalar-type=real $(confOptsCommon) >> >> >> > From dharmareddy84 at gmail.com Thu Mar 13 18:49:26 2014 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Thu, 13 Mar 2014 18:49:26 -0500 Subject: [petsc-users] petsc configuer time In-Reply-To: References: <43666943-4347-470E-9964-9C11F5C9732B@mcs.anl.gov> <8F84BD8FD72DC64AA206CF6E3D7FC47413FD25B1@BUTKUS.anl.gov> Message-ID: The issue was due to python. It is not the python from Enthought, but the python which was installed in another custom software which was in the path. Now the configure takes about 20 minutes before it begins installing the packages. Thanks Reddy On Thu, Mar 13, 2014 at 6:12 PM, Dharmendar Reddy wrote: > I was suspecting python may be the cause, > I am using: Enthought Canopy Python 2.7.3 | 64-bit | (default, Aug 8 > 2013, 05:43:23) > > I will try other approaches. > > Thanks > Reddy > > > On Thu, Mar 13, 2014 at 6:04 PM, Satish Balay wrote: >> Hm - configure should first print the 'executing' message - and then >> run the command. >> >> If its hanging at 'pusing language' message - I'm not sure what the >> cause is. >> >> Perhaps the python stdout-buffer-flush is off-sync. [and its icc/ifort >> thats hanging]. Or there is a problem with python on this machine? >> >> 2 things you can try to confirm. >> >> 1. run configure with gcc/gfortran and see if thats quicker. >> If so - then intel compilers are the cause for slowdown. >> >> 2. Try configure with the option --useThreads=0 and see if this makes a difference. >> [or tray a different python] >> >> Satish >> >> On Thu, 13 Mar 2014, Dharmendar Reddy wrote: >> >>> I see that at every pushing language, the screen statys there for a >>> while and the compile execute commands quickly appear and go.... >>> >>> >>> On Thu, Mar 13, 2014 at 4:23 PM, Balay, Satish wrote: >>> > You can try " tail -f configure.log" to see where its hanging during the >>> > run >>> > >>> > Intel compilers can hang waiting for response from license server. >>> > >>> > Satish >>> > ________________________________ >>> > From: Dharmendar Reddy >>> > Sent: 3/13/2014 2:03 PM >>> > To: Smith, Barry F. >>> > Cc: PETSc users list >>> > Subject: Re: [petsc-users] petsc configuer time >>> > >>> > Yes, my home directory is mounted on nfs. And i have configured and >>> > installed petsc many times on my laptop and TACC stampede (which also >>> > has my home directory mounted on network file system). But the >>> > particular computer that i am working on now has been extremely slow >>> > when it comes to petsc configure. Any suggestions on how i can fix >>> > this ? I do not have a choice of not having my home on nfs. >>> > >>> > >>> > Otherwise, i do not see big disk i/o impact even when i visualize >>> > large ( > 100 MB ) files for visualization. >>> > >>> > >>> > >>> > On Thu, Mar 13, 2014 at 3:55 PM, Barry Smith wrote: >>> >> >>> >> The long time is pretty much always due to a slow file system (it takes >>> >> about 3 minutes with my laptop using the local disk) but on a desktop >>> >> machine using a network file system it can take up to 20 minutes. We >>> >> generally always build on a local disk; since disk space is so cheap now >>> >> pretty much any machine has gigabytes free of disk space that you can use to >>> >> build on. >>> >> >>> >> I think two hours is totally unacceptably long. What type of system are >>> >> you building on and where is the file system? My guess is /home/reddy is off >>> >> on some slow filesystem away from the machine you are compiling on. >>> >> >>> >> Barry >>> >> >>> >> On Mar 13, 2014, at 3:29 PM, Dharmendar Reddy >>> >> wrote: >>> >> >>> >>> Hello, >>> >>> How long does it take to configure petsc ? I understand that >>> >>> it depends on the options, but i am find the particular version i have >>> >>> is taking very long time (nearly 2 hours) before it begins configuring >>> >>> packages. >>> >>> >>> >>> I am using intel MPI and intel compilers. >>> >>> >>> >>> I am using the following config opts: >>> >>> PETSC_VERSION = petsc-3.4.3 >>> >>> MPICC=mpiicc >>> >>> MPIF90=mpiifort >>> >>> MPICXX=mpiicpc >>> >>> COMPILERS = --with-cc="$(MPICC)" --with-fc="$(MPIF90)" >>> >>> --with-cxx="$(MPICXX)" COPTFLAGS="$(O_LEVEL)" CXXOPTFLAGS="$(O_LEVEL)" >>> >>> FOPTFLAGS="$(O_LEVEL)" >>> >>> # COMPILERS = --with-mpi-dir=$(MPI_HOME) >>> >>> >>> >>> BLAS_LAPACK = $(PETSC_BLAS_LAPACK_DIR) >>> >>> PETSCExtPackagePath = /home/reddy/libs/petsc >>> >>> METISPATH=$(PETSCExtPackagePath)/metis-5.0.2-p3.tar.gz >>> >>> MUMPSPATH=$(PETSCExtPackagePath)/MUMPS_4.10.0-p3.tar.gz >>> >>> PARMETISPATH=$(PETSCExtPackagePath)/parmetis-4.0.2-p5.tar.gz >>> >>> SUPERLUPATH=$(PETSCExtPackagePath)/superlu_dist_3.3.tar.gz >>> >>> SCALPACKINC=$(MKLHOME)/include >>> >>> SCALPACKLIB="$(MKLROOT)/lib/intel64/libmkl_scalapack_lp64.a >>> >>> -Wl,--start-group $(MKLROOT)/lib/intel64/libmkl_intel_lp64.a >>> >>> $(MKLROOT)/lib/intel64/libmkl_core.a >>> >>> $(MKLROOT)/lib/intel64/libmkl_sequential.a -Wl,--end-group >>> >>> $(MKLROOT)/lib/intel64/libmkl_blacs_intelmpi_lp64.a -lpthread -lm" >>> >>> #BLACSINC=$(MKLHOME)/include >>> >>> #BLACSLIB=$(MKLHOME)/lib/intel64/libmkl_blacs_intelmpi_lp64.a >>> >>> confOptsCommon = --with-x=0 --with-make-np=12 --with-hdf5 >>> >>> --with-hdf5-dir=$(HDF5_DIR) --with-single-library=0 --with-pic=1 >>> >>> --with-shared-libraries=0 --with-blas-lapack-dir=$(BLAS_LAPACK) >>> >>> --with-clanguage=C++ --with-fortran --with-debugging=1 $(COMPILERS) >>> >>> --download-metis=$(METISPATH) --download-parmetis=$(PARMETISPATH) >>> >>> --download-superlu_dist=$(SUPERLUPATH) --download-mumps=$(MUMPSPATH) >>> >>> --with-scalapack-include=$(SCALPACKINC) >>> >>> --with-scalapack-lib=$(SCALPACKLIB) >>> >>> #--with-blacs-include=$(BLACSINC) --with-blacs-lib=$(BLACSLIB) >>> >>> >>> >>> ### configure command >>> >>> ./configure --with-scalar-type=real $(confOptsCommon) >>> >> >>> >> From jed at jedbrown.org Thu Mar 13 20:19:02 2014 From: jed at jedbrown.org (Jed Brown) Date: Thu, 13 Mar 2014 19:19:02 -0600 Subject: [petsc-users] petsc configuer time In-Reply-To: <5584D2CF-886B-4FC3-BA9B-F73A570A5CBD@mcs.anl.gov> References: <43666943-4347-470E-9964-9C11F5C9732B@mcs.anl.gov> <5584D2CF-886B-4FC3-BA9B-F73A570A5CBD@mcs.anl.gov> Message-ID: <87pplpy3uh.fsf@jedbrown.org> Barry Smith writes: > You may need to ask the system administrator of the machine to give > you a directory on the machine that has lots of free space to use to > work with PETSc. The system search paths are often over NFS as well, and account for the vast majority of file access during configure/builds. This prevents fixing the performance problem simply by building in scratch. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From bsmith at mcs.anl.gov Thu Mar 13 20:38:41 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 13 Mar 2014 20:38:41 -0500 Subject: [petsc-users] petsc configuer time In-Reply-To: <87pplpy3uh.fsf@jedbrown.org> References: <43666943-4347-470E-9964-9C11F5C9732B@mcs.anl.gov> <5584D2CF-886B-4FC3-BA9B-F73A570A5CBD@mcs.anl.gov> <87pplpy3uh.fsf@jedbrown.org> Message-ID: On Mar 13, 2014, at 8:19 PM, Jed Brown wrote: > Barry Smith writes: >> You may need to ask the system administrator of the machine to give >> you a directory on the machine that has lots of free space to use to >> work with PETSc. > > The system search paths are often over NFS as well, and account for the > vast majority of file access during configure/builds. This prevents > fixing the performance problem simply by building in scratch. Yes but scratch can still help a great deal. BTW: how much information is/could be cached either in the fileserver memory or local memory and can get reused? Barry From knepley at gmail.com Thu Mar 13 20:51:29 2014 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 13 Mar 2014 20:51:29 -0500 Subject: [petsc-users] Rescaling quantites to remove positivity constraints In-Reply-To: References: Message-ID: On Thu, Mar 13, 2014 at 6:12 PM, Mani Chandra wrote: > Hi, > > I'm trying to solve for fluid flows with large density and pressure > variations using the theta method in the TS module. This inevitably leads > to negative densities and pressures during the course of the evolution. In > order to deal with this, I am trying to now solve for log(rho) and log(P) > instead of rho and P. In order to do this, I changed my initial conditions > for rho and P to log(rho) and log(P) and inside the residual evaluation > function I keep the original code unchanged expect that I add the following > lines above the original code: > > rho = exp(logrho) > P = exp(logP) > You must d your scaling such that the output of your residual is log() of your variables. If that is the case, it will work. > since I expect the solution vector x[j][i] to now have logrho and logP > instead of rho and P. I now run the simulation with a modified newton > krylov method and lag the jacobian for every 100 linear solves. The > jacobian assembly is using colored finite differences. This goes smoothly > from the point I start the simulation to the first 100 linear solves after > which it needs to assemble the jacobian again. At this point the nonlinear > solver crashes and I think something is wrong with the jacobian assembly. I > checked that the simulation runs till it has to assemble the jacobian the > second time by varying the lagging by say 10, 20, 30, ... It fails > everytime it has to assemble the second time. > > When I view the solution using VecView inside TSMonitor during the time > elasped when the simulation ran, the initial conditions are log(rho) and > log(P) but the output of the vector actually has rho and P and not log(rho) > and log(P) which I expect the solver to be solving for. However when I look > at the .vts files generated by -ts_monitor_solution_vtk, the output is > log(rho) and log(P). > > Should I be doing something else in order to rescale the quantities that > TS/SNES should solve for instead of what I did above? > > The method suggesting that one rescales is described in this paper: > "A newton-krylov solver for implicit solution of hydrodynamics in > core-collapse supernova" > > http://iopscience.iop.org/1742-6596/125/1/012085/pdf/1742-6596_125_1_012085.pdf > I personally think this is a bad idea. They must have guesses that are very close to the solutions, and it messes with the accuracy of the discretization. Thanks, Matt P.S I also tried the variational inequality SNES solver a while back but it > didn't really work in enforcing the constraints. > > Thanks, > Mani > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Thu Mar 13 20:57:39 2014 From: jed at jedbrown.org (Jed Brown) Date: Thu, 13 Mar 2014 20:57:39 -0500 Subject: [petsc-users] petsc configuer time In-Reply-To: References: <43666943-4347-470E-9964-9C11F5C9732B@mcs.anl.gov> <5584D2CF-886B-4FC3-BA9B-F73A570A5CBD@mcs.anl.gov> <87pplpy3uh.fsf@jedbrown.org> Message-ID: <87eh25y224.fsf@jedbrown.org> Barry Smith writes: > BTW: how much information is/could be cached either in the > fileserver memory or local memory and can get reused? Configure would be *blazing* fast if compilers were shared libraries. Then they could cache everything in a reliable way. Otherwise, they really have to go search the file system every time because it could have changed. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From knepley at gmail.com Thu Mar 13 21:00:57 2014 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 13 Mar 2014 21:00:57 -0500 Subject: [petsc-users] petsc configuer time In-Reply-To: <87eh25y224.fsf@jedbrown.org> References: <43666943-4347-470E-9964-9C11F5C9732B@mcs.anl.gov> <5584D2CF-886B-4FC3-BA9B-F73A570A5CBD@mcs.anl.gov> <87pplpy3uh.fsf@jedbrown.org> <87eh25y224.fsf@jedbrown.org> Message-ID: On Thu, Mar 13, 2014 at 8:57 PM, Jed Brown wrote: > Barry Smith writes: > > BTW: how much information is/could be cached either in the > > fileserver memory or local memory and can get reused? > > Configure would be *blazing* fast if compilers were shared libraries. > Then they could cache everything in a reliable way. Otherwise, they > really have to go search the file system every time because it could > have changed. > I knew Jed was a secret OpenCL fanatic! Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Thu Mar 13 21:08:05 2014 From: balay at mcs.anl.gov (Satish Balay) Date: Thu, 13 Mar 2014 21:08:05 -0500 Subject: [petsc-users] petsc configuer time In-Reply-To: References: <43666943-4347-470E-9964-9C11F5C9732B@mcs.anl.gov> <8F84BD8FD72DC64AA206CF6E3D7FC47413FD25B1@BUTKUS.anl.gov> Message-ID: 20min to get to installing packages is still a lot. The other issues discussed in this thread could be the cause [intel compilers, search path on nfs, build on nfs] Satish On Thu, 13 Mar 2014, Dharmendar Reddy wrote: > The issue was due to python. It is not the python from Enthought, but > the python which was installed in another custom software which was in > the path. > Now the configure takes about 20 minutes before it begins installing > the packages. > > Thanks > Reddy > > On Thu, Mar 13, 2014 at 6:12 PM, Dharmendar Reddy > wrote: > > I was suspecting python may be the cause, > > I am using: Enthought Canopy Python 2.7.3 | 64-bit | (default, Aug 8 > > 2013, 05:43:23) > > > > I will try other approaches. > > > > Thanks > > Reddy > > > > > > On Thu, Mar 13, 2014 at 6:04 PM, Satish Balay wrote: > >> Hm - configure should first print the 'executing' message - and then > >> run the command. > >> > >> If its hanging at 'pusing language' message - I'm not sure what the > >> cause is. > >> > >> Perhaps the python stdout-buffer-flush is off-sync. [and its icc/ifort > >> thats hanging]. Or there is a problem with python on this machine? > >> > >> 2 things you can try to confirm. > >> > >> 1. run configure with gcc/gfortran and see if thats quicker. > >> If so - then intel compilers are the cause for slowdown. > >> > >> 2. Try configure with the option --useThreads=0 and see if this makes a difference. > >> [or tray a different python] > >> > >> Satish > >> > >> On Thu, 13 Mar 2014, Dharmendar Reddy wrote: > >> > >>> I see that at every pushing language, the screen statys there for a > >>> while and the compile execute commands quickly appear and go.... > >>> > >>> > >>> On Thu, Mar 13, 2014 at 4:23 PM, Balay, Satish wrote: > >>> > You can try " tail -f configure.log" to see where its hanging during the > >>> > run > >>> > > >>> > Intel compilers can hang waiting for response from license server. > >>> > > >>> > Satish > >>> > ________________________________ > >>> > From: Dharmendar Reddy > >>> > Sent: 3/13/2014 2:03 PM > >>> > To: Smith, Barry F. > >>> > Cc: PETSc users list > >>> > Subject: Re: [petsc-users] petsc configuer time > >>> > > >>> > Yes, my home directory is mounted on nfs. And i have configured and > >>> > installed petsc many times on my laptop and TACC stampede (which also > >>> > has my home directory mounted on network file system). But the > >>> > particular computer that i am working on now has been extremely slow > >>> > when it comes to petsc configure. Any suggestions on how i can fix > >>> > this ? I do not have a choice of not having my home on nfs. > >>> > > >>> > > >>> > Otherwise, i do not see big disk i/o impact even when i visualize > >>> > large ( > 100 MB ) files for visualization. > >>> > > >>> > > >>> > > >>> > On Thu, Mar 13, 2014 at 3:55 PM, Barry Smith wrote: > >>> >> > >>> >> The long time is pretty much always due to a slow file system (it takes > >>> >> about 3 minutes with my laptop using the local disk) but on a desktop > >>> >> machine using a network file system it can take up to 20 minutes. We > >>> >> generally always build on a local disk; since disk space is so cheap now > >>> >> pretty much any machine has gigabytes free of disk space that you can use to > >>> >> build on. > >>> >> > >>> >> I think two hours is totally unacceptably long. What type of system are > >>> >> you building on and where is the file system? My guess is /home/reddy is off > >>> >> on some slow filesystem away from the machine you are compiling on. > >>> >> > >>> >> Barry > >>> >> > >>> >> On Mar 13, 2014, at 3:29 PM, Dharmendar Reddy > >>> >> wrote: > >>> >> > >>> >>> Hello, > >>> >>> How long does it take to configure petsc ? I understand that > >>> >>> it depends on the options, but i am find the particular version i have > >>> >>> is taking very long time (nearly 2 hours) before it begins configuring > >>> >>> packages. > >>> >>> > >>> >>> I am using intel MPI and intel compilers. > >>> >>> > >>> >>> I am using the following config opts: > >>> >>> PETSC_VERSION = petsc-3.4.3 > >>> >>> MPICC=mpiicc > >>> >>> MPIF90=mpiifort > >>> >>> MPICXX=mpiicpc > >>> >>> COMPILERS = --with-cc="$(MPICC)" --with-fc="$(MPIF90)" > >>> >>> --with-cxx="$(MPICXX)" COPTFLAGS="$(O_LEVEL)" CXXOPTFLAGS="$(O_LEVEL)" > >>> >>> FOPTFLAGS="$(O_LEVEL)" > >>> >>> # COMPILERS = --with-mpi-dir=$(MPI_HOME) > >>> >>> > >>> >>> BLAS_LAPACK = $(PETSC_BLAS_LAPACK_DIR) > >>> >>> PETSCExtPackagePath = /home/reddy/libs/petsc > >>> >>> METISPATH=$(PETSCExtPackagePath)/metis-5.0.2-p3.tar.gz > >>> >>> MUMPSPATH=$(PETSCExtPackagePath)/MUMPS_4.10.0-p3.tar.gz > >>> >>> PARMETISPATH=$(PETSCExtPackagePath)/parmetis-4.0.2-p5.tar.gz > >>> >>> SUPERLUPATH=$(PETSCExtPackagePath)/superlu_dist_3.3.tar.gz > >>> >>> SCALPACKINC=$(MKLHOME)/include > >>> >>> SCALPACKLIB="$(MKLROOT)/lib/intel64/libmkl_scalapack_lp64.a > >>> >>> -Wl,--start-group $(MKLROOT)/lib/intel64/libmkl_intel_lp64.a > >>> >>> $(MKLROOT)/lib/intel64/libmkl_core.a > >>> >>> $(MKLROOT)/lib/intel64/libmkl_sequential.a -Wl,--end-group > >>> >>> $(MKLROOT)/lib/intel64/libmkl_blacs_intelmpi_lp64.a -lpthread -lm" > >>> >>> #BLACSINC=$(MKLHOME)/include > >>> >>> #BLACSLIB=$(MKLHOME)/lib/intel64/libmkl_blacs_intelmpi_lp64.a > >>> >>> confOptsCommon = --with-x=0 --with-make-np=12 --with-hdf5 > >>> >>> --with-hdf5-dir=$(HDF5_DIR) --with-single-library=0 --with-pic=1 > >>> >>> --with-shared-libraries=0 --with-blas-lapack-dir=$(BLAS_LAPACK) > >>> >>> --with-clanguage=C++ --with-fortran --with-debugging=1 $(COMPILERS) > >>> >>> --download-metis=$(METISPATH) --download-parmetis=$(PARMETISPATH) > >>> >>> --download-superlu_dist=$(SUPERLUPATH) --download-mumps=$(MUMPSPATH) > >>> >>> --with-scalapack-include=$(SCALPACKINC) > >>> >>> --with-scalapack-lib=$(SCALPACKLIB) > >>> >>> #--with-blacs-include=$(BLACSINC) --with-blacs-lib=$(BLACSLIB) > >>> >>> > >>> >>> ### configure command > >>> >>> ./configure --with-scalar-type=real $(confOptsCommon) > >>> >> > >>> > >> > From mc0710 at gmail.com Thu Mar 13 21:08:32 2014 From: mc0710 at gmail.com (Mani Chandra) Date: Thu, 13 Mar 2014 21:08:32 -0500 Subject: [petsc-users] Rescaling quantites to remove positivity constraints In-Reply-To: References: Message-ID: On Thu, Mar 13, 2014 at 8:51 PM, Matthew Knepley wrote: > On Thu, Mar 13, 2014 at 6:12 PM, Mani Chandra wrote: > >> Hi, >> >> I'm trying to solve for fluid flows with large density and pressure >> variations using the theta method in the TS module. This inevitably leads >> to negative densities and pressures during the course of the evolution. In >> order to deal with this, I am trying to now solve for log(rho) and log(P) >> instead of rho and P. In order to do this, I changed my initial conditions >> for rho and P to log(rho) and log(P) and inside the residual evaluation >> function I keep the original code unchanged expect that I add the following >> lines above the original code: >> >> rho = exp(logrho) >> P = exp(logP) >> > > You must d your scaling such that the output of your residual is log() of > your variables. If that is the case, it > will work. > > Could you elaborate a bit? I didn't understand that. I thought that as long as the residuals are the same, one can solve for any forms of the variables. > since I expect the solution vector x[j][i] to now have logrho and logP >> instead of rho and P. I now run the simulation with a modified newton >> krylov method and lag the jacobian for every 100 linear solves. The >> jacobian assembly is using colored finite differences. This goes smoothly >> from the point I start the simulation to the first 100 linear solves after >> which it needs to assemble the jacobian again. At this point the nonlinear >> solver crashes and I think something is wrong with the jacobian assembly. I >> checked that the simulation runs till it has to assemble the jacobian the >> second time by varying the lagging by say 10, 20, 30, ... It fails >> everytime it has to assemble the second time. >> >> When I view the solution using VecView inside TSMonitor during the time >> elasped when the simulation ran, the initial conditions are log(rho) and >> log(P) but the output of the vector actually has rho and P and not log(rho) >> and log(P) which I expect the solver to be solving for. However when I look >> at the .vts files generated by -ts_monitor_solution_vtk, the output is >> log(rho) and log(P). >> >> Should I be doing something else in order to rescale the quantities that >> TS/SNES should solve for instead of what I did above? >> >> The method suggesting that one rescales is described in this paper: >> "A newton-krylov solver for implicit solution of hydrodynamics in >> core-collapse supernova" >> >> http://iopscience.iop.org/1742-6596/125/1/012085/pdf/1742-6596_125_1_012085.pdf >> > > I personally think this is a bad idea. They must have guesses that are > very close to the solutions, and > it messes with the accuracy of the discretization. > > I tried other things like setting floors, but they fail. Any suggestions? > Thanks, > > Matt > > P.S I also tried the variational inequality SNES solver a while back but >> it didn't really work in enforcing the constraints. >> >> Thanks, >> Mani >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Mar 13 21:14:46 2014 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 13 Mar 2014 21:14:46 -0500 Subject: [petsc-users] Rescaling quantites to remove positivity constraints In-Reply-To: References: Message-ID: On Thu, Mar 13, 2014 at 9:08 PM, Mani Chandra wrote: > > On Thu, Mar 13, 2014 at 8:51 PM, Matthew Knepley wrote: > >> On Thu, Mar 13, 2014 at 6:12 PM, Mani Chandra wrote: >> >>> Hi, >>> >>> I'm trying to solve for fluid flows with large density and pressure >>> variations using the theta method in the TS module. This inevitably leads >>> to negative densities and pressures during the course of the evolution. In >>> order to deal with this, I am trying to now solve for log(rho) and log(P) >>> instead of rho and P. In order to do this, I changed my initial conditions >>> for rho and P to log(rho) and log(P) and inside the residual evaluation >>> function I keep the original code unchanged expect that I add the following >>> lines above the original code: >>> >>> rho = exp(logrho) >>> P = exp(logP) >>> >> >> You must d your scaling such that the output of your residual is log() of >> your variables. If that is the case, it >> will work. >> >> > > Could you elaborate a bit? I didn't understand that. I thought that as > long as the residuals are the same, one can solve for any forms of the > variables. > You are using the residual function to form the Jacobian using finite differences, so it has to represent the variables you use. Matt > since I expect the solution vector x[j][i] to now have logrho and logP >>> instead of rho and P. I now run the simulation with a modified newton >>> krylov method and lag the jacobian for every 100 linear solves. The >>> jacobian assembly is using colored finite differences. This goes smoothly >>> from the point I start the simulation to the first 100 linear solves after >>> which it needs to assemble the jacobian again. At this point the nonlinear >>> solver crashes and I think something is wrong with the jacobian assembly. I >>> checked that the simulation runs till it has to assemble the jacobian the >>> second time by varying the lagging by say 10, 20, 30, ... It fails >>> everytime it has to assemble the second time. >>> >>> When I view the solution using VecView inside TSMonitor during the time >>> elasped when the simulation ran, the initial conditions are log(rho) and >>> log(P) but the output of the vector actually has rho and P and not log(rho) >>> and log(P) which I expect the solver to be solving for. However when I look >>> at the .vts files generated by -ts_monitor_solution_vtk, the output is >>> log(rho) and log(P). >>> >>> Should I be doing something else in order to rescale the quantities that >>> TS/SNES should solve for instead of what I did above? >>> >>> The method suggesting that one rescales is described in this paper: >>> "A newton-krylov solver for implicit solution of hydrodynamics in >>> core-collapse supernova" >>> >>> http://iopscience.iop.org/1742-6596/125/1/012085/pdf/1742-6596_125_1_012085.pdf >>> >> >> I personally think this is a bad idea. They must have guesses that are >> very close to the solutions, and >> it messes with the accuracy of the discretization. >> >> I tried other things like setting floors, but they fail. Any suggestions? > > >> Thanks, >> >> Matt >> >> P.S I also tried the variational inequality SNES solver a while back but >>> it didn't really work in enforcing the constraints. >>> >>> Thanks, >>> Mani >>> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From florian.meier at koalo.de Fri Mar 14 04:13:05 2014 From: florian.meier at koalo.de (Florian Meier) Date: Fri, 14 Mar 2014 10:13:05 +0100 Subject: [petsc-users] Extra Variable in DMCircuit Message-ID: <5322C821.7040400@koalo.de> Hi, I got quite far with my project, although I still have not managed (or better "have not tried...") to get the parallelization running (Shri: Any news about that?). Now I would like to add a single global variable (and a single equation) to the equation system. Is there an elegant way to do this with DMCircuit? A hackish solution might be to add an additional imaginary vertex that is excluded from all other calculations, but that does not seem to be the right way to do it. Greetings, Florian From rupp at iue.tuwien.ac.at Fri Mar 14 05:43:31 2014 From: rupp at iue.tuwien.ac.at (Karl Rupp) Date: Fri, 14 Mar 2014 11:43:31 +0100 Subject: [petsc-users] petsc configuer time In-Reply-To: References: <43666943-4347-470E-9964-9C11F5C9732B@mcs.anl.gov> <5584D2CF-886B-4FC3-BA9B-F73A570A5CBD@mcs.anl.gov> <87pplpy3uh.fsf@jedbrown.org> <87eh25y224.fsf@jedbrown.org> Message-ID: <5322DD53.90502@iue.tuwien.ac.at> Hey, > Configure would be *blazing* fast if compilers were shared libraries. > Then they could cache everything in a reliable way. Otherwise, they > really have to go search the file system every time because it could > have changed. > > > I knew Jed was a secret OpenCL fanatic! Haha, nice try ;-) But then there are funny 'features' such as kernel caching in the home directories, where the filesystem may still be an issue... Best regards, Karli From knepley at gmail.com Fri Mar 14 05:44:45 2014 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 14 Mar 2014 05:44:45 -0500 Subject: [petsc-users] Extra Variable in DMCircuit In-Reply-To: <5322C821.7040400@koalo.de> References: <5322C821.7040400@koalo.de> Message-ID: On Fri, Mar 14, 2014 at 4:13 AM, Florian Meier wrote: > Hi, > I got quite far with my project, although I still have not managed (or > better "have not tried...") to get the parallelization running (Shri: > Any news about that?). > > Now I would like to add a single global variable (and a single equation) > to the equation system. Is there an elegant way to do this with DMCircuit? > > A hackish solution might be to add an additional imaginary vertex that > is excluded from all other calculations, but that does not seem to be > the right way to do it. > The graph expresses the influence of variables on each other, so if you have a globally coupled variable, then a node linked to all other nodes is the appropriate structure. The problem is that this is not scalable since it will result in a dense row. I would do this for now. And when your problem is large enough for this to be a drag on scalability (thousands of processes), switch to specialized code. Matt > Greetings, > Florian > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From florian.meier at koalo.de Fri Mar 14 07:00:27 2014 From: florian.meier at koalo.de (Florian Meier) Date: Fri, 14 Mar 2014 13:00:27 +0100 Subject: [petsc-users] Extra Variable in DMCircuit In-Reply-To: References: <5322C821.7040400@koalo.de> Message-ID: <5322EF5B.7050306@koalo.de> Seen from that perspective it is reasonable to add a further vertex, thank you! I have implemented that and it works very good. Regarding the scalability: My previous approach was to do binary search and running the calculation multiple times with a different constant parameter, so this is much faster anyway :-) Greetings, Florian On 03/14/2014 11:44 AM, Matthew Knepley wrote: > On Fri, Mar 14, 2014 at 4:13 AM, Florian Meier > wrote: > > Hi, > I got quite far with my project, although I still have not managed (or > better "have not tried...") to get the parallelization running (Shri: > Any news about that?). > > Now I would like to add a single global variable (and a single equation) > to the equation system. Is there an elegant way to do this with > DMCircuit? > > A hackish solution might be to add an additional imaginary vertex that > is excluded from all other calculations, but that does not seem to be > the right way to do it. > > > The graph expresses the influence of variables on each other, so if you have > a globally coupled variable, then a node linked to all other nodes is the > appropriate structure. > > The problem is that this is not scalable since it will result in a > dense row. I > would do this for now. And when your problem is large enough for this to > be a drag on scalability (thousands of processes), switch to specialized > code. > > Matt > > > Greetings, > Florian > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener From abhyshr at mcs.anl.gov Fri Mar 14 12:24:05 2014 From: abhyshr at mcs.anl.gov (Abhyankar, Shrirang G.) Date: Fri, 14 Mar 2014 17:24:05 +0000 Subject: [petsc-users] Extra Variable in DMCircuit In-Reply-To: <5322C821.7040400@koalo.de> References: <5322C821.7040400@koalo.de> Message-ID: > Hi, > I got quite far with my project, although I still have not managed (or > better "have not tried...") to get the parallelization running (Shri: > Any news about that?). We've figured out what needs to be done but haven't done it yet :-). Your application needs either a vertex distribution with an overlap or a custom MPI reduction scheme. After speaking with Barry last week, it seems to me that the latter option would be best way to proceed. The custom MPI reduction scheme is because you have 2 equations for every vertex with the first equation needing an ADD operation and the second one needs a PROD. Thus, we would need to have an ADD_PROD insertmode for DMLocalToGlobalXXX that we currently don't have. > Now I would like to add a single global variable (and a single equation) > to the equation system. Is there an elegant way to do this with DMCircuit? Is this akin to a "Ground" node for circuits? Is the variable value constant? After working on your example I realized that specifying a bidirectional edge as two unidirectional edges in the data may cause problems for the partitioner. I observed that the two undirectional edges may be assigned to different processors although they are connected to the same vertices. This may be a problem when communicating ghost values. Hence, I've modified the data format in the attached links1.txt file to only specify edges via their nodal connectivity and then to specify the type information. I've reworked your source code also accordingly and it gives the same answer as your original code. It gives a wrong answer for parallel runs because of the incorrect ghost value exchanges. Once we have the ADD_PROD insertmode, this code should work fine in parallel too. I think that going forward you should use a similar data format. > > A hackish solution might be to add an additional imaginary vertex that > is excluded from all other calculations, but that does not seem to be > the right way to do it. > > Greetings, > Florian -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: aloha1.cpp Type: application/octet-stream Size: 18203 bytes Desc: aloha1.cpp URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: links1.txt URL: From florian.meier at koalo.de Fri Mar 14 13:34:23 2014 From: florian.meier at koalo.de (Florian Meier) Date: Fri, 14 Mar 2014 19:34:23 +0100 Subject: [petsc-users] Extra Variable in DMCircuit In-Reply-To: References: <5322C821.7040400@koalo.de> Message-ID: <53234BAF.1040001@koalo.de> On 03/14/2014 06:24 PM, Abhyankar, Shrirang G. wrote: > >> Hi, >> I got quite far with my project, although I still have not managed (or >> better "have not tried...") to get the parallelization running (Shri: >> Any news about that?). > We've figured out what needs to be done but haven't done it yet :-). > Your application needs either a vertex distribution with an overlap or a > custom MPI reduction scheme. After speaking with Barry last week, it > seems to me that the latter option would be best way to proceed. The > custom MPI reduction scheme is because you have 2 equations for every > vertex with the first equation needing an ADD operation and the second > one needs a PROD. Thus, we would need > to have an ADD_PROD insertmode for DMLocalToGlobalXXX that we currently > don't have. That sounds great! Although, this solution seems to be very specific to the equations. Does this approach still work when the equations get more complex (e.g. handling multiple variables like PRODUCT(1+t*(a-1)) or SUM(R*q*(1-f)) )? >> Now I would like to add a single global variable (and a single equation) >> to the equation system. Is there an elegant way to do this with DMCircuit? > > Is this akin to a "Ground" node for circuits? Is the variable value > constant? Maybe... The additional equation is the multiplication of the reliability over a specific path in the network (a rather arbitrary, but small subset of the links (e.g. 55 links for a problem with 10000 links)) minus a constant predefined value. This gives me the possibility to convert the formerly constant packet generation (g_i) into a variable. When adding an additional vertex it works quite good. We will see how it works out when running in parallel. > After working on your example I realized that specifying a bidirectional > edge as two unidirectional edges in the data may cause problems for the > partitioner. I observed that > the two undirectional edges may be assigned to different processors > although they are connected to the same vertices. This may be a problem > when communicating ghost > values. Hence, I've modified the data format in the attached links1.txt > file to only specify edges via their nodal connectivity and then to > specify the type information. > I've reworked your source code also accordingly and it gives the same > answer as your original code. It gives a wrong answer for parallel runs > because of the incorrect > ghost value exchanges. Once we have the ADD_PROD insertmode, this code > should work fine in parallel too. I think that going forward you should > use a similar data format. Good idea, but unfortunately it is not always guaranteed that the edge is bidirectional for the extended formulation of the problem. What exactly is the problem when the two unidirectional edges are assigned to different processes? > >> >> A hackish solution might be to add an additional imaginary vertex that >> is excluded from all other calculations, but that does not seem to be >> the right way to do it. >> >> Greetings, >> Florian > From abhyshr at mcs.anl.gov Fri Mar 14 14:22:19 2014 From: abhyshr at mcs.anl.gov (Abhyankar, Shrirang G.) Date: Fri, 14 Mar 2014 19:22:19 +0000 Subject: [petsc-users] Extra Variable in DMCircuit In-Reply-To: <53234BAF.1040001@koalo.de> Message-ID: -----Original Message----- From: Florian Meier > Date: Fri, 14 Mar 2014 19:34:23 +0100 To: Shri > Cc: petsc-users list > Subject: Re: Extra Variable in DMCircuit On 03/14/2014 06:24 PM, Abhyankar, Shrirang G. wrote: That sounds great! Although, this solution seems to be very specific to the equations. Does this approach still work when the equations get more complex (e.g. handling multiple variables like PRODUCT(1+t*(a-1)) or SUM(R*q*(1-f)) )? Managing custom needs is difficult to maintain hence what I'm planning to do is to provide a way for the user to define a own custom MPI_Op that can be called by DMLocalToGlobalXXX/DMGlobalToLocalXXX. So eventually you'll have to write the MPI_Op and we'll provide hooks to attach it to the DM. Now I would like to add a single global variable (and a single equation) to the equation system. Is there an elegant way to do this with DMCircuit? Is this akin to a "Ground" node for circuits? Is the variable value constant? Maybe... The additional equation is the multiplication of the reliability over a specific path in the network (a rather arbitrary, but small subset of the links (e.g. 55 links for a problem with 10000 links)) minus a constant predefined value. This gives me the possibility to convert the formerly constant packet generation (g_i) into a variable. I see..so it is like an equality constraint on a subset of links, not all the links. Presumably these links form a subnetwork that may get assigned to one processor/set of neighboring processors. When adding an additional vertex it works quite good. We will see how it works out when running in parallel. After working on your example I realized that specifying a bidirectional edge as two unidirectional edges in the data may cause problems for the partitioner. I observed that the two undirectional edges may be assigned to different processors although they are connected to the same vertices. This may be a problem when communicating ghost values. Hence, I've modified the data format in the attached links1.txt file to only specify edges via their nodal connectivity and then to specify the type information. I've reworked your source code also accordingly and it gives the same answer as your original code. It gives a wrong answer for parallel runs because of the incorrect ghost value exchanges. Once we have the ADD_PROD insertmode, this code should work fine in parallel too. I think that going forward you should use a similar data format. Good idea, but unfortunately it is not always guaranteed that the edge is bidirectional for the extended formulation of the problem. Are you saying that the directionality could change during the calculation? In your example, the INTERFERING edges are bidirectional while the INFLOWING links are unidirectional. By setting up the appropriate relations in the data attached with the edges , you can manage the equations for the edges/vertices. If there is some specific case that cannot be handled then we can take a look at it. What exactly is the problem when the two unidirectional edges are assigned to different processes? I don't quite remember it right now but I recall seeing weird partitions and incorrect ghost exchanges. I'll have to run it once again to produce specific details. Shri A hackish solution might be to add an additional imaginary vertex that is excluded from all other calculations, but that does not seem to be the right way to do it. Greetings, Florian -------------- next part -------------- An HTML attachment was scrubbed... URL: From abhyshr at mcs.anl.gov Fri Mar 14 14:59:23 2014 From: abhyshr at mcs.anl.gov (Abhyankar, Shrirang G.) Date: Fri, 14 Mar 2014 19:59:23 +0000 Subject: [petsc-users] Extra Variable in DMCircuit In-Reply-To: <87txb0vay9.fsf@jedbrown.org> References: <87txb0vay9.fsf@jedbrown.org> Message-ID: <72AB7464-1E10-433B-A41D-1E05C5B00294@anl.gov> Arrgggh..Outlook!! I have no freakin idea how to resolve this. I'll have to sit with the IT help desk guys to figure it out. I made sure that the quoting levels were on and even changed to different font size with a different color before sending out that email!! Funnily, this email looks fine in Outlook Exchange. This is really retarded! Anyways, this was my reply (I am putting it in quotes now, hopefully Microsoft doesn't remove this) >> Although, this solution seems to be very specific to >> the equations. Does this approach still work when the equations get more >> complex (e.g. handling multiple variables like PRODUCT(1+t*(a-1)) or >> SUM(R*q*(1-f)) )? Shri: "Managing custom needs is difficult to maintain hence what I'm planning to do is to provide a way for the user to define a own custom MPI_Op that can be called by DMLocalToGlobalXXX/DMGlobalToLocalXXX. So eventually you'll have to write the MPI_Op and we'll provide hooks to attach it to the DM" >> The additional equation is the multiplication of the reliability over a >> specific path in the network (a rather arbitrary, but small subset of >> the links (e.g. 55 links for a problem with 10000 links)) minus a >> constant predefined value. >> This gives me the possibility to convert the formerly constant packet >> generation (g_i) into a variable. >> Shri: "I see..so it is like an equality constraint on a subset of links, not all the links. Presumably these links form a subnetwork that may get assigned to one processor/set of neighboring processors" >> Good idea, but unfortunately it is not always guaranteed that the edge >> is bidirectional for the extended formulation of the problem. >> Shri: "Are you saying that the directionality could change during the calculation? In your example, the INTERFERING edges are bidirectional while the INFLOWING links are unidirectional. By setting up the appropriate relations in the data attached with the edges , you can manage the equations for the edges/vertices. If there is some specific case that cannot be handled then we can take a look at it." >> What >> exactly is the problem when the two unidirectional edges are assigned to >> different processes? >> Shri: "I don't quite remember it right now but I recall seeing weird partitions and incorrect ghost exchanges. I'll have to run it once again to produce specific details." On Mar 14, 2014, at 2:26 PM, Jed Brown wrote: > Your email quoting style is really hard to read. What part is you and > what part is supposed to be cited? (Microsoft is at-fault for this > breakage, but we need a work-around.) > > http://lists.mcs.anl.gov/pipermail/petsc-users/2014-March/020931.html > > "Abhyankar, Shrirang G." writes: > >> -----Original Message----- >> From: Florian Meier > >> Date: Fri, 14 Mar 2014 19:34:23 +0100 >> To: Shri > >> Cc: petsc-users list > >> Subject: Re: Extra Variable in DMCircuit >> >> On 03/14/2014 06:24 PM, Abhyankar, Shrirang G. wrote: >> >> >> That sounds great! Although, this solution seems to be very specific to >> the equations. Does this approach still work when the equations get more >> complex (e.g. handling multiple variables like PRODUCT(1+t*(a-1)) or >> SUM(R*q*(1-f)) )? >> >> Managing custom needs is difficult to maintain hence what I'm planning to do is to provide a way for the user to define a own custom MPI_Op that can be called >> by DMLocalToGlobalXXX/DMGlobalToLocalXXX. So eventually you'll have to write the MPI_Op and we'll provide hooks to attach it to the DM. >> >> >> Now I would like to add a single global variable (and a single equation) >> to the equation system. Is there an elegant way to do this with DMCircuit? >> Is this akin to a "Ground" node for circuits? Is the variable value >> constant? >> >> Maybe... >> The additional equation is the multiplication of the reliability over a >> specific path in the network (a rather arbitrary, but small subset of >> the links (e.g. 55 links for a problem with 10000 links)) minus a >> constant predefined value. >> This gives me the possibility to convert the formerly constant packet >> generation (g_i) into a variable. >> >> I see..so it is like an equality constraint on a subset of links, not all the links. Presumably these links form a subnetwork that >> may get assigned to one processor/set of neighboring processors. >> >> >> When adding an additional vertex it works quite good. We will see how it >> works out when running in parallel. >> >> After working on your example I realized that specifying a bidirectional >> edge as two unidirectional edges in the data may cause problems for the >> partitioner. I observed that >> the two undirectional edges may be assigned to different processors >> although they are connected to the same vertices. This may be a problem >> when communicating ghost >> values. Hence, I've modified the data format in the attached links1.txt >> file to only specify edges via their nodal connectivity and then to >> specify the type information. >> I've reworked your source code also accordingly and it gives the same >> answer as your original code. It gives a wrong answer for parallel runs >> because of the incorrect >> ghost value exchanges. Once we have the ADD_PROD insertmode, this code >> should work fine in parallel too. I think that going forward you should >> use a similar data format. >> >> Good idea, but unfortunately it is not always guaranteed that the edge >> is bidirectional for the extended formulation of the problem. >> >> Are you saying that the directionality could change during the calculation? >> In your example, the INTERFERING edges are bidirectional >> while the INFLOWING links are unidirectional. By setting up the appropriate relations in the >> data attached with the edges , you can manage the equations for the >> edges/vertices. If there is some specific case that cannot be handled then we can take a look at it. >> >> What >> exactly is the problem when the two unidirectional edges are assigned to >> different processes? >> >> I don't quite remember it right now but I recall seeing weird partitions and incorrect ghost exchanges. I'll have to run it once again >> to produce specific details. >> >> Shri >> >> A hackish solution might be to add an additional imaginary vertex that >> is excluded from all other calculations, but that does not seem to be >> the right way to do it. >> Greetings, >> Florian From florian.meier at koalo.de Fri Mar 14 15:27:53 2014 From: florian.meier at koalo.de (Florian Meier) Date: Fri, 14 Mar 2014 21:27:53 +0100 Subject: [petsc-users] Extra Variable in DMCircuit In-Reply-To: <72AB7464-1E10-433B-A41D-1E05C5B00294@anl.gov> References: <87txb0vay9.fsf@jedbrown.org> <72AB7464-1E10-433B-A41D-1E05C5B00294@anl.gov> Message-ID: <53236649.7060103@koalo.de> On 14.03.2014 20:59, Abhyankar, Shrirang G. wrote: > Arrgggh..Outlook!! I have no freakin idea how to resolve this. I'll have to sit with the IT help desk guys to figure it out. > I made sure that the quoting levels were on and even changed to different font size with a different color before sending out that email!! Funnily, > this email looks fine in Outlook Exchange. This is really retarded! > Anyways, this was my reply (I am putting it in quotes now, hopefully Microsoft doesn't remove this) Oh yes, Outlook does funny things (based on the variety of weird formatted emails) :-) Anyway, the style is visible in Thunderbird, but not in the plain text. Maybe you could turn off HTML composition temporarily. >>> Although, this solution seems to be very specific to >>> the equations. Does this approach still work when the equations get more >>> complex (e.g. handling multiple variables like PRODUCT(1+t*(a-1)) or >>> SUM(R*q*(1-f)) )? > > > Shri: "Managing custom needs is difficult to maintain hence what I'm planning to do is to provide a way for the user to define a own custom MPI_Op that can be called > by DMLocalToGlobalXXX/DMGlobalToLocalXXX. So eventually you'll have to write the MPI_Op and we'll provide hooks to attach it to the DM" I see, that is good! >>> The additional equation is the multiplication of the reliability over a >>> specific path in the network (a rather arbitrary, but small subset of >>> the links (e.g. 55 links for a problem with 10000 links)) minus a >>> constant predefined value. >>> This gives me the possibility to convert the formerly constant packet >>> generation (g_i) into a variable. >>> > > Shri: "I see..so it is like an equality constraint on a subset of links, not all the links. Presumably these links form a subnetwork that > may get assigned to one processor/set of neighboring processors" Yes >>> Good idea, but unfortunately it is not always guaranteed that the edge >>> is bidirectional for the extended formulation of the problem. >>> > > Shri: "Are you saying that the directionality could change during the calculation? > In your example, the INTERFERING edges are bidirectional > while the INFLOWING links are unidirectional. By setting up the appropriate relations in the > data attached with the edges , you can manage the equations for the > edges/vertices. If there is some specific case that cannot be handled then we can take a look at it." > >>> What exactly is the problem when the two unidirectional >>> edges are assigned to >>> different processes? > > Shri: "I don't quite remember it right now but I recall seeing > weird partitions and incorrect ghost exchanges. I'll have to run it > once again to produce specific details." No, not during the calculation, but there will be another kind of edges where some vertices are connected in both directions and others only in one direction. I assume it does not make much sense to speculate about that now. There is still a lot of work to do until everything will run reasonable on a single core (e.g. I still use finite differences...). After that we will see if and how it will be possible to parallelize it. By the way: My mail address is wrong in the source code you sent me. It is koalo not koala. Thanks, Florian > On Mar 14, 2014, at 2:26 PM, Jed Brown wrote: > >> Your email quoting style is really hard to read. What part is you and >> what part is supposed to be cited? (Microsoft is at-fault for this >> breakage, but we need a work-around.) >> >> http://lists.mcs.anl.gov/pipermail/petsc-users/2014-March/020931.html >> >> "Abhyankar, Shrirang G." writes: >> >>> -----Original Message----- >>> From: Florian Meier > >>> Date: Fri, 14 Mar 2014 19:34:23 +0100 >>> To: Shri > >>> Cc: petsc-users list > >>> Subject: Re: Extra Variable in DMCircuit >>> >>> On 03/14/2014 06:24 PM, Abhyankar, Shrirang G. wrote: >>> >>> >>> That sounds great! Although, this solution seems to be very specific to >>> the equations. Does this approach still work when the equations get more >>> complex (e.g. handling multiple variables like PRODUCT(1+t*(a-1)) or >>> SUM(R*q*(1-f)) )? >>> >>> Managing custom needs is difficult to maintain hence what I'm planning to do is to provide a way for the user to define a own custom MPI_Op that can be called >>> by DMLocalToGlobalXXX/DMGlobalToLocalXXX. So eventually you'll have to write the MPI_Op and we'll provide hooks to attach it to the DM. >>> >>> >>> Now I would like to add a single global variable (and a single equation) >>> to the equation system. Is there an elegant way to do this with DMCircuit? >>> Is this akin to a "Ground" node for circuits? Is the variable value >>> constant? >>> >>> Maybe... >>> The additional equation is the multiplication of the reliability over a >>> specific path in the network (a rather arbitrary, but small subset of >>> the links (e.g. 55 links for a problem with 10000 links)) minus a >>> constant predefined value. >>> This gives me the possibility to convert the formerly constant packet >>> generation (g_i) into a variable. >>> >>> I see..so it is like an equality constraint on a subset of links, not all the links. Presumably these links form a subnetwork that >>> may get assigned to one processor/set of neighboring processors. >>> >>> >>> When adding an additional vertex it works quite good. We will see how it >>> works out when running in parallel. >>> >>> After working on your example I realized that specifying a bidirectional >>> edge as two unidirectional edges in the data may cause problems for the >>> partitioner. I observed that >>> the two undirectional edges may be assigned to different processors >>> although they are connected to the same vertices. This may be a problem >>> when communicating ghost >>> values. Hence, I've modified the data format in the attached links1.txt >>> file to only specify edges via their nodal connectivity and then to >>> specify the type information. >>> I've reworked your source code also accordingly and it gives the same >>> answer as your original code. It gives a wrong answer for parallel runs >>> because of the incorrect >>> ghost value exchanges. Once we have the ADD_PROD insertmode, this code >>> should work fine in parallel too. I think that going forward you should >>> use a similar data format. >>> >>> Good idea, but unfortunately it is not always guaranteed that the edge >>> is bidirectional for the extended formulation of the problem. >>> >>> Are you saying that the directionality could change during the calculation? >>> In your example, the INTERFERING edges are bidirectional >>> while the INFLOWING links are unidirectional. By setting up the appropriate relations in the >>> data attached with the edges , you can manage the equations for the >>> edges/vertices. If there is some specific case that cannot be handled then we can take a look at it. >>> >>> What >>> exactly is the problem when the two unidirectional edges are assigned to >>> different processes? >>> >>> I don't quite remember it right now but I recall seeing weird partitions and incorrect ghost exchanges. I'll have to run it once again >>> to produce specific details. >>> >>> Shri >>> >>> A hackish solution might be to add an additional imaginary vertex that >>> is excluded from all other calculations, but that does not seem to be >>> the right way to do it. >>> Greetings, >>> Florian > From William.Coirier at kratosdefense.com Fri Mar 14 16:45:35 2014 From: William.Coirier at kratosdefense.com (William Coirier) Date: Fri, 14 Mar 2014 21:45:35 +0000 Subject: [petsc-users] KSPSolve doesn't seem to scale. (Must be doing something wrong...) Message-ID: I've written a parallel, finite-volume, transient thermal conduction solver using PETSc primitives, and so far things have been going great. Comparisons to theory for a simple problem (transient conduction in a semi-infinite slab) looks good, but I'm not getting very good parallel scaling behavior with the KSP solver. Whether I use the default KSP/PC or other sensible combinations, the time spent in KSPSolve seems to not scale well at all. I seem to have loaded up the problem well enough. The PETSc logging/profiling has been really useful for reworking various code segments, and right now, the bottleneck is KSPSolve, and I can't seem to figure out how to get it to scale properly. I'm attaching output produced with -log_summary, -info, -ksp_view and -pc_view all specified on the command line for 1, 2, 4 and 8 processes. If you guys have any suggestions, I'd definitely like to hear them! And I apologize in advance if I've done something stupid. All the documentation has been really helpful. Thanks in advance... Bill Coirier -------------------------------------------------------------------------------------------------------------------- ***NOTICE*** This e-mail and/or the attached documents may contain technical data within the definition of the International Traffic in Arms Regulations and/or Export Administration Regulations, and are subject to the export control laws of the U.S. Government. Transfer of this data by any means to a foreign person, whether in the United States or abroad, without an export license or other approval from the U.S. Department of State or Commerce, as applicable, is prohibited. No portion of this e-mail or its attachment(s) may be reproduced without written consent of Kratos Defense & Security Solutions, Inc. Any views expressed in this message are those of the individual sender, except where the message states otherwise and the sender is authorized to state them to be the views of any such entity. The information contained in this message and or attachments is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. If you are not the intended recipient or believe that you may have received this document in error, please notify the sender and delete this e-mail and any attachments immediately. -------------- next part -------------- A non-text attachment was scrubbed... Name: out.1 Type: application/x-troff-man Size: 197673 bytes Desc: out.1 URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: out.2 Type: application/x-troff-man Size: 399424 bytes Desc: out.2 URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: out.4 Type: application/x-troff-man Size: 536306 bytes Desc: out.4 URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: out.8 Type: application/x-troff-man Size: 810360 bytes Desc: out.8 URL: From balay at mcs.anl.gov Fri Mar 14 19:09:07 2014 From: balay at mcs.anl.gov (Satish Balay) Date: Fri, 14 Mar 2014 19:09:07 -0500 Subject: [petsc-users] petsc-3.4.4.tar.gz now available Message-ID: Dear PETSc users, The patch release petsc-3.4.4 is now available for download. http://www.mcs.anl.gov/petsc/download/index.html Some of the changes include: * MatDense: fix overflow with int32*int32 during malloc() * VecView_MPI_DA: fix gsizes bug (bad conversion in parent commit) * MATSNESMF: fix so that MatMFFDSetFunction() works correctly with it * configure: enable hdf5 even when its not built with a compression library * configure: check for ddot_() in chaco and reject it - if found * configure: print download package URL before downloading the package * MatNullSpaceCreateRigidBody: fix array length for 3D * MatMPIAIJGetLocalMat: Correctly check input type of Mat argument * Matlab: Fix Matlab viewer for parallel complex vectors * Mat: Turned on Fortran binding for MatSetValuesBlockedStencil() * Mat: bugfix in MatCreateSeqAIJFromTriple() * SNESLINESEARCHBT: Set the norms when exiting early due to negligible step * configure: detect if winzip is used to extract petsc.tar.gz and error out * configure: check for windows-python and give error message * Mat: Check column block size in MatSetBlockSizes * Mat: add non-square block support for MatSetValuesBlocked * PetscSynchronizedFGets: fix deadlock at EOF * Mat: fix bad LogFlops in MatSOR_SeqSBAIJ. Satish From dafang.wang at jhu.edu Fri Mar 14 20:57:03 2014 From: dafang.wang at jhu.edu (Dafang Wang) Date: Fri, 14 Mar 2014 21:57:03 -0400 Subject: [petsc-users] When does DIVERGED_LINE_SEARCH Happen? Message-ID: <5323B36F.2070305@jhu.edu> Hi, Does anyone know what the error code DIVERGED_LINE_SEARCH means in the SNES nonlinear solve? Or what scenario would lead to this error code? Running a solid mechanics simulation, I found that the occurrence of DIVERGED_LINE_SEARCH was very unpredictable and sensitive to the input values to my nonlinear system, although my system should not be that unstable. As shown by the two examples below, my system diverged in one case and converged in the other, although the input values in these two cases differed by only 1e-4, Moreover, the Newton steps in the two cases were very similar up to NL step 1. Since then, however, Case 1 encountered a line-search divergence whereas Case 2 converged successfully. This is my main confusion. (Note that each residual vector contains 3e04 DOF, so when their L2 norms differ within 1e-4, the two systems should be very close.) My simulation input consists of two scalar values (p1 and p2), each of which acts as a constant pressure boundary condition. Case 1, diverge: p1= -10.190869 p2= -2.367555 NL step 0, |residual|_2 = 1.621402e-02 Line search: Using full step: fnorm 1.621401550027e-02 gnorm 7.022558235262e-05 NL step 1, |residual|_2 = 7.022558e-05 Line search: Using full step: fnorm 7.022558235262e-05 gnorm 1.636418730611e-06 NL step 2, |residual|_2 = 1.636419e-06 Nonlinear solve did not converge due to DIVERGED_LINE_SEARCH iterations 2 ------------------------------------------------------------------------ Case 2: converge: p1= -10.190747 p2= -2.367558 NL step 0, |residual|_2 = 1.621380e-02 Line search: Using full step: fnorm 1.621379778276e-02 gnorm 6.976373804153e-05 NL step 1, |residual|_2 = 6.976374e-05 Line search: Using full step: fnorm 6.976373804153e-05 gnorm 4.000992847275e-07 NL step 2, |residual|_2 = 4.000993e-07 Line search: Using full step: fnorm 4.000992847275e-07 gnorm 1.621646014441e-08 NL step 3, |residual|_2 = 1.621646e-08 Nonlinear solve converged due to CONVERGED_SNORM_RELATIVE iterations 3 ------------------------------------------------------------------------ Aside from the input values, the initial solution in both cases may differ very slightly. (Each case is one time step in a time-sequence simulation. The two cases behaved nearly identically up to the last time step before the step shown above, so their initial solutions may differ by a cumulative error but such error should be very small.) Is it possible that little difference in initial guess leads to different local minimum regions where the line search in Case 1 failed? Any comments will be greatly appreciated. Thanks, Dafang -- Dafang Wang, Ph.D Postdoctoral Fellow Institute of Computational Medicine Department of Biomedical Engineering Johns Hopkins University Hackerman Hall Room 218 Baltimore, MD, 21218 -------------- next part -------------- An HTML attachment was scrubbed... URL: From rupp at iue.tuwien.ac.at Sat Mar 15 04:01:10 2014 From: rupp at iue.tuwien.ac.at (Karl Rupp) Date: Sat, 15 Mar 2014 10:01:10 +0100 Subject: [petsc-users] KSPSolve doesn't seem to scale. (Must be doing something wrong...) In-Reply-To: References: Message-ID: <532416D6.2050401@iue.tuwien.ac.at> Hi William, I couldn't find something really suspicious in the logs, so the lack of scalability may be due to hardware limitations. Did you run all MPI processes on the same machine? How many CPU sockets? If it is a single-socket machine, chances are good that you saturate the memory channels pretty well with one process already. With higher process counts the cache per process is reduced, thus reducing cache reuse. This is the only reasonable explanation why the execution time for VecMDot goes up from e.g. 7 seconds for one and two processes to about 24 for four and eight processes. I suggest you try to run the same code across multiple machines if possible, you should see better scalability there. Also, for benchmarking purposes try to replace the ILU preconditioner with e.g. Jacobi, this should give you better scalability (provided that the solver still converges, of course...) Best regards, Karli On 03/14/2014 10:45 PM, William Coirier wrote: > I've written a parallel, finite-volume, transient thermal conduction solver using PETSc primitives, and so far things have been going great. Comparisons to theory for a simple problem (transient conduction in a semi-infinite slab) looks good, but I'm not getting very good parallel scaling behavior with the KSP solver. Whether I use the default KSP/PC or other sensible combinations, the time spent in KSPSolve seems to not scale well at all. > > I seem to have loaded up the problem well enough. The PETSc logging/profiling has been really useful for reworking various code segments, and right now, the bottleneck is KSPSolve, and I can't seem to figure out how to get it to scale properly. > > I'm attaching output produced with -log_summary, -info, -ksp_view and -pc_view all specified on the command line for 1, 2, 4 and 8 processes. > > If you guys have any suggestions, I'd definitely like to hear them! And I apologize in advance if I've done something stupid. All the documentation has been really helpful. > > Thanks in advance... > > Bill Coirier > > -------------------------------------------------------------------------------------------------------------------- > > ***NOTICE*** This e-mail and/or the attached documents may contain technical data within the definition of the International Traffic in Arms Regulations and/or Export Administration Regulations, and are subject to the export control laws of the U.S. Government. Transfer of this data by any means to a foreign person, whether in the United States or abroad, without an export license or other approval from the U.S. Department of State or Commerce, as applicable, is prohibited. No portion of this e-mail or its attachment(s) may be reproduced without written consent of Kratos Defense & Security Solutions, Inc. Any views expressed in this message are those of the individual sender, except where the message states otherwise and the sender is authorized to state them to be the views of any such entity. The information contained in this message and or attachments is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. If you are not the intended recipient or believe that you may have received this document in error, please notify the sender and delete this e-mail and any attachments immediately. > From knepley at gmail.com Sat Mar 15 06:43:11 2014 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 15 Mar 2014 06:43:11 -0500 Subject: [petsc-users] KSPSolve doesn't seem to scale. (Must be doing something wrong...) In-Reply-To: <532416D6.2050401@iue.tuwien.ac.at> References: <532416D6.2050401@iue.tuwien.ac.at> Message-ID: On Sat, Mar 15, 2014 at 4:01 AM, Karl Rupp wrote: > Hi William, > > I couldn't find something really suspicious in the logs, so the lack of > scalability may be due to hardware limitations. Did you run all MPI > processes on the same machine? How many CPU sockets? If it is a > single-socket machine, chances are good that you saturate the memory > channels pretty well with one process already. With higher process counts > the cache per process is reduced, thus reducing cache reuse. This is the > only reasonable explanation why the execution time for VecMDot goes up from > e.g. 7 seconds for one and two processes to about 24 for four and eight > processes. > http://www.mcs.anl.gov/petsc/documentation/faq.html#computers > I suggest you try to run the same code across multiple machines if > possible, you should see better scalability there. Also, for benchmarking > purposes try to replace the ILU preconditioner with e.g. Jacobi, this > should give you better scalability (provided that the solver still > converges, of course...) > BJacobi/ASM would be the next thing to try, since it would scale in terms of communication, but not in terms of iterates. Eventually you will want a nice multilevel solver for your problem. Matt > Best regards, > Karli > > > > On 03/14/2014 10:45 PM, William Coirier wrote: > >> I've written a parallel, finite-volume, transient thermal conduction >> solver using PETSc primitives, and so far things have been going great. >> Comparisons to theory for a simple problem (transient conduction in a >> semi-infinite slab) looks good, but I'm not getting very good parallel >> scaling behavior with the KSP solver. Whether I use the default KSP/PC or >> other sensible combinations, the time spent in KSPSolve seems to not scale >> well at all. >> >> I seem to have loaded up the problem well enough. The PETSc >> logging/profiling has been really useful for reworking various code >> segments, and right now, the bottleneck is KSPSolve, and I can't seem to >> figure out how to get it to scale properly. >> >> I'm attaching output produced with -log_summary, -info, -ksp_view and >> -pc_view all specified on the command line for 1, 2, 4 and 8 processes. >> >> If you guys have any suggestions, I'd definitely like to hear them! And I >> apologize in advance if I've done something stupid. All the documentation >> has been really helpful. >> >> Thanks in advance... >> >> Bill Coirier >> >> ------------------------------------------------------------ >> -------------------------------------------------------- >> >> ***NOTICE*** This e-mail and/or the attached documents may contain >> technical data within the definition of the International Traffic in Arms >> Regulations and/or Export Administration Regulations, and are subject to >> the export control laws of the U.S. Government. Transfer of this data by >> any means to a foreign person, whether in the United States or abroad, >> without an export license or other approval from the U.S. Department of >> State or Commerce, as applicable, is prohibited. No portion of this e-mail >> or its attachment(s) may be reproduced without written consent of Kratos >> Defense & Security Solutions, Inc. Any views expressed in this message are >> those of the individual sender, except where the message states otherwise >> and the sender is authorized to state them to be the views of any such >> entity. The information contained in this message and or attachments is >> intended only for the person or entity to which it is addressed and may >> contain confidential and/or privileged material. If you >> > are not the intended recipient or believe that you may have received this > document in error, please notify the sender and delete this e-mail and any > attachments immediately. > >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From William.Coirier at kratosdefense.com Sat Mar 15 09:08:12 2014 From: William.Coirier at kratosdefense.com (William Coirier) Date: Sat, 15 Mar 2014 14:08:12 +0000 Subject: [petsc-users] KSPSolve doesn't seem to scale. (Must be doing something wrong...) In-Reply-To: <532416D6.2050401@iue.tuwien.ac.at> References: , <532416D6.2050401@iue.tuwien.ac.at> Message-ID: Thanks Karl. We'll check this out on different architectures. I appreciate your help! Related to all of this, is there an example code in the distribution that might be recommended to use for testing machines for scalabiilty? Perhaps an example from the KSP? Thanks again... Bill C. ________________________________________ From: Karl Rupp [rupp at iue.tuwien.ac.at] Sent: Saturday, March 15, 2014 4:01 AM To: William Coirier Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] KSPSolve doesn't seem to scale. (Must be doing something wrong...) Hi William, I couldn't find something really suspicious in the logs, so the lack of scalability may be due to hardware limitations. Did you run all MPI processes on the same machine? How many CPU sockets? If it is a single-socket machine, chances are good that you saturate the memory channels pretty well with one process already. With higher process counts the cache per process is reduced, thus reducing cache reuse. This is the only reasonable explanation why the execution time for VecMDot goes up from e.g. 7 seconds for one and two processes to about 24 for four and eight processes. I suggest you try to run the same code across multiple machines if possible, you should see better scalability there. Also, for benchmarking purposes try to replace the ILU preconditioner with e.g. Jacobi, this should give you better scalability (provided that the solver still converges, of course...) Best regards, Karli On 03/14/2014 10:45 PM, William Coirier wrote: > I've written a parallel, finite-volume, transient thermal conduction solver using PETSc primitives, and so far things have been going great. Comparisons to theory for a simple problem (transient conduction in a semi-infinite slab) looks good, but I'm not getting very good parallel scaling behavior with the KSP solver. Whether I use the default KSP/PC or other sensible combinations, the time spent in KSPSolve seems to not scale well at all. > > I seem to have loaded up the problem well enough. The PETSc logging/profiling has been really useful for reworking various code segments, and right now, the bottleneck is KSPSolve, and I can't seem to figure out how to get it to scale properly. > > I'm attaching output produced with -log_summary, -info, -ksp_view and -pc_view all specified on the command line for 1, 2, 4 and 8 processes. > > If you guys have any suggestions, I'd definitely like to hear them! And I apologize in advance if I've done something stupid. All the documentation has been really helpful. > > Thanks in advance... > > Bill Coirier > > -------------------------------------------------------------------------------------------------------------------- > > ***NOTICE*** This e-mail and/or the attached documents may contain technical data within the definition of the International Traffic in Arms Regulations and/or Export Administration Regulations, and are subject to the export control laws of the U.S. Government. Transfer of this data by any means to a foreign person, whether in the United States or abroad, without an export license or other approval from the U.S. Department of State or Commerce, as applicable, is prohibited. No portion of this e-mail or its attachment(s) may be reproduced without written consent of Kratos Defense & Security Solutions, Inc. Any views expressed in this message are those of the individual sender, except where the message states otherwise and the sender is authorized to state them to be the views of any such entity. The information contained in this message and or attachments is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. If you are not the intended recipient or believe that you may have received this document in error, please notify the sender and delete this e-mail and any attachments immediately. > -------------------------------------------------------------------------------------------------------------------- ***NOTICE*** This e-mail and/or the attached documents may contain technical data within the definition of the International Traffic in Arms Regulations and/or Export Administration Regulations, and are subject to the export control laws of the U.S. Government. Transfer of this data by any means to a foreign person, whether in the United States or abroad, without an export license or other approval from the U.S. Department of State or Commerce, as applicable, is prohibited. No portion of this e-mail or its attachment(s) may be reproduced without written consent of Kratos Defense & Security Solutions, Inc. Any views expressed in this message are those of the individual sender, except where the message states otherwise and the sender is authorized to state them to be the views of any such entity. The information contained in this message and or attachments is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. If you are not the intended recipient or believe that you may have received this document in error, please notify the sender and delete this e-mail and any attachments immediately. From knepley at gmail.com Sat Mar 15 09:15:01 2014 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 15 Mar 2014 09:15:01 -0500 Subject: [petsc-users] KSPSolve doesn't seem to scale. (Must be doing something wrong...) In-Reply-To: References: <532416D6.2050401@iue.tuwien.ac.at> Message-ID: On Sat, Mar 15, 2014 at 9:08 AM, William Coirier < William.Coirier at kratosdefense.com> wrote: > Thanks Karl. We'll check this out on different architectures. I appreciate > your help! > > Related to all of this, is there an example code in the distribution that > might be recommended to use for testing machines for scalabiilty? Perhaps > an example from the KSP? > SNES ex5 is a very simple code and easy to see what is happening on the architecture. I use it to start. You can increase the grid size using -da_grid_x M -da_grid_y N. Matt > Thanks again... > > Bill C. > ________________________________________ > From: Karl Rupp [rupp at iue.tuwien.ac.at] > Sent: Saturday, March 15, 2014 4:01 AM > To: William Coirier > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] KSPSolve doesn't seem to scale. (Must be doing > something wrong...) > > Hi William, > > I couldn't find something really suspicious in the logs, so the lack of > scalability may be due to hardware limitations. Did you run all MPI > processes on the same machine? How many CPU sockets? If it is a > single-socket machine, chances are good that you saturate the memory > channels pretty well with one process already. With higher process > counts the cache per process is reduced, thus reducing cache reuse. This > is the only reasonable explanation why the execution time for VecMDot > goes up from e.g. 7 seconds for one and two processes to about 24 for > four and eight processes. > > I suggest you try to run the same code across multiple machines if > possible, you should see better scalability there. Also, for > benchmarking purposes try to replace the ILU preconditioner with e.g. > Jacobi, this should give you better scalability (provided that the > solver still converges, of course...) > > Best regards, > Karli > > > On 03/14/2014 10:45 PM, William Coirier wrote: > > I've written a parallel, finite-volume, transient thermal conduction > solver using PETSc primitives, and so far things have been going great. > Comparisons to theory for a simple problem (transient conduction in a > semi-infinite slab) looks good, but I'm not getting very good parallel > scaling behavior with the KSP solver. Whether I use the default KSP/PC or > other sensible combinations, the time spent in KSPSolve seems to not scale > well at all. > > > > I seem to have loaded up the problem well enough. The PETSc > logging/profiling has been really useful for reworking various code > segments, and right now, the bottleneck is KSPSolve, and I can't seem to > figure out how to get it to scale properly. > > > > I'm attaching output produced with -log_summary, -info, -ksp_view and > -pc_view all specified on the command line for 1, 2, 4 and 8 processes. > > > > If you guys have any suggestions, I'd definitely like to hear them! And > I apologize in advance if I've done something stupid. All the documentation > has been really helpful. > > > > Thanks in advance... > > > > Bill Coirier > > > > > -------------------------------------------------------------------------------------------------------------------- > > > > ***NOTICE*** This e-mail and/or the attached documents may contain > technical data within the definition of the International Traffic in Arms > Regulations and/or Export Administration Regulations, and are subject to > the export control laws of the U.S. Government. Transfer of this data by > any means to a foreign person, whether in the United States or abroad, > without an export license or other approval from the U.S. Department of > State or Commerce, as applicable, is prohibited. No portion of this e-mail > or its attachment(s) may be reproduced without written consent of Kratos > Defense & Security Solutions, Inc. Any views expressed in this message are > those of the individual sender, except where the message states otherwise > and the sender is authorized to state them to be the views of any such > entity. The information contained in this message and or attachments is > intended only for the person or entity to which it is addressed and may > contain confidential and/or privileged material. If you > are not the intended recipient or believe that you may have received this > document in error, please notify the sender and delete this e-mail and any > attachments immediately. > > > > > > -------------------------------------------------------------------------------------------------------------------- > > ***NOTICE*** This e-mail and/or the attached documents may contain > technical data within the definition of the International Traffic in Arms > Regulations and/or Export Administration Regulations, and are subject to > the export control laws of the U.S. Government. Transfer of this data by > any means to a foreign person, whether in the United States or abroad, > without an export license or other approval from the U.S. Department of > State or Commerce, as applicable, is prohibited. No portion of this e-mail > or its attachment(s) may be reproduced without written consent of Kratos > Defense & Security Solutions, Inc. Any views expressed in this message are > those of the individual sender, except where the message states otherwise > and the sender is authorized to state them to be the views of any such > entity. The information contained in this message and or attachments is > intended only for the person or entity to which it is addressed and may > contain confidential and/or privileged material. If you are not the > intended recipient or believe that you may have received this document in > error, please notify the sender and delete this e-mail and any attachments > immediately. > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From William.Coirier at kratosdefense.com Sat Mar 15 09:50:28 2014 From: William.Coirier at kratosdefense.com (William Coirier) Date: Sat, 15 Mar 2014 14:50:28 +0000 Subject: [petsc-users] KSPSolve doesn't seem to scale. (Must be doing something wrong...) In-Reply-To: References: <532416D6.2050401@iue.tuwien.ac.at> , Message-ID: Matt: So if we use the ex5 from SNES, use default ksp/pc and size the problem big enough via the M,N, it should scale? Just checking... ________________________________ From: Matthew Knepley [knepley at gmail.com] Sent: Saturday, March 15, 2014 9:15 AM To: William Coirier Cc: Karl Rupp; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] KSPSolve doesn't seem to scale. (Must be doing something wrong...) On Sat, Mar 15, 2014 at 9:08 AM, William Coirier > wrote: Thanks Karl. We'll check this out on different architectures. I appreciate your help! Related to all of this, is there an example code in the distribution that might be recommended to use for testing machines for scalabiilty? Perhaps an example from the KSP? SNES ex5 is a very simple code and easy to see what is happening on the architecture. I use it to start. You can increase the grid size using -da_grid_x M -da_grid_y N. Matt Thanks again... Bill C. ________________________________________ From: Karl Rupp [rupp at iue.tuwien.ac.at] Sent: Saturday, March 15, 2014 4:01 AM To: William Coirier Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] KSPSolve doesn't seem to scale. (Must be doing something wrong...) Hi William, I couldn't find something really suspicious in the logs, so the lack of scalability may be due to hardware limitations. Did you run all MPI processes on the same machine? How many CPU sockets? If it is a single-socket machine, chances are good that you saturate the memory channels pretty well with one process already. With higher process counts the cache per process is reduced, thus reducing cache reuse. This is the only reasonable explanation why the execution time for VecMDot goes up from e.g. 7 seconds for one and two processes to about 24 for four and eight processes. I suggest you try to run the same code across multiple machines if possible, you should see better scalability there. Also, for benchmarking purposes try to replace the ILU preconditioner with e.g. Jacobi, this should give you better scalability (provided that the solver still converges, of course...) Best regards, Karli On 03/14/2014 10:45 PM, William Coirier wrote: > I've written a parallel, finite-volume, transient thermal conduction solver using PETSc primitives, and so far things have been going great. Comparisons to theory for a simple problem (transient conduction in a semi-infinite slab) looks good, but I'm not getting very good parallel scaling behavior with the KSP solver. Whether I use the default KSP/PC or other sensible combinations, the time spent in KSPSolve seems to not scale well at all. > > I seem to have loaded up the problem well enough. The PETSc logging/profiling has been really useful for reworking various code segments, and right now, the bottleneck is KSPSolve, and I can't seem to figure out how to get it to scale properly. > > I'm attaching output produced with -log_summary, -info, -ksp_view and -pc_view all specified on the command line for 1, 2, 4 and 8 processes. > > If you guys have any suggestions, I'd definitely like to hear them! And I apologize in advance if I've done something stupid. All the documentation has been really helpful. > > Thanks in advance... > > Bill Coirier > > -------------------------------------------------------------------------------------------------------------------- > > ***NOTICE*** This e-mail and/or the attached documents may contain technical data within the definition of the International Traffic in Arms Regulations and/or Export Administration Regulations, and are subject to the export control laws of the U.S. Government. Transfer of this data by any means to a foreign person, whether in the United States or abroad, without an export license or other approval from the U.S. Department of State or Commerce, as applicable, is prohibited. No portion of this e-mail or its attachment(s) may be reproduced without written consent of Kratos Defense & Security Solutions, Inc. Any views expressed in this message are those of the individual sender, except where the message states otherwise and the sender is authorized to state them to be the views of any such entity. The information contained in this message and or attachments is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. If you are not the intended recipient or believe that you may have received this document in error, please notify the sender and delete this e-mail and any attachments immediately. > -------------------------------------------------------------------------------------------------------------------- ***NOTICE*** This e-mail and/or the attached documents may contain technical data within the definition of the International Traffic in Arms Regulations and/or Export Administration Regulations, and are subject to the export control laws of the U.S. Government. Transfer of this data by any means to a foreign person, whether in the United States or abroad, without an export license or other approval from the U.S. Department of State or Commerce, as applicable, is prohibited. No portion of this e-mail or its attachment(s) may be reproduced without written consent of Kratos Defense & Security Solutions, Inc. Any views expressed in this message are those of the individual sender, except where the message states otherwise and the sender is authorized to state them to be the views of any such entity. The information contained in this message and or attachments is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. If you are not the intended recipient or believe that you may have received this document in error, please notify the sender and delete this e-mail and any attachments immediately. -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------------------------------------------------------------------------------------------------------------- ***NOTICE*** This e-mail and/or the attached documents may contain technical data within the definition of the International Traffic in Arms Regulations and/or Export Administration Regulations, and are subject to the export control laws of the U.S. Government. Transfer of this data by any means to a foreign person, whether in the United States or abroad, without an export license or other approval from the U.S. Department of State or Commerce, as applicable, is prohibited. No portion of this e-mail or its attachment(s) may be reproduced without written consent of Kratos Defense & Security Solutions, Inc. Any views expressed in this message are those of the individual sender, except where the message states otherwise and the sender is authorized to state them to be the views of any such entity. The information contained in this message and or attachments is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. If you are not the intended recipient or believe that you may have received this document in error, please notify the sender and delete this e-mail and any attachments immediately. From knepley at gmail.com Sat Mar 15 10:04:36 2014 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 15 Mar 2014 10:04:36 -0500 Subject: [petsc-users] KSPSolve doesn't seem to scale. (Must be doing something wrong...) In-Reply-To: References: <532416D6.2050401@iue.tuwien.ac.at> Message-ID: On Sat, Mar 15, 2014 at 9:50 AM, William Coirier < William.Coirier at kratosdefense.com> wrote: > Matt: > > So if we use the ex5 from SNES, use default ksp/pc and size the problem > big enough via the M,N, it should scale? > There are several kinds of scaling. If you use a series of problems, and the default solvers (GMRES/BJacobi/ILU), the operations will scale (VecAXPY, VecDot, MatMult, etc.), but the number of iterates will grow. If you use MG and GAMG, the number of iterates will be constant, but the setup time for GAMG will not be as scalable as the rest. Here is a paper where we talk about scalability of a real code, instead of the toy problems that most CS people use, http://onlinelibrary.wiley.com/doi/10.1002/jgrb.50217/abstract http://arxiv.org/abs/1308.5846 Thanks, Matt > Just checking... > ________________________________ > From: Matthew Knepley [knepley at gmail.com] > Sent: Saturday, March 15, 2014 9:15 AM > To: William Coirier > Cc: Karl Rupp; petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] KSPSolve doesn't seem to scale. (Must be doing > something wrong...) > > On Sat, Mar 15, 2014 at 9:08 AM, William Coirier < > William.Coirier at kratosdefense.com> > wrote: > Thanks Karl. We'll check this out on different architectures. I appreciate > your help! > > Related to all of this, is there an example code in the distribution that > might be recommended to use for testing machines for scalabiilty? Perhaps > an example from the KSP? > > SNES ex5 is a very simple code and easy to see what is happening on the > architecture. I use it to start. You > can increase the grid size using -da_grid_x M -da_grid_y N. > > Matt > > Thanks again... > > Bill C. > ________________________________________ > From: Karl Rupp [rupp at iue.tuwien.ac.at] > Sent: Saturday, March 15, 2014 4:01 AM > To: William Coirier > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] KSPSolve doesn't seem to scale. (Must be doing > something wrong...) > > Hi William, > > I couldn't find something really suspicious in the logs, so the lack of > scalability may be due to hardware limitations. Did you run all MPI > processes on the same machine? How many CPU sockets? If it is a > single-socket machine, chances are good that you saturate the memory > channels pretty well with one process already. With higher process > counts the cache per process is reduced, thus reducing cache reuse. This > is the only reasonable explanation why the execution time for VecMDot > goes up from e.g. 7 seconds for one and two processes to about 24 for > four and eight processes. > > I suggest you try to run the same code across multiple machines if > possible, you should see better scalability there. Also, for > benchmarking purposes try to replace the ILU preconditioner with e.g. > Jacobi, this should give you better scalability (provided that the > solver still converges, of course...) > > Best regards, > Karli > > > On 03/14/2014 10:45 PM, William Coirier wrote: > > I've written a parallel, finite-volume, transient thermal conduction > solver using PETSc primitives, and so far things have been going great. > Comparisons to theory for a simple problem (transient conduction in a > semi-infinite slab) looks good, but I'm not getting very good parallel > scaling behavior with the KSP solver. Whether I use the default KSP/PC or > other sensible combinations, the time spent in KSPSolve seems to not scale > well at all. > > > > I seem to have loaded up the problem well enough. The PETSc > logging/profiling has been really useful for reworking various code > segments, and right now, the bottleneck is KSPSolve, and I can't seem to > figure out how to get it to scale properly. > > > > I'm attaching output produced with -log_summary, -info, -ksp_view and > -pc_view all specified on the command line for 1, 2, 4 and 8 processes. > > > > If you guys have any suggestions, I'd definitely like to hear them! And > I apologize in advance if I've done something stupid. All the documentation > has been really helpful. > > > > Thanks in advance... > > > > Bill Coirier > > > > > -------------------------------------------------------------------------------------------------------------------- > > > > ***NOTICE*** This e-mail and/or the attached documents may contain > technical data within the definition of the International Traffic in Arms > Regulations and/or Export Administration Regulations, and are subject to > the export control laws of the U.S. Government. Transfer of this data by > any means to a foreign person, whether in the United States or abroad, > without an export license or other approval from the U.S. Department of > State or Commerce, as applicable, is prohibited. No portion of this e-mail > or its attachment(s) may be reproduced without written consent of Kratos > Defense & Security Solutions, Inc. Any views expressed in this message are > those of the individual sender, except where the message states otherwise > and the sender is authorized to state them to be the views of any such > entity. The information contained in this message and or attachments is > intended only for the person or entity to which it is addressed and may > contain confidential and/or privileged material. If you > are not the intended recipient or believe that you may have received this > document in error, please notify the sender and delete this e-mail and any > attachments immediately. > > > > > > -------------------------------------------------------------------------------------------------------------------- > > ***NOTICE*** This e-mail and/or the attached documents may contain > technical data within the definition of the International Traffic in Arms > Regulations and/or Export Administration Regulations, and are subject to > the export control laws of the U.S. Government. Transfer of this data by > any means to a foreign person, whether in the United States or abroad, > without an export license or other approval from the U.S. Department of > State or Commerce, as applicable, is prohibited. No portion of this e-mail > or its attachment(s) may be reproduced without written consent of Kratos > Defense & Security Solutions, Inc. Any views expressed in this message are > those of the individual sender, except where the message states otherwise > and the sender is authorized to state them to be the views of any such > entity. The information contained in this message and or attachments is > intended only for the person or entity to which it is addressed and may > contain confidential and/or privileged material. If you are not the > intended recipient or believe that you may have received this document in > error, please notify the sender and delete this e-mail and any attachments > immediately. > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > -------------------------------------------------------------------------------------------------------------------- > > ***NOTICE*** This e-mail and/or the attached documents may contain > technical data within the definition of the International Traffic in Arms > Regulations and/or Export Administration Regulations, and are subject to > the export control laws of the U.S. Government. Transfer of this data by > any means to a foreign person, whether in the United States or abroad, > without an export license or other approval from the U.S. Department of > State or Commerce, as applicable, is prohibited. No portion of this e-mail > or its attachment(s) may be reproduced without written consent of Kratos > Defense & Security Solutions, Inc. Any views expressed in this message are > those of the individual sender, except where the message states otherwise > and the sender is authorized to state them to be the views of any such > entity. The information contained in this message and or attachments is > intended only for the person or entity to which it is addressed and may > contain confidential and/or privileged material. If you are not the > intended recipient or believe that you may have received this document in > error, please notify the sender and delete this e-mail and any attachments > immediately. > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sat Mar 15 10:15:12 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sat, 15 Mar 2014 10:15:12 -0500 Subject: [petsc-users] When does DIVERGED_LINE_SEARCH Happen? In-Reply-To: <5323B36F.2070305@jhu.edu> References: <5323B36F.2070305@jhu.edu> Message-ID: <3C53DDCF-455A-4A28-A998-28CC7D96CC7E@mcs.anl.gov> Failed line search are almost always due to an incorrect Jacobian. Please let us know if the suggestions at http://www.mcs.anl.gov/petsc/documentation/faq.html#newton don?t help. Barry On Mar 14, 2014, at 8:57 PM, Dafang Wang wrote: > Hi, > > Does anyone know what the error code DIVERGED_LINE_SEARCH means in the SNES nonlinear solve? Or what scenario would lead to this error code? > > Running a solid mechanics simulation, I found that the occurrence of DIVERGED_LINE_SEARCH was very unpredictable and sensitive to the input values to my nonlinear system, although my system should not be that unstable. As shown by the two examples below, my system diverged in one case and converged in the other, although the input values in these two cases differed by only 1e-4, > > Moreover, the Newton steps in the two cases were very similar up to NL step 1. Since then, however, Case 1 encountered a line-search divergence whereas Case 2 converged successfully. This is my main confusion. (Note that each residual vector contains 3e04 DOF, so when their L2 norms differ within 1e-4, the two systems should be very close.) > > My simulation input consists of two scalar values (p1 and p2), each of which acts as a constant pressure boundary condition. > > Case 1, diverge: > p1= -10.190869 p2= -2.367555 > NL step 0, |residual|_2 = 1.621402e-02 > Line search: Using full step: fnorm 1.621401550027e-02 gnorm 7.022558235262e-05 > NL step 1, |residual|_2 = 7.022558e-05 > Line search: Using full step: fnorm 7.022558235262e-05 gnorm 1.636418730611e-06 > NL step 2, |residual|_2 = 1.636419e-06 > Nonlinear solve did not converge due to DIVERGED_LINE_SEARCH iterations 2 > Case 2: converge: > p1= -10.190747 p2= -2.367558 > NL step 0, |residual|_2 = 1.621380e-02 > Line search: Using full step: fnorm 1.621379778276e-02 gnorm 6.976373804153e-05 > NL step 1, |residual|_2 = 6.976374e-05 > Line search: Using full step: fnorm 6.976373804153e-05 gnorm 4.000992847275e-07 > NL step 2, |residual|_2 = 4.000993e-07 > Line search: Using full step: fnorm 4.000992847275e-07 gnorm 1.621646014441e-08 > NL step 3, |residual|_2 = 1.621646e-08 > Nonlinear solve converged due to CONVERGED_SNORM_RELATIVE iterations 3 > > Aside from the input values, the initial solution in both cases may differ very slightly. (Each case is one time step in a time-sequence simulation. The two cases behaved nearly identically up to the last time step before the step shown above, so their initial solutions may differ by a cumulative error but such error should be very small.) Is it possible that little difference in initial guess leads to different local minimum regions where the line search in Case 1 failed? > > Any comments will be greatly appreciated. > > Thanks, > Dafang > -- > Dafang Wang, Ph.D > Postdoctoral Fellow > Institute of Computational Medicine > Department of Biomedical Engineering > Johns Hopkins University > Hackerman Hall Room 218 > Baltimore, MD, 21218 From salazardetroya at gmail.com Sat Mar 15 15:31:57 2014 From: salazardetroya at gmail.com (Miguel Angel Salazar de Troya) Date: Sat, 15 Mar 2014 15:31:57 -0500 Subject: [petsc-users] Problems running ex12.c In-Reply-To: References: <8448FFCE4362914496BCEAF8BE810C13EF1D34@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D5E@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D7C@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D8F@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DA3@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DBF@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E22@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E50@DCPWPEXMBX02.mdanderson.edu> Message-ID: Hello everybody I keep trying to understand this example. I don't have any problems with this example when I run it like this: [salaza11 at maya PETSC]$ ./ex12 -bc_type dirichlet -interpolate -petscspace_order 1 -variable_coefficient nonlinear -dim 2 -run_type full -show_solution Number of SNES iterations = 5 L_2 Error: 0.107289 Solution Vec Object: 1 MPI processes type: seq 0.484618 However, when I change the boundary conditions to Neumann, I get this error. [salaza11 at maya PETSC]$ ./ex12 -bc_type neumann -interpolate 1 -petscspace_order 2 -variable_coefficient nonlinear -dim 2 -run_type full -show_solution [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Petsc has generated inconsistent data [0]PETSC ERROR: Number of dual basis vectors 0 not equal to dimension 1 [0]PETSC ERROR: See http:// http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-4776-gb18359b GIT Date: 2014-03-04 10:53:30 -0600 [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 Sat Mar 15 14:28:05 2014 [0]PETSC ERROR: Configure options --download-mpich --download-scientificpython --download-triangle --download-ctetgen --download-chaco --with-c2html=0 [0]PETSC ERROR: #1 PetscDualSpaceSetUp_Lagrange() line 1763 in /home/salaza11/petsc/src/dm/dt/interface/dtfe.c [0]PETSC ERROR: #2 PetscDualSpaceSetUp() line 1277 in /home/salaza11/petsc/src/dm/dt/interface/dtfe.c [0]PETSC ERROR: #3 SetupElementCommon() line 474 in /home/salaza11/workspace/PETSC/ex12.c [0]PETSC ERROR: #4 SetupBdElement() line 559 in /home/salaza11/workspace/PETSC/ex12.c [0]PETSC ERROR: #5 main() line 755 in /home/salaza11/workspace/PETSC/ex12.c [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- application called MPI_Abort(MPI_COMM_WORLD, 77) - process 0 [unset]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, 77) - process 0 I honestly do not know much about using dual spaces in a finite element context. I have been trying to find some material that could help me without much success. I tried to modify the dual space order with the option -petscdualspace_order but I kept getting errors. In particular, I got this when I set it to 1. [salaza11 at maya PETSC]$ ./ex12 -bc_type neumann -interpolate 1 -petscspace_order 2 -variable_coefficient nonlinear -dim 2 -run_type full -show_solution -petscdualspace_order 1 [0]PETSC ERROR: PetscTrFreeDefault() called from PetscFESetUp_Basic() line 2492 in /home/salaza11/petsc/src/dm/dt/interface/dtfe.c [0]PETSC ERROR: Block [id=0(32)] at address 0x1cc32f0 is corrupted (probably write past end of array) [0]PETSC ERROR: Block allocated in PetscFESetUp_Basic() line 2483 in /home/salaza11/petsc/src/dm/dt/interface/dtfe.c [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Memory corruption: http://www.mcs.anl.gov/petsc/documentation/installation.html#valgrind [0]PETSC ERROR: Corrupted memory [0]PETSC ERROR: See http:// http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-4776-gb18359b GIT Date: 2014-03-04 10:53:30 -0600 [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 Sat Mar 15 14:37:34 2014 [0]PETSC ERROR: Configure options --download-mpich --download-scientificpython --download-triangle --download-ctetgen --download-chaco --with-c2html=0 [0]PETSC ERROR: #1 PetscTrFreeDefault() line 289 in /home/salaza11/petsc/src/sys/memory/mtr.c [0]PETSC ERROR: #2 PetscFESetUp_Basic() line 2492 in /home/salaza11/petsc/src/dm/dt/interface/dtfe.c [0]PETSC ERROR: #3 PetscFESetUp() line 2126 in /home/salaza11/petsc/src/dm/dt/interface/dtfe.c [0]PETSC ERROR: #4 SetupElementCommon() line 482 in /home/salaza11/workspace/PETSC/ex12.c [0]PETSC ERROR: #5 SetupElement() line 506 in /home/salaza11/workspace/PETSC/ex12.c [0]PETSC ERROR: #6 main() line 754 in /home/salaza11/workspace/PETSC/ex12.c [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 [unset]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 [salaza11 at maya PETSC]$ Then again, I do not know much what I am doing given my ignorance with respect to the dual spaces in FE. I apologize for that. My questions are: - Where could I find more resources in order to understand the PETSc implementation of dual spaces for FE? - Why does it run with Dirichlet but not with Neumann? Thanks in advance. Miguel. On Tue, Mar 4, 2014 at 11:28 PM, Matthew Knepley wrote: > On Tue, Mar 4, 2014 at 12:01 PM, Matthew Knepley wrote: > >> On Tue, Mar 4, 2014 at 11:51 AM, Miguel Angel Salazar de Troya < >> salazardetroya at gmail.com> wrote: >> >>> I can run it now, thanks. Although if I run it with valgrind 3.5.0 >>> (should I update to the last version?) I get some memory leaks related with >>> the function DMPlexCreateBoxMesh. >>> >> >> I will check it out. >> > > This is now fixed. > > Thanks for finding it > > Matt > > >> Thanks, >> >> Matt >> >> >>> [salaza11 at maya tutorials]$ valgrind --leak-check=full ./ex12 -run_type >>> test -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 >>> -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>> ==9625== Memcheck, a memory error detector >>> ==9625== Copyright (C) 2002-2009, and GNU GPL'd, by Julian Seward et al. >>> ==9625== Using Valgrind-3.5.0 and LibVEX; rerun with -h for copyright >>> info >>> ==9625== Command: ./ex12 -run_type test -refinement_limit 0.0 -bc_type >>> dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>> -dm_plex_print_fem 1 >>> ==9625== >>> Local function: >>> Vec Object: 1 MPI processes >>> type: seq >>> 0 >>> 0.25 >>> 1 >>> 0.25 >>> 0.5 >>> 1.25 >>> 1 >>> 1.25 >>> 2 >>> Initial guess >>> Vec Object: 1 MPI processes >>> type: seq >>> 0.5 >>> L_2 Error: 0.111111 >>> Residual: >>> Vec Object: 1 MPI processes >>> type: seq >>> 0 >>> 0 >>> 0 >>> 0 >>> 0 >>> 0 >>> 0 >>> 0 >>> 0 >>> Initial Residual >>> Vec Object: 1 MPI processes >>> type: seq >>> 0 >>> L_2 Residual: 0 >>> Jacobian: >>> Mat Object: 1 MPI processes >>> type: seqaij >>> row 0: (0, 4) >>> Residual: >>> Vec Object: 1 MPI processes >>> type: seq >>> 0 >>> 0 >>> 0 >>> 0 >>> -2 >>> 0 >>> 0 >>> 0 >>> 0 >>> Au - b = Au + F(0) >>> Vec Object: 1 MPI processes >>> type: seq >>> 0 >>> Linear L_2 Residual: 0 >>> ==9625== >>> ==9625== HEAP SUMMARY: >>> ==9625== in use at exit: 288 bytes in 3 blocks >>> ==9625== total heap usage: 2,484 allocs, 2,481 frees, 1,009,287 bytes >>> allocated >>> ==9625== >>> ==9625== 48 bytes in 1 blocks are definitely lost in loss record 1 of 3 >>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>> ==9625== by 0x5D8D4E1: writepoly (triangle.c:12012) >>> ==9625== by 0x5D8FAAC: triangulate (triangle.c:13167) >>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>> ==9625== by 0x408D3D: main (ex12.c:651) >>> ==9625== >>> ==9625== 96 bytes in 1 blocks are definitely lost in loss record 2 of 3 >>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>> ==9625== by 0x5D8D485: writepoly (triangle.c:12004) >>> ==9625== by 0x5D8FAAC: triangulate (triangle.c:13167) >>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>> ==9625== by 0x408D3D: main (ex12.c:651) >>> ==9625== >>> ==9625== 144 bytes in 1 blocks are definitely lost in loss record 3 of 3 >>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>> ==9625== by 0x5D8CD20: writenodes (triangle.c:11718) >>> ==9625== by 0x5D8F9DE: triangulate (triangle.c:13132) >>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>> ==9625== by 0x408D3D: main (ex12.c:651) >>> ==9625== >>> ==9625== LEAK SUMMARY: >>> ==9625== definitely lost: 288 bytes in 3 blocks >>> ==9625== indirectly lost: 0 bytes in 0 blocks >>> ==9625== possibly lost: 0 bytes in 0 blocks >>> ==9625== still reachable: 0 bytes in 0 blocks >>> ==9625== suppressed: 0 bytes in 0 blocks >>> ==9625== >>> ==9625== For counts of detected and suppressed errors, rerun with: -v >>> ==9625== ERROR SUMMARY: 3 errors from 3 contexts (suppressed: 6 from 6) >>> >>> >>> >>> >>> >>> On Mon, Mar 3, 2014 at 7:05 PM, Matthew Knepley wrote: >>> >>>> On Mon, Mar 3, 2014 at 4:59 PM, Miguel Angel Salazar de Troya < >>>> salazardetroya at gmail.com> wrote: >>>> >>>>> You are welcome, thanks for your help. >>>>> >>>> >>>> Okay, I have rebuilt completely clean, and ex12 runs for me. Can you >>>> try again after pulling? >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> On Mon, Mar 3, 2014 at 4:13 PM, Matthew Knepley wrote: >>>>> >>>>>> On Mon, Mar 3, 2014 at 1:44 PM, Miguel Angel Salazar de Troya < >>>>>> salazardetroya at gmail.com> wrote: >>>>>> >>>>>>> Thanks. This is what I get. >>>>>>> >>>>>> >>>>>> Okay, this was broken by a new push to master/next in the last few >>>>>> days. I have pushed a fix, >>>>>> however next is currently broken due to a failure to check in a file. >>>>>> This should be fixed shortly, >>>>>> and then ex12 will work. I will mail you when its ready. >>>>>> >>>>>> Thanks for finding this, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> (gdb) cont >>>>>>> Continuing. >>>>>>> >>>>>>> Program received signal SIGSEGV, Segmentation fault. >>>>>>> 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, >>>>>>> X=0x168b5b0, >>>>>>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, str=0x7fffae6e7970, >>>>>>> user=0x7fd6811be509) >>>>>>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>>>>>> 882 ierr = PetscFEGetDimension(fe[f], &Nb);CHKERRQ(ierr); >>>>>>> (gdb) where >>>>>>> #0 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, >>>>>>> X=0x168b5b0, >>>>>>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, str=0x7fffae6e7970, >>>>>>> user=0x7fd6811be509) >>>>>>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>>>>>> #1 0x00007fd6814a5bf6 in SNESComputeJacobian_DMLocal >>>>>>> (snes=0x14e9450, >>>>>>> X=0x1622ad0, A=0x7fffae6e8a88, B=0x7fffae6e8a88, ctx=0x1652300) >>>>>>> at /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c:102 >>>>>>> #2 0x00007fd6814cc609 in SNESComputeJacobian (snes=0x14e9450, >>>>>>> X=0x1622ad0, >>>>>>> A=0x7fffae6e8a88, B=0x7fffae6e8a88) >>>>>>> at /home/salaza11/petsc/src/snes/interface/snes.c:2245 >>>>>>> #3 0x000000000040af72 in main (argc=15, argv=0x7fffae6e8bc8) >>>>>>> at /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:784 >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> On Mon, Mar 3, 2014 at 1:40 PM, Matthew Knepley wrote: >>>>>>> >>>>>>>> On Mon, Mar 3, 2014 at 1:39 PM, Miguel Angel Salazar de Troya < >>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>> >>>>>>>>> This is what I get at gdb when I type 'where'. >>>>>>>>> >>>>>>>> >>>>>>>> You have to type 'cont', and then when it fails you type 'where'. >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> #0 0x000000310e0aa860 in __nanosleep_nocancel () from >>>>>>>>> /lib64/libc.so.6 >>>>>>>>> #1 0x000000310e0aa70f in sleep () from /lib64/libc.so.6 >>>>>>>>> #2 0x00007fd83a00a8be in PetscSleep (s=10) >>>>>>>>> at /home/salaza11/petsc/src/sys/utils/psleep.c:52 >>>>>>>>> #3 0x00007fd83a06f331 in PetscAttachDebugger () >>>>>>>>> at /home/salaza11/petsc/src/sys/error/adebug.c:397 >>>>>>>>> #4 0x00007fd83a0af1d2 in PetscOptionsCheckInitial_Private () >>>>>>>>> at /home/salaza11/petsc/src/sys/objects/init.c:444 >>>>>>>>> #5 0x00007fd83a0b6448 in PetscInitialize (argc=0x7fff5cd8df2c, >>>>>>>>> args=0x7fff5cd8df20, file=0x0, >>>>>>>>> help=0x60ce40 "Poisson Problem in 2d and 3d with simplicial >>>>>>>>> finite elements.\nWe solve the Poisson problem in a rectangular\ndomain, >>>>>>>>> using a parallel unstructured mesh (DMPLEX) to discretize it.\n\n\n") >>>>>>>>> at /home/salaza11/petsc/src/sys/objects/pinit.c:876 >>>>>>>>> #6 0x0000000000408f2c in main (argc=15, argv=0x7fff5cd8f1f8) >>>>>>>>> at /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:663 >>>>>>>>> >>>>>>>>> The rest of the gdb output is attached. I am a bit ignorant with >>>>>>>>> gdb, I apologize for that. >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> On Mon, Mar 3, 2014 at 12:48 PM, Matthew Knepley < >>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> On Mon, Mar 3, 2014 at 12:39 PM, Miguel Angel Salazar de Troya < >>>>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> Thanks for your response. Sorry I did not have the "next" >>>>>>>>>>> version, but the "master" version. I still have an error though. I followed >>>>>>>>>>> the steps given here ( >>>>>>>>>>> https://bitbucket.org/petsc/petsc/wiki/Home) to obtain the next >>>>>>>>>>> version, I configured petsc as above and ran ex12 as above as well, getting >>>>>>>>>>> this error: >>>>>>>>>>> >>>>>>>>>>> [salaza11 at maya tutorials]$ ./ex12 -run_type test >>>>>>>>>>> -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 >>>>>>>>>>> -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>>>>>>>> Local function: >>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>> type: seq >>>>>>>>>>> 0 >>>>>>>>>>> 0.25 >>>>>>>>>>> 1 >>>>>>>>>>> 0.25 >>>>>>>>>>> 0.5 >>>>>>>>>>> 1.25 >>>>>>>>>>> 1 >>>>>>>>>>> 1.25 >>>>>>>>>>> 2 >>>>>>>>>>> Initial guess >>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>> type: seq >>>>>>>>>>> 0.5 >>>>>>>>>>> L_2 Error: 0.111111 >>>>>>>>>>> Residual: >>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>> type: seq >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> Initial Residual >>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>> type: seq >>>>>>>>>>> 0 >>>>>>>>>>> L_2 Residual: 0 >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Okay, now run with -start_in_debugger, and give me a stack trace >>>>>>>>>> using 'where'. >>>>>>>>>> >>>>>>>>>> Thanks, >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>>>>> Violation, probably memory access out of range >>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find >>>>>>>>>>> memory corruption errors >>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>> ------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are >>>>>>>>>>> not available, >>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of >>>>>>>>>>> the function >>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 871 >>>>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>> -------------------------------------------------------------- >>>>>>>>>>> [0]PETSC ERROR: Signal received >>>>>>>>>>> [0]PETSC ERROR: See http:// >>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble >>>>>>>>>>> shooting. >>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>> v3.4.3-4705-gfb6b3bc GIT Date: 2014-03-03 08:23:43 -0600 >>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by >>>>>>>>>>> salaza11 Mon Mar 3 11:49:15 2014 >>>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>>>> [0]PETSC ERROR: #1 User provided function() line 0 in unknown >>>>>>>>>>> file >>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>> [unset]: aborting job: >>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Sun, Mar 2, 2014 at 7:11 PM, Matthew Knepley < >>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> On Sun, Mar 2, 2014 at 6:54 PM, Miguel Angel Salazar de Troya < >>>>>>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> Hi everybody >>>>>>>>>>>>> >>>>>>>>>>>>> I am trying to run example ex12.c without much success. I >>>>>>>>>>>>> specifically run it with the command options: >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> We need to start narrowing down differences, because it runs >>>>>>>>>>>> for me and our nightly tests. So, first can >>>>>>>>>>>> you confirm that you are using the latest 'next' branch? >>>>>>>>>>>> >>>>>>>>>>>> Thanks, >>>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type >>>>>>>>>>>>> dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>>> >>>>>>>>>>>>> And I get this output >>>>>>>>>>>>> >>>>>>>>>>>>> Local function: >>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>> type: seq >>>>>>>>>>>>> 0 >>>>>>>>>>>>> 1 >>>>>>>>>>>>> 1 >>>>>>>>>>>>> 2 >>>>>>>>>>>>> 1 >>>>>>>>>>>>> 2 >>>>>>>>>>>>> 2 >>>>>>>>>>>>> 3 >>>>>>>>>>>>> Initial guess >>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>> type: seq >>>>>>>>>>>>> L_2 Error: 0.625 >>>>>>>>>>>>> Residual: >>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>> type: seq >>>>>>>>>>>>> 0 >>>>>>>>>>>>> 0 >>>>>>>>>>>>> 0 >>>>>>>>>>>>> 0 >>>>>>>>>>>>> 0 >>>>>>>>>>>>> 0 >>>>>>>>>>>>> 0 >>>>>>>>>>>>> 0 >>>>>>>>>>>>> Initial Residual >>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>> type: seq >>>>>>>>>>>>> L_2 Residual: 0 >>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>>>>>>> Violation, probably memory access out of range >>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find >>>>>>>>>>>>> memory corruption errors >>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are >>>>>>>>>>>>> not available, >>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of >>>>>>>>>>>>> the function >>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 867 >>>>>>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>>>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>> v3.4.3-3453-g0a94005 GIT Date: 2014-03-02 13:12:04 -0600 >>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>>> shooting. >>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by >>>>>>>>>>>>> salaza11 Sun Mar 2 17:00:09 2014 >>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>> /home/salaza11/petsc/linux-gnu-c-debug/lib >>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Sun Mar 2 16:46:51 2014 >>>>>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in unknown >>>>>>>>>>>>> file >>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>>> [unset]: aborting job: >>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Probably my problems could be on my configuration. I attach >>>>>>>>>>>>> the configure.log. I ran ./configure like this >>>>>>>>>>>>> >>>>>>>>>>>>> ./configure --download-mpich --download-scientificpython >>>>>>>>>>>>> --download-triangle --download-ctetgen --download-chaco --with-c2html=0 >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks a lot in advance. >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> On Tue, Jan 28, 2014 at 10:37 AM, Matthew Knepley < >>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> On Tue, Jan 28, 2014 at 10:31 AM, Yaakoub El Khamra < >>>>>>>>>>>>>> yelkhamra at gmail.com> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> If >>>>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> is for serial, any chance we can get the options to run in >>>>>>>>>>>>>>> parallel? >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Just use mpiexec -n >>>>>>>>>>>>>> >>>>>>>>>>>>>> Matt >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Regards >>>>>>>>>>>>>>> Yaakoub El Khamra >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:29 AM, Matthew Knepley < >>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:06 AM, Jones,Martin Alexander < >>>>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>> *Sent:* Friday, January 17, 2014 11:04 AM >>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:00 AM, Jones,Martin >>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> These examples all seem to run excepting the following >>>>>>>>>>>>>>>>>> command, >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> I get the following ouput: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>>>>>> ./ex12: symbol lookup error: >>>>>>>>>>>>>>>>>> /opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so: undefined >>>>>>>>>>>>>>>>>> symbol: omp_get_num_procs >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> This is a build problem, but it should affect all the >>>>>>>>>>>>>>>>> runs. Is this reproducible? Can you send configure.log? MKL is the worst. >>>>>>>>>>>>>>>>> If this >>>>>>>>>>>>>>>>> persists, I would just switch to --download-f-blas-lapack. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Thanks. I have some advice on options >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> --with-precision=single # I would not use this unless you >>>>>>>>>>>>>>>> are doing something special, like CUDA >>>>>>>>>>>>>>>> --with-clanguage=C++ # I would recommend switching to C, >>>>>>>>>>>>>>>> the build is much faster >>>>>>>>>>>>>>>> --with-mpi-dir=/usr --with-mpi4py=0 >>>>>>>>>>>>>>>> --with-shared-libraries --CFLAGS=-O0 --CXXFLAGS=-O0 >>>>>>>>>>>>>>>> --with-fc=0 >>>>>>>>>>>>>>>> --with-etags=1 # This is unnecessary >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>> # Here is the problem, see below >>>>>>>>>>>>>>>> --download-metis >>>>>>>>>>>>>>>> --download-fiat=yes --download-generator >>>>>>>>>>>>>>>> --download-scientificpython # Get rid of these, they are obsolete >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Your MKL needs another library for the OpenMP symbols. I >>>>>>>>>>>>>>>> would recommend switching to --download-f2cblaslapack, >>>>>>>>>>>>>>>> or you can try and find that library. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 6:35 PM >>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 5:43 PM, Jones,Martin >>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Hi, This is the next error message after configuring >>>>>>>>>>>>>>>>>>> and building with the triangle package when trying to run ex12 >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> This is my fault for bad defaults. I will fix. Try >>>>>>>>>>>>>>>>>> running >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 >>>>>>>>>>>>>>>>>> -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> for a representative run. Then you could try 3D >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> or a full run >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>>>>>>> -interpolate -petscspace_order 1 >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>>>>>>> -interpolate -petscspace_order 2 >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Let me know if those work. >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> ./ex12 >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 8 FPE: Floating >>>>>>>>>>>>>>>>>>> Point Exception,probably divide by zero >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to >>>>>>>>>>>>>>>>>>> find memory corruption errors >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in >>>>>>>>>>>>>>>>>>> stack below >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the >>>>>>>>>>>>>>>>>>> stack are not available, >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the >>>>>>>>>>>>>>>>>>> start of the function >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeResidualFEM line 531 >>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction_DMLocal line 63 >>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user function line 2088 >>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction line 2076 >>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve_NEWTONLS line 144 >>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/impls/ls/ls.c >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve line 3765 >>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>>>>>>> updates. >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about >>>>>>>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named >>>>>>>>>>>>>>>>>>> maeda by mjonesa Thu Jan 16 17:41:23 2014 >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 17:38:33 2014 >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local >>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>> --with-c2html=0 --with-clanguage=c++ PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>> --download-triangle >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in >>>>>>>>>>>>>>>>>>> unknown file >>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - >>>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:37 PM >>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:33 PM, Jones,Martin >>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Hi, I have downloaded and built the dev version you >>>>>>>>>>>>>>>>>>>> suggested. I think I need the triangle package to run this particular case. >>>>>>>>>>>>>>>>>>>> Is there any thing else that appears wrong in what I have done from the >>>>>>>>>>>>>>>>>>>> error messages below: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Great! Its running. You can reconfigure like this: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> $PETSC_DIR/$PETSC_ARCH/conf/reconfigure-$PETSC_ARCH.py --download-triangle >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> and then rebuild >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> make >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> and then rerun. You can load meshes, but its much >>>>>>>>>>>>>>>>>>> easier to have triangle create them. >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Thanks for being patient, >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: No support for this operation for this >>>>>>>>>>>>>>>>>>>> object type! >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Mesh generation needs external package >>>>>>>>>>>>>>>>>>>> support. >>>>>>>>>>>>>>>>>>>> Please reconfigure with --download-triangle.! >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>>>>>>>> updates. >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about >>>>>>>>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named >>>>>>>>>>>>>>>>>>>> maeda by mjonesa Thu Jan 16 16:28:20 2014 >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 16:25:53 >>>>>>>>>>>>>>>>>>>> 2014 >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local --with-clanguage=c++ --with-c2html=0 >>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexGenerate() line 4332 in >>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plex.c >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexCreateBoxMesh() line 600 in >>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexcreate.c >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: CreateMesh() line 295 in >>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: main() line 659 in >>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 56) - >>>>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:06 PM >>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:05 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Hi. I changed the ENV variable to the correct entry. >>>>>>>>>>>>>>>>>>>>> when I type make ex12 I get this: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>>>>> make ex12 >>>>>>>>>>>>>>>>>>>>> g++ -o ex12.o -c -Wall -Wwrite-strings >>>>>>>>>>>>>>>>>>>>> -Wno-strict-aliasing -Wno-unknown-pragmas -g -fPIC >>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include >>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/arch-linux2-cxx-debug/include >>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include/mpiuni >>>>>>>>>>>>>>>>>>>>> -D__INSDIR__=src/snes/examples/tutorials/ ex12.c >>>>>>>>>>>>>>>>>>>>> ex12.c:14:18: fatal error: ex12.h: No such file or >>>>>>>>>>>>>>>>>>>>> directory >>>>>>>>>>>>>>>>>>>>> compilation terminated. >>>>>>>>>>>>>>>>>>>>> make: *** [ex12.o] Error 1 >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Any help of yours is very much appreciated. >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Yes, this relates to my 3). This is not going to work >>>>>>>>>>>>>>>>>>>> for you with the release. Please see the link I sent. >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:58 PM >>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:55 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> You built with PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:31 PM >>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:11 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Now I went to the directory where ex12.c sits and >>>>>>>>>>>>>>>>>>>>>>> just did a 'make ex12.c' with the following error if this helps? : >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>>>>>>> make ex12.c >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/variables:108: >>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscvariables: No >>>>>>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/rules:962: >>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules: No >>>>>>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> make: *** No rule to make target >>>>>>>>>>>>>>>>>>>>>>> `/home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules'. >>>>>>>>>>>>>>>>>>>>>>> Stop. >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> 1) You would type 'make ex12' >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> 2) Either you PETSC_DIR ( >>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3) or PETSC_ARCH ( >>>>>>>>>>>>>>>>>>>>>> linux-gnu-cxx-debug) environment variables >>>>>>>>>>>>>>>>>>>>>> do not match what you built. Please send >>>>>>>>>>>>>>>>>>>>>> configure.log and make.log >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> 3) Since it was only recently added, if you want to >>>>>>>>>>>>>>>>>>>>>> use the FEM functionality, you must use the development version: >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/developers/index.html >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [mailto:knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 2:48 PM >>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 2:35 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Hello To Whom it Concerns, >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> I am trying to run the tutorial ex12.c by running 'bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>>>>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h' >>>>>>>>>>>>>>>>>>>>>>> but getting the following error: >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> $ bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>>>>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h >>>>>>>>>>>>>>>>>>>>>>> Traceback (most recent call last): >>>>>>>>>>>>>>>>>>>>>>> File >>>>>>>>>>>>>>>>>>>>>>> "bin/pythonscripts/PetscGenerateFEMQuadrature.py", line 15, in >>>>>>>>>>>>>>>>>>>>>>> from FIAT.reference_element import >>>>>>>>>>>>>>>>>>>>>>> default_simplex >>>>>>>>>>>>>>>>>>>>>>> ImportError: No module named FIAT.reference_element >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> I have removed the requirement of generating the >>>>>>>>>>>>>>>>>>>>>>> header file (its now all handled in C). I thought >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> I changed the documentation everywhere (including >>>>>>>>>>>>>>>>>>>>>>> the latest tutorial slides). Can you try running >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> with 'master' (or 'next'), and point me toward the >>>>>>>>>>>>>>>>>>>>>>> old docs? >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>> (217) 550-2360 >>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>> Graduate Research Assistant >>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>> (217) 550-2360 >>>>>>>>> salaza11 at illinois.edu >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> *Miguel Angel Salazar de Troya* >>>>>>> Graduate Research Assistant >>>>>>> Department of Mechanical Science and Engineering >>>>>>> University of Illinois at Urbana-Champaign >>>>>>> (217) 550-2360 >>>>>>> salaza11 at illinois.edu >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> *Miguel Angel Salazar de Troya* >>>>> Graduate Research Assistant >>>>> Department of Mechanical Science and Engineering >>>>> University of Illinois at Urbana-Champaign >>>>> (217) 550-2360 >>>>> salaza11 at illinois.edu >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >>> >>> -- >>> *Miguel Angel Salazar de Troya* >>> Graduate Research Assistant >>> Department of Mechanical Science and Engineering >>> University of Illinois at Urbana-Champaign >>> (217) 550-2360 >>> salaza11 at illinois.edu >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- *Miguel Angel Salazar de Troya* Graduate Research Assistant Department of Mechanical Science and Engineering University of Illinois at Urbana-Champaign (217) 550-2360 salaza11 at illinois.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sat Mar 15 15:36:27 2014 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 15 Mar 2014 15:36:27 -0500 Subject: [petsc-users] Problems running ex12.c In-Reply-To: References: <8448FFCE4362914496BCEAF8BE810C13EF1D34@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D5E@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D7C@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D8F@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DA3@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DBF@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E22@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E50@DCPWPEXMBX02.mdanderson.edu> Message-ID: On Sat, Mar 15, 2014 at 3:31 PM, Miguel Angel Salazar de Troya < salazardetroya at gmail.com> wrote: > Hello everybody > > I keep trying to understand this example. I don't have any problems with > this example when I run it like this: > > [salaza11 at maya PETSC]$ ./ex12 -bc_type dirichlet -interpolate > -petscspace_order 1 -variable_coefficient nonlinear -dim 2 -run_type full > -show_solution > Number of SNES iterations = 5 > L_2 Error: 0.107289 > Solution > Vec Object: 1 MPI processes > type: seq > 0.484618 > > However, when I change the boundary conditions to Neumann, I get this > error. > > [salaza11 at maya PETSC]$ ./ex12 -bc_type neumann -interpolate 1 > -petscspace_order 2 -variable_coefficient nonlinear -dim 2 -run_type full > -show_solution > Here you set the order of the element used in bulk, but not on the boundary where you condition is, so it defaults to 0. In order to become more familiar, take a look at the tests that I run here: https://bitbucket.org/petsc/petsc/src/64715f0f033346c10c77b73cf58216d111db8789/config/builder.py?at=master#cl-216 Matt [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Petsc has generated inconsistent data > [0]PETSC ERROR: Number of dual basis vectors 0 not equal to dimension 1 > [0]PETSC ERROR: See http:// > http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-4776-gb18359b GIT > Date: 2014-03-04 10:53:30 -0600 > [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 Sat > Mar 15 14:28:05 2014 > [0]PETSC ERROR: Configure options --download-mpich > --download-scientificpython --download-triangle --download-ctetgen > --download-chaco --with-c2html=0 > [0]PETSC ERROR: #1 PetscDualSpaceSetUp_Lagrange() line 1763 in > /home/salaza11/petsc/src/dm/dt/interface/dtfe.c > [0]PETSC ERROR: #2 PetscDualSpaceSetUp() line 1277 in > /home/salaza11/petsc/src/dm/dt/interface/dtfe.c > [0]PETSC ERROR: #3 SetupElementCommon() line 474 in > /home/salaza11/workspace/PETSC/ex12.c > [0]PETSC ERROR: #4 SetupBdElement() line 559 in > /home/salaza11/workspace/PETSC/ex12.c > [0]PETSC ERROR: #5 main() line 755 in /home/salaza11/workspace/PETSC/ex12.c > [0]PETSC ERROR: ----------------End of Error Message -------send entire > error message to petsc-maint at mcs.anl.gov---------- > application called MPI_Abort(MPI_COMM_WORLD, 77) - process 0 > [unset]: aborting job: > application called MPI_Abort(MPI_COMM_WORLD, 77) - process 0 > > I honestly do not know much about using dual spaces in a finite element > context. I have been trying to find some material that could help me > without much success. I tried to modify the dual space order with the > option -petscdualspace_order but I kept getting errors. In particular, I > got this when I set it to 1. > > [salaza11 at maya PETSC]$ ./ex12 -bc_type neumann -interpolate 1 > -petscspace_order 2 -variable_coefficient nonlinear -dim 2 -run_type full > -show_solution -petscdualspace_order 1 > [0]PETSC ERROR: PetscTrFreeDefault() called from PetscFESetUp_Basic() line > 2492 in /home/salaza11/petsc/src/dm/dt/interface/dtfe.c > [0]PETSC ERROR: Block [id=0(32)] at address 0x1cc32f0 is corrupted > (probably write past end of array) > [0]PETSC ERROR: Block allocated in PetscFESetUp_Basic() line 2483 in > /home/salaza11/petsc/src/dm/dt/interface/dtfe.c > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Memory corruption: > http://www.mcs.anl.gov/petsc/documentation/installation.html#valgrind > [0]PETSC ERROR: Corrupted memory > [0]PETSC ERROR: See http:// > http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-4776-gb18359b GIT > Date: 2014-03-04 10:53:30 -0600 > [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 Sat > Mar 15 14:37:34 2014 > [0]PETSC ERROR: Configure options --download-mpich > --download-scientificpython --download-triangle --download-ctetgen > --download-chaco --with-c2html=0 > [0]PETSC ERROR: #1 PetscTrFreeDefault() line 289 in > /home/salaza11/petsc/src/sys/memory/mtr.c > [0]PETSC ERROR: #2 PetscFESetUp_Basic() line 2492 in > /home/salaza11/petsc/src/dm/dt/interface/dtfe.c > [0]PETSC ERROR: #3 PetscFESetUp() line 2126 in > /home/salaza11/petsc/src/dm/dt/interface/dtfe.c > [0]PETSC ERROR: #4 SetupElementCommon() line 482 in > /home/salaza11/workspace/PETSC/ex12.c > [0]PETSC ERROR: #5 SetupElement() line 506 in > /home/salaza11/workspace/PETSC/ex12.c > [0]PETSC ERROR: #6 main() line 754 in /home/salaza11/workspace/PETSC/ex12.c > [0]PETSC ERROR: ----------------End of Error Message -------send entire > error message to petsc-maint at mcs.anl.gov---------- > application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 > [unset]: aborting job: > application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 > [salaza11 at maya PETSC]$ > > > Then again, I do not know much what I am doing given my ignorance with > respect to the dual spaces in FE. I apologize for that. My questions are: > > - Where could I find more resources in order to understand the PETSc > implementation of dual spaces for FE? > - Why does it run with Dirichlet but not with Neumann? > > Thanks in advance. > Miguel. > > > On Tue, Mar 4, 2014 at 11:28 PM, Matthew Knepley wrote: > >> On Tue, Mar 4, 2014 at 12:01 PM, Matthew Knepley wrote: >> >>> On Tue, Mar 4, 2014 at 11:51 AM, Miguel Angel Salazar de Troya < >>> salazardetroya at gmail.com> wrote: >>> >>>> I can run it now, thanks. Although if I run it with valgrind 3.5.0 >>>> (should I update to the last version?) I get some memory leaks related with >>>> the function DMPlexCreateBoxMesh. >>>> >>> >>> I will check it out. >>> >> >> This is now fixed. >> >> Thanks for finding it >> >> Matt >> >> >>> Thanks, >>> >>> Matt >>> >>> >>>> [salaza11 at maya tutorials]$ valgrind --leak-check=full ./ex12 -run_type >>>> test -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 >>>> -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>> ==9625== Memcheck, a memory error detector >>>> ==9625== Copyright (C) 2002-2009, and GNU GPL'd, by Julian Seward et al. >>>> ==9625== Using Valgrind-3.5.0 and LibVEX; rerun with -h for copyright >>>> info >>>> ==9625== Command: ./ex12 -run_type test -refinement_limit 0.0 -bc_type >>>> dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>> -dm_plex_print_fem 1 >>>> ==9625== >>>> Local function: >>>> Vec Object: 1 MPI processes >>>> type: seq >>>> 0 >>>> 0.25 >>>> 1 >>>> 0.25 >>>> 0.5 >>>> 1.25 >>>> 1 >>>> 1.25 >>>> 2 >>>> Initial guess >>>> Vec Object: 1 MPI processes >>>> type: seq >>>> 0.5 >>>> L_2 Error: 0.111111 >>>> Residual: >>>> Vec Object: 1 MPI processes >>>> type: seq >>>> 0 >>>> 0 >>>> 0 >>>> 0 >>>> 0 >>>> 0 >>>> 0 >>>> 0 >>>> 0 >>>> Initial Residual >>>> Vec Object: 1 MPI processes >>>> type: seq >>>> 0 >>>> L_2 Residual: 0 >>>> Jacobian: >>>> Mat Object: 1 MPI processes >>>> type: seqaij >>>> row 0: (0, 4) >>>> Residual: >>>> Vec Object: 1 MPI processes >>>> type: seq >>>> 0 >>>> 0 >>>> 0 >>>> 0 >>>> -2 >>>> 0 >>>> 0 >>>> 0 >>>> 0 >>>> Au - b = Au + F(0) >>>> Vec Object: 1 MPI processes >>>> type: seq >>>> 0 >>>> Linear L_2 Residual: 0 >>>> ==9625== >>>> ==9625== HEAP SUMMARY: >>>> ==9625== in use at exit: 288 bytes in 3 blocks >>>> ==9625== total heap usage: 2,484 allocs, 2,481 frees, 1,009,287 bytes >>>> allocated >>>> ==9625== >>>> ==9625== 48 bytes in 1 blocks are definitely lost in loss record 1 of 3 >>>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>>> ==9625== by 0x5D8D4E1: writepoly (triangle.c:12012) >>>> ==9625== by 0x5D8FAAC: triangulate (triangle.c:13167) >>>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>>> ==9625== by 0x408D3D: main (ex12.c:651) >>>> ==9625== >>>> ==9625== 96 bytes in 1 blocks are definitely lost in loss record 2 of 3 >>>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>>> ==9625== by 0x5D8D485: writepoly (triangle.c:12004) >>>> ==9625== by 0x5D8FAAC: triangulate (triangle.c:13167) >>>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>>> ==9625== by 0x408D3D: main (ex12.c:651) >>>> ==9625== >>>> ==9625== 144 bytes in 1 blocks are definitely lost in loss record 3 of 3 >>>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>>> ==9625== by 0x5D8CD20: writenodes (triangle.c:11718) >>>> ==9625== by 0x5D8F9DE: triangulate (triangle.c:13132) >>>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>>> ==9625== by 0x408D3D: main (ex12.c:651) >>>> ==9625== >>>> ==9625== LEAK SUMMARY: >>>> ==9625== definitely lost: 288 bytes in 3 blocks >>>> ==9625== indirectly lost: 0 bytes in 0 blocks >>>> ==9625== possibly lost: 0 bytes in 0 blocks >>>> ==9625== still reachable: 0 bytes in 0 blocks >>>> ==9625== suppressed: 0 bytes in 0 blocks >>>> ==9625== >>>> ==9625== For counts of detected and suppressed errors, rerun with: -v >>>> ==9625== ERROR SUMMARY: 3 errors from 3 contexts (suppressed: 6 from 6) >>>> >>>> >>>> >>>> >>>> >>>> On Mon, Mar 3, 2014 at 7:05 PM, Matthew Knepley wrote: >>>> >>>>> On Mon, Mar 3, 2014 at 4:59 PM, Miguel Angel Salazar de Troya < >>>>> salazardetroya at gmail.com> wrote: >>>>> >>>>>> You are welcome, thanks for your help. >>>>>> >>>>> >>>>> Okay, I have rebuilt completely clean, and ex12 runs for me. Can you >>>>> try again after pulling? >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> On Mon, Mar 3, 2014 at 4:13 PM, Matthew Knepley wrote: >>>>>> >>>>>>> On Mon, Mar 3, 2014 at 1:44 PM, Miguel Angel Salazar de Troya < >>>>>>> salazardetroya at gmail.com> wrote: >>>>>>> >>>>>>>> Thanks. This is what I get. >>>>>>>> >>>>>>> >>>>>>> Okay, this was broken by a new push to master/next in the last few >>>>>>> days. I have pushed a fix, >>>>>>> however next is currently broken due to a failure to check in a >>>>>>> file. This should be fixed shortly, >>>>>>> and then ex12 will work. I will mail you when its ready. >>>>>>> >>>>>>> Thanks for finding this, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> (gdb) cont >>>>>>>> Continuing. >>>>>>>> >>>>>>>> Program received signal SIGSEGV, Segmentation fault. >>>>>>>> 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, >>>>>>>> X=0x168b5b0, >>>>>>>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, str=0x7fffae6e7970, >>>>>>>> user=0x7fd6811be509) >>>>>>>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>>>>>>> 882 ierr = PetscFEGetDimension(fe[f], &Nb);CHKERRQ(ierr); >>>>>>>> (gdb) where >>>>>>>> #0 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, >>>>>>>> X=0x168b5b0, >>>>>>>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, str=0x7fffae6e7970, >>>>>>>> user=0x7fd6811be509) >>>>>>>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>>>>>>> #1 0x00007fd6814a5bf6 in SNESComputeJacobian_DMLocal >>>>>>>> (snes=0x14e9450, >>>>>>>> X=0x1622ad0, A=0x7fffae6e8a88, B=0x7fffae6e8a88, ctx=0x1652300) >>>>>>>> at /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c:102 >>>>>>>> #2 0x00007fd6814cc609 in SNESComputeJacobian (snes=0x14e9450, >>>>>>>> X=0x1622ad0, >>>>>>>> A=0x7fffae6e8a88, B=0x7fffae6e8a88) >>>>>>>> at /home/salaza11/petsc/src/snes/interface/snes.c:2245 >>>>>>>> #3 0x000000000040af72 in main (argc=15, argv=0x7fffae6e8bc8) >>>>>>>> at /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:784 >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On Mon, Mar 3, 2014 at 1:40 PM, Matthew Knepley wrote: >>>>>>>> >>>>>>>>> On Mon, Mar 3, 2014 at 1:39 PM, Miguel Angel Salazar de Troya < >>>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> This is what I get at gdb when I type 'where'. >>>>>>>>>> >>>>>>>>> >>>>>>>>> You have to type 'cont', and then when it fails you type 'where'. >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> #0 0x000000310e0aa860 in __nanosleep_nocancel () from >>>>>>>>>> /lib64/libc.so.6 >>>>>>>>>> #1 0x000000310e0aa70f in sleep () from /lib64/libc.so.6 >>>>>>>>>> #2 0x00007fd83a00a8be in PetscSleep (s=10) >>>>>>>>>> at /home/salaza11/petsc/src/sys/utils/psleep.c:52 >>>>>>>>>> #3 0x00007fd83a06f331 in PetscAttachDebugger () >>>>>>>>>> at /home/salaza11/petsc/src/sys/error/adebug.c:397 >>>>>>>>>> #4 0x00007fd83a0af1d2 in PetscOptionsCheckInitial_Private () >>>>>>>>>> at /home/salaza11/petsc/src/sys/objects/init.c:444 >>>>>>>>>> #5 0x00007fd83a0b6448 in PetscInitialize (argc=0x7fff5cd8df2c, >>>>>>>>>> args=0x7fff5cd8df20, file=0x0, >>>>>>>>>> help=0x60ce40 "Poisson Problem in 2d and 3d with simplicial >>>>>>>>>> finite elements.\nWe solve the Poisson problem in a rectangular\ndomain, >>>>>>>>>> using a parallel unstructured mesh (DMPLEX) to discretize it.\n\n\n") >>>>>>>>>> at /home/salaza11/petsc/src/sys/objects/pinit.c:876 >>>>>>>>>> #6 0x0000000000408f2c in main (argc=15, argv=0x7fff5cd8f1f8) >>>>>>>>>> at /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:663 >>>>>>>>>> >>>>>>>>>> The rest of the gdb output is attached. I am a bit ignorant with >>>>>>>>>> gdb, I apologize for that. >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Mon, Mar 3, 2014 at 12:48 PM, Matthew Knepley < >>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> On Mon, Mar 3, 2014 at 12:39 PM, Miguel Angel Salazar de Troya < >>>>>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> Thanks for your response. Sorry I did not have the "next" >>>>>>>>>>>> version, but the "master" version. I still have an error though. I followed >>>>>>>>>>>> the steps given here ( >>>>>>>>>>>> https://bitbucket.org/petsc/petsc/wiki/Home) to obtain the >>>>>>>>>>>> next version, I configured petsc as above and ran ex12 as above as well, >>>>>>>>>>>> getting this error: >>>>>>>>>>>> >>>>>>>>>>>> [salaza11 at maya tutorials]$ ./ex12 -run_type test >>>>>>>>>>>> -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 >>>>>>>>>>>> -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>>>>>>>>> Local function: >>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>> type: seq >>>>>>>>>>>> 0 >>>>>>>>>>>> 0.25 >>>>>>>>>>>> 1 >>>>>>>>>>>> 0.25 >>>>>>>>>>>> 0.5 >>>>>>>>>>>> 1.25 >>>>>>>>>>>> 1 >>>>>>>>>>>> 1.25 >>>>>>>>>>>> 2 >>>>>>>>>>>> Initial guess >>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>> type: seq >>>>>>>>>>>> 0.5 >>>>>>>>>>>> L_2 Error: 0.111111 >>>>>>>>>>>> Residual: >>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>> type: seq >>>>>>>>>>>> 0 >>>>>>>>>>>> 0 >>>>>>>>>>>> 0 >>>>>>>>>>>> 0 >>>>>>>>>>>> 0 >>>>>>>>>>>> 0 >>>>>>>>>>>> 0 >>>>>>>>>>>> 0 >>>>>>>>>>>> 0 >>>>>>>>>>>> Initial Residual >>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>> type: seq >>>>>>>>>>>> 0 >>>>>>>>>>>> L_2 Residual: 0 >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Okay, now run with -start_in_debugger, and give me a stack trace >>>>>>>>>>> using 'where'. >>>>>>>>>>> >>>>>>>>>>> Thanks, >>>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>>>>>> Violation, probably memory access out of range >>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find >>>>>>>>>>>> memory corruption errors >>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are >>>>>>>>>>>> not available, >>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of >>>>>>>>>>>> the function >>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 871 >>>>>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>> -------------------------------------------------------------- >>>>>>>>>>>> [0]PETSC ERROR: Signal received >>>>>>>>>>>> [0]PETSC ERROR: See http:// >>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for >>>>>>>>>>>> trouble shooting. >>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>> v3.4.3-4705-gfb6b3bc GIT Date: 2014-03-03 08:23:43 -0600 >>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by >>>>>>>>>>>> salaza11 Mon Mar 3 11:49:15 2014 >>>>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>>>>> [0]PETSC ERROR: #1 User provided function() line 0 in unknown >>>>>>>>>>>> file >>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>> [unset]: aborting job: >>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On Sun, Mar 2, 2014 at 7:11 PM, Matthew Knepley < >>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> On Sun, Mar 2, 2014 at 6:54 PM, Miguel Angel Salazar de Troya >>>>>>>>>>>>> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> Hi everybody >>>>>>>>>>>>>> >>>>>>>>>>>>>> I am trying to run example ex12.c without much success. I >>>>>>>>>>>>>> specifically run it with the command options: >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> We need to start narrowing down differences, because it runs >>>>>>>>>>>>> for me and our nightly tests. So, first can >>>>>>>>>>>>> you confirm that you are using the latest 'next' branch? >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks, >>>>>>>>>>>>> >>>>>>>>>>>>> Matt >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type >>>>>>>>>>>>>> dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>>>> >>>>>>>>>>>>>> And I get this output >>>>>>>>>>>>>> >>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>> 0 >>>>>>>>>>>>>> 1 >>>>>>>>>>>>>> 1 >>>>>>>>>>>>>> 2 >>>>>>>>>>>>>> 1 >>>>>>>>>>>>>> 2 >>>>>>>>>>>>>> 2 >>>>>>>>>>>>>> 3 >>>>>>>>>>>>>> Initial guess >>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>> L_2 Error: 0.625 >>>>>>>>>>>>>> Residual: >>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>> 0 >>>>>>>>>>>>>> 0 >>>>>>>>>>>>>> 0 >>>>>>>>>>>>>> 0 >>>>>>>>>>>>>> 0 >>>>>>>>>>>>>> 0 >>>>>>>>>>>>>> 0 >>>>>>>>>>>>>> 0 >>>>>>>>>>>>>> Initial Residual >>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>> L_2 Residual: 0 >>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>>>>>>>> Violation, probably memory access out of range >>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find >>>>>>>>>>>>>> memory corruption errors >>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack >>>>>>>>>>>>>> below >>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are >>>>>>>>>>>>>> not available, >>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of >>>>>>>>>>>>>> the function >>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 867 >>>>>>>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>> v3.4.3-3453-g0a94005 GIT Date: 2014-03-02 13:12:04 -0600 >>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>> updates. >>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>>>> shooting. >>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by >>>>>>>>>>>>>> salaza11 Sun Mar 2 17:00:09 2014 >>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>> /home/salaza11/petsc/linux-gnu-c-debug/lib >>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Sun Mar 2 16:46:51 2014 >>>>>>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in unknown >>>>>>>>>>>>>> file >>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>>>> [unset]: aborting job: >>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Probably my problems could be on my configuration. I attach >>>>>>>>>>>>>> the configure.log. I ran ./configure like this >>>>>>>>>>>>>> >>>>>>>>>>>>>> ./configure --download-mpich --download-scientificpython >>>>>>>>>>>>>> --download-triangle --download-ctetgen --download-chaco --with-c2html=0 >>>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks a lot in advance. >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Tue, Jan 28, 2014 at 10:37 AM, Matthew Knepley < >>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Tue, Jan 28, 2014 at 10:31 AM, Yaakoub El Khamra < >>>>>>>>>>>>>>> yelkhamra at gmail.com> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> If >>>>>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> is for serial, any chance we can get the options to run in >>>>>>>>>>>>>>>> parallel? >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Just use mpiexec -n >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Regards >>>>>>>>>>>>>>>> Yaakoub El Khamra >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:29 AM, Matthew Knepley < >>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:06 AM, Jones,Martin Alexander < >>>>>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>> *Sent:* Friday, January 17, 2014 11:04 AM >>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:00 AM, Jones,Martin >>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> These examples all seem to run excepting the following >>>>>>>>>>>>>>>>>>> command, >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> I get the following ouput: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>>>>>>> ./ex12: symbol lookup error: >>>>>>>>>>>>>>>>>>> /opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so: undefined >>>>>>>>>>>>>>>>>>> symbol: omp_get_num_procs >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> This is a build problem, but it should affect all the >>>>>>>>>>>>>>>>>> runs. Is this reproducible? Can you send configure.log? MKL is the worst. >>>>>>>>>>>>>>>>>> If this >>>>>>>>>>>>>>>>>> persists, I would just switch to --download-f-blas-lapack. >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Thanks. I have some advice on options >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> --with-precision=single # I would not use this unless >>>>>>>>>>>>>>>>> you are doing something special, like CUDA >>>>>>>>>>>>>>>>> --with-clanguage=C++ # I would recommend switching to >>>>>>>>>>>>>>>>> C, the build is much faster >>>>>>>>>>>>>>>>> --with-mpi-dir=/usr --with-mpi4py=0 >>>>>>>>>>>>>>>>> --with-shared-libraries --CFLAGS=-O0 --CXXFLAGS=-O0 >>>>>>>>>>>>>>>>> --with-fc=0 >>>>>>>>>>>>>>>>> --with-etags=1 # This is unnecessary >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>> # Here is the problem, see below >>>>>>>>>>>>>>>>> --download-metis >>>>>>>>>>>>>>>>> --download-fiat=yes --download-generator >>>>>>>>>>>>>>>>> --download-scientificpython # Get rid of these, they are obsolete >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Your MKL needs another library for the OpenMP symbols. I >>>>>>>>>>>>>>>>> would recommend switching to --download-f2cblaslapack, >>>>>>>>>>>>>>>>> or you can try and find that library. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 6:35 PM >>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 5:43 PM, Jones,Martin >>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Hi, This is the next error message after configuring >>>>>>>>>>>>>>>>>>>> and building with the triangle package when trying to run ex12 >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> This is my fault for bad defaults. I will fix. Try >>>>>>>>>>>>>>>>>>> running >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 >>>>>>>>>>>>>>>>>>> -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> for a representative run. Then you could try 3D >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> or a full run >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>>>>>>>> -interpolate -petscspace_order 1 >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>>>>>>>> -interpolate -petscspace_order 2 >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Let me know if those work. >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> ./ex12 >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 8 FPE: Floating >>>>>>>>>>>>>>>>>>>> Point Exception,probably divide by zero >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to >>>>>>>>>>>>>>>>>>>> find memory corruption errors >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in >>>>>>>>>>>>>>>>>>>> stack below >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the >>>>>>>>>>>>>>>>>>>> stack are not available, >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the >>>>>>>>>>>>>>>>>>>> start of the function >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeResidualFEM line 531 >>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction_DMLocal line 63 >>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user function line 2088 >>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction line 2076 >>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve_NEWTONLS line 144 >>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/impls/ls/ls.c >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve line 3765 >>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>>>>>>>> updates. >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about >>>>>>>>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug named >>>>>>>>>>>>>>>>>>>> maeda by mjonesa Thu Jan 16 17:41:23 2014 >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 17:38:33 >>>>>>>>>>>>>>>>>>>> 2014 >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local >>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>>> --with-c2html=0 --with-clanguage=c++ PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>> --download-triangle >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in >>>>>>>>>>>>>>>>>>>> unknown file >>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - >>>>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:37 PM >>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:33 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Hi, I have downloaded and built the dev version you >>>>>>>>>>>>>>>>>>>>> suggested. I think I need the triangle package to run this particular case. >>>>>>>>>>>>>>>>>>>>> Is there any thing else that appears wrong in what I have done from the >>>>>>>>>>>>>>>>>>>>> error messages below: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Great! Its running. You can reconfigure like this: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> $PETSC_DIR/$PETSC_ARCH/conf/reconfigure-$PETSC_ARCH.py --download-triangle >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> and then rebuild >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> make >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> and then rerun. You can load meshes, but its much >>>>>>>>>>>>>>>>>>>> easier to have triangle create them. >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Thanks for being patient, >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: No support for this operation for this >>>>>>>>>>>>>>>>>>>>> object type! >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Mesh generation needs external package >>>>>>>>>>>>>>>>>>>>> support. >>>>>>>>>>>>>>>>>>>>> Please reconfigure with --download-triangle.! >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>>>>>>>>> updates. >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about >>>>>>>>>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>> named maeda by mjonesa Thu Jan 16 16:28:20 2014 >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 16:25:53 >>>>>>>>>>>>>>>>>>>>> 2014 >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local --with-clanguage=c++ --with-c2html=0 >>>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexGenerate() line 4332 in >>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plex.c >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexCreateBoxMesh() line 600 in >>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexcreate.c >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: CreateMesh() line 295 in >>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: main() line 659 in >>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 56) - >>>>>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:06 PM >>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:05 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Hi. I changed the ENV variable to the correct >>>>>>>>>>>>>>>>>>>>>> entry. when I type make ex12 I get this: >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>>>>>> make ex12 >>>>>>>>>>>>>>>>>>>>>> g++ -o ex12.o -c -Wall -Wwrite-strings >>>>>>>>>>>>>>>>>>>>>> -Wno-strict-aliasing -Wno-unknown-pragmas -g -fPIC >>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include >>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/arch-linux2-cxx-debug/include >>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include/mpiuni >>>>>>>>>>>>>>>>>>>>>> -D__INSDIR__=src/snes/examples/tutorials/ ex12.c >>>>>>>>>>>>>>>>>>>>>> ex12.c:14:18: fatal error: ex12.h: No such file or >>>>>>>>>>>>>>>>>>>>>> directory >>>>>>>>>>>>>>>>>>>>>> compilation terminated. >>>>>>>>>>>>>>>>>>>>>> make: *** [ex12.o] Error 1 >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Any help of yours is very much appreciated. >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Yes, this relates to my 3). This is not going to >>>>>>>>>>>>>>>>>>>>> work for you with the release. Please see the link I sent. >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:58 PM >>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:55 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> You built with PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:31 PM >>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:11 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Now I went to the directory where ex12.c sits and >>>>>>>>>>>>>>>>>>>>>>>> just did a 'make ex12.c' with the following error if this helps? : >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>>>>>>>> make ex12.c >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/variables:108: >>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscvariables: No >>>>>>>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/rules:962: >>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules: No >>>>>>>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> make: *** No rule to make target >>>>>>>>>>>>>>>>>>>>>>>> `/home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules'. >>>>>>>>>>>>>>>>>>>>>>>> Stop. >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> 1) You would type 'make ex12' >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> 2) Either you PETSC_DIR ( >>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3) or PETSC_ARCH ( >>>>>>>>>>>>>>>>>>>>>>> linux-gnu-cxx-debug) environment variables >>>>>>>>>>>>>>>>>>>>>>> do not match what you built. Please send >>>>>>>>>>>>>>>>>>>>>>> configure.log and make.log >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> 3) Since it was only recently added, if you want >>>>>>>>>>>>>>>>>>>>>>> to use the FEM functionality, you must use the development version: >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/developers/index.html >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [mailto:knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 2:48 PM >>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 2:35 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Hello To Whom it Concerns, >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> I am trying to run the tutorial ex12.c by running 'bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>>>>>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h' >>>>>>>>>>>>>>>>>>>>>>>> but getting the following error: >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> $ bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>>>>>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h >>>>>>>>>>>>>>>>>>>>>>>> Traceback (most recent call last): >>>>>>>>>>>>>>>>>>>>>>>> File >>>>>>>>>>>>>>>>>>>>>>>> "bin/pythonscripts/PetscGenerateFEMQuadrature.py", line 15, in >>>>>>>>>>>>>>>>>>>>>>>> from FIAT.reference_element import >>>>>>>>>>>>>>>>>>>>>>>> default_simplex >>>>>>>>>>>>>>>>>>>>>>>> ImportError: No module named FIAT.reference_element >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> I have removed the requirement of generating the >>>>>>>>>>>>>>>>>>>>>>>> header file (its now all handled in C). I thought >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> I changed the documentation everywhere (including >>>>>>>>>>>>>>>>>>>>>>>> the latest tutorial slides). Can you try running >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> with 'master' (or 'next'), and point me toward the >>>>>>>>>>>>>>>>>>>>>>>> old docs? >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>> Graduate Research Assistant >>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>> (217) 550-2360 >>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>> Graduate Research Assistant >>>>>>>> Department of Mechanical Science and Engineering >>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>> (217) 550-2360 >>>>>>>> salaza11 at illinois.edu >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> *Miguel Angel Salazar de Troya* >>>>>> Graduate Research Assistant >>>>>> Department of Mechanical Science and Engineering >>>>>> University of Illinois at Urbana-Champaign >>>>>> (217) 550-2360 >>>>>> salaza11 at illinois.edu >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>>> >>>> -- >>>> *Miguel Angel Salazar de Troya* >>>> Graduate Research Assistant >>>> Department of Mechanical Science and Engineering >>>> University of Illinois at Urbana-Champaign >>>> (217) 550-2360 >>>> salaza11 at illinois.edu >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > > > -- > *Miguel Angel Salazar de Troya* > Graduate Research Assistant > Department of Mechanical Science and Engineering > University of Illinois at Urbana-Champaign > (217) 550-2360 > salaza11 at illinois.edu > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From salazardetroya at gmail.com Sat Mar 15 16:16:54 2014 From: salazardetroya at gmail.com (Miguel Angel Salazar de Troya) Date: Sat, 15 Mar 2014 16:16:54 -0500 Subject: [petsc-users] Problems running ex12.c In-Reply-To: References: <8448FFCE4362914496BCEAF8BE810C13EF1D34@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D5E@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D7C@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D8F@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DA3@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DBF@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E22@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E50@DCPWPEXMBX02.mdanderson.edu> Message-ID: Thanks a lot. On Sat, Mar 15, 2014 at 3:36 PM, Matthew Knepley wrote: > On Sat, Mar 15, 2014 at 3:31 PM, Miguel Angel Salazar de Troya < > salazardetroya at gmail.com> wrote: > >> Hello everybody >> >> I keep trying to understand this example. I don't have any problems with >> this example when I run it like this: >> >> [salaza11 at maya PETSC]$ ./ex12 -bc_type dirichlet -interpolate >> -petscspace_order 1 -variable_coefficient nonlinear -dim 2 -run_type full >> -show_solution >> Number of SNES iterations = 5 >> L_2 Error: 0.107289 >> Solution >> Vec Object: 1 MPI processes >> type: seq >> 0.484618 >> >> However, when I change the boundary conditions to Neumann, I get this >> error. >> >> [salaza11 at maya PETSC]$ ./ex12 -bc_type neumann -interpolate 1 >> -petscspace_order 2 -variable_coefficient nonlinear -dim 2 -run_type full >> -show_solution >> > > Here you set the order of the element used in bulk, but not on the > boundary where you condition is, so it defaults to 0. In > order to become more familiar, take a look at the tests that I run here: > > > https://bitbucket.org/petsc/petsc/src/64715f0f033346c10c77b73cf58216d111db8789/config/builder.py?at=master#cl-216 > > Matt > > [0]PETSC ERROR: --------------------- Error Message >> -------------------------------------------------------------- >> [0]PETSC ERROR: Petsc has generated inconsistent data >> [0]PETSC ERROR: Number of dual basis vectors 0 not equal to dimension 1 >> [0]PETSC ERROR: See http:// >> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. >> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-4776-gb18359b GIT >> Date: 2014-03-04 10:53:30 -0600 >> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 Sat >> Mar 15 14:28:05 2014 >> [0]PETSC ERROR: Configure options --download-mpich >> --download-scientificpython --download-triangle --download-ctetgen >> --download-chaco --with-c2html=0 >> [0]PETSC ERROR: #1 PetscDualSpaceSetUp_Lagrange() line 1763 in >> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >> [0]PETSC ERROR: #2 PetscDualSpaceSetUp() line 1277 in >> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >> [0]PETSC ERROR: #3 SetupElementCommon() line 474 in >> /home/salaza11/workspace/PETSC/ex12.c >> [0]PETSC ERROR: #4 SetupBdElement() line 559 in >> /home/salaza11/workspace/PETSC/ex12.c >> [0]PETSC ERROR: #5 main() line 755 in >> /home/salaza11/workspace/PETSC/ex12.c >> [0]PETSC ERROR: ----------------End of Error Message -------send entire >> error message to petsc-maint at mcs.anl.gov---------- >> application called MPI_Abort(MPI_COMM_WORLD, 77) - process 0 >> [unset]: aborting job: >> application called MPI_Abort(MPI_COMM_WORLD, 77) - process 0 >> >> I honestly do not know much about using dual spaces in a finite element >> context. I have been trying to find some material that could help me >> without much success. I tried to modify the dual space order with the >> option -petscdualspace_order but I kept getting errors. In particular, I >> got this when I set it to 1. >> >> [salaza11 at maya PETSC]$ ./ex12 -bc_type neumann -interpolate 1 >> -petscspace_order 2 -variable_coefficient nonlinear -dim 2 -run_type full >> -show_solution -petscdualspace_order 1 >> [0]PETSC ERROR: PetscTrFreeDefault() called from PetscFESetUp_Basic() >> line 2492 in /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >> [0]PETSC ERROR: Block [id=0(32)] at address 0x1cc32f0 is corrupted >> (probably write past end of array) >> [0]PETSC ERROR: Block allocated in PetscFESetUp_Basic() line 2483 in >> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >> [0]PETSC ERROR: --------------------- Error Message >> -------------------------------------------------------------- >> [0]PETSC ERROR: Memory corruption: >> http://www.mcs.anl.gov/petsc/documentation/installation.html#valgrind >> [0]PETSC ERROR: Corrupted memory >> [0]PETSC ERROR: See http:// >> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. >> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-4776-gb18359b GIT >> Date: 2014-03-04 10:53:30 -0600 >> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 Sat >> Mar 15 14:37:34 2014 >> [0]PETSC ERROR: Configure options --download-mpich >> --download-scientificpython --download-triangle --download-ctetgen >> --download-chaco --with-c2html=0 >> [0]PETSC ERROR: #1 PetscTrFreeDefault() line 289 in >> /home/salaza11/petsc/src/sys/memory/mtr.c >> [0]PETSC ERROR: #2 PetscFESetUp_Basic() line 2492 in >> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >> [0]PETSC ERROR: #3 PetscFESetUp() line 2126 in >> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >> [0]PETSC ERROR: #4 SetupElementCommon() line 482 in >> /home/salaza11/workspace/PETSC/ex12.c >> [0]PETSC ERROR: #5 SetupElement() line 506 in >> /home/salaza11/workspace/PETSC/ex12.c >> [0]PETSC ERROR: #6 main() line 754 in >> /home/salaza11/workspace/PETSC/ex12.c >> [0]PETSC ERROR: ----------------End of Error Message -------send entire >> error message to petsc-maint at mcs.anl.gov---------- >> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 >> [unset]: aborting job: >> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 >> [salaza11 at maya PETSC]$ >> >> >> Then again, I do not know much what I am doing given my ignorance with >> respect to the dual spaces in FE. I apologize for that. My questions are: >> >> - Where could I find more resources in order to understand the PETSc >> implementation of dual spaces for FE? >> - Why does it run with Dirichlet but not with Neumann? >> >> Thanks in advance. >> Miguel. >> >> >> On Tue, Mar 4, 2014 at 11:28 PM, Matthew Knepley wrote: >> >>> On Tue, Mar 4, 2014 at 12:01 PM, Matthew Knepley wrote: >>> >>>> On Tue, Mar 4, 2014 at 11:51 AM, Miguel Angel Salazar de Troya < >>>> salazardetroya at gmail.com> wrote: >>>> >>>>> I can run it now, thanks. Although if I run it with valgrind 3.5.0 >>>>> (should I update to the last version?) I get some memory leaks related with >>>>> the function DMPlexCreateBoxMesh. >>>>> >>>> >>>> I will check it out. >>>> >>> >>> This is now fixed. >>> >>> Thanks for finding it >>> >>> Matt >>> >>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> [salaza11 at maya tutorials]$ valgrind --leak-check=full ./ex12 >>>>> -run_type test -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 >>>>> -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>> ==9625== Memcheck, a memory error detector >>>>> ==9625== Copyright (C) 2002-2009, and GNU GPL'd, by Julian Seward et >>>>> al. >>>>> ==9625== Using Valgrind-3.5.0 and LibVEX; rerun with -h for copyright >>>>> info >>>>> ==9625== Command: ./ex12 -run_type test -refinement_limit 0.0 -bc_type >>>>> dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>> -dm_plex_print_fem 1 >>>>> ==9625== >>>>> Local function: >>>>> Vec Object: 1 MPI processes >>>>> type: seq >>>>> 0 >>>>> 0.25 >>>>> 1 >>>>> 0.25 >>>>> 0.5 >>>>> 1.25 >>>>> 1 >>>>> 1.25 >>>>> 2 >>>>> Initial guess >>>>> Vec Object: 1 MPI processes >>>>> type: seq >>>>> 0.5 >>>>> L_2 Error: 0.111111 >>>>> Residual: >>>>> Vec Object: 1 MPI processes >>>>> type: seq >>>>> 0 >>>>> 0 >>>>> 0 >>>>> 0 >>>>> 0 >>>>> 0 >>>>> 0 >>>>> 0 >>>>> 0 >>>>> Initial Residual >>>>> Vec Object: 1 MPI processes >>>>> type: seq >>>>> 0 >>>>> L_2 Residual: 0 >>>>> Jacobian: >>>>> Mat Object: 1 MPI processes >>>>> type: seqaij >>>>> row 0: (0, 4) >>>>> Residual: >>>>> Vec Object: 1 MPI processes >>>>> type: seq >>>>> 0 >>>>> 0 >>>>> 0 >>>>> 0 >>>>> -2 >>>>> 0 >>>>> 0 >>>>> 0 >>>>> 0 >>>>> Au - b = Au + F(0) >>>>> Vec Object: 1 MPI processes >>>>> type: seq >>>>> 0 >>>>> Linear L_2 Residual: 0 >>>>> ==9625== >>>>> ==9625== HEAP SUMMARY: >>>>> ==9625== in use at exit: 288 bytes in 3 blocks >>>>> ==9625== total heap usage: 2,484 allocs, 2,481 frees, 1,009,287 >>>>> bytes allocated >>>>> ==9625== >>>>> ==9625== 48 bytes in 1 blocks are definitely lost in loss record 1 of 3 >>>>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>>>> ==9625== by 0x5D8D4E1: writepoly (triangle.c:12012) >>>>> ==9625== by 0x5D8FAAC: triangulate (triangle.c:13167) >>>>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>>>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>>>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>>>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>>>> ==9625== by 0x408D3D: main (ex12.c:651) >>>>> ==9625== >>>>> ==9625== 96 bytes in 1 blocks are definitely lost in loss record 2 of 3 >>>>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>>>> ==9625== by 0x5D8D485: writepoly (triangle.c:12004) >>>>> ==9625== by 0x5D8FAAC: triangulate (triangle.c:13167) >>>>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>>>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>>>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>>>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>>>> ==9625== by 0x408D3D: main (ex12.c:651) >>>>> ==9625== >>>>> ==9625== 144 bytes in 1 blocks are definitely lost in loss record 3 of >>>>> 3 >>>>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>>>> ==9625== by 0x5D8CD20: writenodes (triangle.c:11718) >>>>> ==9625== by 0x5D8F9DE: triangulate (triangle.c:13132) >>>>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>>>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>>>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>>>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>>>> ==9625== by 0x408D3D: main (ex12.c:651) >>>>> ==9625== >>>>> ==9625== LEAK SUMMARY: >>>>> ==9625== definitely lost: 288 bytes in 3 blocks >>>>> ==9625== indirectly lost: 0 bytes in 0 blocks >>>>> ==9625== possibly lost: 0 bytes in 0 blocks >>>>> ==9625== still reachable: 0 bytes in 0 blocks >>>>> ==9625== suppressed: 0 bytes in 0 blocks >>>>> ==9625== >>>>> ==9625== For counts of detected and suppressed errors, rerun with: -v >>>>> ==9625== ERROR SUMMARY: 3 errors from 3 contexts (suppressed: 6 from 6) >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> On Mon, Mar 3, 2014 at 7:05 PM, Matthew Knepley wrote: >>>>> >>>>>> On Mon, Mar 3, 2014 at 4:59 PM, Miguel Angel Salazar de Troya < >>>>>> salazardetroya at gmail.com> wrote: >>>>>> >>>>>>> You are welcome, thanks for your help. >>>>>>> >>>>>> >>>>>> Okay, I have rebuilt completely clean, and ex12 runs for me. Can you >>>>>> try again after pulling? >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> On Mon, Mar 3, 2014 at 4:13 PM, Matthew Knepley wrote: >>>>>>> >>>>>>>> On Mon, Mar 3, 2014 at 1:44 PM, Miguel Angel Salazar de Troya < >>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>> >>>>>>>>> Thanks. This is what I get. >>>>>>>>> >>>>>>>> >>>>>>>> Okay, this was broken by a new push to master/next in the last few >>>>>>>> days. I have pushed a fix, >>>>>>>> however next is currently broken due to a failure to check in a >>>>>>>> file. This should be fixed shortly, >>>>>>>> and then ex12 will work. I will mail you when its ready. >>>>>>>> >>>>>>>> Thanks for finding this, >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> (gdb) cont >>>>>>>>> Continuing. >>>>>>>>> >>>>>>>>> Program received signal SIGSEGV, Segmentation fault. >>>>>>>>> 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, >>>>>>>>> X=0x168b5b0, >>>>>>>>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, str=0x7fffae6e7970, >>>>>>>>> user=0x7fd6811be509) >>>>>>>>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>>>>>>>> 882 ierr = PetscFEGetDimension(fe[f], &Nb);CHKERRQ(ierr); >>>>>>>>> (gdb) where >>>>>>>>> #0 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, >>>>>>>>> X=0x168b5b0, >>>>>>>>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, str=0x7fffae6e7970, >>>>>>>>> user=0x7fd6811be509) >>>>>>>>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>>>>>>>> #1 0x00007fd6814a5bf6 in SNESComputeJacobian_DMLocal >>>>>>>>> (snes=0x14e9450, >>>>>>>>> X=0x1622ad0, A=0x7fffae6e8a88, B=0x7fffae6e8a88, ctx=0x1652300) >>>>>>>>> at /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c:102 >>>>>>>>> #2 0x00007fd6814cc609 in SNESComputeJacobian (snes=0x14e9450, >>>>>>>>> X=0x1622ad0, >>>>>>>>> A=0x7fffae6e8a88, B=0x7fffae6e8a88) >>>>>>>>> at /home/salaza11/petsc/src/snes/interface/snes.c:2245 >>>>>>>>> #3 0x000000000040af72 in main (argc=15, argv=0x7fffae6e8bc8) >>>>>>>>> at /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:784 >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> On Mon, Mar 3, 2014 at 1:40 PM, Matthew Knepley >>>>>>>> > wrote: >>>>>>>>> >>>>>>>>>> On Mon, Mar 3, 2014 at 1:39 PM, Miguel Angel Salazar de Troya < >>>>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> This is what I get at gdb when I type 'where'. >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> You have to type 'cont', and then when it fails you type 'where'. >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> #0 0x000000310e0aa860 in __nanosleep_nocancel () from >>>>>>>>>>> /lib64/libc.so.6 >>>>>>>>>>> #1 0x000000310e0aa70f in sleep () from /lib64/libc.so.6 >>>>>>>>>>> #2 0x00007fd83a00a8be in PetscSleep (s=10) >>>>>>>>>>> at /home/salaza11/petsc/src/sys/utils/psleep.c:52 >>>>>>>>>>> #3 0x00007fd83a06f331 in PetscAttachDebugger () >>>>>>>>>>> at /home/salaza11/petsc/src/sys/error/adebug.c:397 >>>>>>>>>>> #4 0x00007fd83a0af1d2 in PetscOptionsCheckInitial_Private () >>>>>>>>>>> at /home/salaza11/petsc/src/sys/objects/init.c:444 >>>>>>>>>>> #5 0x00007fd83a0b6448 in PetscInitialize (argc=0x7fff5cd8df2c, >>>>>>>>>>> args=0x7fff5cd8df20, file=0x0, >>>>>>>>>>> help=0x60ce40 "Poisson Problem in 2d and 3d with simplicial >>>>>>>>>>> finite elements.\nWe solve the Poisson problem in a rectangular\ndomain, >>>>>>>>>>> using a parallel unstructured mesh (DMPLEX) to discretize it.\n\n\n") >>>>>>>>>>> at /home/salaza11/petsc/src/sys/objects/pinit.c:876 >>>>>>>>>>> #6 0x0000000000408f2c in main (argc=15, argv=0x7fff5cd8f1f8) >>>>>>>>>>> at >>>>>>>>>>> /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:663 >>>>>>>>>>> >>>>>>>>>>> The rest of the gdb output is attached. I am a bit ignorant with >>>>>>>>>>> gdb, I apologize for that. >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Mon, Mar 3, 2014 at 12:48 PM, Matthew Knepley < >>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> On Mon, Mar 3, 2014 at 12:39 PM, Miguel Angel Salazar de Troya >>>>>>>>>>>> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> Thanks for your response. Sorry I did not have the "next" >>>>>>>>>>>>> version, but the "master" version. I still have an error though. I followed >>>>>>>>>>>>> the steps given here ( >>>>>>>>>>>>> https://bitbucket.org/petsc/petsc/wiki/Home) to obtain the >>>>>>>>>>>>> next version, I configured petsc as above and ran ex12 as above as well, >>>>>>>>>>>>> getting this error: >>>>>>>>>>>>> >>>>>>>>>>>>> [salaza11 at maya tutorials]$ ./ex12 -run_type test >>>>>>>>>>>>> -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 >>>>>>>>>>>>> -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>>>>>>>>>> Local function: >>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>> type: seq >>>>>>>>>>>>> 0 >>>>>>>>>>>>> 0.25 >>>>>>>>>>>>> 1 >>>>>>>>>>>>> 0.25 >>>>>>>>>>>>> 0.5 >>>>>>>>>>>>> 1.25 >>>>>>>>>>>>> 1 >>>>>>>>>>>>> 1.25 >>>>>>>>>>>>> 2 >>>>>>>>>>>>> Initial guess >>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>> type: seq >>>>>>>>>>>>> 0.5 >>>>>>>>>>>>> L_2 Error: 0.111111 >>>>>>>>>>>>> Residual: >>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>> type: seq >>>>>>>>>>>>> 0 >>>>>>>>>>>>> 0 >>>>>>>>>>>>> 0 >>>>>>>>>>>>> 0 >>>>>>>>>>>>> 0 >>>>>>>>>>>>> 0 >>>>>>>>>>>>> 0 >>>>>>>>>>>>> 0 >>>>>>>>>>>>> 0 >>>>>>>>>>>>> Initial Residual >>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>> type: seq >>>>>>>>>>>>> 0 >>>>>>>>>>>>> L_2 Residual: 0 >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Okay, now run with -start_in_debugger, and give me a stack >>>>>>>>>>>> trace using 'where'. >>>>>>>>>>>> >>>>>>>>>>>> Thanks, >>>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>>>>>>> Violation, probably memory access out of range >>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find >>>>>>>>>>>>> memory corruption errors >>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack below >>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are >>>>>>>>>>>>> not available, >>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of >>>>>>>>>>>>> the function >>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 871 >>>>>>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>>>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>> -------------------------------------------------------------- >>>>>>>>>>>>> [0]PETSC ERROR: Signal received >>>>>>>>>>>>> [0]PETSC ERROR: See http:// >>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for >>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>> v3.4.3-4705-gfb6b3bc GIT Date: 2014-03-03 08:23:43 -0600 >>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by >>>>>>>>>>>>> salaza11 Mon Mar 3 11:49:15 2014 >>>>>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>>>>>> [0]PETSC ERROR: #1 User provided function() line 0 in unknown >>>>>>>>>>>>> file >>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>>> [unset]: aborting job: >>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> On Sun, Mar 2, 2014 at 7:11 PM, Matthew Knepley < >>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> On Sun, Mar 2, 2014 at 6:54 PM, Miguel Angel Salazar de Troya >>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Hi everybody >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> I am trying to run example ex12.c without much success. I >>>>>>>>>>>>>>> specifically run it with the command options: >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> We need to start narrowing down differences, because it runs >>>>>>>>>>>>>> for me and our nightly tests. So, first can >>>>>>>>>>>>>> you confirm that you are using the latest 'next' branch? >>>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>> >>>>>>>>>>>>>> Matt >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type >>>>>>>>>>>>>>> dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> And I get this output >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>> 3 >>>>>>>>>>>>>>> Initial guess >>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>> L_2 Error: 0.625 >>>>>>>>>>>>>>> Residual: >>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>> Initial Residual >>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>> L_2 Residual: 0 >>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>>>>>>>>> Violation, probably memory access out of range >>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find >>>>>>>>>>>>>>> memory corruption errors >>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack >>>>>>>>>>>>>>> below >>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack >>>>>>>>>>>>>>> are not available, >>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start >>>>>>>>>>>>>>> of the function >>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 867 >>>>>>>>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>> v3.4.3-3453-g0a94005 GIT Date: 2014-03-02 13:12:04 -0600 >>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>>> updates. >>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>>>>> shooting. >>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by >>>>>>>>>>>>>>> salaza11 Sun Mar 2 17:00:09 2014 >>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>> /home/salaza11/petsc/linux-gnu-c-debug/lib >>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Sun Mar 2 16:46:51 2014 >>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in unknown >>>>>>>>>>>>>>> file >>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>>>>> [unset]: aborting job: >>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Probably my problems could be on my configuration. I attach >>>>>>>>>>>>>>> the configure.log. I ran ./configure like this >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> ./configure --download-mpich --download-scientificpython >>>>>>>>>>>>>>> --download-triangle --download-ctetgen --download-chaco --with-c2html=0 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks a lot in advance. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Tue, Jan 28, 2014 at 10:37 AM, Matthew Knepley < >>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Tue, Jan 28, 2014 at 10:31 AM, Yaakoub El Khamra < >>>>>>>>>>>>>>>> yelkhamra at gmail.com> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> If >>>>>>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> is for serial, any chance we can get the options to run in >>>>>>>>>>>>>>>>> parallel? >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Just use mpiexec -n >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Regards >>>>>>>>>>>>>>>>> Yaakoub El Khamra >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:29 AM, Matthew Knepley < >>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:06 AM, Jones,Martin Alexander >>>>>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>> *Sent:* Friday, January 17, 2014 11:04 AM >>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:00 AM, Jones,Martin >>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> These examples all seem to run excepting the >>>>>>>>>>>>>>>>>>>> following command, >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> I get the following ouput: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>>>>>>>> ./ex12: symbol lookup error: >>>>>>>>>>>>>>>>>>>> /opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so: undefined >>>>>>>>>>>>>>>>>>>> symbol: omp_get_num_procs >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> This is a build problem, but it should affect all the >>>>>>>>>>>>>>>>>>> runs. Is this reproducible? Can you send configure.log? MKL is the worst. >>>>>>>>>>>>>>>>>>> If this >>>>>>>>>>>>>>>>>>> persists, I would just switch to >>>>>>>>>>>>>>>>>>> --download-f-blas-lapack. >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Thanks. I have some advice on options >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> --with-precision=single # I would not use this unless >>>>>>>>>>>>>>>>>> you are doing something special, like CUDA >>>>>>>>>>>>>>>>>> --with-clanguage=C++ # I would recommend switching to >>>>>>>>>>>>>>>>>> C, the build is much faster >>>>>>>>>>>>>>>>>> --with-mpi-dir=/usr --with-mpi4py=0 >>>>>>>>>>>>>>>>>> --with-shared-libraries --CFLAGS=-O0 --CXXFLAGS=-O0 >>>>>>>>>>>>>>>>>> --with-fc=0 >>>>>>>>>>>>>>>>>> --with-etags=1 # This is unnecessary >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>> # Here is the problem, see below >>>>>>>>>>>>>>>>>> --download-metis >>>>>>>>>>>>>>>>>> --download-fiat=yes --download-generator >>>>>>>>>>>>>>>>>> --download-scientificpython # Get rid of these, they are obsolete >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Your MKL needs another library for the OpenMP symbols. I >>>>>>>>>>>>>>>>>> would recommend switching to --download-f2cblaslapack, >>>>>>>>>>>>>>>>>> or you can try and find that library. >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 6:35 PM >>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 5:43 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Hi, This is the next error message after configuring >>>>>>>>>>>>>>>>>>>>> and building with the triangle package when trying to run ex12 >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> This is my fault for bad defaults. I will fix. Try >>>>>>>>>>>>>>>>>>>> running >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 >>>>>>>>>>>>>>>>>>>> -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> for a representative run. Then you could try 3D >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit >>>>>>>>>>>>>>>>>>>> 0.0125 -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> or a full run >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>>>>>>>>> -interpolate -petscspace_order 1 >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>>>>>>>>> -interpolate -petscspace_order 2 >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Let me know if those work. >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> ./ex12 >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 8 FPE: Floating >>>>>>>>>>>>>>>>>>>>> Point Exception,probably divide by zero >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X >>>>>>>>>>>>>>>>>>>>> to find memory corruption errors >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in >>>>>>>>>>>>>>>>>>>>> stack below >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the >>>>>>>>>>>>>>>>>>>>> stack are not available, >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the >>>>>>>>>>>>>>>>>>>>> start of the function >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeResidualFEM line 531 >>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction_DMLocal line >>>>>>>>>>>>>>>>>>>>> 63 /home/mjonesa/PETSc/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user function line 2088 >>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction line 2076 >>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve_NEWTONLS line 144 >>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/impls/ls/ls.c >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve line 3765 >>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>>>>>>>>> updates. >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about >>>>>>>>>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>> named maeda by mjonesa Thu Jan 16 17:41:23 2014 >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 17:38:33 >>>>>>>>>>>>>>>>>>>>> 2014 >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local >>>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>>>> --with-c2html=0 --with-clanguage=c++ PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>> --download-triangle >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in >>>>>>>>>>>>>>>>>>>>> unknown file >>>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - >>>>>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:37 PM >>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:33 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Hi, I have downloaded and built the dev version you >>>>>>>>>>>>>>>>>>>>>> suggested. I think I need the triangle package to run this particular case. >>>>>>>>>>>>>>>>>>>>>> Is there any thing else that appears wrong in what I have done from the >>>>>>>>>>>>>>>>>>>>>> error messages below: >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Great! Its running. You can reconfigure like this: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> $PETSC_DIR/$PETSC_ARCH/conf/reconfigure-$PETSC_ARCH.py --download-triangle >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> and then rebuild >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> make >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> and then rerun. You can load meshes, but its much >>>>>>>>>>>>>>>>>>>>> easier to have triangle create them. >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Thanks for being patient, >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: No support for this operation for >>>>>>>>>>>>>>>>>>>>>> this object type! >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Mesh generation needs external >>>>>>>>>>>>>>>>>>>>>> package support. >>>>>>>>>>>>>>>>>>>>>> Please reconfigure with --download-triangle.! >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for >>>>>>>>>>>>>>>>>>>>>> recent updates. >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about >>>>>>>>>>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>>> named maeda by mjonesa Thu Jan 16 16:28:20 2014 >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 16:25:53 >>>>>>>>>>>>>>>>>>>>>> 2014 >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local --with-clanguage=c++ --with-c2html=0 >>>>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexGenerate() line 4332 in >>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plex.c >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexCreateBoxMesh() line 600 in >>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexcreate.c >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: CreateMesh() line 295 in >>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: main() line 659 in >>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 56) - >>>>>>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:06 PM >>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:05 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Hi. I changed the ENV variable to the correct >>>>>>>>>>>>>>>>>>>>>>> entry. when I type make ex12 I get this: >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>>>>>>> make ex12 >>>>>>>>>>>>>>>>>>>>>>> g++ -o ex12.o -c -Wall -Wwrite-strings >>>>>>>>>>>>>>>>>>>>>>> -Wno-strict-aliasing -Wno-unknown-pragmas -g -fPIC >>>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include >>>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/arch-linux2-cxx-debug/include >>>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include/mpiuni >>>>>>>>>>>>>>>>>>>>>>> -D__INSDIR__=src/snes/examples/tutorials/ ex12.c >>>>>>>>>>>>>>>>>>>>>>> ex12.c:14:18: fatal error: ex12.h: No such file or >>>>>>>>>>>>>>>>>>>>>>> directory >>>>>>>>>>>>>>>>>>>>>>> compilation terminated. >>>>>>>>>>>>>>>>>>>>>>> make: *** [ex12.o] Error 1 >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Any help of yours is very much appreciated. >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Yes, this relates to my 3). This is not going to >>>>>>>>>>>>>>>>>>>>>> work for you with the release. Please see the link I sent. >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:58 PM >>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:55 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> You built with PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:31 PM >>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:11 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> Now I went to the directory where ex12.c sits >>>>>>>>>>>>>>>>>>>>>>>>> and just did a 'make ex12.c' with the following error if this helps? : >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>>>>>>>>> make ex12.c >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/variables:108: >>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscvariables: No >>>>>>>>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/rules:962: >>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules: No >>>>>>>>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> make: *** No rule to make target >>>>>>>>>>>>>>>>>>>>>>>>> `/home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules'. >>>>>>>>>>>>>>>>>>>>>>>>> Stop. >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> 1) You would type 'make ex12' >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> 2) Either you PETSC_DIR ( >>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3) or PETSC_ARCH ( >>>>>>>>>>>>>>>>>>>>>>>> linux-gnu-cxx-debug) environment variables >>>>>>>>>>>>>>>>>>>>>>>> do not match what you built. Please send >>>>>>>>>>>>>>>>>>>>>>>> configure.log and make.log >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> 3) Since it was only recently added, if you want >>>>>>>>>>>>>>>>>>>>>>>> to use the FEM functionality, you must use the development version: >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/developers/index.html >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [mailto:knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 2:48 PM >>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 2:35 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> Hello To Whom it Concerns, >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> I am trying to run the tutorial ex12.c by running 'bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>>>>>>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h' >>>>>>>>>>>>>>>>>>>>>>>>> but getting the following error: >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> $ bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>>>>>>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h >>>>>>>>>>>>>>>>>>>>>>>>> Traceback (most recent call last): >>>>>>>>>>>>>>>>>>>>>>>>> File >>>>>>>>>>>>>>>>>>>>>>>>> "bin/pythonscripts/PetscGenerateFEMQuadrature.py", line 15, in >>>>>>>>>>>>>>>>>>>>>>>>> from FIAT.reference_element import >>>>>>>>>>>>>>>>>>>>>>>>> default_simplex >>>>>>>>>>>>>>>>>>>>>>>>> ImportError: No module named FIAT.reference_element >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> I have removed the requirement of generating the >>>>>>>>>>>>>>>>>>>>>>>>> header file (its now all handled in C). I thought >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> I changed the documentation everywhere (including >>>>>>>>>>>>>>>>>>>>>>>>> the latest tutorial slides). Can you try running >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> with 'master' (or 'next'), and point me toward the >>>>>>>>>>>>>>>>>>>>>>>>> old docs? >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>> (217) 550-2360 >>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>> Graduate Research Assistant >>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>> (217) 550-2360 >>>>>>>>> salaza11 at illinois.edu >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> *Miguel Angel Salazar de Troya* >>>>>>> Graduate Research Assistant >>>>>>> Department of Mechanical Science and Engineering >>>>>>> University of Illinois at Urbana-Champaign >>>>>>> (217) 550-2360 >>>>>>> salaza11 at illinois.edu >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> *Miguel Angel Salazar de Troya* >>>>> Graduate Research Assistant >>>>> Department of Mechanical Science and Engineering >>>>> University of Illinois at Urbana-Champaign >>>>> (217) 550-2360 >>>>> salaza11 at illinois.edu >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> >> >> -- >> *Miguel Angel Salazar de Troya* >> Graduate Research Assistant >> Department of Mechanical Science and Engineering >> University of Illinois at Urbana-Champaign >> (217) 550-2360 >> salaza11 at illinois.edu >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- *Miguel Angel Salazar de Troya* Graduate Research Assistant Department of Mechanical Science and Engineering University of Illinois at Urbana-Champaign (217) 550-2360 salaza11 at illinois.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sat Mar 15 16:31:49 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sat, 15 Mar 2014 16:31:49 -0500 Subject: [petsc-users] KSPSolve doesn't seem to scale. (Must be doing something wrong...) In-Reply-To: References: Message-ID: <1E995842-1C73-4513-A47C-AF7834085452@mcs.anl.gov> Bill, It is great that you ran with -info to see that there are not excessive mallocs in vector and matrix assemblies and -ksp_view to show the solver being used but I would recommend doing that in a separate run from the -log_summary because we make no attempt to have -info and -xx_view options optimized for performance. To begin analysis I find it is always best not to compare 1 to 2 processors, nor to compare at the highest level of number of processors but instead to compare somewhere in the middle. Hence I look at 2 and 4 processes 1) Looking at embarrassingly parallel operations 4procs VecMAXPY 8677 1.0 6.9120e+00 1.0 8.15e+09 1.0 0.0e+00 0.0e+00 0.0e+00 5 35 0 0 0 6 35 0 0 0 4717 MatSolve 8677 1.0 6.9232e+00 1.1 3.41e+09 1.0 0.0e+00 0.0e+00 0.0e+00 5 15 0 0 0 6 15 0 0 0 1971 MatLUFactorNum 1 1.0 2.5489e-03 1.2 6.53e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1024 VecScale 8677 1.0 2.1447e+01 1.1 2.71e+08 1.0 0.0e+00 0.0e+00 0.0e+00 16 1 0 0 0 19 1 0 0 0 51 VecAXPY 508 1.0 8.9473e-01 1.4 3.18e+07 1.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 142 2procs VecMAXPY 8341 1.0 9.4324e+00 1.0 1.54e+10 1.0 0.0e+00 0.0e+00 0.0e+00 15 34 0 0 0 23 35 0 0 0 3261 MatSolve 8341 1.0 1.0210e+01 1.0 6.61e+09 1.0 0.0e+00 0.0e+00 0.0e+00 16 15 0 0 0 25 15 0 0 0 1294 MatLUFactorNum 1 1.0 4.0622e-03 1.1 1.32e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 650 VecScale 8341 1.0 1.0367e+00 1.3 5.21e+08 1.0 0.0e+00 0.0e+00 0.0e+00 2 1 0 0 0 2 1 0 0 0 1006 VecAXPY 502 1.0 3.5317e-02 1.7 6.28e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 3553 These are routines where there is no communication between the MPI processes and no synchronization. Thus in an ideal situation one could hope for the routines to run TWICE as fast. For the last three operations I calculated the ratio of flop rates as 1.57, 1.52, and 1.44. Thus I conclude that the 4 MPI processes are sharing memory bandwidth thus you cannot expect to get 2 times speed up. But what is going on with VecScale and VecAXPY, why is the performance falling through the floor? I noticed that you are using OpenBLAS so did some poking around in google and found at https://github.com/xianyi/OpenBLAS/wiki/faq#what If your application is already multi-threaded, it will conflict with OpenBLAS multi-threading. Thus, you must set OpenBLAS to use single thread as following. ? export OPENBLAS_NUM_THREADS=1 in the environment variables. Or ? Call openblas_set_num_threads(1) in the application on runtime. Or ? Build OpenBLAS single thread version, e.g. make USE_THREAD=0 Of course you application is not multi-threaded it is MPI parallel but you have the exact same problem, the number of cores is over subscribed with too many threads killing performance of some routines. So please FORCE OpenBlas to only use a single thread and rerun the 1,2,4, and 8 with -log_summary and without the -info and -xxx_view 2) I now compare the 4 and 8 process case with MAXPY and Solve 8procs VecMAXPY 9336 1.0 3.0977e+00 1.0 4.59e+09 1.0 0.0e+00 0.0e+00 0.0e+00 3 35 0 0 0 5 35 0 0 0 11835 MatSolve 9336 1.0 3.0873e+00 1.1 1.82e+09 1.0 0.0e+00 0.0e+00 0.0e+00 3 14 0 0 0 4 14 0 0 0 4716 4procs VecMAXPY 8677 1.0 6.9120e+00 1.0 8.15e+09 1.0 0.0e+00 0.0e+00 0.0e+00 5 35 0 0 0 6 35 0 0 0 4717 MatSolve 8677 1.0 6.9232e+00 1.1 3.41e+09 1.0 0.0e+00 0.0e+00 0.0e+00 5 15 0 0 0 6 15 0 0 0 1971 What the hey is going on here? The performance more than doubles! From this I conclude that going from 4 to 8 processes is moving the computation to twice as many physical CPUs that DO NOT share memory bandwidth. A general observation, since p multiple cores on the same physical CPU generally share memory bandwidth when you go from p/2 to p MPI processes on that CPU you will never see a double in performance (perfect speedup) you are actually lucky if you see the 1.5 speed up that you are seeing. Thus as you increase the number of MPI processes to extend to more and more physical CPUs you will see ?funny jumps? in your speedup depending on when it is switching to more physical CPUs (and hence more memory bandwidth). Thus is is important to understand ?where? the program is actually running. So make the changes I recommend and send us the new set of -log_summary and we may be able to make more observations based on less ?cluttered? data. Barry On Mar 14, 2014, at 4:45 PM, William Coirier wrote: > I've written a parallel, finite-volume, transient thermal conduction solver using PETSc primitives, and so far things have been going great. Comparisons to theory for a simple problem (transient conduction in a semi-infinite slab) looks good, but I'm not getting very good parallel scaling behavior with the KSP solver. Whether I use the default KSP/PC or other sensible combinations, the time spent in KSPSolve seems to not scale well at all. > > I seem to have loaded up the problem well enough. The PETSc logging/profiling has been really useful for reworking various code segments, and right now, the bottleneck is KSPSolve, and I can't seem to figure out how to get it to scale properly. > > I'm attaching output produced with -log_summary, -info, -ksp_view and -pc_view all specified on the command line for 1, 2, 4 and 8 processes. > > If you guys have any suggestions, I'd definitely like to hear them! And I apologize in advance if I've done something stupid. All the documentation has been really helpful. > > Thanks in advance... > > Bill Coirier > > -------------------------------------------------------------------------------------------------------------------- > > ***NOTICE*** This e-mail and/or the attached documents may contain technical data within the definition of the International Traffic in Arms Regulations and/or Export Administration Regulations, and are subject to the export control laws of the U.S. Government. Transfer of this data by any means to a foreign person, whether in the United States or abroad, without an export license or other approval from the U.S. Department of State or Commerce, as applicable, is prohibited. No portion of this e-mail or its attachment(s) may be reproduced without written consent of Kratos Defense & Security Solutions, Inc. Any views expressed in this message are those of the individual sender, except where the message states otherwise and the sender is authorized to state them to be the views of any such entity. The information contained in this message and or attachments is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. If you are not the intended recipient or believe that you may have received this document in error, please notify the sender and delete this e-mail and any attachments immediately. From mc0710 at gmail.com Sat Mar 15 17:47:41 2014 From: mc0710 at gmail.com (Mani Chandra) Date: Sat, 15 Mar 2014 17:47:41 -0500 Subject: [petsc-users] Spatial distribution of the residual after an implicit TS timestep Message-ID: Hi, Is there anyway I can VecView the residual after TS has completed an implicit time step? I'd like to see where in my domain most of the errors are coming from. I looked at TSMonitor but that doesn't seem to give me access to the residual at the end of the current time step. Thanks, Mani -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sat Mar 15 18:24:15 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sat, 15 Mar 2014 18:24:15 -0500 Subject: [petsc-users] Spatial distribution of the residual after an implicit TS timestep In-Reply-To: References: Message-ID: <9523A33B-3F7D-4C44-9DFF-361FF72546A0@mcs.anl.gov> You can write a custom monitor and set it with TSMonitorSet() This routine would call TSGetSNES() then SNESGetSolution() then call SNESComputeFunction() then call VecView() on the result. But note that just because the residual is big somewhere doesn?t mean the error need be. You could also run with -snes_monitor_residual to see how the residual is being reduced inside the nonlinear solve (that is, what parts of the residual are most stubborn). Barry On Mar 15, 2014, at 5:47 PM, Mani Chandra wrote: > Hi, > > Is there anyway I can VecView the residual after TS has completed an implicit time step? I'd like to see where in my domain most of the errors are coming from. I looked at TSMonitor but that doesn't seem to give me access to the residual at the end of the current time step. > > Thanks, > Mani From mc0710 at gmail.com Sat Mar 15 18:29:12 2014 From: mc0710 at gmail.com (Mani Chandra) Date: Sat, 15 Mar 2014 18:29:12 -0500 Subject: [petsc-users] Spatial distribution of the residual after an implicit TS timestep In-Reply-To: <9523A33B-3F7D-4C44-9DFF-361FF72546A0@mcs.anl.gov> References: <9523A33B-3F7D-4C44-9DFF-361FF72546A0@mcs.anl.gov> Message-ID: -snes_montior_residual is really cool. Anyway I could get it to spit out png images instead of on screen visualization? On Sat, Mar 15, 2014 at 6:24 PM, Barry Smith wrote: > > > You can write a custom monitor and set it with TSMonitorSet() > > This routine would call TSGetSNES() then SNESGetSolution() then call > SNESComputeFunction() then call VecView() on the result. > > But note that just because the residual is big somewhere doesn?t mean > the error need be. > > You could also run with -snes_monitor_residual to see how the residual > is being reduced inside the nonlinear solve (that is, what parts of the > residual are most stubborn). > > > Barry > > > On Mar 15, 2014, at 5:47 PM, Mani Chandra wrote: > > > Hi, > > > > Is there anyway I can VecView the residual after TS has completed an > implicit time step? I'd like to see where in my domain most of the errors > are coming from. I looked at TSMonitor but that doesn't seem to give me > access to the residual at the end of the current time step. > > > > Thanks, > > Mani > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sat Mar 15 18:29:24 2014 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 15 Mar 2014 18:29:24 -0500 Subject: [petsc-users] Spatial distribution of the residual after an implicit TS timestep In-Reply-To: <9523A33B-3F7D-4C44-9DFF-361FF72546A0@mcs.anl.gov> References: <9523A33B-3F7D-4C44-9DFF-361FF72546A0@mcs.anl.gov> Message-ID: On Sat, Mar 15, 2014 at 6:24 PM, Barry Smith wrote: > > > You can write a custom monitor and set it with TSMonitorSet() > > This routine would call TSGetSNES() then SNESGetSolution() then call > SNESComputeFunction() then call VecView() on the result. > > But note that just because the residual is big somewhere doesn't mean > the error need be. > > You could also run with -snes_monitor_residual to see how the residual > is being reduced inside the nonlinear solve (that is, what parts of the > residual are most stubborn). If you really want to play with the residual, inside your monitor you can use: TSGetSNES() http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/SNES/SNESGetFunction.html#SNESGetFunction Thanks, Matt > > Barry > > > On Mar 15, 2014, at 5:47 PM, Mani Chandra wrote: > > > Hi, > > > > Is there anyway I can VecView the residual after TS has completed an > implicit time step? I'd like to see where in my domain most of the errors > are coming from. I looked at TSMonitor but that doesn't seem to give me > access to the residual at the end of the current time step. > > > > Thanks, > > Mani > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mc0710 at gmail.com Sat Mar 15 18:35:58 2014 From: mc0710 at gmail.com (Mani Chandra) Date: Sat, 15 Mar 2014 18:35:58 -0500 Subject: [petsc-users] Spatial distribution of the residual after an implicit TS timestep In-Reply-To: References: <9523A33B-3F7D-4C44-9DFF-361FF72546A0@mcs.anl.gov> Message-ID: Thanks, those functions are what I was looking for. On Sat, Mar 15, 2014 at 6:29 PM, Matthew Knepley wrote: > On Sat, Mar 15, 2014 at 6:24 PM, Barry Smith wrote: > >> >> >> You can write a custom monitor and set it with TSMonitorSet() >> >> This routine would call TSGetSNES() then SNESGetSolution() then call >> SNESComputeFunction() then call VecView() on the result. >> >> But note that just because the residual is big somewhere doesn?t mean >> the error need be. >> >> You could also run with -snes_monitor_residual to see how the >> residual is being reduced inside the nonlinear solve (that is, what parts >> of the residual are most stubborn). > > > If you really want to play with the residual, inside your monitor you can > use: > > TSGetSNES() > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/SNES/SNESGetFunction.html#SNESGetFunction > > Thanks, > > Matt > > >> >> Barry >> >> >> On Mar 15, 2014, at 5:47 PM, Mani Chandra wrote: >> >> > Hi, >> > >> > Is there anyway I can VecView the residual after TS has completed an >> implicit time step? I'd like to see where in my domain most of the errors >> are coming from. I looked at TSMonitor but that doesn't seem to give me >> access to the residual at the end of the current time step. >> > >> > Thanks, >> > Mani >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sat Mar 15 21:03:08 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sat, 15 Mar 2014 21:03:08 -0500 Subject: [petsc-users] Spatial distribution of the residual after an implicit TS timestep In-Reply-To: References: <9523A33B-3F7D-4C44-9DFF-361FF72546A0@mcs.anl.gov> Message-ID: Yes but it is slightly complicated. You need to configure PETSc with ?with-afterimage and run the program with -draw_save then it will save each image. It will open an X window to display each image. You can run with -x_virtual to skip the opening of X windows but then you need Xvfb to be installed Barry On Mar 15, 2014, at 6:29 PM, Mani Chandra wrote: > -snes_montior_residual is really cool. Anyway I could get it to spit out png images instead of on screen visualization? > > > On Sat, Mar 15, 2014 at 6:24 PM, Barry Smith wrote: > > > You can write a custom monitor and set it with TSMonitorSet() > > This routine would call TSGetSNES() then SNESGetSolution() then call SNESComputeFunction() then call VecView() on the result. > > But note that just because the residual is big somewhere doesn?t mean the error need be. > > You could also run with -snes_monitor_residual to see how the residual is being reduced inside the nonlinear solve (that is, what parts of the residual are most stubborn). > > > Barry > > > On Mar 15, 2014, at 5:47 PM, Mani Chandra wrote: > > > Hi, > > > > Is there anyway I can VecView the residual after TS has completed an implicit time step? I'd like to see where in my domain most of the errors are coming from. I looked at TSMonitor but that doesn't seem to give me access to the residual at the end of the current time step. > > > > Thanks, > > Mani > > From chao.yang at Colorado.EDU Mon Mar 17 04:12:40 2014 From: chao.yang at Colorado.EDU (Chao Yang) Date: Mon, 17 Mar 2014 03:12:40 -0600 Subject: [petsc-users] Pipelined CG (or Gropp's CG) and communication overlap Message-ID: Hi, The pipelined CG (or Gropp's CG) recently implemented in PETSc is very attractive since it has the ability of hiding the collective communication in vector dot product by overlapping it with the application of preconditioner and/or SpMV. However, there is an issue that may seriously degrade the performance. In the pipelined CG, the asynchronous MPI_Iallreduce is called before the application of preconditioner and/or SpMV, and then ended by MPI_Wait. In the application of preconditioner and/or SpMV, communication may also be required (such as halo updating), which I find is often slowed down by the unfinished MPI_Iallreduce in the background. As far as I know, the current MPI doesn't provide prioritized communication. Therefore, it's highly possible that the performance of the pipelined CG may be even worse than a classic one due to the slowdown of preconditioner and SpMV. Is there a way to avoid this? Any suggestion would be high appreciated. Thanks in advance! Best wishes, Chao From jed at jedbrown.org Mon Mar 17 04:21:39 2014 From: jed at jedbrown.org (Jed Brown) Date: Mon, 17 Mar 2014 10:21:39 +0100 Subject: [petsc-users] Pipelined CG (or Gropp's CG) and communication overlap In-Reply-To: References: Message-ID: <87iordqixo.fsf@jedbrown.org> Chao Yang writes: > The pipelined CG (or Gropp's CG) recently implemented in PETSc is very > attractive since it has the ability of hiding the collective > communication in vector dot product by overlapping it with the > application of preconditioner and/or SpMV. > > However, there is an issue that may seriously degrade the > performance. In the pipelined CG, the asynchronous MPI_Iallreduce is > called before the application of preconditioner and/or SpMV, and then > ended by MPI_Wait. In the application of preconditioner and/or SpMV, > communication may also be required (such as halo updating), which I > find is often slowed down by the unfinished MPI_Iallreduce in the > background. > > As far as I know, the current MPI doesn't provide prioritized > communication. No, and there is not much interest in adding it because it adds complication and tends to create starvation situations in which raising the priority actually makes it slower. > Therefore, it's highly possible that the performance of the pipelined > CG may be even worse than a classic one due to the slowdown of > preconditioner and SpMV. Is there a way to avoid this? This is an MPI quality-of-implementation issue and there isn't much we can do about it. There may be MPI tuning parameters that can help, but the nature of these methods is that in exchange for creating latency-tolerance in the reduction, it now overlaps the neighbor communication in MatMult/PCApply. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From chao.yang at Colorado.EDU Mon Mar 17 04:36:59 2014 From: chao.yang at Colorado.EDU (Chao Yang) Date: Mon, 17 Mar 2014 03:36:59 -0600 Subject: [petsc-users] Pipelined CG (or Gropp's CG) and communication overlap In-Reply-To: <87iordqixo.fsf@jedbrown.org> References: <87iordqixo.fsf@jedbrown.org> Message-ID: Jed, Thanks ... aren't you sleep in the night? ;-) Chao > Chao Yang writes: > >> The pipelined CG (or Gropp's CG) recently implemented in PETSc is very >> attractive since it has the ability of hiding the collective >> communication in vector dot product by overlapping it with the >> application of preconditioner and/or SpMV. >> >> However, there is an issue that may seriously degrade the >> performance. In the pipelined CG, the asynchronous MPI_Iallreduce is >> called before the application of preconditioner and/or SpMV, and then >> ended by MPI_Wait. In the application of preconditioner and/or SpMV, >> communication may also be required (such as halo updating), which I >> find is often slowed down by the unfinished MPI_Iallreduce in the >> background. >> >> As far as I know, the current MPI doesn't provide prioritized >> communication. > > No, and there is not much interest in adding it because it adds > complication and tends to create starvation situations in which raising > the priority actually makes it slower. > >> Therefore, it's highly possible that the performance of the pipelined >> CG may be even worse than a classic one due to the slowdown of >> preconditioner and SpMV. Is there a way to avoid this? > > This is an MPI quality-of-implementation issue and there isn't much we > can do about it. There may be MPI tuning parameters that can help, but > the nature of these methods is that in exchange for creating > latency-tolerance in the reduction, it now overlaps the neighbor > communication in MatMult/PCApply. From William.Coirier at kratosdefense.com Mon Mar 17 09:23:23 2014 From: William.Coirier at kratosdefense.com (William Coirier) Date: Mon, 17 Mar 2014 14:23:23 +0000 Subject: [petsc-users] KSPSolve doesn't seem to scale. (Must be doing something wrong...) In-Reply-To: <1E995842-1C73-4513-A47C-AF7834085452@mcs.anl.gov> References: <1E995842-1C73-4513-A47C-AF7834085452@mcs.anl.gov> Message-ID: Thanks Barry. This analysis is very very helpful. We will reconfigure/re-run and get back with you when we have usable information. Thanks again! ----------------------------------------------------------------------- William J. Coirier, Ph.D. Director, Aerosciences and Engineering Analysis Advanced Technology Division Kratos/Digital Fusion, Inc. 4904 Research Drive Huntsville, AL 35805 256-327-8170 256-327-8120 (fax) -----Original Message----- From: Barry Smith [mailto:bsmith at mcs.anl.gov] Sent: Saturday, March 15, 2014 4:32 PM To: William Coirier Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] KSPSolve doesn't seem to scale. (Must be doing something wrong...) Bill, It is great that you ran with -info to see that there are not excessive mallocs in vector and matrix assemblies and -ksp_view to show the solver being used but I would recommend doing that in a separate run from the -log_summary because we make no attempt to have -info and -xx_view options optimized for performance. To begin analysis I find it is always best not to compare 1 to 2 processors, nor to compare at the highest level of number of processors but instead to compare somewhere in the middle. Hence I look at 2 and 4 processes 1) Looking at embarrassingly parallel operations 4procs VecMAXPY 8677 1.0 6.9120e+00 1.0 8.15e+09 1.0 0.0e+00 0.0e+00 0.0e+00 5 35 0 0 0 6 35 0 0 0 4717 MatSolve 8677 1.0 6.9232e+00 1.1 3.41e+09 1.0 0.0e+00 0.0e+00 0.0e+00 5 15 0 0 0 6 15 0 0 0 1971 MatLUFactorNum 1 1.0 2.5489e-03 1.2 6.53e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1024 VecScale 8677 1.0 2.1447e+01 1.1 2.71e+08 1.0 0.0e+00 0.0e+00 0.0e+00 16 1 0 0 0 19 1 0 0 0 51 VecAXPY 508 1.0 8.9473e-01 1.4 3.18e+07 1.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 142 2procs VecMAXPY 8341 1.0 9.4324e+00 1.0 1.54e+10 1.0 0.0e+00 0.0e+00 0.0e+00 15 34 0 0 0 23 35 0 0 0 3261 MatSolve 8341 1.0 1.0210e+01 1.0 6.61e+09 1.0 0.0e+00 0.0e+00 0.0e+00 16 15 0 0 0 25 15 0 0 0 1294 MatLUFactorNum 1 1.0 4.0622e-03 1.1 1.32e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 650 VecScale 8341 1.0 1.0367e+00 1.3 5.21e+08 1.0 0.0e+00 0.0e+00 0.0e+00 2 1 0 0 0 2 1 0 0 0 1006 VecAXPY 502 1.0 3.5317e-02 1.7 6.28e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 3553 These are routines where there is no communication between the MPI processes and no synchronization. Thus in an ideal situation one could hope for the routines to run TWICE as fast. For the last three operations I calculated the ratio of flop rates as 1.57, 1.52, and 1.44. Thus I conclude that the 4 MPI processes are sharing memory bandwidth thus you cannot expect to get 2 times speed up. But what is going on with VecScale and VecAXPY, why is the performance falling through the floor? I noticed that you are using OpenBLAS so did some poking around in google and found at https://github.com/xianyi/OpenBLAS/wiki/faq#what If your application is already multi-threaded, it will conflict with OpenBLAS multi-threading. Thus, you must set OpenBLAS to use single thread as following. * export OPENBLAS_NUM_THREADS=1 in the environment variables. Or * Call openblas_set_num_threads(1) in the application on runtime. Or * Build OpenBLAS single thread version, e.g. make USE_THREAD=0 Of course you application is not multi-threaded it is MPI parallel but you have the exact same problem, the number of cores is over subscribed with too many threads killing performance of some routines. So please FORCE OpenBlas to only use a single thread and rerun the 1,2,4, and 8 with -log_summary and without the -info and -xxx_view 2) I now compare the 4 and 8 process case with MAXPY and Solve 8procs VecMAXPY 9336 1.0 3.0977e+00 1.0 4.59e+09 1.0 0.0e+00 0.0e+00 0.0e+00 3 35 0 0 0 5 35 0 0 0 11835 MatSolve 9336 1.0 3.0873e+00 1.1 1.82e+09 1.0 0.0e+00 0.0e+00 0.0e+00 3 14 0 0 0 4 14 0 0 0 4716 4procs VecMAXPY 8677 1.0 6.9120e+00 1.0 8.15e+09 1.0 0.0e+00 0.0e+00 0.0e+00 5 35 0 0 0 6 35 0 0 0 4717 MatSolve 8677 1.0 6.9232e+00 1.1 3.41e+09 1.0 0.0e+00 0.0e+00 0.0e+00 5 15 0 0 0 6 15 0 0 0 1971 What the hey is going on here? The performance more than doubles! From this I conclude that going from 4 to 8 processes is moving the computation to twice as many physical CPUs that DO NOT share memory bandwidth. A general observation, since p multiple cores on the same physical CPU generally share memory bandwidth when you go from p/2 to p MPI processes on that CPU you will never see a double in performance (perfect speedup) you are actually lucky if you see the 1.5 speed up that you are seeing. Thus as you increase the number of MPI processes to extend to more and more physical CPUs you will see "funny jumps" in your speedup depending on when it is switching to more physical CPUs (and hence more memory bandwidth). Thus is is important to understand "where" the program is actually running. So make the changes I recommend and send us the new set of -log_summary and we may be able to make more observations based on less "cluttered" data. Barry On Mar 14, 2014, at 4:45 PM, William Coirier wrote: > I've written a parallel, finite-volume, transient thermal conduction solver using PETSc primitives, and so far things have been going great. Comparisons to theory for a simple problem (transient conduction in a semi-infinite slab) looks good, but I'm not getting very good parallel scaling behavior with the KSP solver. Whether I use the default KSP/PC or other sensible combinations, the time spent in KSPSolve seems to not scale well at all. > > I seem to have loaded up the problem well enough. The PETSc logging/profiling has been really useful for reworking various code segments, and right now, the bottleneck is KSPSolve, and I can't seem to figure out how to get it to scale properly. > > I'm attaching output produced with -log_summary, -info, -ksp_view and -pc_view all specified on the command line for 1, 2, 4 and 8 processes. > > If you guys have any suggestions, I'd definitely like to hear them! And I apologize in advance if I've done something stupid. All the documentation has been really helpful. > > Thanks in advance... > > Bill Coirier > > -------------------------------------------------------------------------------------------------------------------- > > ***NOTICE*** This e-mail and/or the attached documents may contain technical data within the definition of the International Traffic in Arms Regulations and/or Export Administration Regulations, and are subject to the export control laws of the U.S. Government. Transfer of this data by any means to a foreign person, whether in the United States or abroad, without an export license or other approval from the U.S. Department of State or Commerce, as applicable, is prohibited. No portion of this e-mail or its attachment(s) may be reproduced without written consent of Kratos Defense & Security Solutions, Inc. Any views expressed in this message are those of the individual sender, except where the message states otherwise and the sender is authorized to state them to be the views of any such entity. The information contained in this message and or attachments is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. If you are not the intended recipient or believe that you may have received this document in error, please notify the sender and delete this e-mail and any attachments immediately. -------------------------------------------------------------------------------------------------------------------- ***NOTICE*** This e-mail and/or the attached documents may contain technical data within the definition of the International Traffic in Arms Regulations and/or Export Administration Regulations, and are subject to the export control laws of the U.S. Government. Transfer of this data by any means to a foreign person, whether in the United States or abroad, without an export license or other approval from the U.S. Department of State or Commerce, as applicable, is prohibited. No portion of this e-mail or its attachment(s) may be reproduced without written consent of Kratos Defense & Security Solutions, Inc. Any views expressed in this message are those of the individual sender, except where the message states otherwise and the sender is authorized to state them to be the views of any such entity. The information contained in this message and or attachments is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. If you are not the intended recipient or believe that you may have received this document in error, please notify the sender and delete this e-mail and any attachments immediately. From zonexo at gmail.com Mon Mar 17 09:39:30 2014 From: zonexo at gmail.com (TAY wee-beng) Date: Mon, 17 Mar 2014 22:39:30 +0800 Subject: [petsc-users] How to use command line option for separate matrix Message-ID: <53270922.1090104@gmail.com> Hi, I use call KSPSetOptionsPrefix(ksp_semi_xyz,"momentum_",ierr) and call KSPSetOptionsPrefix(ksp,"poisson_",ierr) so that I can choose separate ksp/pc options for my momentum and poisson equations through command line e.g. -poisson_ksp_type gmres -poisson_pc_type hypre -poisson_pc_type_hypre boomeramg In general, I need to use boomeramg as the preconditioner and gmres as the solver for my poisson eqn, separate from my momentum eqn, which has its own default pc and ksp. Is the above the correct way? Thanks! -- Yours sincerely, TAY wee-beng From jianjun.xiao at kit.edu Mon Mar 17 10:06:51 2014 From: jianjun.xiao at kit.edu (Xiao, Jianjun (IKET)) Date: Mon, 17 Mar 2014 16:06:51 +0100 Subject: [petsc-users] undefined reference to `dmsetmattype_' Message-ID: <56D054AF2E93E044AC1D2685709D2868D8BBE2EE24@KIT-MSX-07.kit.edu> Hello, After the version was upgraded to 3.4.4, I got the error "undefined reference to `dmsetmattype_' ". I was using PETSc-dev, and it worked fine. The matrix was created as below CALL DMSetMatType(da,MATMPISBAIJ,ierr) CALL DMCreateMatrix(da,mat,ierr) Thank you. Best regards JJ From jed at jedbrown.org Mon Mar 17 13:11:05 2014 From: jed at jedbrown.org (Jed Brown) Date: Mon, 17 Mar 2014 19:11:05 +0100 Subject: [petsc-users] undefined reference to `dmsetmattype_' In-Reply-To: <56D054AF2E93E044AC1D2685709D2868D8BBE2EE24@KIT-MSX-07.kit.edu> References: <56D054AF2E93E044AC1D2685709D2868D8BBE2EE24@KIT-MSX-07.kit.edu> Message-ID: <87d2hkr8zq.fsf@jedbrown.org> "Xiao, Jianjun (IKET)" writes: > Hello, > > After the version was upgraded to 3.4.4, I got the error "undefined reference to `dmsetmattype_' ". I was using PETSc-dev, and it worked fine. > > The matrix was created as below > > CALL DMSetMatType(da,MATMPISBAIJ,ierr) > CALL DMCreateMatrix(da,mat,ierr) Use 'master' for this interface. The MatType argument to DMCreateMatrix was removed between v3.4 and 'master'. Barry added the missing Fortran stub (dmsetmattype_) in the commit that make this interface change. You're already using the new interface, so stick with 'master' until we make a release (soon). -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From bsmith at mcs.anl.gov Mon Mar 17 13:23:51 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 17 Mar 2014 13:23:51 -0500 Subject: [petsc-users] How to use command line option for separate matrix In-Reply-To: <53270922.1090104@gmail.com> References: <53270922.1090104@gmail.com> Message-ID: Yes. You can run with -poisson_ksp_view and -options_left to make sure the options you provide are actually used. Barry On Mar 17, 2014, at 9:39 AM, TAY wee-beng wrote: > Hi, > > I use > > call KSPSetOptionsPrefix(ksp_semi_xyz,"momentum_",ierr) > > and > > call KSPSetOptionsPrefix(ksp,"poisson_",ierr) > > so that I can choose separate ksp/pc options for my momentum and poisson equations through command line e.g. > > -poisson_ksp_type gmres -poisson_pc_type hypre -poisson_pc_type_hypre boomeramg > > In general, I need to use boomeramg as the preconditioner and gmres as the solver for my poisson eqn, separate from my momentum eqn, which has its own default pc and ksp. Is the above the correct way? > > Thanks! > > -- > Yours sincerely, > > TAY wee-beng > From jianjun.xiao at kit.edu Mon Mar 17 15:51:11 2014 From: jianjun.xiao at kit.edu (Xiao, Jianjun (IKET)) Date: Mon, 17 Mar 2014 21:51:11 +0100 Subject: [petsc-users] undefined reference to `dmsetmattype_' In-Reply-To: <87d2hkr8zq.fsf@jedbrown.org> References: <56D054AF2E93E044AC1D2685709D2868D8BBE2EE24@KIT-MSX-07.kit.edu>, <87d2hkr8zq.fsf@jedbrown.org> Message-ID: <56D054AF2E93E044AC1D2685709D2868D8BBE2EE25@KIT-MSX-07.kit.edu> Dear Jed, Thank you. I am using 'master' now. But I have another error DMDABoundaryType :: bx, by, bz 1 Error: Unclassifiable statement at (1) JJ ________________________________________ From: Jed Brown [jed at jedbrown.org] Sent: Monday, March 17, 2014 7:11 PM To: Xiao, Jianjun (IKET); petsc-users at mcs.anl.gov Subject: Re: [petsc-users] undefined reference to `dmsetmattype_' "Xiao, Jianjun (IKET)" writes: > Hello, > > After the version was upgraded to 3.4.4, I got the error "undefined reference to `dmsetmattype_' ". I was using PETSc-dev, and it worked fine. > > The matrix was created as below > > CALL DMSetMatType(da,MATMPISBAIJ,ierr) > CALL DMCreateMatrix(da,mat,ierr) Use 'master' for this interface. The MatType argument to DMCreateMatrix was removed between v3.4 and 'master'. Barry added the missing Fortran stub (dmsetmattype_) in the commit that make this interface change. You're already using the new interface, so stick with 'master' until we make a release (soon). From jed at jedbrown.org Mon Mar 17 15:54:41 2014 From: jed at jedbrown.org (Jed Brown) Date: Mon, 17 Mar 2014 21:54:41 +0100 Subject: [petsc-users] undefined reference to `dmsetmattype_' In-Reply-To: <56D054AF2E93E044AC1D2685709D2868D8BBE2EE25@KIT-MSX-07.kit.edu> References: <56D054AF2E93E044AC1D2685709D2868D8BBE2EE24@KIT-MSX-07.kit.edu> <87d2hkr8zq.fsf@jedbrown.org> <56D054AF2E93E044AC1D2685709D2868D8BBE2EE25@KIT-MSX-07.kit.edu> Message-ID: <87y508pmum.fsf@jedbrown.org> "Xiao, Jianjun (IKET)" writes: > Dear Jed, > > Thank you. > > I am using 'master' now. But I have another error > > DMDABoundaryType :: bx, by, bz > 1 > Error: Unclassifiable statement at (1) "DMDABoundaryType has become DMBoundaryType, and all the enumeration values have also been renamed." http://www.mcs.anl.gov/petsc/documentation/changes/dev.html -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From dafang.wang at jhu.edu Mon Mar 17 17:19:54 2014 From: dafang.wang at jhu.edu (Dafang Wang) Date: Mon, 17 Mar 2014 18:19:54 -0400 Subject: [petsc-users] When does DIVERGED_LINE_SEARCH Happen? In-Reply-To: <3C53DDCF-455A-4A28-A998-28CC7D96CC7E@mcs.anl.gov> References: <5323B36F.2070305@jhu.edu> <3C53DDCF-455A-4A28-A998-28CC7D96CC7E@mcs.anl.gov> Message-ID: <5327750A.5010004@jhu.edu> Hi Barry, Thanks for your tips. I have read the webpage you mentioned many times before, but still I have been stuck on the line-search problem for weeks. I cannot guarantee my Jacobian is correct but I believe an incorrect Jacobian is very unlikely. My Jacobian-calculation code has been under test for a year with both analytical and realistic models, and the results have been good until recently when I ran a very realistic physical model. Also, I looked up the implementation of SNESSolve_NEWTONLS() in "ls.c". According to the algorithm, when the function "SNESLineSearchApply()" does not succeed, one may encounter two possible outcomes: CONVERGED_SNORM_RELATIVE (if the search step is too small) or otherwise, DIVERGED_LINE_SEARCH. Does this mean that both these two outcomes indicate that the line search fails? I ask this question because my simulation encountered many CONVERGED_SNORM_RELATIVE. I treated them as if my nonlinear system converged, accepted the nonlinear solution, and then proceeded to the next time step of my simulation. Apparently, such practice has worked well in most cases, (even when I encountered suspicious DIVERGED_LINE_SEARCH behaviors). However, I wonder if there are any potential pitfalls in my practice such as missing a nonlinear solve divergence and taking a partial solution as the correct solution. Thank you very much for your time and help. Best, Dafang On 03/15/2014 11:15 AM, Barry Smith wrote: > Failed line search are almost always due to an incorrect Jacobian. Please let us know if the suggestions at http://www.mcs.anl.gov/petsc/documentation/faq.html#newton don?t help. > > Barry > > On Mar 14, 2014, at 8:57 PM, Dafang Wang wrote: > >> Hi, >> >> Does anyone know what the error code DIVERGED_LINE_SEARCH means in the SNES nonlinear solve? Or what scenario would lead to this error code? >> >> Running a solid mechanics simulation, I found that the occurrence of DIVERGED_LINE_SEARCH was very unpredictable and sensitive to the input values to my nonlinear system, although my system should not be that unstable. As shown by the two examples below, my system diverged in one case and converged in the other, although the input values in these two cases differed by only 1e-4, >> >> Moreover, the Newton steps in the two cases were very similar up to NL step 1. Since then, however, Case 1 encountered a line-search divergence whereas Case 2 converged successfully. This is my main confusion. (Note that each residual vector contains 3e04 DOF, so when their L2 norms differ within 1e-4, the two systems should be very close.) >> >> My simulation input consists of two scalar values (p1 and p2), each of which acts as a constant pressure boundary condition. >> >> Case 1, diverge: >> p1= -10.190869 p2= -2.367555 >> NL step 0, |residual|_2 = 1.621402e-02 >> Line search: Using full step: fnorm 1.621401550027e-02 gnorm 7.022558235262e-05 >> NL step 1, |residual|_2 = 7.022558e-05 >> Line search: Using full step: fnorm 7.022558235262e-05 gnorm 1.636418730611e-06 >> NL step 2, |residual|_2 = 1.636419e-06 >> Nonlinear solve did not converge due to DIVERGED_LINE_SEARCH iterations 2 >> Case 2: converge: >> p1= -10.190747 p2= -2.367558 >> NL step 0, |residual|_2 = 1.621380e-02 >> Line search: Using full step: fnorm 1.621379778276e-02 gnorm 6.976373804153e-05 >> NL step 1, |residual|_2 = 6.976374e-05 >> Line search: Using full step: fnorm 6.976373804153e-05 gnorm 4.000992847275e-07 >> NL step 2, |residual|_2 = 4.000993e-07 >> Line search: Using full step: fnorm 4.000992847275e-07 gnorm 1.621646014441e-08 >> NL step 3, |residual|_2 = 1.621646e-08 >> Nonlinear solve converged due to CONVERGED_SNORM_RELATIVE iterations 3 >> >> Aside from the input values, the initial solution in both cases may differ very slightly. (Each case is one time step in a time-sequence simulation. The two cases behaved nearly identically up to the last time step before the step shown above, so their initial solutions may differ by a cumulative error but such error should be very small.) Is it possible that little difference in initial guess leads to different local minimum regions where the line search in Case 1 failed? >> >> Any comments will be greatly appreciated. >> >> Thanks, >> Dafang >> -- >> Dafang Wang, Ph.D >> Postdoctoral Fellow >> Institute of Computational Medicine >> Department of Biomedical Engineering >> Johns Hopkins University >> Hackerman Hall Room 218 >> Baltimore, MD, 21218 -- Dafang Wang, Ph.D Postdoctoral Fellow Institute of Computational Medicine Department of Biomedical Engineering Johns Hopkins University Hackerman Hall Room 218 Baltimore, MD, 21218 From prbrune at gmail.com Mon Mar 17 17:27:02 2014 From: prbrune at gmail.com (Peter Brune) Date: Mon, 17 Mar 2014 17:27:02 -0500 Subject: [petsc-users] When does DIVERGED_LINE_SEARCH Happen? In-Reply-To: <5327750A.5010004@jhu.edu> References: <5323B36F.2070305@jhu.edu> <3C53DDCF-455A-4A28-A998-28CC7D96CC7E@mcs.anl.gov> <5327750A.5010004@jhu.edu> Message-ID: This may be related to a bug we had reported before to petsc-maint: https://bitbucket.org/petsc/petsc/commits/ced04f9d467b04aa83a18d3f8875c7f72c17217a What version of PETSc are you running? Also, what happens if you set -snes_stol to zero? Thanks, - Peter On Mon, Mar 17, 2014 at 5:19 PM, Dafang Wang wrote: > Hi Barry, > > Thanks for your tips. I have read the webpage you mentioned many times > before, but still I have been stuck on the line-search problem for weeks. > > I cannot guarantee my Jacobian is correct but I believe an incorrect > Jacobian is very unlikely. My Jacobian-calculation code has been under test > for a year with both analytical and realistic models, and the results have > been good until recently when I ran a very realistic physical model. > > Also, I looked up the implementation of SNESSolve_NEWTONLS() in "ls.c". > According to the algorithm, when the function "SNESLineSearchApply()" does > not succeed, one may encounter two possible outcomes: > CONVERGED_SNORM_RELATIVE (if the search step is too small) or otherwise, > DIVERGED_LINE_SEARCH. Does this mean that both these two outcomes indicate > that the line search fails? > > I ask this question because my simulation encountered many > CONVERGED_SNORM_RELATIVE. I treated them as if my nonlinear system > converged, accepted the nonlinear solution, and then proceeded to the next > time step of my simulation. Apparently, such practice has worked well in > most cases, (even when I encountered suspicious DIVERGED_LINE_SEARCH > behaviors). However, I wonder if there are any potential pitfalls in my > practice such as missing a nonlinear solve divergence and taking a partial > solution as the correct solution. > > Thank you very much for your time and help. > > Best, > Dafang > > > On 03/15/2014 11:15 AM, Barry Smith wrote: > >> Failed line search are almost always due to an incorrect Jacobian. >> Please let us know if the suggestions at http://www.mcs.anl.gov/petsc/ >> documentation/faq.html#newton don't help. >> >> Barry >> >> On Mar 14, 2014, at 8:57 PM, Dafang Wang wrote: >> >> Hi, >>> >>> Does anyone know what the error code DIVERGED_LINE_SEARCH means in the >>> SNES nonlinear solve? Or what scenario would lead to this error code? >>> >>> Running a solid mechanics simulation, I found that the occurrence of >>> DIVERGED_LINE_SEARCH was very unpredictable and sensitive to the input >>> values to my nonlinear system, although my system should not be that >>> unstable. As shown by the two examples below, my system diverged in one >>> case and converged in the other, although the input values in these two >>> cases differed by only 1e-4, >>> >>> Moreover, the Newton steps in the two cases were very similar up to NL >>> step 1. Since then, however, Case 1 encountered a line-search divergence >>> whereas Case 2 converged successfully. This is my main confusion. (Note >>> that each residual vector contains 3e04 DOF, so when their L2 norms differ >>> within 1e-4, the two systems should be very close.) >>> >>> My simulation input consists of two scalar values (p1 and p2), each of >>> which acts as a constant pressure boundary condition. >>> >>> Case 1, diverge: >>> p1= -10.190869 p2= -2.367555 >>> NL step 0, |residual|_2 = 1.621402e-02 >>> Line search: Using full step: fnorm 1.621401550027e-02 gnorm >>> 7.022558235262e-05 >>> NL step 1, |residual|_2 = 7.022558e-05 >>> Line search: Using full step: fnorm 7.022558235262e-05 gnorm >>> 1.636418730611e-06 >>> NL step 2, |residual|_2 = 1.636419e-06 >>> Nonlinear solve did not converge due to DIVERGED_LINE_SEARCH iterations 2 >>> Case 2: converge: >>> p1= -10.190747 p2= -2.367558 >>> NL step 0, |residual|_2 = 1.621380e-02 >>> Line search: Using full step: fnorm 1.621379778276e-02 gnorm >>> 6.976373804153e-05 >>> NL step 1, |residual|_2 = 6.976374e-05 >>> Line search: Using full step: fnorm 6.976373804153e-05 gnorm >>> 4.000992847275e-07 >>> NL step 2, |residual|_2 = 4.000993e-07 >>> Line search: Using full step: fnorm 4.000992847275e-07 gnorm >>> 1.621646014441e-08 >>> NL step 3, |residual|_2 = 1.621646e-08 >>> Nonlinear solve converged due to CONVERGED_SNORM_RELATIVE iterations 3 >>> >>> Aside from the input values, the initial solution in both cases may >>> differ very slightly. (Each case is one time step in a time-sequence >>> simulation. The two cases behaved nearly identically up to the last time >>> step before the step shown above, so their initial solutions may differ by >>> a cumulative error but such error should be very small.) Is it possible >>> that little difference in initial guess leads to different local minimum >>> regions where the line search in Case 1 failed? >>> >>> Any comments will be greatly appreciated. >>> >>> Thanks, >>> Dafang >>> -- >>> Dafang Wang, Ph.D >>> Postdoctoral Fellow >>> Institute of Computational Medicine >>> Department of Biomedical Engineering >>> Johns Hopkins University >>> Hackerman Hall Room 218 >>> Baltimore, MD, 21218 >>> >> > -- > Dafang Wang, Ph.D > Postdoctoral Fellow > Institute of Computational Medicine > Department of Biomedical Engineering > Johns Hopkins University > Hackerman Hall Room 218 > Baltimore, MD, 21218 > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dafang.wang at jhu.edu Mon Mar 17 17:37:26 2014 From: dafang.wang at jhu.edu (Dafang Wang) Date: Mon, 17 Mar 2014 18:37:26 -0400 Subject: [petsc-users] When does DIVERGED_LINE_SEARCH Happen? In-Reply-To: References: <5323B36F.2070305@jhu.edu> <3C53DDCF-455A-4A28-A998-28CC7D96CC7E@mcs.anl.gov> <5327750A.5010004@jhu.edu> Message-ID: <53277926.3060904@jhu.edu> Hi Peter, My version of PETSc (v3.4.3) does not contain the bug fix you mentioned: "+ ierr = SNESLineSearchSetNorms(linesearch,xnorm,fnorm,ynorm);CHKERRQ(ierr);" Would that be a problem? I typically used the default value of -snes_stol, never setting it to zero. I will let you know soon if you believe this is important. Cheers, Dafang On 03/17/2014 06:27 PM, Peter Brune wrote: > This may be related to a bug we had reported before to petsc-maint: > > https://bitbucket.org/petsc/petsc/commits/ced04f9d467b04aa83a18d3f8875c7f72c17217a > > What version of PETSc are you running? Also, what happens if you > set -snes_stol to zero? > > Thanks, > > - Peter > > > On Mon, Mar 17, 2014 at 5:19 PM, Dafang Wang > wrote: > > Hi Barry, > > Thanks for your tips. I have read the webpage you mentioned many > times before, but still I have been stuck on the line-search > problem for weeks. > > I cannot guarantee my Jacobian is correct but I believe an > incorrect Jacobian is very unlikely. My Jacobian-calculation code > has been under test for a year with both analytical and realistic > models, and the results have been good until recently when I ran a > very realistic physical model. > > Also, I looked up the implementation of SNESSolve_NEWTONLS() in > "ls.c". According to the algorithm, when the function > "SNESLineSearchApply()" does not succeed, one may encounter two > possible outcomes: CONVERGED_SNORM_RELATIVE (if the search step is > too small) or otherwise, DIVERGED_LINE_SEARCH. Does this mean that > both these two outcomes indicate that the line search fails? > > I ask this question because my simulation encountered many > CONVERGED_SNORM_RELATIVE. I treated them as if my nonlinear system > converged, accepted the nonlinear solution, and then proceeded to > the next time step of my simulation. Apparently, such practice has > worked well in most cases, (even when I encountered suspicious > DIVERGED_LINE_SEARCH behaviors). However, I wonder if there are > any potential pitfalls in my practice such as missing a nonlinear > solve divergence and taking a partial solution as the correct > solution. > > Thank you very much for your time and help. > > Best, > Dafang > > > On 03/15/2014 11:15 AM, Barry Smith wrote: > > Failed line search are almost always due to an incorrect > Jacobian. Please let us know if the suggestions at > http://www.mcs.anl.gov/petsc/documentation/faq.html#newton > don't help. > > Barry > > On Mar 14, 2014, at 8:57 PM, Dafang Wang > wrote: > > Hi, > > Does anyone know what the error code DIVERGED_LINE_SEARCH > means in the SNES nonlinear solve? Or what scenario would > lead to this error code? > > Running a solid mechanics simulation, I found that the > occurrence of DIVERGED_LINE_SEARCH was very unpredictable > and sensitive to the input values to my nonlinear system, > although my system should not be that unstable. As shown > by the two examples below, my system diverged in one case > and converged in the other, although the input values in > these two cases differed by only 1e-4, > > Moreover, the Newton steps in the two cases were very > similar up to NL step 1. Since then, however, Case 1 > encountered a line-search divergence whereas Case 2 > converged successfully. This is my main confusion. (Note > that each residual vector contains 3e04 DOF, so when their > L2 norms differ within 1e-4, the two systems should be > very close.) > > My simulation input consists of two scalar values (p1 and > p2), each of which acts as a constant pressure boundary > condition. > > Case 1, diverge: > p1= -10.190869 p2= -2.367555 > NL step 0, |residual|_2 = 1.621402e-02 > Line search: Using full step: fnorm > 1.621401550027e-02 gnorm 7.022558235262e-05 > NL step 1, |residual|_2 = 7.022558e-05 > Line search: Using full step: fnorm > 7.022558235262e-05 gnorm 1.636418730611e-06 > NL step 2, |residual|_2 = 1.636419e-06 > Nonlinear solve did not converge due to > DIVERGED_LINE_SEARCH iterations 2 > Case 2: converge: > p1= -10.190747 p2= -2.367558 > NL step 0, |residual|_2 = 1.621380e-02 > Line search: Using full step: fnorm > 1.621379778276e-02 gnorm 6.976373804153e-05 > NL step 1, |residual|_2 = 6.976374e-05 > Line search: Using full step: fnorm > 6.976373804153e-05 gnorm 4.000992847275e-07 > NL step 2, |residual|_2 = 4.000993e-07 > Line search: Using full step: fnorm > 4.000992847275e-07 gnorm 1.621646014441e-08 > NL step 3, |residual|_2 = 1.621646e-08 > Nonlinear solve converged due to CONVERGED_SNORM_RELATIVE > iterations 3 > > Aside from the input values, the initial solution in both > cases may differ very slightly. (Each case is one time > step in a time-sequence simulation. The two cases behaved > nearly identically up to the last time step before the > step shown above, so their initial solutions may differ by > a cumulative error but such error should be very small.) > Is it possible that little difference in initial guess > leads to different local minimum regions where the line > search in Case 1 failed? > > Any comments will be greatly appreciated. > > Thanks, > Dafang > -- > Dafang Wang, Ph.D > Postdoctoral Fellow > Institute of Computational Medicine > Department of Biomedical Engineering > Johns Hopkins University > Hackerman Hall Room 218 > Baltimore, MD, 21218 > > > -- > Dafang Wang, Ph.D > Postdoctoral Fellow > Institute of Computational Medicine > Department of Biomedical Engineering > Johns Hopkins University > Hackerman Hall Room 218 > Baltimore, MD, 21218 > > -- Dafang Wang, Ph.D Postdoctoral Fellow Institute of Computational Medicine Department of Biomedical Engineering Johns Hopkins University Hackerman Hall Room 218 Baltimore, MD, 21218 -------------- next part -------------- An HTML attachment was scrubbed... URL: From zonexo at gmail.com Tue Mar 18 01:58:37 2014 From: zonexo at gmail.com (TAY wee-beng) Date: Tue, 18 Mar 2014 14:58:37 +0800 Subject: [petsc-users] How to use command line option for separate matrix In-Reply-To: References: <53270922.1090104@gmail.com> Message-ID: <5327EE9D.2080407@gmail.com> Hi Barry, My command line is : mpiexec -np 46 ./a.out -options_left -poisson_ksp_view -poisson_ksp_type gmres -poisson_pc_type hypre -poisson_pc_type_hypre boomeramg > log My result is : /*KSP Object:(poisson_) 46 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object:(poisson_) 46 MPI processes type: hypre HYPRE BoomerAMG preconditioning HYPRE BoomerAMG: Cycle type V HYPRE BoomerAMG: Maximum number of levels 25 HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1 HYPRE BoomerAMG: Convergence tolerance PER hypre call 0 HYPRE BoomerAMG: Threshold for strong coupling 0.25 HYPRE BoomerAMG: Interpolation truncation factor 0 HYPRE BoomerAMG: Interpolation: max elements per row 0 HYPRE BoomerAMG: Number of levels of aggressive coarsening 0 HYPRE BoomerAMG: Number of paths for aggressive coarsening 1 HYPRE BoomerAMG: Maximum row sums 0.9 HYPRE BoomerAMG: Sweeps down 1 HYPRE BoomerAMG: Sweeps up 1 HYPRE BoomerAMG: Sweeps on coarse 1 HYPRE BoomerAMG: Relax down symmetric-SOR/Jacobi HYPRE BoomerAMG: Relax up symmetric-SOR/Jacobi HYPRE BoomerAMG: Relax on coarse Gaussian-elimination HYPRE BoomerAMG: Relax weight (all) 1 HYPRE BoomerAMG: Outer relax weight (all) 1 HYPRE BoomerAMG: Using CF-relaxation HYPRE BoomerAMG: Measure type local HYPRE BoomerAMG: Coarsen type Falgout HYPRE BoomerAMG: Interpolation type classical linear system matrix = precond matrix: Matrix Object: 46 MPI processes .... ... *//**//*#PETSc Option Table entries: -options_left -poisson_ksp_type gmres -poisson_pc_type hypre -poisson_pc_type_hypre boomeramg #End of PETSc Option Table entries There is one unused database option. It is: Option left: name:-poisson_pc_type_hypre value: boomeramg*/ It seems that it is using boomeramg but why does it say "one unused database option"? Did I do something wrong? Also if only my RHS of the Poisson eqn's changes, do I set the ksp and pc once at the start? E.g. : call KSPSetType(ksp,ksptype,ierr) ksptype=KSPGMRES call PCSetType(pc,'hypre',ierr) call PCHYPREGetType(pc,'boomeramg',ierr) or do I have to do it at each time step? Thank you Yours sincerely, TAY wee-beng On 18/3/2014 2:23 AM, Barry Smith wrote: > Yes. You can run with -poisson_ksp_view and -options_left to make sure the options you provide are actually used. > > Barry > > On Mar 17, 2014, at 9:39 AM, TAY wee-beng wrote: > >> Hi, >> >> I use >> >> call KSPSetOptionsPrefix(ksp_semi_xyz,"momentum_",ierr) >> >> and >> >> call KSPSetOptionsPrefix(ksp,"poisson_",ierr) >> >> so that I can choose separate ksp/pc options for my momentum and poisson equations through command line e.g. >> >> -poisson_ksp_type gmres -poisson_pc_type hypre -poisson_pc_type_hypre boomeramg >> >> In general, I need to use boomeramg as the preconditioner and gmres as the solver for my poisson eqn, separate from my momentum eqn, which has its own default pc and ksp. Is the above the correct way? >> >> Thanks! >> >> -- >> Yours sincerely, >> >> TAY wee-beng >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Mar 18 02:07:29 2014 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 18 Mar 2014 02:07:29 -0500 Subject: [petsc-users] How to use command line option for separate matrix In-Reply-To: <5327EE9D.2080407@gmail.com> References: <53270922.1090104@gmail.com> <5327EE9D.2080407@gmail.com> Message-ID: On Tue, Mar 18, 2014 at 1:58 AM, TAY wee-beng wrote: > Hi Barry, > > My command line is : > > mpiexec -np 46 ./a.out -options_left -poisson_ksp_view -poisson_ksp_type > gmres -poisson_pc_type hypre -poisson_pc_type_hypre boomeramg > log > ^^^^^^^^^^^^^^^^^^^^^^^^^^^ It should be -pisson_pc_hypre_type boomeramg, but BoomerAMG is the default. Matt My result is : > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > *KSP Object:(poisson_) 46 MPI processes type: gmres GMRES: > restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization > with no iterative refinement GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero tolerances: > relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning > using PRECONDITIONED norm type for convergence test PC Object:(poisson_) 46 > MPI processes type: hypre HYPRE BoomerAMG preconditioning HYPRE > BoomerAMG: Cycle type V HYPRE BoomerAMG: Maximum number of levels 25 > HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1 > HYPRE BoomerAMG: Convergence tolerance PER hypre call 0 HYPRE > BoomerAMG: Threshold for strong coupling 0.25 HYPRE BoomerAMG: > Interpolation truncation factor 0 HYPRE BoomerAMG: Interpolation: max > elements per row 0 HYPRE BoomerAMG: Number of levels of aggressive > coarsening 0 HYPRE BoomerAMG: Number of paths for aggressive coarsening > 1 HYPRE BoomerAMG: Maximum row sums 0.9 HYPRE BoomerAMG: Sweeps > down 1 HYPRE BoomerAMG: Sweeps up 1 HYPRE > BoomerAMG: Sweeps on coarse 1 HYPRE BoomerAMG: Relax down > symmetric-SOR/Jacobi HYPRE BoomerAMG: Relax up > symmetric-SOR/Jacobi HYPRE BoomerAMG: Relax on coarse > Gaussian-elimination HYPRE BoomerAMG: Relax weight (all) 1 > HYPRE BoomerAMG: Outer relax weight (all) 1 HYPRE BoomerAMG: Using > CF-relaxation HYPRE BoomerAMG: Measure type local HYPRE > BoomerAMG: Coarsen type Falgout HYPRE BoomerAMG: Interpolation > type classical linear system matrix = precond matrix: Matrix Object: > 46 MPI processes .... ... * > > > > > > > *#PETSc Option Table entries: -options_left -poisson_ksp_type gmres > -poisson_pc_type hypre -poisson_pc_type_hypre boomeramg #End of PETSc > Option Table entries There is one unused database option. It is: Option > left: name:-poisson_pc_type_hypre value: boomeramg* > > It seems that it is using boomeramg but why does it say "one unused > database option"? > > Did I do something wrong? > > Also if only my RHS of the Poisson eqn's changes, do I set the ksp and pc > once at the start? E.g. : > > call KSPSetType(ksp,ksptype,ierr) > > ksptype=KSPGMRES > > call PCSetType(pc,'hypre',ierr) > > call PCHYPREGetType(pc,'boomeramg',ierr) > > or do I have to do it at each time step? > > Thank you > > Yours sincerely, > > TAY wee-beng > > On 18/3/2014 2:23 AM, Barry Smith wrote: > > Yes. You can run with -poisson_ksp_view and -options_left to make sure the options you provide are actually used. > > Barry > > On Mar 17, 2014, at 9:39 AM, TAY wee-beng wrote: > > > Hi, > > I use > > call KSPSetOptionsPrefix(ksp_semi_xyz,"momentum_",ierr) > > and > > call KSPSetOptionsPrefix(ksp,"poisson_",ierr) > > so that I can choose separate ksp/pc options for my momentum and poisson equations through command line e.g. > > -poisson_ksp_type gmres -poisson_pc_type hypre -poisson_pc_type_hypre boomeramg > > In general, I need to use boomeramg as the preconditioner and gmres as the solver for my poisson eqn, separate from my momentum eqn, which has its own default pc and ksp. Is the above the correct way? > > Thanks! > > -- > Yours sincerely, > > TAY wee-beng > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From zonexo at gmail.com Tue Mar 18 02:28:56 2014 From: zonexo at gmail.com (TAY wee-beng) Date: Tue, 18 Mar 2014 15:28:56 +0800 Subject: [petsc-users] How to use command line option for separate matrix In-Reply-To: References: <53270922.1090104@gmail.com> <5327EE9D.2080407@gmail.com> Message-ID: <5327F5B8.5020004@gmail.com> On 18/3/2014 3:07 PM, Matthew Knepley wrote: > On Tue, Mar 18, 2014 at 1:58 AM, TAY wee-beng > wrote: > > Hi Barry, > > My command line is : > > mpiexec -np 46 ./a.out -options_left -poisson_ksp_view > -poisson_ksp_type gmres -poisson_pc_type hypre > -poisson_pc_type_hypre boomeramg > log > > ^^^^^^^^^^^^^^^^^^^^^^^^^^^ > > It should be -pisson_pc_hypre_type boomeramg, but BoomerAMG is the > default. So in other words, -poisson_pc_type hypre is sufficient, is that so? Also if only my RHS of the Poisson eqn's changes at every timestep, do I set the ksp and pc only once at the start? E.g. : call KSPSetType(ksp,ksptype,ierr) ksptype=KSPGMRES call PCSetType(pc,'hypre',ierr) call PCHYPREGetType(pc,'boomeramg',ierr) or do I have to do it at every time step? > > Matt > > My result is : > > /*KSP Object:(poisson_) 46 MPI processes > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000 > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object:(poisson_) 46 MPI processes > type: hypre > HYPRE BoomerAMG preconditioning > HYPRE BoomerAMG: Cycle type V > HYPRE BoomerAMG: Maximum number of levels 25 > HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1 > HYPRE BoomerAMG: Convergence tolerance PER hypre call 0 > HYPRE BoomerAMG: Threshold for strong coupling 0.25 > HYPRE BoomerAMG: Interpolation truncation factor 0 > HYPRE BoomerAMG: Interpolation: max elements per row 0 > HYPRE BoomerAMG: Number of levels of aggressive coarsening 0 > HYPRE BoomerAMG: Number of paths for aggressive coarsening 1 > HYPRE BoomerAMG: Maximum row sums 0.9 > HYPRE BoomerAMG: Sweeps down 1 > HYPRE BoomerAMG: Sweeps up 1 > HYPRE BoomerAMG: Sweeps on coarse 1 > HYPRE BoomerAMG: Relax down symmetric-SOR/Jacobi > HYPRE BoomerAMG: Relax up symmetric-SOR/Jacobi > HYPRE BoomerAMG: Relax on coarse Gaussian-elimination > HYPRE BoomerAMG: Relax weight (all) 1 > HYPRE BoomerAMG: Outer relax weight (all) 1 > HYPRE BoomerAMG: Using CF-relaxation > HYPRE BoomerAMG: Measure type local > HYPRE BoomerAMG: Coarsen type Falgout > HYPRE BoomerAMG: Interpolation type classical > linear system matrix = precond matrix: > Matrix Object: 46 MPI processes > .... > > ... > *//*#PETSc Option Table entries: > -options_left > -poisson_ksp_type gmres > -poisson_pc_type hypre > -poisson_pc_type_hypre boomeramg > #End of PETSc Option Table entries > There is one unused database option. It is: > Option left: name:-poisson_pc_type_hypre value: boomeramg*/ > > It seems that it is using boomeramg but why does it say "one > unused database option"? > > Did I do something wrong? > > Also if only my RHS of the Poisson eqn's changes, do I set the ksp > and pc once at the start? E.g. : > > call KSPSetType(ksp,ksptype,ierr) > > ksptype=KSPGMRES > > call PCSetType(pc,'hypre',ierr) > > call PCHYPREGetType(pc,'boomeramg',ierr) > > or do I have to do it at each time step? > > Thank you > > Yours sincerely, > > TAY wee-beng > > On 18/3/2014 2:23 AM, Barry Smith wrote: >> Yes. You can run with -poisson_ksp_view and -options_left to make sure the options you provide are actually used. >> >> Barry >> >> On Mar 17, 2014, at 9:39 AM, TAY wee-beng wrote: >> >>> Hi, >>> >>> I use >>> >>> call KSPSetOptionsPrefix(ksp_semi_xyz,"momentum_",ierr) >>> >>> and >>> >>> call KSPSetOptionsPrefix(ksp,"poisson_",ierr) >>> >>> so that I can choose separate ksp/pc options for my momentum and poisson equations through command line e.g. >>> >>> -poisson_ksp_type gmres -poisson_pc_type hypre -poisson_pc_type_hypre boomeramg >>> >>> In general, I need to use boomeramg as the preconditioner and gmres as the solver for my poisson eqn, separate from my momentum eqn, which has its own default pc and ksp. Is the above the correct way? >>> >>> Thanks! >>> >>> -- >>> Yours sincerely, >>> >>> TAY wee-beng >>> > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From zonexo at gmail.com Tue Mar 18 02:55:27 2014 From: zonexo at gmail.com (TAY wee-beng) Date: Tue, 18 Mar 2014 15:55:27 +0800 Subject: [petsc-users] How to use command line option for separate matrix In-Reply-To: References: <53270922.1090104@gmail.com> <5327EE9D.2080407@gmail.com> Message-ID: <5327FBEF.2060007@gmail.com> Thank you Yours sincerely, TAY wee-beng On 18/3/2014 3:07 PM, Matthew Knepley wrote: > On Tue, Mar 18, 2014 at 1:58 AM, TAY wee-beng > wrote: > > Hi Barry, > > My command line is : > > mpiexec -np 46 ./a.out -options_left -poisson_ksp_view > -poisson_ksp_type gmres -poisson_pc_type hypre > -poisson_pc_type_hypre boomeramg > log > > ^^^^^^^^^^^^^^^^^^^^^^^^^^^ > > It should be -pisson_pc_hypre_type boomeramg, but BoomerAMG is the > default. > > Matt After using mpiexec -np 46 ./a-gmres.out -options_left -poisson_ksp_type gmres -poisson_pc_type hypre -poisson_pc_hypre_type boomeramg > log I still get: #PETSc Option Table entries: -options_left -poisson_ksp_type gmres -poisson_pc_hypre_type boomeramg -poisson_pc_type hypre #End of PETSc Option Table entries There is one unused database option. It is: Option left: name:-poisson_pc_hypre_type value: boomeramg Did I missed out something? ~ > > My result is : > > /*KSP Object:(poisson_) 46 MPI processes > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000 > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object:(poisson_) 46 MPI processes > type: hypre > HYPRE BoomerAMG preconditioning > HYPRE BoomerAMG: Cycle type V > HYPRE BoomerAMG: Maximum number of levels 25 > HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1 > HYPRE BoomerAMG: Convergence tolerance PER hypre call 0 > HYPRE BoomerAMG: Threshold for strong coupling 0.25 > HYPRE BoomerAMG: Interpolation truncation factor 0 > HYPRE BoomerAMG: Interpolation: max elements per row 0 > HYPRE BoomerAMG: Number of levels of aggressive coarsening 0 > HYPRE BoomerAMG: Number of paths for aggressive coarsening 1 > HYPRE BoomerAMG: Maximum row sums 0.9 > HYPRE BoomerAMG: Sweeps down 1 > HYPRE BoomerAMG: Sweeps up 1 > HYPRE BoomerAMG: Sweeps on coarse 1 > HYPRE BoomerAMG: Relax down symmetric-SOR/Jacobi > HYPRE BoomerAMG: Relax up symmetric-SOR/Jacobi > HYPRE BoomerAMG: Relax on coarse Gaussian-elimination > HYPRE BoomerAMG: Relax weight (all) 1 > HYPRE BoomerAMG: Outer relax weight (all) 1 > HYPRE BoomerAMG: Using CF-relaxation > HYPRE BoomerAMG: Measure type local > HYPRE BoomerAMG: Coarsen type Falgout > HYPRE BoomerAMG: Interpolation type classical > linear system matrix = precond matrix: > Matrix Object: 46 MPI processes > .... > > ... > *//*#PETSc Option Table entries: > -options_left > -poisson_ksp_type gmres > -poisson_pc_type hypre > -poisson_pc_type_hypre boomeramg > #End of PETSc Option Table entries > There is one unused database option. It is: > Option left: name:-poisson_pc_type_hypre value: boomeramg*/ > > It seems that it is using boomeramg but why does it say "one > unused database option"? > > Did I do something wrong? > > Also if only my RHS of the Poisson eqn's changes, do I set the ksp > and pc once at the start? E.g. : > > call KSPSetType(ksp,ksptype,ierr) > > ksptype=KSPGMRES > > call PCSetType(pc,'hypre',ierr) > > call PCHYPREGetType(pc,'boomeramg',ierr) > > or do I have to do it at each time step? > > Thank you > > Yours sincerely, > > TAY wee-beng > > On 18/3/2014 2:23 AM, Barry Smith wrote: >> Yes. You can run with -poisson_ksp_view and -options_left to make sure the options you provide are actually used. >> >> Barry >> >> On Mar 17, 2014, at 9:39 AM, TAY wee-beng wrote: >> >>> Hi, >>> >>> I use >>> >>> call KSPSetOptionsPrefix(ksp_semi_xyz,"momentum_",ierr) >>> >>> and >>> >>> call KSPSetOptionsPrefix(ksp,"poisson_",ierr) >>> >>> so that I can choose separate ksp/pc options for my momentum and poisson equations through command line e.g. >>> >>> -poisson_ksp_type gmres -poisson_pc_type hypre -poisson_pc_type_hypre boomeramg >>> >>> In general, I need to use boomeramg as the preconditioner and gmres as the solver for my poisson eqn, separate from my momentum eqn, which has its own default pc and ksp. Is the above the correct way? >>> >>> Thanks! >>> >>> -- >>> Yours sincerely, >>> >>> TAY wee-beng >>> > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Mar 18 03:16:32 2014 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 18 Mar 2014 03:16:32 -0500 Subject: [petsc-users] How to use command line option for separate matrix In-Reply-To: <5327F5B8.5020004@gmail.com> References: <53270922.1090104@gmail.com> <5327EE9D.2080407@gmail.com> <5327F5B8.5020004@gmail.com> Message-ID: On Tue, Mar 18, 2014 at 2:28 AM, TAY wee-beng wrote: > On 18/3/2014 3:07 PM, Matthew Knepley wrote: > > On Tue, Mar 18, 2014 at 1:58 AM, TAY wee-beng wrote: > >> Hi Barry, >> >> My command line is : >> >> mpiexec -np 46 ./a.out -options_left -poisson_ksp_view -poisson_ksp_type >> gmres -poisson_pc_type hypre -poisson_pc_type_hypre boomeramg > log >> > > > > ^^^^^^^^^^^^^^^^^^^^^^^^^^^ > > It should be -pisson_pc_hypre_type boomeramg, but BoomerAMG is the > default. > > So in other words, -poisson_pc_type hypre is sufficient, is that so? > > Also if only my RHS of the Poisson eqn's changes at every timestep, do I > set the ksp and pc only once at the start? E.g. : > Yes, only at the start. Matt > call KSPSetType(ksp,ksptype,ierr) > > ksptype=KSPGMRES > > call PCSetType(pc,'hypre',ierr) > > call PCHYPREGetType(pc,'boomeramg',ierr) > > or do I have to do it at every time step? > > > Matt > > My result is : >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> *KSP Object:(poisson_) 46 MPI processes type: gmres GMRES: >> restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization >> with no iterative refinement GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero tolerances: >> relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning >> using PRECONDITIONED norm type for convergence test PC Object:(poisson_) 46 >> MPI processes type: hypre HYPRE BoomerAMG preconditioning HYPRE >> BoomerAMG: Cycle type V HYPRE BoomerAMG: Maximum number of levels 25 >> HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1 >> HYPRE BoomerAMG: Convergence tolerance PER hypre call 0 HYPRE >> BoomerAMG: Threshold for strong coupling 0.25 HYPRE BoomerAMG: >> Interpolation truncation factor 0 HYPRE BoomerAMG: Interpolation: max >> elements per row 0 HYPRE BoomerAMG: Number of levels of aggressive >> coarsening 0 HYPRE BoomerAMG: Number of paths for aggressive coarsening >> 1 HYPRE BoomerAMG: Maximum row sums 0.9 HYPRE BoomerAMG: Sweeps >> down 1 HYPRE BoomerAMG: Sweeps up 1 HYPRE >> BoomerAMG: Sweeps on coarse 1 HYPRE BoomerAMG: Relax down >> symmetric-SOR/Jacobi HYPRE BoomerAMG: Relax up >> symmetric-SOR/Jacobi HYPRE BoomerAMG: Relax on coarse >> Gaussian-elimination HYPRE BoomerAMG: Relax weight (all) 1 >> HYPRE BoomerAMG: Outer relax weight (all) 1 HYPRE BoomerAMG: Using >> CF-relaxation HYPRE BoomerAMG: Measure type local HYPRE >> BoomerAMG: Coarsen type Falgout HYPRE BoomerAMG: Interpolation >> type classical linear system matrix = precond matrix: Matrix Object: >> 46 MPI processes .... ... * >> >> >> >> >> >> >> *#PETSc Option Table entries: -options_left -poisson_ksp_type gmres >> -poisson_pc_type hypre -poisson_pc_type_hypre boomeramg #End of PETSc >> Option Table entries There is one unused database option. It is: Option >> left: name:-poisson_pc_type_hypre value: boomeramg* >> >> It seems that it is using boomeramg but why does it say "one unused >> database option"? >> >> Did I do something wrong? >> >> Also if only my RHS of the Poisson eqn's changes, do I set the ksp and pc >> once at the start? E.g. : >> >> call KSPSetType(ksp,ksptype,ierr) >> >> ksptype=KSPGMRES >> >> call PCSetType(pc,'hypre',ierr) >> >> call PCHYPREGetType(pc,'boomeramg',ierr) >> >> or do I have to do it at each time step? >> >> Thank you >> >> Yours sincerely, >> >> TAY wee-beng >> >> On 18/3/2014 2:23 AM, Barry Smith wrote: >> >> Yes. You can run with -poisson_ksp_view and -options_left to make sure the options you provide are actually used. >> >> Barry >> >> On Mar 17, 2014, at 9:39 AM, TAY wee-beng wrote: >> >> >> Hi, >> >> I use >> >> call KSPSetOptionsPrefix(ksp_semi_xyz,"momentum_",ierr) >> >> and >> >> call KSPSetOptionsPrefix(ksp,"poisson_",ierr) >> >> so that I can choose separate ksp/pc options for my momentum and poisson equations through command line e.g. >> >> -poisson_ksp_type gmres -poisson_pc_type hypre -poisson_pc_type_hypre boomeramg >> >> In general, I need to use boomeramg as the preconditioner and gmres as the solver for my poisson eqn, separate from my momentum eqn, which has its own default pc and ksp. Is the above the correct way? >> >> Thanks! >> >> -- >> Yours sincerely, >> >> TAY wee-beng >> >> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Mar 18 03:18:58 2014 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 18 Mar 2014 03:18:58 -0500 Subject: [petsc-users] How to use command line option for separate matrix In-Reply-To: <5327FBEF.2060007@gmail.com> References: <53270922.1090104@gmail.com> <5327EE9D.2080407@gmail.com> <5327FBEF.2060007@gmail.com> Message-ID: On Tue, Mar 18, 2014 at 2:55 AM, TAY wee-beng wrote: > > Thank you > > Yours sincerely, > > TAY wee-beng > > On 18/3/2014 3:07 PM, Matthew Knepley wrote: > > On Tue, Mar 18, 2014 at 1:58 AM, TAY wee-beng wrote: > >> Hi Barry, >> >> My command line is : >> >> mpiexec -np 46 ./a.out -options_left -poisson_ksp_view -poisson_ksp_type >> gmres -poisson_pc_type hypre -poisson_pc_type_hypre boomeramg > log >> > > > > ^^^^^^^^^^^^^^^^^^^^^^^^^^^ > > It should be -pisson_pc_hypre_type boomeramg, but BoomerAMG is the > default. > > Matt > > > After using > > mpiexec -np 46 ./a-gmres.out -options_left -poisson_ksp_type gmres > -poisson_pc_type hypre -poisson_pc_hypre_type boomeramg > log > > I still get: > > > #PETSc Option Table entries: > -options_left > -poisson_ksp_type gmres > -poisson_pc_hypre_type boomeramg > -poisson_pc_type hypre > > #End of PETSc Option Table entries > There is one unused database option. It is: > Option left: name:-poisson_pc_hypre_type value: boomeramg > > Did I missed out something? > I think you are covering this up by API calls. You should only have KSPCreate() KSPSetFromOptions() KSPSetOperators() KSPSolve() Matt > ~ > > > My result is : >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> *KSP Object:(poisson_) 46 MPI processes type: gmres GMRES: >> restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization >> with no iterative refinement GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero tolerances: >> relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning >> using PRECONDITIONED norm type for convergence test PC Object:(poisson_) 46 >> MPI processes type: hypre HYPRE BoomerAMG preconditioning HYPRE >> BoomerAMG: Cycle type V HYPRE BoomerAMG: Maximum number of levels 25 >> HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1 >> HYPRE BoomerAMG: Convergence tolerance PER hypre call 0 HYPRE >> BoomerAMG: Threshold for strong coupling 0.25 HYPRE BoomerAMG: >> Interpolation truncation factor 0 HYPRE BoomerAMG: Interpolation: max >> elements per row 0 HYPRE BoomerAMG: Number of levels of aggressive >> coarsening 0 HYPRE BoomerAMG: Number of paths for aggressive coarsening >> 1 HYPRE BoomerAMG: Maximum row sums 0.9 HYPRE BoomerAMG: Sweeps >> down 1 HYPRE BoomerAMG: Sweeps up 1 HYPRE >> BoomerAMG: Sweeps on coarse 1 HYPRE BoomerAMG: Relax down >> symmetric-SOR/Jacobi HYPRE BoomerAMG: Relax up >> symmetric-SOR/Jacobi HYPRE BoomerAMG: Relax on coarse >> Gaussian-elimination HYPRE BoomerAMG: Relax weight (all) 1 >> HYPRE BoomerAMG: Outer relax weight (all) 1 HYPRE BoomerAMG: Using >> CF-relaxation HYPRE BoomerAMG: Measure type local HYPRE >> BoomerAMG: Coarsen type Falgout HYPRE BoomerAMG: Interpolation >> type classical linear system matrix = precond matrix: Matrix Object: >> 46 MPI processes .... ... * >> >> >> >> >> >> >> *#PETSc Option Table entries: -options_left -poisson_ksp_type gmres >> -poisson_pc_type hypre -poisson_pc_type_hypre boomeramg #End of PETSc >> Option Table entries There is one unused database option. It is: Option >> left: name:-poisson_pc_type_hypre value: boomeramg* >> >> It seems that it is using boomeramg but why does it say "one unused >> database option"? >> >> Did I do something wrong? >> >> Also if only my RHS of the Poisson eqn's changes, do I set the ksp and pc >> once at the start? E.g. : >> >> call KSPSetType(ksp,ksptype,ierr) >> >> ksptype=KSPGMRES >> >> call PCSetType(pc,'hypre',ierr) >> >> call PCHYPREGetType(pc,'boomeramg',ierr) >> >> or do I have to do it at each time step? >> >> Thank you >> >> Yours sincerely, >> >> TAY wee-beng >> >> On 18/3/2014 2:23 AM, Barry Smith wrote: >> >> Yes. You can run with -poisson_ksp_view and -options_left to make sure the options you provide are actually used. >> >> Barry >> >> On Mar 17, 2014, at 9:39 AM, TAY wee-beng wrote: >> >> >> Hi, >> >> I use >> >> call KSPSetOptionsPrefix(ksp_semi_xyz,"momentum_",ierr) >> >> and >> >> call KSPSetOptionsPrefix(ksp,"poisson_",ierr) >> >> so that I can choose separate ksp/pc options for my momentum and poisson equations through command line e.g. >> >> -poisson_ksp_type gmres -poisson_pc_type hypre -poisson_pc_type_hypre boomeramg >> >> In general, I need to use boomeramg as the preconditioner and gmres as the solver for my poisson eqn, separate from my momentum eqn, which has its own default pc and ksp. Is the above the correct way? >> >> Thanks! >> >> -- >> Yours sincerely, >> >> TAY wee-beng >> >> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From m.bahaa.eldein at gmail.com Tue Mar 18 07:44:05 2014 From: m.bahaa.eldein at gmail.com (Mohammad Bahaa) Date: Tue, 18 Mar 2014 14:44:05 +0200 Subject: [petsc-users] Using PETSC for a CFD solver In-Reply-To: References: Message-ID: The "PETSC_COMM_SELF" worked like a charm for me, thanks alot On Mon, Mar 10, 2014 at 4:50 PM, Matthew Knepley wrote: > On Mon, Mar 10, 2014 at 9:36 AM, Mohammad Bahaa wrote: > >> Hi, >> >> I'm pretty new to PETSC, so pardon me if the question is primitive >> somehow, I used *METIS *to partition my grid (represented by a system of >> linear equations Ax=b) to a number of sub-grids, say 4 sub-grids, with 4 >> different systems of linear Equations (A1x1=b1, A2x2=b2, ...), can anyone >> post an example showing how to solve these "n" sub-systems simultaneously, >> I've tried the following program, but it's not working correctly, as when I >> use *MatGetOwnershipRange *in each process I find that A1 ownership >> range is 1/4 the matrix size for the first process, while it should be all >> of it. >> > > I will answer this two ways. First, here is the "PETSc strategy" for doing > the same thing. > > 1) Write a code that assembles and solves the entire thing > > 2) Use -pc_type bjacobi -ksp_type preonly -sub_ksp_type want> -mat > > 3) You can use ParMetis inside PETSc with the MatPartitioning > > This will solve the individual systems with no coupling (this is what it > sounds like you want above). > > If you want to manage everything yourself, and you want to form individual > systems on every process, > just create the solvers using a smaller communicator. PETSC_COMM_SELF > means that every system > is serial. You can make smaller comms with MPI_Comm_split() if you want > smaller comms, but some > parallelism for each system. > > Thanks, > > Matt > > >> subroutine test_drive_2 >> >> implicit none >> >> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >> ! Include files >> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >> ! >> ! This program uses CPP for preprocessing, as indicated by the use of >> ! PETSc include files in the directory petsc/include/finclude. This >> ! convention enables use of the CPP preprocessor, which allows the use >> ! of the #include statements that define PETSc objects and variables. >> ! >> ! Use of the conventional Fortran include statements is also supported >> ! In this case, the PETsc include files are located in the directory >> ! petsc/include/foldinclude. >> ! >> ! Since one must be very careful to include each file no more than once >> ! in a Fortran routine, application programmers must exlicitly list >> ! each file needed for the various PETSc components within their >> ! program (unlike the C/C++ interface). >> ! >> ! See the Fortran section of the PETSc users manual for details. >> ! >> ! The following include statements are required for KSP Fortran programs: >> ! petscsys.h - base PETSc routines >> ! petscvec.h - vectors >> ! petscmat.h - matrices >> ! petscksp.h - Krylov subspace methods >> ! petscpc.h - preconditioners >> ! Other include statements may be needed if using additional PETSc >> ! routines in a Fortran program, e.g., >> ! petscviewer.h - viewers >> ! petscis.h - index sets >> ! >> >> ! main includes >> #include >> #include >> #include >> #include >> #include >> >> ! includes for F90 specific functions >> #include >> #include >> #include >> #include >> ! >> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >> ! Variable declarations >> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >> ! >> ! Variables: >> ! ksp - linear solver context >> ! ksp - Krylov subspace method context >> ! pc - preconditioner context >> ! x, b, u - approx solution, right-hand-side, exact solution vectors >> ! A - matrix that defines linear system >> ! its - iterations for convergence >> ! norm - norm of error in solution >> ! >> Vec x,b,u >> Mat A >> KSP ksp >> PC pc >> PetscReal norm,tol >> PetscErrorCode ierr >> PetscInt i,n,col(3),its,i1,i2,i3 >> PetscBool flg >> PetscMPIInt size,rank >> PetscScalar none,one,value(3), testa >> >> PetscScalar, pointer :: xx_v(:) >> PetscScalar, allocatable, dimension(:) :: myx >> PetscOffset i_x >> >> !real(4) :: myx(10), myu(10), myb(10) >> !real(8), allocatable, dimension(:) :: myx >> >> integer :: ic, nc, ncmax, nz, ncols, j >> integer :: fileunit, ione >> integer, allocatable, dimension(:,:) :: neighb, cols >> integer, allocatable, dimension(:) :: nnz, vcols >> real(8), allocatable, dimension(:,:) :: acoef, vals >> real(8), allocatable, dimension(:) :: ap, su, vvals >> character (len=100) :: rankstring, filename, folder >> >> real(8) :: atol, rtol, dtol >> integer :: mxit, istart, iend >> real(8) :: rvar, minvalx >> >> >> call PetscInitialize(PETSC_NULL_CHARACTER,ierr) >> call MPI_Comm_size(PETSC_COMM_WORLD,size,ierr) >> call MPI_Comm_rank(PETSC_COMM_WORLD,rank,ierr) >> >> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >> ! Load the linear system >> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >> >> !mbs read data file for experimentation >> !if(rank.EQ.0) read(*,'(A)'), folder >> folder = '60' >> write(rankstring,'(I)'), rank >> filename = trim(adjustl(folder)) // '/linearsys_' // >> trim(adjustl(rankstring)) // '.txt' >> fileunit = 9000 + rank >> open(unit=fileunit, file=trim(filename)) >> >> read(fileunit,*), ncmax >> read(fileunit,*), nc >> read(fileunit,*), >> >> allocate( neighb(6,ncmax), acoef(6,ncmax), ap(ncmax), su(ncmax) ) >> allocate( nnz(0:ncmax-1), cols(6,0:ncmax-1), vals(6,0:ncmax-1) ) >> allocate( vcols(0:ncmax-1), vvals(0:ncmax-1) ) >> !allocate( xx_v(ncmax), myx(ncmax) ) >> !allocate( xx_v(0:ncmax), myx(0:ncmax) ) >> allocate( myx(ncmax) ) >> >> do i=1,nc >> read(fileunit,'(I)'), ic >> read(fileunit,'(6I)'), ( neighb(j,ic), j=1,6 ) >> read(fileunit,'(6F)'), ( acoef(j,ic), j=1,6 ) >> read(fileunit,'(2F)'), ap(ic), su(ic) >> read(fileunit,*), >> enddo >> >> close(fileunit) >> >> nz = 7 >> nnz = 0 >> do ic=0,nc-1 >> >> ! values for coefficient matrix (diagonal) >> nnz(ic) = nnz(ic) + 1 >> cols(nnz(ic),ic) = ic >> vals(nnz(ic),ic) = ap(ic+1) >> >> ! values for coefficient matrix (off diagonal) >> do j=1,6 >> if(neighb(j,ic+1).GT.0)then >> nnz(ic) = nnz(ic) + 1 >> cols(nnz(ic),ic) = neighb(j,ic+1) - 1 >> vals(nnz(ic),ic) = acoef(j,ic+1) >> endif >> enddo >> >> ! values for RHS >> vcols(ic) = ic >> vvals(ic) = su(ic+1) >> >> enddo >> >> ! add dummy values for remaining rows (if any) >> if(ncmax.GT.nc)then >> do ic=nc,ncmax-1 >> ! coeff matrix >> nnz(ic) = 1 >> cols(nnz(ic),ic) = ic >> vals(nnz(ic),ic) = 1.0 >> ! RHS >> vcols(ic) = ic >> vvals(ic) = 0.0 >> enddo >> endif >> >> print*, 'rank', rank, 'says nc is', nc >> >> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >> ! Beginning of program >> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >> >> !if (size .ne. 1) then >> ! call MPI_Comm_rank(PETSC_COMM_WORLD,rank,ierr) >> ! if (rank .eq. 0) then >> ! write(6,*) 'This is a uniprocessor example only!' >> ! endif >> ! SETERRQ(PETSC_COMM_WORLD,1,' ',ierr) >> !endif >> >> ione = 1 >> none = -1.0 >> one = 1.0 >> n = ncmax >> i1 = 1 >> i2 = 2 >> i3 = 3 >> call PetscOptionsGetInt(PETSC_NULL_CHARACTER,'-n',n,flg,ierr) >> >> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >> ! Compute the matrix and right-hand-side vector that define >> ! the linear system, Ax = b. >> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >> >> ! Create matrix. When using MatCreate(), the matrix format can >> ! be specified at runtime. >> >> call MatCreate(PETSC_COMM_WORLD,A,ierr) >> !call MatCreateSeqAij(PETSC_COMM_SELF,A,ierr) >> call MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,n,n,ierr) >> call MatSetFromOptions(A,ierr) >> call MatSetUp(A,ierr) >> >> call MatGetOwnershipRange(A,istart,iend,ierr) >> print*, rank, istart, iend >> >> ! Assemble matrix. >> ! - Note that MatSetValues() uses 0-based row and column numbers >> ! in Fortran as well as in C (as set here in the array "col"). >> >> ! value(1) = -1.0 >> ! value(2) = 2.0 >> ! value(3) = -1.0 >> ! do 50 i=1,n-2 >> ! col(1) = i-1 >> ! col(2) = i >> ! col(3) = i+1 >> ! call MatSetValues(A,i1,i,i3,col,value,INSERT_VALUES,ierr) >> !50 continue >> ! i = n - 1 >> ! col(1) = n - 2 >> ! col(2) = n - 1 >> ! call MatSetValues(A,i1,i,i2,col,value,INSERT_VALUES,ierr) >> ! i = 0 >> ! col(1) = 0 >> ! col(2) = 1 >> ! value(1) = 2.0 >> ! value(2) = -1.0 >> ! call MatSetValues(A,i1,i,i2,col,value,INSERT_VALUES,ierr) >> ! call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr) >> ! call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr) >> >> do ic=0,ncmax-1 >> call >> MatSetValues(A,ione,ic,nnz(ic),cols(1:nnz(ic),ic),vals(1:nnz(ic),ic),INSERT_VALUES,ierr) >> enddo >> >> call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr) >> call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr) >> >> ! Create vectors. Note that we form 1 vector from scratch and >> ! then duplicate as needed. >> >> call VecCreate(PETSC_COMM_WORLD,x,ierr) >> !call VecCreateSeq(PETSC_COMM_SELF,x,ierr) >> call VecSetSizes(x,PETSC_DECIDE,n,ierr) >> call VecSetFromOptions(x,ierr) >> call VecDuplicate(x,b,ierr) >> call VecDuplicate(x,u,ierr) >> >> ! Set exact solution; then compute right-hand-side vector. >> >> call VecSet(u,one,ierr) >> call VecSet(x,one*10,ierr) >> !call MatMult(A,u,b,ierr) >> >> ! set source terms vector >> call VecSetValues(b,ncmax,vcols,vvals,INSERT_VALUES,ierr) >> call VecAssemblyBegin(b,ierr) >> call VecAssemblyEnd(b,ierr) >> >> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >> ! Create the linear solver and set various options >> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >> >> ! Create linear solver context >> >> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) >> >> ! Set operators. Here the matrix that defines the linear system >> ! also serves as the preconditioning matrix. >> >> call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr) >> >> ! Set linear solver defaults for this problem (optional). >> ! - By extracting the KSP and PC contexts from the KSP context, >> ! we can then directly directly call any KSP and PC routines >> ! to set various options. >> ! - The following four statements are optional; all of these >> ! parameters could alternatively be specified at runtime via >> ! KSPSetFromOptions(); >> >> call KSPGetPC(ksp,pc,ierr) >> call PCSetType(pc,PCJACOBI,ierr) >> atol = 1.d-12 >> rtol = 1.d-12 >> dtol = 1.d10 >> mxit = 100 >> ! call KSPSetTolerances(ksp,tol,PETSC_DEFAULT_DOUBLE_PRECISION, & >> ! & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) >> >> call KSPSetTolerances(ksp,atol,rtol,dtol,mxit,ierr) >> >> ! Set runtime options, e.g., >> ! -ksp_type -pc_type -ksp_monitor -ksp_rtol >> ! These options will override those specified above as long as >> ! KSPSetFromOptions() is called _after_ any other customization >> ! routines. >> >> call KSPSetFromOptions(ksp,ierr) >> >> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >> ! Solve the linear system >> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >> >> call KSPSetType(ksp,KSPBCGS,ierr) >> >> call KSPSolve(ksp,b,x,ierr) >> >> ! View solver info; we could instead use the option -ksp_view >> >> !call KSPView(ksp,PETSC_VIEWER_STDOUT_WORLD,ierr) >> >> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >> ! Check solution and clean up >> ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - >> >> call VecGetArrayF90(x,xx_v,ierr) >> !xx_v = 5.1d0 >> !do ic=1,ncmax >> ! myx(ic) = xx_v(ic) >> !enddo >> myx = xx_v >> !call VecGetArray(x,myx,i_x,ierr) >> !value = x_array(i_x + 1) >> !call VecRestoreArray(x,myx,i_x,ierr) >> !rvar = xx_v(3) >> call VecRestoreArrayF90(x,xx_v,ierr) >> ! >> !call VecGetArrayF90(b,xx_v,ierr) >> !myb = xx_v >> !call VecRestoreArrayF90(x,xx_v,ierr) >> ! >> !call VecView(x,PETSC_VIEWER_STDOUT_SELF) >> !call MatView(a,PETSC_VIEWER_STDOUT_SELF) >> >> !print*, 'rank', rank, 'says max x is', maxval(myx) >> ! print*, xx_v >> >> ! Check the error >> >> call MatMult(A,x,u,ierr) >> call VecAXPY(u,none,b,ierr) >> call VecNorm(u,NORM_2,norm,ierr) >> call KSPGetIterationNumber(ksp,its,ierr) >> >> if (norm .gt. 1.e-12) then >> write(6,100) norm,its >> else >> write(6,200) its >> endif >> 100 format('Norm of error = ',e11.4,', Iterations = ',i5) >> 200 format('Norm of error < 1.e-12,Iterations = ',i5) >> >> !call KSPGetSolution(ksp,myx) >> >> minvalx = 1.0e15 >> do ic=1,ncmax >> if(myx(ic).LT.minvalx) minvalx = myx(ic) >> enddo >> >> !write(*,300), rank, maxval(myx(1:nc)), minvalx >> >> if(rank.EQ.0) print*, myx >> >> 300 format('Rank ', I1, ' says max/min x are: ', F12.4, ' / ', F12.4) >> >> ! Free work space. All PETSc objects should be destroyed when they >> ! are no longer needed. >> >> call VecDestroy(x,ierr) >> call VecDestroy(u,ierr) >> call VecDestroy(b,ierr) >> call MatDestroy(A,ierr) >> call KSPDestroy(ksp,ierr) >> call PetscFinalize(ierr) >> >> continue >> >> end >> >> -- >> Mohamamd Bahaa ElDin >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- Mohamamd Bahaa ElDin -------------- next part -------------- An HTML attachment was scrubbed... URL: From m.bahaa.eldein at gmail.com Tue Mar 18 07:53:26 2014 From: m.bahaa.eldein at gmail.com (Mohammad Bahaa) Date: Tue, 18 Mar 2014 14:53:26 +0200 Subject: [petsc-users] Custom vector owenrship ranges Message-ID: I'm using "PETSC_COMM_SELF" communicator for running n serial independent processes, I need to sum up a certain vector from the n processes in one vector, however, vectors involved in each process vary in size, and I couldn't find any function to define custom ownership ranges, so assuming I have a 4 processes run with each computing an "x" vector as follows: 1. process (1) with x of length 51 2. process (2) with x of length 49 3. process (3) with x of length 52 4. process (4) with x of length 48 The processes sum up to 100 elements, when I define a vector "x_all" of size "100" with "PETSC_COMM_WORLD" communicator, the ownership ranges are equal, which isn't the case, how to customize them ? -- Mohamamd Bahaa ElDin -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Mar 18 08:09:28 2014 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 18 Mar 2014 08:09:28 -0500 Subject: [petsc-users] Custom vector owenrship ranges In-Reply-To: References: Message-ID: On Tue, Mar 18, 2014 at 7:53 AM, Mohammad Bahaa wrote: > I'm using "PETSC_COMM_SELF" communicator for running n serial independent > processes, I need to sum up a certain vector from the n processes in one > vector, however, vectors involved in each process vary in size, and I > couldn't find any function to define custom ownership ranges, so assuming I > have a 4 processes run with each computing an "x" vector as follows: > > 1. process (1) with x of length 51 > 2. process (2) with x of length 49 > 3. process (3) with x of length 52 > 4. process (4) with x of length 48 > Let your local length be n, so that on proc 3 n== 52. Then VecCreate(comm, &v); VecSetSizes(v, n, PETSC_DETERMINE); VecSetFromOptions(v); VecSum(v, &sum); You could also make a parallel Vec from your Seq vecs: VecGetArray(lv, &array); VecCreateMPIWithArray(comm, 1, n, PETSC_DETERMINE, array, &v); Thanks, Matt > The processes sum up to 100 elements, when I define a vector "x_all" of > size "100" with "PETSC_COMM_WORLD" communicator, the ownership ranges are > equal, which isn't the case, how to customize them ? > > -- > Mohamamd Bahaa ElDin > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From m.bahaa.eldein at gmail.com Tue Mar 18 08:20:36 2014 From: m.bahaa.eldein at gmail.com (Mohammad Bahaa) Date: Tue, 18 Mar 2014 15:20:36 +0200 Subject: [petsc-users] Custom vector owenrship ranges In-Reply-To: References: Message-ID: Forgive me as my expression "sum up" was misguiding or misplaced, I didn't mean to literally sum the values in the vectors, I meant I want to put all values from each local vector into one global vector that can be accessed by all processes, "COMM_WORLD" communicator for instance On Tue, Mar 18, 2014 at 3:09 PM, Matthew Knepley wrote: > On Tue, Mar 18, 2014 at 7:53 AM, Mohammad Bahaa wrote: > >> I'm using "PETSC_COMM_SELF" communicator for running n serial >> independent processes, I need to sum up a certain vector from the n >> processes in one vector, however, vectors involved in each process vary in >> size, and I couldn't find any function to define custom ownership ranges, >> so assuming I have a 4 processes run with each computing an "x" vector as >> follows: >> >> 1. process (1) with x of length 51 >> 2. process (2) with x of length 49 >> 3. process (3) with x of length 52 >> 4. process (4) with x of length 48 >> > > Let your local length be n, so that on proc 3 n== 52. Then > > VecCreate(comm, &v); > VecSetSizes(v, n, PETSC_DETERMINE); > VecSetFromOptions(v); > > VecSum(v, &sum); > > You could also make a parallel Vec from your Seq vecs: > > VecGetArray(lv, &array); > VecCreateMPIWithArray(comm, 1, n, PETSC_DETERMINE, array, &v); > > Thanks, > > Matt > > >> The processes sum up to 100 elements, when I define a vector "x_all" of >> size "100" with "PETSC_COMM_WORLD" communicator, the ownership ranges >> are equal, which isn't the case, how to customize them ? >> >> -- >> Mohamamd Bahaa ElDin >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- Mohamamd Bahaa ElDin -------------- next part -------------- An HTML attachment was scrubbed... URL: From m.bahaa.eldein at gmail.com Tue Mar 18 08:43:59 2014 From: m.bahaa.eldein at gmail.com (Mohammad Bahaa) Date: Tue, 18 Mar 2014 15:43:59 +0200 Subject: [petsc-users] Custom vector owenrship ranges In-Reply-To: References: Message-ID: the second approach of the MPI vector did it for me, thanks On Tue, Mar 18, 2014 at 3:20 PM, Mohammad Bahaa wrote: > Forgive me as my expression "sum up" was misguiding or misplaced, I didn't > mean to literally sum the values in the vectors, I meant I want to put all > values from each local vector into one global vector that can be accessed > by all processes, "COMM_WORLD" communicator for instance > > > On Tue, Mar 18, 2014 at 3:09 PM, Matthew Knepley wrote: > >> On Tue, Mar 18, 2014 at 7:53 AM, Mohammad Bahaa > > wrote: >> >>> I'm using "PETSC_COMM_SELF" communicator for running n serial >>> independent processes, I need to sum up a certain vector from the n >>> processes in one vector, however, vectors involved in each process vary in >>> size, and I couldn't find any function to define custom ownership ranges, >>> so assuming I have a 4 processes run with each computing an "x" vector as >>> follows: >>> >>> 1. process (1) with x of length 51 >>> 2. process (2) with x of length 49 >>> 3. process (3) with x of length 52 >>> 4. process (4) with x of length 48 >>> >> >> Let your local length be n, so that on proc 3 n== 52. Then >> >> VecCreate(comm, &v); >> VecSetSizes(v, n, PETSC_DETERMINE); >> VecSetFromOptions(v); >> >> VecSum(v, &sum); >> >> You could also make a parallel Vec from your Seq vecs: >> >> VecGetArray(lv, &array); >> VecCreateMPIWithArray(comm, 1, n, PETSC_DETERMINE, array, &v); >> >> Thanks, >> >> Matt >> >> >>> The processes sum up to 100 elements, when I define a vector "x_all" of >>> size "100" with "PETSC_COMM_WORLD" communicator, the ownership ranges >>> are equal, which isn't the case, how to customize them ? >>> >>> -- >>> Mohamamd Bahaa ElDin >>> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > > > -- > Mohamamd Bahaa ElDin > -- Mohamamd Bahaa ElDin -------------- next part -------------- An HTML attachment was scrubbed... URL: From prbrune at gmail.com Tue Mar 18 09:15:38 2014 From: prbrune at gmail.com (Peter Brune) Date: Tue, 18 Mar 2014 09:15:38 -0500 Subject: [petsc-users] When does DIVERGED_LINE_SEARCH Happen? In-Reply-To: <53277926.3060904@jhu.edu> References: <5323B36F.2070305@jhu.edu> <3C53DDCF-455A-4A28-A998-28CC7D96CC7E@mcs.anl.gov> <5327750A.5010004@jhu.edu> <53277926.3060904@jhu.edu> Message-ID: Is there more output from the line search? What happens when you run with -snes_linesearch_monitor? I remember there being a reason that I didn't put this update in the maintenance branch. Let me figure out exactly why and get back to you. On Mon, Mar 17, 2014 at 5:37 PM, Dafang Wang wrote: > Hi Peter, > > My version of PETSc (v3.4.3) does not contain the bug fix you mentioned: > "+ ierr = > SNESLineSearchSetNorms(linesearch,xnorm,fnorm,ynorm);CHKERRQ(ierr);" > Would that be a problem? > I typically used the default value of -snes_stol, never setting it to > zero. I will let you know soon if you believe this is important. > > It would certainly be worth a try. - Peter > Cheers, > Dafang > > > On 03/17/2014 06:27 PM, Peter Brune wrote: > > This may be related to a bug we had reported before to petsc-maint: > > > https://bitbucket.org/petsc/petsc/commits/ced04f9d467b04aa83a18d3f8875c7f72c17217a > > What version of PETSc are you running? Also, what happens if you set > -snes_stol to zero? > > Thanks, > > - Peter > > > On Mon, Mar 17, 2014 at 5:19 PM, Dafang Wang wrote: > >> Hi Barry, >> >> Thanks for your tips. I have read the webpage you mentioned many times >> before, but still I have been stuck on the line-search problem for weeks. >> >> I cannot guarantee my Jacobian is correct but I believe an incorrect >> Jacobian is very unlikely. My Jacobian-calculation code has been under test >> for a year with both analytical and realistic models, and the results have >> been good until recently when I ran a very realistic physical model. >> >> Also, I looked up the implementation of SNESSolve_NEWTONLS() in "ls.c". >> According to the algorithm, when the function "SNESLineSearchApply()" does >> not succeed, one may encounter two possible outcomes: >> CONVERGED_SNORM_RELATIVE (if the search step is too small) or otherwise, >> DIVERGED_LINE_SEARCH. Does this mean that both these two outcomes indicate >> that the line search fails? >> >> I ask this question because my simulation encountered many >> CONVERGED_SNORM_RELATIVE. I treated them as if my nonlinear system >> converged, accepted the nonlinear solution, and then proceeded to the next >> time step of my simulation. Apparently, such practice has worked well in >> most cases, (even when I encountered suspicious DIVERGED_LINE_SEARCH >> behaviors). However, I wonder if there are any potential pitfalls in my >> practice such as missing a nonlinear solve divergence and taking a partial >> solution as the correct solution. >> >> Thank you very much for your time and help. >> >> Best, >> Dafang >> >> >> On 03/15/2014 11:15 AM, Barry Smith wrote: >> >>> Failed line search are almost always due to an incorrect Jacobian. >>> Please let us know if the suggestions at >>> http://www.mcs.anl.gov/petsc/documentation/faq.html#newton don't help. >>> >>> Barry >>> >>> On Mar 14, 2014, at 8:57 PM, Dafang Wang wrote: >>> >>> Hi, >>>> >>>> Does anyone know what the error code DIVERGED_LINE_SEARCH means in the >>>> SNES nonlinear solve? Or what scenario would lead to this error code? >>>> >>>> Running a solid mechanics simulation, I found that the occurrence of >>>> DIVERGED_LINE_SEARCH was very unpredictable and sensitive to the input >>>> values to my nonlinear system, although my system should not be that >>>> unstable. As shown by the two examples below, my system diverged in one >>>> case and converged in the other, although the input values in these two >>>> cases differed by only 1e-4, >>>> >>>> Moreover, the Newton steps in the two cases were very similar up to NL >>>> step 1. Since then, however, Case 1 encountered a line-search divergence >>>> whereas Case 2 converged successfully. This is my main confusion. (Note >>>> that each residual vector contains 3e04 DOF, so when their L2 norms differ >>>> within 1e-4, the two systems should be very close.) >>>> >>>> My simulation input consists of two scalar values (p1 and p2), each of >>>> which acts as a constant pressure boundary condition. >>>> >>>> Case 1, diverge: >>>> p1= -10.190869 p2= -2.367555 >>>> NL step 0, |residual|_2 = 1.621402e-02 >>>> Line search: Using full step: fnorm 1.621401550027e-02 gnorm >>>> 7.022558235262e-05 >>>> NL step 1, |residual|_2 = 7.022558e-05 >>>> Line search: Using full step: fnorm 7.022558235262e-05 gnorm >>>> 1.636418730611e-06 >>>> NL step 2, |residual|_2 = 1.636419e-06 >>>> Nonlinear solve did not converge due to DIVERGED_LINE_SEARCH iterations >>>> 2 >>>> Case 2: converge: >>>> p1= -10.190747 p2= -2.367558 >>>> NL step 0, |residual|_2 = 1.621380e-02 >>>> Line search: Using full step: fnorm 1.621379778276e-02 gnorm >>>> 6.976373804153e-05 >>>> NL step 1, |residual|_2 = 6.976374e-05 >>>> Line search: Using full step: fnorm 6.976373804153e-05 gnorm >>>> 4.000992847275e-07 >>>> NL step 2, |residual|_2 = 4.000993e-07 >>>> Line search: Using full step: fnorm 4.000992847275e-07 gnorm >>>> 1.621646014441e-08 >>>> NL step 3, |residual|_2 = 1.621646e-08 >>>> Nonlinear solve converged due to CONVERGED_SNORM_RELATIVE iterations 3 >>>> >>>> Aside from the input values, the initial solution in both cases may >>>> differ very slightly. (Each case is one time step in a time-sequence >>>> simulation. The two cases behaved nearly identically up to the last time >>>> step before the step shown above, so their initial solutions may differ by >>>> a cumulative error but such error should be very small.) Is it possible >>>> that little difference in initial guess leads to different local minimum >>>> regions where the line search in Case 1 failed? >>>> >>>> Any comments will be greatly appreciated. >>>> >>>> Thanks, >>>> Dafang >>>> -- >>>> Dafang Wang, Ph.D >>>> Postdoctoral Fellow >>>> Institute of Computational Medicine >>>> Department of Biomedical Engineering >>>> Johns Hopkins University >>>> Hackerman Hall Room 218 >>>> Baltimore, MD, 21218 >>>> >>> >> -- >> Dafang Wang, Ph.D >> Postdoctoral Fellow >> Institute of Computational Medicine >> Department of Biomedical Engineering >> Johns Hopkins University >> Hackerman Hall Room 218 >> Baltimore, MD, 21218 >> > > > -- > Dafang Wang, Ph.D > Postdoctoral Fellow > Institute of Computational Medicine > Department of Biomedical Engineering > Johns Hopkins University > Hackerman Hall Room 218 > Baltimore, MD, 21218 > -------------- next part -------------- An HTML attachment was scrubbed... URL: From m.bahaa.eldein at gmail.com Tue Mar 18 11:00:58 2014 From: m.bahaa.eldein at gmail.com (Mohammad Bahaa) Date: Tue, 18 Mar 2014 18:00:58 +0200 Subject: [petsc-users] Custom vector owenrship ranges In-Reply-To: References: Message-ID: I used call VecCreateMPIWithArray(PETSC_COMM_WORLD,1,nc,ncall,myx,xall,ierr) however, when I use process 0 to write a file containing the combined values (the xall vector), the values seem not to be updated by some processes, eventhough I use PetscBarrier, in other words, values locally owned by processes 0 and 2 are ok, but those owned by process 1 & 3 aren't ! On Tue, Mar 18, 2014 at 3:43 PM, Mohammad Bahaa wrote: > the second approach of the MPI vector did it for me, thanks > > > On Tue, Mar 18, 2014 at 3:20 PM, Mohammad Bahaa wrote: > >> Forgive me as my expression "sum up" was misguiding or misplaced, I >> didn't mean to literally sum the values in the vectors, I meant I want to >> put all values from each local vector into one global vector that can be >> accessed by all processes, "COMM_WORLD" communicator for instance >> >> >> On Tue, Mar 18, 2014 at 3:09 PM, Matthew Knepley wrote: >> >>> On Tue, Mar 18, 2014 at 7:53 AM, Mohammad Bahaa < >>> m.bahaa.eldein at gmail.com> wrote: >>> >>>> I'm using "PETSC_COMM_SELF" communicator for running n serial >>>> independent processes, I need to sum up a certain vector from the n >>>> processes in one vector, however, vectors involved in each process vary in >>>> size, and I couldn't find any function to define custom ownership ranges, >>>> so assuming I have a 4 processes run with each computing an "x" vector as >>>> follows: >>>> >>>> 1. process (1) with x of length 51 >>>> 2. process (2) with x of length 49 >>>> 3. process (3) with x of length 52 >>>> 4. process (4) with x of length 48 >>>> >>> >>> Let your local length be n, so that on proc 3 n== 52. Then >>> >>> VecCreate(comm, &v); >>> VecSetSizes(v, n, PETSC_DETERMINE); >>> VecSetFromOptions(v); >>> >>> VecSum(v, &sum); >>> >>> You could also make a parallel Vec from your Seq vecs: >>> >>> VecGetArray(lv, &array); >>> VecCreateMPIWithArray(comm, 1, n, PETSC_DETERMINE, array, &v); >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> The processes sum up to 100 elements, when I define a vector "x_all" of >>>> size "100" with "PETSC_COMM_WORLD" communicator, the ownership ranges >>>> are equal, which isn't the case, how to customize them ? >>>> >>>> -- >>>> Mohamamd Bahaa ElDin >>>> >>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> >> >> -- >> Mohamamd Bahaa ElDin >> > > > > -- > Mohamamd Bahaa ElDin > -- Mohamamd Bahaa ElDin -------------- next part -------------- An HTML attachment was scrubbed... URL: From dafang.wang at jhu.edu Tue Mar 18 12:04:44 2014 From: dafang.wang at jhu.edu (Dafang Wang) Date: Tue, 18 Mar 2014 13:04:44 -0400 Subject: [petsc-users] When does DIVERGED_LINE_SEARCH Happen? In-Reply-To: References: <5323B36F.2070305@jhu.edu> <3C53DDCF-455A-4A28-A998-28CC7D96CC7E@mcs.anl.gov> <5327750A.5010004@jhu.edu> <53277926.3060904@jhu.edu> Message-ID: <53287CAC.9060405@jhu.edu> Hi Peter, Running with "snes_linesearch_monitor" would give the following results which I presented in my original email: My simulation input consists of two scalar values (p1 and p2), each of which acts as a constant pressure boundary condition. Case 1, diverge: p1= -10.190869 p2= -2.367555 NL step 0, |residual|_2 = 1.621402e-02 Line search: Using full step: fnorm 1.621401550027e-02 gnorm 7.022558235262e-05 NL step 1, |residual|_2 = 7.022558e-05 Line search: Using full step: fnorm 7.022558235262e-05 gnorm 1.636418730611e-06 NL step 2, |residual|_2 = 1.636419e-06 Nonlinear solve did not converge due to DIVERGED_LINE_SEARCH iterations 2 ------------------------------------------------------------------------ Case 2: converge: p1= -10.190747 p2= -2.367558 NL step 0, |residual|_2 = 1.621380e-02 Line search: Using full step: fnorm 1.621379778276e-02 gnorm 6.976373804153e-05 NL step 1, |residual|_2 = 6.976374e-05 Line search: Using full step: fnorm 6.976373804153e-05 gnorm 4.000992847275e-07 NL step 2, |residual|_2 = 4.000993e-07 Line search: Using full step: fnorm 4.000992847275e-07 gnorm 1.621646014441e-08 NL step 3, |residual|_2 = 1.621646e-08 Nonlinear solve converged due to CONVERGED_SNORM_RELATIVE iterations 3 ------------------------------------------------------------------------ Also, running with "-snes_stol=0" effectively suppressed the occurrence of CONVERGED_SNORM_RELATIVE. Instead, most nonlinear solves took more iterations and ended with CONVERGED_FNORM_RELATIVE at a smaller residual error. In some cases (roughly 10 out of 5000), the nonlinear solves failed with DIVERGED_LINE_SEARCH Cheers, Dafang On 03/18/2014 10:15 AM, Peter Brune wrote: > Is there more output from the line search? What happens when you run > with -snes_linesearch_monitor? I remember there being a reason that I > didn't put this update in the maintenance branch. Let me figure out > exactly why and get back to you. > > > On Mon, Mar 17, 2014 at 5:37 PM, Dafang Wang > wrote: > > Hi Peter, > > My version of PETSc (v3.4.3) does not contain the bug fix you > mentioned: > "+ ierr = > SNESLineSearchSetNorms(linesearch,xnorm,fnorm,ynorm);CHKERRQ(ierr);" > Would that be a problem? > > I typically used the default value of -snes_stol, never setting it > to zero. I will let you know soon if you believe this is important. > > > It would certainly be worth a try. > > - Peter > > Cheers, > Dafang > > > On 03/17/2014 06:27 PM, Peter Brune wrote: >> This may be related to a bug we had reported before to petsc-maint: >> >> https://bitbucket.org/petsc/petsc/commits/ced04f9d467b04aa83a18d3f8875c7f72c17217a >> >> What version of PETSc are you running? Also, what happens if >> you set -snes_stol to zero? >> >> Thanks, >> >> - Peter >> >> >> On Mon, Mar 17, 2014 at 5:19 PM, Dafang Wang > > wrote: >> >> Hi Barry, >> >> Thanks for your tips. I have read the webpage you mentioned >> many times before, but still I have been stuck on the >> line-search problem for weeks. >> >> I cannot guarantee my Jacobian is correct but I believe an >> incorrect Jacobian is very unlikely. My Jacobian-calculation >> code has been under test for a year with both analytical and >> realistic models, and the results have been good until >> recently when I ran a very realistic physical model. >> >> Also, I looked up the implementation of SNESSolve_NEWTONLS() >> in "ls.c". According to the algorithm, when the function >> "SNESLineSearchApply()" does not succeed, one may encounter >> two possible outcomes: CONVERGED_SNORM_RELATIVE (if the >> search step is too small) or otherwise, DIVERGED_LINE_SEARCH. >> Does this mean that both these two outcomes indicate that the >> line search fails? >> >> I ask this question because my simulation encountered many >> CONVERGED_SNORM_RELATIVE. I treated them as if my nonlinear >> system converged, accepted the nonlinear solution, and then >> proceeded to the next time step of my simulation. Apparently, >> such practice has worked well in most cases, (even when I >> encountered suspicious DIVERGED_LINE_SEARCH behaviors). >> However, I wonder if there are any potential pitfalls in my >> practice such as missing a nonlinear solve divergence and >> taking a partial solution as the correct solution. >> >> Thank you very much for your time and help. >> >> Best, >> Dafang >> >> >> On 03/15/2014 11:15 AM, Barry Smith wrote: >> >> Failed line search are almost always due to an >> incorrect Jacobian. Please let us know if the suggestions >> at >> http://www.mcs.anl.gov/petsc/documentation/faq.html#newton don't >> help. >> >> Barry >> >> On Mar 14, 2014, at 8:57 PM, Dafang Wang >> > wrote: >> >> Hi, >> >> Does anyone know what the error code >> DIVERGED_LINE_SEARCH means in the SNES nonlinear >> solve? Or what scenario would lead to this error code? >> >> Running a solid mechanics simulation, I found that >> the occurrence of DIVERGED_LINE_SEARCH was very >> unpredictable and sensitive to the input values to my >> nonlinear system, although my system should not be >> that unstable. As shown by the two examples below, my >> system diverged in one case and converged in the >> other, although the input values in these two cases >> differed by only 1e-4, >> >> Moreover, the Newton steps in the two cases were very >> similar up to NL step 1. Since then, however, Case 1 >> encountered a line-search divergence whereas Case 2 >> converged successfully. This is my main confusion. >> (Note that each residual vector contains 3e04 DOF, so >> when their L2 norms differ within 1e-4, the two >> systems should be very close.) >> >> My simulation input consists of two scalar values (p1 >> and p2), each of which acts as a constant pressure >> boundary condition. >> >> Case 1, diverge: >> p1= -10.190869 p2= -2.367555 >> NL step 0, |residual|_2 = 1.621402e-02 >> Line search: Using full step: fnorm >> 1.621401550027e-02 gnorm 7.022558235262e-05 >> NL step 1, |residual|_2 = 7.022558e-05 >> Line search: Using full step: fnorm >> 7.022558235262e-05 gnorm 1.636418730611e-06 >> NL step 2, |residual|_2 = 1.636419e-06 >> Nonlinear solve did not converge due to >> DIVERGED_LINE_SEARCH iterations 2 >> Case 2: converge: >> p1= -10.190747 p2= -2.367558 >> NL step 0, |residual|_2 = 1.621380e-02 >> Line search: Using full step: fnorm >> 1.621379778276e-02 gnorm 6.976373804153e-05 >> NL step 1, |residual|_2 = 6.976374e-05 >> Line search: Using full step: fnorm >> 6.976373804153e-05 gnorm 4.000992847275e-07 >> NL step 2, |residual|_2 = 4.000993e-07 >> Line search: Using full step: fnorm >> 4.000992847275e-07 gnorm 1.621646014441e-08 >> NL step 3, |residual|_2 = 1.621646e-08 >> Nonlinear solve converged due to >> CONVERGED_SNORM_RELATIVE iterations 3 >> >> Aside from the input values, the initial solution in >> both cases may differ very slightly. (Each case is >> one time step in a time-sequence simulation. The two >> cases behaved nearly identically up to the last time >> step before the step shown above, so their initial >> solutions may differ by a cumulative error but such >> error should be very small.) Is it possible that >> little difference in initial guess leads to different >> local minimum regions where the line search in Case 1 >> failed? >> >> Any comments will be greatly appreciated. >> >> Thanks, >> Dafang >> -- >> Dafang Wang, Ph.D >> Postdoctoral Fellow >> Institute of Computational Medicine >> Department of Biomedical Engineering >> Johns Hopkins University >> Hackerman Hall Room 218 >> Baltimore, MD, 21218 >> >> >> -- >> Dafang Wang, Ph.D >> Postdoctoral Fellow >> Institute of Computational Medicine >> Department of Biomedical Engineering >> Johns Hopkins University >> Hackerman Hall Room 218 >> Baltimore, MD, 21218 >> >> > > -- > Dafang Wang, Ph.D > Postdoctoral Fellow > Institute of Computational Medicine > Department of Biomedical Engineering > Johns Hopkins University > Hackerman Hall Room 218 > Baltimore, MD, 21218 > > -- Dafang Wang, Ph.D Postdoctoral Fellow Institute of Computational Medicine Department of Biomedical Engineering Johns Hopkins University Hackerman Hall Room 218 Baltimore, MD, 21218 -------------- next part -------------- An HTML attachment was scrubbed... URL: From asmund.ervik at ntnu.no Tue Mar 18 17:19:00 2014 From: asmund.ervik at ntnu.no (=?iso-8859-1?Q?=C5smund_Ervik?=) Date: Tue, 18 Mar 2014 22:19:00 +0000 Subject: [petsc-users] Writing solution data to file when using DMDA. Message-ID: <0E576811AB298343AC632BBCAAEFC37945BEFD17@WAREHOUSE08.win.ntnu.no> Dear PETSc users, I'm trying to wrap my head around parallel I/O. If I understand correctly, a decent way of doing this is having one rank (say 0) writing to disk, and the other ranks communicating their part of the solution to rank 0. Please correct me if I'm wrong here. I'm using DMDA to manage my domain decomposition. As a first step, I've been trying to create an array on rank 0 holding the entire global solution and then writing this to file by re-using some routines from our serial codes (the format is Tecplot ASCII). (I realize that neither this approach nor an ASCII format are good solutions in the end, but I have to start somewhere.) However, I haven't been able to find any DMDA routines that give me an array holding the entire global solution on rank 0. Are there any, or is this too much of a "dirty trick"? (For just 1 process there is no problem, the output files generated look good.) I'm also willing to try the VTK way of doing things, but I hit a problem when I tried that: even though I include "petscviewer.h" (also tried adding "petscviewerdef.h"), when I do call PetscViewerSetType(viewer,PETSCVIEWERVTK,ierr) my compiler complains that PETSCVIEWERVTK is undefined (has no implicit type). This is from Fortran90 using preprocessing macros to #include the files. I tried PETSCVIEWERASCII as well, same problem. This is with 3.4.3. Any hints on this? Also, there are many different examples and mailing list threads about VTK output. What is the currently recommended way of doing things? I need to output at least (u,v,w) as vector components of one field, together with a scalar field (p). These currently have separate DM's, since I only use PETSc to solve for p (the pressure). Best regards, ?smund From knepley at gmail.com Tue Mar 18 18:16:28 2014 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 18 Mar 2014 18:16:28 -0500 Subject: [petsc-users] Custom vector owenrship ranges In-Reply-To: References: Message-ID: On Tue, Mar 18, 2014 at 11:00 AM, Mohammad Bahaa wrote: > I used > call VecCreateMPIWithArray(PETSC_COMM_WORLD,1,nc,ncall,myx,xall,ierr) > > however, when I use process 0 to write a file containing the combined > values (the xall vector), the values seem not to be updated by some > processes, eventhough I use PetscBarrier, in other words, values locally > owned by processes 0 and 2 are ok, but those owned by process 1 & 3 aren't ! > For collective writes, use VecView() or -vec_view Matt > On Tue, Mar 18, 2014 at 3:43 PM, Mohammad Bahaa wrote: > >> the second approach of the MPI vector did it for me, thanks >> >> >> On Tue, Mar 18, 2014 at 3:20 PM, Mohammad Bahaa > > wrote: >> >>> Forgive me as my expression "sum up" was misguiding or misplaced, I >>> didn't mean to literally sum the values in the vectors, I meant I want to >>> put all values from each local vector into one global vector that can be >>> accessed by all processes, "COMM_WORLD" communicator for instance >>> >>> >>> On Tue, Mar 18, 2014 at 3:09 PM, Matthew Knepley wrote: >>> >>>> On Tue, Mar 18, 2014 at 7:53 AM, Mohammad Bahaa < >>>> m.bahaa.eldein at gmail.com> wrote: >>>> >>>>> I'm using "PETSC_COMM_SELF" communicator for running n serial >>>>> independent processes, I need to sum up a certain vector from the n >>>>> processes in one vector, however, vectors involved in each process vary in >>>>> size, and I couldn't find any function to define custom ownership ranges, >>>>> so assuming I have a 4 processes run with each computing an "x" vector as >>>>> follows: >>>>> >>>>> 1. process (1) with x of length 51 >>>>> 2. process (2) with x of length 49 >>>>> 3. process (3) with x of length 52 >>>>> 4. process (4) with x of length 48 >>>>> >>>> >>>> Let your local length be n, so that on proc 3 n== 52. Then >>>> >>>> VecCreate(comm, &v); >>>> VecSetSizes(v, n, PETSC_DETERMINE); >>>> VecSetFromOptions(v); >>>> >>>> VecSum(v, &sum); >>>> >>>> You could also make a parallel Vec from your Seq vecs: >>>> >>>> VecGetArray(lv, &array); >>>> VecCreateMPIWithArray(comm, 1, n, PETSC_DETERMINE, array, &v); >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> The processes sum up to 100 elements, when I define a vector "x_all" >>>>> of size "100" with "PETSC_COMM_WORLD" communicator, the ownership >>>>> ranges are equal, which isn't the case, how to customize them ? >>>>> >>>>> -- >>>>> Mohamamd Bahaa ElDin >>>>> >>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >>> >>> -- >>> Mohamamd Bahaa ElDin >>> >> >> >> >> -- >> Mohamamd Bahaa ElDin >> > > > > -- > Mohamamd Bahaa ElDin > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Tue Mar 18 18:27:42 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 18 Mar 2014 18:27:42 -0500 Subject: [petsc-users] Writing solution data to file when using DMDA. In-Reply-To: <0E576811AB298343AC632BBCAAEFC37945BEFD17@WAREHOUSE08.win.ntnu.no> References: <0E576811AB298343AC632BBCAAEFC37945BEFD17@WAREHOUSE08.win.ntnu.no> Message-ID: <5A7CBF66-5E2D-4BC6-B116-C30452DDBF0E@mcs.anl.gov> On Mar 18, 2014, at 5:19 PM, ?smund Ervik wrote: > Dear PETSc users, > > I'm trying to wrap my head around parallel I/O. If I understand correctly, a decent way of doing this is having one rank (say 0) writing to disk, and the other ranks communicating their part of the solution to rank 0. Please correct me if I'm wrong here. > > I'm using DMDA to manage my domain decomposition. As a first step, I've been trying to create an array on rank 0 holding the entire global solution and then writing this to file by re-using some routines from our serial codes (the format is Tecplot ASCII). (I realize that neither this approach nor an ASCII format are good solutions in the end, but I have to start somewhere.) However, I haven't been able to find any DMDA routines that give me an array holding the entire global solution on rank 0. Are there any, or is this too much of a "dirty trick"? (For just 1 process there is no problem, the output files generated look good.) DMDACreateNatural() DMDAGlobalToNaturalBegin/End() VecScatterCreateToZero VecGetArray() on process 0 the final array is in the natural ordering, x direction first, y direction second, z direction third. > > I'm also willing to try the VTK way of doing things, but I hit a problem when I tried that: even though I include "petscviewer.h" (also tried adding "petscviewerdef.h"), when I do > call PetscViewerSetType(viewer,PETSCVIEWERVTK,ierr) > my compiler complains that PETSCVIEWERVTK is undefined (has no implicit type). This is from Fortran90 using preprocessing macros to #include the files. I tried PETSCVIEWERASCII as well, same problem. This is with 3.4.3. Any hints on this? Hmm, they are in petscviewerdef.h in 3.4.4 but anyways you can pass ?vtk? or ?ascii? as the type > > Also, there are many different examples and mailing list threads about VTK output. What is the currently recommended way of doing things? I need to output at least (u,v,w) as vector components of one field, together with a scalar field (p). These currently have separate DM's, since I only use PETSc to solve for p (the pressure). > > Best regards, > ?smund From jed at jedbrown.org Tue Mar 18 18:31:17 2014 From: jed at jedbrown.org (Jed Brown) Date: Wed, 19 Mar 2014 00:31:17 +0100 Subject: [petsc-users] Writing solution data to file when using DMDA. In-Reply-To: <0E576811AB298343AC632BBCAAEFC37945BEFD17@WAREHOUSE08.win.ntnu.no> References: <0E576811AB298343AC632BBCAAEFC37945BEFD17@WAREHOUSE08.win.ntnu.no> Message-ID: <87a9cnm6d6.fsf@jedbrown.org> ?smund Ervik writes: > I'm also willing to try the VTK way of doing things, but I hit a > problem when I tried that: even though I include "petscviewer.h" (also > tried adding "petscviewerdef.h"), when I do > call PetscViewerSetType(viewer,PETSCVIEWERVTK,ierr) my compiler > complains that PETSCVIEWERVTK is undefined (has no implicit > type). Should be Is it actually included? Are you sure you have v3.4.3? $ git grep VIEWERVTK v3.4.3 include/finclude/petscviewerdef.h v3.4.3:include/finclude/petscviewerdef.h:#define PETSCVIEWERVTK 'vtk' > This is from Fortran90 using preprocessing macros to #include the > files. I tried PETSCVIEWERASCII as well, same problem. This is with > 3.4.3. Any hints on this? -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From mc0710 at gmail.com Tue Mar 18 19:45:58 2014 From: mc0710 at gmail.com (Mani Chandra) Date: Tue, 18 Mar 2014 19:45:58 -0500 Subject: [petsc-users] Spatial distribution of the residual after an implicit TS timestep In-Reply-To: References: <9523A33B-3F7D-4C44-9DFF-361FF72546A0@mcs.anl.gov> Message-ID: Hi, Is there anyway I can get the residual at the beginning of the time step? I tried TSSetPreStage and TSSetPreStep but they didn't work. Setting TSSetPreStage would crash the program even if there is nothing in the routine that I set and TSSetPreStep would give me the residual at the end of the previous time step. I want to take a look at the residual at the beginning of the new time step before the nonlinear solver starts. Thanks, Mani On Sat, Mar 15, 2014 at 9:03 PM, Barry Smith wrote: > > Yes but it is slightly complicated. You need to configure PETSc with > ?with-afterimage and run the program with -draw_save then it will save > each image. > > It will open an X window to display each image. You can run with > -x_virtual to skip the opening of X windows but then you need Xvfb to be > installed > > Barry > > > On Mar 15, 2014, at 6:29 PM, Mani Chandra wrote: > > > -snes_montior_residual is really cool. Anyway I could get it to spit out > png images instead of on screen visualization? > > > > > > On Sat, Mar 15, 2014 at 6:24 PM, Barry Smith wrote: > > > > > > You can write a custom monitor and set it with TSMonitorSet() > > > > This routine would call TSGetSNES() then SNESGetSolution() then call > SNESComputeFunction() then call VecView() on the result. > > > > But note that just because the residual is big somewhere doesn?t > mean the error need be. > > > > You could also run with -snes_monitor_residual to see how the > residual is being reduced inside the nonlinear solve (that is, what parts > of the residual are most stubborn). > > > > > > Barry > > > > > > On Mar 15, 2014, at 5:47 PM, Mani Chandra wrote: > > > > > Hi, > > > > > > Is there anyway I can VecView the residual after TS has completed an > implicit time step? I'd like to see where in my domain most of the errors > are coming from. I looked at TSMonitor but that doesn't seem to give me > access to the residual at the end of the current time step. > > > > > > Thanks, > > > Mani > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Tue Mar 18 19:48:03 2014 From: jed at jedbrown.org (Jed Brown) Date: Wed, 19 Mar 2014 01:48:03 +0100 Subject: [petsc-users] Spatial distribution of the residual after an implicit TS timestep In-Reply-To: References: <9523A33B-3F7D-4C44-9DFF-361FF72546A0@mcs.anl.gov> Message-ID: <87txavko8s.fsf@jedbrown.org> Mani Chandra writes: > Is there anyway I can get the residual at the beginning of the time step? I > tried TSSetPreStage and TSSetPreStep but they didn't work. Setting > TSSetPreStage would crash the program even if there is nothing in the > routine that I set and TSSetPreStep would give me the residual at the end > of the previous time step. I want to take a look at the residual at the > beginning of the new time step before the nonlinear solver starts. Residual of what? What are you going to do with it? -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From mc0710 at gmail.com Tue Mar 18 19:54:58 2014 From: mc0710 at gmail.com (Mani Chandra) Date: Tue, 18 Mar 2014 19:54:58 -0500 Subject: [petsc-users] Spatial distribution of the residual after an implicit TS timestep In-Reply-To: <87txavko8s.fsf@jedbrown.org> References: <9523A33B-3F7D-4C44-9DFF-361FF72546A0@mcs.anl.gov> <87txavko8s.fsf@jedbrown.org> Message-ID: The residual of the system of equation that I feed into TS with TSSetIFunction. I have a system of equations and I want to probe where most of the residual is coming from. The reason is that after certain time of evolution, the initial residual at the beginning of each time step increases by orders of magnitude than what it used to be at the beginning of the time step at early times. For ex, say at t=200, the SNES norm at the beginning of a TS timestep with the theta method would be something like 1.0812 and at t = 300, it would be 2e5 at the beginning. SNES then has to work much harder to reach the abs norm levels and so I want to investigate what is happening. On Tue, Mar 18, 2014 at 7:48 PM, Jed Brown wrote: > Mani Chandra writes: > > > Is there anyway I can get the residual at the beginning of the time > step? I > > tried TSSetPreStage and TSSetPreStep but they didn't work. Setting > > TSSetPreStage would crash the program even if there is nothing in the > > routine that I set and TSSetPreStep would give me the residual at the end > > of the previous time step. I want to take a look at the residual at the > > beginning of the new time step before the nonlinear solver starts. > > Residual of what? What are you going to do with it? > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mc0710 at gmail.com Tue Mar 18 20:04:38 2014 From: mc0710 at gmail.com (Mani Chandra) Date: Tue, 18 Mar 2014 20:04:38 -0500 Subject: [petsc-users] Drastic increase in memory usage between DMDA_STENCIL_BOX and DMDA_STENCIL_STAR Message-ID: Hi, I see a 4x increase in the memory usage when I change from DMDA_STENCIL_STAR to DMDA_STENCIL_BOX. Attached are the outputs of -log_summary which shows a huge increase in the matrix memory usage. Is this expected? On a different note, suppose I am running a serial calculation with no need to exchange data but I am using corner node information, do I need to use DMDA_STENCIL_BOX? Would the jacobian when computed using colored finite differences be correctly represented if I use corner information but still use DMDA_STENCIL_STAR? Thanks, Mani -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- ./grim on a arch-linux2-c-debug named Deathstar with 1 processor, by mc Tue Mar 18 19:56:44 2014 Using Petsc Development GIT revision: v3.4.3-3262-g255453a GIT Date: 2014-02-08 22:41:14 -0600 Max Max/Min Avg Total Time (sec): 9.532e+01 1.00000 9.532e+01 Objects: 2.650e+02 1.00000 2.650e+02 Flops: 2.300e+10 1.00000 2.300e+10 2.300e+10 Flops/sec: 2.413e+08 1.00000 2.413e+08 2.413e+08 MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Reductions: 2.600e+02 1.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 9.5319e+01 100.0% 2.2998e+10 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 2.590e+02 99.6% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage VecView 3 1.0 4.7265e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecDot 11 1.0 1.0761e-02 1.0 1.15e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1072 VecMDot 22 1.0 2.7508e-02 1.0 3.46e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1258 VecNorm 58 1.0 4.4957e-02 1.0 6.08e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1353 VecScale 33 1.0 1.4479e-02 1.0 1.73e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1195 VecCopy 236 1.0 2.5637e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 41 1.0 4.4386e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 212 1.0 2.0891e-01 1.0 2.22e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 1064 VecAXPBYCZ 214 1.0 3.3057e-01 1.0 3.37e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 1018 VecWAXPY 12 1.0 1.8075e-02 1.0 6.82e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 377 VecMAXPY 33 1.0 4.7938e-02 1.0 5.77e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1203 VecLoad 1 1.0 1.6348e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecReduceArith 22 1.0 1.6276e-02 1.0 2.31e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1417 VecReduceComm 11 1.0 1.1206e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecNormalize 33 1.0 3.8849e-02 1.0 5.19e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1336 MatMult 33 1.0 6.2562e+00 1.0 6.84e+09 1.0 0.0e+00 0.0e+00 0.0e+00 7 30 0 0 0 7 30 0 0 0 1093 MatSolve 33 1.0 6.2317e+00 1.0 6.84e+09 1.0 0.0e+00 0.0e+00 0.0e+00 7 30 0 0 0 7 30 0 0 0 1097 MatLUFactorNum 1 1.0 3.3280e+01 1.0 8.55e+09 1.0 0.0e+00 0.0e+00 0.0e+00 35 37 0 0 0 35 37 0 0 0 257 MatILUFactorSym 1 1.0 5.1644e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.0e+00 1 0 0 0 0 1 0 0 0 0 0 MatAssemblyBegin 2 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyEnd 2 1.0 2.8725e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 1 1.0 1.9073e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetOrdering 1 1.0 1.1145e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 4.0e+00 0 0 0 0 2 0 0 0 0 2 0 MatZeroEntries 1 1.0 1.2959e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatFDColorCreate 1 1.0 2.9290e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+00 0 0 0 0 1 0 0 0 0 1 0 MatFDColorSetUp 1 1.0 5.2455e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+02 6 0 0 0 79 6 0 0 0 79 0 MatFDColorApply 1 1.0 1.4934e+01 1.0 5.25e+08 1.0 0.0e+00 0.0e+00 2.0e+00 16 2 0 0 1 16 2 0 0 1 35 MatFDColorFunc 200 1.0 1.3355e+01 1.0 3.15e+08 1.0 0.0e+00 0.0e+00 0.0e+00 14 1 0 0 0 14 1 0 0 0 24 TSStep 1 1.0 8.9164e+01 1.0 2.30e+10 1.0 0.0e+00 0.0e+00 2.4e+02 94100 0 0 92 94100 0 0 93 258 TSFunctionEval 213 1.0 1.3918e+01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 15 0 0 0 0 15 0 0 0 0 0 SNESSolve 1 1.0 6.7756e+01 1.0 2.30e+10 1.0 0.0e+00 0.0e+00 2.3e+02 71100 0 0 88 71100 0 0 89 339 SNESFunctionEval 13 1.0 8.9281e-01 1.0 2.04e+07 1.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 23 SNESJacobianEval 1 1.0 2.0316e+01 1.0 5.25e+08 1.0 0.0e+00 0.0e+00 2.1e+02 21 2 0 0 81 21 2 0 0 81 26 SNESLineSearch 11 1.0 3.0041e+00 1.0 2.36e+09 1.0 0.0e+00 0.0e+00 0.0e+00 3 10 0 0 0 3 10 0 0 0 787 KSPGMRESOrthog 22 1.0 5.4503e-02 1.0 6.92e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1270 KSPSetUp 11 1.0 7.4494e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.0e+01 0 0 0 0 4 0 0 0 0 4 0 KSPSolve 11 1.0 4.4361e+01 1.0 2.01e+10 1.0 0.0e+00 0.0e+00 1.5e+01 47 87 0 0 6 47 87 0 0 6 453 PCSetUp 1 1.0 3.3808e+01 1.0 8.55e+09 1.0 0.0e+00 0.0e+00 5.0e+00 35 37 0 0 2 35 37 0 0 2 253 PCApply 33 1.0 6.2317e+00 1.0 6.84e+09 1.0 0.0e+00 0.0e+00 0.0e+00 7 30 0 0 0 7 30 0 0 0 1097 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 23 23 83920248 0 Vector Scatter 3 3 1956 0 Matrix 2 2 2509827340 0 Matrix FD Coloring 1 1 2289081428 0 Distributed Mesh 2 2 279712 0 Bipartite Graph 4 4 3264 0 Index Set 212 212 7242384 0 IS L to G Mapping 3 3 1788 0 TSAdapt 2 2 2384 0 TS 1 1 1240 0 DMTS 1 1 720 0 SNES 1 1 1348 0 SNESLineSearch 1 1 880 0 DMSNES 1 1 680 0 Krylov Solver 1 1 18376 0 DMKSP interface 1 1 664 0 Preconditioner 1 1 992 0 Viewer 5 4 2848 0 ======================================================================================================================== Average time to get PetscTime(): 1.90735e-07 #PETSc Option Table entries: -log_summary -snes_atol 1e-5 -snes_converged_reason -snes_lag_jacobian 100 -snes_lag_jacobian_persists TRUE -snes_max_it 101 -snes_monitor -snes_rtol 1e-50 -snes_stol 1e-50 -ts_dt 0.03 -ts_final_time 500. -ts_max_snes_failures -1 -ts_max_steps 1 -ts_monitor -ts_type theta #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure run at: Thu Feb 20 16:27:19 2014 Configure options: --prefix=/home/mc/Downloads/petsc_optimized/ --with-debugging=0 COPTFLAGS="-O3 -march=native" FOPTFLAGS="-O3 -qarch=native" --with-clean=1 --with-hdf5=1 --download-hdf5=yes ----------------------------------------- Libraries compiled on Thu Feb 20 16:27:19 2014 on Deathstar Machine characteristics: Linux-3.12.9-2-ARCH-x86_64-with-glibc2.2.5 Using PETSc directory: /home/mc/Downloads/petsc Using PETSc arch: arch-linux2-c-debug ----------------------------------------- Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -O3 -march=native ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: mpif90 -fPIC -Wall -Wno-unused-variable -Wno-unused-dummy-argument ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/home/mc/Downloads/petsc/arch-linux2-c-debug/include -I/home/mc/Downloads/petsc/include -I/home/mc/Downloads/petsc/include -I/home/mc/Downloads/petsc/arch-linux2-c-debug/include ----------------------------------------- Using C linker: mpicc Using Fortran linker: mpif90 Using libraries: -Wl,-rpath,/home/mc/Downloads/petsc/arch-linux2-c-debug/lib -L/home/mc/Downloads/petsc/arch-linux2-c-debug/lib -lpetsc -llapack -lblas -lX11 -lpthread -Wl,-rpath,/home/mc/Downloads/petsc/arch-linux2-c-debug/lib -L/home/mc/Downloads/petsc/arch-linux2-c-debug/lib -lhdf5_fortran -lhdf5_hl -lhdf5 -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/usr/lib/gcc/x86_64-unknown-linux-gnu/4.8.2 -L/usr/lib/gcc/x86_64-unknown-linux-gnu/4.8.2 -Wl,-rpath,/opt/intel/composerxe/compiler/lib/intel64 -L/opt/intel/composerxe/compiler/lib/intel64 -Wl,-rpath,/opt/intel/composerxe/ipp/lib/intel64 -L/opt/intel/composerxe/ipp/lib/intel64 -Wl,-rpath,/opt/intel/composerxe/mkl/lib/intel64 -L/opt/intel/composerxe/mkl/lib/intel64 -Wl,-rpath,/opt/intel/composerxe/tbb/lib/intel64/gcc4.4 -L/opt/intel/composerxe/tbb/lib/intel64/gcc4.4 -lmpi_f90 -lmpi_f77 -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -ldl -lmpi -lhwloc -lgcc_s -lpthread -ldl ----------------------------------------- -------------- next part -------------- ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- ./grim on a arch-linux2-c-debug named Deathstar with 1 processor, by mc Tue Mar 18 19:48:47 2014 Using Petsc Development GIT revision: v3.4.3-3262-g255453a GIT Date: 2014-02-08 22:41:14 -0600 Max Max/Min Avg Total Time (sec): 4.053e+01 1.00000 4.053e+01 Objects: 2.750e+02 1.00000 2.750e+02 Flops: 9.192e+09 1.00000 9.192e+09 9.192e+09 Flops/sec: 2.268e+08 1.00000 2.268e+08 2.268e+08 MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Reductions: 2.800e+02 1.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 4.0532e+01 100.0% 9.1921e+09 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 2.790e+02 99.6% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage VecView 3 1.0 4.3488e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 VecDot 11 1.0 1.0955e-02 1.0 1.15e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1053 VecMDot 36 1.0 5.5024e-02 1.0 8.18e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 1486 VecNorm 72 1.0 5.6935e-02 1.0 7.55e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 1326 VecScale 47 1.0 2.1471e-02 1.0 2.46e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1148 VecCopy 236 1.0 2.6282e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 VecSet 51 1.0 6.2247e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 212 1.0 2.1605e-01 1.0 2.22e+08 1.0 0.0e+00 0.0e+00 0.0e+00 1 2 0 0 0 1 2 0 0 0 1029 VecAXPBYCZ 214 1.0 3.3499e-01 1.0 3.37e+08 1.0 0.0e+00 0.0e+00 0.0e+00 1 4 0 0 0 1 4 0 0 0 1005 VecWAXPY 12 1.0 1.8339e-02 1.0 6.82e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 372 VecMAXPY 47 1.0 8.2657e-02 1.0 1.20e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 1446 VecLoad 1 1.0 1.6351e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecReduceArith 22 1.0 1.6958e-02 1.0 2.31e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1360 VecReduceComm 11 1.0 1.2875e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecNormalize 47 1.0 5.7861e-02 1.0 7.39e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 1278 MatMult 47 1.0 3.3177e+00 1.0 3.51e+09 1.0 0.0e+00 0.0e+00 0.0e+00 8 38 0 0 0 8 38 0 0 0 1057 MatSolve 47 1.0 3.2621e+00 1.0 3.51e+09 1.0 0.0e+00 0.0e+00 0.0e+00 8 38 0 0 0 8 38 0 0 0 1075 MatLUFactorNum 1 1.0 6.2700e+00 1.0 1.28e+09 1.0 0.0e+00 0.0e+00 0.0e+00 15 14 0 0 0 15 14 0 0 0 204 MatILUFactorSym 1 1.0 2.2177e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.0e+00 1 0 0 0 0 1 0 0 0 0 0 MatAssemblyBegin 2 1.0 9.5367e-07 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyEnd 2 1.0 1.1208e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 1 1.0 1.9073e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetOrdering 1 1.0 1.0976e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 4.0e+00 0 0 0 0 1 0 0 0 0 1 0 MatZeroEntries 1 1.0 4.7051e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatFDColorCreate 1 1.0 2.9862e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+00 0 0 0 0 1 0 0 0 0 1 0 MatFDColorSetUp 1 1.0 2.0944e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+02 5 0 0 0 73 5 0 0 0 73 0 MatFDColorApply 1 1.0 1.4396e+01 1.0 5.25e+08 1.0 0.0e+00 0.0e+00 2.0e+00 36 6 0 0 1 36 6 0 0 1 36 MatFDColorFunc 200 1.0 1.3555e+01 1.0 3.15e+08 1.0 0.0e+00 0.0e+00 0.0e+00 33 3 0 0 0 33 3 0 0 0 23 TSStep 1 1.0 3.4390e+01 1.0 9.19e+09 1.0 0.0e+00 0.0e+00 2.6e+02 85100 0 0 93 85100 0 0 93 267 TSFunctionEval 213 1.0 1.4103e+01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 35 0 0 0 0 35 0 0 0 0 0 SNESSolve 1 1.0 3.0864e+01 1.0 9.19e+09 1.0 0.0e+00 0.0e+00 2.5e+02 76100 0 0 89 76100 0 0 90 298 SNESFunctionEval 13 1.0 8.8251e-01 1.0 2.04e+07 1.0 0.0e+00 0.0e+00 0.0e+00 2 0 0 0 0 2 0 0 0 0 23 SNESJacobianEval 1 1.0 1.6544e+01 1.0 5.25e+08 1.0 0.0e+00 0.0e+00 2.1e+02 41 6 0 0 75 41 6 0 0 76 32 SNESLineSearch 11 1.0 1.6754e+00 1.0 9.05e+08 1.0 0.0e+00 0.0e+00 0.0e+00 4 10 0 0 0 4 10 0 0 0 540 KSPGMRESOrthog 36 1.0 1.1178e-01 1.0 1.64e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 1463 KSPSetUp 11 1.0 7.5173e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.0e+01 0 0 0 0 4 0 0 0 0 4 0 KSPSolve 11 1.0 1.2571e+01 1.0 7.76e+09 1.0 0.0e+00 0.0e+00 3.5e+01 31 84 0 0 12 31 84 0 0 13 617 PCSetUp 1 1.0 6.5028e+00 1.0 1.28e+09 1.0 0.0e+00 0.0e+00 5.0e+00 16 14 0 0 2 16 14 0 0 2 197 PCApply 47 1.0 3.2621e+00 1.0 3.51e+09 1.0 0.0e+00 0.0e+00 0.0e+00 8 38 0 0 0 8 38 0 0 0 1075 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 33 33 125878088 0 Vector Scatter 3 3 1956 0 Matrix 2 2 918033676 0 Matrix FD Coloring 1 1 829426772 0 Distributed Mesh 2 2 279712 0 Bipartite Graph 4 4 3264 0 Index Set 212 212 7242384 0 IS L to G Mapping 3 3 1788 0 TSAdapt 2 2 2384 0 TS 1 1 1240 0 DMTS 1 1 720 0 SNES 1 1 1348 0 SNESLineSearch 1 1 880 0 DMSNES 1 1 680 0 Krylov Solver 1 1 18376 0 DMKSP interface 1 1 664 0 Preconditioner 1 1 992 0 Viewer 5 4 2848 0 ======================================================================================================================== Average time to get PetscTime(): 9.53674e-08 #PETSc Option Table entries: -log_summary -snes_atol 1e-5 -snes_converged_reason -snes_lag_jacobian 100 -snes_lag_jacobian_persists TRUE -snes_max_it 101 -snes_monitor -snes_rtol 1e-50 -snes_stol 1e-50 -ts_dt 0.03 -ts_final_time 500. -ts_max_snes_failures -1 -ts_max_steps 1 -ts_monitor -ts_type theta #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure run at: Thu Feb 20 16:27:19 2014 Configure options: --prefix=/home/mc/Downloads/petsc_optimized/ --with-debugging=0 COPTFLAGS="-O3 -march=native" FOPTFLAGS="-O3 -qarch=native" --with-clean=1 --with-hdf5=1 --download-hdf5=yes ----------------------------------------- Libraries compiled on Thu Feb 20 16:27:19 2014 on Deathstar Machine characteristics: Linux-3.12.9-2-ARCH-x86_64-with-glibc2.2.5 Using PETSc directory: /home/mc/Downloads/petsc Using PETSc arch: arch-linux2-c-debug ----------------------------------------- Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -O3 -march=native ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: mpif90 -fPIC -Wall -Wno-unused-variable -Wno-unused-dummy-argument ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/home/mc/Downloads/petsc/arch-linux2-c-debug/include -I/home/mc/Downloads/petsc/include -I/home/mc/Downloads/petsc/include -I/home/mc/Downloads/petsc/arch-linux2-c-debug/include ----------------------------------------- Using C linker: mpicc Using Fortran linker: mpif90 Using libraries: -Wl,-rpath,/home/mc/Downloads/petsc/arch-linux2-c-debug/lib -L/home/mc/Downloads/petsc/arch-linux2-c-debug/lib -lpetsc -llapack -lblas -lX11 -lpthread -Wl,-rpath,/home/mc/Downloads/petsc/arch-linux2-c-debug/lib -L/home/mc/Downloads/petsc/arch-linux2-c-debug/lib -lhdf5_fortran -lhdf5_hl -lhdf5 -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/usr/lib/gcc/x86_64-unknown-linux-gnu/4.8.2 -L/usr/lib/gcc/x86_64-unknown-linux-gnu/4.8.2 -Wl,-rpath,/opt/intel/composerxe/compiler/lib/intel64 -L/opt/intel/composerxe/compiler/lib/intel64 -Wl,-rpath,/opt/intel/composerxe/ipp/lib/intel64 -L/opt/intel/composerxe/ipp/lib/intel64 -Wl,-rpath,/opt/intel/composerxe/mkl/lib/intel64 -L/opt/intel/composerxe/mkl/lib/intel64 -Wl,-rpath,/opt/intel/composerxe/tbb/lib/intel64/gcc4.4 -L/opt/intel/composerxe/tbb/lib/intel64/gcc4.4 -lmpi_f90 -lmpi_f77 -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -ldl -lmpi -lhwloc -lgcc_s -lpthread -ldl ----------------------------------------- From prbrune at gmail.com Tue Mar 18 20:30:12 2014 From: prbrune at gmail.com (Peter Brune) Date: Tue, 18 Mar 2014 20:30:12 -0500 Subject: [petsc-users] Drastic increase in memory usage between DMDA_STENCIL_BOX and DMDA_STENCIL_STAR In-Reply-To: References: Message-ID: On Tue, Mar 18, 2014 at 8:04 PM, Mani Chandra wrote: > Hi, > > I see a 4x increase in the memory usage when I change from > DMDA_STENCIL_STAR to DMDA_STENCIL_BOX. Attached are the outputs of > -log_summary which shows a huge increase in the matrix memory usage. Is > this expected? > > Yup. In 2D your matrices balloon from 5 entries per row to 9. In 3D it's 7 to 27. The number of colors required to color the matrix will increase by a significant factor as well, explaining the situation in your log files by simple multiplication. > On a different note, suppose I am running a serial calculation with no > need to exchange data but I am using corner node information, do I need to > use DMDA_STENCIL_BOX? Would the jacobian when computed using colored finite > differences be correctly represented if I use corner information but still > use DMDA_STENCIL_STAR? > > If you're using diagonal corners in your function evaluation, then you need DMDA_STENCIL_BOX. Otherwise you're going to have columns of the Jacobian of the same color that aren't structurally orthogonal, and the finite difference Jacobian will be straightup wrong. - Peter > Thanks, > Mani > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Tue Mar 18 21:11:24 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 18 Mar 2014 21:11:24 -0500 Subject: [petsc-users] Spatial distribution of the residual after an implicit TS timestep In-Reply-To: References: <9523A33B-3F7D-4C44-9DFF-361FF72546A0@mcs.anl.gov> <87txavko8s.fsf@jedbrown.org> Message-ID: <6FE9F3FA-A207-41C5-8D20-CB294DD5CE88@mcs.anl.gov> On Mar 18, 2014, at 7:54 PM, Mani Chandra wrote: > The residual of the system of equation that I feed into TS with TSSetIFunction. I have a system of equations and I want to probe where most of the residual is coming from. The reason is that after certain time of evolution, the initial residual at the beginning of each time step increases by orders of magnitude than what it used to be at the beginning of the time step at early times. For ex, say at t=200, the SNES norm at the beginning of a TS timestep with the theta method would be something like 1.0812 and at t = 300, it would be 2e5 at the beginning. SNES then has to work much harder to reach the abs norm levels and so I want to investigate what is happening. So this is the initial residual in the nonlinear solve. -snes_monitor_residual works for plotting if it is 2d otherwise write your own custom SNES monitor routine and call TSGetSNES, SNESSetMonitor(). Barry > > > On Tue, Mar 18, 2014 at 7:48 PM, Jed Brown wrote: > Mani Chandra writes: > > > Is there anyway I can get the residual at the beginning of the time step? I > > tried TSSetPreStage and TSSetPreStep but they didn't work. Setting > > TSSetPreStage would crash the program even if there is nothing in the > > routine that I set and TSSetPreStep would give me the residual at the end > > of the previous time step. I want to take a look at the residual at the > > beginning of the new time step before the nonlinear solver starts. > > Residual of what? What are you going to do with it? > From bsmith at mcs.anl.gov Tue Mar 18 21:14:18 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 18 Mar 2014 21:14:18 -0500 Subject: [petsc-users] Drastic increase in memory usage between DMDA_STENCIL_BOX and DMDA_STENCIL_STAR In-Reply-To: References: Message-ID: <051B67D0-C3F0-48B7-ABB7-978E2004AEBA@mcs.anl.gov> On Mar 18, 2014, at 8:04 PM, Mani Chandra wrote: > Hi, > > I see a 4x increase in the memory usage when I change from DMDA_STENCIL_STAR to DMDA_STENCIL_BOX. Attached are the outputs of -log_summary which shows a huge increase in the matrix memory usage. Is this expected? > > On a different note, suppose I am running a serial calculation with no need to exchange data but I am using corner node information, How can you use corner node information if you are not exchanging it? You will be using incorrect data > do I need to use DMDA_STENCIL_BOX? Would the jacobian when computed using colored finite differences be correctly represented if I use corner information but still use DMDA_STENCIL_STAR? > > Thanks, > Mani > > > From mc0710 at gmail.com Tue Mar 18 22:25:50 2014 From: mc0710 at gmail.com (Mani Chandra) Date: Tue, 18 Mar 2014 22:25:50 -0500 Subject: [petsc-users] Drastic increase in memory usage between DMDA_STENCIL_BOX and DMDA_STENCIL_STAR In-Reply-To: <051B67D0-C3F0-48B7-ABB7-978E2004AEBA@mcs.anl.gov> References: <051B67D0-C3F0-48B7-ABB7-978E2004AEBA@mcs.anl.gov> Message-ID: I meant to say that since its a serial computation, I have all the information on the same node and I am not exchanging data with another node. I wanted to know if DMDA_STENCIL_BOX would also change the jacobian structure. Looks like it does. On Tue, Mar 18, 2014 at 9:14 PM, Barry Smith wrote: > > On Mar 18, 2014, at 8:04 PM, Mani Chandra wrote: > > > Hi, > > > > I see a 4x increase in the memory usage when I change from > DMDA_STENCIL_STAR to DMDA_STENCIL_BOX. Attached are the outputs of > -log_summary which shows a huge increase in the matrix memory usage. Is > this expected? > > > > On a different note, suppose I am running a serial calculation with no > need to exchange data but I am using corner node information, > > How can you use corner node information if you are not exchanging it? > You will be using incorrect data > > > > do I need to use DMDA_STENCIL_BOX? Would the jacobian when computed > using colored finite differences be correctly represented if I use corner > information but still use DMDA_STENCIL_STAR? > > > > Thanks, > > Mani > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mc0710 at gmail.com Tue Mar 18 22:56:38 2014 From: mc0710 at gmail.com (Mani Chandra) Date: Tue, 18 Mar 2014 22:56:38 -0500 Subject: [petsc-users] Spatial distribution of the residual after an implicit TS timestep In-Reply-To: <6FE9F3FA-A207-41C5-8D20-CB294DD5CE88@mcs.anl.gov> References: <9523A33B-3F7D-4C44-9DFF-361FF72546A0@mcs.anl.gov> <87txavko8s.fsf@jedbrown.org> <6FE9F3FA-A207-41C5-8D20-CB294DD5CE88@mcs.anl.gov> Message-ID: I can't find SNESSetMonitor in petsc-dev. I get the following error even after including petscsnes.h error: ?SNESSetMonitor? was not declared in this scope Moreover, there is no manpage for it on the petsc-dev SNES website: http://www.mcs.anl.gov/petsc/petsc-dev/docs/manualpages/SNES/index.html Has it been removed? On Tue, Mar 18, 2014 at 9:11 PM, Barry Smith wrote: > > On Mar 18, 2014, at 7:54 PM, Mani Chandra wrote: > > > The residual of the system of equation that I feed into TS with > TSSetIFunction. I have a system of equations and I want to probe where most > of the residual is coming from. The reason is that after certain time of > evolution, the initial residual at the beginning of each time step > increases by orders of magnitude than what it used to be at the beginning > of the time step at early times. For ex, say at t=200, the SNES norm at the > beginning of a TS timestep with the theta method would be something like > 1.0812 and at t = 300, it would be 2e5 at the beginning. SNES then has to > work much harder to reach the abs norm levels and so I want to investigate > what is happening. > > So this is the initial residual in the nonlinear solve. > -snes_monitor_residual works for plotting if it is 2d otherwise write your > own custom SNES monitor routine and call TSGetSNES, SNESSetMonitor(). > > Barry > > > > > > > On Tue, Mar 18, 2014 at 7:48 PM, Jed Brown wrote: > > Mani Chandra writes: > > > > > Is there anyway I can get the residual at the beginning of the time > step? I > > > tried TSSetPreStage and TSSetPreStep but they didn't work. Setting > > > TSSetPreStage would crash the program even if there is nothing in the > > > routine that I set and TSSetPreStep would give me the residual at the > end > > > of the previous time step. I want to take a look at the residual at the > > > beginning of the new time step before the nonlinear solver starts. > > > > Residual of what? What are you going to do with it? > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Tue Mar 18 23:11:44 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 18 Mar 2014 23:11:44 -0500 Subject: [petsc-users] Spatial distribution of the residual after an implicit TS timestep In-Reply-To: References: <9523A33B-3F7D-4C44-9DFF-361FF72546A0@mcs.anl.gov> <87txavko8s.fsf@jedbrown.org> <6FE9F3FA-A207-41C5-8D20-CB294DD5CE88@mcs.anl.gov> Message-ID: Sorry http://www.mcs.anl.gov/petsc/petsc-dev/docs/manualpages/SNES/SNESMonitorSet.html On Mar 18, 2014, at 10:56 PM, Mani Chandra wrote: > I can't find SNESSetMonitor in petsc-dev. I get the following error even after including petscsnes.h > > error: ?SNESSetMonitor? was not declared in this scope > > Moreover, there is no manpage for it on the petsc-dev SNES website: http://www.mcs.anl.gov/petsc/petsc-dev/docs/manualpages/SNES/index.html > > Has it been removed? > > > On Tue, Mar 18, 2014 at 9:11 PM, Barry Smith wrote: > > On Mar 18, 2014, at 7:54 PM, Mani Chandra wrote: > > > The residual of the system of equation that I feed into TS with TSSetIFunction. I have a system of equations and I want to probe where most of the residual is coming from. The reason is that after certain time of evolution, the initial residual at the beginning of each time step increases by orders of magnitude than what it used to be at the beginning of the time step at early times. For ex, say at t=200, the SNES norm at the beginning of a TS timestep with the theta method would be something like 1.0812 and at t = 300, it would be 2e5 at the beginning. SNES then has to work much harder to reach the abs norm levels and so I want to investigate what is happening. > > So this is the initial residual in the nonlinear solve. -snes_monitor_residual works for plotting if it is 2d otherwise write your own custom SNES monitor routine and call TSGetSNES, SNESSetMonitor(). > > Barry > > > > > > > On Tue, Mar 18, 2014 at 7:48 PM, Jed Brown wrote: > > Mani Chandra writes: > > > > > Is there anyway I can get the residual at the beginning of the time step? I > > > tried TSSetPreStage and TSSetPreStep but they didn't work. Setting > > > TSSetPreStage would crash the program even if there is nothing in the > > > routine that I set and TSSetPreStep would give me the residual at the end > > > of the previous time step. I want to take a look at the residual at the > > > beginning of the new time step before the nonlinear solver starts. > > > > Residual of what? What are you going to do with it? > > > > From mc0710 at gmail.com Tue Mar 18 23:14:33 2014 From: mc0710 at gmail.com (Mani Chandra) Date: Tue, 18 Mar 2014 23:14:33 -0500 Subject: [petsc-users] Spatial distribution of the residual after an implicit TS timestep In-Reply-To: References: <9523A33B-3F7D-4C44-9DFF-361FF72546A0@mcs.anl.gov> <87txavko8s.fsf@jedbrown.org> <6FE9F3FA-A207-41C5-8D20-CB294DD5CE88@mcs.anl.gov> Message-ID: Ahh thanks. I got confused cause there apparently is indeed a function called SNESSetMonitor in a (very) old version of petsc: http://www.mcs.anl.gov/petsc/petsc-2.3.1/docs/manualpages/SNES/SNESSetMonitor.html On Tue, Mar 18, 2014 at 11:11 PM, Barry Smith wrote: > > Sorry > http://www.mcs.anl.gov/petsc/petsc-dev/docs/manualpages/SNES/SNESMonitorSet.html > > > On Mar 18, 2014, at 10:56 PM, Mani Chandra wrote: > > > I can't find SNESSetMonitor in petsc-dev. I get the following error even > after including petscsnes.h > > > > error: ?SNESSetMonitor? was not declared in this scope > > > > Moreover, there is no manpage for it on the petsc-dev SNES website: > http://www.mcs.anl.gov/petsc/petsc-dev/docs/manualpages/SNES/index.html > > > > Has it been removed? > > > > > > On Tue, Mar 18, 2014 at 9:11 PM, Barry Smith wrote: > > > > On Mar 18, 2014, at 7:54 PM, Mani Chandra wrote: > > > > > The residual of the system of equation that I feed into TS with > TSSetIFunction. I have a system of equations and I want to probe where most > of the residual is coming from. The reason is that after certain time of > evolution, the initial residual at the beginning of each time step > increases by orders of magnitude than what it used to be at the beginning > of the time step at early times. For ex, say at t=200, the SNES norm at the > beginning of a TS timestep with the theta method would be something like > 1.0812 and at t = 300, it would be 2e5 at the beginning. SNES then has to > work much harder to reach the abs norm levels and so I want to investigate > what is happening. > > > > So this is the initial residual in the nonlinear solve. > -snes_monitor_residual works for plotting if it is 2d otherwise write your > own custom SNES monitor routine and call TSGetSNES, SNESSetMonitor(). > > > > Barry > > > > > > > > > > > On Tue, Mar 18, 2014 at 7:48 PM, Jed Brown wrote: > > > Mani Chandra writes: > > > > > > > Is there anyway I can get the residual at the beginning of the time > step? I > > > > tried TSSetPreStage and TSSetPreStep but they didn't work. Setting > > > > TSSetPreStage would crash the program even if there is nothing in the > > > > routine that I set and TSSetPreStep would give me the residual at > the end > > > > of the previous time step. I want to take a look at the residual at > the > > > > beginning of the new time step before the nonlinear solver starts. > > > > > > Residual of what? What are you going to do with it? > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Wed Mar 19 01:01:08 2014 From: jed at jedbrown.org (Jed Brown) Date: Wed, 19 Mar 2014 07:01:08 +0100 Subject: [petsc-users] Spatial distribution of the residual after an implicit TS timestep In-Reply-To: References: <9523A33B-3F7D-4C44-9DFF-361FF72546A0@mcs.anl.gov> <87txavko8s.fsf@jedbrown.org> <6FE9F3FA-A207-41C5-8D20-CB294DD5CE88@mcs.anl.gov> Message-ID: <87lhw6lobf.fsf@jedbrown.org> Mani Chandra writes: > I can't find SNESSetMonitor in petsc-dev. I get the following error even > after including petscsnes.h > > error: ?SNESSetMonitor? was not declared in this scope > > Moreover, there is no manpage for it on the petsc-dev SNES website: > http://www.mcs.anl.gov/petsc/petsc-dev/docs/manualpages/SNES/index.html > > Has it been removed? It has been spelled SNESMonitorSet since v2.3.3 (2007). http://www.mcs.anl.gov/petsc/petsc-dev/docs/manualpages/SNES/SNESMonitorSet.html -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From oliver.browne at upm.es Wed Mar 19 12:03:59 2014 From: oliver.browne at upm.es (Oliver Browne) Date: Wed, 19 Mar 2014 18:03:59 +0100 Subject: [petsc-users] Fwd: Convergence problems Message-ID: <6bfbcf14407c675fa30093ea78ba3f02@upm.es> To Whom It May Concern, I am using petsc to solve an eigenvalue problem Jq = oq I am obtaining the Jacobian matrix, J, from solving F(x) = 0 and by numerically perturbing the solution to give J = dF(x)/dx = F(x + delta) + F(x) / delta (1) I use the PETSc to solve Ax=b in the Arnoldi iteration to build the Hessenberg matrix. For small problems (1000 DoF) I obtain accurate results in a timely fashion by using ILU(2) and GMRES. However, when I move to bigger problems (60000 Dof) I can't get the GMRES to converge. 5 GMRES steps take an HOUR (it is incredible!!!). I have tried modifying the levels for ILU but with no success and a few different preconditioners. I have also tried some of the MatGetOrdering to improve the stability. I believe that it is down to the structure of my matrix, because I have tried similar size matrices obtained from a different code which creates the analytical Jacobian (has a different topology), of which I can get a convergence to GMRES in about 3 mins. I have attached a figure showing the structure of the matrix for a smaller problem but obtained with equation (1) above. I would be very grateful if you could give me some advice in choosing a different preconditioner/solver etc or anything else which could help. Thanks in advance Olls -------------- next part -------------- A non-text attachment was scrubbed... Name: MatrixStructure.png Type: image/png Size: 60505 bytes Desc: not available URL: From jroman at dsic.upv.es Wed Mar 19 12:17:17 2014 From: jroman at dsic.upv.es (Jose E. Roman) Date: Wed, 19 Mar 2014 18:17:17 +0100 Subject: [petsc-users] Fwd: Convergence problems In-Reply-To: <6bfbcf14407c675fa30093ea78ba3f02@upm.es> References: <6bfbcf14407c675fa30093ea78ba3f02@upm.es> Message-ID: <25D37C2C-843D-41BC-8D57-E16DF8A196F3@dsic.upv.es> El 19/03/2014, a las 18:03, Oliver Browne escribi?: > To Whom It May Concern, > > I am using petsc to solve an eigenvalue problem > > Jq = oq > > I am obtaining the Jacobian matrix, J, from solving > > F(x) = 0 > > and by numerically perturbing the solution to give > > J = dF(x)/dx = F(x + delta) + F(x) / delta (1) > > I use the PETSc to solve Ax=b in the Arnoldi iteration to build the Hessenberg matrix. > > For small problems (1000 DoF) I obtain accurate results in a timely fashion by using ILU(2) and GMRES. > > However, when I move to bigger problems (60000 Dof) I can't get the GMRES to converge. 5 GMRES steps take an HOUR (it is incredible!!!). I have tried modifying the levels for ILU but with no success and a few different preconditioners. I have also tried some of the MatGetOrdering to improve the stability. I believe that it is down to the structure of my matrix, because I have tried similar size matrices obtained from a different code which creates the analytical Jacobian (has a different topology), of which I can get a convergence to GMRES in about 3 mins. I have attached a figure showing the structure of the matrix for a smaller problem but obtained with equation (1) above. > > I would be very grateful if you could give me some advice in choosing a different preconditioner/solver etc or anything else which could help. > > Thanks in advance > > Olls Are you using SLEPc for the eigenvalue problem? Which eigenvalues do you want? Are you using GMRES for the shift-and-invert solve? Jose From oliver.browne at upm.es Wed Mar 19 12:22:59 2014 From: oliver.browne at upm.es (Oliver Browne) Date: Wed, 19 Mar 2014 18:22:59 +0100 Subject: [petsc-users] Fwd: Convergence problems In-Reply-To: <25D37C2C-843D-41BC-8D57-E16DF8A196F3@dsic.upv.es> References: <6bfbcf14407c675fa30093ea78ba3f02@upm.es> <25D37C2C-843D-41BC-8D57-E16DF8A196F3@dsic.upv.es> Message-ID: <0e5791fd194ee5f4e2861a86851635ce@upm.es> The problem at the moment is building the Hessenberg matrix. Once I have that I find the eigenvalues with LAPACK. So I am solving x = (J^I)b with PETSc (ILU and GMRES) for the first column of the Hessenberg matrix and for the big problem (Dof = 64000) I can't get it to converge. Regards, --- Oliver Browne Ph.D. Student School of Aeronautics (E.T.S.I.A.) Universidad Polit?cnica de Madrid Tel No: (0034) 913 366 326 ext: 201 http://matap.dmae.upm.es/numat/index.php/people/mod-fellow-in?persona=oliver http://www.anade-itn.eu/index.php/people/fellow-information On 19-03-2014 18:17, Jose E. Roman wrote: > El 19/03/2014, a las 18:03, Oliver Browne escribi?: > >> To Whom It May Concern, >> >> I am using petsc to solve an eigenvalue problem >> >> Jq = oq >> >> I am obtaining the Jacobian matrix, J, from solving >> >> F(x) = 0 >> >> and by numerically perturbing the solution to give >> >> J = dF(x)/dx = F(x + delta) + F(x) / delta (1) >> >> I use the PETSc to solve Ax=b in the Arnoldi iteration to build the >> Hessenberg matrix. >> >> For small problems (1000 DoF) I obtain accurate results in a timely >> fashion by using ILU(2) and GMRES. >> >> However, when I move to bigger problems (60000 Dof) I can't get the >> GMRES to converge. 5 GMRES steps take an HOUR (it is incredible!!!). I >> have tried modifying the levels for ILU but with no success and a few >> different preconditioners. I have also tried some of the >> MatGetOrdering to improve the stability. I believe that it is down to >> the structure of my matrix, because I have tried similar size matrices >> obtained from a different code which creates the analytical Jacobian >> (has a different topology), of which I can get a convergence to GMRES >> in about 3 mins. I have attached a figure showing the structure of the >> matrix for a smaller problem but obtained with equation (1) above. >> >> I would be very grateful if you could give me some advice in choosing >> a different preconditioner/solver etc or anything else which could >> help. >> >> Thanks in advance >> >> Olls > > Are you using SLEPc for the eigenvalue problem? Which eigenvalues do > you want? Are you using GMRES for the shift-and-invert solve? > > Jose From asmund.ervik at ntnu.no Wed Mar 19 15:44:35 2014 From: asmund.ervik at ntnu.no (=?Windows-1252?Q?=C5smund_Ervik?=) Date: Wed, 19 Mar 2014 20:44:35 +0000 Subject: [petsc-users] Writing solution data to file when using DMDA. In-Reply-To: <5A7CBF66-5E2D-4BC6-B116-C30452DDBF0E@mcs.anl.gov> References: <0E576811AB298343AC632BBCAAEFC37945BEFD17@WAREHOUSE08.win.ntnu.no>, <5A7CBF66-5E2D-4BC6-B116-C30452DDBF0E@mcs.anl.gov> Message-ID: <0E576811AB298343AC632BBCAAEFC37945BF0070@WAREHOUSE08.win.ntnu.no> Hi Barry, Thanks for the hints. I couldn't find any "DMDACreateNatural", but after some grepping of the manual, I ended up with this: [... in init] DMDACreateNaturalVector() VecScatterCreateToZero() [... inside a loop, e.g. over time steps] DMDAGlobalToNaturalBegin/End() VecScatterBegin/End() VecGetArray() [... write to file] VecRestoreArray() Is this what you meant, or have I misunderstood anything? It seems to work fine both in sequential and parallel, the resulting plots look OK, but it has uncovered some race condition that I need to fix now. Regards, ?smund ________________________________________ Fra: Barry Smith [bsmith at mcs.anl.gov] Sendt: 19. mars 2014 00:27 Til: ?smund Ervik Kopi: petsc-users at mcs.anl.gov Emne: Re: [petsc-users] Writing solution data to file when using DMDA. On Mar 18, 2014, at 5:19 PM, ?smund Ervik wrote: > Dear PETSc users, > > I'm trying to wrap my head around parallel I/O. If I understand correctly, a decent way of doing this is having one rank (say 0) writing to disk, and the other ranks communicating their part of the solution to rank 0. Please correct me if I'm wrong here. > > I'm using DMDA to manage my domain decomposition. As a first step, I've been trying to create an array on rank 0 holding the entire global solution and then writing this to file by re-using some routines from our serial codes (the format is Tecplot ASCII). (I realize that neither this approach nor an ASCII format are good solutions in the end, but I have to start somewhere.) However, I haven't been able to find any DMDA routines that give me an array holding the entire global solution on rank 0. Are there any, or is this too much of a "dirty trick"? (For just 1 process there is no problem, the output files generated look good.) DMDACreateNatural() DMDAGlobalToNaturalBegin/End() VecScatterCreateToZero VecGetArray() on process 0 the final array is in the natural ordering, x direction first, y direction second, z direction third. > > I'm also willing to try the VTK way of doing things, but I hit a problem when I tried that: even though I include "petscviewer.h" (also tried adding "petscviewerdef.h"), when I do > call PetscViewerSetType(viewer,PETSCVIEWERVTK,ierr) > my compiler complains that PETSCVIEWERVTK is undefined (has no implicit type). This is from Fortran90 using preprocessing macros to #include the files. I tried PETSCVIEWERASCII as well, same problem. This is with 3.4.3. Any hints on this? Hmm, they are in petscviewerdef.h in 3.4.4 but anyways you can pass ?vtk? or ?ascii? as the type > > Also, there are many different examples and mailing list threads about VTK output. What is the currently recommended way of doing things? I need to output at least (u,v,w) as vector components of one field, together with a scalar field (p). These currently have separate DM's, since I only use PETSc to solve for p (the pressure). > > Best regards, > ?smund From bsmith at mcs.anl.gov Wed Mar 19 15:46:01 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 19 Mar 2014 15:46:01 -0500 Subject: [petsc-users] Writing solution data to file when using DMDA. In-Reply-To: <0E576811AB298343AC632BBCAAEFC37945BF0070@WAREHOUSE08.win.ntnu.no> References: <0E576811AB298343AC632BBCAAEFC37945BEFD17@WAREHOUSE08.win.ntnu.no>, <5A7CBF66-5E2D-4BC6-B116-C30452DDBF0E@mcs.anl.gov> <0E576811AB298343AC632BBCAAEFC37945BF0070@WAREHOUSE08.win.ntnu.no> Message-ID: <7009AD67-583C-419B-8797-CB4D0F3EFDE6@mcs.anl.gov> Yes, sorry. On Mar 19, 2014, at 3:44 PM, ?smund Ervik wrote: > Hi Barry, > > Thanks for the hints. I couldn't find any "DMDACreateNatural", but after some grepping of the manual, I ended up with this: > > [... in init] > DMDACreateNaturalVector() > VecScatterCreateToZero() > > [... inside a loop, e.g. over time steps] > DMDAGlobalToNaturalBegin/End() > VecScatterBegin/End() > VecGetArray() > [... write to file] > VecRestoreArray() > > Is this what you meant, or have I misunderstood anything? > > It seems to work fine both in sequential and parallel, the resulting plots look OK, but it has uncovered some race condition that I need to fix now. > > Regards, > ?smund > > ________________________________________ > Fra: Barry Smith [bsmith at mcs.anl.gov] > Sendt: 19. mars 2014 00:27 > Til: ?smund Ervik > Kopi: petsc-users at mcs.anl.gov > Emne: Re: [petsc-users] Writing solution data to file when using DMDA. > > On Mar 18, 2014, at 5:19 PM, ?smund Ervik wrote: > >> Dear PETSc users, >> >> I'm trying to wrap my head around parallel I/O. If I understand correctly, a decent way of doing this is having one rank (say 0) writing to disk, and the other ranks communicating their part of the solution to rank 0. Please correct me if I'm wrong here. >> >> I'm using DMDA to manage my domain decomposition. As a first step, I've been trying to create an array on rank 0 holding the entire global solution and then writing this to file by re-using some routines from our serial codes (the format is Tecplot ASCII). (I realize that neither this approach nor an ASCII format are good solutions in the end, but I have to start somewhere.) However, I haven't been able to find any DMDA routines that give me an array holding the entire global solution on rank 0. Are there any, or is this too much of a "dirty trick"? (For just 1 process there is no problem, the output files generated look good.) > > DMDACreateNatural() > DMDAGlobalToNaturalBegin/End() > VecScatterCreateToZero > VecGetArray() on process 0 > > the final array is in the natural ordering, x direction first, y direction second, z direction third. >> >> I'm also willing to try the VTK way of doing things, but I hit a problem when I tried that: even though I include "petscviewer.h" (also tried adding "petscviewerdef.h"), when I do >> call PetscViewerSetType(viewer,PETSCVIEWERVTK,ierr) >> my compiler complains that PETSCVIEWERVTK is undefined (has no implicit type). This is from Fortran90 using preprocessing macros to #include the files. I tried PETSCVIEWERASCII as well, same problem. This is with 3.4.3. Any hints on this? > > Hmm, they are in petscviewerdef.h in 3.4.4 but anyways you can pass ?vtk? or ?ascii? as the type >> >> Also, there are many different examples and mailing list threads about VTK output. What is the currently recommended way of doing things? I need to output at least (u,v,w) as vector components of one field, together with a scalar field (p). These currently have separate DM's, since I only use PETSc to solve for p (the pressure). >> >> Best regards, >> ?smund > From lu_qin_2000 at yahoo.com Wed Mar 19 16:05:02 2014 From: lu_qin_2000 at yahoo.com (Qin Lu) Date: Wed, 19 Mar 2014 14:05:02 -0700 (PDT) Subject: [petsc-users] KSPSolve crash In-Reply-To: <8F84BD8FD72DC64AA206CF6E3D7FC47413FD25DF@BUTKUS.anl.gov> References: <1394739814.87919.YahooMailNeo@web160201.mail.bf1.yahoo.com> <1394739932.48891.YahooMailNeo@web160206.mail.bf1.yahoo.com> <41A0D070-A1EA-41CC-9E8E-D49D96E8236C@mcs.anl.gov> <1394744574.24008.YahooMailNeo@web160203.mail.bf1.yahoo.com>, <8F84BD8FD72DC64AA206CF6E3D7FC47413FD25DF@BUTKUS.anl.gov> Message-ID: <1395263102.95080.YahooMailNeo@web160205.mail.bf1.yahoo.com> I made a few more tests. The weird thing is that the case runs in 2 win-7 workstations but failed in other 2 win-7 workstations. At the failing workstations it did not print out any call stack when it crashes (I tried with both release and debug?PETSc libs), however, win-7 itself?popped out a window that says "A problem caused the program to stop working correctly. Windows will close the program and notify you if a solution is available." Does this mean the program was trying to thing such as writing to?reserved memory address? ? Unfortunately, I can only run the program from command line in the failing workstations since I am not allowd to install VS debuger in them. ? Thanks for any info, ? Qin ________________________________ From: "Balay, Satish" To: "Smith, Barry F." ; Qin Lu Cc: petsc-users Sent: Thursday, March 13, 2014 4:25 PM Subject: RE: [petsc-users] KSPSolve crash Also run Linux version in valgrind ________________________________ From: Barry Smith Sent: ?3/?13/?2014 2:22 PM To: Qin Lu Cc: petsc-users Subject: Re: [petsc-users] KSPSolve crash ?? Compile in debug mode and run in the debugger (visual studio has it built in). On Mar 13, 2014, at 4:02 PM, Qin Lu wrote: > Yes, it crashed without any printout. It always crashes at the first call to KSPSolve. Currently I only has a release version of petsc lib although my program has debug version. >? > Thanks, > Qin > > > ----- Original Message ----- > From: Barry Smith > To: Qin Lu > Cc: petsc-users > Sent: Thursday, March 13, 2014 3:57 PM > Subject: Re: [petsc-users] KSPSolve crash > > >?? How did it crash? Absolutely nothing printed to the screen, the program just ended? Please send any output. Since it worked on linux it is likely something specific to the windows machine like lack of memory, a compiler bug, ?.? Does it always crash at the same place, can you run it in the debugger on the windows machine and where does it end up? > >?? Barry > > > On Mar 13, 2014, at 2:45 PM, Qin Lu wrote: > >> I forget to mention: it crashed in Win-7 only, while it runs fine in Linux. >>?? >> Qin >> >> >> ----- Original Message ----- >> From: Qin Lu >> To: petsc-users >> Cc: >> Sent: Thursday, March 13, 2014 2:43 PM >> Subject: [petsc-users] KSPSolve crash >> >> PETSc team, >> >> I have a program using PETSc linear solver (using KSPBCG with PCILU level 0). But with one case it crashed inside KSPSolve without any error message. I have tested this program with many other cases successfully. >> >> Could you debug PETSc with the linear system? I can send you the matrix and rhs if you let me know where to upload them. >> >> Many thanks, >> Qin????? -------------- next part -------------- An HTML attachment was scrubbed... URL: From asmund.ervik at ntnu.no Wed Mar 19 16:06:33 2014 From: asmund.ervik at ntnu.no (=?iso-8859-1?Q?=C5smund_Ervik?=) Date: Wed, 19 Mar 2014 21:06:33 +0000 Subject: [petsc-users] Writing solution data to file when using DMDA. In-Reply-To: <87a9cnm6d6.fsf@jedbrown.org> References: <0E576811AB298343AC632BBCAAEFC37945BEFD17@WAREHOUSE08.win.ntnu.no>, <87a9cnm6d6.fsf@jedbrown.org> Message-ID: <0E576811AB298343AC632BBCAAEFC37945BF009B@WAREHOUSE08.win.ntnu.no> Hi Jed, Yeah, the actual include lines I use are of the form #include It looks like I figured it out in the end: my main file has many "use" statements at the beginning. One of the files that are "used" has all the #include stuff, and that entire module is public. Thus I can't #include stuff at the top of my main file, that would result in conflicting definitions. This works for 99% of things, so I can call e.g. DMDA routines in the main file and everything is fine. But certain things, e.g. constants like PETSCVIEWERVTK or macros like CHKERRQ don't get pulled in correctly. This is probably due to some Fortran preprocessing subtlety that I don't understand. What I can do short-term is put the stuff that needs PETSCVIEWERVTK and similar constants inside a function in some other module where I can #include stuff properly. For the long term, I guess it's "to the refactoring-mobile". Regards, ?smund ________________________________________ Fra: Jed Brown [jed at jedbrown.org] Sendt: 19. mars 2014 00:31 Til: ?smund Ervik; petsc-users at mcs.anl.gov Emne: Re: [petsc-users] Writing solution data to file when using DMDA. ?smund Ervik writes: > I'm also willing to try the VTK way of doing things, but I hit a > problem when I tried that: even though I include "petscviewer.h" (also > tried adding "petscviewerdef.h"), when I do > call PetscViewerSetType(viewer,PETSCVIEWERVTK,ierr) my compiler > complains that PETSCVIEWERVTK is undefined (has no implicit > type). Should be Is it actually included? Are you sure you have v3.4.3? $ git grep VIEWERVTK v3.4.3 include/finclude/petscviewerdef.h v3.4.3:include/finclude/petscviewerdef.h:#define PETSCVIEWERVTK 'vtk' > This is from Fortran90 using preprocessing macros to #include the > files. I tried PETSCVIEWERASCII as well, same problem. This is with > 3.4.3. Any hints on this? From bsmith at mcs.anl.gov Wed Mar 19 16:36:15 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 19 Mar 2014 16:36:15 -0500 Subject: [petsc-users] KSPSolve crash In-Reply-To: <1395263102.95080.YahooMailNeo@web160205.mail.bf1.yahoo.com> References: <1394739814.87919.YahooMailNeo@web160201.mail.bf1.yahoo.com> <1394739932.48891.YahooMailNeo@web160206.mail.bf1.yahoo.com> <41A0D070-A1EA-41CC-9E8E-D49D96E8236C@mcs.anl.gov> <1394744574.24008.YahooMailNeo@web160203.mail.bf1.yahoo.com>, <8F84BD8FD72DC64AA206CF6E3D7FC47413FD25DF@BUTKUS.anl.gov> <1395263102.95080.YahooMailNeo@web160205.mail.bf1.yahoo.com> Message-ID: <9E95630B-9E8D-480C-87E2-C79B38BF4C99@mcs.anl.gov> Best to run under valgrind to make sure there is not a memory access error that only kills the program sometimes. Barry On Mar 19, 2014, at 4:05 PM, Qin Lu wrote: > I made a few more tests. The weird thing is that the case runs in 2 win-7 workstations but failed in other 2 win-7 workstations. At the failing workstations it did not print out any call stack when it crashes (I tried with both release and debug PETSc libs), however, win-7 itself popped out a window that says "A problem caused the program to stop working correctly. Windows will close the program and notify you if a solution is available." Does this mean the program was trying to thing such as writing to reserved memory address? > > Unfortunately, I can only run the program from command line in the failing workstations since I am not allowd to install VS debuger in them. > > Thanks for any info, > > Qin > > From: "Balay, Satish" > To: "Smith, Barry F." ; Qin Lu > Cc: petsc-users > Sent: Thursday, March 13, 2014 4:25 PM > Subject: RE: [petsc-users] KSPSolve crash > > Also run Linux version in valgrind > From: Barry Smith > Sent: ?3/?13/?2014 2:22 PM > To: Qin Lu > Cc: petsc-users > Subject: Re: [petsc-users] KSPSolve crash > > > Compile in debug mode and run in the debugger (visual studio has it built in). > > On Mar 13, 2014, at 4:02 PM, Qin Lu wrote: > > > Yes, it crashed without any printout. It always crashes at the first call to KSPSolve. Currently I only has a release version of petsc lib although my program has debug version. > > > > Thanks, > > Qin > > > > > > ----- Original Message ----- > > From: Barry Smith > > To: Qin Lu > > Cc: petsc-users > > Sent: Thursday, March 13, 2014 3:57 PM > > Subject: Re: [petsc-users] KSPSolve crash > > > > > > How did it crash? Absolutely nothing printed to the screen, the program just ended? Please send any output. Since it worked on linux it is likely something specific to the windows machine like lack of memory, a compiler bug, ?. Does it always crash at the same place, can you run it in the debugger on the windows machine and where does it end up? > > > > Barry > > > > > > On Mar 13, 2014, at 2:45 PM, Qin Lu wrote: > > > >> I forget to mention: it crashed in Win-7 only, while it runs fine in Linux. > >> > >> Qin > >> > >> > >> ----- Original Message ----- > >> From: Qin Lu > >> To: petsc-users > >> Cc: > >> Sent: Thursday, March 13, 2014 2:43 PM > >> Subject: [petsc-users] KSPSolve crash > >> > >> PETSc team, > >> > >> I have a program using PETSc linear solver (using KSPBCG with PCILU level 0). But with one case it crashed inside KSPSolve without any error message. I have tested this program with many other cases successfully. > >> > >> Could you debug PETSc with the linear system? I can send you the matrix and rhs if you let me know where to upload them. > >> > >> Many thanks, > >> Qin From fd.kong at siat.ac.cn Wed Mar 19 16:49:36 2014 From: fd.kong at siat.ac.cn (Fande Kong) Date: Wed, 19 Mar 2014 15:49:36 -0600 Subject: [petsc-users] what kind of stuffs I forget to free? Message-ID: Hi, I run my code with options: -malloc_debug -malloc_dump, then the following messages are produced, but nothing happened in my code. I want to know what kind of objects I forgot freeing. Any suggestions? [0]Total space allocated 6864 bytes [ 0]256 bytes PetscSplitReductionCreate() line 91 in /home/fdkong/math/petsc-3.4.1/src/vec/vec/utils/comb.c [ 0]256 bytes PetscSplitReductionCreate() line 88 in /home/fdkong/math/petsc-3.4.1/src/vec/vec/utils/comb.c [ 0]512 bytes PetscSplitReductionCreate() line 87 in /home/fdkong/math/petsc-3.4.1/src/vec/vec/utils/comb.c [ 0]512 bytes PetscSplitReductionCreate() line 86 in /home/fdkong/math/petsc-3.4.1/src/vec/vec/utils/comb.c [ 0]80 bytes PetscSplitReductionCreate() line 81 in /home/fdkong/math/petsc-3.4.1/src/vec/vec/utils/comb.c [ 0]16 bytes PetscThreadCommReductionCreate() line 448 in /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcommred.c [ 0]512 bytes PetscThreadCommReductionCreate() line 440 in /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcommred.c [ 0]256 bytes PetscThreadCommReductionCreate() line 436 in /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcommred.c [ 0]1280 bytes PetscThreadCommReductionCreate() line 435 in /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcommred.c [ 0]32 bytes PetscThreadCommReductionCreate() line 432 in /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcommred.c [ 0]128 bytes PetscThreadCommWorldInitialize() line 1242 in /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcomm.c [ 0]2560 bytes PetscThreadCommWorldInitialize() line 1241 in /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcomm.c [ 0]32 bytes PetscThreadCommWorldInitialize() line 1233 in /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcomm.c [ 0]16 bytes PetscThreadCommSetAffinities() line 424 in /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcomm.c [ 0]48 bytes PetscThreadCommCreate() line 150 in /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcomm.c [ 0]336 bytes PetscThreadCommCreate() line 146 in /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcomm.c [ 0]32 bytes PetscCommDuplicate() line 151 in /home/fdkong/math/petsc-3.4.1/src/sys/objects/tagm.c Fande, -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Wed Mar 19 17:58:13 2014 From: jed at jedbrown.org (Jed Brown) Date: Wed, 19 Mar 2014 23:58:13 +0100 Subject: [petsc-users] what kind of stuffs I forget to free? In-Reply-To: References: Message-ID: <8738idlrsq.fsf@jedbrown.org> Fande Kong writes: > I run my code with options: -malloc_debug -malloc_dump, then the following > messages are produced, but nothing happened in my code. I want to know what > kind of objects I forgot freeing. Any suggestions? Huh, those should show more stack. Before trying to think hard, can you run with valgrind? > [0]Total space allocated 6864 bytes > [ 0]256 bytes PetscSplitReductionCreate() line 91 in > /home/fdkong/math/petsc-3.4.1/src/vec/vec/utils/comb.c > [ 0]256 bytes PetscSplitReductionCreate() line 88 in > /home/fdkong/math/petsc-3.4.1/src/vec/vec/utils/comb.c > [ 0]512 bytes PetscSplitReductionCreate() line 87 in > /home/fdkong/math/petsc-3.4.1/src/vec/vec/utils/comb.c > [ 0]512 bytes PetscSplitReductionCreate() line 86 in > /home/fdkong/math/petsc-3.4.1/src/vec/vec/utils/comb.c > [ 0]80 bytes PetscSplitReductionCreate() line 81 in > /home/fdkong/math/petsc-3.4.1/src/vec/vec/utils/comb.c > [ 0]16 bytes PetscThreadCommReductionCreate() line 448 in > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcommred.c > [ 0]512 bytes PetscThreadCommReductionCreate() line 440 in > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcommred.c > [ 0]256 bytes PetscThreadCommReductionCreate() line 436 in > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcommred.c > [ 0]1280 bytes PetscThreadCommReductionCreate() line 435 in > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcommred.c > [ 0]32 bytes PetscThreadCommReductionCreate() line 432 in > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcommred.c > [ 0]128 bytes PetscThreadCommWorldInitialize() line 1242 in > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcomm.c > [ 0]2560 bytes PetscThreadCommWorldInitialize() line 1241 in > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcomm.c > [ 0]32 bytes PetscThreadCommWorldInitialize() line 1233 in > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcomm.c > [ 0]16 bytes PetscThreadCommSetAffinities() line 424 in > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcomm.c > [ 0]48 bytes PetscThreadCommCreate() line 150 in > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcomm.c > [ 0]336 bytes PetscThreadCommCreate() line 146 in > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcomm.c > [ 0]32 bytes PetscCommDuplicate() line 151 in > /home/fdkong/math/petsc-3.4.1/src/sys/objects/tagm.c > > > Fande, -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From derek.gaston at inl.gov Wed Mar 19 18:27:46 2014 From: derek.gaston at inl.gov (Gaston, Derek R) Date: Wed, 19 Mar 2014 19:27:46 -0400 Subject: [petsc-users] PETSc-based Multiphysics Package Open Sourced! Message-ID: After six years of development, Idaho National Laboratory is proud to announce that the Multiphysics Object Oriented Simulation Environment (MOOSE) framework is now available as open source software! The main website can be found at http://mooseframework.com and the code is being developed on GitHub: https://github.com/idaholab/moose Built on top of PETSc and libMesh, MOOSE provides a modular, pluggable system aimed at accelerating the development of complex, multiphysics applications. Over the last six years it was licensed by over 60 institutions world-wide and has been utilized to create over 40 different multiphysics applications simulating everything from groundwater migration to nuclear reactors. A sample of some of the capability within MOOSE: * Fully-coupled, fully-implicit multiphysics solver * Dimension independent physics * Scalable hybrid parallelism (largest runs >100,000 CPU cores!) * Modular development simplifies code reuse * Built-in mesh adaptivity * Continuous and Discontinuous Galerkin (DG) (at the same time!) * Intuitive parallel multiscale solves * Dimension agnostic, parallel geometric search (for contact related applications) * Flexible, plugable graphical user interface * ~30 plugable interfaces allow specialization of every part of the solve For examples of what MOOSE-based applications can do see these Youtube videos: https://www.youtube.com/watch?v=V-2VfET8SNw https://www.youtube.com/watch?v=4xTfQxpGAI4 https://www.youtube.com/watch?v=0oz8FD3H52s -------------- next part -------------- An HTML attachment was scrubbed... URL: From vijay.m at gmail.com Wed Mar 19 19:39:14 2014 From: vijay.m at gmail.com (Vijay S. Mahadevan) Date: Wed, 19 Mar 2014 19:39:14 -0500 Subject: [petsc-users] Config option --download-suitesparse fails Message-ID: I think someone at UFL messed up the webspace that hosts Suitesparse. The locations are not recognized by the server. I can't even find a manual download link at their site. Google redirects me to the same 403 error page too. For the time being, is it possibly mirrored at one of the ANL systems for download ? Relevant error message with: ./configure PETSC_ARCH=standalone_packages --with-debugging=1 --with-pic=1 --with-mpi-dir=/usr/software/mpich-3.0.4 --with-blas-lapack-dir=/usr --download-hypre=yes --with-shared-libraries=1 --with-clanguage=C++ --download-suitesparse=yes .... =============================================================================== Trying to download http://www.cise.ufl.edu/research/sparse/SuiteSparse/SuiteSparse-4.2.1.tar.gz for SUITESPARSE =============================================================================== ******************************************************************************* UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): ------------------------------------------------------------------------------- file could not be opened successfully Downloaded package SuiteSparse from: http://www.cise.ufl.edu/research/sparse/SuiteSparse/SuiteSparse-4.2.1.tar.gz is not a tarball. [or installed python cannot process compressed files] * If you are behind a firewall - please fix your proxy and rerun ./configure For example at LANL you may need to set the environmental variable http_proxy (or HTTP_PROXY?) to http://proxyout.lanl.gov * Alternatively, you can download the above URL manually, to /yourselectedlocation/SuiteSparse-4.2.1.tar.gz and use the configure option: --download-suitesparse=/yourselectedlocation/SuiteSparse-4.2.1.tar.gz ******************************************************************************* Vijay From balay at mcs.anl.gov Wed Mar 19 19:54:47 2014 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 19 Mar 2014 19:54:47 -0500 Subject: [petsc-users] Config option --download-suitesparse fails In-Reply-To: References: Message-ID: I pushed a change to suitesparse.py to have a backup url at our ftp server. Shri - if you have a acopy of the suitesparse tarball - I can place it at our ftp site Satish On Wed, 19 Mar 2014, Vijay S. Mahadevan wrote: > I think someone at UFL messed up the webspace that hosts Suitesparse. > The locations are not recognized by the server. I can't even find a > manual download link at their site. Google redirects me to the same > 403 error page too. > > For the time being, is it possibly mirrored at one of the ANL systems > for download ? > > > Relevant error message with: > > ./configure PETSC_ARCH=standalone_packages --with-debugging=1 > --with-pic=1 --with-mpi-dir=/usr/software/mpich-3.0.4 > --with-blas-lapack-dir=/usr --download-hypre=yes > --with-shared-libraries=1 --with-clanguage=C++ > --download-suitesparse=yes > .... > =============================================================================== > > > Trying to download > http://www.cise.ufl.edu/research/sparse/SuiteSparse/SuiteSparse-4.2.1.tar.gz > for SUITESPARSE > > =============================================================================== > ******************************************************************************* > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log > for details): > ------------------------------------------------------------------------------- > file could not be opened successfully > Downloaded package SuiteSparse from: > http://www.cise.ufl.edu/research/sparse/SuiteSparse/SuiteSparse-4.2.1.tar.gz > is not a tarball. > [or installed python cannot process compressed files] > * If you are behind a firewall - please fix your proxy and rerun ./configure > For example at LANL you may need to set the environmental variable > http_proxy (or HTTP_PROXY?) to http://proxyout.lanl.gov > * Alternatively, you can download the above URL manually, to > /yourselectedlocation/SuiteSparse-4.2.1.tar.gz > and use the configure option: > --download-suitesparse=/yourselectedlocation/SuiteSparse-4.2.1.tar.gz > ******************************************************************************* > > Vijay > From balay at mcs.anl.gov Wed Mar 19 20:28:09 2014 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 19 Mar 2014 20:28:09 -0500 Subject: [petsc-users] Config option --download-suitesparse fails In-Reply-To: References: Message-ID: The tarball is now at our ftp server - so configure should work now. Satish On Wed, 19 Mar 2014, Satish Balay wrote: > I pushed a change to suitesparse.py to have a backup url at our ftp server. > > Shri - if you have a acopy of the suitesparse tarball - I can place it > at our ftp site > > Satish > > On Wed, 19 Mar 2014, Vijay S. Mahadevan wrote: > > > I think someone at UFL messed up the webspace that hosts Suitesparse. > > The locations are not recognized by the server. I can't even find a > > manual download link at their site. Google redirects me to the same > > 403 error page too. > > > > For the time being, is it possibly mirrored at one of the ANL systems > > for download ? > > > > > > Relevant error message with: > > > > ./configure PETSC_ARCH=standalone_packages --with-debugging=1 > > --with-pic=1 --with-mpi-dir=/usr/software/mpich-3.0.4 > > --with-blas-lapack-dir=/usr --download-hypre=yes > > --with-shared-libraries=1 --with-clanguage=C++ > > --download-suitesparse=yes > > .... > > =============================================================================== > > > > > > Trying to download > > http://www.cise.ufl.edu/research/sparse/SuiteSparse/SuiteSparse-4.2.1.tar.gz > > for SUITESPARSE > > > > =============================================================================== > > ******************************************************************************* > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log > > for details): > > ------------------------------------------------------------------------------- > > file could not be opened successfully > > Downloaded package SuiteSparse from: > > http://www.cise.ufl.edu/research/sparse/SuiteSparse/SuiteSparse-4.2.1.tar.gz > > is not a tarball. > > [or installed python cannot process compressed files] > > * If you are behind a firewall - please fix your proxy and rerun ./configure > > For example at LANL you may need to set the environmental variable > > http_proxy (or HTTP_PROXY?) to http://proxyout.lanl.gov > > * Alternatively, you can download the above URL manually, to > > /yourselectedlocation/SuiteSparse-4.2.1.tar.gz > > and use the configure option: > > --download-suitesparse=/yourselectedlocation/SuiteSparse-4.2.1.tar.gz > > ******************************************************************************* > > > > Vijay > > > > From vijay.m at gmail.com Wed Mar 19 20:39:37 2014 From: vijay.m at gmail.com (Vijay S. Mahadevan) Date: Wed, 19 Mar 2014 20:39:37 -0500 Subject: [petsc-users] Config option --download-suitesparse fails In-Reply-To: References: Message-ID: Thanks Satish. Will pull and try now. On Wed, Mar 19, 2014 at 8:28 PM, Satish Balay wrote: > The tarball is now at our ftp server - so configure should work now. > > Satish > > On Wed, 19 Mar 2014, Satish Balay wrote: > >> I pushed a change to suitesparse.py to have a backup url at our ftp server. >> >> Shri - if you have a acopy of the suitesparse tarball - I can place it >> at our ftp site >> >> Satish >> >> On Wed, 19 Mar 2014, Vijay S. Mahadevan wrote: >> >> > I think someone at UFL messed up the webspace that hosts Suitesparse. >> > The locations are not recognized by the server. I can't even find a >> > manual download link at their site. Google redirects me to the same >> > 403 error page too. >> > >> > For the time being, is it possibly mirrored at one of the ANL systems >> > for download ? >> > >> > >> > Relevant error message with: >> > >> > ./configure PETSC_ARCH=standalone_packages --with-debugging=1 >> > --with-pic=1 --with-mpi-dir=/usr/software/mpich-3.0.4 >> > --with-blas-lapack-dir=/usr --download-hypre=yes >> > --with-shared-libraries=1 --with-clanguage=C++ >> > --download-suitesparse=yes >> > .... >> > =============================================================================== >> > >> > >> > Trying to download >> > http://www.cise.ufl.edu/research/sparse/SuiteSparse/SuiteSparse-4.2.1.tar.gz >> > for SUITESPARSE >> > >> > =============================================================================== >> > ******************************************************************************* >> > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log >> > for details): >> > ------------------------------------------------------------------------------- >> > file could not be opened successfully >> > Downloaded package SuiteSparse from: >> > http://www.cise.ufl.edu/research/sparse/SuiteSparse/SuiteSparse-4.2.1.tar.gz >> > is not a tarball. >> > [or installed python cannot process compressed files] >> > * If you are behind a firewall - please fix your proxy and rerun ./configure >> > For example at LANL you may need to set the environmental variable >> > http_proxy (or HTTP_PROXY?) to http://proxyout.lanl.gov >> > * Alternatively, you can download the above URL manually, to >> > /yourselectedlocation/SuiteSparse-4.2.1.tar.gz >> > and use the configure option: >> > --download-suitesparse=/yourselectedlocation/SuiteSparse-4.2.1.tar.gz >> > ******************************************************************************* >> > >> > Vijay >> > >> >> > From vijay.m at gmail.com Wed Mar 19 21:24:06 2014 From: vijay.m at gmail.com (Vijay S. Mahadevan) Date: Wed, 19 Mar 2014 21:24:06 -0500 Subject: [petsc-users] Config option --download-suitesparse fails In-Reply-To: References: Message-ID: Resolved. Thanks again Satish. =============================================================================== Trying to download http://www.cise.ufl.edu/research/sparse/SuiteSparse/SuiteSparse-4.2.1.tar.gz for SUITESPARSE =============================================================================== =============================================================================== Trying to download http://ftp.mcs.anl.gov/pub/petsc/externalpackages/SuiteSparse-4.2.1.tar.gz for SUITESPARSE =============================================================================== =============================================================================== Compiling SuiteSparse; this may take several minutes =============================================================================== Vijay On Wed, Mar 19, 2014 at 8:39 PM, Vijay S. Mahadevan wrote: > Thanks Satish. Will pull and try now. > > On Wed, Mar 19, 2014 at 8:28 PM, Satish Balay wrote: >> The tarball is now at our ftp server - so configure should work now. >> >> Satish >> >> On Wed, 19 Mar 2014, Satish Balay wrote: >> >>> I pushed a change to suitesparse.py to have a backup url at our ftp server. >>> >>> Shri - if you have a acopy of the suitesparse tarball - I can place it >>> at our ftp site >>> >>> Satish >>> >>> On Wed, 19 Mar 2014, Vijay S. Mahadevan wrote: >>> >>> > I think someone at UFL messed up the webspace that hosts Suitesparse. >>> > The locations are not recognized by the server. I can't even find a >>> > manual download link at their site. Google redirects me to the same >>> > 403 error page too. >>> > >>> > For the time being, is it possibly mirrored at one of the ANL systems >>> > for download ? >>> > >>> > >>> > Relevant error message with: >>> > >>> > ./configure PETSC_ARCH=standalone_packages --with-debugging=1 >>> > --with-pic=1 --with-mpi-dir=/usr/software/mpich-3.0.4 >>> > --with-blas-lapack-dir=/usr --download-hypre=yes >>> > --with-shared-libraries=1 --with-clanguage=C++ >>> > --download-suitesparse=yes >>> > .... >>> > =============================================================================== >>> > >>> > >>> > Trying to download >>> > http://www.cise.ufl.edu/research/sparse/SuiteSparse/SuiteSparse-4.2.1.tar.gz >>> > for SUITESPARSE >>> > >>> > =============================================================================== >>> > ******************************************************************************* >>> > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log >>> > for details): >>> > ------------------------------------------------------------------------------- >>> > file could not be opened successfully >>> > Downloaded package SuiteSparse from: >>> > http://www.cise.ufl.edu/research/sparse/SuiteSparse/SuiteSparse-4.2.1.tar.gz >>> > is not a tarball. >>> > [or installed python cannot process compressed files] >>> > * If you are behind a firewall - please fix your proxy and rerun ./configure >>> > For example at LANL you may need to set the environmental variable >>> > http_proxy (or HTTP_PROXY?) to http://proxyout.lanl.gov >>> > * Alternatively, you can download the above URL manually, to >>> > /yourselectedlocation/SuiteSparse-4.2.1.tar.gz >>> > and use the configure option: >>> > --download-suitesparse=/yourselectedlocation/SuiteSparse-4.2.1.tar.gz >>> > ******************************************************************************* >>> > >>> > Vijay >>> > >>> >>> >> From mfadams at lbl.gov Thu Mar 20 06:17:58 2014 From: mfadams at lbl.gov (Mark Adams) Date: Thu, 20 Mar 2014 14:17:58 +0300 Subject: [petsc-users] cray build error Message-ID: I get this on Hopper with PETSC maint. Any ideas? -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: application/octet-stream Size: 130233 bytes Desc: not available URL: From jed at jedbrown.org Thu Mar 20 06:23:03 2014 From: jed at jedbrown.org (Jed Brown) Date: Thu, 20 Mar 2014 12:23:03 +0100 Subject: [petsc-users] cray build error In-Reply-To: References: Message-ID: <87ior9jeqw.fsf@jedbrown.org> Mark Adams writes: > I get this on Hopper with PETSC maint. Any ideas? The environment is broken: wrapper compiler is adding -lpspline and -lezcdf. Perhaps you need to unload modules or load other modules? | Executing: cc -o /scratch/scratchdirs/madams/petsc-pDazNU/config.setCompilers/conftest /scratch/scratchdirs/madams/petsc-pDazNU/config.setCompilers/conftest.o | sh: | Possible ERROR while running linker: ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not loaded | ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not loaded | ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not loaded | /usr/bin/ld: cannot find -lpspline | /usr/bin/ld: cannot find -lezcdf | collect2: error: ld returned 1 exit status | output: ret = 256 | _LMFILES_=/opt/modulefiles/modules/3.2.6.6:/usr/syscom/nsg/modulefiles/nsg/1.2.0:/opt/modulefiles/modules/3.2.6.7:/opt/cray/xt-asyncpe/default/modulefiles/xtpe-network-gemini:/opt/modulefiles/PrgEnv-gnu/4.2.34:/opt/cray/modulefiles/atp/1.7.0:/opt/modulefiles/xe-sysroot/4.2.34:/opt/cray/gem/modulefiles/switch/1.0-1.0402.45840.2.63.gem:/opt/cray/gem/modulefiles/shared-root/1.0-1.0402.46893.3.17.gem:/opt/cray/gem/modulefiles/pdsh/2.26-1.0402.45278.1.1.gem:/opt/cray/gem/modulefiles/nodehealth/5.1-1.0402.45895.3.76.gem:/opt/cray/gem/modulefiles/lbcd/2.1-1.0402.45245.1.2.gem:/opt/cray/gem/modulefiles/hosts/1.0-1.0402.45251.1.86.gem:/opt/cray/gem/modulefiles/configuration/1.0-1.0402.45284.1.2.gem:/opt/cray/modulefiles/ccm/2.2.0-1.0402.46086.4.120:/opt/cray/gem/modulefiles/audit/1.0.0-1.0402.45273.1.86.gem:/opt/cray/gem/modulefiles/rca/1.0.0-2.0402.47290.7.1.gem:/opt/cray/gem/modulefiles/csa/3.0.0-1_2.0402.45268.1.90.gem:/opt/cray/gem/modulefiles/job/1.5.5-0.1_2.0402.45272.1.5.gem:/opt/cray/gem/modulefiles/xpmem/0.1-2.0402.45248.1.5.gem:/opt/cray/gem/modulefiles/gni-headers/2.1-1.0402.7541.1.5.gem:/opt/cray/gem/modulefiles/dmapp/4.0.1-1.0402.7784.4.1.gem:/opt/cray/gem/modulefiles/pmi/4.0.1-1.0000.9753.86.3.gem:/opt/cray/gem/modulefiles/ugni/5.0-1.0402.7551.1.10.gem:/opt/cray/gem/modulefiles/udreg/2.3.2-1.0402.7546.1.5.gem:/opt/cray/modulefiles/cray-libsci/12.1.01:/opt/modulefiles/gcc/4.8.1:/opt/modulefiles/xt-asyncpe/5.23:/opt/modulefiles/eswrap/1.0.20-1.010102.662.0:/opt/cray/xt-asyncpe/default/modulefiles/craype-mc12:/opt/cray/modulefiles/cray-shmem/6.0.1:/opt/cray/modulefiles/cray-mpich/6.0.1:/opt/modulefiles/torque/4.2.3.h5_notcpretry:/opt/modulefiles/moab/7.2.3-r19-b121-SUSE11:/usr/common/usg/Modules/modulefiles/valgrind/3.8.1:/usr/common/usg/Modules/modulefiles/cmake/2.8.10.1:/usr/common/graphics/Modules/modulefiles/visit/2.7.0:/opt/cray/modulefiles/papi/5.1.2:/usr/common/usg/Modules/modulefiles/ipm/2.00:/usr/common/usg/Modules/modulefiles/allineatools/4.2-34404:/usr/common/usg/Modules/modulefiles/adios/1.2.1:/usr/common/usg/Modules/modulefiles/pspline/nersc1.0:/opt/cray/modulefiles/cray-hdf5/1.8.11:/opt/cray/modulefiles/cray-netcdf/4.3.0:/usr/common/usg/Modules/modulefiles/matlab/R2012a:/usr/common/usg/Modules/modulefiles/altd/1.0:/usr/common/usg/Modules/modulefiles/usg-default-modules/1.0 -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From PRaeth at drc.com Thu Mar 20 08:04:25 2014 From: PRaeth at drc.com (Raeth . Peter) Date: Thu, 20 Mar 2014 13:04:25 +0000 Subject: [petsc-users] undefined reference to `PetscMalloc1' Message-ID: <539FFE8B854A464BA19148E33BC0DAA4A5AB2B8F@exmb02.drc.com> Am trying to build the code in a PETsc tutorial: http://www.mcs.anl.gov/petsc/petsc-dev/src/ksp/ksp/examples/tutorials/ex7.c.html Got past the compile error regarding KSPSetOperators by adding the fourth parameter: ierr = KSPSetOperators(ksp,A,A,SAME_PRECONDITIONER);CHKERRQ(ierr); Added the include file the man page said was needed: #include Using the following build command: gcc gmres_ex7.c -o gmres_ex7 -I $PET_HOME/pkgs/valgrind-3.6.1/include -I $PET_HOME/.unsupported/petsc-3.4.4/Installed/include -I $PET_HOME/.unsupported/mpich-3.0.3/Installed/include -L $PET_HOME/.unsupported/petsc-3.4.4/Installed/lib -l petsc Getting the following link error: undefined reference to `PetscMalloc1' Many thanks for any insights the group has to offer. Best, Peter. ________________________________ This electronic message transmission and any attachments that accompany it contain information from DRC? (Dynamics Research Corporation) or its subsidiaries, or the intended recipient, which is privileged, proprietary, business confidential, or otherwise protected from disclosure and is the exclusive property of DRC and/or the intended recipient. The information in this email is solely intended for the use of the individual or entity that is the intended recipient. If you are not the intended recipient, any use, dissemination, distribution, retention, or copying of this communication, attachments, or substance is prohibited. If you have received this electronic transmission in error, please immediately reply to the author via email that you received the message by mistake and also promptly and permanently delete this message and all copies of this email and any attachments. We thank you for your assistance and apologize for any inconvenience. From bsmith at mcs.anl.gov Thu Mar 20 08:17:01 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 20 Mar 2014 08:17:01 -0500 Subject: [petsc-users] undefined reference to `PetscMalloc1' In-Reply-To: <539FFE8B854A464BA19148E33BC0DAA4A5AB2B8F@exmb02.drc.com> References: <539FFE8B854A464BA19148E33BC0DAA4A5AB2B8F@exmb02.drc.com> Message-ID: <3B5B58B5-DFB7-404F-9D7A-9063767DF62E@mcs.anl.gov> On Mar 20, 2014, at 8:04 AM, Raeth . Peter wrote: > > Am trying to build the code in a PETsc tutorial: > > http://www.mcs.anl.gov/petsc/petsc-dev/src/ksp/ksp/examples/tutorials/ex7.c.html The examples in this directory, denoted by petsc-dev, work with the development version of PETSc. Sometimes they will be different than the examples in the release of PETSc http://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/examples/tutorials/ex7.c.html denoted by petsc-currant. They shouldn?t be mixed together. So either use the PETSc release and the examples in it or the development version of PETSc and the examples in it. Barry > > > Got past the compile error regarding KSPSetOperators by adding the fourth parameter: > > ierr = KSPSetOperators(ksp,A,A,SAME_PRECONDITIONER);CHKERRQ(ierr); > > > Added the include file the man page said was needed: > > #include > > > Using the following build command: > > gcc gmres_ex7.c -o gmres_ex7 -I $PET_HOME/pkgs/valgrind-3.6.1/include -I $PET_HOME/.unsupported/petsc-3.4.4/Installed/include -I $PET_HOME/.unsupported/mpich-3.0.3/Installed/include -L $PET_HOME/.unsupported/petsc-3.4.4/Installed/lib -l petsc > > > Getting the following link error: > > undefined reference to `PetscMalloc1' > > > Many thanks for any insights the group has to offer. > > > Best, > > Peter. > > > ________________________________ > This electronic message transmission and any attachments that accompany it contain information from DRC? (Dynamics Research Corporation) or its subsidiaries, or the intended recipient, which is privileged, proprietary, business confidential, or otherwise protected from disclosure and is the exclusive property of DRC and/or the intended recipient. The information in this email is solely intended for the use of the individual or entity that is the intended recipient. If you are not the intended recipient, any use, dissemination, distribution, retention, or copying of this communication, attachments, or substance is prohibited. If you have received this electronic transmission in error, please immediately reply to the author via email that you received the message by mistake and also promptly and permanently delete this message and all copies of this email and any attachments. We thank you for your assistance and apologize for any inconvenience. From PRaeth at drc.com Thu Mar 20 08:29:09 2014 From: PRaeth at drc.com (Raeth . Peter) Date: Thu, 20 Mar 2014 13:29:09 +0000 Subject: [petsc-users] undefined reference to `PetscMalloc1' In-Reply-To: <3B5B58B5-DFB7-404F-9D7A-9063767DF62E@mcs.anl.gov> References: <539FFE8B854A464BA19148E33BC0DAA4A5AB2B8F@exmb02.drc.com>, <3B5B58B5-DFB7-404F-9D7A-9063767DF62E@mcs.anl.gov> Message-ID: <539FFE8B854A464BA19148E33BC0DAA4A5AB2BD3@exmb02.drc.com> Oh my..... Thanks Barry. Did not realize there would be a difference. Your input is most appreciated. Best, Peter. ________________________________ This electronic message transmission and any attachments that accompany it contain information from DRC? (Dynamics Research Corporation) or its subsidiaries, or the intended recipient, which is privileged, proprietary, business confidential, or otherwise protected from disclosure and is the exclusive property of DRC and/or the intended recipient. The information in this email is solely intended for the use of the individual or entity that is the intended recipient. If you are not the intended recipient, any use, dissemination, distribution, retention, or copying of this communication, attachments, or substance is prohibited. If you have received this electronic transmission in error, please immediately reply to the author via email that you received the message by mistake and also promptly and permanently delete this message and all copies of this email and any attachments. We thank you for your assistance and apologize for any inconvenience. From mlohry at gmail.com Thu Mar 20 10:38:22 2014 From: mlohry at gmail.com (Mark Lohry) Date: Thu, 20 Mar 2014 11:38:22 -0400 Subject: [petsc-users] Compute norm of a single component of DMDAVec struct Message-ID: <532B0B6E.2010500@gmail.com> I'm using a struct for a multi-component PDE as suggested in the manual, like so: typedef struct { PetscScalar u,v,omega,temperature; } Node; Node **f,**u; DMDAVecGetArray(DM da,Vec local,&u); DMDAVecGetArray(DM da,Vec global,&f); Calling VecNorm(...) on these vectors gives a norm for the entire vector. If one wants separate norms for each component of the struct, i.e. Norm(u) or Norm(v), what's the right approach? Would I need to manually compute norms locally and then call an MPI reduce function, or is this ability built-in to PETSc somewhere? -Mark Lohry From prbrune at gmail.com Thu Mar 20 11:10:41 2014 From: prbrune at gmail.com (Peter Brune) Date: Thu, 20 Mar 2014 11:10:41 -0500 Subject: [petsc-users] Compute norm of a single component of DMDAVec struct In-Reply-To: <532B0B6E.2010500@gmail.com> References: <532B0B6E.2010500@gmail.com> Message-ID: You could use http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMCreateFieldDecomposition.html to get DMDAs representing the layout of the individual components and DMCreateGlobalVector() on those DMDAs to get properly laid-out individual field vectors. Then, you would use the ISes given by this function to build VecScatters using http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Vec/VecScatterCreate.html from the monolithic vector to the component ones. Then apply the scatter and do whatever you want with those vectors (take norms, etc.) Hope this helps. - Peter On Thu, Mar 20, 2014 at 10:38 AM, Mark Lohry wrote: > I'm using a struct for a multi-component PDE as suggested in the manual, > like so: > > typedef struct { > PetscScalar u,v,omega,temperature; > } Node; > Node **f,**u; > DMDAVecGetArray(DM da,Vec local,&u); > DMDAVecGetArray(DM da,Vec global,&f); > > > Calling VecNorm(...) on these vectors gives a norm for the entire vector. > If one wants separate norms for each component of the struct, i.e. Norm(u) > or Norm(v), what's the right approach? Would I need to manually compute > norms locally and then call an MPI reduce function, or is this ability > built-in to PETSc somewhere? > > > -Mark Lohry > -------------- next part -------------- An HTML attachment was scrubbed... URL: From s.kramer at imperial.ac.uk Thu Mar 20 11:39:09 2014 From: s.kramer at imperial.ac.uk (Stephan Kramer) Date: Thu, 20 Mar 2014 16:39:09 +0000 Subject: [petsc-users] gamg failure with petsc-dev Message-ID: <532B19AD.50105@imperial.ac.uk> Hi guys, We have been having some problems with GAMG on petsc-dev (master) for cases that worked fine on petsc 3.4. We're solving a Stokes equation (just the velocity block) for a simple convection in a square box (isoviscous). The problem only occurs if we supply a near null space (via MatSetNearNullSpace) where we supply the usual (1,0) (0,1) and (-y,x) (near) null space vectors. If we supply those, the smoother complains that the diagonal of the A matrix at the first coarsened level contains a zero. If I dump out the prolongator from the finest to the first coarsened level it indeed contains a zero column at that same index. We're pretty confident that the fine level A matrix is correct (it solves fine with LU). I've briefly spoken to Matt about this and he suggested trying to run with -pc_gamg_agg_nsmooths 0 (as the default changed from 3.4 -> dev) but that didn't make any difference, the dumped out prolongator still has zero columns, and it crashes in the same way. Do you have any further suggestions what to try and how to further debug this? Cheers Stephan From bsmith at mcs.anl.gov Thu Mar 20 12:03:01 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 20 Mar 2014 12:03:01 -0500 Subject: [petsc-users] Compute norm of a single component of DMDAVec struct In-Reply-To: <532B0B6E.2010500@gmail.com> References: <532B0B6E.2010500@gmail.com> Message-ID: You can use VecStrideNorm() if you wish only a particular one or VecStrideNormAll() if you wish all the norms. Barry On Mar 20, 2014, at 10:38 AM, Mark Lohry wrote: > I'm using a struct for a multi-component PDE as suggested in the manual, like so: > > typedef struct { > PetscScalar u,v,omega,temperature; > } Node; > Node **f,**u; > DMDAVecGetArray(DM da,Vec local,&u); > DMDAVecGetArray(DM da,Vec global,&f); > > > Calling VecNorm(...) on these vectors gives a norm for the entire vector. If one wants separate norms for each component of the struct, i.e. Norm(u) or Norm(v), what's the right approach? Would I need to manually compute norms locally and then call an MPI reduce function, or is this ability built-in to PETSc somewhere? > > > -Mark Lohry From fd.kong at siat.ac.cn Thu Mar 20 12:29:14 2014 From: fd.kong at siat.ac.cn (Fande Kong) Date: Thu, 20 Mar 2014 11:29:14 -0600 Subject: [petsc-users] what kind of stuffs I forget to free? In-Reply-To: <8738idlrsq.fsf@jedbrown.org> References: <8738idlrsq.fsf@jedbrown.org> Message-ID: Thanks, Jed, I tried valgrind, but it could not catch any noises. I used these options: mpirun -n 8 valgrind --tool=memcheck -q --num-callers=20 --log-file=valgrind.log.%p ./nonlinearElasticity3d -malloc off . I think I forget freeing something or petsc does. On Wed, Mar 19, 2014 at 4:58 PM, Jed Brown wrote: > Fande Kong writes: > > > I run my code with options: -malloc_debug -malloc_dump, then the > following > > messages are produced, but nothing happened in my code. I want to know > what > > kind of objects I forgot freeing. Any suggestions? > > Huh, those should show more stack. Before trying to think hard, can you > run with valgrind? > > > [0]Total space allocated 6864 bytes > > [ 0]256 bytes PetscSplitReductionCreate() line 91 in > > /home/fdkong/math/petsc-3.4.1/src/vec/vec/utils/comb.c > > [ 0]256 bytes PetscSplitReductionCreate() line 88 in > > /home/fdkong/math/petsc-3.4.1/src/vec/vec/utils/comb.c > > [ 0]512 bytes PetscSplitReductionCreate() line 87 in > > /home/fdkong/math/petsc-3.4.1/src/vec/vec/utils/comb.c > > [ 0]512 bytes PetscSplitReductionCreate() line 86 in > > /home/fdkong/math/petsc-3.4.1/src/vec/vec/utils/comb.c > > [ 0]80 bytes PetscSplitReductionCreate() line 81 in > > /home/fdkong/math/petsc-3.4.1/src/vec/vec/utils/comb.c > > [ 0]16 bytes PetscThreadCommReductionCreate() line 448 in > > > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcommred.c > > [ 0]512 bytes PetscThreadCommReductionCreate() line 440 in > > > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcommred.c > > [ 0]256 bytes PetscThreadCommReductionCreate() line 436 in > > > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcommred.c > > [ 0]1280 bytes PetscThreadCommReductionCreate() line 435 in > > > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcommred.c > > [ 0]32 bytes PetscThreadCommReductionCreate() line 432 in > > > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcommred.c > > [ 0]128 bytes PetscThreadCommWorldInitialize() line 1242 in > > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcomm.c > > [ 0]2560 bytes PetscThreadCommWorldInitialize() line 1241 in > > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcomm.c > > [ 0]32 bytes PetscThreadCommWorldInitialize() line 1233 in > > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcomm.c > > [ 0]16 bytes PetscThreadCommSetAffinities() line 424 in > > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcomm.c > > [ 0]48 bytes PetscThreadCommCreate() line 150 in > > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcomm.c > > [ 0]336 bytes PetscThreadCommCreate() line 146 in > > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcomm.c > > [ 0]32 bytes PetscCommDuplicate() line 151 in > > /home/fdkong/math/petsc-3.4.1/src/sys/objects/tagm.c > > > > > > Fande, > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bhuntington2716 at gmail.com Thu Mar 20 14:32:03 2014 From: bhuntington2716 at gmail.com (Ben Huntington) Date: Thu, 20 Mar 2014 14:32:03 -0500 Subject: [petsc-users] PETSc and smoothed particle hydrodynamics Message-ID: Hi, I was just wondering if anyone has ever used (or knows of someone using) PETSc for a smoothed particle hydrodynamics code? Thanks, Ben -------------- next part -------------- An HTML attachment was scrubbed... URL: From dharmareddy84 at gmail.com Thu Mar 20 15:52:53 2014 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Thu, 20 Mar 2014 15:52:53 -0500 Subject: [petsc-users] SNESSetConvergenceTest In-Reply-To: <87a9detujh.fsf@jedbrown.org> References: <5FF817E1-8806-4965-8F76-B0504EF1F311@mcs.anl.gov> <82D00B91-6660-47AA-AC5B-F77CD33A2F99@mcs.anl.gov> <8761o2vp52.fsf@jedbrown.org> <87a9detujh.fsf@jedbrown.org> Message-ID: Hello Jed, Were you able to look into this issue ? Thanks Reddy On Tue, Feb 25, 2014 at 9:36 PM, Jed Brown wrote: > Dharmendar Reddy writes: >> Hello Jed, >> Sorry for the compilation issues, I do not have access >> to a machine with petsc and gfortran > 4.7 to fix the compile issues. >> Can you make the follwing changes to the code in precision_m module. >> REAL64 and REAL32 should be available via iso_fortran_env module, I >> do not know why it is complaining. >> >> module precision_m >> implicit none >> integer,parameter :: DP = kind(1.0D0) >> integer,parameter :: SP = kind(1.0E0) >> integer,parameter :: WP=DP >> integer,parameter :: MSL=100 ! MAX_STR_LENGTH >> end module precision_m > > The f2003 dialect is not supported by mpif.h in my build of mpich-3.1. > I'll try spinning up a new build of MPICH with -std=f2003 (hopefully its > configure can sort this out). Did I mention Fortran is not my favorite > language? > > > /opt/mpich/include/mpif.h:16.18: > Included at /home/jed/petsc/include/finclude/petscsys.h:11: > Included at /home/jed/petsc/include/finclude/petsc.h:7: > Included at /home/jed/petsc/include/finclude/petsc.h90:5: > Included at Solver.F90:160: > > CHARACTER*1 MPI_ARGVS_NULL(1,1) > 1 > Warning: Obsolescent feature: Old-style character length at (1) > /opt/mpich/include/mpif.h:17.18: > Included at /home/jed/petsc/include/finclude/petscsys.h:11: > Included at /home/jed/petsc/include/finclude/petsc.h:7: > Included at /home/jed/petsc/include/finclude/petsc.h90:5: > Included at Solver.F90:160: > > CHARACTER*1 MPI_ARGV_NULL(1) > 1 > Warning: Obsolescent feature: Old-style character length at (1) > /opt/mpich/include/mpif.h:528.16: > Included at /home/jed/petsc/include/finclude/petscsys.h:11: > Included at /home/jed/petsc/include/finclude/petsc.h:7: > Included at /home/jed/petsc/include/finclude/petsc.h90:5: > Included at Solver.F90:160: > > integer*8 MPI_DISPLACEMENT_CURRENT > 1 > Error: GNU Extension: Nonstandard type declaration INTEGER*8 at (1) > /opt/mpich/include/mpif.h:546.13: > Included at /home/jed/petsc/include/finclude/petscsys.h:11: > Included at /home/jed/petsc/include/finclude/petsc.h:7: > Included at /home/jed/petsc/include/finclude/petsc.h90:5: > Included at Solver.F90:160: > > REAL*8 MPI_WTIME, MPI_WTICK > 1 > Error: GNU Extension: Nonstandard type declaration REAL*8 at (1) From dave.mayhem23 at gmail.com Thu Mar 20 16:14:14 2014 From: dave.mayhem23 at gmail.com (Dave May) Date: Thu, 20 Mar 2014 22:14:14 +0100 Subject: [petsc-users] PETSc and smoothed particle hydrodynamics In-Reply-To: References: Message-ID: I've not seen anyone doing this before. For representing particle data in parallel, petsc objects (eg vecs) are not really suitable as they don't support dynamic resizing of the local or global size. Most SPH calculations are explicit. If you're doing incompressible flow and use a projection scheme which enforces the in compressibility via a variable coefficient poisson solve, petsc would be useful for that solve. Cheers Dave On Thursday, 20 March 2014, Ben Huntington wrote: > Hi, > > I was just wondering if anyone has ever used (or knows of someone using) > PETSc for a smoothed particle hydrodynamics code? > > Thanks, > Ben > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From lb2653 at columbia.edu Thu Mar 20 18:20:12 2014 From: lb2653 at columbia.edu (Luc Berger-Vergiat) Date: Thu, 20 Mar 2014 19:20:12 -0400 Subject: [petsc-users] 2 level schur Message-ID: <532B77AC.1060806@columbi.edu> Hi all, I am solving a four field problem using two Schur complements. Here are the arguments that I usually pass to PETSc to do it: -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_schur_factorization_type full -pc_fieldsplit_schur_precondition selfp -pc_fieldsplit_0_fields 2,3 -pc_fieldsplit_1_fields 0,1 -fieldsplit_0_ksp_type preonly -fieldsplit_0_pc_type fieldsplit -fieldsplit_0_pc_fieldsplit_type schur -fieldsplit_0_pc_fieldsplit_schur_factorization_type full -fieldsplit_0_pc_fieldsplit_schur_precondition selfp -fieldsplit_0_fieldsplit_Field_2_fields 2 -fieldsplit_0_fieldsplit_Field_3_fields 3 -fieldsplit_0_fieldsplit_Field_2_ksp_type preonly -fieldsplit_0_fieldsplit_Field_2_pc_type ilu -fieldsplit_0_fieldsplit_Field_3_ksp_type preonly -fieldsplit_0_fieldsplit_Field_3_pc_type jacobi -fieldsplit_1_ksp_type preonly -fieldsplit_1_pc_type lu -malloc_log mlog -log_summary time.log One issue with this is that when I change -fieldsplit_0_fieldsplit_Field_2_fields 2 to -fieldsplit_0_fieldsplit_Field_2_fields 3 it is ineffective, as if PETSc automatically assign IS 2 to Field 2 even though it is not what I want. Is there a way to pass the arguments correctly so that PETSc goes about switching the IS set of -fieldsplit_0_fieldsplit_Field_2 and -fieldsplit_0_fieldsplit_Field_3? This is crucial to me since I am using the selfp option and the matrix associated to IS 3 is diagonal. By assigning the fields correctly I can get an exact Schur preconditioner and hence very fast convergence. Right now my convergence is not optimal because of this. Thanks! Best, Luc -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Mar 20 20:01:23 2014 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 20 Mar 2014 20:01:23 -0500 Subject: [petsc-users] 2 level schur In-Reply-To: <532B77AC.1060806@columbi.edu> References: <532B77AC.1060806@columbi.edu> Message-ID: On Thu, Mar 20, 2014 at 6:20 PM, Luc Berger-Vergiat wrote: > Hi all, > I am solving a four field problem using two Schur complements. Here are > the arguments that I usually pass to PETSc to do it: > > -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur > -pc_fieldsplit_schur_factorization_type full > -pc_fieldsplit_schur_precondition selfp -pc_fieldsplit_0_fields 2,3 > -pc_fieldsplit_1_fields 0,1 -fieldsplit_0_ksp_type preonly > -fieldsplit_0_pc_type fieldsplit -fieldsplit_0_pc_fieldsplit_type schur > -fieldsplit_0_pc_fieldsplit_schur_factorization_type full > -fieldsplit_0_pc_fieldsplit_schur_precondition selfp > -fieldsplit_0_fieldsplit_Field_2_fields 2 > -fieldsplit_0_fieldsplit_Field_3_fields 3 > -fieldsplit_0_fieldsplit_Field_2_ksp_type preonly > -fieldsplit_0_fieldsplit_Field_2_pc_type ilu > -fieldsplit_0_fieldsplit_Field_3_ksp_type preonly > -fieldsplit_0_fieldsplit_Field_3_pc_type jacobi -fieldsplit_1_ksp_type > preonly -fieldsplit_1_pc_type lu -malloc_log mlog -log_summary time.log > > One issue with this is that when I change > -fieldsplit_0_fieldsplit_Field_2_fields 2 to > -fieldsplit_0_fieldsplit_Field_2_fields 3 it is ineffective, as if PETSc > automatically assign IS 2 to Field 2 even though it is not what I want. > Is there a way to pass the arguments correctly so that PETSc goes about > switching the IS set of -fieldsplit_0_fieldsplit_Field_2 and > -fieldsplit_0_fieldsplit_Field_3? > This is crucial to me since I am using the selfp option and the matrix > associated to IS 3 is diagonal. By assigning the fields correctly I can get > an exact Schur preconditioner and hence very fast convergence. Right now my > convergence is not optimal because of this. > I believe the inner Schur field statements should not be using the original numbering, but the inner numbering, after they have been reordered. Matt > Thanks! > > Best, > Luc > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From fd.kong at siat.ac.cn Thu Mar 20 21:18:19 2014 From: fd.kong at siat.ac.cn (Fande Kong) Date: Thu, 20 Mar 2014 20:18:19 -0600 Subject: [petsc-users] what kind of stuffs I forget to free? In-Reply-To: References: <8738idlrsq.fsf@jedbrown.org> Message-ID: Hi Jed, I finally figured out why. I think there is a bug in the PETSc. I write a very simple program that can reproduce the same results. Please check the attachment. Type " make all -f makefilet " to run this code. Could you know how to fix this issue? Thanks, On Thu, Mar 20, 2014 at 11:29 AM, Fande Kong wrote: > Thanks, Jed, > > I tried valgrind, but it could not catch any noises. I used these > options: mpirun -n 8 valgrind --tool=memcheck -q --num-callers=20 > --log-file=valgrind.log.%p ./nonlinearElasticity3d -malloc off . > > I think I forget freeing something or petsc does. > > > > > On Wed, Mar 19, 2014 at 4:58 PM, Jed Brown wrote: > >> Fande Kong writes: >> >> > I run my code with options: -malloc_debug -malloc_dump, then the >> following >> > messages are produced, but nothing happened in my code. I want to know >> what >> > kind of objects I forgot freeing. Any suggestions? >> >> Huh, those should show more stack. Before trying to think hard, can you >> run with valgrind? >> >> > [0]Total space allocated 6864 bytes >> > [ 0]256 bytes PetscSplitReductionCreate() line 91 in >> > /home/fdkong/math/petsc-3.4.1/src/vec/vec/utils/comb.c >> > [ 0]256 bytes PetscSplitReductionCreate() line 88 in >> > /home/fdkong/math/petsc-3.4.1/src/vec/vec/utils/comb.c >> > [ 0]512 bytes PetscSplitReductionCreate() line 87 in >> > /home/fdkong/math/petsc-3.4.1/src/vec/vec/utils/comb.c >> > [ 0]512 bytes PetscSplitReductionCreate() line 86 in >> > /home/fdkong/math/petsc-3.4.1/src/vec/vec/utils/comb.c >> > [ 0]80 bytes PetscSplitReductionCreate() line 81 in >> > /home/fdkong/math/petsc-3.4.1/src/vec/vec/utils/comb.c >> > [ 0]16 bytes PetscThreadCommReductionCreate() line 448 in >> > >> /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcommred.c >> > [ 0]512 bytes PetscThreadCommReductionCreate() line 440 in >> > >> /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcommred.c >> > [ 0]256 bytes PetscThreadCommReductionCreate() line 436 in >> > >> /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcommred.c >> > [ 0]1280 bytes PetscThreadCommReductionCreate() line 435 in >> > >> /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcommred.c >> > [ 0]32 bytes PetscThreadCommReductionCreate() line 432 in >> > >> /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcommred.c >> > [ 0]128 bytes PetscThreadCommWorldInitialize() line 1242 in >> > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcomm.c >> > [ 0]2560 bytes PetscThreadCommWorldInitialize() line 1241 in >> > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcomm.c >> > [ 0]32 bytes PetscThreadCommWorldInitialize() line 1233 in >> > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcomm.c >> > [ 0]16 bytes PetscThreadCommSetAffinities() line 424 in >> > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcomm.c >> > [ 0]48 bytes PetscThreadCommCreate() line 150 in >> > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcomm.c >> > [ 0]336 bytes PetscThreadCommCreate() line 146 in >> > /home/fdkong/math/petsc-3.4.1/src/sys/threadcomm/interface/threadcomm.c >> > [ 0]32 bytes PetscCommDuplicate() line 151 in >> > /home/fdkong/math/petsc-3.4.1/src/sys/objects/tagm.c >> > >> > >> > Fande, >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: commproblem.cpp Type: text/x-c++src Size: 1579 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: makefilet Type: application/octet-stream Size: 499 bytes Desc: not available URL: From luchao at mail.iggcas.ac.cn Thu Mar 20 21:36:20 2014 From: luchao at mail.iggcas.ac.cn (=?GBK?B?wsCzrA==?=) Date: Fri, 21 Mar 2014 10:36:20 +0800 (GMT+08:00) Subject: [petsc-users] Preallocation Memory of Finite Element Method's Sparse Matrices Message-ID: <1d56169.3f9ca.144e27f0c58.Coremail.luchao@mail.iggcas.ac.cn> Your faithfully: program src/ksp/ksp/examples/tutorials/ex3.c.html is about Bilinear elements on the unit square for Laplacian. After preallocation using "ierr = MatMPIAIJSetPreallocation(A,9,NULL,5,NULL);CHKERRQ(ierr); /* More than necessary */", Results of commands of "mpiexec -n 2 ./ex3" and "mpiexec -n 2 ./ex3" are "Norm of error 2.22327e-06 Iterations 6" and "Norm of error 3.12849e-07 Iterations 8". Both results is good! However, if I use "mpiexec -n 4 ./ex3" or 5,6,7...precesses, error "[2]PETSC ERROR: New nonzero at (4,29) (here is for process 4, other positions for different processes) caused a malloc!" appear!. For me, this error is unbelievable, because first, the preallocation is more than necessary,how can the new malloc appear? Second, the global number 4 point originally have no neighbor vertices whose global number is 29! This error have tortured me for long times. This error seems meaningless, however, my recent 3d finite element method cannot be caculated by more processesowing to the new nonzero malloc!And this is why I want to use 4 or much more processes to compute ex3.c. Thank you for all previous assistence and hope you have a good life! your sincerely LV CHAO 2014/3/21 -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Thu Mar 20 23:24:36 2014 From: jed at jedbrown.org (Jed Brown) Date: Thu, 20 Mar 2014 21:24:36 -0700 Subject: [petsc-users] gamg failure with petsc-dev In-Reply-To: <532B19AD.50105@imperial.ac.uk> References: <532B19AD.50105@imperial.ac.uk> Message-ID: <87lhw4yy9n.fsf@jedbrown.org> Stephan Kramer writes: > We have been having some problems with GAMG on petsc-dev (master) for > cases that worked fine on petsc 3.4. We're solving a Stokes equation > (just the velocity block) for a simple convection in a square box > (isoviscous). The problem only occurs if we supply a near null space > (via MatSetNearNullSpace) where we supply the usual (1,0) (0,1) and > (-y,x) (near) null space vectors. If we supply those, the smoother > complains that the diagonal of the A matrix at the first coarsened > level contains a zero. If I dump out the prolongator from the finest > to the first coarsened level it indeed contains a zero column at that > same index. We're pretty confident that the fine level A matrix is > correct (it solves fine with LU). I've briefly spoken to Matt about > this and he suggested trying to run with -pc_gamg_agg_nsmooths 0 (as > the default changed from 3.4 -> dev) but that didn't make any > difference, the dumped out prolongator still has zero columns, and it > crashes in the same way. Do you have any further suggestions what to > try and how to further debug this? Do you set the block size? Can you reproduce by modifying src/ksp/ksp/examples/tutorials/ex49.c (plane strain elasticity)? -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From jed at jedbrown.org Thu Mar 20 23:39:09 2014 From: jed at jedbrown.org (Jed Brown) Date: Thu, 20 Mar 2014 21:39:09 -0700 Subject: [petsc-users] what kind of stuffs I forget to free? In-Reply-To: References: <8738idlrsq.fsf@jedbrown.org> Message-ID: <87d2hgyxle.fsf@jedbrown.org> Fande Kong writes: > /* > * commproblem.cpp > * > * Created on: Mar 20, 2014 > * Author: fdkong > */ > > #include > #include > #include > > static char help[] = "Need helps.\n\n"; > > #undef __FUNCT__ > #define __FUNCT__ "main" > int main(int argc,char **argv) > { > Vec vec; > MPI_Request request; > MPI_Status status; > PetscMPIInt tag =123; > MPI_Comm comm; > PetscMPIInt rank, size; > PetscInt recv =10; > PetscInt send = 0; > DM dm; > PetscErrorCode ierr; > > PetscInitialize(&argc,&argv,(char *)0,help); > comm = PETSC_COMM_WORLD; > ierr = MPI_Comm_rank(comm, &rank);CHKERRQ(ierr); > ierr = MPI_Comm_size(comm, &size);CHKERRQ(ierr); > // create an object > ierr = VecCreate(comm, &vec);CHKERRQ(ierr); > // take a comm from that object > comm = ((PetscObject) vec)->comm; > // if we set comm back to PETSC_COMM_WORLD, the code should work fine > //comm = PETSC_COMM_WORLD; > // receive messages from rank 0 > if( rank!=0) > { > ierr = MPI_Irecv(&recv, 1, MPIU_INT, 0,tag, comm, &request);CHKERRQ(ierr); > } You have to wait on this request. MPI_Requests are not automatically collected at some point where you can prove that the operation has finished. The Wait should return immediately, but you have to call it. Otherwise MPI holds a reference to the communicator and will not call the destructors. > if(!rank) > { > //send messages to all others > for(PetscMPIInt i =1; i { > ierr = MPI_Isend(&send, 1, MPIU_INT, i, tag, comm, &request);CHKERRQ(ierr); > } > } > // rank 0 doest not need to wait, it could continue to do other things. It *does* need to wait eventually. > if(rank !=0) > { > ierr = MPI_Waitall(1,&request, &status);CHKERRQ(ierr); > } > ierr = VecDestroy(&vec);CHKERRQ(ierr); > ierr = PetscFinalize();CHKERRQ(ierr); > } -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From fd.kong at siat.ac.cn Fri Mar 21 00:44:53 2014 From: fd.kong at siat.ac.cn (Fande Kong) Date: Thu, 20 Mar 2014 23:44:53 -0600 Subject: [petsc-users] what kind of stuffs I forget to free? In-Reply-To: <87d2hgyxle.fsf@jedbrown.org> References: <8738idlrsq.fsf@jedbrown.org> <87d2hgyxle.fsf@jedbrown.org> Message-ID: Jed, Thanks, On Thu, Mar 20, 2014 at 10:39 PM, Jed Brown wrote: > Fande Kong writes: > > > /* > > * commproblem.cpp > > * > > * Created on: Mar 20, 2014 > > * Author: fdkong > > */ > > > > #include > > #include > > #include > > > > static char help[] = "Need helps.\n\n"; > > > > #undef __FUNCT__ > > #define __FUNCT__ "main" > > int main(int argc,char **argv) > > { > > Vec vec; > > MPI_Request request; > > MPI_Status status; > > PetscMPIInt tag =123; > > MPI_Comm comm; > > PetscMPIInt rank, size; > > PetscInt recv =10; > > PetscInt send = 0; > > DM dm; > > PetscErrorCode ierr; > > > > PetscInitialize(&argc,&argv,(char *)0,help); > > comm = PETSC_COMM_WORLD; > > ierr = MPI_Comm_rank(comm, &rank);CHKERRQ(ierr); > > ierr = MPI_Comm_size(comm, &size);CHKERRQ(ierr); > > // create an object > > ierr = VecCreate(comm, &vec);CHKERRQ(ierr); > > // take a comm from that object > > comm = ((PetscObject) vec)->comm; > > // if we set comm back to PETSC_COMM_WORLD, the code should work fine > > //comm = PETSC_COMM_WORLD; > > // receive messages from rank 0 > > if( rank!=0) > > { > > ierr = MPI_Irecv(&recv, 1, MPIU_INT, 0,tag, comm, > &request);CHKERRQ(ierr); > > } > > You have to wait on this request. MPI_Requests are not automatically > collected at some point where you can prove that the operation has > finished. The Wait should return immediately, but you have to call it. > Otherwise MPI holds a reference to the communicator and will not call > the destructors. > > > if(!rank) > > { > > //send messages to all others > > for(PetscMPIInt i =1; i > { > > ierr = MPI_Isend(&send, 1, MPIU_INT, i, tag, comm, > &request);CHKERRQ(ierr); > > } > > } > > // rank 0 doest not need to wait, it could continue to do other > things. > > It *does* need to wait eventually. > > > if(rank !=0) > > { > > ierr = MPI_Waitall(1,&request, &status);CHKERRQ(ierr); > > } > > ierr = VecDestroy(&vec);CHKERRQ(ierr); > > ierr = PetscFinalize();CHKERRQ(ierr); > > } > -------------- next part -------------- An HTML attachment was scrubbed... URL: From s.kramer at imperial.ac.uk Fri Mar 21 06:34:46 2014 From: s.kramer at imperial.ac.uk (Stephan Kramer) Date: Fri, 21 Mar 2014 11:34:46 +0000 Subject: [petsc-users] gamg failure with petsc-dev In-Reply-To: <87lhw4yy9n.fsf@jedbrown.org> References: <532B19AD.50105@imperial.ac.uk> <87lhw4yy9n.fsf@jedbrown.org> Message-ID: <532C23D6.7000400@imperial.ac.uk> On 21/03/14 04:24, Jed Brown wrote: > Stephan Kramer writes: > >> We have been having some problems with GAMG on petsc-dev (master) for >> cases that worked fine on petsc 3.4. We're solving a Stokes equation >> (just the velocity block) for a simple convection in a square box >> (isoviscous). The problem only occurs if we supply a near null space >> (via MatSetNearNullSpace) where we supply the usual (1,0) (0,1) and >> (-y,x) (near) null space vectors. If we supply those, the smoother >> complains that the diagonal of the A matrix at the first coarsened >> level contains a zero. If I dump out the prolongator from the finest >> to the first coarsened level it indeed contains a zero column at that >> same index. We're pretty confident that the fine level A matrix is >> correct (it solves fine with LU). I've briefly spoken to Matt about >> this and he suggested trying to run with -pc_gamg_agg_nsmooths 0 (as >> the default changed from 3.4 -> dev) but that didn't make any >> difference, the dumped out prolongator still has zero columns, and it >> crashes in the same way. Do you have any further suggestions what to >> try and how to further debug this? > > Do you set the block size? Can you reproduce by modifying > src/ksp/ksp/examples/tutorials/ex49.c (plane strain elasticity)? > I don't set a block size, no. About ex49: Ah great, with master (just updated now) I get: [skramer at stommel]{/data/stephan/git/petsc/src/ksp/ksp/examples/tutorials}$ ./ex49 -elas_pc_type gamg -mx 100 -my 100 -mat_no_inode [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Arguments are incompatible [0]PETSC ERROR: Zero diagonal on row 1 [0]PETSC ERROR: See http://http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Development GIT revision: v3.4.4-3671-gbb161d1 GIT Date: 2014-03-21 01:14:15 +0000 [0]PETSC ERROR: ./ex49 on a linux-gnu-c-opt named stommel by skramer Fri Mar 21 11:25:55 2014 [0]PETSC ERROR: Configure options --download-fblaslapack=1 --download-blacs=1 --download-scalapack=1 --download-ptscotch=1 --download-mumps=1 --download-hypre=1 --download-suitesparse=1 --download-ml=1 [0]PETSC ERROR: #1 MatInvertDiagonal_SeqAIJ() line 1728 in /data/stephan/git/petsc/src/mat/impls/aij/seq/aij.c [0]PETSC ERROR: #2 MatSOR_SeqAIJ() line 1760 in /data/stephan/git/petsc/src/mat/impls/aij/seq/aij.c [0]PETSC ERROR: #3 MatSOR() line 3734 in /data/stephan/git/petsc/src/mat/interface/matrix.c [0]PETSC ERROR: #4 PCApply_SOR() line 35 in /data/stephan/git/petsc/src/ksp/pc/impls/sor/sor.c [0]PETSC ERROR: #5 PCApply() line 440 in /data/stephan/git/petsc/src/ksp/pc/interface/precon.c [0]PETSC ERROR: #6 KSP_PCApply() line 227 in /data/stephan/git/petsc/include/petsc-private/kspimpl.h [0]PETSC ERROR: #7 KSPSolve_Chebyshev() line 456 in /data/stephan/git/petsc/src/ksp/ksp/impls/cheby/cheby.c [0]PETSC ERROR: #8 KSPSolve() line 458 in /data/stephan/git/petsc/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: #9 PCMGMCycle_Private() line 19 in /data/stephan/git/petsc/src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: #10 PCMGMCycle_Private() line 48 in /data/stephan/git/petsc/src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: #11 PCApply_MG() line 330 in /data/stephan/git/petsc/src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: #12 PCApply() line 440 in /data/stephan/git/petsc/src/ksp/pc/interface/precon.c [0]PETSC ERROR: #13 KSP_PCApply() line 227 in /data/stephan/git/petsc/include/petsc-private/kspimpl.h [0]PETSC ERROR: #14 KSPInitialResidual() line 63 in /data/stephan/git/petsc/src/ksp/ksp/interface/itres.c [0]PETSC ERROR: #15 KSPSolve_GMRES() line 234 in /data/stephan/git/petsc/src/ksp/ksp/impls/gmres/gmres.c [0]PETSC ERROR: #16 KSPSolve() line 458 in /data/stephan/git/petsc/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: #17 solve_elasticity_2d() line 1053 in /data/stephan/git/petsc/src/ksp/ksp/examples/tutorials/ex49.c [0]PETSC ERROR: #18 main() line 1104 in /data/stephan/git/petsc/src/ksp/ksp/examples/tutorials/ex49.c [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- Which is the same error we were getting on our problem Cheers Stephan From luchao at mail.iggcas.ac.cn Fri Mar 21 09:11:53 2014 From: luchao at mail.iggcas.ac.cn (=?GBK?B?wsCzrA==?=) Date: Fri, 21 Mar 2014 22:11:53 +0800 (GMT+08:00) Subject: [petsc-users] Preallocation Memory of Finite Element Method's Sparse Matrices Message-ID: <1d15832.41d6d.144e4fbd571.Coremail.luchao@mail.iggcas.ac.cn> Your faithfully: Last e-mail has some literal error, sorry~ program src/ksp/ksp/examples/tutorials/ex3.c.html is about Bilinear elements on the unit square for Laplacian. After preallocation using "ierr = MatMPIAIJSetPreallocation(A,9,NULL,5,NULL);CHKERRQ(ierr); /* More than necessary */", Results of commands of "mpiexec -n 2 ./ex3" and "mpiexec -n 3 ./ex3" are "Norm of error 2.22327e-06 Iterations 6" and "Norm of error 3.12849e-07 Iterations 8". Both results are good! However, if I use "mpiexec -n 4 ./ex3" or 5,6,7...precesses, error "[2]PETSC ERROR: New nonzero at (4,29) (here is for process 4, other positions for different processes) caused a malloc!" appear!. For me, this error is unbelievable, because first, the preallocation is more than necessary,how can the new malloc appear? Second, the global number 4 point originally has no neighbor vertices whose global number is 29! This error has tortured me for a long time. This error seems meaningless, however, my recent 3d finite element method cannot be caculated by more processes owing to the new nonzero malloc error! And this is why I want to use 4 or much more processes to compute ex3.c. Thank you for all previous assistence and hope you have a good life! your sincerely LV CHAO 2014/3/21 -------------- next part -------------- An HTML attachment was scrubbed... URL: From lb2653 at columbia.edu Fri Mar 21 09:37:55 2014 From: lb2653 at columbia.edu (Luc Berger-Vergiat) Date: Fri, 21 Mar 2014 10:37:55 -0400 Subject: [petsc-users] 2 level schur In-Reply-To: References: <532B77AC.1060806@columbi.edu> Message-ID: <532C4EC3.4090903@columbi.edu> Is there a way to now what the new numbering is? I am assuming that in y example since there are two fields only the numbers associated with them are 0 and 1 hence I tried: -fieldsplit_0_fieldsplit_Field_2_fields 1 -fieldsplit_0_fieldsplit_Field_3_fields 0 which did not work. As mentioned earlier, the following does not work either: -fieldsplit_0_fieldsplit_Field_2_fields 3 -fieldsplit_0_fieldsplit_Field_3_fields 2 and without too much expectation I also passed the following -fieldsplit_0_fieldsplit_Field_2_fields Field_3 -fieldsplit_0_fieldsplit_Field_3_fields Field_2 to no avail. By the way I attached the output from -ksp_view in case I might be doing something wrong? Best, Luc On 03/20/2014 09:01 PM, Matthew Knepley wrote: > On Thu, Mar 20, 2014 at 6:20 PM, Luc Berger-Vergiat > > wrote: > > Hi all, > I am solving a four field problem using two Schur complements. > Here are the arguments that I usually pass to PETSc to do it: > > -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur > -pc_fieldsplit_schur_factorization_type full > -pc_fieldsplit_schur_precondition selfp > -pc_fieldsplit_0_fields 2,3 -pc_fieldsplit_1_fields 0,1 > -fieldsplit_0_ksp_type preonly -fieldsplit_0_pc_type > fieldsplit -fieldsplit_0_pc_fieldsplit_type schur > -fieldsplit_0_pc_fieldsplit_schur_factorization_type full > -fieldsplit_0_pc_fieldsplit_schur_precondition selfp > -fieldsplit_0_fieldsplit_Field_2_fields 2 > -fieldsplit_0_fieldsplit_Field_3_fields 3 > -fieldsplit_0_fieldsplit_Field_2_ksp_type preonly > -fieldsplit_0_fieldsplit_Field_2_pc_type ilu > -fieldsplit_0_fieldsplit_Field_3_ksp_type preonly > -fieldsplit_0_fieldsplit_Field_3_pc_type jacobi > -fieldsplit_1_ksp_type preonly -fieldsplit_1_pc_type lu > -malloc_log mlog -log_summary time.log > > One issue with this is that when I change > -fieldsplit_0_fieldsplit_Field_2_fields 2 to > -fieldsplit_0_fieldsplit_Field_2_fields 3 it is ineffective, as if > PETSc automatically assign IS 2 to Field 2 even though it is not > what I want. > Is there a way to pass the arguments correctly so that PETSc goes > about switching the IS set of -fieldsplit_0_fieldsplit_Field_2 and > -fieldsplit_0_fieldsplit_Field_3? > This is crucial to me since I am using the selfp option and the > matrix associated to IS 3 is diagonal. By assigning the fields > correctly I can get an exact Schur preconditioner and hence very > fast convergence. Right now my convergence is not optimal because > of this. > > > I believe the inner Schur field statements should not be using the > original numbering, but the inner numbering, after they have been > reordered. > > Matt > > Thanks! > > Best, > Luc > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- KSP Object: 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-08, absolute=1e-16, divergence=1e+16 left preconditioning using PRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization FULL Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses A00's diagonal's inverse Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (fieldsplit_0_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_) 1 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization FULL Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses A00's diagonal's inverse Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (fieldsplit_0_fieldsplit_Field_2_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_fieldsplit_Field_2_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Mat Object: (fieldsplit_0_fieldsplit_Field_2_) 1 MPI processes type: seqaij rows=1600, cols=1600 total: nonzeros=25600, allocated nonzeros=25600 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 400 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_0_fieldsplit_Field_3_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_fieldsplit_Field_3_) 1 MPI processes type: ilu ILU: out-of-place factorization 0 levels of fill tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot [INBLOCKS] matrix ordering: natural factor fill ratio given 1, needed 1 Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=400, cols=400 package used to perform factorization: petsc total: nonzeros=1600, allocated nonzeros=1600 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 100 nodes, limit used is 5 linear system matrix followed by preconditioner matrix: Mat Object: (fieldsplit_0_fieldsplit_Field_3_) 1 MPI processes type: schurcomplement rows=400, cols=400 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (fieldsplit_0_fieldsplit_Field_3_) 1 MPI processes type: seqaij rows=400, cols=400 total: nonzeros=1600, allocated nonzeros=1600 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 100 nodes, limit used is 5 A10 Mat Object: 1 MPI processes type: seqaij rows=400, cols=1600 total: nonzeros=6400, allocated nonzeros=6400 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 100 nodes, limit used is 5 KSP of A00 KSP Object: (fieldsplit_0_fieldsplit_Field_2_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_fieldsplit_Field_2_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Mat Object: (fieldsplit_0_fieldsplit_Field_2_) 1 MPI processes type: seqaij rows=1600, cols=1600 total: nonzeros=25600, allocated nonzeros=25600 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 400 nodes, limit used is 5 A01 Mat Object: 1 MPI processes type: seqaij rows=1600, cols=400 total: nonzeros=6400, allocated nonzeros=6400 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 400 nodes, limit used is 5 Mat Object: 1 MPI processes type: seqaij rows=400, cols=400 total: nonzeros=1600, allocated nonzeros=1600 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 100 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 1 MPI processes type: seqaij rows=2000, cols=2000 total: nonzeros=40000, allocated nonzeros=40000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 400 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_1_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_1_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 5, needed 2.62994 Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=330, cols=330 package used to perform factorization: petsc total: nonzeros=20098, allocated nonzeros=20098 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 106 nodes, limit used is 5 linear system matrix followed by preconditioner matrix: Mat Object: (fieldsplit_1_) 1 MPI processes type: schurcomplement rows=330, cols=330 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (fieldsplit_1_) 1 MPI processes type: seqaij rows=330, cols=330 total: nonzeros=7642, allocated nonzeros=7642 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 121 nodes, limit used is 5 A10 Mat Object: 1 MPI processes type: seqaij rows=330, cols=2000 total: nonzeros=22800, allocated nonzeros=22800 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 121 nodes, limit used is 5 KSP of A00 KSP Object: (fieldsplit_0_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_) 1 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization FULL Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses A00's diagonal's inverse Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (fieldsplit_0_fieldsplit_Field_2_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_fieldsplit_Field_2_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Mat Object: (fieldsplit_0_fieldsplit_Field_2_) 1 MPI processes type: seqaij rows=1600, cols=1600 total: nonzeros=25600, allocated nonzeros=25600 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 400 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_0_fieldsplit_Field_3_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_fieldsplit_Field_3_) 1 MPI processes type: ilu ILU: out-of-place factorization 0 levels of fill tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot [INBLOCKS] matrix ordering: natural factor fill ratio given 1, needed 1 Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=400, cols=400 package used to perform factorization: petsc total: nonzeros=1600, allocated nonzeros=1600 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 100 nodes, limit used is 5 linear system matrix followed by preconditioner matrix: Mat Object: (fieldsplit_0_fieldsplit_Field_3_) 1 MPI processes type: schurcomplement rows=400, cols=400 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (fieldsplit_0_fieldsplit_Field_3_) 1 MPI processes type: seqaij rows=400, cols=400 total: nonzeros=1600, allocated nonzeros=1600 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 100 nodes, limit used is 5 A10 Mat Object: 1 MPI processes type: seqaij rows=400, cols=1600 total: nonzeros=6400, allocated nonzeros=6400 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 100 nodes, limit used is 5 KSP of A00 KSP Object: (fieldsplit_0_fieldsplit_Field_2_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_fieldsplit_Field_2_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Mat Object: (fieldsplit_0_fieldsplit_Field_2_) 1 MPI processes type: seqaij rows=1600, cols=1600 total: nonzeros=25600, allocated nonzeros=25600 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 400 nodes, limit used is 5 A01 Mat Object: 1 MPI processes type: seqaij rows=1600, cols=400 total: nonzeros=6400, allocated nonzeros=6400 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 400 nodes, limit used is 5 Mat Object: 1 MPI processes type: seqaij rows=400, cols=400 total: nonzeros=1600, allocated nonzeros=1600 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 100 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 1 MPI processes type: seqaij rows=2000, cols=2000 total: nonzeros=40000, allocated nonzeros=40000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 400 nodes, limit used is 5 A01 Mat Object: 1 MPI processes type: seqaij rows=2000, cols=330 total: nonzeros=22800, allocated nonzeros=22800 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 400 nodes, limit used is 5 Mat Object: 1 MPI processes type: seqaij rows=330, cols=330 total: nonzeros=7642, allocated nonzeros=7642 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 121 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=2330, cols=2330 total: nonzeros=93242, allocated nonzeros=93242 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 521 nodes, limit used is 5 From MAJones2 at mdanderson.org Fri Mar 21 09:37:39 2014 From: MAJones2 at mdanderson.org (Jones,Martin Alexander) Date: Fri, 21 Mar 2014 14:37:39 +0000 Subject: [petsc-users] A modified ex12.c Message-ID: <8448FFCE4362914496BCEAF8BE810C13EFE944@DCPWPEXMBX02.mdanderson.edu> Does anyone know if the DMPlex solver can be run on GPU? Martin -------------- next part -------------- An HTML attachment was scrubbed... URL: From lu_qin_2000 at yahoo.com Fri Mar 21 09:45:53 2014 From: lu_qin_2000 at yahoo.com (Qin Lu) Date: Fri, 21 Mar 2014 07:45:53 -0700 (PDT) Subject: [petsc-users] Building PETSc with Intel mpi Message-ID: <1395413153.15393.YahooMailNeo@web160201.mail.bf1.yahoo.com> Hello, ? I?was trying to build PETSc-3.4.2 with Intel MPI using Intel-2013 compilers in Linux, but got?the error below. The configure.log is attached. ? ******************************************************************************* ??????????????????? UNABLE to EXECUTE BINARIES for ./configure ------------------------------------------------------------------------------- Cannot run executables created with FC. If this machine uses a batch system to submit jobs you will need to configure using ./configure with the additional option? --with-batch. ?Otherwise there is problem with the compilers. Can you compile and run code with your C/C++ (and maybe Fortran) compilers? See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf ******************************************************************************* ? Thanks a lot for any suggestions abut the problem, ? Regards, Qin -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: application/octet-stream Size: 1376960 bytes Desc: not available URL: From rupp at iue.tuwien.ac.at Fri Mar 21 10:08:30 2014 From: rupp at iue.tuwien.ac.at (Karl Rupp) Date: Fri, 21 Mar 2014 16:08:30 +0100 Subject: [petsc-users] A modified ex12.c In-Reply-To: <8448FFCE4362914496BCEAF8BE810C13EFE944@DCPWPEXMBX02.mdanderson.edu> References: <8448FFCE4362914496BCEAF8BE810C13EFE944@DCPWPEXMBX02.mdanderson.edu> Message-ID: <532C55EE.7050003@iue.tuwien.ac.at> Hi Martin, > Does anyone know if the DMPlex solver can be run on GPU? what are you looking for? DMPlex is not a 'solver'... There is some code for running FEM on GPUs, but if you need specific preconditioners and such, you may be better off using the full flexibility provided by the standard CPU-based implementations. Best regards, Karli From bsmith at mcs.anl.gov Fri Mar 21 10:11:49 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 21 Mar 2014 10:11:49 -0500 Subject: [petsc-users] Building PETSc with Intel mpi In-Reply-To: <1395413153.15393.YahooMailNeo@web160201.mail.bf1.yahoo.com> References: <1395413153.15393.YahooMailNeo@web160201.mail.bf1.yahoo.com> Message-ID: <7D85BAD9-6B3E-4B6A-ACC8-EBF5FC6D4ADA@mcs.anl.gov> Did you follow the directions here: http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf Did it make any difference? On Mar 21, 2014, at 9:45 AM, Qin Lu wrote: > Hello, > > I was trying to build PETSc-3.4.2 with Intel MPI using Intel-2013 compilers in Linux, but got the error below. The configure.log is attached. > > ******************************************************************************* > UNABLE to EXECUTE BINARIES for ./configure > ------------------------------------------------------------------------------- > Cannot run executables created with FC. If this machine uses a batch system > to submit jobs you will need to configure using ./configure with the additional option --with-batch. > Otherwise there is problem with the compilers. Can you compile and run code with your C/C++ (and maybe Fortran) compilers? > See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > ******************************************************************************* > > Thanks a lot for any suggestions abut the problem, > > Regards, > Qin > > From torquil at gmail.com Fri Mar 21 10:51:11 2014 From: torquil at gmail.com (=?ISO-8859-1?Q?Torquil_Macdonald_S=F8rensen?=) Date: Fri, 21 Mar 2014 16:51:11 +0100 Subject: [petsc-users] MatCreateMPIAdj and mat/examples/tutorials/ex11.c Message-ID: <532C5FEF.7070102@gmail.com> Hi! In the documentation of MatCreateMPIAdj it says that the fifth argument "j" should be "sorted for each row": http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatCreateMPIAdj.html The same page links to an example: http://www.mcs.anl.gov/petsc/petsc-current/src/mat/examples/tutorials/ex11.c.html but in that example the entries in "jj", on lines 40 and 42, do not seem to be "sorted for each row". On rank 0, the column indices are 0, 1 and 2, which are sorted. But the second row correspond to the column indices 1, 3, 2, which are not given in increasing order. The same goes for the indices given in jj for rank 1 on line 42, corresponding to the second row on rank 1. Doesn't that conflict with the documentation? Best regards Torquil S?rensen From knepley at gmail.com Fri Mar 21 11:13:48 2014 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 21 Mar 2014 11:13:48 -0500 Subject: [petsc-users] 2 level schur In-Reply-To: <532C4EC3.4090903@columbi.edu> References: <532B77AC.1060806@columbi.edu> <532C4EC3.4090903@columbi.edu> Message-ID: On Fri, Mar 21, 2014 at 9:37 AM, Luc Berger-Vergiat wrote: > Is there a way to now what the new numbering is? > I am assuming that in y example since there are two fields only the > numbers associated with them are 0 and 1 hence I tried: > > -fieldsplit_0_fieldsplit_Field_2_fields 1 > -fieldsplit_0_fieldsplit_Field_3_fields 0 > > If its an inner fieldsplit, the numbering for options starts over again -fieldsplit_0_fieldsplit_Field_0_fields 1 -fieldsplit_0_fieldsplit_Field_1_fields 0 Thanks, Matt > which did not work. As mentioned earlier, the following does not work > either: > > -fieldsplit_0_fieldsplit_Field_2_fields 3 > -fieldsplit_0_fieldsplit_Field_3_fields 2 > > and without too much expectation I also passed the following > > -fieldsplit_0_fieldsplit_Field_2_fields Field_3 > -fieldsplit_0_fieldsplit_Field_3_fields Field_2 > > to no avail. > > By the way I attached the output from -ksp_view in case I might be doing > something wrong? > > Best, > Luc > > On 03/20/2014 09:01 PM, Matthew Knepley wrote: > > On Thu, Mar 20, 2014 at 6:20 PM, Luc Berger-Vergiat wrote: > >> Hi all, >> I am solving a four field problem using two Schur complements. Here are >> the arguments that I usually pass to PETSc to do it: >> >> -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur >> -pc_fieldsplit_schur_factorization_type full >> -pc_fieldsplit_schur_precondition selfp -pc_fieldsplit_0_fields 2,3 >> -pc_fieldsplit_1_fields 0,1 -fieldsplit_0_ksp_type preonly >> -fieldsplit_0_pc_type fieldsplit -fieldsplit_0_pc_fieldsplit_type schur >> -fieldsplit_0_pc_fieldsplit_schur_factorization_type full >> -fieldsplit_0_pc_fieldsplit_schur_precondition selfp >> -fieldsplit_0_fieldsplit_Field_2_fields 2 >> -fieldsplit_0_fieldsplit_Field_3_fields 3 >> -fieldsplit_0_fieldsplit_Field_2_ksp_type preonly >> -fieldsplit_0_fieldsplit_Field_2_pc_type ilu >> -fieldsplit_0_fieldsplit_Field_3_ksp_type preonly >> -fieldsplit_0_fieldsplit_Field_3_pc_type jacobi -fieldsplit_1_ksp_type >> preonly -fieldsplit_1_pc_type lu -malloc_log mlog -log_summary time.log >> >> One issue with this is that when I change >> -fieldsplit_0_fieldsplit_Field_2_fields 2 to >> -fieldsplit_0_fieldsplit_Field_2_fields 3 it is ineffective, as if PETSc >> automatically assign IS 2 to Field 2 even though it is not what I want. >> Is there a way to pass the arguments correctly so that PETSc goes about >> switching the IS set of -fieldsplit_0_fieldsplit_Field_2 and >> -fieldsplit_0_fieldsplit_Field_3? >> This is crucial to me since I am using the selfp option and the matrix >> associated to IS 3 is diagonal. By assigning the fields correctly I can get >> an exact Schur preconditioner and hence very fast convergence. Right now my >> convergence is not optimal because of this. >> > > I believe the inner Schur field statements should not be using the > original numbering, but the inner numbering, after they have been reordered. > > Matt > > >> Thanks! >> >> Best, >> Luc >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Fri Mar 21 14:15:13 2014 From: mfadams at lbl.gov (Mark Adams) Date: Fri, 21 Mar 2014 15:15:13 -0400 Subject: [petsc-users] cray build error In-Reply-To: <87ior9jeqw.fsf@jedbrown.org> References: <87ior9jeqw.fsf@jedbrown.org> Message-ID: Thanks, that go me further. I build here all the time (Hopper at NERSC) but I'm having a hard time here. I've cleaned everything but still get these errors. On Thu, Mar 20, 2014 at 7:23 AM, Jed Brown wrote: > Mark Adams writes: > > > I get this on Hopper with PETSC maint. Any ideas? > > The environment is broken: wrapper compiler is adding -lpspline and > -lezcdf. Perhaps you need to unload modules or load other modules? > > | Executing: cc -o > /scratch/scratchdirs/madams/petsc-pDazNU/config.setCompilers/conftest > /scratch/scratchdirs/madams/petsc-pDazNU/config.setCompilers/conftest.o > | sh: > | Possible ERROR while running linker: ModuleCmd_Switch.c(172):ERROR:152: > Module 'PrgEnv-pgi' is currently not loaded > | ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not > loaded > | ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not > loaded > | /usr/bin/ld: cannot find -lpspline > | /usr/bin/ld: cannot find -lezcdf > | collect2: error: ld returned 1 exit status > | output: ret = 256 > > | _LMFILES_=/opt/modulefiles/modules/3.2.6.6: > /usr/syscom/nsg/modulefiles/nsg/1.2.0:/opt/modulefiles/modules/3.2.6.7: > /opt/cray/xt-asyncpe/default/modulefiles/xtpe-network-gemini:/opt/modulefiles/PrgEnv-gnu/4.2.34:/opt/cray/modulefiles/atp/1.7.0:/opt/modulefiles/xe-sysroot/4.2.34:/opt/cray/gem/modulefiles/switch/1.0-1.0402.45840.2.63.gem:/opt/cray/gem/modulefiles/shared-root/1.0-1.0402.46893.3.17.gem:/opt/cray/gem/modulefiles/pdsh/2.26-1.0402.45278.1.1.gem:/opt/cray/gem/modulefiles/nodehealth/5.1-1.0402.45895.3.76.gem:/opt/cray/gem/modulefiles/lbcd/2.1-1.0402.45245.1.2.gem:/opt/cray/gem/modulefiles/hosts/1.0-1.0402.45251.1.86.gem:/opt/cray/gem/modulefiles/configuration/1.0-1.0402.45284.1.2.gem:/opt/cray/modulefiles/ccm/2.2.0-1.0402.46086.4.120:/opt/cray/gem/modulefiles/audit/1.0.0-1.0402.45273.1.86.gem:/opt/cray/gem/modulefiles/rca/1.0.0-2.0402.47290.7.1.gem:/opt/cray/gem/modulefiles/csa/3.0.0-1_2.0402.45268.1.90.gem:/opt/cray/gem/modulefiles/job/1.5.5-0.1_2.0402.45272.1.5.gem:/opt/cray/gem/modulefiles/xpmem/0.1-2.0402.45248.1.5.gem:/opt/cray/gem/modulefiles/gni-headers/2.1-1.0402.7541.1.5.gem:/opt/cray/gem/modulefiles/dmapp/4.0.1-1.0402.7784.4.1.gem:/opt/cray/gem/modulefiles/pmi/4.0.1-1.0000.9753.86.3.gem:/opt/cray/gem/modulefiles/ugni/5.0-1.0402.7551.1.10.gem:/opt/cray/gem/modulefiles/udreg/2.3.2-1.0402.7546.1.5.gem:/opt/cray/modulefiles/cray-libsci/12.1.01:/opt/modulefiles/gcc/4.8.1:/opt/modulefiles/xt-asyncpe/5.23:/opt/modulefiles/eswrap/1.0.20-1.010102.662.0:/opt/cray/xt-asyncpe/default/modulefiles/craype-mc12:/opt/cray/modulefiles/cray-shmem/6.0.1:/opt/cray/modulefiles/cray-mpich/6.0.1:/opt/modulefiles/torque/4.2.3.h5_notcpretry:/opt/modulefiles/moab/7.2.3-r19-b121-SUSE11:/usr/common/usg/Modules/modulefiles/valgrind/3.8.1:/usr/common/usg/Modules/modulefiles/cmake/2.8.10.1: > /usr/common/graphics/Modules/modulefiles/visit/2.7.0:/opt/cray/modulefiles/papi/5.1.2:/usr/common/usg/Modules/modulefiles/ipm/2.00:/usr/common/usg/Modules/modulefiles/allineatools/4.2-34404:/usr/common/usg/Modules/modulefiles/adios/1.2.1:/usr/common/usg/Modules/modulefiles/pspline/nersc1.0:/opt/cray/modulefiles/cray-hdf5/1.8.11:/opt/cray/modulefiles/cray-netcdf/4.3.0:/usr/common/usg/Modules/modulefiles/matlab/R2012a:/usr/common/usg/Modules/modulefiles/altd/1.0:/usr/common/usg/Modules/modulefiles/usg-default-modules/1.0 > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: application/octet-stream Size: 1760977 bytes Desc: not available URL: From bsmith at mcs.anl.gov Fri Mar 21 15:08:18 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 21 Mar 2014 15:08:18 -0500 Subject: [petsc-users] MatCreateMPIAdj and mat/examples/tutorials/ex11.c In-Reply-To: <532C5FEF.7070102@gmail.com> References: <532C5FEF.7070102@gmail.com> Message-ID: <25C259FB-E428-460D-ACC8-A69424259349@mcs.anl.gov> Thanks. Now fixed in master. Barry On Mar 21, 2014, at 10:51 AM, Torquil Macdonald S?rensen wrote: > Hi! > > In the documentation of MatCreateMPIAdj it says that the fifth argument > "j" should be "sorted for each row": > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatCreateMPIAdj.html > > The same page links to an example: > > http://www.mcs.anl.gov/petsc/petsc-current/src/mat/examples/tutorials/ex11.c.html > > but in that example the entries in "jj", on lines 40 and 42, do not seem > to be "sorted for each row". > > On rank 0, the column indices are 0, 1 and 2, which are sorted. But the > second row correspond to the column indices 1, 3, 2, which are not given > in increasing order. The same goes for the indices given in jj for rank > 1 on line 42, corresponding to the second row on rank 1. > > Doesn't that conflict with the documentation? > > Best regards > Torquil S?rensen > From bsmith at mcs.anl.gov Fri Mar 21 15:15:44 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 21 Mar 2014 15:15:44 -0500 Subject: [petsc-users] Preallocation Memory of Finite Element Method's Sparse Matrices In-Reply-To: <1d15832.41d6d.144e4fbd571.Coremail.luchao@mail.iggcas.ac.cn> References: <1d15832.41d6d.144e4fbd571.Coremail.luchao@mail.iggcas.ac.cn> Message-ID: Thank you for reporting this. It was our error. In fact 4 is not enough under certain circumstances; consider where each process has only a single degree of freedom (vertex) then it is coupled to 8 other vertices ALL on other processes. Thus we really need to use 8 instead of 4 as the maximum number of off process coupling. I have fixed this in master so it now runs on any number of processes. Barry On Mar 21, 2014, at 9:11 AM, ?? wrote: > > > Your faithfully: > > Last e-mail has some literal error, sorry~ > > program src/ksp/ksp/examples/tutorials/ex3.c.html is about Bilinear elements on the unit square for Laplacian. > > After preallocation using > > "ierr = MatMPIAIJSetPreallocation(A,9,NULL,5,NULL);CHKERRQ(ierr); /* More than necessary */", > > Results of commands of "mpiexec -n 2 ./ex3" and "mpiexec -n 3 ./ex3" are "Norm of error 2.22327e-06 Iterations 6" and "Norm of error 3.12849e-07 Iterations 8". Both results are good! > > However, if I use "mpiexec -n 4 ./ex3" or 5,6,7...precesses, error "[2]PETSC ERROR: New nonzero at (4,29) (here is for process 4, other positions for different processes) caused a malloc!" appear!. For me, this error is unbelievable, because first, the preallocation is more than necessary,how can the new malloc appear? Second, the global number 4 point originally has no neighbor vertices whose global number is 29! This error has tortured me for a long time. > > This error seems meaningless, however, my recent 3d finite element method cannot be caculated by more processes owing to the new nonzero malloc error! And this is why I want to use 4 or much more processes to compute ex3.c. > > Thank you for all previous assistence and hope you have a good life! > > your sin cerely > > LV CHAO > > 2014/3/21 > > > > From lb2653 at columbia.edu Fri Mar 21 15:33:59 2014 From: lb2653 at columbia.edu (Luc Berger-Vergiat) Date: Fri, 21 Mar 2014 16:33:59 -0400 Subject: [petsc-users] 2 level schur In-Reply-To: References: <532B77AC.1060806@columbi.edu> <532C4EC3.4090903@columbi.edu> Message-ID: <532CA237.4070304@columbi.edu> I hear you though that is not what petsc does. When I name the fields as you suggest: -fieldsplit_0_fieldsplit_Field_0_fields 1 -fieldsplit_0_fieldsplit_Field_1_fields 0 petsc ignores it and still call the fields -fieldsplit_0_fieldsplit_Field_2 -fieldsplit_0_fieldsplit_Field_3 But the automatic naming scheme is not really the issue. It would just be nice to be able to switch the two fields. I will try to change the order in which I pass the IS to the DM and see if I can go around the problem that way. Best, Luc On 03/21/2014 12:13 PM, Matthew Knepley wrote: > On Fri, Mar 21, 2014 at 9:37 AM, Luc Berger-Vergiat > > wrote: > > Is there a way to now what the new numbering is? > I am assuming that in y example since there are two fields only > the numbers associated with them are 0 and 1 hence I tried: > > -fieldsplit_0_fieldsplit_Field_2_fields 1 > -fieldsplit_0_fieldsplit_Field_3_fields 0 > > > If its an inner fieldsplit, the numbering for options starts over again > > -fieldsplit_0_fieldsplit_Field_0_fields 1 > -fieldsplit_0_fieldsplit_Field_1_fields 0 > > Thanks, > > Matt > > which did not work. As mentioned earlier, the following does not > work either: > > -fieldsplit_0_fieldsplit_Field_2_fields 3 > -fieldsplit_0_fieldsplit_Field_3_fields 2 > > and without too much expectation I also passed the following > > -fieldsplit_0_fieldsplit_Field_2_fields Field_3 > -fieldsplit_0_fieldsplit_Field_3_fields Field_2 > > to no avail. > > By the way I attached the output from -ksp_view in case I might be > doing something wrong? > > Best, > Luc > > On 03/20/2014 09:01 PM, Matthew Knepley wrote: >> On Thu, Mar 20, 2014 at 6:20 PM, Luc Berger-Vergiat >> > wrote: >> >> Hi all, >> I am solving a four field problem using two Schur >> complements. Here are the arguments that I usually pass to >> PETSc to do it: >> >> -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type >> schur -pc_fieldsplit_schur_factorization_type full >> -pc_fieldsplit_schur_precondition selfp >> -pc_fieldsplit_0_fields 2,3 -pc_fieldsplit_1_fields 0,1 >> -fieldsplit_0_ksp_type preonly -fieldsplit_0_pc_type >> fieldsplit -fieldsplit_0_pc_fieldsplit_type schur >> -fieldsplit_0_pc_fieldsplit_schur_factorization_type full >> -fieldsplit_0_pc_fieldsplit_schur_precondition selfp >> -fieldsplit_0_fieldsplit_Field_2_fields 2 >> -fieldsplit_0_fieldsplit_Field_3_fields 3 >> -fieldsplit_0_fieldsplit_Field_2_ksp_type preonly >> -fieldsplit_0_fieldsplit_Field_2_pc_type ilu >> -fieldsplit_0_fieldsplit_Field_3_ksp_type preonly >> -fieldsplit_0_fieldsplit_Field_3_pc_type jacobi >> -fieldsplit_1_ksp_type preonly -fieldsplit_1_pc_type lu >> -malloc_log mlog -log_summary time.log >> >> One issue with this is that when I change >> -fieldsplit_0_fieldsplit_Field_2_fields 2 to >> -fieldsplit_0_fieldsplit_Field_2_fields 3 it is ineffective, >> as if PETSc automatically assign IS 2 to Field 2 even though >> it is not what I want. >> Is there a way to pass the arguments correctly so that PETSc >> goes about switching the IS set of >> -fieldsplit_0_fieldsplit_Field_2 and >> -fieldsplit_0_fieldsplit_Field_3? >> This is crucial to me since I am using the selfp option and >> the matrix associated to IS 3 is diagonal. By assigning the >> fields correctly I can get an exact Schur preconditioner and >> hence very fast convergence. Right now my convergence is not >> optimal because of this. >> >> >> I believe the inner Schur field statements should not be using >> the original numbering, but the inner numbering, after they have >> been reordered. >> >> Matt >> >> Thanks! >> >> Best, >> Luc >> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to >> which their experiments lead. >> -- Norbert Wiener > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Mar 21 18:29:55 2014 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 21 Mar 2014 18:29:55 -0500 Subject: [petsc-users] 2 level schur In-Reply-To: <532CA237.4070304@columbi.edu> References: <532B77AC.1060806@columbi.edu> <532C4EC3.4090903@columbi.edu> <532CA237.4070304@columbi.edu> Message-ID: On Fri, Mar 21, 2014 at 3:33 PM, Luc Berger-Vergiat wrote: > I hear you though that is not what petsc does. > When I name the fields as you suggest: > > -fieldsplit_0_fieldsplit_Field_0_fields 1 > -fieldsplit_0_fieldsplit_Field_1_fields 0 > > I could be wrong. Is it possible to reproduce this bad behavior with SNES ex19, which has four fields? Thanks, Matt > petsc ignores it and still call the fields > > -fieldsplit_0_fieldsplit_Field_2 > -fieldsplit_0_fieldsplit_Field_3 > > But the automatic naming scheme is not really the issue. It would just be > nice to be able to switch the two fields. > I will try to change the order in which I pass the IS to the DM and see if > I can go around the problem that way. > > Best, > Luc > > On 03/21/2014 12:13 PM, Matthew Knepley wrote: > > On Fri, Mar 21, 2014 at 9:37 AM, Luc Berger-Vergiat wrote: > >> Is there a way to now what the new numbering is? >> I am assuming that in y example since there are two fields only the >> numbers associated with them are 0 and 1 hence I tried: >> >> -fieldsplit_0_fieldsplit_Field_2_fields 1 >> -fieldsplit_0_fieldsplit_Field_3_fields 0 >> >> > If its an inner fieldsplit, the numbering for options starts over again > > -fieldsplit_0_fieldsplit_Field_0_fields 1 > -fieldsplit_0_fieldsplit_Field_1_fields 0 > > Thanks, > > Matt > >> which did not work. As mentioned earlier, the following does not work >> either: >> >> -fieldsplit_0_fieldsplit_Field_2_fields 3 >> -fieldsplit_0_fieldsplit_Field_3_fields 2 >> >> and without too much expectation I also passed the following >> >> -fieldsplit_0_fieldsplit_Field_2_fields Field_3 >> -fieldsplit_0_fieldsplit_Field_3_fields Field_2 >> >> to no avail. >> >> By the way I attached the output from -ksp_view in case I might be doing >> something wrong? >> >> Best, >> Luc >> >> On 03/20/2014 09:01 PM, Matthew Knepley wrote: >> >> On Thu, Mar 20, 2014 at 6:20 PM, Luc Berger-Vergiat > > wrote: >> >>> Hi all, >>> I am solving a four field problem using two Schur complements. Here are >>> the arguments that I usually pass to PETSc to do it: >>> >>> -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur >>> -pc_fieldsplit_schur_factorization_type full >>> -pc_fieldsplit_schur_precondition selfp -pc_fieldsplit_0_fields 2,3 >>> -pc_fieldsplit_1_fields 0,1 -fieldsplit_0_ksp_type preonly >>> -fieldsplit_0_pc_type fieldsplit -fieldsplit_0_pc_fieldsplit_type schur >>> -fieldsplit_0_pc_fieldsplit_schur_factorization_type full >>> -fieldsplit_0_pc_fieldsplit_schur_precondition selfp >>> -fieldsplit_0_fieldsplit_Field_2_fields 2 >>> -fieldsplit_0_fieldsplit_Field_3_fields 3 >>> -fieldsplit_0_fieldsplit_Field_2_ksp_type preonly >>> -fieldsplit_0_fieldsplit_Field_2_pc_type ilu >>> -fieldsplit_0_fieldsplit_Field_3_ksp_type preonly >>> -fieldsplit_0_fieldsplit_Field_3_pc_type jacobi -fieldsplit_1_ksp_type >>> preonly -fieldsplit_1_pc_type lu -malloc_log mlog -log_summary time.log >>> >>> One issue with this is that when I change >>> -fieldsplit_0_fieldsplit_Field_2_fields 2 to >>> -fieldsplit_0_fieldsplit_Field_2_fields 3 it is ineffective, as if PETSc >>> automatically assign IS 2 to Field 2 even though it is not what I want. >>> Is there a way to pass the arguments correctly so that PETSc goes about >>> switching the IS set of -fieldsplit_0_fieldsplit_Field_2 and >>> -fieldsplit_0_fieldsplit_Field_3? >>> This is crucial to me since I am using the selfp option and the matrix >>> associated to IS 3 is diagonal. By assigning the fields correctly I can get >>> an exact Schur preconditioner and hence very fast convergence. Right now my >>> convergence is not optimal because of this. >>> >> >> I believe the inner Schur field statements should not be using the >> original numbering, but the inner numbering, after they have been reordered. >> >> Matt >> >> >>> Thanks! >>> >>> Best, >>> Luc >>> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Fri Mar 21 21:12:51 2014 From: jed at jedbrown.org (Jed Brown) Date: Fri, 21 Mar 2014 19:12:51 -0700 Subject: [petsc-users] Preallocation Memory of Finite Element Method's Sparse Matrices In-Reply-To: References: <1d15832.41d6d.144e4fbd571.Coremail.luchao@mail.iggcas.ac.cn> Message-ID: <87pplfx9p8.fsf@jedbrown.org> Barry Smith writes: > Thank you for reporting this. It was our error. In fact 4 is not > enough under certain circumstances; consider where each process has > only a single degree of freedom (vertex) then it is coupled to 8 > other vertices ALL on other processes. Thus we really need to use 8 > instead of 4 as the maximum number of off process coupling. Note that _your_ code should generally not have this problem because you should use a non-pathological partition. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From fd.kong at siat.ac.cn Fri Mar 21 21:49:29 2014 From: fd.kong at siat.ac.cn (Fande Kong) Date: Fri, 21 Mar 2014 20:49:29 -0600 Subject: [petsc-users] The same code randomly produce different residual norms Message-ID: Hi, I run the code src/snes/examples/tutorials/ex5.c with options: * mpirun -n 8 ./ex5 -pc_type mg \ -ksp_monitor\ -pc_mg_levels 3 \ -pc_mg_galerkin \ -da_grid_x 17 \ -da_grid_y 17 \ -mg_levels_ksp_norm_type unpreconditioned \ -snes_monitor \ -mg_levels_ksp_chebyshev_estimate_eigenvalues 0.5,1.1 \ -mg_levels_pc_type sor \ -pc_mg_type \* I run this script several times, but at each time, the residual norms have some tiny differences ( these should not happen). For example: Case 1: * 0 SNES Function norm 1.188788066192e+00 0 KSP Residual norm 1.573384253521e+00 1 KSP Residual norm 3.616321708396e-02 2 KSP Residual norm 2.780221563755e-04 3 KSP Residual norm 2.194662354037e-06 1 SNES Function norm 5.125240595190e-03 0 KSP Residual norm 1.217284338266e-01 1 KSP Residual norm 1.774247017346e-04 2 KSP Residual norm 2.557118591292e-06 3 KSP Residual norm 1.622173269367e-08 2 SNES Function norm 3.922335995111e-05 0 KSP Residual norm 9.960745924876e-04 1 KSP Residual norm 1.273916336665e-06 2 KSP Residual norm 1.571259383270e-08 3 KSP Residual norm 1.250266145356e-10 3 SNES Function norm 2.662898279023e-09 * Case 2: * 0 SNES Function norm 1.188788066192e+00 0 KSP Residual norm 1.573384253521e+00 1 KSP Residual norm 3.616321708396e-02 2 KSP Residual norm 2.780221563755e-04 3 KSP Residual norm 2.194662354037e-06 1 SNES Function norm 5.125240595190e-03 0 KSP Residual norm 1.217284338266e-01 1 KSP Residual norm 1.774247017347e-04 2 KSP Residual norm 2.557118591292e-06 3 KSP Residual norm 1.622173269367e-08 2 SNES Function norm 3.922335995108e-05 0 KSP Residual norm 9.960745924862e-04 1 KSP Residual norm 1.273916336654e-06 2 KSP Residual norm 1.571259383257e-08 3 KSP Residual norm 1.250266145346e-10 3 SNES Function norm 2.662898285038e-09 * These differences are marked by red color. So here, I was wondering there are any explanations why these happen? Thanks, Fande, -------------- next part -------------- An HTML attachment was scrubbed... URL: From lu_qin_2000 at yahoo.com Fri Mar 21 21:32:53 2014 From: lu_qin_2000 at yahoo.com (Qin Lu) Date: Fri, 21 Mar 2014 19:32:53 -0700 (PDT) Subject: [petsc-users] Building PETSc with Intel mpi In-Reply-To: <7D85BAD9-6B3E-4B6A-ACC8-EBF5FC6D4ADA@mcs.anl.gov> References: <1395413153.15393.YahooMailNeo@web160201.mail.bf1.yahoo.com> <7D85BAD9-6B3E-4B6A-ACC8-EBF5FC6D4ADA@mcs.anl.gov> Message-ID: <1395455573.34700.YahooMailNeo@web160204.mail.bf1.yahoo.com> Sourcing the .csh files of the compiler fixed the problem. Thanks! However, later it got another error (see the attached configure.log for details): ? ******************************************************************************* ???????? UNABLE to CONFIGURE with GIVEN OPTIONS??? (see configure.log for details): ------------------------------------------------------------------------------- Fortran error! mpi_init() could not be located! ******************************************************************************* It seems the configure did not?link the Intel MPI libs. I used --with-mpi-dir to specify the MPI directory, can configure get the correct Intel MPI lib names? If I have to specify the lib names (using --with-mpi-lib?), which libs should I specify? I saw a lot of libs under the directory, such as libmpi.a, libmpi_ipl64.a, libmpi_mt.a, etc. ? Thanks a lot, Qin ________________________________ From: Barry Smith To: Qin Lu Cc: petsc-users Sent: Friday, March 21, 2014 10:11 AM Subject: Re: [petsc-users] Building PETSc with Intel mpi ? Did you follow the directions here: http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf ? Did it make any difference? On Mar 21, 2014, at 9:45 AM, Qin Lu wrote: > Hello, >? > I was trying to build PETSc-3.4.2 with Intel MPI using Intel-2013 compilers in Linux, but got the error below. The configure.log is attached. >? > ******************************************************************************* >? ? ? ? ? ? ? ? ? ? UNABLE to EXECUTE BINARIES for ./configure > ------------------------------------------------------------------------------- > Cannot run executables created with FC. If this machine uses a batch system > to submit jobs you will need to configure using ./configure with the additional option? --with-batch. >? Otherwise there is problem with the compilers. Can you compile and run code with your C/C++ (and maybe Fortran) compilers? > See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > ******************************************************************************* >? > Thanks a lot for any suggestions abut the problem, >? > Regards, > Qin >? > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: application/octet-stream Size: 4157069 bytes Desc: not available URL: From lu_qin_2000 at yahoo.com Fri Mar 21 21:51:56 2014 From: lu_qin_2000 at yahoo.com (Qin Lu) Date: Fri, 21 Mar 2014 19:51:56 -0700 (PDT) Subject: [petsc-users] Building PETSc with Intel mpi In-Reply-To: <7D85BAD9-6B3E-4B6A-ACC8-EBF5FC6D4ADA@mcs.anl.gov> References: <1395413153.15393.YahooMailNeo@web160201.mail.bf1.yahoo.com> <7D85BAD9-6B3E-4B6A-ACC8-EBF5FC6D4ADA@mcs.anl.gov> Message-ID: <1395456716.21155.YahooMailNeo@web160205.mail.bf1.yahoo.com> Sourcing the .csh files of the compiler fixed the problem. Thanks! However, later it got another error (the attached configure.log?was shortened since it was too big to be sent by email, it should contain the error message, though): ******************************************************************************* UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): ------------------------------------------------------------------------------- Fortran error! mpi_init() could not be located! ******************************************************************************* It seems the configure did not link the Intel MPI libs. I used --with-mpi-dir to specify the MPI directory, can configure get the correct Intel MPI lib names? If I have to specify the lib names (using --with-mpi-lib?), which libs should I specify? I saw a lot of libs under the directory, such as libmpi.a, libmpi_ipl64.a, libmpi_mt.a, etc. Thanks a lot, Qin ________________________________ From: Barry Smith To: Qin Lu Cc: petsc-users Sent: Friday, March 21, 2014 10:11 AM Subject: Re: [petsc-users] Building PETSc with Intel mpi ? Did you follow the directions here: http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf ? Did it make any difference? On Mar 21, 2014, at 9:45 AM, Qin Lu wrote: > Hello, >? > I was trying to build PETSc-3.4.2 with Intel MPI using Intel-2013 compilers in Linux, but got the error below. The configure.log is attached. >? > ******************************************************************************* >? ? ? ? ? ? ? ? ? ? UNABLE to EXECUTE BINARIES for ./configure > ------------------------------------------------------------------------------- > Cannot run executables created with FC. If this machine uses a batch system > to submit jobs you will need to configure using ./configure with the additional option? --with-batch. >? Otherwise there is problem with the compilers. Can you compile and run code with your C/C++ (and maybe Fortran) compilers? > See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > ******************************************************************************* >? > Thanks a lot for any suggestions abut the problem, >? > Regards, > Qin >? > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: application/octet-stream Size: 994962 bytes Desc: not available URL: From jed at jedbrown.org Fri Mar 21 22:00:54 2014 From: jed at jedbrown.org (Jed Brown) Date: Fri, 21 Mar 2014 20:00:54 -0700 Subject: [petsc-users] The same code randomly produce different residual norms In-Reply-To: References: Message-ID: <87eh1vx7h5.fsf@jedbrown.org> Fande Kong writes: > * mpirun -n 8 ./ex5 -pc_type mg \ -ksp_monitor\ -pc_mg_levels 3 > \ -pc_mg_galerkin \ -da_grid_x 17 \ -da_grid_y 17 > \ -mg_levels_ksp_norm_type unpreconditioned \ -snes_monitor > \ -mg_levels_ksp_chebyshev_estimate_eigenvalues 0.5,1.1 > \ -mg_levels_pc_type sor \ -pc_mg_type \* > > > I run this script several times, but at each time, the residual norms have > some tiny differences ( these should not happen). For example: Some messages are unpacked in the order in which they are received, rather than doing a moderately expensive sorting. There are some options to make VecScatter and other components reproducible, but this configuration might use a feature for which such an option has not been implemented (-vecscatter_reproduce helps a little in my test, but is not sufficient). The difference you're seeing is harmless, but if you have an application for which this is required (either to reproduce rare events for re-analysis of an unstable dynamical system or due to programmatic requirements that misunderstand error analysis of finite-precision arithmetic), we can hunt down any such places and provide an option to make it reproducible. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From balay at mcs.anl.gov Sat Mar 22 00:09:08 2014 From: balay at mcs.anl.gov (Satish Balay) Date: Sat, 22 Mar 2014 00:09:08 -0500 Subject: [petsc-users] Building PETSc with Intel mpi In-Reply-To: <1395455573.34700.YahooMailNeo@web160204.mail.bf1.yahoo.com> References: <1395413153.15393.YahooMailNeo@web160201.mail.bf1.yahoo.com> <7D85BAD9-6B3E-4B6A-ACC8-EBF5FC6D4ADA@mcs.anl.gov> <1395455573.34700.YahooMailNeo@web160204.mail.bf1.yahoo.com> Message-ID: > --with-mpi-dir=/apps/compilers/intel_2013/impi/4.1.0.024/intel64 --with-mpi-compilers=0 Does this mpi not come with mpicc/mpif90 wrappers? If they do - its best to use them. If not - its best to look at the docs for this compiler - and specify it with the appropriate --with-mpi-include --with-mpi-lib options, [instead of the above] Satish On Fri, 21 Mar 2014, Qin Lu wrote: > Sourcing the .csh files of the compiler fixed the problem. Thanks! However, later it got another error (see the attached configure.log for details): > ? > ******************************************************************************* > ???????? UNABLE to CONFIGURE with GIVEN OPTIONS??? (see configure.log for details): > ------------------------------------------------------------------------------- > Fortran error! mpi_init() could not be located! > ******************************************************************************* > > It seems the configure did not?link the Intel MPI libs. I used --with-mpi-dir to specify the MPI directory, can configure get the correct Intel MPI lib names? If I have to specify the lib names (using --with-mpi-lib?), which libs should I specify? I saw a lot of libs under the directory, such as libmpi.a, libmpi_ipl64.a, libmpi_mt.a, etc. > ? > Thanks a lot, > Qin > > > ________________________________ > From: Barry Smith > To: Qin Lu > Cc: petsc-users > Sent: Friday, March 21, 2014 10:11 AM > Subject: Re: [petsc-users] Building PETSc with Intel mpi > > > > ? Did you follow the directions here: http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > > ? Did it make any difference? > > > On Mar 21, 2014, at 9:45 AM, Qin Lu wrote: > > > Hello, > >? > > I was trying to build PETSc-3.4.2 with Intel MPI using Intel-2013 compilers in Linux, but got the error below. The configure.log is attached. > >? > > ******************************************************************************* > >? ? ? ? ? ? ? ? ? ? UNABLE to EXECUTE BINARIES for ./configure > > ------------------------------------------------------------------------------- > > Cannot run executables created with FC. If this machine uses a batch system > > to submit jobs you will need to configure using ./configure with the additional option? --with-batch. > >? Otherwise there is problem with the compilers. Can you compile and run code with your C/C++ (and maybe Fortran) compilers? > > See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > > ******************************************************************************* > >? > > Thanks a lot for any suggestions abut the problem, > >? > > Regards, > > Qin > > >? > > > From balay at mcs.anl.gov Sat Mar 22 00:12:37 2014 From: balay at mcs.anl.gov (Satish Balay) Date: Sat, 22 Mar 2014 00:12:37 -0500 Subject: [petsc-users] cray build error In-Reply-To: References: <87ior9jeqw.fsf@jedbrown.org> Message-ID: >>> Configure Options: --configModules=PETSc.Configure --optionsModule=PETSc.compilerOptions --COPTFLAGS="-O3 -ffast-math -funroll-loops" --CXXOPTFLAGS="-O3 -ffast-math -funroll-loops" --FOPTFLAGS="-O3 -ffast-math -funroll-loops" --download-parmetis --download-metis --download-hypre --with-cc=cc --with-clib-autodetect=0 --with-cxx=CC --with-cxxlib-autodetect=0 --with-debugging=0 --with-fc=ftn --with-fortranlib-autodetect=0 --with-shared-libraries=0 --download-mpich --with-x=0 --with-64-bit-indices PETSC_ARCH=arch-xe6-opt64 PETSC_DIR=/global/homes/m/madams/petsc_maint/ <<<<< --download-mpich on hopper looks weird. >>>>>>>> ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not loaded ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not loaded ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not loaded ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not loaded ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not loaded ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not loaded ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not loaded <<<<<< I see more module errors here.. Satish On Fri, 21 Mar 2014, Mark Adams wrote: > Thanks, that go me further. I build here all the time (Hopper at NERSC) > but I'm having a hard time here. I've cleaned everything but still get > these errors. > > > On Thu, Mar 20, 2014 at 7:23 AM, Jed Brown wrote: > > > Mark Adams writes: > > > > > I get this on Hopper with PETSC maint. Any ideas? > > > > The environment is broken: wrapper compiler is adding -lpspline and > > -lezcdf. Perhaps you need to unload modules or load other modules? > > > > | Executing: cc -o > > /scratch/scratchdirs/madams/petsc-pDazNU/config.setCompilers/conftest > > /scratch/scratchdirs/madams/petsc-pDazNU/config.setCompilers/conftest.o > > | sh: > > | Possible ERROR while running linker: ModuleCmd_Switch.c(172):ERROR:152: > > Module 'PrgEnv-pgi' is currently not loaded > > | ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not > > loaded > > | ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not > > loaded > > | /usr/bin/ld: cannot find -lpspline > > | /usr/bin/ld: cannot find -lezcdf > > | collect2: error: ld returned 1 exit status > > | output: ret = 256 > > > > | _LMFILES_=/opt/modulefiles/modules/3.2.6.6: > > /usr/syscom/nsg/modulefiles/nsg/1.2.0:/opt/modulefiles/modules/3.2.6.7: > > /opt/cray/xt-asyncpe/default/modulefiles/xtpe-network-gemini:/opt/modulefiles/PrgEnv-gnu/4.2.34:/opt/cray/modulefiles/atp/1.7.0:/opt/modulefiles/xe-sysroot/4.2.34:/opt/cray/gem/modulefiles/switch/1.0-1.0402.45840.2.63.gem:/opt/cray/gem/modulefiles/shared-root/1.0-1.0402.46893.3.17.gem:/opt/cray/gem/modulefiles/pdsh/2.26-1.0402.45278.1.1.gem:/opt/cray/gem/modulefiles/nodehealth/5.1-1.0402.45895.3.76.gem:/opt/cray/gem/modulefiles/lbcd/2.1-1.0402.45245.1.2.gem:/opt/cray/gem/modulefiles/hosts/1.0-1.0402.45251.1.86.gem:/opt/cray/gem/modulefiles/configuration/1.0-1.0402.45284.1.2.gem:/opt/cray/modulefiles/ccm/2.2.0-1.0402.46086.4.120:/opt/cray/gem/modulefiles/audit/1.0.0-1.0402.45273.1.86.gem:/opt/cray/gem/modulefiles/rca/1.0.0-2.0402.47290.7.1.gem:/opt/cray/gem/modulefiles/csa/3.0.0-1_2.0402.45268.1.90.gem:/opt/cray/gem/modulefiles/job/1.5.5-0.1_2.0402.45272.1.5.gem:/opt/cray/gem/modulefiles/xpmem/0.1-2.0402.45248.1.5.gem:/opt/cray/gem/modulefiles/gni-headers/2.1-1.0402.7541.1.5.gem:/opt/cray/gem/modulefiles/dmapp/4.0.1-1.0402.7784.4.1.gem:/opt/cray/gem/modulefiles/pmi/4.0.1-1.0000.9753.86.3.gem:/opt/cray/gem/modulefiles/ugni/5.0-1.0402.7551.1.10.gem:/opt/cray/gem/modulefiles/udreg/2.3.2-1.0402.7546.1.5.gem:/opt/cray/modulefiles/cray-libsci/12.1.01:/opt/modulefiles/gcc/4.8.1:/opt/modulefiles/xt-asyncpe/5.23:/opt/modulefiles/eswrap/1.0.20-1.010102.662.0:/opt/cray/xt-asyncpe/default/modulefiles/craype-mc12:/opt/cray/modulefiles/cray-shmem/6.0.1:/opt/cray/modulefiles/cray-mpich/6.0.1:/opt/modulefiles/torque/4.2.3.h5_notcpretry:/opt/modulefiles/moab/7.2.3-r19-b121-SUSE11:/usr/common/usg/Modules/modulefiles/valgrind/3.8.1:/usr/common/usg/Modules/modulefiles/cmake/2.8.10.1: > > /usr/common/graphics/Modules/modulefiles/visit/2.7.0:/opt/cray/modulefiles/papi/5.1.2:/usr/common/usg/Modules/modulefiles/ipm/2.00:/usr/common/usg/Modules/modulefiles/allineatools/4.2-34404:/usr/common/usg/Modules/modulefiles/adios/1.2.1:/usr/common/usg/Modules/modulefiles/pspline/nersc1.0:/opt/cray/modulefiles/cray-hdf5/1.8.11:/opt/cray/modulefiles/cray-netcdf/4.3.0:/usr/common/usg/Modules/modulefiles/matlab/R2012a:/usr/common/usg/Modules/modulefiles/altd/1.0:/usr/common/usg/Modules/modulefiles/usg-default-modules/1.0 > > > From bsmith at mcs.anl.gov Sat Mar 22 09:24:21 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sat, 22 Mar 2014 09:24:21 -0500 Subject: [petsc-users] The same code randomly produce different residual norms In-Reply-To: References: Message-ID: <9D76ED2A-E503-4DC3-951C-7CB6CD1EAA74@mcs.anl.gov> All of these answers are equally correct, they are due to different round-offs due to different order of operations due to different communication patterns between the processes in each run. Barry On Mar 21, 2014, at 9:49 PM, Fande Kong wrote: > Hi, > > I run the code src/snes/examples/tutorials/ex5.c with options: > > mpirun -n 8 ./ex5 -pc_type mg \ > -ksp_monitor\ > -pc_mg_levels 3 \ > -pc_mg_galerkin \ > -da_grid_x 17 \ > -da_grid_y 17 \ > -mg_levels_ksp_norm_type unpreconditioned \ > -snes_monitor \ > -mg_levels_ksp_chebyshev_estimate_eigenvalues 0.5,1.1 \ > -mg_levels_pc_type sor \ > -pc_mg_type \ > > > I run this script several times, but at each time, the residual norms have some tiny differences ( these should not happen). For example: > > Case 1: > > 0 SNES Function norm 1.188788066192e+00 > 0 KSP Residual norm 1.573384253521e+00 > 1 KSP Residual norm 3.616321708396e-02 > 2 KSP Residual norm 2.780221563755e-04 > 3 KSP Residual norm 2.194662354037e-06 > 1 SNES Function norm 5.125240595190e-03 > 0 KSP Residual norm 1.217284338266e-01 > 1 KSP Residual norm 1.774247017346e-04 > 2 KSP Residual norm 2.557118591292e-06 > 3 KSP Residual norm 1.622173269367e-08 > 2 SNES Function norm 3.922335995111e-05 > 0 KSP Residual norm 9.960745924876e-04 > 1 KSP Residual norm 1.273916336665e-06 > 2 KSP Residual norm 1.571259383270e-08 > 3 KSP Residual norm 1.250266145356e-10 > 3 SNES Function norm 2.662898279023e-09 > > > Case 2: > > > 0 SNES Function norm 1.188788066192e+00 > 0 KSP Residual norm 1.573384253521e+00 > 1 KSP Residual norm 3.616321708396e-02 > 2 KSP Residual norm 2.780221563755e-04 > 3 KSP Residual norm 2.194662354037e-06 > 1 SNES Function norm 5.125240595190e-03 > 0 KSP Residual norm 1.217284338266e-01 > 1 KSP Residual norm 1.774247017347e-04 > 2 KSP Residual norm 2.557118591292e-06 > 3 KSP Residual norm 1.622173269367e-08 > 2 SNES Function norm 3.922335995108e-05 > 0 KSP Residual norm 9.960745924862e-04 > 1 KSP Residual norm 1.273916336654e-06 > 2 KSP Residual norm 1.571259383257e-08 > 3 KSP Residual norm 1.250266145346e-10 > 3 SNES Function norm 2.662898285038e-09 > > These differences are marked by red color. So here, I was wondering there are any explanations why these happen? > > Thanks, > > Fande, > > From lu_qin_2000 at yahoo.com Sat Mar 22 10:15:52 2014 From: lu_qin_2000 at yahoo.com (Qin Lu) Date: Sat, 22 Mar 2014 08:15:52 -0700 (PDT) Subject: [petsc-users] Building PETSc with Intel mpi In-Reply-To: References: <1395413153.15393.YahooMailNeo@web160201.mail.bf1.yahoo.com> <7D85BAD9-6B3E-4B6A-ACC8-EBF5FC6D4ADA@mcs.anl.gov> <1395455573.34700.YahooMailNeo@web160204.mail.bf1.yahoo.com> Message-ID: <1395501352.26228.YahooMailNeo@web160203.mail.bf1.yahoo.com> The Intel mpi's wrappers to Intel compilers are mpiicc and mpiifort (not mpicc and mpif90), can PETSc's configure automatically pick them? Or I have to specify them explicitly (--with-cc=mpiicc --with-fc=mpiifort --with-mpi-compilers=0)? Thanks, Qin ________________________________ From: Satish Balay To: Qin Lu Cc: Barry Smith ; petsc-users Sent: Saturday, March 22, 2014 12:09 AM Subject: Re: [petsc-users] Building PETSc with Intel mpi > --with-mpi-dir=/apps/compilers/intel_2013/impi/4.1.0.024/intel64 --with-mpi-compilers=0 Does this mpi not come with mpicc/mpif90 wrappers? If they do - its best to use them. If not - its best to look at the docs for this compiler - and specify it with the appropriate --with-mpi-include --with-mpi-lib options, [instead of the above] Satish On Fri, 21 Mar 2014, Qin Lu wrote: > Sourcing the .csh files of the compiler fixed the problem. Thanks! However, later it got another error (see the attached configure.log for details): > ? > ******************************************************************************* > ???????? UNABLE to CONFIGURE with GIVEN OPTIONS??? (see configure.log for details): > ------------------------------------------------------------------------------- > Fortran error! mpi_init() could not be located! > ******************************************************************************* > > It seems the configure did not?link the Intel MPI libs. I used --with-mpi-dir to specify the MPI directory, can configure get the correct Intel MPI lib names? If I have to specify the lib names (using --with-mpi-lib?), which libs should I specify? I saw a lot of libs under the directory, such as libmpi.a, libmpi_ipl64.a, libmpi_mt.a, etc. > ? > Thanks a lot, > Qin >? > > ________________________________ >? From: Barry Smith > To: Qin Lu > Cc: petsc-users > Sent: Friday, March 21, 2014 10:11 AM > Subject: Re: [petsc-users] Building PETSc with Intel mpi >? > > > ? Did you follow the directions here: http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > > ? Did it make any difference? > > > On Mar 21, 2014, at 9:45 AM, Qin Lu wrote: > > > Hello, > >? > > I was trying to build PETSc-3.4.2 with Intel MPI using Intel-2013 compilers in Linux, but got the error below. The configure.log is attached. > >? > > ******************************************************************************* > >? ? ? ? ? ? ? ? ? ?? UNABLE to EXECUTE BINARIES for ./configure > > ------------------------------------------------------------------------------- > > Cannot run executables created with FC. If this machine uses a batch system > > to submit jobs you will need to configure using ./configure with the additional option? --with-batch. > >? Otherwise there is problem with the compilers. Can you compile and run code with your C/C++ (and maybe Fortran) compilers? > > See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > > ******************************************************************************* > >? > > Thanks a lot for any suggestions abut the problem, > >? > > Regards, > > Qin > > >? > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sat Mar 22 11:46:35 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sat, 22 Mar 2014 11:46:35 -0500 Subject: [petsc-users] Building PETSc with Intel mpi In-Reply-To: <1395501352.26228.YahooMailNeo@web160203.mail.bf1.yahoo.com> References: <1395413153.15393.YahooMailNeo@web160201.mail.bf1.yahoo.com> <7D85BAD9-6B3E-4B6A-ACC8-EBF5FC6D4ADA@mcs.anl.gov> <1395455573.34700.YahooMailNeo@web160204.mail.bf1.yahoo.com> <1395501352.26228.YahooMailNeo@web160203.mail.bf1.yahoo.com> Message-ID: <501992A5-F022-4E1A-9EB5-42967F0C7FA8@mcs.anl.gov> On Mar 22, 2014, at 10:15 AM, Qin Lu wrote: > The Intel mpi's wrappers to Intel compilers are mpiicc and mpiifort (not mpicc and mpif90), can PETSc's configure automatically pick them? Or I have to specify them explicitly (--with-cc=mpiicc --with-fc=mpiifort --with-mpi-compilers=0)? List --with-cc=mpiicc --with-fc=mpiifort do not list --with-mpi-compilers=0 or ?with-mpi-dir or ?with-mpi-libs Barry > > Thanks, > Qin > > From: Satish Balay > To: Qin Lu > Cc: Barry Smith ; petsc-users > Sent: Saturday, March 22, 2014 12:09 AM > Subject: Re: [petsc-users] Building PETSc with Intel mpi > > > --with-mpi-dir=/apps/compilers/intel_2013/impi/4.1.0.024/intel64 --with-mpi-compilers=0 > > Does this mpi not come with mpicc/mpif90 wrappers? If they do - its best to use them. > > If not - its best to look at the docs for this compiler - and specify it with the appropriate > > --with-mpi-include --with-mpi-lib options, [instead of the above] > > Satish > > On Fri, 21 Mar 2014, Qin Lu wrote: > > > Sourcing the .csh files of the compiler fixed the problem. Thanks! However, later it got another error (see the attached configure.log for details): > > > > ******************************************************************************* > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): > > ------------------------------------------------------------------------------- > > Fortran error! mpi_init() could not be located! > > ******************************************************************************* > > > > It seems the configure did not link the Intel MPI libs. I used --with-mpi-dir to specify the MPI directory, can configure get the correct Intel MPI lib names? If I have to specify the lib names (using --with-mpi-lib?), which libs should I specify? I saw a lot of libs under the directory, such as libmpi.a, libmpi_ipl64.a, libmpi_mt.a, etc. > > > > Thanks a lot, > > Qin > > > > > > ________________________________ > > From: Barry Smith > > To: Qin Lu > > Cc: petsc-users > > Sent: Friday, March 21, 2014 10:11 AM > > Subject: Re: [petsc-users] Building PETSc with Intel mpi > > > > > > > > Did you follow the directions here: http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > > > > Did it make any difference? > > > > > > On Mar 21, 2014, at 9:45 AM, Qin Lu wrote: > > > > > Hello, > > > > > > I was trying to build PETSc-3.4.2 with Intel MPI using Intel-2013 compilers in Linux, but got the error below. The configure.log is attached. > > > > > > ******************************************************************************* > > > UNABLE to EXECUTE BINARIES for ./configure > > > ------------------------------------------------------------------------------- > > > Cannot run executables created with FC. If this machine uses a batch system > > > to submit jobs you will need to configure using ./configure with the additional option --with-batch. > > > Otherwise there is problem with the compilers. Can you compile and run code with your C/C++ (and maybe Fortran) compilers? > > > See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > > > ******************************************************************************* > > > > > > Thanks a lot for any suggestions abut the problem, > > > > > > Regards, > > > Qin > > > > > > > > > > > > From lu_qin_2000 at yahoo.com Sat Mar 22 14:24:16 2014 From: lu_qin_2000 at yahoo.com (Qin Lu) Date: Sat, 22 Mar 2014 12:24:16 -0700 (PDT) Subject: [petsc-users] Building PETSc with Intel mpi In-Reply-To: <501992A5-F022-4E1A-9EB5-42967F0C7FA8@mcs.anl.gov> References: <1395413153.15393.YahooMailNeo@web160201.mail.bf1.yahoo.com> <7D85BAD9-6B3E-4B6A-ACC8-EBF5FC6D4ADA@mcs.anl.gov> <1395455573.34700.YahooMailNeo@web160204.mail.bf1.yahoo.com> <1395501352.26228.YahooMailNeo@web160203.mail.bf1.yahoo.com> <501992A5-F022-4E1A-9EB5-42967F0C7FA8@mcs.anl.gov> Message-ID: <1395516256.48871.YahooMailNeo@web160203.mail.bf1.yahoo.com> I tried what you suggested and got the following error (configure .log attached): ************** sh: ERROR while running executable: Could not execute "/tmp/petsc-cd5uJs/config.setCompilers/conftest": /tmp/petsc-cd5uJs/config.setCompilers/conftest: error while loading shared libraries: /apps/compilers/intel_2013/impi/4.1.0.024/intel64/lib/libmpi.so.4: requires glibc 2.5 or later dynamic linker ****************** ? What can I do about this? ? Thanks, Qin ________________________________ From: Barry Smith To: Qin Lu Cc: petsc-users Sent: Saturday, March 22, 2014 11:46 AM Subject: Re: [petsc-users] Building PETSc with Intel mpi On Mar 22, 2014, at 10:15 AM, Qin Lu wrote: > The Intel mpi's wrappers to Intel compilers are mpiicc and mpiifort (not mpicc and mpif90), can PETSc's configure automatically pick them? Or I have to specify them explicitly (--with-cc=mpiicc --with-fc=mpiifort --with-mpi-compilers=0)? ? List ? --with-cc=mpiicc --with-fc=mpiifort? ? do not list --with-mpi-compilers=0 or ?with-mpi-dir or ?with-mpi-libs ? Barry > > Thanks, > Qin > > From: Satish Balay > To: Qin Lu > Cc: Barry Smith ; petsc-users > Sent: Saturday, March 22, 2014 12:09 AM > Subject: Re: [petsc-users] Building PETSc with Intel mpi > > > --with-mpi-dir=/apps/compilers/intel_2013/impi/4.1.0.024/intel64 --with-mpi-compilers=0 > > Does this mpi not come with mpicc/mpif90 wrappers? If they do - its best to use them. > > If not - its best to look at the docs for this compiler - and specify it with the appropriate > > --with-mpi-include --with-mpi-lib options, [instead of the above] > > Satish > > On Fri, 21 Mar 2014, Qin Lu wrote: > > > Sourcing the .csh files of the compiler fixed the problem. Thanks! However, later it got another error (see the attached configure.log for details): > >? > > ******************************************************************************* > >? ? ? ? ? UNABLE to CONFIGURE with GIVEN OPTIONS? ? (see configure.log for details): > > ------------------------------------------------------------------------------- > > Fortran error! mpi_init() could not be located! > > ******************************************************************************* > > > > It seems the configure did not link the Intel MPI libs. I used --with-mpi-dir to specify the MPI directory, can configure get the correct Intel MPI lib names? If I have to specify the lib names (using --with-mpi-lib?), which libs should I specify? I saw a lot of libs under the directory, such as libmpi.a, libmpi_ipl64.a, libmpi_mt.a, etc. > >? > > Thanks a lot, > > Qin > >? > > > > ________________________________ > >? From: Barry Smith > > To: Qin Lu > > Cc: petsc-users > > Sent: Friday, March 21, 2014 10:11 AM > > Subject: Re: [petsc-users] Building PETSc with Intel mpi > >? > > > > > >? Did you follow the directions here: http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > > > >? Did it make any difference? > > > > > > On Mar 21, 2014, at 9:45 AM, Qin Lu wrote: > > > > > Hello, > > >? > > > I was trying to build PETSc-3.4.2 with Intel MPI using Intel-2013 compilers in Linux, but got the error below. The configure.log is attached. > > >? > > > ******************************************************************************* > > >? ? ? ? ? ? ? ? ? ? UNABLE to EXECUTE BINARIES for ./configure > > > ------------------------------------------------------------------------------- > > > Cannot run executables created with FC. If this machine uses a batch system > > > to submit jobs you will need to configure using ./configure with the additional option? --with-batch. > > >? Otherwise there is problem with the compilers. Can you compile and run code with your C/C++ (and maybe Fortran) compilers? > > > See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > > > ******************************************************************************* > > >? > > > Thanks a lot for any suggestions abut the problem, > > >? > > > Regards, > > > Qin > > > > >? > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: application/octet-stream Size: 179361 bytes Desc: not available URL: From balay at mcs.anl.gov Sat Mar 22 17:24:02 2014 From: balay at mcs.anl.gov (Satish Balay) Date: Sat, 22 Mar 2014 17:24:02 -0500 Subject: [petsc-users] Building PETSc with Intel mpi In-Reply-To: <1395516256.48871.YahooMailNeo@web160203.mail.bf1.yahoo.com> References: <1395413153.15393.YahooMailNeo@web160201.mail.bf1.yahoo.com> <7D85BAD9-6B3E-4B6A-ACC8-EBF5FC6D4ADA@mcs.anl.gov> <1395455573.34700.YahooMailNeo@web160204.mail.bf1.yahoo.com> <1395501352.26228.YahooMailNeo@web160203.mail.bf1.yahoo.com> <501992A5-F022-4E1A-9EB5-42967F0C7FA8@mcs.anl.gov> <1395516256.48871.YahooMailNeo@web160203.mail.bf1.yahoo.com> Message-ID: Is this MPI install functional and useable on this machine? Perhaps you should heck with the sysadmin on this machine on the correct way to use this MPI. Googling on this error gives: http://stackoverflow.com/questions/12075403/gcc-reduce-libc-required-version Perhaps you can try the additional conifgure option: LDFLAGS=-static or LDFLAGS="-Wl,--hash-style=both" Satish On Sat, 22 Mar 2014, Qin Lu wrote: > I tried what you suggested and got the following error (configure .log attached): > ************** > sh: > ERROR while running executable: Could not execute "/tmp/petsc-cd5uJs/config.setCompilers/conftest": > /tmp/petsc-cd5uJs/config.setCompilers/conftest: error while loading shared libraries: /apps/compilers/intel_2013/impi/4.1.0.024/intel64/lib/libmpi.so.4: requires glibc 2.5 or later dynamic linker > ****************** > ? > What can I do about this? > ? > Thanks, > Qin > > > ________________________________ > From: Barry Smith > To: Qin Lu > Cc: petsc-users > Sent: Saturday, March 22, 2014 11:46 AM > Subject: Re: [petsc-users] Building PETSc with Intel mpi > > > > On Mar 22, 2014, at 10:15 AM, Qin Lu wrote: > > > The Intel mpi's wrappers to Intel compilers are mpiicc and mpiifort (not mpicc and mpif90), can PETSc's configure automatically pick them? Or I have to specify them explicitly (--with-cc=mpiicc --with-fc=mpiifort --with-mpi-compilers=0)? > > ? List > > ? --with-cc=mpiicc --with-fc=mpiifort? > > ? do not list --with-mpi-compilers=0 or ?with-mpi-dir or ?with-mpi-libs > > ? Barry > > > > > > > > Thanks, > > Qin > > > > From: Satish Balay > > To: Qin Lu > > Cc: Barry Smith ; petsc-users > > Sent: Saturday, March 22, 2014 12:09 AM > > Subject: Re: [petsc-users] Building PETSc with Intel mpi > > > > > --with-mpi-dir=/apps/compilers/intel_2013/impi/4.1.0.024/intel64 --with-mpi-compilers=0 > > > > Does this mpi not come with mpicc/mpif90 wrappers? If they do - its best to use them. > > > > If not - its best to look at the docs for this compiler - and specify it with the appropriate > > > > --with-mpi-include --with-mpi-lib options, [instead of the above] > > > > Satish > > > > On Fri, 21 Mar 2014, Qin Lu wrote: > > > > > Sourcing the .csh files of the compiler fixed the problem. Thanks! However, later it got another error (see the attached configure.log for details): > > >? > > > ******************************************************************************* > > >? ? ? ? ? UNABLE to CONFIGURE with GIVEN OPTIONS? ? (see configure.log for details): > > > ------------------------------------------------------------------------------- > > > Fortran error! mpi_init() could not be located! > > > ******************************************************************************* > > > > > > It seems the configure did not link the Intel MPI libs. I used --with-mpi-dir to specify the MPI directory, can configure get the correct Intel MPI lib names? If I have to specify the lib names (using --with-mpi-lib?), which libs should I specify? I saw a lot of libs under the directory, such as libmpi.a, libmpi_ipl64.a, libmpi_mt.a, etc. > > >? > > > Thanks a lot, > > > Qin > > >? > > > > > > ________________________________ > > >? From: Barry Smith > > > To: Qin Lu > > > Cc: petsc-users > > > Sent: Friday, March 21, 2014 10:11 AM > > > Subject: Re: [petsc-users] Building PETSc with Intel mpi > > >? > > > > > > > > >? Did you follow the directions here: http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > > > > > >? Did it make any difference? > > > > > > > > > On Mar 21, 2014, at 9:45 AM, Qin Lu wrote: > > > > > > > Hello, > > > >? > > > > I was trying to build PETSc-3.4.2 with Intel MPI using Intel-2013 compilers in Linux, but got the error below. The configure.log is attached. > > > >? > > > > ******************************************************************************* > > > >? ? ? ? ? ? ? ? ? ? UNABLE to EXECUTE BINARIES for ./configure > > > > ------------------------------------------------------------------------------- > > > > Cannot run executables created with FC. If this machine uses a batch system > > > > to submit jobs you will need to configure using ./configure with the additional option? --with-batch. > > > >? Otherwise there is problem with the compilers. Can you compile and run code with your C/C++ (and maybe Fortran) compilers? > > > > See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > > > > ******************************************************************************* > > > >? > > > > Thanks a lot for any suggestions abut the problem, > > > >? > > > > Regards, > > > > Qin > > > > > > >? > > > > > > > > > > > > From bsmith at mcs.anl.gov Sat Mar 22 17:30:13 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sat, 22 Mar 2014 17:30:13 -0500 Subject: [petsc-users] Building PETSc with Intel mpi In-Reply-To: <1395516256.48871.YahooMailNeo@web160203.mail.bf1.yahoo.com> References: <1395413153.15393.YahooMailNeo@web160201.mail.bf1.yahoo.com> <7D85BAD9-6B3E-4B6A-ACC8-EBF5FC6D4ADA@mcs.anl.gov> <1395455573.34700.YahooMailNeo@web160204.mail.bf1.yahoo.com> <1395501352.26228.YahooMailNeo@web160203.mail.bf1.yahoo.com> <501992A5-F022-4E1A-9EB5-42967F0C7FA8@mcs.anl.gov> <1395516256.48871.YahooMailNeo@web160203.mail.bf1.yahoo.com> Message-ID: <87FFE616-426C-499D-860A-5C6CF7D0302B@mcs.anl.gov> There is something wrong with your environment. It is compiling the trivial program int main() { ; return 0; } Pushing language C Popping language C sh: mpiicc -o /tmp/petsc-cd5uJs/config.setCompilers/conftest /tmp/petsc-cd5uJs/config.setCompilers/conftest.o Executing: mpiicc -o /tmp/petsc-cd5uJs/config.setCompilers/conftest /tmp/petsc-cd5uJs/config.setCompilers/conftest.o sh: Executing: /tmp/petsc-cd5uJs/config.setCompilers/conftest sh: /tmp/petsc-cd5uJs/config.setCompilers/conftest Executing: /tmp/petsc-cd5uJs/config.setCompilers/conftest sh: ERROR while running executable: Could not execute "/tmp/petsc-cd5uJs/config.setCompilers/conftest": /tmp/petsc-cd5uJs/config.setCompilers/conftest: error while loading shared libraries: /apps/compilers/intel_2013/impi/4.1.0.024/intel64/lib/libmpi.so.4: requires glibc 2.5 or later dynamic linker but the resulting program cannot be run. Can you compile and run a simple MPI program on this machine with mpiicc ? Do the following printf ?int main(){ return 0;} > simple.c mpiicc -o simple simple.c ./simplec does it work? On Mar 22, 2014, at 2:24 PM, Qin Lu wrote: > I tried what you suggested and got the following error (configure .log attached): > ************** > sh: > ERROR while running executable: Could not execute "/tmp/petsc-cd5uJs/config.setCompilers/conftest": > /tmp/petsc-cd5uJs/config.setCompilers/conftest: error while loading shared libraries: /apps/compilers/intel_2013/impi/4.1.0.024/intel64/lib/libmpi.so.4: requires glibc 2.5 or later dynamic linker > ****************** > > What can I do about this? > > Thanks, > Qin > > From: Barry Smith > To: Qin Lu > Cc: petsc-users > Sent: Saturday, March 22, 2014 11:46 AM > Subject: Re: [petsc-users] Building PETSc with Intel mpi > > > On Mar 22, 2014, at 10:15 AM, Qin Lu wrote: > > > The Intel mpi's wrappers to Intel compilers are mpiicc and mpiifort (not mpicc and mpif90), can PETSc's configure automatically pick them? Or I have to specify them explicitly (--with-cc=mpiicc --with-fc=mpiifort --with-mpi-compilers=0)? > > List > > --with-cc=mpiicc --with-fc=mpiifort > > do not list --with-mpi-compilers=0 or ?with-mpi-dir or ?with-mpi-libs > > Barry > > > > > > > > Thanks, > > Qin > > > > From: Satish Balay > > To: Qin Lu > > Cc: Barry Smith ; petsc-users > > Sent: Saturday, March 22, 2014 12:09 AM > > Subject: Re: [petsc-users] Building PETSc with Intel mpi > > > > > --with-mpi-dir=/apps/compilers/intel_2013/impi/4.1.0.024/intel64 --with-mpi-compilers=0 > > > > Does this mpi not come with mpicc/mpif90 wrappers? If they do - its best to use them. > > > > If not - its best to look at the docs for this compiler - and specify it with the appropriate > > > > --with-mpi-include --with-mpi-lib options, [instead of the above] > > > > Satish > > > > On Fri, 21 Mar 2014, Qin Lu wrote: > > > > > Sourcing the .csh files of the compiler fixed the problem. Thanks! However, later it got another error (see the attached configure.log for details): > > > > > > ******************************************************************************* > > > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): > > > ------------------------------------------------------------------------------- > > > Fortran error! mpi_init() could not be located! > > > ******************************************************************************* > > > > > > It seems the configure did not link the Intel MPI libs. I used --with-mpi-dir to specify the MPI directory, can configure get the correct Intel MPI lib names? If I have to specify the lib names (using --with-mpi-lib?), which libs should I specify? I saw a lot of libs under the directory, such as libmpi.a, libmpi_ipl64.a, libmpi_mt.a, etc. > > > > > > Thanks a lot, > > > Qin > > > > > > > > > ________________________________ > > > From: Barry Smith > > > To: Qin Lu > > > Cc: petsc-users > > > Sent: Friday, March 21, 2014 10:11 AM > > > Subject: Re: [petsc-users] Building PETSc with Intel mpi > > > > > > > > > > > > Did you follow the directions here: http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > > > > > > Did it make any difference? > > > > > > > > > On Mar 21, 2014, at 9:45 AM, Qin Lu wrote: > > > > > > > Hello, > > > > > > > > I was trying to build PETSc-3.4.2 with Intel MPI using Intel-2013 compilers in Linux, but got the error below. The configure.log is attached. > > > > > > > > ******************************************************************************* > > > > UNABLE to EXECUTE BINARIES for ./configure > > > > ------------------------------------------------------------------------------- > > > > Cannot run executables created with FC. If this machine uses a batch system > > > > to submit jobs you will need to configure using ./configure with the additional option --with-batch. > > > > Otherwise there is problem with the compilers. Can you compile and run code with your C/C++ (and maybe Fortran) compilers? > > > > See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > > > > ******************************************************************************* > > > > > > > > Thanks a lot for any suggestions abut the problem, > > > > > > > > Regards, > > > > Qin > > > > > > > > > > > > > > > > > > > > > From venkateshgk.j at gmail.com Sun Mar 23 06:56:57 2014 From: venkateshgk.j at gmail.com (venkatesh g) Date: Sun, 23 Mar 2014 17:26:57 +0530 Subject: [petsc-users] reg: Eigenvectors of general complex eigen value problem Message-ID: Hi, I am using SLEPC example EX7.C, I am using complex binary matrices A and B written from matlab using Petscbinarywrite.m. I ran the example using "./ex7 -f1 A -f2 B -eps_type krylovschur -st_type sinvert -evecs out.mat -eps_smallest_magnitude" My plotted my eigenvectors using Petscbinaryread and comparing them to MATLAB's eigs command. It is wrong. It is right only if I use real matrices A and B. I compiled Petsc for complex scalars. How to get the correct eigenvectors for complex matrices, please note my eigenvalue is right but not my eigenvector. Any help is greatly appreciated! Kindly help me out. cheers, Venkatesh Sr. Research Fellow -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Sun Mar 23 07:46:17 2014 From: mfadams at lbl.gov (Mark Adams) Date: Sun, 23 Mar 2014 08:46:17 -0400 Subject: [petsc-users] cray build error In-Reply-To: References: <87ior9jeqw.fsf@jedbrown.org> Message-ID: Yes, I was told: Comments: 2014-03-21 14:44:49 - Harvey Wasserman (Additional comments) 1. If you want to eliminate the "ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not loaded" warnings, then you need to eliminate "module swap PrgEnv-pgi PrgEnv-gnu" from ~/.bashrc.ext . On Sat, Mar 22, 2014 at 1:12 AM, Satish Balay wrote: > >>> > Configure Options: --configModules=PETSc.Configure > --optionsModule=PETSc.compilerOptions --COPTFLAGS="-O3 -ffast-math > -funroll-loops" --CXXOPTFLAGS="-O3 -ffast-math -funroll-loops" > --FOPTFLAGS="-O3 -ffast-math -funroll-loops" --download-parmetis > --download-metis --download-hypre --with-cc=cc --with-clib-autodetect=0 > --with-cxx=CC --with-cxxlib-autodetect=0 --with-debugging=0 --with-fc=ftn > --with-fortranlib-autodetect=0 --with-shared-libraries=0 --download-mpich > --with-x=0 --with-64-bit-indices PETSC_ARCH=arch-xe6-opt64 > PETSC_DIR=/global/homes/m/madams/petsc_maint/ > <<<<< > > --download-mpich on hopper looks weird. > >>>>>>>> > ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not > loaded > ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not > loaded > ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not > loaded > ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not > loaded > ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not > loaded > ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not > loaded > ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not > loaded > <<<<<< > > I see more module errors here.. > > Satish > > On Fri, 21 Mar 2014, Mark Adams wrote: > > > Thanks, that go me further. I build here all the time (Hopper at NERSC) > > but I'm having a hard time here. I've cleaned everything but still get > > these errors. > > > > > > On Thu, Mar 20, 2014 at 7:23 AM, Jed Brown wrote: > > > > > Mark Adams writes: > > > > > > > I get this on Hopper with PETSC maint. Any ideas? > > > > > > The environment is broken: wrapper compiler is adding -lpspline and > > > -lezcdf. Perhaps you need to unload modules or load other modules? > > > > > > | Executing: cc -o > > > /scratch/scratchdirs/madams/petsc-pDazNU/config.setCompilers/conftest > > > > /scratch/scratchdirs/madams/petsc-pDazNU/config.setCompilers/conftest.o > > > | sh: > > > | Possible ERROR while running linker: > ModuleCmd_Switch.c(172):ERROR:152: > > > Module 'PrgEnv-pgi' is currently not loaded > > > | ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently > not > > > loaded > > > | ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently > not > > > loaded > > > | /usr/bin/ld: cannot find -lpspline > > > | /usr/bin/ld: cannot find -lezcdf > > > | collect2: error: ld returned 1 exit status > > > | output: ret = 256 > > > > > > | _LMFILES_=/opt/modulefiles/modules/3.2.6.6: > > > /usr/syscom/nsg/modulefiles/nsg/1.2.0:/opt/modulefiles/modules/3.2.6.7 > : > > > > /opt/cray/xt-asyncpe/default/modulefiles/xtpe-network-gemini:/opt/modulefiles/PrgEnv-gnu/4.2.34:/opt/cray/modulefiles/atp/1.7.0:/opt/modulefiles/xe-sysroot/4.2.34:/opt/cray/gem/modulefiles/switch/1.0-1.0402.45840.2.63.gem:/opt/cray/gem/modulefiles/shared-root/1.0-1.0402.46893.3.17.gem:/opt/cray/gem/modulefiles/pdsh/2.26-1.0402.45278.1.1.gem:/opt/cray/gem/modulefiles/nodehealth/5.1-1.0402.45895.3.76.gem:/opt/cray/gem/modulefiles/lbcd/2.1-1.0402.45245.1.2.gem:/opt/cray/gem/modulefiles/hosts/1.0-1.0402.45251.1.86.gem:/opt/cray/gem/modulefiles/configuration/1.0-1.0402.45284.1.2.gem:/opt/cray/modulefiles/ccm/2.2.0-1.0402.46086.4.120:/opt/cray/gem/modulefiles/audit/1.0.0-1.0402.45273.1.86.gem:/opt/cray/gem/modulefiles/rca/1.0.0-2.0402.47290.7.1.gem:/opt/cray/gem/modulefiles/csa/3.0.0-1_2.0402.45268.1.90.gem:/opt/cray/gem/modulefiles/job/1.5.5-0.1_2.0402.45272.1.5.gem:/opt/cray/gem/modulefiles/xpmem/0.1-2.0402.45248.1.5.gem:/opt/cray/gem/modulefiles/gni-headers/2.1-1.0402.7541.1.5.gem:/opt/cray/gem/modulefiles/dmapp/4.0.1-1.0402.7784.4.1.gem:/opt/cray/gem/modulefiles/pmi/4.0.1-1.0000.9753.86.3.gem:/opt/cray/gem/modulefiles/ugni/5.0-1.0402.7551.1.10.gem:/opt/cray/gem/modulefiles/udreg/2.3.2-1.0402.7546.1.5.gem:/opt/cray/modulefiles/cray-libsci/12.1.01:/opt/modulefiles/gcc/4.8.1:/opt/modulefiles/xt-asyncpe/5.23:/opt/modulefiles/eswrap/1.0.20-1.010102.662.0:/opt/cray/xt-asyncpe/default/modulefiles/craype-mc12:/opt/cray/modulefiles/cray-shmem/6.0.1:/opt/cray/modulefiles/cray-mpich/6.0.1:/opt/modulefiles/torque/4.2.3.h5_notcpretry:/opt/modulefiles/moab/7.2.3-r19-b121-SUSE11:/usr/common/usg/Modules/modulefiles/valgrind/3.8.1:/usr/common/usg/Modules/modulefiles/cmake/ > 2.8.10.1: > > > > /usr/common/graphics/Modules/modulefiles/visit/2.7.0:/opt/cray/modulefiles/papi/5.1.2:/usr/common/usg/Modules/modulefiles/ipm/2.00:/usr/common/usg/Modules/modulefiles/allineatools/4.2-34404:/usr/common/usg/Modules/modulefiles/adios/1.2.1:/usr/common/usg/Modules/modulefiles/pspline/nersc1.0:/opt/cray/modulefiles/cray-hdf5/1.8.11:/opt/cray/modulefiles/cray-netcdf/4.3.0:/usr/common/usg/Modules/modulefiles/matlab/R2012a:/usr/common/usg/Modules/modulefiles/altd/1.0:/usr/common/usg/Modules/modulefiles/usg-default-modules/1.0 > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Sun Mar 23 09:30:11 2014 From: mfadams at lbl.gov (Mark Adams) Date: Sun, 23 Mar 2014 10:30:11 -0400 Subject: [petsc-users] cray build error In-Reply-To: References: <87ior9jeqw.fsf@jedbrown.org> Message-ID: I fixed this by not downloading MPI. I used to not need to specify MPI at all. But I seemed to when I use 'maint'. On Sun, Mar 23, 2014 at 8:46 AM, Mark Adams wrote: > Yes, I was told: > > Comments: > 2014-03-21 14:44:49 - Harvey Wasserman (Additional comments) > 1. If you want to eliminate the "ModuleCmd_Switch.c(172):ERROR:152: Module > 'PrgEnv-pgi' is currently not loaded" warnings, > then you need to eliminate "module swap PrgEnv-pgi PrgEnv-gnu" from > ~/.bashrc.ext . > > On Sat, Mar 22, 2014 at 1:12 AM, Satish Balay wrote: > >> >>> >> Configure Options: --configModules=PETSc.Configure >> --optionsModule=PETSc.compilerOptions --COPTFLAGS="-O3 -ffast-math >> -funroll-loops" --CXXOPTFLAGS="-O3 -ffast-math -funroll-loops" >> --FOPTFLAGS="-O3 -ffast-math -funroll-loops" --download-parmetis >> --download-metis --download-hypre --with-cc=cc --with-clib-autodetect=0 >> --with-cxx=CC --with-cxxlib-autodetect=0 --with-debugging=0 --with-fc=ftn >> --with-fortranlib-autodetect=0 --with-shared-libraries=0 --download-mpich >> --with-x=0 --with-64-bit-indices PETSC_ARCH=arch-xe6-opt64 >> PETSC_DIR=/global/homes/m/madams/petsc_maint/ >> <<<<< >> >> --download-mpich on hopper looks weird. > > >> >>>>>>>> >> ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not >> loaded >> ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not >> loaded >> ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not >> loaded >> ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not >> loaded >> ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not >> loaded >> ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not >> loaded >> ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently not >> loaded >> <<<<<< >> >> I see more module errors here.. >> >> Satish >> >> On Fri, 21 Mar 2014, Mark Adams wrote: >> >> > Thanks, that go me further. I build here all the time (Hopper at NERSC) >> > but I'm having a hard time here. I've cleaned everything but still get >> > these errors. >> > >> > >> > On Thu, Mar 20, 2014 at 7:23 AM, Jed Brown wrote: >> > >> > > Mark Adams writes: >> > > >> > > > I get this on Hopper with PETSC maint. Any ideas? >> > > >> > > The environment is broken: wrapper compiler is adding -lpspline and >> > > -lezcdf. Perhaps you need to unload modules or load other modules? >> > > >> > > | Executing: cc -o >> > > /scratch/scratchdirs/madams/petsc-pDazNU/config.setCompilers/conftest >> > > >> /scratch/scratchdirs/madams/petsc-pDazNU/config.setCompilers/conftest.o >> > > | sh: >> > > | Possible ERROR while running linker: >> ModuleCmd_Switch.c(172):ERROR:152: >> > > Module 'PrgEnv-pgi' is currently not loaded >> > > | ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently >> not >> > > loaded >> > > | ModuleCmd_Switch.c(172):ERROR:152: Module 'PrgEnv-pgi' is currently >> not >> > > loaded >> > > | /usr/bin/ld: cannot find -lpspline >> > > | /usr/bin/ld: cannot find -lezcdf >> > > | collect2: error: ld returned 1 exit status >> > > | output: ret = 256 >> > > >> > > | _LMFILES_=/opt/modulefiles/modules/3.2.6.6: >> > > /usr/syscom/nsg/modulefiles/nsg/1.2.0:/opt/modulefiles/modules/ >> 3.2.6.7: >> > > >> /opt/cray/xt-asyncpe/default/modulefiles/xtpe-network-gemini:/opt/modulefiles/PrgEnv-gnu/4.2.34:/opt/cray/modulefiles/atp/1.7.0:/opt/modulefiles/xe-sysroot/4.2.34:/opt/cray/gem/modulefiles/switch/1.0-1.0402.45840.2.63.gem:/opt/cray/gem/modulefiles/shared-root/1.0-1.0402.46893.3.17.gem:/opt/cray/gem/modulefiles/pdsh/2.26-1.0402.45278.1.1.gem:/opt/cray/gem/modulefiles/nodehealth/5.1-1.0402.45895.3.76.gem:/opt/cray/gem/modulefiles/lbcd/2.1-1.0402.45245.1.2.gem:/opt/cray/gem/modulefiles/hosts/1.0-1.0402.45251.1.86.gem:/opt/cray/gem/modulefiles/configuration/1.0-1.0402.45284.1.2.gem:/opt/cray/modulefiles/ccm/2.2.0-1.0402.46086.4.120:/opt/cray/gem/modulefiles/audit/1.0.0-1.0402.45273.1.86.gem:/opt/cray/gem/modulefiles/rca/1.0.0-2.0402.47290.7.1.gem:/opt/cray/gem/modulefiles/csa/3.0.0-1_2.0402.45268.1.90.gem:/opt/cray/gem/modulefiles/job/1.5.5-0.1_2.0402.45272.1.5.gem:/opt/cray/gem/modulefiles/xpmem/0.1-2.0402.45248.1.5.gem:/opt/cray/gem/modulefiles/gni-headers/2.1-1.0402.7541.1.5.gem:/opt/cray/gem/modulefiles/dmapp/4.0.1-1.0402.7784.4.1.gem:/opt/cray/gem/modulefiles/pmi/4.0.1-1.0000.9753.86.3.gem:/opt/cray/gem/modulefiles/ugni/5.0-1.0402.7551.1.10.gem:/opt/cray/gem/modulefiles/udreg/2.3.2-1.0402.7546.1.5.gem:/opt/cray/modulefiles/cray-libsci/12.1.01:/opt/modulefiles/gcc/4.8.1:/opt/modulefiles/xt-asyncpe/5.23:/opt/modulefiles/eswrap/1.0.20-1.010102.662.0:/opt/cray/xt-asyncpe/default/modulefiles/craype-mc12:/opt/cray/modulefiles/cray-shmem/6.0.1:/opt/cray/modulefiles/cray-mpich/6.0.1:/opt/modulefiles/torque/4.2.3.h5_notcpretry:/opt/modulefiles/moab/7.2.3-r19-b121-SUSE11:/usr/common/usg/Modules/modulefiles/valgrind/3.8.1:/usr/common/usg/Modules/modulefiles/cmake/ >> 2.8.10.1: >> > > >> /usr/common/graphics/Modules/modulefiles/visit/2.7.0:/opt/cray/modulefiles/papi/5.1.2:/usr/common/usg/Modules/modulefiles/ipm/2.00:/usr/common/usg/Modules/modulefiles/allineatools/4.2-34404:/usr/common/usg/Modules/modulefiles/adios/1.2.1:/usr/common/usg/Modules/modulefiles/pspline/nersc1.0:/opt/cray/modulefiles/cray-hdf5/1.8.11:/opt/cray/modulefiles/cray-netcdf/4.3.0:/usr/common/usg/Modules/modulefiles/matlab/R2012a:/usr/common/usg/Modules/modulefiles/altd/1.0:/usr/common/usg/Modules/modulefiles/usg-default-modules/1.0 >> > > >> > >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jroman at dsic.upv.es Sun Mar 23 11:16:35 2014 From: jroman at dsic.upv.es (Jose E. Roman) Date: Sun, 23 Mar 2014 17:16:35 +0100 Subject: [petsc-users] reg: Eigenvectors of general complex eigen value problem In-Reply-To: References: Message-ID: El 23/03/2014, a las 12:56, venkatesh g escribi?: > Hi, > > I am using SLEPC example EX7.C, I am using complex binary matrices A and B written from matlab using Petscbinarywrite.m. > > I ran the example using "./ex7 -f1 A -f2 B -eps_type krylovschur -st_type sinvert -evecs out.mat -eps_smallest_magnitude" You should not use -st_type sinvert together with -eps_smallest_magnitude. Read the documentation. > > My plotted my eigenvectors using Petscbinaryread and comparing them to MATLAB's eigs command. > > It is wrong. It is right only if I use real matrices A and B. > > I compiled Petsc for complex scalars. How to get the correct eigenvectors for complex matrices, please note my eigenvalue is right but not my eigenvector. How do you know the eigenvectors are wrong? Did you check the residuals? If you want to compare the eigenvectors to the output of eigs, you may need to normalize them in the same way, for instance with v=v/max(v). Jose > > Any help is greatly appreciated! Kindly help me out. > > cheers, > Venkatesh > Sr. Research Fellow > > From venkateshgk.j at gmail.com Sun Mar 23 23:59:59 2014 From: venkateshgk.j at gmail.com (venkatesh g) Date: Mon, 24 Mar 2014 10:29:59 +0530 Subject: [petsc-users] reg: Eigenvectors of general complex eigen value problem In-Reply-To: References: Message-ID: Ok. I used -st_type sinvert only, I am comparing it eigs after normalization. Even now the eigenvectors differ. Whether I should use a different eps solver ? Pls let me know. On Sun, Mar 23, 2014 at 9:46 PM, Jose E. Roman wrote: > > El 23/03/2014, a las 12:56, venkatesh g escribi?: > > > Hi, > > > > I am using SLEPC example EX7.C, I am using complex binary matrices A and > B written from matlab using Petscbinarywrite.m. > > > > I ran the example using "./ex7 -f1 A -f2 B -eps_type krylovschur > -st_type sinvert -evecs out.mat -eps_smallest_magnitude" > > You should not use -st_type sinvert together with -eps_smallest_magnitude. > Read the documentation. > > > > > My plotted my eigenvectors using Petscbinaryread and comparing them to > MATLAB's eigs command. > > > > It is wrong. It is right only if I use real matrices A and B. > > > > I compiled Petsc for complex scalars. How to get the correct > eigenvectors for complex matrices, please note my eigenvalue is right but > not my eigenvector. > > How do you know the eigenvectors are wrong? Did you check the residuals? > If you want to compare the eigenvectors to the output of eigs, you may need > to normalize them in the same way, for instance with v=v/max(v). > > Jose > > > > > Any help is greatly appreciated! Kindly help me out. > > > > cheers, > > Venkatesh > > Sr. Research Fellow > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jroman at dsic.upv.es Mon Mar 24 03:54:14 2014 From: jroman at dsic.upv.es (Jose E. Roman) Date: Mon, 24 Mar 2014 09:54:14 +0100 Subject: [petsc-users] reg: Eigenvectors of general complex eigen value problem In-Reply-To: References: Message-ID: <40D5A7AA-A412-4C18-AD17-7AD9C5BAFB6A@dsic.upv.es> El 24/03/2014, a las 05:59, venkatesh g escribi?: > Ok. I used -st_type sinvert only, > > I am comparing it eigs after normalization. Even now the eigenvectors differ. > > Whether I should use a different eps solver ? > > Pls let me know. > It works for me, so you should provide more details about how you are loading the eigenvectors in Matlab and comparing them. Note that the complex version of the example also writes the xi vector (which should not) so you will have zero vectors interleaved with the eigenvectors. If you want to avoid this, make the following change (line 179 of ex7.c): #if !defined(PETSC_USE_COMPLEX) if (!ishermitian) { ierr = VecView(xi,viewer);CHKERRQ(ierr); } #endif Jose From venkateshgk.j at gmail.com Mon Mar 24 04:28:37 2014 From: venkateshgk.j at gmail.com (venkatesh g) Date: Mon, 24 Mar 2014 14:58:37 +0530 Subject: [petsc-users] reg: Eigenvectors of general complex eigen value problem In-Reply-To: <40D5A7AA-A412-4C18-AD17-7AD9C5BAFB6A@dsic.upv.es> References: <40D5A7AA-A412-4C18-AD17-7AD9C5BAFB6A@dsic.upv.es> Message-ID: Ok. I am doing the following 1. Running "./ex7 -f1 A2 -f2 B2 -st_type sinvert -evecs VC" 2. In MATLAB I do "vc=PetscBinaryRead('VC');" The output is one Eigenvector in VC which is 800x1. And I did the normalization vc=vc/max(vc) and compared with the normalized original vector.. Also I must tell that if A2 and B2 are real, it works. I also changed the line 179 of ex7.c like u said. Pls let me know. On Mon, Mar 24, 2014 at 2:24 PM, Jose E. Roman wrote: > > El 24/03/2014, a las 05:59, venkatesh g escribi?: > > > Ok. I used -st_type sinvert only, > > > > I am comparing it eigs after normalization. Even now the eigenvectors > differ. > > > > Whether I should use a different eps solver ? > > > > Pls let me know. > > > > It works for me, so you should provide more details about how you are > loading the eigenvectors in Matlab and comparing them. > > Note that the complex version of the example also writes the xi vector > (which should not) so you will have zero vectors interleaved with the > eigenvectors. If you want to avoid this, make the following change (line > 179 of ex7.c): > > #if !defined(PETSC_USE_COMPLEX) > if (!ishermitian) { ierr = VecView(xi,viewer);CHKERRQ(ierr); } > #endif > > Jose > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jroman at dsic.upv.es Mon Mar 24 04:51:52 2014 From: jroman at dsic.upv.es (Jose E. Roman) Date: Mon, 24 Mar 2014 10:51:52 +0100 Subject: [petsc-users] reg: Eigenvectors of general complex eigen value problem In-Reply-To: References: <40D5A7AA-A412-4C18-AD17-7AD9C5BAFB6A@dsic.upv.es> Message-ID: <922D9AC1-9462-4B74-85A5-05C602ECDC61@dsic.upv.es> El 24/03/2014, a las 10:28, venkatesh g escribi?: > Ok. I am doing the following > > 1. Running "./ex7 -f1 A2 -f2 B2 -st_type sinvert -evecs VC" > 2. In MATLAB I do "vc=PetscBinaryRead('VC');" > > The output is one Eigenvector in VC which is 800x1. > > And I did the normalization vc=vc/max(vc) and compared with the normalized original vector.. Also I must tell that if A2 and B2 are real, it works. > > I also changed the line 179 of ex7.c like u said. > > Pls let me know. > You should do vc=PetscBinaryRead('VC','complex',true); If you want to load several eigenvectors, then vc=PetscBinaryRead('VC','complex',true,'cell',nconv); where nconv is the number of converged eigenpairs reported by SLEPc. In that case, the result is a cell array so vc{1} is the first eigenvector, vc{2} the second one, and so on. Jose From venkateshgk.j at gmail.com Mon Mar 24 05:45:35 2014 From: venkateshgk.j at gmail.com (venkatesh g) Date: Mon, 24 Mar 2014 16:15:35 +0530 Subject: [petsc-users] reg: Eigenvectors of general complex eigen value problem In-Reply-To: <922D9AC1-9462-4B74-85A5-05C602ECDC61@dsic.upv.es> References: <40D5A7AA-A412-4C18-AD17-7AD9C5BAFB6A@dsic.upv.es> <922D9AC1-9462-4B74-85A5-05C602ECDC61@dsic.upv.es> Message-ID: Thanks a lot. I did what you told. It is working now like a charm. However, I have another query, which I will post separately. On Mon, Mar 24, 2014 at 3:21 PM, Jose E. Roman wrote: > > El 24/03/2014, a las 10:28, venkatesh g escribi?: > > > Ok. I am doing the following > > > > 1. Running "./ex7 -f1 A2 -f2 B2 -st_type sinvert -evecs VC" > > 2. In MATLAB I do "vc=PetscBinaryRead('VC');" > > > > The output is one Eigenvector in VC which is 800x1. > > > > And I did the normalization vc=vc/max(vc) and compared with the > normalized original vector.. Also I must tell that if A2 and B2 are real, > it works. > > > > I also changed the line 179 of ex7.c like u said. > > > > Pls let me know. > > > > You should do > > vc=PetscBinaryRead('VC','complex',true); > > If you want to load several eigenvectors, then > > vc=PetscBinaryRead('VC','complex',true,'cell',nconv); > > where nconv is the number of converged eigenpairs reported by SLEPc. In > that case, the result is a cell array so vc{1} is the first eigenvector, > vc{2} the second one, and so on. > > Jose > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From s.kramer at imperial.ac.uk Mon Mar 24 12:47:20 2014 From: s.kramer at imperial.ac.uk (Stephan Kramer) Date: Mon, 24 Mar 2014 17:47:20 +0000 Subject: [petsc-users] gamg failure with petsc-dev In-Reply-To: <532C23D6.7000400@imperial.ac.uk> References: <532B19AD.50105@imperial.ac.uk> <87lhw4yy9n.fsf@jedbrown.org> <532C23D6.7000400@imperial.ac.uk> Message-ID: <53306FA8.4040001@imperial.ac.uk> On 21/03/14 11:34, Stephan Kramer wrote: > On 21/03/14 04:24, Jed Brown wrote: >> Stephan Kramer writes: >> >>> We have been having some problems with GAMG on petsc-dev (master) for >>> cases that worked fine on petsc 3.4. We're solving a Stokes equation >>> (just the velocity block) for a simple convection in a square box >>> (isoviscous). The problem only occurs if we supply a near null space >>> (via MatSetNearNullSpace) where we supply the usual (1,0) (0,1) and >>> (-y,x) (near) null space vectors. If we supply those, the smoother >>> complains that the diagonal of the A matrix at the first coarsened >>> level contains a zero. If I dump out the prolongator from the finest >>> to the first coarsened level it indeed contains a zero column at that >>> same index. We're pretty confident that the fine level A matrix is >>> correct (it solves fine with LU). I've briefly spoken to Matt about >>> this and he suggested trying to run with -pc_gamg_agg_nsmooths 0 (as >>> the default changed from 3.4 -> dev) but that didn't make any >>> difference, the dumped out prolongator still has zero columns, and it >>> crashes in the same way. Do you have any further suggestions what to >>> try and how to further debug this? >> >> Do you set the block size? Can you reproduce by modifying >> src/ksp/ksp/examples/tutorials/ex49.c (plane strain elasticity)? >> > > I don't set a block size, no. About ex49: Ah great, with master (just updated now) I get: > > [skramer at stommel]{/data/stephan/git/petsc/src/ksp/ksp/examples/tutorials}$ ./ex49 -elas_pc_type gamg -mx 100 -my 100 -mat_no_inode > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Arguments are incompatible > [0]PETSC ERROR: Zero diagonal on row 1 > [0]PETSC ERROR: See http://http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [0]PETSC ERROR: Petsc Development GIT revision: v3.4.4-3671-gbb161d1 GIT Date: 2014-03-21 01:14:15 +0000 > [0]PETSC ERROR: ./ex49 on a linux-gnu-c-opt named stommel by skramer Fri Mar 21 11:25:55 2014 > [0]PETSC ERROR: Configure options --download-fblaslapack=1 --download-blacs=1 --download-scalapack=1 --download-ptscotch=1 --download-mumps=1 --download-hypre=1 --download-suitesparse=1 --download-ml=1 > [0]PETSC ERROR: #1 MatInvertDiagonal_SeqAIJ() line 1728 in /data/stephan/git/petsc/src/mat/impls/aij/seq/aij.c > [0]PETSC ERROR: #2 MatSOR_SeqAIJ() line 1760 in /data/stephan/git/petsc/src/mat/impls/aij/seq/aij.c > [0]PETSC ERROR: #3 MatSOR() line 3734 in /data/stephan/git/petsc/src/mat/interface/matrix.c > [0]PETSC ERROR: #4 PCApply_SOR() line 35 in /data/stephan/git/petsc/src/ksp/pc/impls/sor/sor.c > [0]PETSC ERROR: #5 PCApply() line 440 in /data/stephan/git/petsc/src/ksp/pc/interface/precon.c > [0]PETSC ERROR: #6 KSP_PCApply() line 227 in /data/stephan/git/petsc/include/petsc-private/kspimpl.h > [0]PETSC ERROR: #7 KSPSolve_Chebyshev() line 456 in /data/stephan/git/petsc/src/ksp/ksp/impls/cheby/cheby.c > [0]PETSC ERROR: #8 KSPSolve() line 458 in /data/stephan/git/petsc/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: #9 PCMGMCycle_Private() line 19 in /data/stephan/git/petsc/src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: #10 PCMGMCycle_Private() line 48 in /data/stephan/git/petsc/src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: #11 PCApply_MG() line 330 in /data/stephan/git/petsc/src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: #12 PCApply() line 440 in /data/stephan/git/petsc/src/ksp/pc/interface/precon.c > [0]PETSC ERROR: #13 KSP_PCApply() line 227 in /data/stephan/git/petsc/include/petsc-private/kspimpl.h > [0]PETSC ERROR: #14 KSPInitialResidual() line 63 in /data/stephan/git/petsc/src/ksp/ksp/interface/itres.c > [0]PETSC ERROR: #15 KSPSolve_GMRES() line 234 in /data/stephan/git/petsc/src/ksp/ksp/impls/gmres/gmres.c > [0]PETSC ERROR: #16 KSPSolve() line 458 in /data/stephan/git/petsc/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: #17 solve_elasticity_2d() line 1053 in /data/stephan/git/petsc/src/ksp/ksp/examples/tutorials/ex49.c > [0]PETSC ERROR: #18 main() line 1104 in /data/stephan/git/petsc/src/ksp/ksp/examples/tutorials/ex49.c > [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- > > Which is the same error we were getting on our problem > Cheers > Stephan > > Ok, I found out a bit more. The fact that the prolongator has zero columns appears to arise in petsc 3.4 as well. The only reason it wasn't flagged before is that the default for the smoother (not the aggregation smoother but the standard pre and post smoothing) changed from jacobi to sor. I can make the example work with the additional option: $ ./ex49 -elas_pc_type gamg -mx 100 -my 100 -mat_no_inode -elas_mg_levels_1_pc_type jacobi Vice versa, if in petsc 3.4.4 I change ex49 to include the near nullspace (the /* constrain near-null space bit */) at the end, it works with jacobi (the default in 3.4) but it breaks with sor with the same error message as above. I'm not entirely sure why jacobi doesn't give an error with a zero on the diagonal, but the zero column also means that the related coarse dof doesn't actually affect the fine grid solution. I think (but I might be barking up the wrong tree here) that the zero columns appear because the aggregation method typically will have a few small aggregates that are not big enough to support the polynomials of the near null space (i.e. the polynomials restricted to an aggregate are not linearly independent). A solution would be to reduce the number of polynomials for these aggregates (only take the linearly independent). Obviously this has the down-side that the degrees of freedom per aggregate at the coarse level is no longer a constant making the administration more complicated. It would be nice to find a solution though as I've always been taught that jacobi is not a robust smoother for multigrid. Cheers Stephan From gbisht at lbl.gov Mon Mar 24 13:37:49 2014 From: gbisht at lbl.gov (Gautam Bisht) Date: Mon, 24 Mar 2014 11:37:49 -0700 Subject: [petsc-users] Local vecs of a DMComposite() in F90 code Message-ID: Hi, I'm trying to write a fortran code that uses TS with DMComposite for a multiphysics problem. How can I access local vectors created using a DMCreateGlobalVector() from a Fortran code? I tried DMCompositeGetLocalVectorsF90(), but I believe there isn't such a function. Also, what is the difference between creating vectors/matrices as nested versus letting DMComposite() handle the creation of vec/mat? Thanks, -Gautam. -------------- next part -------------- An HTML attachment was scrubbed... URL: From qiyuelu1 at gmail.com Mon Mar 24 14:54:26 2014 From: qiyuelu1 at gmail.com (Qiyue Lu) Date: Mon, 24 Mar 2014 14:54:26 -0500 Subject: [petsc-users] hypre-boomeramg precondition Message-ID: Dear All: I am solving a series of matrix which are from the same FEA model but have different number of meshes. In petsc, I use CR as the solver and hypre-boomeramg as the preconditioner. It works for a small system with 2 million DOFs. Also works for a system with 6 million DOFs. But always fail with an error ' memory out of range' for a 10 or 20 million system. Additionally, with solver CR + preconditioner bjacobi, Petsc could solve 10 and 20 million and even large systems easily. Also, with using top command in linux by monitoring the memory usage, I saw some peaks in the memory. So, I am wondering, is this because the memory usage of boomeramg, that larger systems always fail? Thanks Qiyue Lu -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Mon Mar 24 15:00:25 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 24 Mar 2014 15:00:25 -0500 Subject: [petsc-users] hypre-boomeramg precondition In-Reply-To: References: Message-ID: <6E733C95-EF7E-4915-8563-532EAFF0C924@mcs.anl.gov> On Mar 24, 2014, at 2:54 PM, Qiyue Lu wrote: > Dear All: > > I am solving a series of matrix which are from the same FEA model but have different number of meshes. In petsc, I use CR as the solver and hypre-boomeramg as the preconditioner. It works for a small system with 2 million DOFs. Also works for a system with 6 million DOFs. But always fail with an error ' memory out of range' for a 10 or 20 million system. > > Additionally, with solver CR + preconditioner bjacobi, Petsc could solve 10 and 20 million and even large systems easily. > > Also, with using top command in linux by monitoring the memory usage, I saw some peaks in the memory. > > So, I am wondering, is this because the memory usage of boomeramg, that larger systems always fail? BoomerAMG does require a good amount more memory than ?simpler? solvers. There is nothing one can do about it except use more MPI processes with more total memory for larger problems. Barry > > Thanks > > Qiyue Lu From lu_qin_2000 at yahoo.com Mon Mar 24 18:08:01 2014 From: lu_qin_2000 at yahoo.com (Qin Lu) Date: Mon, 24 Mar 2014 16:08:01 -0700 (PDT) Subject: [petsc-users] Building PETSc with Intel mpi In-Reply-To: <87FFE616-426C-499D-860A-5C6CF7D0302B@mcs.anl.gov> References: <1395413153.15393.YahooMailNeo@web160201.mail.bf1.yahoo.com> <7D85BAD9-6B3E-4B6A-ACC8-EBF5FC6D4ADA@mcs.anl.gov> <1395455573.34700.YahooMailNeo@web160204.mail.bf1.yahoo.com> <1395501352.26228.YahooMailNeo@web160203.mail.bf1.yahoo.com> <501992A5-F022-4E1A-9EB5-42967F0C7FA8@mcs.anl.gov> <1395516256.48871.YahooMailNeo@web160203.mail.bf1.yahoo.com> <87FFE616-426C-499D-860A-5C6CF7D0302B@mcs.anl.gov> Message-ID: <1395702481.42349.YahooMailNeo@web160205.mail.bf1.yahoo.com> It turns out the machine's Linux/glibc versions are too old. The configure passed the error after I switched to a new machine. ? Thanks a lot, Qin ________________________________ From: Barry Smith To: Qin Lu Cc: petsc-users Sent: Saturday, March 22, 2014 5:30 PM Subject: Re: [petsc-users] Building PETSc with Intel mpi ? There is something wrong with your environment. It is compiling the trivial program int main() { ; ? return 0; } ? ? ? ? ? ? ? ? ? ? ? ? Pushing language C ? ? ? ? ? ? ? ? ? ? ? ? Popping language C sh: mpiicc? -o /tmp/petsc-cd5uJs/config.setCompilers/conftest? ? /tmp/petsc-cd5uJs/config.setCompilers/conftest.o Executing: mpiicc? -o /tmp/petsc-cd5uJs/config.setCompilers/conftest? ? /tmp/petsc-cd5uJs/config.setCompilers/conftest.o sh: Executing: /tmp/petsc-cd5uJs/config.setCompilers/conftest sh: /tmp/petsc-cd5uJs/config.setCompilers/conftest Executing: /tmp/petsc-cd5uJs/config.setCompilers/conftest sh: ERROR while running executable: Could not execute "/tmp/petsc-cd5uJs/config.setCompilers/conftest": /tmp/petsc-cd5uJs/config.setCompilers/conftest: error while loading shared libraries: /apps/compilers/intel_2013/impi/4.1.0.024/intel64/lib/libmpi.so.4: requires glibc 2.5 or later dynamic linker but the resulting program cannot be run. Can you compile and run a simple MPI program on this machine with mpiicc ? Do the following printf ?int main(){ return 0;} > simple.c mpiicc -o simple simple.c ./simplec does it work? On Mar 22, 2014, at 2:24 PM, Qin Lu wrote: > I tried what you suggested and got the following error (configure .log attached): > ************** > sh: > ERROR while running executable: Could not execute "/tmp/petsc-cd5uJs/config.setCompilers/conftest": > /tmp/petsc-cd5uJs/config.setCompilers/conftest: error while loading shared libraries: /apps/compilers/intel_2013/impi/4.1.0.024/intel64/lib/libmpi.so.4: requires glibc 2.5 or later dynamic linker > ****************** >? > What can I do about this? >? > Thanks, > Qin > > From: Barry Smith > To: Qin Lu > Cc: petsc-users > Sent: Saturday, March 22, 2014 11:46 AM > Subject: Re: [petsc-users] Building PETSc with Intel mpi > > > On Mar 22, 2014, at 10:15 AM, Qin Lu wrote: > > > The Intel mpi's wrappers to Intel compilers are mpiicc and mpiifort (not mpicc and mpif90), can PETSc's configure automatically pick them? Or I have to specify them explicitly (--with-cc=mpiicc --with-fc=mpiifort --with-mpi-compilers=0)? > >? List > >? --with-cc=mpiicc --with-fc=mpiifort? > >? do not list --with-mpi-compilers=0 or ?with-mpi-dir or ?with-mpi-libs > >? Barry > > > > > > > > Thanks, > > Qin > > > > From: Satish Balay > > To: Qin Lu > > Cc: Barry Smith ; petsc-users > > Sent: Saturday, March 22, 2014 12:09 AM > > Subject: Re: [petsc-users] Building PETSc with Intel mpi > > > > > --with-mpi-dir=/apps/compilers/intel_2013/impi/4.1.0.024/intel64 --with-mpi-compilers=0 > > > > Does this mpi not come with mpicc/mpif90 wrappers? If they do - its best to use them. > > > > If not - its best to look at the docs for this compiler - and specify it with the appropriate > > > > --with-mpi-include --with-mpi-lib options, [instead of the above] > > > > Satish > > > > On Fri, 21 Mar 2014, Qin Lu wrote: > > > > > Sourcing the .csh files of the compiler fixed the problem. Thanks! However, later it got another error (see the attached configure.log for details): > > >? > > > ******************************************************************************* > > >? ? ? ? ? UNABLE to CONFIGURE with GIVEN OPTIONS? ? (see configure.log for details): > > > ------------------------------------------------------------------------------- > > > Fortran error! mpi_init() could not be located! > > > ******************************************************************************* > > > > > > It seems the configure did not link the Intel MPI libs. I used --with-mpi-dir to specify the MPI directory, can configure get the correct Intel MPI lib names? If I have to specify the lib names (using --with-mpi-lib?), which libs should I specify? I saw a lot of libs under the directory, such as libmpi.a, libmpi_ipl64.a, libmpi_mt.a, etc. > > >? > > > Thanks a lot, > > > Qin > > >? > > > > > > ________________________________ > > >? From: Barry Smith > > > To: Qin Lu > > > Cc: petsc-users > > > Sent: Friday, March 21, 2014 10:11 AM > > > Subject: Re: [petsc-users] Building PETSc with Intel mpi > > >? > > > > > > > > >? Did you follow the directions here: http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > > > > > >? Did it make any difference? > > > > > > > > > On Mar 21, 2014, at 9:45 AM, Qin Lu wrote: > > > > > > > Hello, > > > >? > > > > I was trying to build PETSc-3.4.2 with Intel MPI using Intel-2013 compilers in Linux, but got the error below. The configure.log is attached. > > > >? > > > > ******************************************************************************* > > > >? ? ? ? ? ? ? ? ? ? UNABLE to EXECUTE BINARIES for ./configure > > > > ------------------------------------------------------------------------------- > > > > Cannot run executables created with FC. If this machine uses a batch system > > > > to submit jobs you will need to configure using ./configure with the additional option? --with-batch. > > > >? Otherwise there is problem with the compilers. Can you compile and run code with your C/C++ (and maybe Fortran) compilers? > > > > See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > > > > ******************************************************************************* > > > >? > > > > Thanks a lot for any suggestions abut the problem, > > > >? > > > > Regards, > > > > Qin > > > > > > >? > > > > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dkumar2 at vcu.edu Mon Mar 24 15:56:51 2014 From: dkumar2 at vcu.edu (Dinesh Kumar) Date: Mon, 24 Mar 2014 16:56:51 -0400 Subject: [petsc-users] vector of struct Message-ID: <53309C13.2040302@vcu.edu> Hi, I am trying to implement a 3-D Surface Registration code using PetSc. I want to create a vector of structures i.e. struct Point { double x, y, z; }; Then create a PetSc vector that stores array of "Points types". Can someone point me to the right direction. regards --dinesh From natacha.bereux at gmail.com Tue Mar 25 03:06:49 2014 From: natacha.bereux at gmail.com (Natacha BEREUX) Date: Tue, 25 Mar 2014 09:06:49 +0100 Subject: [petsc-users] Sparse QR factorization Message-ID: Dear all, I would like to compute the QR factorization of a rectangular sparse matrix. I found in PETSc documentation that PETSc is interfaced with some modules of SuiteSparse. I am wondering if there is a way to call SPQR to compute this factorization? Thanks a lot, Natacha -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Mar 25 05:21:46 2014 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 25 Mar 2014 05:21:46 -0500 Subject: [petsc-users] vector of struct In-Reply-To: <53309C13.2040302@vcu.edu> References: <53309C13.2040302@vcu.edu> Message-ID: On Mon, Mar 24, 2014 at 3:56 PM, Dinesh Kumar wrote: > Hi, > > I am trying to implement a 3-D Surface Registration code using PetSc. I > want to create a vector of structures i.e. > > struct Point { > double x, y, z; > }; > > Then create a PetSc vector that stores array of "Points types". Can > someone point me to the right direction. > Make a Vec with block size 3. Then you can always address it using Point pointers, and you can cast the output pointer to Point *. Matt > regards > --dinesh > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Tue Mar 25 06:51:39 2014 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 25 Mar 2014 06:51:39 -0500 Subject: [petsc-users] vector of struct In-Reply-To: References: <53309C13.2040302@vcu.edu> Message-ID: On Mar 25, 2014, at 5:21 AM, Matthew Knepley wrote: > On Mon, Mar 24, 2014 at 3:56 PM, Dinesh Kumar wrote: > Hi, > > I am trying to implement a 3-D Surface Registration code using PetSc. I > want to create a vector of structures i.e. > > struct Point { > double x, y, z; > }; > > Then create a PetSc vector that stores array of "Points types". Can > someone point me to the right direction. > > Make a Vec with block size 3. Then you can always address it using Point pointers, > and you can cast the output pointer to Point *. Here means use either VecGetArray() or DMDAVecGetArray() to access the array inside the vector. Barry > > Matt > > regards > --dinesh > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener From fischega at westinghouse.com Tue Mar 25 11:41:07 2014 From: fischega at westinghouse.com (Fischer, Greg A.) Date: Tue, 25 Mar 2014 12:41:07 -0400 Subject: [petsc-users] function that returns the coordinates in the DMDA group? Message-ID: Hello, The FAQ indicates: The MPI_Cart_create() first divides the mesh along the z direction, then the y, then the x. DMDA divides along the x, then y, then z. Is there a PETSc function call that returns the coordinates of the calling process in the DMDA group? Thanks, Greg -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Mar 25 11:49:57 2014 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 25 Mar 2014 11:49:57 -0500 Subject: [petsc-users] function that returns the coordinates in the DMDA group? In-Reply-To: References: Message-ID: On Tue, Mar 25, 2014 at 11:41 AM, Fischer, Greg A. < fischega at westinghouse.com> wrote: > Hello, > > > > The FAQ indicates: > > > > The MPI_Cart_create() first divides the mesh along the z > direction, then the y, then the x. DMDA divides along the x, then y, then z. > > > > Is there a PETSc function call that returns the coordinates of the calling > process in the DMDA group? > I am not sure I understand exactly what you want. Can you do a small 2D example, with 4 or 6 cells? Thanks, Matt > Thanks, > > Greg > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From fischega at westinghouse.com Tue Mar 25 12:01:30 2014 From: fischega at westinghouse.com (Fischer, Greg A.) Date: Tue, 25 Mar 2014 13:01:30 -0400 Subject: [petsc-users] function that returns the coordinates in the DMDA group? In-Reply-To: References: Message-ID: From: Matthew Knepley [mailto:knepley at gmail.com] Sent: Tuesday, March 25, 2014 12:50 PM To: Fischer, Greg A. Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] function that returns the coordinates in the DMDA group? On Tue, Mar 25, 2014 at 11:41 AM, Fischer, Greg A. > wrote: Hello, The FAQ indicates: The MPI_Cart_create() first divides the mesh along the z direction, then the y, then the x. DMDA divides along the x, then y, then z. Is there a PETSc function call that returns the coordinates of the calling process in the DMDA group? I am not sure I understand exactly what you want. Can you do a small 2D example, with 4 or 6 cells? In a 2D DMDA, my understanding is that the process ranks would be arranged as: 2 3 0 1 I would like to be able to call some function and have it return coordinates: [Rank 0] (x,y) = (0, 0) [Rank 1] (x,y) = (0, 1) [Rank 2] (x,y) = (1, 0) [Rank 3] (x,y) = (1, 1) Greg Thanks, Matt Thanks, Greg -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Mar 25 12:06:40 2014 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 25 Mar 2014 12:06:40 -0500 Subject: [petsc-users] function that returns the coordinates in the DMDA group? In-Reply-To: References: Message-ID: On Tue, Mar 25, 2014 at 12:01 PM, Fischer, Greg A. < fischega at westinghouse.com> wrote: > > > *From:* Matthew Knepley [mailto:knepley at gmail.com] > *Sent:* Tuesday, March 25, 2014 12:50 PM > *To:* Fischer, Greg A. > *Cc:* petsc-users at mcs.anl.gov > *Subject:* Re: [petsc-users] function that returns the coordinates in the > DMDA group? > > > > On Tue, Mar 25, 2014 at 11:41 AM, Fischer, Greg A. < > fischega at westinghouse.com> wrote: > > Hello, > > > > The FAQ indicates: > > > > The MPI_Cart_create() first divides the mesh along the z > direction, then the y, then the x. DMDA divides along the x, then y, then z. > > > > Is there a PETSc function call that returns the coordinates of the calling > process in the DMDA group? > > > > I am not sure I understand exactly what you want. Can you do a small 2D > example, with 4 or 6 cells? > > > > > > In a 2D DMDA, my understanding is that the process ranks would be arranged > as: > > > > 2 3 > > 0 1 > > > > I would like to be able to call some function and have it return > coordinates: > > > > [Rank 0] (x,y) = (0, 0) > > [Rank 1] (x,y) = (0, 1) > > [Rank 2] (x,y) = (1, 0) > > [Rank 3] (x,y) = (1, 1) > We do not have an API method for that. We give you the rank, (m,n,p) for the number of processors in each direction, and that it is numbered lexicographically. Thanks, Matt > Greg > > > > Thanks, > > > > Matt > > > > Thanks, > > Greg > > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From gbisht at lbl.gov Tue Mar 25 12:29:36 2014 From: gbisht at lbl.gov (Gautam Bisht) Date: Tue, 25 Mar 2014 10:29:36 -0700 Subject: [petsc-users] Local vecs of a DMComposite() in F90 code In-Reply-To: References: Message-ID: Hi, I was able to solve my problem by using DMCompositeGetLocalISs() and VecGetSubVector(). -Gautam. On Mon, Mar 24, 2014 at 11:37 AM, Gautam Bisht wrote: > Hi, > > I'm trying to write a fortran code that uses TS with DMComposite for a > multiphysics problem. How can I access local vectors created using a > DMCreateGlobalVector() from a Fortran code? I tried > DMCompositeGetLocalVectorsF90(), but I believe there isn't such a function. > > Also, what is the difference between creating vectors/matrices as nested > versus letting DMComposite() handle the creation of vec/mat? > > Thanks, > -Gautam. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From m.bahaa.eldein at gmail.com Wed Mar 26 08:28:45 2014 From: m.bahaa.eldein at gmail.com (Mohammad Bahaa) Date: Wed, 26 Mar 2014 15:28:45 +0200 Subject: [petsc-users] Custom vector owenrship ranges In-Reply-To: References: Message-ID: Actually I tried your suggestion, and it works fine, but it's slightly different from what I need, since each process should have access to other processes' values, since there's some sort of interaction, so I need process 0 (for instance) to be capable of accessing (just reading) value out of its ownership range On Wed, Mar 19, 2014 at 1:16 AM, Matthew Knepley wrote: > On Tue, Mar 18, 2014 at 11:00 AM, Mohammad Bahaa > wrote: > >> I used >> call VecCreateMPIWithArray(PETSC_COMM_WORLD,1,nc,ncall,myx,xall,ierr) >> >> however, when I use process 0 to write a file containing the combined >> values (the xall vector), the values seem not to be updated by some >> processes, eventhough I use PetscBarrier, in other words, values locally >> owned by processes 0 and 2 are ok, but those owned by process 1 & 3 aren't ! >> > > For collective writes, use VecView() or -vec_view > > Matt > > >> On Tue, Mar 18, 2014 at 3:43 PM, Mohammad Bahaa > > wrote: >> >>> the second approach of the MPI vector did it for me, thanks >>> >>> >>> On Tue, Mar 18, 2014 at 3:20 PM, Mohammad Bahaa < >>> m.bahaa.eldein at gmail.com> wrote: >>> >>>> Forgive me as my expression "sum up" was misguiding or misplaced, I >>>> didn't mean to literally sum the values in the vectors, I meant I want to >>>> put all values from each local vector into one global vector that can be >>>> accessed by all processes, "COMM_WORLD" communicator for instance >>>> >>>> >>>> On Tue, Mar 18, 2014 at 3:09 PM, Matthew Knepley wrote: >>>> >>>>> On Tue, Mar 18, 2014 at 7:53 AM, Mohammad Bahaa < >>>>> m.bahaa.eldein at gmail.com> wrote: >>>>> >>>>>> I'm using "PETSC_COMM_SELF" communicator for running n serial >>>>>> independent processes, I need to sum up a certain vector from the n >>>>>> processes in one vector, however, vectors involved in each process vary in >>>>>> size, and I couldn't find any function to define custom ownership ranges, >>>>>> so assuming I have a 4 processes run with each computing an "x" vector as >>>>>> follows: >>>>>> >>>>>> 1. process (1) with x of length 51 >>>>>> 2. process (2) with x of length 49 >>>>>> 3. process (3) with x of length 52 >>>>>> 4. process (4) with x of length 48 >>>>>> >>>>> >>>>> Let your local length be n, so that on proc 3 n== 52. Then >>>>> >>>>> VecCreate(comm, &v); >>>>> VecSetSizes(v, n, PETSC_DETERMINE); >>>>> VecSetFromOptions(v); >>>>> >>>>> VecSum(v, &sum); >>>>> >>>>> You could also make a parallel Vec from your Seq vecs: >>>>> >>>>> VecGetArray(lv, &array); >>>>> VecCreateMPIWithArray(comm, 1, n, PETSC_DETERMINE, array, &v); >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> The processes sum up to 100 elements, when I define a vector "x_all" >>>>>> of size "100" with "PETSC_COMM_WORLD" communicator, the ownership >>>>>> ranges are equal, which isn't the case, how to customize them ? >>>>>> >>>>>> -- >>>>>> Mohamamd Bahaa ElDin >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>>> >>>> -- >>>> Mohamamd Bahaa ElDin >>>> >>> >>> >>> >>> -- >>> Mohamamd Bahaa ElDin >>> >> >> >> >> -- >> Mohamamd Bahaa ElDin >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- Mohamamd Bahaa ElDin -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Mar 26 08:36:10 2014 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 26 Mar 2014 08:36:10 -0500 Subject: [petsc-users] Custom vector owenrship ranges In-Reply-To: References: Message-ID: On Wed, Mar 26, 2014 at 8:28 AM, Mohammad Bahaa wrote: > Actually I tried your suggestion, and it works fine, but it's slightly > different from what I need, since each process should have access to other > processes' values, since there's some sort of interaction, so I need > process 0 (for instance) to be capable of accessing (just reading) value > out of its ownership range > Random access defeats the purpose of parallel computing. If the access is structured, like a halo region, then you can use a VecScatter to map between Vecs with and without a halo. See the manual chapter on the DMDA object for a discussion of this in the case of structured meshes. Thanks, Matt > On Wed, Mar 19, 2014 at 1:16 AM, Matthew Knepley wrote: > >> On Tue, Mar 18, 2014 at 11:00 AM, Mohammad Bahaa < >> m.bahaa.eldein at gmail.com> wrote: >> >>> I used >>> call VecCreateMPIWithArray(PETSC_COMM_WORLD,1,nc,ncall,myx,xall,ierr) >>> >>> however, when I use process 0 to write a file containing the combined >>> values (the xall vector), the values seem not to be updated by some >>> processes, eventhough I use PetscBarrier, in other words, values locally >>> owned by processes 0 and 2 are ok, but those owned by process 1 & 3 aren't ! >>> >> >> For collective writes, use VecView() or -vec_view >> >> Matt >> >> >>> On Tue, Mar 18, 2014 at 3:43 PM, Mohammad Bahaa < >>> m.bahaa.eldein at gmail.com> wrote: >>> >>>> the second approach of the MPI vector did it for me, thanks >>>> >>>> >>>> On Tue, Mar 18, 2014 at 3:20 PM, Mohammad Bahaa < >>>> m.bahaa.eldein at gmail.com> wrote: >>>> >>>>> Forgive me as my expression "sum up" was misguiding or misplaced, I >>>>> didn't mean to literally sum the values in the vectors, I meant I want to >>>>> put all values from each local vector into one global vector that can be >>>>> accessed by all processes, "COMM_WORLD" communicator for instance >>>>> >>>>> >>>>> On Tue, Mar 18, 2014 at 3:09 PM, Matthew Knepley wrote: >>>>> >>>>>> On Tue, Mar 18, 2014 at 7:53 AM, Mohammad Bahaa < >>>>>> m.bahaa.eldein at gmail.com> wrote: >>>>>> >>>>>>> I'm using "PETSC_COMM_SELF" communicator for running n serial >>>>>>> independent processes, I need to sum up a certain vector from the n >>>>>>> processes in one vector, however, vectors involved in each process vary in >>>>>>> size, and I couldn't find any function to define custom ownership ranges, >>>>>>> so assuming I have a 4 processes run with each computing an "x" vector as >>>>>>> follows: >>>>>>> >>>>>>> 1. process (1) with x of length 51 >>>>>>> 2. process (2) with x of length 49 >>>>>>> 3. process (3) with x of length 52 >>>>>>> 4. process (4) with x of length 48 >>>>>>> >>>>>> >>>>>> Let your local length be n, so that on proc 3 n== 52. Then >>>>>> >>>>>> VecCreate(comm, &v); >>>>>> VecSetSizes(v, n, PETSC_DETERMINE); >>>>>> VecSetFromOptions(v); >>>>>> >>>>>> VecSum(v, &sum); >>>>>> >>>>>> You could also make a parallel Vec from your Seq vecs: >>>>>> >>>>>> VecGetArray(lv, &array); >>>>>> VecCreateMPIWithArray(comm, 1, n, PETSC_DETERMINE, array, &v); >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> The processes sum up to 100 elements, when I define a vector "x_all" >>>>>>> of size "100" with "PETSC_COMM_WORLD" communicator, the ownership >>>>>>> ranges are equal, which isn't the case, how to customize them ? >>>>>>> >>>>>>> -- >>>>>>> Mohamamd Bahaa ElDin >>>>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> Mohamamd Bahaa ElDin >>>>> >>>> >>>> >>>> >>>> -- >>>> Mohamamd Bahaa ElDin >>>> >>> >>> >>> >>> -- >>> Mohamamd Bahaa ElDin >>> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > > > -- > Mohamamd Bahaa ElDin > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From zonexo at gmail.com Wed Mar 26 09:59:25 2014 From: zonexo at gmail.com (TAY wee-beng) Date: Wed, 26 Mar 2014 22:59:25 +0800 Subject: [petsc-users] Insufficient memory when using GMRES + Boomeramg Message-ID: <5332EB4D.5060802@gmail.com> Hi, I am running a CFD solver. The Poisson eqn was originally solved using HYPRE's geometric multigrid. Recently, I tested it with Boomeramg as the preconditioner and GMRES as the ksp solver. There's a 20% increase in speed. However, when I increased the grid resolution, I got the out of memory error. Changing the solver back to HYPRE solved the problem. So does GMRES + Boomeramg used more memory than other solvers? Are there alternatives? Thank you. -- Yours sincerely, TAY wee-beng From knepley at gmail.com Wed Mar 26 10:22:28 2014 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 26 Mar 2014 10:22:28 -0500 Subject: [petsc-users] Insufficient memory when using GMRES + Boomeramg In-Reply-To: <5332EB4D.5060802@gmail.com> References: <5332EB4D.5060802@gmail.com> Message-ID: On Wed, Mar 26, 2014 at 9:59 AM, TAY wee-beng wrote: > Hi, > > I am running a CFD solver. The Poisson eqn was originally solved using > HYPRE's geometric multigrid. > Is this on a structured grid? Matt > Recently, I tested it with Boomeramg as the preconditioner and GMRES as > the ksp solver. There's a 20% increase in speed. > > However, when I increased the grid resolution, I got the out of memory > error. Changing the solver back to HYPRE solved the problem. > > So does GMRES + Boomeramg used more memory than other solvers? Are there > alternatives? > > Thank you. > > -- > Yours sincerely, > > TAY wee-beng > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From zonexo at gmail.com Wed Mar 26 10:42:56 2014 From: zonexo at gmail.com (TAY wee-beng) Date: Wed, 26 Mar 2014 23:42:56 +0800 Subject: [petsc-users] Insufficient memory when using GMRES + Boomeramg In-Reply-To: References: <5332EB4D.5060802@gmail.com> Message-ID: <5332F580.30306@gmail.com> On 26/3/2014 11:22 PM, Matthew Knepley wrote: > On Wed, Mar 26, 2014 at 9:59 AM, TAY wee-beng > wrote: > > Hi, > > I am running a CFD solver. The Poisson eqn was originally solved > using HYPRE's geometric multigrid. > > > Is this on a structured grid? Yes. > > Matt > > Recently, I tested it with Boomeramg as the preconditioner and > GMRES as the ksp solver. There's a 20% increase in speed. > > However, when I increased the grid resolution, I got the out of > memory error. Changing the solver back to HYPRE solved the problem. > > So does GMRES + Boomeramg used more memory than other solvers? > Are there alternatives? > > Thank you. > > -- > Yours sincerely, > > TAY wee-beng > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Mar 26 11:14:40 2014 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 26 Mar 2014 11:14:40 -0500 Subject: [petsc-users] Insufficient memory when using GMRES + Boomeramg In-Reply-To: <5332F580.30306@gmail.com> References: <5332EB4D.5060802@gmail.com> <5332F580.30306@gmail.com> Message-ID: On Wed, Mar 26, 2014 at 10:42 AM, TAY wee-beng wrote: > On 26/3/2014 11:22 PM, Matthew Knepley wrote: > > On Wed, Mar 26, 2014 at 9:59 AM, TAY wee-beng wrote: > >> Hi, >> >> I am running a CFD solver. The Poisson eqn was originally solved using >> HYPRE's geometric multigrid. >> > > Is this on a structured grid? > > > Yes. > Then you can replicate the Hypre structured MG with the PCMG, and it can be lighter memory than GAMG. You will need to code the problem in the style of SNES ex5, which is a Poisson for which geometric MG works from the command line. Thanks, Matt > Matt > > >> Recently, I tested it with Boomeramg as the preconditioner and GMRES as >> the ksp solver. There's a 20% increase in speed. >> >> However, when I increased the grid resolution, I got the out of memory >> error. Changing the solver back to HYPRE solved the problem. >> >> So does GMRES + Boomeramg used more memory than other solvers? Are there >> alternatives? >> >> Thank you. >> >> -- >> Yours sincerely, >> >> TAY wee-beng >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From pvsang002 at gmail.com Wed Mar 26 11:55:45 2014 From: pvsang002 at gmail.com (Sang pham van) Date: Wed, 26 Mar 2014 12:55:45 -0400 Subject: [petsc-users] ex42 question Message-ID: Hi Dave, I guess you are the one contributed the ex42 in KSP's examples. I want to modify the example to solve for stokes flow driven by volume force in 3D duct. Please help me to understand the code by answering the following questions: 1. Firstly, just for confirmation, the equations you're solving are: \nu * \nabla \cdot \nabla U - \nabla P = 0 and \nabla \cdot U = 0 where U = (Ux,Uy,Uz), \nu is variable viscosity? 2. Are U and P defined at all nodes? (I googled the Q1Q1 element, it looks like a box element with U and P defined at 8 corners). 3. Are nodes' coordinate defined though the DA coordinates? 4. How can I enforce noslip BC, and where should I plug in volume force? Thank you in advance. Regards, Sang -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Wed Mar 26 13:38:27 2014 From: jed at jedbrown.org (Jed Brown) Date: Wed, 26 Mar 2014 14:38:27 -0400 Subject: [petsc-users] ex42 question In-Reply-To: References: Message-ID: <87eh1ost3w.fsf@jedbrown.org> Sang pham van writes: > Hi Dave, > I guess you are the one contributed the ex42 in KSP's examples. I want to > modify the example to solve for stokes flow driven by volume force in 3D > duct. Please help me to understand the code by answering the following > questions: > > 1. Firstly, just for confirmation, the equations you're solving are: > \nu * \nabla \cdot \nabla U - \nabla P = 0 and For variable viscosity, it must be formulated as in the example: \nabla\cdot (\nu D U) - \nabla P = 0 where D U = (\nabla U + (\nabla U)^T)/2 > \nabla \cdot U = 0 > > where U = (Ux,Uy,Uz), \nu is variable viscosity? > > 2. Are U and P defined at all nodes? (I googled the Q1Q1 element, it looks > like a box element with U and P defined at 8 corners). Yes. > 3. Are nodes' coordinate defined though the DA coordinates? Yes, though they are set to be uniform. > 4. How can I enforce noslip BC, and where should I plug in volume force? Enforce the Dirichlet condition for the entire node. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From zonexo at gmail.com Wed Mar 26 19:54:18 2014 From: zonexo at gmail.com (TAY wee-beng) Date: Thu, 27 Mar 2014 08:54:18 +0800 Subject: [petsc-users] Insufficient memory when using GMRES + Boomeramg In-Reply-To: References: <5332EB4D.5060802@gmail.com> <5332F580.30306@gmail.com> Message-ID: <533376BA.4080901@gmail.com> On 27/3/2014 12:14 AM, Matthew Knepley wrote: > On Wed, Mar 26, 2014 at 10:42 AM, TAY wee-beng > wrote: > > On 26/3/2014 11:22 PM, Matthew Knepley wrote: >> On Wed, Mar 26, 2014 at 9:59 AM, TAY wee-beng > > wrote: >> >> Hi, >> >> I am running a CFD solver. The Poisson eqn was originally >> solved using HYPRE's geometric multigrid. >> >> >> Is this on a structured grid? > > Yes. > > > Then you can replicate the Hypre structured MG with the PCMG, and it > can be lighter memory than GAMG. You > will need to code the problem in the style of SNES ex5, which is a > Poisson for which geometric MG works from > the command line. > > Thanks, > > Matt Hi Matt, First of all, is there an easier way out? Is it Boomeramg or GMRES which has a large memory requirement? Will changing e.g. GMRES to FGMRES or other ksp solvers solve the problem? Also, I switched from geometric multigrid (GMG) to Boomeramg because the latter is faster. If I use PCMG, am I going back to the GMG path, which was slower? Thanks! > >> Matt >> >> Recently, I tested it with Boomeramg as the preconditioner >> and GMRES as the ksp solver. There's a 20% increase in speed. >> >> However, when I increased the grid resolution, I got the out >> of memory error. Changing the solver back to HYPRE solved the >> problem. >> >> So does GMRES + Boomeramg used more memory than other >> solvers? Are there alternatives? >> >> Thank you. >> >> -- >> Yours sincerely, >> >> TAY wee-beng >> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to >> which their experiments lead. >> -- Norbert Wiener > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Mar 26 20:15:54 2014 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 26 Mar 2014 20:15:54 -0500 Subject: [petsc-users] Insufficient memory when using GMRES + Boomeramg In-Reply-To: <533376BA.4080901@gmail.com> References: <5332EB4D.5060802@gmail.com> <5332F580.30306@gmail.com> <533376BA.4080901@gmail.com> Message-ID: On Wed, Mar 26, 2014 at 7:54 PM, TAY wee-beng wrote: > On 27/3/2014 12:14 AM, Matthew Knepley wrote: > > On Wed, Mar 26, 2014 at 10:42 AM, TAY wee-beng wrote: > >> On 26/3/2014 11:22 PM, Matthew Knepley wrote: >> >> On Wed, Mar 26, 2014 at 9:59 AM, TAY wee-beng wrote: >> >>> Hi, >>> >>> I am running a CFD solver. The Poisson eqn was originally solved using >>> HYPRE's geometric multigrid. >>> >> >> Is this on a structured grid? >> >> >> Yes. >> > > Then you can replicate the Hypre structured MG with the PCMG, and it can > be lighter memory than GAMG. You > will need to code the problem in the style of SNES ex5, which is a Poisson > for which geometric MG works from > the command line. > > Thanks, > > Matt > > Hi Matt, > > First of all, is there an easier way out? Is it Boomeramg or GMRES which > has a large memory requirement? Will changing e.g. GMRES to FGMRES or other > ksp solvers solve the problem? > You could perhaps use a low-memory Krylov method, like BiCGStab. How big is your Krylov space? > Also, I switched from geometric multigrid (GMG) to Boomeramg because the > latter is faster. If I use PCMG, am I going back to the GMG path, which was > slower? > If multigrid is working correctly, it should take 10 iterates or so. I am guessing something was wrong with the GMG, like coarse BC. Thanks, Matt > Thanks! > > > >> Matt >> >> >>> Recently, I tested it with Boomeramg as the preconditioner and GMRES as >>> the ksp solver. There's a 20% increase in speed. >>> >>> However, when I increased the grid resolution, I got the out of memory >>> error. Changing the solver back to HYPRE solved the problem. >>> >>> So does GMRES + Boomeramg used more memory than other solvers? Are >>> there alternatives? >>> >>> Thank you. >>> >>> -- >>> Yours sincerely, >>> >>> TAY wee-beng >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From abaas.salawu at thesyriatel.com Thu Mar 27 10:45:20 2014 From: abaas.salawu at thesyriatel.com (Abbas Salawu) Date: Thu, 27 Mar 2014 15:45:20 +0000 Subject: [petsc-users] Proposal Message-ID: <31762354045441475017485@SBB5> Please go through the attached mutually benefiting business proposal and let me know if you are interested in working with us on this project. It is attached in a pdf format to this email. After reading the proposal, you can email me on: abbas.salawu at thesyriatel.com so that we can go over the details together. Regards. Abbas. -------------- next part -------------- A non-text attachment was scrubbed... Name: Proposal.pdf Type: application/pdf Size: 192772 bytes Desc: not available URL: From PRaeth at drc.com Thu Mar 27 11:57:29 2014 From: PRaeth at drc.com (Raeth . Peter) Date: Thu, 27 Mar 2014 16:57:29 +0000 Subject: [petsc-users] Proposal In-Reply-To: <31762354045441475017485@SBB5> References: <31762354045441475017485@SBB5> Message-ID: <539FFE8B854A464BA19148E33BC0DAA4A5AB3664@exmb02.drc.com> scam ________________________________ This electronic message transmission and any attachments that accompany it contain information from DRC? (Dynamics Research Corporation) or its subsidiaries, or the intended recipient, which is privileged, proprietary, business confidential, or otherwise protected from disclosure and is the exclusive property of DRC and/or the intended recipient. The information in this email is solely intended for the use of the individual or entity that is the intended recipient. If you are not the intended recipient, any use, dissemination, distribution, retention, or copying of this communication, attachments, or substance is prohibited. If you have received this electronic transmission in error, please immediately reply to the author via email that you received the message by mistake and also promptly and permanently delete this message and all copies of this email and any attachments. We thank you for your assistance and apologize for any inconvenience. From mfadams at lbl.gov Fri Mar 28 07:50:43 2014 From: mfadams at lbl.gov (Mark Adams) Date: Fri, 28 Mar 2014 08:50:43 -0400 Subject: [petsc-users] GMRES error Message-ID: I get this error in the middle of a GMRES iteration. Any ideas? 0 SNES Function norm 6.067786990073e+17 Residual norms for fsa_ solve. 0 KSP Residual norm 6.067786990073e+17 1 KSP Residual norm 6.037192129591e+17 2 KSP Residual norm 5.852390318109e+17 3 KSP Residual norm 5.562063100790e+17 4 KSP Residual norm 5.284787094948e+17 5 KSP Residual norm 5.183753281867e+17 6 KSP Residual norm 4.813052731918e+17 7 KSP Residual norm 4.571658388437e+17 8 KSP Residual norm 4.326765203018e+17 9 KSP Residual norm 4.044786063878e+17 10 KSP Residual norm 3.852338418686e+17 11 KSP Residual norm 3.533696733608e+17 12 KSP Residual norm 3.242503119640e+17 13 KSP Residual norm 2.946010215248e+17 14 KSP Residual norm 2.642699328492e+17 15 KSP Residual norm 2.365537803535e+17 16 KSP Residual norm 1.841438363878e+17 17 KSP Residual norm 1.333683638359e+17 18 KSP Residual norm 7.325375319129e+16 19 KSP Residual norm 2.964254455676e+16 20 KSP Residual norm 1.216650547542e+16 [3]PETSC ERROR: --------------------- Error Message ------------------------------------ [3]PETSC ERROR: Invalid argument! [3]PETSC ERROR: Scalar value must be same on all processes, argument # 3! [3]PETSC ERROR: ------------------------------------------------------------------------ [3]PETSC ERROR: Petsc Development GIT revision: a9e422f38a085385bc6ffdad704c9d6a22342ac9 GIT Date: 2013-12-31 01:26:24 -0700 [3]PETSC ERROR: See docs/changes/index.html for recent updates. [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [3]PETSC ERROR: See docs/index.html for manual pages. [3]PETSC ERROR: ------------------------------------------------------------------------ [3]PETSC ERROR: ../../XGC1_3/xgc2 on a arch-xc30-dbg named nid04306 by madams Fri Mar 28 05:34:54 2014 [3]PETSC ERROR: Libraries linked from /global/homes/m/madams/petsc_private/arch-xc30-dbg/lib [3]PETSC ERROR: Configure run at Tue Dec 31 14:53:42 2013 [3]PETSC ERROR: Configure options --COPTFLAGS="-g -no-ipo" --CXXOPTFLAGS="-g -no-ipo" --FOPTFLAGS="-g -no-ipo" --download-hypre --download-metis --download-parmetis --with-cc=cc --with-clib-autodetect=0 --with-cxx=CC --with-cxxlib-autodetect=0 --with-debugging=1 --with-fc=ftn --with-fortranlib-autodetect=0 --with-hdf5-dir=/opt/cray/hdf5-parallel/1.8.9/intel/120/ --with-shared-libraries=0 --with-x=0 LIBS=-lstdc++ PETSC_ARCH=arch-xc30-dbg [3]PETSC ERROR: ------------------------------------------------------------------------ [3]PETSC ERROR: VecMAXPY() line 1257 in /global/u2/m/madams/petsc_private/src/vec/vec/interface/rvector.c [3]PETSC ERROR: KSPGMRESBuildSoln() line 353 in /global/u2/m/madams/petsc_private/src/ksp/ksp/impls/gmres/gmres.c [3]PETSC ERROR: KSPGMRESCycle() line 211 in /global/u2/m/madams/petsc_private/src/ksp/ksp/impls/gmres/gmres.c [3]PETSC ERROR: KSPSolve_GMRES() line 235 in /global/u2/m/madams/petsc_private/src/ksp/ksp/impls/gmres/gmres.c [3]PETSC ERROR: KSPSolve() line 432 in /global/u2/m/madams/petsc_private/src/ksp/ksp/interface/itfunc.c [3]PETSC ERROR: SNESSolve_KSPONLY() line 44 in /global/u2/m/madams/petsc_private/src/snes/impls/ksponly/ksponly.c [3]PETSC ERROR: SNESSolve() line 3812 in /global/u2/m/madams/petsc_private/src/snes/interface/snes.c Rank 3 [Fri Mar 28 05:35:04 2014] [c6-2c1s4n2] application called MPI_Abort(MPI_COMM_WORLD, 62) - process 3 -------------- next part -------------- An HTML attachment was scrubbed... URL: From prbrune at gmail.com Fri Mar 28 08:10:47 2014 From: prbrune at gmail.com (Peter Brune) Date: Fri, 28 Mar 2014 08:10:47 -0500 Subject: [petsc-users] GMRES error In-Reply-To: References: Message-ID: That's what happens when you try to VecAXPY/MAXPY/whatever with a NaN or inf. We should really have a better error message for this. - Peter On Fri, Mar 28, 2014 at 7:50 AM, Mark Adams wrote: > I get this error in the middle of a GMRES iteration. Any ideas? > > 0 SNES Function norm 6.067786990073e+17 > > > Residual norms for fsa_ solve. > > > 0 KSP Residual norm 6.067786990073e+17 > > > 1 KSP Residual norm 6.037192129591e+17 > > > 2 KSP Residual norm 5.852390318109e+17 > > > 3 KSP Residual norm 5.562063100790e+17 > > > 4 KSP Residual norm 5.284787094948e+17 > > > 5 KSP Residual norm 5.183753281867e+17 > > > 6 KSP Residual norm 4.813052731918e+17 > > > 7 KSP Residual norm 4.571658388437e+17 > > > 8 KSP Residual norm 4.326765203018e+17 > > > 9 KSP Residual norm 4.044786063878e+17 > > > 10 KSP Residual norm 3.852338418686e+17 > > > 11 KSP Residual norm 3.533696733608e+17 > > > 12 KSP Residual norm 3.242503119640e+17 > > > 13 KSP Residual norm 2.946010215248e+17 > > > 14 KSP Residual norm 2.642699328492e+17 > > > 15 KSP Residual norm 2.365537803535e+17 > > > 16 KSP Residual norm 1.841438363878e+17 > > > 17 KSP Residual norm 1.333683638359e+17 > > > 18 KSP Residual norm 7.325375319129e+16 > > > 19 KSP Residual norm 2.964254455676e+16 > > > 20 KSP Residual norm 1.216650547542e+16 [3]PETSC > ERROR: --------------------- Error Message > ------------------------------------ > > [3]PETSC ERROR: Invalid argument! > > > [3]PETSC ERROR: Scalar value must be same on all processes, argument # 3! > > > [3]PETSC ERROR: > ------------------------------------------------------------------------ > > > [3]PETSC ERROR: Petsc Development GIT revision: > a9e422f38a085385bc6ffdad704c9d6a22342ac9 GIT Date: 2013-12-31 01:26:24 > -0700 > [3]PETSC ERROR: See docs/changes/index.html for recent updates. > > > [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > > > [3]PETSC ERROR: See docs/index.html for manual pages. > > > [3]PETSC ERROR: > ------------------------------------------------------------------------ > > > [3]PETSC ERROR: ../../XGC1_3/xgc2 on a arch-xc30-dbg named nid04306 by > madams Fri Mar 28 05:34:54 2014 > > [3]PETSC ERROR: Libraries linked from > /global/homes/m/madams/petsc_private/arch-xc30-dbg/lib > > [3]PETSC ERROR: Configure run at Tue Dec 31 14:53:42 2013 > > > [3]PETSC ERROR: Configure options --COPTFLAGS="-g -no-ipo" > --CXXOPTFLAGS="-g -no-ipo" --FOPTFLAGS="-g -no-ipo" --download-hypre > --download-metis --download-parmetis --with-cc=cc --with-clib-autodetect=0 > --with-cxx=CC --with-cxxlib-autodetect=0 --with-debugging=1 --with-fc=ftn > --with-fortranlib-autodetect=0 > --with-hdf5-dir=/opt/cray/hdf5-parallel/1.8.9/intel/120/ > --with-shared-libraries=0 --with-x=0 LIBS=-lstdc++ PETSC_ARCH=arch-xc30-dbg > > [3]PETSC ERROR: > ------------------------------------------------------------------------ > > > [3]PETSC ERROR: VecMAXPY() line 1257 in > /global/u2/m/madams/petsc_private/src/vec/vec/interface/rvector.c > > [3]PETSC ERROR: KSPGMRESBuildSoln() line 353 in > /global/u2/m/madams/petsc_private/src/ksp/ksp/impls/gmres/gmres.c > > [3]PETSC ERROR: KSPGMRESCycle() line 211 in > /global/u2/m/madams/petsc_private/src/ksp/ksp/impls/gmres/gmres.c > > [3]PETSC ERROR: KSPSolve_GMRES() line 235 in > /global/u2/m/madams/petsc_private/src/ksp/ksp/impls/gmres/gmres.c > > [3]PETSC ERROR: KSPSolve() line 432 in > /global/u2/m/madams/petsc_private/src/ksp/ksp/interface/itfunc.c > > [3]PETSC ERROR: SNESSolve_KSPONLY() line 44 in > /global/u2/m/madams/petsc_private/src/snes/impls/ksponly/ksponly.c > > [3]PETSC ERROR: SNESSolve() line 3812 in > /global/u2/m/madams/petsc_private/src/snes/interface/snes.c > > Rank 3 [Fri Mar 28 05:35:04 2014] [c6-2c1s4n2] application called > MPI_Abort(MPI_COMM_WORLD, 62) - process 3 > -------------- next part -------------- An HTML attachment was scrubbed... URL: From luchao at mail.iggcas.ac.cn Fri Mar 28 08:18:37 2014 From: luchao at mail.iggcas.ac.cn (=?GBK?B?wsCzrA==?=) Date: Fri, 28 Mar 2014 21:18:37 +0800 (GMT+08:00) Subject: [petsc-users] tests of MatMPIAIJSetPreallocationCSR Message-ID: <16421e2.af33.14508d79511.Coremail.luchao@mail.iggcas.ac.cn> Your faithfully: Can you tell me whether I need to caculate the exact diag and off-diag arrays related to the nonzeros numbers when I use subroutine MatMPIBAIJSetPreallocationCSR to allocate memory? I have read this website "http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatMPIAIJSetPreallocationCSR.html", but I don't understand it clealy. So can you tell me where can I find the example of MatMPIBAIJSetPreallocationCSR? Thanks! your sincerely LV CHAO 2014/3/28 -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Mar 28 08:54:10 2014 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 28 Mar 2014 08:54:10 -0500 Subject: [petsc-users] tests of MatMPIAIJSetPreallocationCSR In-Reply-To: <16421e2.af33.14508d79511.Coremail.luchao@mail.iggcas.ac.cn> References: <16421e2.af33.14508d79511.Coremail.luchao@mail.iggcas.ac.cn> Message-ID: On Fri, Mar 28, 2014 at 8:18 AM, ?? wrote: > > Your faithfully: > Can you tell me whether I need to caculate the exact diag and > off-diag arrays related to the nonzeros numbers when I use subroutine > MatMPIBAIJSetPreallocationCSR to allocate memory? > > I have read this website " > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatMPIAIJSetPreallocationCSR.html", > but I don't understand it clealy. > All preallocation routines behave in the same way. If you give to little memory, it will give an error unless you use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) Matt > So can you tell me where can I find the example of > MatMPIBAIJSetPreallocationCSR? > > Thanks! > > your sincerely > > LV CHAO > > 2014/3/28 > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Fri Mar 28 09:10:12 2014 From: jed at jedbrown.org (Jed Brown) Date: Fri, 28 Mar 2014 08:10:12 -0600 Subject: [petsc-users] tests of MatMPIAIJSetPreallocationCSR In-Reply-To: <16421e2.af33.14508d79511.Coremail.luchao@mail.iggcas.ac.cn> References: <16421e2.af33.14508d79511.Coremail.luchao@mail.iggcas.ac.cn> Message-ID: <87ob0qmn23.fsf@jedbrown.org> ?? writes: > Can you tell me whether I need to caculate the exact diag and > off-diag arrays related to the nonzeros numbers when I use > subroutine MatMPIBAIJSetPreallocationCSR to allocate memory? If you are asking this question, you should use the simpler MatXAIJSetPreallocation(). You can over-estimate, but do not under-estimate. > I have read this website > "http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatMPIAIJSetPreallocationCSR.html", > but I don't understand it clealy. So can you tell me where can I > find the example of MatMPIBAIJSetPreallocationCSR? -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From Vincent.De-Groof at uibk.ac.at Fri Mar 28 11:56:38 2014 From: Vincent.De-Groof at uibk.ac.at (De Groof, Vincent Frans Maria) Date: Fri, 28 Mar 2014 16:56:38 +0000 Subject: [petsc-users] Reusing search directions Message-ID: <17A78B9D13564547AC894B88C159674720378C21@XMBX4.uibk.ac.at> Hi all, for the reanalysis of a set of problems (think sensitivity study, reliability, ... ), I'd like to reuse (some) of the search directions of previous problems I solved. The idea is to orthogonolize the new search directions also to a set of user-defined vectors. This idea is not new, and while browsing through the mailing lists, there have been a few discussions on related topics. 1* Can I access/store the old search directions of an iterative solve. Maybe through KSP? 2* Is there an augmented conjugated gradient implementation available? I've read some discussions on whether or not to implement this since a multigrid approach can reach the same objective. But I don't know if it was implemented or not. thanks, Vincent -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Sat Mar 29 16:14:17 2014 From: mfadams at lbl.gov (Mark Adams) Date: Sat, 29 Mar 2014 17:14:17 -0400 Subject: [petsc-users] Petsc versions Message-ID: I have this code: #if PETSC_VERSION_GT(3,4,0) || !PETSC_VERSION_RELEASE ierr = KSPSetOperators(m_ksp,m_mat,m_mat);CHKERRQ(ierr); #else ierr = KSPSetOperators(m_ksp,m_mat,m_mat,SAME_NONZERO_PATTERN);CHKERRQ(ierr); #endif And it fails with a v3.4 at TACC (eg /opt/apps/intel13/mvapich2_1 _9/petsc/3.4/sandybridge/include). It fails with too few arguments in the first call. So my #ifdefs are not working correctly. I want the second branch. What would be the correct syntax for this? -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sat Mar 29 16:18:17 2014 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 29 Mar 2014 16:18:17 -0500 Subject: [petsc-users] Petsc versions In-Reply-To: References: Message-ID: On Sat, Mar 29, 2014 at 4:14 PM, Mark Adams wrote: > I have this code: > > #if PETSC_VERSION_GT(3,4,0) || !PETSC_VERSION_RELEASE > ierr = KSPSetOperators(m_ksp,m_mat,m_mat);CHKERRQ(ierr); > #else > ierr = > KSPSetOperators(m_ksp,m_mat,m_mat,SAME_NONZERO_PATTERN);CHKERRQ(ierr); > #endif > > And it fails with a v3.4 at TACC (eg /opt/apps/intel13/mvapich2_1 > _9/petsc/3.4/sandybridge/include). It fails with too few arguments in > the first call. So my #ifdefs are not working correctly. I want the > second branch. What would be the correct syntax for this? > Are you sure it is not 3.4.x? Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Sat Mar 29 16:35:01 2014 From: mfadams at lbl.gov (Mark Adams) Date: Sat, 29 Mar 2014 17:35:01 -0400 Subject: [petsc-users] Petsc versions In-Reply-To: References: Message-ID: > > > Are you sure it is not 3.4.x? > No. But I do not see a 3.4.1. Perhaps I should change #if PETSC_VERSION_GT(3,4,0) || !PETSC_VERSION_RELEASE to #if PETSC_VERSION_GE(3,5,0) || !PETSC_VERSION_RELEASE This interface change will not go into a release until 3.5 right? -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Sat Mar 29 16:38:33 2014 From: jed at jedbrown.org (Jed Brown) Date: Sat, 29 Mar 2014 15:38:33 -0600 Subject: [petsc-users] Petsc versions In-Reply-To: References: Message-ID: <87zjk8hehy.fsf@jedbrown.org> Mark Adams writes: >> >> >> Are you sure it is not 3.4.x? >> > > No. But I do not see a 3.4.1. > > Perhaps I should change > > #if PETSC_VERSION_GT(3,4,0) || !PETSC_VERSION_RELEASE > > to > > #if PETSC_VERSION_GE(3,5,0) || !PETSC_VERSION_RELEASE Just use PETSC_VERSION_GE(3,5,0). The comparison operators interpret unreleased versions as +Infinity. > This interface change will not go into a release until 3.5 right? Correct. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From mfadams at lbl.gov Sat Mar 29 16:52:30 2014 From: mfadams at lbl.gov (Mark Adams) Date: Sat, 29 Mar 2014 17:52:30 -0400 Subject: [petsc-users] gamg failure with petsc-dev In-Reply-To: <53306FA8.4040001@imperial.ac.uk> References: <532B19AD.50105@imperial.ac.uk> <87lhw4yy9n.fsf@jedbrown.org> <532C23D6.7000400@imperial.ac.uk> <53306FA8.4040001@imperial.ac.uk> Message-ID: Sorry for getting to this late. I think you have figured it out basically but there are a few things: 1) You must set the block size of A (bs=2) for the null spaces to work and for aggregation MG to work properly. SA-AMG really does not make sense unless you work at the vertex level, for which we need the block size. 2) You must be right that the zero column is because the aggregation produced a singleton aggregate. And so the coarse grid is low rank. This is not catastrophic, it is like a fake BC equations. The numerics just have to work around it. Jacobi does this. I will fix SOR. Mark > Ok, I found out a bit more. The fact that the prolongator has zero columns > appears to arise in petsc 3.4 as well. The only reason it wasn't flagged > before is that the default for the smoother (not the aggregation smoother > but the standard pre and post smoothing) changed from jacobi to sor. I can > make the example work with the additional option: > > $ ./ex49 -elas_pc_type gamg -mx 100 -my 100 -mat_no_inode > -elas_mg_levels_1_pc_type jacobi > > Vice versa, if in petsc 3.4.4 I change ex49 to include the near nullspace > (the /* constrain near-null space bit */) at the end, it works with jacobi > (the default in 3.4) but it breaks with sor with the same error message as > above. I'm not entirely sure why jacobi doesn't give an error with a zero > on the diagonal, but the zero column also means that the related coarse dof > doesn't actually affect the fine grid solution. > > I think (but I might be barking up the wrong tree here) that the zero > columns appear because the aggregation method typically will have a few > small aggregates that are not big enough to support the polynomials of the > near null space (i.e. the polynomials restricted to an aggregate are not > linearly independent). A solution would be to reduce the number of > polynomials for these aggregates (only take the linearly independent). > Obviously this has the down-side that the degrees of freedom per aggregate > at the coarse level is no longer a constant making the administration more > complicated. It would be nice to find a solution though as I've always been > taught that jacobi is not a robust smoother for multigrid. > > Cheers > Stephan > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From salazardetroya at gmail.com Sun Mar 30 13:57:47 2014 From: salazardetroya at gmail.com (Miguel Angel Salazar de Troya) Date: Sun, 30 Mar 2014 13:57:47 -0500 Subject: [petsc-users] Problems running ex12.c In-Reply-To: References: <8448FFCE4362914496BCEAF8BE810C13EF1D34@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D5E@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D7C@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D8F@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DA3@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DBF@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E22@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E50@DCPWPEXMBX02.mdanderson.edu> Message-ID: Hello everybody I had a question about this example. In the petsc-dev next version, why don't we create a PetscSection in the function SetupSection, but we do it in the function SetupMaterialSection and in the function SetupSection of the petsc-current version. petsc-dev: #undef __FUNCT__ #define __FUNCT__ "SetupSection" PetscErrorCode SetupSection(DM dm, AppCtx *user) { DM cdm = dm; const PetscInt id = 1; PetscErrorCode ierr; PetscFunctionBeginUser; ierr = PetscObjectSetName((PetscObject) user->fe[0], "potential");CHKERRQ(ierr); while (cdm) { ierr = DMSetNumFields(cdm, 1);CHKERRQ(ierr); ierr = DMSetField(cdm, 0, (PetscObject) user->fe[0]);CHKERRQ(ierr); ierr = DMPlexAddBoundary(cdm, user->bcType == DIRICHLET, user->bcType == NEUMANN ? "boundary" : "marker", 0, user->exactFuncs[0], 1, &id, user);CHKERRQ(ierr); ierr = DMPlexGetCoarseDM(cdm, &cdm);CHKERRQ(ierr); } PetscFunctionReturn(0); } It seems that it adds the number of fields directly to the DM, and takes the number of components that were specified in SetupElementCommon, but what about the number of degrees of freedom? Why we added it for the MaterialSection but not for the regular Section. Thanks in advance Miguel On Sat, Mar 15, 2014 at 4:16 PM, Miguel Angel Salazar de Troya < salazardetroya at gmail.com> wrote: > Thanks a lot. > > > On Sat, Mar 15, 2014 at 3:36 PM, Matthew Knepley wrote: > >> On Sat, Mar 15, 2014 at 3:31 PM, Miguel Angel Salazar de Troya < >> salazardetroya at gmail.com> wrote: >> >>> Hello everybody >>> >>> I keep trying to understand this example. I don't have any problems with >>> this example when I run it like this: >>> >>> [salaza11 at maya PETSC]$ ./ex12 -bc_type dirichlet -interpolate >>> -petscspace_order 1 -variable_coefficient nonlinear -dim 2 -run_type full >>> -show_solution >>> Number of SNES iterations = 5 >>> L_2 Error: 0.107289 >>> Solution >>> Vec Object: 1 MPI processes >>> type: seq >>> 0.484618 >>> >>> However, when I change the boundary conditions to Neumann, I get this >>> error. >>> >>> [salaza11 at maya PETSC]$ ./ex12 -bc_type neumann -interpolate 1 >>> -petscspace_order 2 -variable_coefficient nonlinear -dim 2 -run_type full >>> -show_solution >>> >> >> Here you set the order of the element used in bulk, but not on the >> boundary where you condition is, so it defaults to 0. In >> order to become more familiar, take a look at the tests that I run here: >> >> >> https://bitbucket.org/petsc/petsc/src/64715f0f033346c10c77b73cf58216d111db8789/config/builder.py?at=master#cl-216 >> >> Matt >> >> [0]PETSC ERROR: --------------------- Error Message >>> -------------------------------------------------------------- >>> [0]PETSC ERROR: Petsc has generated inconsistent data >>> [0]PETSC ERROR: Number of dual basis vectors 0 not equal to dimension 1 >>> [0]PETSC ERROR: See http:// >>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble >>> shooting. >>> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-4776-gb18359b >>> GIT Date: 2014-03-04 10:53:30 -0600 >>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 Sat >>> Mar 15 14:28:05 2014 >>> [0]PETSC ERROR: Configure options --download-mpich >>> --download-scientificpython --download-triangle --download-ctetgen >>> --download-chaco --with-c2html=0 >>> [0]PETSC ERROR: #1 PetscDualSpaceSetUp_Lagrange() line 1763 in >>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>> [0]PETSC ERROR: #2 PetscDualSpaceSetUp() line 1277 in >>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>> [0]PETSC ERROR: #3 SetupElementCommon() line 474 in >>> /home/salaza11/workspace/PETSC/ex12.c >>> [0]PETSC ERROR: #4 SetupBdElement() line 559 in >>> /home/salaza11/workspace/PETSC/ex12.c >>> [0]PETSC ERROR: #5 main() line 755 in >>> /home/salaza11/workspace/PETSC/ex12.c >>> [0]PETSC ERROR: ----------------End of Error Message -------send entire >>> error message to petsc-maint at mcs.anl.gov---------- >>> application called MPI_Abort(MPI_COMM_WORLD, 77) - process 0 >>> [unset]: aborting job: >>> application called MPI_Abort(MPI_COMM_WORLD, 77) - process 0 >>> >>> I honestly do not know much about using dual spaces in a finite element >>> context. I have been trying to find some material that could help me >>> without much success. I tried to modify the dual space order with the >>> option -petscdualspace_order but I kept getting errors. In particular, I >>> got this when I set it to 1. >>> >>> [salaza11 at maya PETSC]$ ./ex12 -bc_type neumann -interpolate 1 >>> -petscspace_order 2 -variable_coefficient nonlinear -dim 2 -run_type full >>> -show_solution -petscdualspace_order 1 >>> [0]PETSC ERROR: PetscTrFreeDefault() called from PetscFESetUp_Basic() >>> line 2492 in /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>> [0]PETSC ERROR: Block [id=0(32)] at address 0x1cc32f0 is corrupted >>> (probably write past end of array) >>> [0]PETSC ERROR: Block allocated in PetscFESetUp_Basic() line 2483 in >>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>> [0]PETSC ERROR: --------------------- Error Message >>> -------------------------------------------------------------- >>> [0]PETSC ERROR: Memory corruption: >>> http://www.mcs.anl.gov/petsc/documentation/installation.html#valgrind >>> [0]PETSC ERROR: Corrupted memory >>> [0]PETSC ERROR: See http:// >>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble >>> shooting. >>> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-4776-gb18359b >>> GIT Date: 2014-03-04 10:53:30 -0600 >>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 Sat >>> Mar 15 14:37:34 2014 >>> [0]PETSC ERROR: Configure options --download-mpich >>> --download-scientificpython --download-triangle --download-ctetgen >>> --download-chaco --with-c2html=0 >>> [0]PETSC ERROR: #1 PetscTrFreeDefault() line 289 in >>> /home/salaza11/petsc/src/sys/memory/mtr.c >>> [0]PETSC ERROR: #2 PetscFESetUp_Basic() line 2492 in >>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>> [0]PETSC ERROR: #3 PetscFESetUp() line 2126 in >>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>> [0]PETSC ERROR: #4 SetupElementCommon() line 482 in >>> /home/salaza11/workspace/PETSC/ex12.c >>> [0]PETSC ERROR: #5 SetupElement() line 506 in >>> /home/salaza11/workspace/PETSC/ex12.c >>> [0]PETSC ERROR: #6 main() line 754 in >>> /home/salaza11/workspace/PETSC/ex12.c >>> [0]PETSC ERROR: ----------------End of Error Message -------send entire >>> error message to petsc-maint at mcs.anl.gov---------- >>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 >>> [unset]: aborting job: >>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 >>> [salaza11 at maya PETSC]$ >>> >>> >>> Then again, I do not know much what I am doing given my ignorance with >>> respect to the dual spaces in FE. I apologize for that. My questions are: >>> >>> - Where could I find more resources in order to understand the PETSc >>> implementation of dual spaces for FE? >>> - Why does it run with Dirichlet but not with Neumann? >>> >>> Thanks in advance. >>> Miguel. >>> >>> >>> On Tue, Mar 4, 2014 at 11:28 PM, Matthew Knepley wrote: >>> >>>> On Tue, Mar 4, 2014 at 12:01 PM, Matthew Knepley wrote: >>>> >>>>> On Tue, Mar 4, 2014 at 11:51 AM, Miguel Angel Salazar de Troya < >>>>> salazardetroya at gmail.com> wrote: >>>>> >>>>>> I can run it now, thanks. Although if I run it with valgrind 3.5.0 >>>>>> (should I update to the last version?) I get some memory leaks related with >>>>>> the function DMPlexCreateBoxMesh. >>>>>> >>>>> >>>>> I will check it out. >>>>> >>>> >>>> This is now fixed. >>>> >>>> Thanks for finding it >>>> >>>> Matt >>>> >>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> [salaza11 at maya tutorials]$ valgrind --leak-check=full ./ex12 >>>>>> -run_type test -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 >>>>>> -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>>> ==9625== Memcheck, a memory error detector >>>>>> ==9625== Copyright (C) 2002-2009, and GNU GPL'd, by Julian Seward et >>>>>> al. >>>>>> ==9625== Using Valgrind-3.5.0 and LibVEX; rerun with -h for copyright >>>>>> info >>>>>> ==9625== Command: ./ex12 -run_type test -refinement_limit 0.0 >>>>>> -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>> -dm_plex_print_fem 1 >>>>>> ==9625== >>>>>> Local function: >>>>>> Vec Object: 1 MPI processes >>>>>> type: seq >>>>>> 0 >>>>>> 0.25 >>>>>> 1 >>>>>> 0.25 >>>>>> 0.5 >>>>>> 1.25 >>>>>> 1 >>>>>> 1.25 >>>>>> 2 >>>>>> Initial guess >>>>>> Vec Object: 1 MPI processes >>>>>> type: seq >>>>>> 0.5 >>>>>> L_2 Error: 0.111111 >>>>>> Residual: >>>>>> Vec Object: 1 MPI processes >>>>>> type: seq >>>>>> 0 >>>>>> 0 >>>>>> 0 >>>>>> 0 >>>>>> 0 >>>>>> 0 >>>>>> 0 >>>>>> 0 >>>>>> 0 >>>>>> Initial Residual >>>>>> Vec Object: 1 MPI processes >>>>>> type: seq >>>>>> 0 >>>>>> L_2 Residual: 0 >>>>>> Jacobian: >>>>>> Mat Object: 1 MPI processes >>>>>> type: seqaij >>>>>> row 0: (0, 4) >>>>>> Residual: >>>>>> Vec Object: 1 MPI processes >>>>>> type: seq >>>>>> 0 >>>>>> 0 >>>>>> 0 >>>>>> 0 >>>>>> -2 >>>>>> 0 >>>>>> 0 >>>>>> 0 >>>>>> 0 >>>>>> Au - b = Au + F(0) >>>>>> Vec Object: 1 MPI processes >>>>>> type: seq >>>>>> 0 >>>>>> Linear L_2 Residual: 0 >>>>>> ==9625== >>>>>> ==9625== HEAP SUMMARY: >>>>>> ==9625== in use at exit: 288 bytes in 3 blocks >>>>>> ==9625== total heap usage: 2,484 allocs, 2,481 frees, 1,009,287 >>>>>> bytes allocated >>>>>> ==9625== >>>>>> ==9625== 48 bytes in 1 blocks are definitely lost in loss record 1 of >>>>>> 3 >>>>>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>>>>> ==9625== by 0x5D8D4E1: writepoly (triangle.c:12012) >>>>>> ==9625== by 0x5D8FAAC: triangulate (triangle.c:13167) >>>>>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>>>>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>>>>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>>>>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>>>>> ==9625== by 0x408D3D: main (ex12.c:651) >>>>>> ==9625== >>>>>> ==9625== 96 bytes in 1 blocks are definitely lost in loss record 2 of >>>>>> 3 >>>>>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>>>>> ==9625== by 0x5D8D485: writepoly (triangle.c:12004) >>>>>> ==9625== by 0x5D8FAAC: triangulate (triangle.c:13167) >>>>>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>>>>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>>>>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>>>>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>>>>> ==9625== by 0x408D3D: main (ex12.c:651) >>>>>> ==9625== >>>>>> ==9625== 144 bytes in 1 blocks are definitely lost in loss record 3 >>>>>> of 3 >>>>>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>>>>> ==9625== by 0x5D8CD20: writenodes (triangle.c:11718) >>>>>> ==9625== by 0x5D8F9DE: triangulate (triangle.c:13132) >>>>>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>>>>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>>>>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>>>>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>>>>> ==9625== by 0x408D3D: main (ex12.c:651) >>>>>> ==9625== >>>>>> ==9625== LEAK SUMMARY: >>>>>> ==9625== definitely lost: 288 bytes in 3 blocks >>>>>> ==9625== indirectly lost: 0 bytes in 0 blocks >>>>>> ==9625== possibly lost: 0 bytes in 0 blocks >>>>>> ==9625== still reachable: 0 bytes in 0 blocks >>>>>> ==9625== suppressed: 0 bytes in 0 blocks >>>>>> ==9625== >>>>>> ==9625== For counts of detected and suppressed errors, rerun with: -v >>>>>> ==9625== ERROR SUMMARY: 3 errors from 3 contexts (suppressed: 6 from >>>>>> 6) >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> On Mon, Mar 3, 2014 at 7:05 PM, Matthew Knepley wrote: >>>>>> >>>>>>> On Mon, Mar 3, 2014 at 4:59 PM, Miguel Angel Salazar de Troya < >>>>>>> salazardetroya at gmail.com> wrote: >>>>>>> >>>>>>>> You are welcome, thanks for your help. >>>>>>>> >>>>>>> >>>>>>> Okay, I have rebuilt completely clean, and ex12 runs for me. Can you >>>>>>> try again after pulling? >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> On Mon, Mar 3, 2014 at 4:13 PM, Matthew Knepley >>>>>>> > wrote: >>>>>>>> >>>>>>>>> On Mon, Mar 3, 2014 at 1:44 PM, Miguel Angel Salazar de Troya < >>>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> Thanks. This is what I get. >>>>>>>>>> >>>>>>>>> >>>>>>>>> Okay, this was broken by a new push to master/next in the last few >>>>>>>>> days. I have pushed a fix, >>>>>>>>> however next is currently broken due to a failure to check in a >>>>>>>>> file. This should be fixed shortly, >>>>>>>>> and then ex12 will work. I will mail you when its ready. >>>>>>>>> >>>>>>>>> Thanks for finding this, >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> (gdb) cont >>>>>>>>>> Continuing. >>>>>>>>>> >>>>>>>>>> Program received signal SIGSEGV, Segmentation fault. >>>>>>>>>> 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, >>>>>>>>>> X=0x168b5b0, >>>>>>>>>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, str=0x7fffae6e7970, >>>>>>>>>> user=0x7fd6811be509) >>>>>>>>>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>>>>>>>>> 882 ierr = PetscFEGetDimension(fe[f], &Nb);CHKERRQ(ierr); >>>>>>>>>> (gdb) where >>>>>>>>>> #0 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, >>>>>>>>>> X=0x168b5b0, >>>>>>>>>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, str=0x7fffae6e7970, >>>>>>>>>> user=0x7fd6811be509) >>>>>>>>>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>>>>>>>>> #1 0x00007fd6814a5bf6 in SNESComputeJacobian_DMLocal >>>>>>>>>> (snes=0x14e9450, >>>>>>>>>> X=0x1622ad0, A=0x7fffae6e8a88, B=0x7fffae6e8a88, >>>>>>>>>> ctx=0x1652300) >>>>>>>>>> at /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c:102 >>>>>>>>>> #2 0x00007fd6814cc609 in SNESComputeJacobian (snes=0x14e9450, >>>>>>>>>> X=0x1622ad0, >>>>>>>>>> A=0x7fffae6e8a88, B=0x7fffae6e8a88) >>>>>>>>>> at /home/salaza11/petsc/src/snes/interface/snes.c:2245 >>>>>>>>>> #3 0x000000000040af72 in main (argc=15, argv=0x7fffae6e8bc8) >>>>>>>>>> at /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:784 >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Mon, Mar 3, 2014 at 1:40 PM, Matthew Knepley < >>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> On Mon, Mar 3, 2014 at 1:39 PM, Miguel Angel Salazar de Troya < >>>>>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> This is what I get at gdb when I type 'where'. >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> You have to type 'cont', and then when it fails you type 'where'. >>>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> #0 0x000000310e0aa860 in __nanosleep_nocancel () from >>>>>>>>>>>> /lib64/libc.so.6 >>>>>>>>>>>> #1 0x000000310e0aa70f in sleep () from /lib64/libc.so.6 >>>>>>>>>>>> #2 0x00007fd83a00a8be in PetscSleep (s=10) >>>>>>>>>>>> at /home/salaza11/petsc/src/sys/utils/psleep.c:52 >>>>>>>>>>>> #3 0x00007fd83a06f331 in PetscAttachDebugger () >>>>>>>>>>>> at /home/salaza11/petsc/src/sys/error/adebug.c:397 >>>>>>>>>>>> #4 0x00007fd83a0af1d2 in PetscOptionsCheckInitial_Private () >>>>>>>>>>>> at /home/salaza11/petsc/src/sys/objects/init.c:444 >>>>>>>>>>>> #5 0x00007fd83a0b6448 in PetscInitialize (argc=0x7fff5cd8df2c, >>>>>>>>>>>> args=0x7fff5cd8df20, file=0x0, >>>>>>>>>>>> help=0x60ce40 "Poisson Problem in 2d and 3d with simplicial >>>>>>>>>>>> finite elements.\nWe solve the Poisson problem in a rectangular\ndomain, >>>>>>>>>>>> using a parallel unstructured mesh (DMPLEX) to discretize it.\n\n\n") >>>>>>>>>>>> at /home/salaza11/petsc/src/sys/objects/pinit.c:876 >>>>>>>>>>>> #6 0x0000000000408f2c in main (argc=15, argv=0x7fff5cd8f1f8) >>>>>>>>>>>> at >>>>>>>>>>>> /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:663 >>>>>>>>>>>> >>>>>>>>>>>> The rest of the gdb output is attached. I am a bit ignorant >>>>>>>>>>>> with gdb, I apologize for that. >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On Mon, Mar 3, 2014 at 12:48 PM, Matthew Knepley < >>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> On Mon, Mar 3, 2014 at 12:39 PM, Miguel Angel Salazar de Troya >>>>>>>>>>>>> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks for your response. Sorry I did not have the "next" >>>>>>>>>>>>>> version, but the "master" version. I still have an error though. I followed >>>>>>>>>>>>>> the steps given here ( >>>>>>>>>>>>>> https://bitbucket.org/petsc/petsc/wiki/Home) to obtain the >>>>>>>>>>>>>> next version, I configured petsc as above and ran ex12 as above as well, >>>>>>>>>>>>>> getting this error: >>>>>>>>>>>>>> >>>>>>>>>>>>>> [salaza11 at maya tutorials]$ ./ex12 -run_type test >>>>>>>>>>>>>> -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 >>>>>>>>>>>>>> -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>> 0 >>>>>>>>>>>>>> 0.25 >>>>>>>>>>>>>> 1 >>>>>>>>>>>>>> 0.25 >>>>>>>>>>>>>> 0.5 >>>>>>>>>>>>>> 1.25 >>>>>>>>>>>>>> 1 >>>>>>>>>>>>>> 1.25 >>>>>>>>>>>>>> 2 >>>>>>>>>>>>>> Initial guess >>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>> 0.5 >>>>>>>>>>>>>> L_2 Error: 0.111111 >>>>>>>>>>>>>> Residual: >>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>> 0 >>>>>>>>>>>>>> 0 >>>>>>>>>>>>>> 0 >>>>>>>>>>>>>> 0 >>>>>>>>>>>>>> 0 >>>>>>>>>>>>>> 0 >>>>>>>>>>>>>> 0 >>>>>>>>>>>>>> 0 >>>>>>>>>>>>>> 0 >>>>>>>>>>>>>> Initial Residual >>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>> 0 >>>>>>>>>>>>>> L_2 Residual: 0 >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Okay, now run with -start_in_debugger, and give me a stack >>>>>>>>>>>>> trace using 'where'. >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks, >>>>>>>>>>>>> >>>>>>>>>>>>> Matt >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>>>>>>>> Violation, probably memory access out of range >>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find >>>>>>>>>>>>>> memory corruption errors >>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack >>>>>>>>>>>>>> below >>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are >>>>>>>>>>>>>> not available, >>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of >>>>>>>>>>>>>> the function >>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 871 >>>>>>>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>> -------------------------------------------------------------- >>>>>>>>>>>>>> [0]PETSC ERROR: Signal received >>>>>>>>>>>>>> [0]PETSC ERROR: See http:// >>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for >>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>> v3.4.3-4705-gfb6b3bc GIT Date: 2014-03-03 08:23:43 -0600 >>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by >>>>>>>>>>>>>> salaza11 Mon Mar 3 11:49:15 2014 >>>>>>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>>>>>>> [0]PETSC ERROR: #1 User provided function() line 0 in >>>>>>>>>>>>>> unknown file >>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>>>> [unset]: aborting job: >>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Sun, Mar 2, 2014 at 7:11 PM, Matthew Knepley < >>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Sun, Mar 2, 2014 at 6:54 PM, Miguel Angel Salazar de >>>>>>>>>>>>>>> Troya wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Hi everybody >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> I am trying to run example ex12.c without much success. I >>>>>>>>>>>>>>>> specifically run it with the command options: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> We need to start narrowing down differences, because it runs >>>>>>>>>>>>>>> for me and our nightly tests. So, first can >>>>>>>>>>>>>>> you confirm that you are using the latest 'next' branch? >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type >>>>>>>>>>>>>>>> dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> And I get this output >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>>> 3 >>>>>>>>>>>>>>>> Initial guess >>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>> L_2 Error: 0.625 >>>>>>>>>>>>>>>> Residual: >>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>> Initial Residual >>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>> L_2 Residual: 0 >>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>>>>>>>>>> Violation, probably memory access out of range >>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to >>>>>>>>>>>>>>>> find memory corruption errors >>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack >>>>>>>>>>>>>>>> below >>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack >>>>>>>>>>>>>>>> are not available, >>>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start >>>>>>>>>>>>>>>> of the function >>>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 867 >>>>>>>>>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>> v3.4.3-3453-g0a94005 GIT Date: 2014-03-02 13:12:04 -0600 >>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>>>> updates. >>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>>>>>> shooting. >>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by >>>>>>>>>>>>>>>> salaza11 Sun Mar 2 17:00:09 2014 >>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>> /home/salaza11/petsc/linux-gnu-c-debug/lib >>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Sun Mar 2 16:46:51 2014 >>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in unknown >>>>>>>>>>>>>>>> file >>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>>>>>> [unset]: aborting job: >>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Probably my problems could be on my configuration. I attach >>>>>>>>>>>>>>>> the configure.log. I ran ./configure like this >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> ./configure --download-mpich --download-scientificpython >>>>>>>>>>>>>>>> --download-triangle --download-ctetgen --download-chaco --with-c2html=0 >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Thanks a lot in advance. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Tue, Jan 28, 2014 at 10:37 AM, Matthew Knepley < >>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Tue, Jan 28, 2014 at 10:31 AM, Yaakoub El Khamra < >>>>>>>>>>>>>>>>> yelkhamra at gmail.com> wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> If >>>>>>>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> is for serial, any chance we can get the options to run >>>>>>>>>>>>>>>>>> in parallel? >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Just use mpiexec -n >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Regards >>>>>>>>>>>>>>>>>> Yaakoub El Khamra >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:29 AM, Matthew Knepley < >>>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:06 AM, Jones,Martin Alexander >>>>>>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>> *Sent:* Friday, January 17, 2014 11:04 AM >>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:00 AM, Jones,Martin >>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> These examples all seem to run excepting the >>>>>>>>>>>>>>>>>>>>> following command, >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> I get the following ouput: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>>>>>>>>> ./ex12: symbol lookup error: >>>>>>>>>>>>>>>>>>>>> /opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so: undefined >>>>>>>>>>>>>>>>>>>>> symbol: omp_get_num_procs >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> This is a build problem, but it should affect all the >>>>>>>>>>>>>>>>>>>> runs. Is this reproducible? Can you send configure.log? MKL is the worst. >>>>>>>>>>>>>>>>>>>> If this >>>>>>>>>>>>>>>>>>>> persists, I would just switch to >>>>>>>>>>>>>>>>>>>> --download-f-blas-lapack. >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Thanks. I have some advice on options >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> --with-precision=single # I would not use this unless >>>>>>>>>>>>>>>>>>> you are doing something special, like CUDA >>>>>>>>>>>>>>>>>>> --with-clanguage=C++ # I would recommend switching to >>>>>>>>>>>>>>>>>>> C, the build is much faster >>>>>>>>>>>>>>>>>>> --with-mpi-dir=/usr --with-mpi4py=0 >>>>>>>>>>>>>>>>>>> --with-shared-libraries --CFLAGS=-O0 --CXXFLAGS=-O0 >>>>>>>>>>>>>>>>>>> --with-fc=0 >>>>>>>>>>>>>>>>>>> --with-etags=1 # This is unnecessary >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>> # Here is the problem, see below >>>>>>>>>>>>>>>>>>> --download-metis >>>>>>>>>>>>>>>>>>> --download-fiat=yes --download-generator >>>>>>>>>>>>>>>>>>> --download-scientificpython # Get rid of these, they are obsolete >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Your MKL needs another library for the OpenMP symbols. I >>>>>>>>>>>>>>>>>>> would recommend switching to --download-f2cblaslapack, >>>>>>>>>>>>>>>>>>> or you can try and find that library. >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 6:35 PM >>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 5:43 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Hi, This is the next error message after >>>>>>>>>>>>>>>>>>>>>> configuring and building with the triangle package when trying to run ex12 >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> This is my fault for bad defaults. I will fix. Try >>>>>>>>>>>>>>>>>>>>> running >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 >>>>>>>>>>>>>>>>>>>>> -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> for a representative run. Then you could try 3D >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit >>>>>>>>>>>>>>>>>>>>> 0.0125 -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> or a full run >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>>>>>>>>>> -interpolate -petscspace_order 1 >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>>>>>>>>>> -interpolate -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Let me know if those work. >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> ./ex12 >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 8 FPE: Floating >>>>>>>>>>>>>>>>>>>>>> Point Exception,probably divide by zero >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X >>>>>>>>>>>>>>>>>>>>>> to find memory corruption errors >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in >>>>>>>>>>>>>>>>>>>>>> stack below >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the >>>>>>>>>>>>>>>>>>>>>> stack are not available, >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the >>>>>>>>>>>>>>>>>>>>>> start of the function >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeResidualFEM line 531 >>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction_DMLocal line >>>>>>>>>>>>>>>>>>>>>> 63 /home/mjonesa/PETSc/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user function line 2088 >>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction line 2076 >>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve_NEWTONLS line 144 >>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/impls/ls/ls.c >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve line 3765 >>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for >>>>>>>>>>>>>>>>>>>>>> recent updates. >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about >>>>>>>>>>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>>> named maeda by mjonesa Thu Jan 16 17:41:23 2014 >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 17:38:33 >>>>>>>>>>>>>>>>>>>>>> 2014 >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local >>>>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>>>>> --with-c2html=0 --with-clanguage=c++ PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>>> --download-triangle >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in >>>>>>>>>>>>>>>>>>>>>> unknown file >>>>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - >>>>>>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:37 PM >>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:33 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Hi, I have downloaded and built the dev version >>>>>>>>>>>>>>>>>>>>>>> you suggested. I think I need the triangle package to run this particular >>>>>>>>>>>>>>>>>>>>>>> case. Is there any thing else that appears wrong in what I have done from >>>>>>>>>>>>>>>>>>>>>>> the error messages below: >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Great! Its running. You can reconfigure like this: >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> $PETSC_DIR/$PETSC_ARCH/conf/reconfigure-$PETSC_ARCH.py --download-triangle >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> and then rebuild >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> make >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> and then rerun. You can load meshes, but its much >>>>>>>>>>>>>>>>>>>>>> easier to have triangle create them. >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Thanks for being patient, >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error >>>>>>>>>>>>>>>>>>>>>>> Message ------------------------------------ >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: No support for this operation for >>>>>>>>>>>>>>>>>>>>>>> this object type! >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Mesh generation needs external >>>>>>>>>>>>>>>>>>>>>>> package support. >>>>>>>>>>>>>>>>>>>>>>> Please reconfigure with --download-triangle.! >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for >>>>>>>>>>>>>>>>>>>>>>> recent updates. >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about >>>>>>>>>>>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>>>> named maeda by mjonesa Thu Jan 16 16:28:20 2014 >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 16:25:53 >>>>>>>>>>>>>>>>>>>>>>> 2014 >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local --with-clanguage=c++ --with-c2html=0 >>>>>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexGenerate() line 4332 in >>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plex.c >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexCreateBoxMesh() line 600 in >>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexcreate.c >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: CreateMesh() line 295 in >>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: main() line 659 in >>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 56) - >>>>>>>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:06 PM >>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:05 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Hi. I changed the ENV variable to the correct >>>>>>>>>>>>>>>>>>>>>>>> entry. when I type make ex12 I get this: >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>>>>>>>> make ex12 >>>>>>>>>>>>>>>>>>>>>>>> g++ -o ex12.o -c -Wall -Wwrite-strings >>>>>>>>>>>>>>>>>>>>>>>> -Wno-strict-aliasing -Wno-unknown-pragmas -g -fPIC >>>>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include >>>>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/arch-linux2-cxx-debug/include >>>>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include/mpiuni >>>>>>>>>>>>>>>>>>>>>>>> -D__INSDIR__=src/snes/examples/tutorials/ ex12.c >>>>>>>>>>>>>>>>>>>>>>>> ex12.c:14:18: fatal error: ex12.h: No such file or >>>>>>>>>>>>>>>>>>>>>>>> directory >>>>>>>>>>>>>>>>>>>>>>>> compilation terminated. >>>>>>>>>>>>>>>>>>>>>>>> make: *** [ex12.o] Error 1 >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Any help of yours is very much appreciated. >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Yes, this relates to my 3). This is not going to >>>>>>>>>>>>>>>>>>>>>>> work for you with the release. Please see the link I sent. >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:58 PM >>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:55 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> You built with PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:31 PM >>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:11 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> Now I went to the directory where ex12.c sits >>>>>>>>>>>>>>>>>>>>>>>>>> and just did a 'make ex12.c' with the following error if this helps? : >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>>>>>>>>>> make ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/variables:108: >>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscvariables: No >>>>>>>>>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/rules:962: >>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules: No >>>>>>>>>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> make: *** No rule to make target >>>>>>>>>>>>>>>>>>>>>>>>>> `/home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules'. >>>>>>>>>>>>>>>>>>>>>>>>>> Stop. >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> 1) You would type 'make ex12' >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> 2) Either you PETSC_DIR ( >>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3) or PETSC_ARCH ( >>>>>>>>>>>>>>>>>>>>>>>>> linux-gnu-cxx-debug) environment variables >>>>>>>>>>>>>>>>>>>>>>>>> do not match what you built. Please send >>>>>>>>>>>>>>>>>>>>>>>>> configure.log and make.log >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> 3) Since it was only recently added, if you want >>>>>>>>>>>>>>>>>>>>>>>>> to use the FEM functionality, you must use the development version: >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/developers/index.html >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [mailto:knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 2:48 PM >>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 2:35 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> Hello To Whom it Concerns, >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> I am trying to run the tutorial ex12.c by running 'bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>>>>>>>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h' >>>>>>>>>>>>>>>>>>>>>>>>>> but getting the following error: >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> $ bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>>>>>>>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h >>>>>>>>>>>>>>>>>>>>>>>>>> Traceback (most recent call last): >>>>>>>>>>>>>>>>>>>>>>>>>> File >>>>>>>>>>>>>>>>>>>>>>>>>> "bin/pythonscripts/PetscGenerateFEMQuadrature.py", line 15, in >>>>>>>>>>>>>>>>>>>>>>>>>> from FIAT.reference_element import >>>>>>>>>>>>>>>>>>>>>>>>>> default_simplex >>>>>>>>>>>>>>>>>>>>>>>>>> ImportError: No module named >>>>>>>>>>>>>>>>>>>>>>>>>> FIAT.reference_element >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> I have removed the requirement of generating the >>>>>>>>>>>>>>>>>>>>>>>>>> header file (its now all handled in C). I thought >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> I changed the documentation everywhere (including >>>>>>>>>>>>>>>>>>>>>>>>>> the latest tutorial slides). Can you try running >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> with 'master' (or 'next'), and point me toward >>>>>>>>>>>>>>>>>>>>>>>>>> the old docs? >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>> Graduate Research Assistant >>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>> (217) 550-2360 >>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>> Graduate Research Assistant >>>>>>>> Department of Mechanical Science and Engineering >>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>> (217) 550-2360 >>>>>>>> salaza11 at illinois.edu >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> *Miguel Angel Salazar de Troya* >>>>>> Graduate Research Assistant >>>>>> Department of Mechanical Science and Engineering >>>>>> University of Illinois at Urbana-Champaign >>>>>> (217) 550-2360 >>>>>> salaza11 at illinois.edu >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >>> >>> -- >>> *Miguel Angel Salazar de Troya* >>> Graduate Research Assistant >>> Department of Mechanical Science and Engineering >>> University of Illinois at Urbana-Champaign >>> (217) 550-2360 >>> salaza11 at illinois.edu >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > > > -- > *Miguel Angel Salazar de Troya* > Graduate Research Assistant > Department of Mechanical Science and Engineering > University of Illinois at Urbana-Champaign > (217) 550-2360 > salaza11 at illinois.edu > > -- *Miguel Angel Salazar de Troya* Graduate Research Assistant Department of Mechanical Science and Engineering University of Illinois at Urbana-Champaign (217) 550-2360 salaza11 at illinois.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Mar 30 14:01:17 2014 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 30 Mar 2014 14:01:17 -0500 Subject: [petsc-users] Problems running ex12.c In-Reply-To: References: <8448FFCE4362914496BCEAF8BE810C13EF1D34@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D5E@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D7C@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D8F@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DA3@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DBF@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E22@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E50@DCPWPEXMBX02.mdanderson.edu> Message-ID: On Sun, Mar 30, 2014 at 1:57 PM, Miguel Angel Salazar de Troya < salazardetroya at gmail.com> wrote: > Hello everybody > > I had a question about this example. In the petsc-dev next version, why > don't we create a PetscSection in the function SetupSection, but we do it > in the function SetupMaterialSection and in the function SetupSection of > the petsc-current version. > 1) I wanted to try and make things more automatic for the user 2) I needed a way to automatically layout data for coarser/finer grids in unstructured MG Thus, now when you set for PetscFE into the DM using DMSetField(), it will automatically create the section on the first call to DMGetDefaultSection(). I do not have a similar provision now for materials, so you create your own section. I think this is alright until we have some idea of a nicer interface. Thanks, Matt > petsc-dev: > > #undef __FUNCT__ > #define __FUNCT__ "SetupSection" > PetscErrorCode SetupSection(DM dm, AppCtx *user) > { > DM cdm = dm; > const PetscInt id = 1; > PetscErrorCode ierr; > > PetscFunctionBeginUser; > ierr = PetscObjectSetName((PetscObject) user->fe[0], > "potential");CHKERRQ(ierr); > while (cdm) { > ierr = DMSetNumFields(cdm, 1);CHKERRQ(ierr); > ierr = DMSetField(cdm, 0, (PetscObject) user->fe[0]);CHKERRQ(ierr); > ierr = DMPlexAddBoundary(cdm, user->bcType == DIRICHLET, user->bcType > == NEUMANN ? "boundary" : "marker", 0, user->exactFuncs[0], 1, &id, > user);CHKERRQ(ierr); > ierr = DMPlexGetCoarseDM(cdm, &cdm);CHKERRQ(ierr); > } > PetscFunctionReturn(0); > } > > > It seems that it adds the number of fields directly to the DM, and takes > the number of components that were specified in SetupElementCommon, but > what about the number of degrees of freedom? Why we added it for the > MaterialSection but not for the regular Section. > > Thanks in advance > Miguel > > > On Sat, Mar 15, 2014 at 4:16 PM, Miguel Angel Salazar de Troya < > salazardetroya at gmail.com> wrote: > >> Thanks a lot. >> >> >> On Sat, Mar 15, 2014 at 3:36 PM, Matthew Knepley wrote: >> >>> On Sat, Mar 15, 2014 at 3:31 PM, Miguel Angel Salazar de Troya < >>> salazardetroya at gmail.com> wrote: >>> >>>> Hello everybody >>>> >>>> I keep trying to understand this example. I don't have any problems >>>> with this example when I run it like this: >>>> >>>> [salaza11 at maya PETSC]$ ./ex12 -bc_type dirichlet -interpolate >>>> -petscspace_order 1 -variable_coefficient nonlinear -dim 2 -run_type full >>>> -show_solution >>>> Number of SNES iterations = 5 >>>> L_2 Error: 0.107289 >>>> Solution >>>> Vec Object: 1 MPI processes >>>> type: seq >>>> 0.484618 >>>> >>>> However, when I change the boundary conditions to Neumann, I get this >>>> error. >>>> >>>> [salaza11 at maya PETSC]$ ./ex12 -bc_type neumann -interpolate 1 >>>> -petscspace_order 2 -variable_coefficient nonlinear -dim 2 -run_type full >>>> -show_solution >>>> >>> >>> Here you set the order of the element used in bulk, but not on the >>> boundary where you condition is, so it defaults to 0. In >>> order to become more familiar, take a look at the tests that I run here: >>> >>> >>> https://bitbucket.org/petsc/petsc/src/64715f0f033346c10c77b73cf58216d111db8789/config/builder.py?at=master#cl-216 >>> >>> Matt >>> >>> [0]PETSC ERROR: --------------------- Error Message >>>> -------------------------------------------------------------- >>>> [0]PETSC ERROR: Petsc has generated inconsistent data >>>> [0]PETSC ERROR: Number of dual basis vectors 0 not equal to dimension 1 >>>> [0]PETSC ERROR: See http:// >>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble >>>> shooting. >>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-4776-gb18359b >>>> GIT Date: 2014-03-04 10:53:30 -0600 >>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 >>>> Sat Mar 15 14:28:05 2014 >>>> [0]PETSC ERROR: Configure options --download-mpich >>>> --download-scientificpython --download-triangle --download-ctetgen >>>> --download-chaco --with-c2html=0 >>>> [0]PETSC ERROR: #1 PetscDualSpaceSetUp_Lagrange() line 1763 in >>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>> [0]PETSC ERROR: #2 PetscDualSpaceSetUp() line 1277 in >>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>> [0]PETSC ERROR: #3 SetupElementCommon() line 474 in >>>> /home/salaza11/workspace/PETSC/ex12.c >>>> [0]PETSC ERROR: #4 SetupBdElement() line 559 in >>>> /home/salaza11/workspace/PETSC/ex12.c >>>> [0]PETSC ERROR: #5 main() line 755 in >>>> /home/salaza11/workspace/PETSC/ex12.c >>>> [0]PETSC ERROR: ----------------End of Error Message -------send entire >>>> error message to petsc-maint at mcs.anl.gov---------- >>>> application called MPI_Abort(MPI_COMM_WORLD, 77) - process 0 >>>> [unset]: aborting job: >>>> application called MPI_Abort(MPI_COMM_WORLD, 77) - process 0 >>>> >>>> I honestly do not know much about using dual spaces in a finite element >>>> context. I have been trying to find some material that could help me >>>> without much success. I tried to modify the dual space order with the >>>> option -petscdualspace_order but I kept getting errors. In particular, I >>>> got this when I set it to 1. >>>> >>>> [salaza11 at maya PETSC]$ ./ex12 -bc_type neumann -interpolate 1 >>>> -petscspace_order 2 -variable_coefficient nonlinear -dim 2 -run_type full >>>> -show_solution -petscdualspace_order 1 >>>> [0]PETSC ERROR: PetscTrFreeDefault() called from PetscFESetUp_Basic() >>>> line 2492 in /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>> [0]PETSC ERROR: Block [id=0(32)] at address 0x1cc32f0 is corrupted >>>> (probably write past end of array) >>>> [0]PETSC ERROR: Block allocated in PetscFESetUp_Basic() line 2483 in >>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>> [0]PETSC ERROR: --------------------- Error Message >>>> -------------------------------------------------------------- >>>> [0]PETSC ERROR: Memory corruption: >>>> http://www.mcs.anl.gov/petsc/documentation/installation.html#valgrind >>>> [0]PETSC ERROR: Corrupted memory >>>> [0]PETSC ERROR: See http:// >>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble >>>> shooting. >>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-4776-gb18359b >>>> GIT Date: 2014-03-04 10:53:30 -0600 >>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 >>>> Sat Mar 15 14:37:34 2014 >>>> [0]PETSC ERROR: Configure options --download-mpich >>>> --download-scientificpython --download-triangle --download-ctetgen >>>> --download-chaco --with-c2html=0 >>>> [0]PETSC ERROR: #1 PetscTrFreeDefault() line 289 in >>>> /home/salaza11/petsc/src/sys/memory/mtr.c >>>> [0]PETSC ERROR: #2 PetscFESetUp_Basic() line 2492 in >>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>> [0]PETSC ERROR: #3 PetscFESetUp() line 2126 in >>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>> [0]PETSC ERROR: #4 SetupElementCommon() line 482 in >>>> /home/salaza11/workspace/PETSC/ex12.c >>>> [0]PETSC ERROR: #5 SetupElement() line 506 in >>>> /home/salaza11/workspace/PETSC/ex12.c >>>> [0]PETSC ERROR: #6 main() line 754 in >>>> /home/salaza11/workspace/PETSC/ex12.c >>>> [0]PETSC ERROR: ----------------End of Error Message -------send entire >>>> error message to petsc-maint at mcs.anl.gov---------- >>>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 >>>> [unset]: aborting job: >>>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 >>>> [salaza11 at maya PETSC]$ >>>> >>>> >>>> Then again, I do not know much what I am doing given my ignorance with >>>> respect to the dual spaces in FE. I apologize for that. My questions are: >>>> >>>> - Where could I find more resources in order to understand the PETSc >>>> implementation of dual spaces for FE? >>>> - Why does it run with Dirichlet but not with Neumann? >>>> >>>> Thanks in advance. >>>> Miguel. >>>> >>>> >>>> On Tue, Mar 4, 2014 at 11:28 PM, Matthew Knepley wrote: >>>> >>>>> On Tue, Mar 4, 2014 at 12:01 PM, Matthew Knepley wrote: >>>>> >>>>>> On Tue, Mar 4, 2014 at 11:51 AM, Miguel Angel Salazar de Troya < >>>>>> salazardetroya at gmail.com> wrote: >>>>>> >>>>>>> I can run it now, thanks. Although if I run it with valgrind 3.5.0 >>>>>>> (should I update to the last version?) I get some memory leaks related with >>>>>>> the function DMPlexCreateBoxMesh. >>>>>>> >>>>>> >>>>>> I will check it out. >>>>>> >>>>> >>>>> This is now fixed. >>>>> >>>>> Thanks for finding it >>>>> >>>>> Matt >>>>> >>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> [salaza11 at maya tutorials]$ valgrind --leak-check=full ./ex12 >>>>>>> -run_type test -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 >>>>>>> -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>>>> ==9625== Memcheck, a memory error detector >>>>>>> ==9625== Copyright (C) 2002-2009, and GNU GPL'd, by Julian Seward et >>>>>>> al. >>>>>>> ==9625== Using Valgrind-3.5.0 and LibVEX; rerun with -h for >>>>>>> copyright info >>>>>>> ==9625== Command: ./ex12 -run_type test -refinement_limit 0.0 >>>>>>> -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>> -dm_plex_print_fem 1 >>>>>>> ==9625== >>>>>>> Local function: >>>>>>> Vec Object: 1 MPI processes >>>>>>> type: seq >>>>>>> 0 >>>>>>> 0.25 >>>>>>> 1 >>>>>>> 0.25 >>>>>>> 0.5 >>>>>>> 1.25 >>>>>>> 1 >>>>>>> 1.25 >>>>>>> 2 >>>>>>> Initial guess >>>>>>> Vec Object: 1 MPI processes >>>>>>> type: seq >>>>>>> 0.5 >>>>>>> L_2 Error: 0.111111 >>>>>>> Residual: >>>>>>> Vec Object: 1 MPI processes >>>>>>> type: seq >>>>>>> 0 >>>>>>> 0 >>>>>>> 0 >>>>>>> 0 >>>>>>> 0 >>>>>>> 0 >>>>>>> 0 >>>>>>> 0 >>>>>>> 0 >>>>>>> Initial Residual >>>>>>> Vec Object: 1 MPI processes >>>>>>> type: seq >>>>>>> 0 >>>>>>> L_2 Residual: 0 >>>>>>> Jacobian: >>>>>>> Mat Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> row 0: (0, 4) >>>>>>> Residual: >>>>>>> Vec Object: 1 MPI processes >>>>>>> type: seq >>>>>>> 0 >>>>>>> 0 >>>>>>> 0 >>>>>>> 0 >>>>>>> -2 >>>>>>> 0 >>>>>>> 0 >>>>>>> 0 >>>>>>> 0 >>>>>>> Au - b = Au + F(0) >>>>>>> Vec Object: 1 MPI processes >>>>>>> type: seq >>>>>>> 0 >>>>>>> Linear L_2 Residual: 0 >>>>>>> ==9625== >>>>>>> ==9625== HEAP SUMMARY: >>>>>>> ==9625== in use at exit: 288 bytes in 3 blocks >>>>>>> ==9625== total heap usage: 2,484 allocs, 2,481 frees, 1,009,287 >>>>>>> bytes allocated >>>>>>> ==9625== >>>>>>> ==9625== 48 bytes in 1 blocks are definitely lost in loss record 1 >>>>>>> of 3 >>>>>>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>>>>>> ==9625== by 0x5D8D4E1: writepoly (triangle.c:12012) >>>>>>> ==9625== by 0x5D8FAAC: triangulate (triangle.c:13167) >>>>>>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>>>>>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>>>>>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>>>>>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>>>>>> ==9625== by 0x408D3D: main (ex12.c:651) >>>>>>> ==9625== >>>>>>> ==9625== 96 bytes in 1 blocks are definitely lost in loss record 2 >>>>>>> of 3 >>>>>>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>>>>>> ==9625== by 0x5D8D485: writepoly (triangle.c:12004) >>>>>>> ==9625== by 0x5D8FAAC: triangulate (triangle.c:13167) >>>>>>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>>>>>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>>>>>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>>>>>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>>>>>> ==9625== by 0x408D3D: main (ex12.c:651) >>>>>>> ==9625== >>>>>>> ==9625== 144 bytes in 1 blocks are definitely lost in loss record 3 >>>>>>> of 3 >>>>>>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>>>>>> ==9625== by 0x5D8CD20: writenodes (triangle.c:11718) >>>>>>> ==9625== by 0x5D8F9DE: triangulate (triangle.c:13132) >>>>>>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>>>>>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>>>>>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>>>>>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>>>>>> ==9625== by 0x408D3D: main (ex12.c:651) >>>>>>> ==9625== >>>>>>> ==9625== LEAK SUMMARY: >>>>>>> ==9625== definitely lost: 288 bytes in 3 blocks >>>>>>> ==9625== indirectly lost: 0 bytes in 0 blocks >>>>>>> ==9625== possibly lost: 0 bytes in 0 blocks >>>>>>> ==9625== still reachable: 0 bytes in 0 blocks >>>>>>> ==9625== suppressed: 0 bytes in 0 blocks >>>>>>> ==9625== >>>>>>> ==9625== For counts of detected and suppressed errors, rerun with: -v >>>>>>> ==9625== ERROR SUMMARY: 3 errors from 3 contexts (suppressed: 6 from >>>>>>> 6) >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> On Mon, Mar 3, 2014 at 7:05 PM, Matthew Knepley wrote: >>>>>>> >>>>>>>> On Mon, Mar 3, 2014 at 4:59 PM, Miguel Angel Salazar de Troya < >>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>> >>>>>>>>> You are welcome, thanks for your help. >>>>>>>>> >>>>>>>> >>>>>>>> Okay, I have rebuilt completely clean, and ex12 runs for me. Can >>>>>>>> you try again after pulling? >>>>>>>> >>>>>>>> Thanks, >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> On Mon, Mar 3, 2014 at 4:13 PM, Matthew Knepley < >>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> On Mon, Mar 3, 2014 at 1:44 PM, Miguel Angel Salazar de Troya < >>>>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> Thanks. This is what I get. >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Okay, this was broken by a new push to master/next in the last >>>>>>>>>> few days. I have pushed a fix, >>>>>>>>>> however next is currently broken due to a failure to check in a >>>>>>>>>> file. This should be fixed shortly, >>>>>>>>>> and then ex12 will work. I will mail you when its ready. >>>>>>>>>> >>>>>>>>>> Thanks for finding this, >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> (gdb) cont >>>>>>>>>>> Continuing. >>>>>>>>>>> >>>>>>>>>>> Program received signal SIGSEGV, Segmentation fault. >>>>>>>>>>> 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, >>>>>>>>>>> X=0x168b5b0, >>>>>>>>>>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, str=0x7fffae6e7970, >>>>>>>>>>> user=0x7fd6811be509) >>>>>>>>>>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>>>>>>>>>> 882 ierr = PetscFEGetDimension(fe[f], &Nb);CHKERRQ(ierr); >>>>>>>>>>> (gdb) where >>>>>>>>>>> #0 0x00007fd6811bea7b in DMPlexComputeJacobianFEM >>>>>>>>>>> (dm=0x159a180, X=0x168b5b0, >>>>>>>>>>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, str=0x7fffae6e7970, >>>>>>>>>>> user=0x7fd6811be509) >>>>>>>>>>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>>>>>>>>>> #1 0x00007fd6814a5bf6 in SNESComputeJacobian_DMLocal >>>>>>>>>>> (snes=0x14e9450, >>>>>>>>>>> X=0x1622ad0, A=0x7fffae6e8a88, B=0x7fffae6e8a88, >>>>>>>>>>> ctx=0x1652300) >>>>>>>>>>> at /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c:102 >>>>>>>>>>> #2 0x00007fd6814cc609 in SNESComputeJacobian (snes=0x14e9450, >>>>>>>>>>> X=0x1622ad0, >>>>>>>>>>> A=0x7fffae6e8a88, B=0x7fffae6e8a88) >>>>>>>>>>> at /home/salaza11/petsc/src/snes/interface/snes.c:2245 >>>>>>>>>>> #3 0x000000000040af72 in main (argc=15, argv=0x7fffae6e8bc8) >>>>>>>>>>> at >>>>>>>>>>> /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:784 >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Mon, Mar 3, 2014 at 1:40 PM, Matthew Knepley < >>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> On Mon, Mar 3, 2014 at 1:39 PM, Miguel Angel Salazar de Troya < >>>>>>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> This is what I get at gdb when I type 'where'. >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> You have to type 'cont', and then when it fails you type >>>>>>>>>>>> 'where'. >>>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> #0 0x000000310e0aa860 in __nanosleep_nocancel () from >>>>>>>>>>>>> /lib64/libc.so.6 >>>>>>>>>>>>> #1 0x000000310e0aa70f in sleep () from /lib64/libc.so.6 >>>>>>>>>>>>> #2 0x00007fd83a00a8be in PetscSleep (s=10) >>>>>>>>>>>>> at /home/salaza11/petsc/src/sys/utils/psleep.c:52 >>>>>>>>>>>>> #3 0x00007fd83a06f331 in PetscAttachDebugger () >>>>>>>>>>>>> at /home/salaza11/petsc/src/sys/error/adebug.c:397 >>>>>>>>>>>>> #4 0x00007fd83a0af1d2 in PetscOptionsCheckInitial_Private () >>>>>>>>>>>>> at /home/salaza11/petsc/src/sys/objects/init.c:444 >>>>>>>>>>>>> #5 0x00007fd83a0b6448 in PetscInitialize >>>>>>>>>>>>> (argc=0x7fff5cd8df2c, >>>>>>>>>>>>> args=0x7fff5cd8df20, file=0x0, >>>>>>>>>>>>> help=0x60ce40 "Poisson Problem in 2d and 3d with >>>>>>>>>>>>> simplicial finite elements.\nWe solve the Poisson problem in a >>>>>>>>>>>>> rectangular\ndomain, using a parallel unstructured mesh (DMPLEX) to >>>>>>>>>>>>> discretize it.\n\n\n") >>>>>>>>>>>>> at /home/salaza11/petsc/src/sys/objects/pinit.c:876 >>>>>>>>>>>>> #6 0x0000000000408f2c in main (argc=15, argv=0x7fff5cd8f1f8) >>>>>>>>>>>>> at >>>>>>>>>>>>> /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:663 >>>>>>>>>>>>> >>>>>>>>>>>>> The rest of the gdb output is attached. I am a bit ignorant >>>>>>>>>>>>> with gdb, I apologize for that. >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> On Mon, Mar 3, 2014 at 12:48 PM, Matthew Knepley < >>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> On Mon, Mar 3, 2014 at 12:39 PM, Miguel Angel Salazar de >>>>>>>>>>>>>> Troya wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks for your response. Sorry I did not have the "next" >>>>>>>>>>>>>>> version, but the "master" version. I still have an error though. I followed >>>>>>>>>>>>>>> the steps given here ( >>>>>>>>>>>>>>> https://bitbucket.org/petsc/petsc/wiki/Home) to obtain the >>>>>>>>>>>>>>> next version, I configured petsc as above and ran ex12 as above as well, >>>>>>>>>>>>>>> getting this error: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> [salaza11 at maya tutorials]$ ./ex12 -run_type test >>>>>>>>>>>>>>> -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 >>>>>>>>>>>>>>> -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>> 0.25 >>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>> 0.25 >>>>>>>>>>>>>>> 0.5 >>>>>>>>>>>>>>> 1.25 >>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>> 1.25 >>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>> Initial guess >>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>> 0.5 >>>>>>>>>>>>>>> L_2 Error: 0.111111 >>>>>>>>>>>>>>> Residual: >>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>> Initial Residual >>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>> L_2 Residual: 0 >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Okay, now run with -start_in_debugger, and give me a stack >>>>>>>>>>>>>> trace using 'where'. >>>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>> >>>>>>>>>>>>>> Matt >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>>>>>>>>> Violation, probably memory access out of range >>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find >>>>>>>>>>>>>>> memory corruption errors >>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack >>>>>>>>>>>>>>> below >>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack >>>>>>>>>>>>>>> are not available, >>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start >>>>>>>>>>>>>>> of the function >>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 871 >>>>>>>>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>> -------------------------------------------------------------- >>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received >>>>>>>>>>>>>>> [0]PETSC ERROR: See http:// >>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for >>>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>> v3.4.3-4705-gfb6b3bc GIT Date: 2014-03-03 08:23:43 -0600 >>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by >>>>>>>>>>>>>>> salaza11 Mon Mar 3 11:49:15 2014 >>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>>>>>>>> [0]PETSC ERROR: #1 User provided function() line 0 in >>>>>>>>>>>>>>> unknown file >>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>>>>> [unset]: aborting job: >>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Sun, Mar 2, 2014 at 7:11 PM, Matthew Knepley < >>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Sun, Mar 2, 2014 at 6:54 PM, Miguel Angel Salazar de >>>>>>>>>>>>>>>> Troya wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Hi everybody >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> I am trying to run example ex12.c without much success. I >>>>>>>>>>>>>>>>> specifically run it with the command options: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> We need to start narrowing down differences, because it >>>>>>>>>>>>>>>> runs for me and our nightly tests. So, first can >>>>>>>>>>>>>>>> you confirm that you are using the latest 'next' branch? >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type >>>>>>>>>>>>>>>>> dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> And I get this output >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>>>> 3 >>>>>>>>>>>>>>>>> Initial guess >>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>> L_2 Error: 0.625 >>>>>>>>>>>>>>>>> Residual: >>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> Initial Residual >>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>> L_2 Residual: 0 >>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>>>>>>>>>>> Violation, probably memory access out of range >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to >>>>>>>>>>>>>>>>> find memory corruption errors >>>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack >>>>>>>>>>>>>>>>> below >>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack >>>>>>>>>>>>>>>>> are not available, >>>>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start >>>>>>>>>>>>>>>>> of the function >>>>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 867 >>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>> v3.4.3-3453-g0a94005 GIT Date: 2014-03-02 13:12:04 -0600 >>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>>>>> updates. >>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>>>>>>> shooting. >>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya >>>>>>>>>>>>>>>>> by salaza11 Sun Mar 2 17:00:09 2014 >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>> /home/salaza11/petsc/linux-gnu-c-debug/lib >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Sun Mar 2 16:46:51 2014 >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>>>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in >>>>>>>>>>>>>>>>> unknown file >>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> [unset]: aborting job: >>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Probably my problems could be on my configuration. I >>>>>>>>>>>>>>>>> attach the configure.log. I ran ./configure like this >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> ./configure --download-mpich --download-scientificpython >>>>>>>>>>>>>>>>> --download-triangle --download-ctetgen --download-chaco --with-c2html=0 >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Thanks a lot in advance. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Tue, Jan 28, 2014 at 10:37 AM, Matthew Knepley < >>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Tue, Jan 28, 2014 at 10:31 AM, Yaakoub El Khamra < >>>>>>>>>>>>>>>>>> yelkhamra at gmail.com> wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> If >>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> is for serial, any chance we can get the options to run >>>>>>>>>>>>>>>>>>> in parallel? >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Just use mpiexec -n >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Regards >>>>>>>>>>>>>>>>>>> Yaakoub El Khamra >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:29 AM, Matthew Knepley < >>>>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:06 AM, Jones,Martin >>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>> *Sent:* Friday, January 17, 2014 11:04 AM >>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:00 AM, Jones,Martin >>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> These examples all seem to run excepting the >>>>>>>>>>>>>>>>>>>>>> following command, >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> I get the following ouput: >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>>>>>>>>>> ./ex12: symbol lookup error: >>>>>>>>>>>>>>>>>>>>>> /opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so: undefined >>>>>>>>>>>>>>>>>>>>>> symbol: omp_get_num_procs >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> This is a build problem, but it should affect all >>>>>>>>>>>>>>>>>>>>> the runs. Is this reproducible? Can you send configure.log? MKL is the >>>>>>>>>>>>>>>>>>>>> worst. If this >>>>>>>>>>>>>>>>>>>>> persists, I would just switch to >>>>>>>>>>>>>>>>>>>>> --download-f-blas-lapack. >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Thanks. I have some advice on options >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> --with-precision=single # I would not use this unless >>>>>>>>>>>>>>>>>>>> you are doing something special, like CUDA >>>>>>>>>>>>>>>>>>>> --with-clanguage=C++ # I would recommend switching >>>>>>>>>>>>>>>>>>>> to C, the build is much faster >>>>>>>>>>>>>>>>>>>> --with-mpi-dir=/usr --with-mpi4py=0 >>>>>>>>>>>>>>>>>>>> --with-shared-libraries --CFLAGS=-O0 --CXXFLAGS=-O0 >>>>>>>>>>>>>>>>>>>> --with-fc=0 >>>>>>>>>>>>>>>>>>>> --with-etags=1 # This is unnecessary >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>>> # Here is the problem, see below >>>>>>>>>>>>>>>>>>>> --download-metis >>>>>>>>>>>>>>>>>>>> --download-fiat=yes --download-generator >>>>>>>>>>>>>>>>>>>> --download-scientificpython # Get rid of these, they are obsolete >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Your MKL needs another library for the OpenMP symbols. >>>>>>>>>>>>>>>>>>>> I would recommend switching to --download-f2cblaslapack, >>>>>>>>>>>>>>>>>>>> or you can try and find that library. >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 6:35 PM >>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 5:43 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Hi, This is the next error message after >>>>>>>>>>>>>>>>>>>>>>> configuring and building with the triangle package when trying to run ex12 >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> This is my fault for bad defaults. I will fix. Try >>>>>>>>>>>>>>>>>>>>>> running >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 >>>>>>>>>>>>>>>>>>>>>> -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> for a representative run. Then you could try 3D >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit >>>>>>>>>>>>>>>>>>>>>> 0.0125 -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> or a full run >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>>>>>>>>>>> -interpolate -petscspace_order 1 >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>>>>>>>>>>> -interpolate -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Let me know if those work. >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> ./ex12 >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 8 FPE: Floating >>>>>>>>>>>>>>>>>>>>>>> Point Exception,probably divide by zero >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X >>>>>>>>>>>>>>>>>>>>>>> to find memory corruption errors >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in >>>>>>>>>>>>>>>>>>>>>>> stack below >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the >>>>>>>>>>>>>>>>>>>>>>> stack are not available, >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the >>>>>>>>>>>>>>>>>>>>>>> start of the function >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeResidualFEM line >>>>>>>>>>>>>>>>>>>>>>> 531 /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction_DMLocal line >>>>>>>>>>>>>>>>>>>>>>> 63 /home/mjonesa/PETSc/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user function line 2088 >>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction line 2076 >>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve_NEWTONLS line 144 >>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/impls/ls/ls.c >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve line 3765 >>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for >>>>>>>>>>>>>>>>>>>>>>> recent updates. >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about >>>>>>>>>>>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>>>> named maeda by mjonesa Thu Jan 16 17:41:23 2014 >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 17:38:33 >>>>>>>>>>>>>>>>>>>>>>> 2014 >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local >>>>>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>>>>>> --with-c2html=0 --with-clanguage=c++ PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>>>> --download-triangle >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in >>>>>>>>>>>>>>>>>>>>>>> unknown file >>>>>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - >>>>>>>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:37 PM >>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:33 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Hi, I have downloaded and built the dev version >>>>>>>>>>>>>>>>>>>>>>>> you suggested. I think I need the triangle package to run this particular >>>>>>>>>>>>>>>>>>>>>>>> case. Is there any thing else that appears wrong in what I have done from >>>>>>>>>>>>>>>>>>>>>>>> the error messages below: >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Great! Its running. You can reconfigure like this: >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> $PETSC_DIR/$PETSC_ARCH/conf/reconfigure-$PETSC_ARCH.py --download-triangle >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> and then rebuild >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> make >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> and then rerun. You can load meshes, but its much >>>>>>>>>>>>>>>>>>>>>>> easier to have triangle create them. >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Thanks for being patient, >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error >>>>>>>>>>>>>>>>>>>>>>>> Message ------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: No support for this operation for >>>>>>>>>>>>>>>>>>>>>>>> this object type! >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Mesh generation needs external >>>>>>>>>>>>>>>>>>>>>>>> package support. >>>>>>>>>>>>>>>>>>>>>>>> Please reconfigure with --download-triangle.! >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for >>>>>>>>>>>>>>>>>>>>>>>> recent updates. >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about >>>>>>>>>>>>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual >>>>>>>>>>>>>>>>>>>>>>>> pages. >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>>>>> named maeda by mjonesa Thu Jan 16 16:28:20 2014 >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 >>>>>>>>>>>>>>>>>>>>>>>> 16:25:53 2014 >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local --with-clanguage=c++ --with-c2html=0 >>>>>>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexGenerate() line 4332 in >>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plex.c >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexCreateBoxMesh() line 600 in >>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexcreate.c >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: CreateMesh() line 295 in >>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: main() line 659 in >>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 56) - >>>>>>>>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:06 PM >>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:05 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> Hi. I changed the ENV variable to the correct >>>>>>>>>>>>>>>>>>>>>>>>> entry. when I type make ex12 I get this: >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>>>>>>>>> make ex12 >>>>>>>>>>>>>>>>>>>>>>>>> g++ -o ex12.o -c -Wall -Wwrite-strings >>>>>>>>>>>>>>>>>>>>>>>>> -Wno-strict-aliasing -Wno-unknown-pragmas -g -fPIC >>>>>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include >>>>>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/arch-linux2-cxx-debug/include >>>>>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include/mpiuni >>>>>>>>>>>>>>>>>>>>>>>>> -D__INSDIR__=src/snes/examples/tutorials/ ex12.c >>>>>>>>>>>>>>>>>>>>>>>>> ex12.c:14:18: fatal error: ex12.h: No such file or >>>>>>>>>>>>>>>>>>>>>>>>> directory >>>>>>>>>>>>>>>>>>>>>>>>> compilation terminated. >>>>>>>>>>>>>>>>>>>>>>>>> make: *** [ex12.o] Error 1 >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> Any help of yours is very much appreciated. >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Yes, this relates to my 3). This is not going to >>>>>>>>>>>>>>>>>>>>>>>> work for you with the release. Please see the link I sent. >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:58 PM >>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:55 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> You built with PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:31 PM >>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:11 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> Now I went to the directory where ex12.c sits >>>>>>>>>>>>>>>>>>>>>>>>>>> and just did a 'make ex12.c' with the following error if this helps? : >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>>>>>>>>>>> make ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/variables:108: >>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscvariables: No >>>>>>>>>>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/rules:962: >>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules: No >>>>>>>>>>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> make: *** No rule to make target >>>>>>>>>>>>>>>>>>>>>>>>>>> `/home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules'. >>>>>>>>>>>>>>>>>>>>>>>>>>> Stop. >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> 1) You would type 'make ex12' >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> 2) Either you PETSC_DIR ( >>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3) or PETSC_ARCH ( >>>>>>>>>>>>>>>>>>>>>>>>>> linux-gnu-cxx-debug) environment variables >>>>>>>>>>>>>>>>>>>>>>>>>> do not match what you built. Please send >>>>>>>>>>>>>>>>>>>>>>>>>> configure.log and make.log >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> 3) Since it was only recently added, if you >>>>>>>>>>>>>>>>>>>>>>>>>> want to use the FEM functionality, you must use the development version: >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/developers/index.html >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [mailto: >>>>>>>>>>>>>>>>>>>>>>>>>>> knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 2:48 PM >>>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 2:35 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> Hello To Whom it Concerns, >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> I am trying to run the tutorial ex12.c by >>>>>>>>>>>>>>>>>>>>>>>>>>> running 'bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>>>>>>>>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h' >>>>>>>>>>>>>>>>>>>>>>>>>>> but getting the following error: >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> $ >>>>>>>>>>>>>>>>>>>>>>>>>>> bin/pythonscripts/PetscGenerateFEMQuadrature.py dim order dim 1 laplacian >>>>>>>>>>>>>>>>>>>>>>>>>>> dim order dim 1 boundary src/snes/examples/tutorials/ex12.h >>>>>>>>>>>>>>>>>>>>>>>>>>> Traceback (most recent call last): >>>>>>>>>>>>>>>>>>>>>>>>>>> File >>>>>>>>>>>>>>>>>>>>>>>>>>> "bin/pythonscripts/PetscGenerateFEMQuadrature.py", line 15, in >>>>>>>>>>>>>>>>>>>>>>>>>>> from FIAT.reference_element import >>>>>>>>>>>>>>>>>>>>>>>>>>> default_simplex >>>>>>>>>>>>>>>>>>>>>>>>>>> ImportError: No module named >>>>>>>>>>>>>>>>>>>>>>>>>>> FIAT.reference_element >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> I have removed the requirement of generating the >>>>>>>>>>>>>>>>>>>>>>>>>>> header file (its now all handled in C). I thought >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> I changed the documentation everywhere >>>>>>>>>>>>>>>>>>>>>>>>>>> (including the latest tutorial slides). Can you try running >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> with 'master' (or 'next'), and point me toward >>>>>>>>>>>>>>>>>>>>>>>>>>> the old docs? >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>> (217) 550-2360 >>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>> Graduate Research Assistant >>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>> (217) 550-2360 >>>>>>>>> salaza11 at illinois.edu >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> *Miguel Angel Salazar de Troya* >>>>>>> Graduate Research Assistant >>>>>>> Department of Mechanical Science and Engineering >>>>>>> University of Illinois at Urbana-Champaign >>>>>>> (217) 550-2360 >>>>>>> salaza11 at illinois.edu >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>>> >>>> -- >>>> *Miguel Angel Salazar de Troya* >>>> Graduate Research Assistant >>>> Department of Mechanical Science and Engineering >>>> University of Illinois at Urbana-Champaign >>>> (217) 550-2360 >>>> salaza11 at illinois.edu >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> >> >> -- >> *Miguel Angel Salazar de Troya* >> Graduate Research Assistant >> Department of Mechanical Science and Engineering >> University of Illinois at Urbana-Champaign >> (217) 550-2360 >> salaza11 at illinois.edu >> >> > > > -- > *Miguel Angel Salazar de Troya* > Graduate Research Assistant > Department of Mechanical Science and Engineering > University of Illinois at Urbana-Champaign > (217) 550-2360 > salaza11 at illinois.edu > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From salazardetroya at gmail.com Sun Mar 30 19:07:32 2014 From: salazardetroya at gmail.com (Miguel Angel Salazar de Troya) Date: Sun, 30 Mar 2014 19:07:32 -0500 Subject: [petsc-users] Problems running ex12.c In-Reply-To: References: <8448FFCE4362914496BCEAF8BE810C13EF1D34@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D5E@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D7C@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D8F@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DA3@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DBF@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E22@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E50@DCPWPEXMBX02.mdanderson.edu> Message-ID: Thanks for your response. Your help is really useful to me. The difference between the analytic and the field options are that for the field options the function is projected onto the function space defined for feAux right? What is the advantage of doing this? Also, for this field case I see that the function always has to be a vector. What if we wanted to implement a heterogeneous material in linear elasticity? Would we implement the constitutive tensor as a vector? It would not be very difficult I think, I just want to make sure it would be this way. Thanks in advance Miguel On Sun, Mar 30, 2014 at 2:01 PM, Matthew Knepley wrote: > On Sun, Mar 30, 2014 at 1:57 PM, Miguel Angel Salazar de Troya < > salazardetroya at gmail.com> wrote: > >> Hello everybody >> >> I had a question about this example. In the petsc-dev next version, why >> don't we create a PetscSection in the function SetupSection, but we do it >> in the function SetupMaterialSection and in the function SetupSection of >> the petsc-current version. >> > > 1) I wanted to try and make things more automatic for the user > > 2) I needed a way to automatically layout data for coarser/finer grids in > unstructured MG > > Thus, now when you set for PetscFE into the DM using DMSetField(), it will > automatically create > the section on the first call to DMGetDefaultSection(). > > I do not have a similar provision now for materials, so you create your > own section. I think this is > alright until we have some idea of a nicer interface. > > Thanks, > > Matt > > >> petsc-dev: >> >> #undef __FUNCT__ >> #define __FUNCT__ "SetupSection" >> PetscErrorCode SetupSection(DM dm, AppCtx *user) >> { >> DM cdm = dm; >> const PetscInt id = 1; >> PetscErrorCode ierr; >> >> PetscFunctionBeginUser; >> ierr = PetscObjectSetName((PetscObject) user->fe[0], >> "potential");CHKERRQ(ierr); >> while (cdm) { >> ierr = DMSetNumFields(cdm, 1);CHKERRQ(ierr); >> ierr = DMSetField(cdm, 0, (PetscObject) user->fe[0]);CHKERRQ(ierr); >> ierr = DMPlexAddBoundary(cdm, user->bcType == DIRICHLET, user->bcType >> == NEUMANN ? "boundary" : "marker", 0, user->exactFuncs[0], 1, &id, >> user);CHKERRQ(ierr); >> ierr = DMPlexGetCoarseDM(cdm, &cdm);CHKERRQ(ierr); >> } >> PetscFunctionReturn(0); >> } >> >> >> It seems that it adds the number of fields directly to the DM, and takes >> the number of components that were specified in SetupElementCommon, but >> what about the number of degrees of freedom? Why we added it for the >> MaterialSection but not for the regular Section. >> >> Thanks in advance >> Miguel >> >> >> On Sat, Mar 15, 2014 at 4:16 PM, Miguel Angel Salazar de Troya < >> salazardetroya at gmail.com> wrote: >> >>> Thanks a lot. >>> >>> >>> On Sat, Mar 15, 2014 at 3:36 PM, Matthew Knepley wrote: >>> >>>> On Sat, Mar 15, 2014 at 3:31 PM, Miguel Angel Salazar de Troya < >>>> salazardetroya at gmail.com> wrote: >>>> >>>>> Hello everybody >>>>> >>>>> I keep trying to understand this example. I don't have any problems >>>>> with this example when I run it like this: >>>>> >>>>> [salaza11 at maya PETSC]$ ./ex12 -bc_type dirichlet -interpolate >>>>> -petscspace_order 1 -variable_coefficient nonlinear -dim 2 -run_type full >>>>> -show_solution >>>>> Number of SNES iterations = 5 >>>>> L_2 Error: 0.107289 >>>>> Solution >>>>> Vec Object: 1 MPI processes >>>>> type: seq >>>>> 0.484618 >>>>> >>>>> However, when I change the boundary conditions to Neumann, I get this >>>>> error. >>>>> >>>>> [salaza11 at maya PETSC]$ ./ex12 -bc_type neumann -interpolate 1 >>>>> -petscspace_order 2 -variable_coefficient nonlinear -dim 2 -run_type full >>>>> -show_solution >>>>> >>>> >>>> Here you set the order of the element used in bulk, but not on the >>>> boundary where you condition is, so it defaults to 0. In >>>> order to become more familiar, take a look at the tests that I run here: >>>> >>>> >>>> https://bitbucket.org/petsc/petsc/src/64715f0f033346c10c77b73cf58216d111db8789/config/builder.py?at=master#cl-216 >>>> >>>> Matt >>>> >>>> [0]PETSC ERROR: --------------------- Error Message >>>>> -------------------------------------------------------------- >>>>> [0]PETSC ERROR: Petsc has generated inconsistent data >>>>> [0]PETSC ERROR: Number of dual basis vectors 0 not equal to dimension 1 >>>>> [0]PETSC ERROR: See http:// >>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble >>>>> shooting. >>>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-4776-gb18359b >>>>> GIT Date: 2014-03-04 10:53:30 -0600 >>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 >>>>> Sat Mar 15 14:28:05 2014 >>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>> --download-chaco --with-c2html=0 >>>>> [0]PETSC ERROR: #1 PetscDualSpaceSetUp_Lagrange() line 1763 in >>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>>> [0]PETSC ERROR: #2 PetscDualSpaceSetUp() line 1277 in >>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>>> [0]PETSC ERROR: #3 SetupElementCommon() line 474 in >>>>> /home/salaza11/workspace/PETSC/ex12.c >>>>> [0]PETSC ERROR: #4 SetupBdElement() line 559 in >>>>> /home/salaza11/workspace/PETSC/ex12.c >>>>> [0]PETSC ERROR: #5 main() line 755 in >>>>> /home/salaza11/workspace/PETSC/ex12.c >>>>> [0]PETSC ERROR: ----------------End of Error Message -------send >>>>> entire error message to petsc-maint at mcs.anl.gov---------- >>>>> application called MPI_Abort(MPI_COMM_WORLD, 77) - process 0 >>>>> [unset]: aborting job: >>>>> application called MPI_Abort(MPI_COMM_WORLD, 77) - process 0 >>>>> >>>>> I honestly do not know much about using dual spaces in a finite >>>>> element context. I have been trying to find some material that could help >>>>> me without much success. I tried to modify the dual space order with the >>>>> option -petscdualspace_order but I kept getting errors. In particular, I >>>>> got this when I set it to 1. >>>>> >>>>> [salaza11 at maya PETSC]$ ./ex12 -bc_type neumann -interpolate 1 >>>>> -petscspace_order 2 -variable_coefficient nonlinear -dim 2 -run_type full >>>>> -show_solution -petscdualspace_order 1 >>>>> [0]PETSC ERROR: PetscTrFreeDefault() called from PetscFESetUp_Basic() >>>>> line 2492 in /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>>> [0]PETSC ERROR: Block [id=0(32)] at address 0x1cc32f0 is corrupted >>>>> (probably write past end of array) >>>>> [0]PETSC ERROR: Block allocated in PetscFESetUp_Basic() line 2483 in >>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>>> [0]PETSC ERROR: --------------------- Error Message >>>>> -------------------------------------------------------------- >>>>> [0]PETSC ERROR: Memory corruption: >>>>> http://www.mcs.anl.gov/petsc/documentation/installation.html#valgrind >>>>> [0]PETSC ERROR: Corrupted memory >>>>> [0]PETSC ERROR: See http:// >>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble >>>>> shooting. >>>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-4776-gb18359b >>>>> GIT Date: 2014-03-04 10:53:30 -0600 >>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 >>>>> Sat Mar 15 14:37:34 2014 >>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>> --download-chaco --with-c2html=0 >>>>> [0]PETSC ERROR: #1 PetscTrFreeDefault() line 289 in >>>>> /home/salaza11/petsc/src/sys/memory/mtr.c >>>>> [0]PETSC ERROR: #2 PetscFESetUp_Basic() line 2492 in >>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>>> [0]PETSC ERROR: #3 PetscFESetUp() line 2126 in >>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>>> [0]PETSC ERROR: #4 SetupElementCommon() line 482 in >>>>> /home/salaza11/workspace/PETSC/ex12.c >>>>> [0]PETSC ERROR: #5 SetupElement() line 506 in >>>>> /home/salaza11/workspace/PETSC/ex12.c >>>>> [0]PETSC ERROR: #6 main() line 754 in >>>>> /home/salaza11/workspace/PETSC/ex12.c >>>>> [0]PETSC ERROR: ----------------End of Error Message -------send >>>>> entire error message to petsc-maint at mcs.anl.gov---------- >>>>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 >>>>> [unset]: aborting job: >>>>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 >>>>> [salaza11 at maya PETSC]$ >>>>> >>>>> >>>>> Then again, I do not know much what I am doing given my ignorance with >>>>> respect to the dual spaces in FE. I apologize for that. My questions are: >>>>> >>>>> - Where could I find more resources in order to understand the PETSc >>>>> implementation of dual spaces for FE? >>>>> - Why does it run with Dirichlet but not with Neumann? >>>>> >>>>> Thanks in advance. >>>>> Miguel. >>>>> >>>>> >>>>> On Tue, Mar 4, 2014 at 11:28 PM, Matthew Knepley wrote: >>>>> >>>>>> On Tue, Mar 4, 2014 at 12:01 PM, Matthew Knepley wrote: >>>>>> >>>>>>> On Tue, Mar 4, 2014 at 11:51 AM, Miguel Angel Salazar de Troya < >>>>>>> salazardetroya at gmail.com> wrote: >>>>>>> >>>>>>>> I can run it now, thanks. Although if I run it with valgrind 3.5.0 >>>>>>>> (should I update to the last version?) I get some memory leaks related with >>>>>>>> the function DMPlexCreateBoxMesh. >>>>>>>> >>>>>>> >>>>>>> I will check it out. >>>>>>> >>>>>> >>>>>> This is now fixed. >>>>>> >>>>>> Thanks for finding it >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> [salaza11 at maya tutorials]$ valgrind --leak-check=full ./ex12 >>>>>>>> -run_type test -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 >>>>>>>> -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>>>>> ==9625== Memcheck, a memory error detector >>>>>>>> ==9625== Copyright (C) 2002-2009, and GNU GPL'd, by Julian Seward >>>>>>>> et al. >>>>>>>> ==9625== Using Valgrind-3.5.0 and LibVEX; rerun with -h for >>>>>>>> copyright info >>>>>>>> ==9625== Command: ./ex12 -run_type test -refinement_limit 0.0 >>>>>>>> -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>> -dm_plex_print_fem 1 >>>>>>>> ==9625== >>>>>>>> Local function: >>>>>>>> Vec Object: 1 MPI processes >>>>>>>> type: seq >>>>>>>> 0 >>>>>>>> 0.25 >>>>>>>> 1 >>>>>>>> 0.25 >>>>>>>> 0.5 >>>>>>>> 1.25 >>>>>>>> 1 >>>>>>>> 1.25 >>>>>>>> 2 >>>>>>>> Initial guess >>>>>>>> Vec Object: 1 MPI processes >>>>>>>> type: seq >>>>>>>> 0.5 >>>>>>>> L_2 Error: 0.111111 >>>>>>>> Residual: >>>>>>>> Vec Object: 1 MPI processes >>>>>>>> type: seq >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> Initial Residual >>>>>>>> Vec Object: 1 MPI processes >>>>>>>> type: seq >>>>>>>> 0 >>>>>>>> L_2 Residual: 0 >>>>>>>> Jacobian: >>>>>>>> Mat Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> row 0: (0, 4) >>>>>>>> Residual: >>>>>>>> Vec Object: 1 MPI processes >>>>>>>> type: seq >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> -2 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> 0 >>>>>>>> Au - b = Au + F(0) >>>>>>>> Vec Object: 1 MPI processes >>>>>>>> type: seq >>>>>>>> 0 >>>>>>>> Linear L_2 Residual: 0 >>>>>>>> ==9625== >>>>>>>> ==9625== HEAP SUMMARY: >>>>>>>> ==9625== in use at exit: 288 bytes in 3 blocks >>>>>>>> ==9625== total heap usage: 2,484 allocs, 2,481 frees, 1,009,287 >>>>>>>> bytes allocated >>>>>>>> ==9625== >>>>>>>> ==9625== 48 bytes in 1 blocks are definitely lost in loss record 1 >>>>>>>> of 3 >>>>>>>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>>>>>>> ==9625== by 0x5D8D4E1: writepoly (triangle.c:12012) >>>>>>>> ==9625== by 0x5D8FAAC: triangulate (triangle.c:13167) >>>>>>>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>>>>>>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>>>>>>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>>>>>>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>>>>>>> ==9625== by 0x408D3D: main (ex12.c:651) >>>>>>>> ==9625== >>>>>>>> ==9625== 96 bytes in 1 blocks are definitely lost in loss record 2 >>>>>>>> of 3 >>>>>>>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>>>>>>> ==9625== by 0x5D8D485: writepoly (triangle.c:12004) >>>>>>>> ==9625== by 0x5D8FAAC: triangulate (triangle.c:13167) >>>>>>>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>>>>>>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>>>>>>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>>>>>>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>>>>>>> ==9625== by 0x408D3D: main (ex12.c:651) >>>>>>>> ==9625== >>>>>>>> ==9625== 144 bytes in 1 blocks are definitely lost in loss record 3 >>>>>>>> of 3 >>>>>>>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>>>>>>> ==9625== by 0x5D8CD20: writenodes (triangle.c:11718) >>>>>>>> ==9625== by 0x5D8F9DE: triangulate (triangle.c:13132) >>>>>>>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>>>>>>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>>>>>>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>>>>>>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>>>>>>> ==9625== by 0x408D3D: main (ex12.c:651) >>>>>>>> ==9625== >>>>>>>> ==9625== LEAK SUMMARY: >>>>>>>> ==9625== definitely lost: 288 bytes in 3 blocks >>>>>>>> ==9625== indirectly lost: 0 bytes in 0 blocks >>>>>>>> ==9625== possibly lost: 0 bytes in 0 blocks >>>>>>>> ==9625== still reachable: 0 bytes in 0 blocks >>>>>>>> ==9625== suppressed: 0 bytes in 0 blocks >>>>>>>> ==9625== >>>>>>>> ==9625== For counts of detected and suppressed errors, rerun with: >>>>>>>> -v >>>>>>>> ==9625== ERROR SUMMARY: 3 errors from 3 contexts (suppressed: 6 >>>>>>>> from 6) >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On Mon, Mar 3, 2014 at 7:05 PM, Matthew Knepley wrote: >>>>>>>> >>>>>>>>> On Mon, Mar 3, 2014 at 4:59 PM, Miguel Angel Salazar de Troya < >>>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> You are welcome, thanks for your help. >>>>>>>>>> >>>>>>>>> >>>>>>>>> Okay, I have rebuilt completely clean, and ex12 runs for me. Can >>>>>>>>> you try again after pulling? >>>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> On Mon, Mar 3, 2014 at 4:13 PM, Matthew Knepley < >>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> On Mon, Mar 3, 2014 at 1:44 PM, Miguel Angel Salazar de Troya < >>>>>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> Thanks. This is what I get. >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Okay, this was broken by a new push to master/next in the last >>>>>>>>>>> few days. I have pushed a fix, >>>>>>>>>>> however next is currently broken due to a failure to check in a >>>>>>>>>>> file. This should be fixed shortly, >>>>>>>>>>> and then ex12 will work. I will mail you when its ready. >>>>>>>>>>> >>>>>>>>>>> Thanks for finding this, >>>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> (gdb) cont >>>>>>>>>>>> Continuing. >>>>>>>>>>>> >>>>>>>>>>>> Program received signal SIGSEGV, Segmentation fault. >>>>>>>>>>>> 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, >>>>>>>>>>>> X=0x168b5b0, >>>>>>>>>>>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, >>>>>>>>>>>> str=0x7fffae6e7970, >>>>>>>>>>>> user=0x7fd6811be509) >>>>>>>>>>>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>>>>>>>>>>> 882 ierr = PetscFEGetDimension(fe[f], >>>>>>>>>>>> &Nb);CHKERRQ(ierr); >>>>>>>>>>>> (gdb) where >>>>>>>>>>>> #0 0x00007fd6811bea7b in DMPlexComputeJacobianFEM >>>>>>>>>>>> (dm=0x159a180, X=0x168b5b0, >>>>>>>>>>>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, >>>>>>>>>>>> str=0x7fffae6e7970, >>>>>>>>>>>> user=0x7fd6811be509) >>>>>>>>>>>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>>>>>>>>>>> #1 0x00007fd6814a5bf6 in SNESComputeJacobian_DMLocal >>>>>>>>>>>> (snes=0x14e9450, >>>>>>>>>>>> X=0x1622ad0, A=0x7fffae6e8a88, B=0x7fffae6e8a88, >>>>>>>>>>>> ctx=0x1652300) >>>>>>>>>>>> at /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c:102 >>>>>>>>>>>> #2 0x00007fd6814cc609 in SNESComputeJacobian (snes=0x14e9450, >>>>>>>>>>>> X=0x1622ad0, >>>>>>>>>>>> A=0x7fffae6e8a88, B=0x7fffae6e8a88) >>>>>>>>>>>> at /home/salaza11/petsc/src/snes/interface/snes.c:2245 >>>>>>>>>>>> #3 0x000000000040af72 in main (argc=15, argv=0x7fffae6e8bc8) >>>>>>>>>>>> at >>>>>>>>>>>> /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:784 >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On Mon, Mar 3, 2014 at 1:40 PM, Matthew Knepley < >>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> On Mon, Mar 3, 2014 at 1:39 PM, Miguel Angel Salazar de Troya >>>>>>>>>>>>> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> This is what I get at gdb when I type 'where'. >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> You have to type 'cont', and then when it fails you type >>>>>>>>>>>>> 'where'. >>>>>>>>>>>>> >>>>>>>>>>>>> Matt >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> #0 0x000000310e0aa860 in __nanosleep_nocancel () from >>>>>>>>>>>>>> /lib64/libc.so.6 >>>>>>>>>>>>>> #1 0x000000310e0aa70f in sleep () from /lib64/libc.so.6 >>>>>>>>>>>>>> #2 0x00007fd83a00a8be in PetscSleep (s=10) >>>>>>>>>>>>>> at /home/salaza11/petsc/src/sys/utils/psleep.c:52 >>>>>>>>>>>>>> #3 0x00007fd83a06f331 in PetscAttachDebugger () >>>>>>>>>>>>>> at /home/salaza11/petsc/src/sys/error/adebug.c:397 >>>>>>>>>>>>>> #4 0x00007fd83a0af1d2 in PetscOptionsCheckInitial_Private () >>>>>>>>>>>>>> at /home/salaza11/petsc/src/sys/objects/init.c:444 >>>>>>>>>>>>>> #5 0x00007fd83a0b6448 in PetscInitialize >>>>>>>>>>>>>> (argc=0x7fff5cd8df2c, >>>>>>>>>>>>>> args=0x7fff5cd8df20, file=0x0, >>>>>>>>>>>>>> help=0x60ce40 "Poisson Problem in 2d and 3d with >>>>>>>>>>>>>> simplicial finite elements.\nWe solve the Poisson problem in a >>>>>>>>>>>>>> rectangular\ndomain, using a parallel unstructured mesh (DMPLEX) to >>>>>>>>>>>>>> discretize it.\n\n\n") >>>>>>>>>>>>>> at /home/salaza11/petsc/src/sys/objects/pinit.c:876 >>>>>>>>>>>>>> #6 0x0000000000408f2c in main (argc=15, argv=0x7fff5cd8f1f8) >>>>>>>>>>>>>> at >>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:663 >>>>>>>>>>>>>> >>>>>>>>>>>>>> The rest of the gdb output is attached. I am a bit ignorant >>>>>>>>>>>>>> with gdb, I apologize for that. >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Mon, Mar 3, 2014 at 12:48 PM, Matthew Knepley < >>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Mon, Mar 3, 2014 at 12:39 PM, Miguel Angel Salazar de >>>>>>>>>>>>>>> Troya wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Thanks for your response. Sorry I did not have the "next" >>>>>>>>>>>>>>>> version, but the "master" version. I still have an error though. I followed >>>>>>>>>>>>>>>> the steps given here ( >>>>>>>>>>>>>>>> https://bitbucket.org/petsc/petsc/wiki/Home) to obtain the >>>>>>>>>>>>>>>> next version, I configured petsc as above and ran ex12 as above as well, >>>>>>>>>>>>>>>> getting this error: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> [salaza11 at maya tutorials]$ ./ex12 -run_type test >>>>>>>>>>>>>>>> -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 >>>>>>>>>>>>>>>> -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>> 0.25 >>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>> 0.25 >>>>>>>>>>>>>>>> 0.5 >>>>>>>>>>>>>>>> 1.25 >>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>> 1.25 >>>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>>> Initial guess >>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>> 0.5 >>>>>>>>>>>>>>>> L_2 Error: 0.111111 >>>>>>>>>>>>>>>> Residual: >>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>> Initial Residual >>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>> L_2 Residual: 0 >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Okay, now run with -start_in_debugger, and give me a stack >>>>>>>>>>>>>>> trace using 'where'. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>>>>>>>>>> Violation, probably memory access out of range >>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to >>>>>>>>>>>>>>>> find memory corruption errors >>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack >>>>>>>>>>>>>>>> below >>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack >>>>>>>>>>>>>>>> are not available, >>>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start >>>>>>>>>>>>>>>> of the function >>>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 871 >>>>>>>>>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>> -------------------------------------------------------------- >>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received >>>>>>>>>>>>>>>> [0]PETSC ERROR: See http:// >>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for >>>>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>> v3.4.3-4705-gfb6b3bc GIT Date: 2014-03-03 08:23:43 -0600 >>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by >>>>>>>>>>>>>>>> salaza11 Mon Mar 3 11:49:15 2014 >>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>>>>>>>>> [0]PETSC ERROR: #1 User provided function() line 0 in >>>>>>>>>>>>>>>> unknown file >>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>>>>>> [unset]: aborting job: >>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Sun, Mar 2, 2014 at 7:11 PM, Matthew Knepley < >>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Sun, Mar 2, 2014 at 6:54 PM, Miguel Angel Salazar de >>>>>>>>>>>>>>>>> Troya wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Hi everybody >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> I am trying to run example ex12.c without much success. I >>>>>>>>>>>>>>>>>> specifically run it with the command options: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> We need to start narrowing down differences, because it >>>>>>>>>>>>>>>>> runs for me and our nightly tests. So, first can >>>>>>>>>>>>>>>>> you confirm that you are using the latest 'next' branch? >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 -bc_type >>>>>>>>>>>>>>>>>> dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> And I get this output >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>>>>> 3 >>>>>>>>>>>>>>>>>> Initial guess >>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>>> L_2 Error: 0.625 >>>>>>>>>>>>>>>>>> Residual: >>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>> Initial Residual >>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>>> L_2 Residual: 0 >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: >>>>>>>>>>>>>>>>>> Segmentation Violation, probably memory access out of range >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to >>>>>>>>>>>>>>>>>> find memory corruption errors >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack >>>>>>>>>>>>>>>>>> below >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack >>>>>>>>>>>>>>>>>> are not available, >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the >>>>>>>>>>>>>>>>>> start of the function >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 867 >>>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>> v3.4.3-3453-g0a94005 GIT Date: 2014-03-02 13:12:04 -0600 >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>>>>>> updates. >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>>>>>>>> shooting. >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya >>>>>>>>>>>>>>>>>> by salaza11 Sun Mar 2 17:00:09 2014 >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>> /home/salaza11/petsc/linux-gnu-c-debug/lib >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Sun Mar 2 16:46:51 2014 >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>>>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>>>>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in >>>>>>>>>>>>>>>>>> unknown file >>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - >>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>> [unset]: aborting job: >>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - >>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Probably my problems could be on my configuration. I >>>>>>>>>>>>>>>>>> attach the configure.log. I ran ./configure like this >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> ./configure --download-mpich --download-scientificpython >>>>>>>>>>>>>>>>>> --download-triangle --download-ctetgen --download-chaco --with-c2html=0 >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Thanks a lot in advance. >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Tue, Jan 28, 2014 at 10:37 AM, Matthew Knepley < >>>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> On Tue, Jan 28, 2014 at 10:31 AM, Yaakoub El Khamra < >>>>>>>>>>>>>>>>>>> yelkhamra at gmail.com> wrote: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> If >>>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> is for serial, any chance we can get the options to run >>>>>>>>>>>>>>>>>>>> in parallel? >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Just use mpiexec -n >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Regards >>>>>>>>>>>>>>>>>>>> Yaakoub El Khamra >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:29 AM, Matthew Knepley < >>>>>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:06 AM, Jones,Martin >>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>> *Sent:* Friday, January 17, 2014 11:04 AM >>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:00 AM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> These examples all seem to run excepting the >>>>>>>>>>>>>>>>>>>>>>> following command, >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> I get the following ouput: >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit >>>>>>>>>>>>>>>>>>>>>>> 0.0125 -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>>>>>>>>>>> ./ex12: symbol lookup error: >>>>>>>>>>>>>>>>>>>>>>> /opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so: undefined >>>>>>>>>>>>>>>>>>>>>>> symbol: omp_get_num_procs >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> This is a build problem, but it should affect all >>>>>>>>>>>>>>>>>>>>>> the runs. Is this reproducible? Can you send configure.log? MKL is the >>>>>>>>>>>>>>>>>>>>>> worst. If this >>>>>>>>>>>>>>>>>>>>>> persists, I would just switch to >>>>>>>>>>>>>>>>>>>>>> --download-f-blas-lapack. >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Thanks. I have some advice on options >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> --with-precision=single # I would not use this >>>>>>>>>>>>>>>>>>>>> unless you are doing something special, like CUDA >>>>>>>>>>>>>>>>>>>>> --with-clanguage=C++ # I would recommend switching >>>>>>>>>>>>>>>>>>>>> to C, the build is much faster >>>>>>>>>>>>>>>>>>>>> --with-mpi-dir=/usr --with-mpi4py=0 >>>>>>>>>>>>>>>>>>>>> --with-shared-libraries --CFLAGS=-O0 --CXXFLAGS=-O0 >>>>>>>>>>>>>>>>>>>>> --with-fc=0 >>>>>>>>>>>>>>>>>>>>> --with-etags=1 # This is unnecessary >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>>>> # Here is the problem, see below >>>>>>>>>>>>>>>>>>>>> --download-metis >>>>>>>>>>>>>>>>>>>>> --download-fiat=yes --download-generator >>>>>>>>>>>>>>>>>>>>> --download-scientificpython # Get rid of these, they are obsolete >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Your MKL needs another library for the OpenMP symbols. >>>>>>>>>>>>>>>>>>>>> I would recommend switching to --download-f2cblaslapack, >>>>>>>>>>>>>>>>>>>>> or you can try and find that library. >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 6:35 PM >>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 5:43 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Hi, This is the next error message after >>>>>>>>>>>>>>>>>>>>>>>> configuring and building with the triangle package when trying to run ex12 >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> This is my fault for bad defaults. I will fix. Try >>>>>>>>>>>>>>>>>>>>>>> running >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 >>>>>>>>>>>>>>>>>>>>>>> -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>>>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> for a representative run. Then you could try 3D >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit >>>>>>>>>>>>>>>>>>>>>>> 0.0125 -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> or a full run >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>>>>>>>>>>>> -interpolate -petscspace_order 1 >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>>>>>>>>>>>> -interpolate -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Let me know if those work. >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> ./ex12 >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 8 FPE: >>>>>>>>>>>>>>>>>>>>>>>> Floating Point Exception,probably divide by zero >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS >>>>>>>>>>>>>>>>>>>>>>>> X to find memory corruption errors >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in >>>>>>>>>>>>>>>>>>>>>>>> stack below >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the >>>>>>>>>>>>>>>>>>>>>>>> stack are not available, >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of >>>>>>>>>>>>>>>>>>>>>>>> the start of the function >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeResidualFEM line >>>>>>>>>>>>>>>>>>>>>>>> 531 /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction_DMLocal >>>>>>>>>>>>>>>>>>>>>>>> line 63 /home/mjonesa/PETSc/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user function line 2088 >>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction line 2076 >>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve_NEWTONLS line 144 >>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/impls/ls/ls.c >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve line 3765 >>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for >>>>>>>>>>>>>>>>>>>>>>>> recent updates. >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about >>>>>>>>>>>>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual >>>>>>>>>>>>>>>>>>>>>>>> pages. >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>>>>> named maeda by mjonesa Thu Jan 16 17:41:23 2014 >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 >>>>>>>>>>>>>>>>>>>>>>>> 17:38:33 2014 >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local >>>>>>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>>>>>>> --with-c2html=0 --with-clanguage=c++ PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>>>>> --download-triangle >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in >>>>>>>>>>>>>>>>>>>>>>>> unknown file >>>>>>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - >>>>>>>>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:37 PM >>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:33 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> Hi, I have downloaded and built the dev version >>>>>>>>>>>>>>>>>>>>>>>>> you suggested. I think I need the triangle package to run this particular >>>>>>>>>>>>>>>>>>>>>>>>> case. Is there any thing else that appears wrong in what I have done from >>>>>>>>>>>>>>>>>>>>>>>>> the error messages below: >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Great! Its running. You can reconfigure like this: >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> $PETSC_DIR/$PETSC_ARCH/conf/reconfigure-$PETSC_ARCH.py --download-triangle >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> and then rebuild >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> make >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> and then rerun. You can load meshes, but its much >>>>>>>>>>>>>>>>>>>>>>>> easier to have triangle create them. >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Thanks for being patient, >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error >>>>>>>>>>>>>>>>>>>>>>>>> Message ------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: No support for this operation for >>>>>>>>>>>>>>>>>>>>>>>>> this object type! >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Mesh generation needs external >>>>>>>>>>>>>>>>>>>>>>>>> package support. >>>>>>>>>>>>>>>>>>>>>>>>> Please reconfigure with --download-triangle.! >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for >>>>>>>>>>>>>>>>>>>>>>>>> recent updates. >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about >>>>>>>>>>>>>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual >>>>>>>>>>>>>>>>>>>>>>>>> pages. >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>>>>>> named maeda by mjonesa Thu Jan 16 16:28:20 2014 >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 >>>>>>>>>>>>>>>>>>>>>>>>> 16:25:53 2014 >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local --with-clanguage=c++ --with-c2html=0 >>>>>>>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexGenerate() line 4332 in >>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plex.c >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexCreateBoxMesh() line 600 in >>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexcreate.c >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: CreateMesh() line 295 in >>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: main() line 659 in >>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 56) - >>>>>>>>>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:06 PM >>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:05 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> Hi. I changed the ENV variable to the correct >>>>>>>>>>>>>>>>>>>>>>>>>> entry. when I type make ex12 I get this: >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>>>>>>>>>> make ex12 >>>>>>>>>>>>>>>>>>>>>>>>>> g++ -o ex12.o -c -Wall -Wwrite-strings >>>>>>>>>>>>>>>>>>>>>>>>>> -Wno-strict-aliasing -Wno-unknown-pragmas -g -fPIC >>>>>>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include >>>>>>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/arch-linux2-cxx-debug/include >>>>>>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include/mpiuni >>>>>>>>>>>>>>>>>>>>>>>>>> -D__INSDIR__=src/snes/examples/tutorials/ ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c:14:18: fatal error: ex12.h: No such file >>>>>>>>>>>>>>>>>>>>>>>>>> or directory >>>>>>>>>>>>>>>>>>>>>>>>>> compilation terminated. >>>>>>>>>>>>>>>>>>>>>>>>>> make: *** [ex12.o] Error 1 >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> Any help of yours is very much appreciated. >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> Yes, this relates to my 3). This is not going to >>>>>>>>>>>>>>>>>>>>>>>>> work for you with the release. Please see the link I sent. >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:58 PM >>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:55 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> You built with PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:31 PM >>>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:11 PM, >>>>>>>>>>>>>>>>>>>>>>>>>>> Jones,Martin Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> Now I went to the directory where ex12.c sits >>>>>>>>>>>>>>>>>>>>>>>>>>>> and just did a 'make ex12.c' with the following error if this helps? : >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>>>>>>>>>>>> make ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/variables:108: >>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscvariables: No >>>>>>>>>>>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/rules:962: >>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules: No >>>>>>>>>>>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> make: *** No rule to make target >>>>>>>>>>>>>>>>>>>>>>>>>>>> `/home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules'. >>>>>>>>>>>>>>>>>>>>>>>>>>>> Stop. >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> 1) You would type 'make ex12' >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> 2) Either you PETSC_DIR ( >>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3) or PETSC_ARCH ( >>>>>>>>>>>>>>>>>>>>>>>>>>> linux-gnu-cxx-debug) environment variables >>>>>>>>>>>>>>>>>>>>>>>>>>> do not match what you built. Please send >>>>>>>>>>>>>>>>>>>>>>>>>>> configure.log and make.log >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> 3) Since it was only recently added, if you >>>>>>>>>>>>>>>>>>>>>>>>>>> want to use the FEM functionality, you must use the development version: >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/developers/index.html >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [mailto: >>>>>>>>>>>>>>>>>>>>>>>>>>>> knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 2:48 PM >>>>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 2:35 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello To Whom it Concerns, >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> I am trying to run the tutorial ex12.c by >>>>>>>>>>>>>>>>>>>>>>>>>>>> running 'bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>>>>>>>>>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h' >>>>>>>>>>>>>>>>>>>>>>>>>>>> but getting the following error: >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> $ >>>>>>>>>>>>>>>>>>>>>>>>>>>> bin/pythonscripts/PetscGenerateFEMQuadrature.py dim order dim 1 laplacian >>>>>>>>>>>>>>>>>>>>>>>>>>>> dim order dim 1 boundary src/snes/examples/tutorials/ex12.h >>>>>>>>>>>>>>>>>>>>>>>>>>>> Traceback (most recent call last): >>>>>>>>>>>>>>>>>>>>>>>>>>>> File >>>>>>>>>>>>>>>>>>>>>>>>>>>> "bin/pythonscripts/PetscGenerateFEMQuadrature.py", line 15, in >>>>>>>>>>>>>>>>>>>>>>>>>>>> from FIAT.reference_element import >>>>>>>>>>>>>>>>>>>>>>>>>>>> default_simplex >>>>>>>>>>>>>>>>>>>>>>>>>>>> ImportError: No module named >>>>>>>>>>>>>>>>>>>>>>>>>>>> FIAT.reference_element >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> I have removed the requirement of generating >>>>>>>>>>>>>>>>>>>>>>>>>>>> the header file (its now all handled in C). I thought >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> I changed the documentation everywhere >>>>>>>>>>>>>>>>>>>>>>>>>>>> (including the latest tutorial slides). Can you try running >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> with 'master' (or 'next'), and point me toward >>>>>>>>>>>>>>>>>>>>>>>>>>>> the old docs? >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>> Graduate Research Assistant >>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>> (217) 550-2360 >>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>> Graduate Research Assistant >>>>>>>> Department of Mechanical Science and Engineering >>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>> (217) 550-2360 >>>>>>>> salaza11 at illinois.edu >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> *Miguel Angel Salazar de Troya* >>>>> Graduate Research Assistant >>>>> Department of Mechanical Science and Engineering >>>>> University of Illinois at Urbana-Champaign >>>>> (217) 550-2360 >>>>> salaza11 at illinois.edu >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >>> >>> -- >>> *Miguel Angel Salazar de Troya* >>> Graduate Research Assistant >>> Department of Mechanical Science and Engineering >>> University of Illinois at Urbana-Champaign >>> (217) 550-2360 >>> salaza11 at illinois.edu >>> >>> >> >> >> -- >> *Miguel Angel Salazar de Troya* >> Graduate Research Assistant >> Department of Mechanical Science and Engineering >> University of Illinois at Urbana-Champaign >> (217) 550-2360 >> salaza11 at illinois.edu >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- *Miguel Angel Salazar de Troya* Graduate Research Assistant Department of Mechanical Science and Engineering University of Illinois at Urbana-Champaign (217) 550-2360 salaza11 at illinois.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Mar 30 19:51:40 2014 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 30 Mar 2014 19:51:40 -0500 Subject: [petsc-users] Problems running ex12.c In-Reply-To: References: <8448FFCE4362914496BCEAF8BE810C13EF1D34@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D5E@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D7C@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D8F@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DA3@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DBF@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E22@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E50@DCPWPEXMBX02.mdanderson.edu> Message-ID: On Sun, Mar 30, 2014 at 7:07 PM, Miguel Angel Salazar de Troya < salazardetroya at gmail.com> wrote: > Thanks for your response. Your help is really useful to me. > > The difference between the analytic and the field options are that for the > field options the function is projected onto the function space defined for > feAux right? What is the advantage of doing this? > If it is not purely a function of the coordinates, or you do not know that function, there is no option left. > Also, for this field case I see that the function always has to be a > vector. What if we wanted to implement a heterogeneous material in linear > elasticity? Would we implement the constitutive tensor as a vector? It > would not be very difficult I think, I just want to make sure it would be > this way. > Its not a vector, which indicates a particular behavior under coordinate transformations, but an array which can hold any data you want. Matt > Thanks in advance > Miguel > > > On Sun, Mar 30, 2014 at 2:01 PM, Matthew Knepley wrote: > >> On Sun, Mar 30, 2014 at 1:57 PM, Miguel Angel Salazar de Troya < >> salazardetroya at gmail.com> wrote: >> >>> Hello everybody >>> >>> I had a question about this example. In the petsc-dev next version, why >>> don't we create a PetscSection in the function SetupSection, but we do it >>> in the function SetupMaterialSection and in the function SetupSection of >>> the petsc-current version. >>> >> >> 1) I wanted to try and make things more automatic for the user >> >> 2) I needed a way to automatically layout data for coarser/finer grids in >> unstructured MG >> >> Thus, now when you set for PetscFE into the DM using DMSetField(), it >> will automatically create >> the section on the first call to DMGetDefaultSection(). >> >> I do not have a similar provision now for materials, so you create your >> own section. I think this is >> alright until we have some idea of a nicer interface. >> >> Thanks, >> >> Matt >> >> >>> petsc-dev: >>> >>> #undef __FUNCT__ >>> #define __FUNCT__ "SetupSection" >>> PetscErrorCode SetupSection(DM dm, AppCtx *user) >>> { >>> DM cdm = dm; >>> const PetscInt id = 1; >>> PetscErrorCode ierr; >>> >>> PetscFunctionBeginUser; >>> ierr = PetscObjectSetName((PetscObject) user->fe[0], >>> "potential");CHKERRQ(ierr); >>> while (cdm) { >>> ierr = DMSetNumFields(cdm, 1);CHKERRQ(ierr); >>> ierr = DMSetField(cdm, 0, (PetscObject) user->fe[0]);CHKERRQ(ierr); >>> ierr = DMPlexAddBoundary(cdm, user->bcType == DIRICHLET, >>> user->bcType == NEUMANN ? "boundary" : "marker", 0, user->exactFuncs[0], 1, >>> &id, user);CHKERRQ(ierr); >>> ierr = DMPlexGetCoarseDM(cdm, &cdm);CHKERRQ(ierr); >>> } >>> PetscFunctionReturn(0); >>> } >>> >>> >>> It seems that it adds the number of fields directly to the DM, and takes >>> the number of components that were specified in SetupElementCommon, but >>> what about the number of degrees of freedom? Why we added it for the >>> MaterialSection but not for the regular Section. >>> >>> Thanks in advance >>> Miguel >>> >>> >>> On Sat, Mar 15, 2014 at 4:16 PM, Miguel Angel Salazar de Troya < >>> salazardetroya at gmail.com> wrote: >>> >>>> Thanks a lot. >>>> >>>> >>>> On Sat, Mar 15, 2014 at 3:36 PM, Matthew Knepley wrote: >>>> >>>>> On Sat, Mar 15, 2014 at 3:31 PM, Miguel Angel Salazar de Troya < >>>>> salazardetroya at gmail.com> wrote: >>>>> >>>>>> Hello everybody >>>>>> >>>>>> I keep trying to understand this example. I don't have any problems >>>>>> with this example when I run it like this: >>>>>> >>>>>> [salaza11 at maya PETSC]$ ./ex12 -bc_type dirichlet -interpolate >>>>>> -petscspace_order 1 -variable_coefficient nonlinear -dim 2 -run_type full >>>>>> -show_solution >>>>>> Number of SNES iterations = 5 >>>>>> L_2 Error: 0.107289 >>>>>> Solution >>>>>> Vec Object: 1 MPI processes >>>>>> type: seq >>>>>> 0.484618 >>>>>> >>>>>> However, when I change the boundary conditions to Neumann, I get this >>>>>> error. >>>>>> >>>>>> [salaza11 at maya PETSC]$ ./ex12 -bc_type neumann -interpolate 1 >>>>>> -petscspace_order 2 -variable_coefficient nonlinear -dim 2 -run_type full >>>>>> -show_solution >>>>>> >>>>> >>>>> Here you set the order of the element used in bulk, but not on the >>>>> boundary where you condition is, so it defaults to 0. In >>>>> order to become more familiar, take a look at the tests that I run >>>>> here: >>>>> >>>>> >>>>> https://bitbucket.org/petsc/petsc/src/64715f0f033346c10c77b73cf58216d111db8789/config/builder.py?at=master#cl-216 >>>>> >>>>> Matt >>>>> >>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>> -------------------------------------------------------------- >>>>>> [0]PETSC ERROR: Petsc has generated inconsistent data >>>>>> [0]PETSC ERROR: Number of dual basis vectors 0 not equal to dimension >>>>>> 1 >>>>>> [0]PETSC ERROR: See http:// >>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble >>>>>> shooting. >>>>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-4776-gb18359b >>>>>> GIT Date: 2014-03-04 10:53:30 -0600 >>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 >>>>>> Sat Mar 15 14:28:05 2014 >>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>> --download-chaco --with-c2html=0 >>>>>> [0]PETSC ERROR: #1 PetscDualSpaceSetUp_Lagrange() line 1763 in >>>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>>>> [0]PETSC ERROR: #2 PetscDualSpaceSetUp() line 1277 in >>>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>>>> [0]PETSC ERROR: #3 SetupElementCommon() line 474 in >>>>>> /home/salaza11/workspace/PETSC/ex12.c >>>>>> [0]PETSC ERROR: #4 SetupBdElement() line 559 in >>>>>> /home/salaza11/workspace/PETSC/ex12.c >>>>>> [0]PETSC ERROR: #5 main() line 755 in >>>>>> /home/salaza11/workspace/PETSC/ex12.c >>>>>> [0]PETSC ERROR: ----------------End of Error Message -------send >>>>>> entire error message to petsc-maint at mcs.anl.gov---------- >>>>>> application called MPI_Abort(MPI_COMM_WORLD, 77) - process 0 >>>>>> [unset]: aborting job: >>>>>> application called MPI_Abort(MPI_COMM_WORLD, 77) - process 0 >>>>>> >>>>>> I honestly do not know much about using dual spaces in a finite >>>>>> element context. I have been trying to find some material that could help >>>>>> me without much success. I tried to modify the dual space order with the >>>>>> option -petscdualspace_order but I kept getting errors. In particular, I >>>>>> got this when I set it to 1. >>>>>> >>>>>> [salaza11 at maya PETSC]$ ./ex12 -bc_type neumann -interpolate 1 >>>>>> -petscspace_order 2 -variable_coefficient nonlinear -dim 2 -run_type full >>>>>> -show_solution -petscdualspace_order 1 >>>>>> [0]PETSC ERROR: PetscTrFreeDefault() called from PetscFESetUp_Basic() >>>>>> line 2492 in /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>>>> [0]PETSC ERROR: Block [id=0(32)] at address 0x1cc32f0 is corrupted >>>>>> (probably write past end of array) >>>>>> [0]PETSC ERROR: Block allocated in PetscFESetUp_Basic() line 2483 in >>>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>> -------------------------------------------------------------- >>>>>> [0]PETSC ERROR: Memory corruption: >>>>>> http://www.mcs.anl.gov/petsc/documentation/installation.html#valgrind >>>>>> [0]PETSC ERROR: Corrupted memory >>>>>> [0]PETSC ERROR: See http:// >>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble >>>>>> shooting. >>>>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-4776-gb18359b >>>>>> GIT Date: 2014-03-04 10:53:30 -0600 >>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 >>>>>> Sat Mar 15 14:37:34 2014 >>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>> --download-chaco --with-c2html=0 >>>>>> [0]PETSC ERROR: #1 PetscTrFreeDefault() line 289 in >>>>>> /home/salaza11/petsc/src/sys/memory/mtr.c >>>>>> [0]PETSC ERROR: #2 PetscFESetUp_Basic() line 2492 in >>>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>>>> [0]PETSC ERROR: #3 PetscFESetUp() line 2126 in >>>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>>>> [0]PETSC ERROR: #4 SetupElementCommon() line 482 in >>>>>> /home/salaza11/workspace/PETSC/ex12.c >>>>>> [0]PETSC ERROR: #5 SetupElement() line 506 in >>>>>> /home/salaza11/workspace/PETSC/ex12.c >>>>>> [0]PETSC ERROR: #6 main() line 754 in >>>>>> /home/salaza11/workspace/PETSC/ex12.c >>>>>> [0]PETSC ERROR: ----------------End of Error Message -------send >>>>>> entire error message to petsc-maint at mcs.anl.gov---------- >>>>>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 >>>>>> [unset]: aborting job: >>>>>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 >>>>>> [salaza11 at maya PETSC]$ >>>>>> >>>>>> >>>>>> Then again, I do not know much what I am doing given my ignorance >>>>>> with respect to the dual spaces in FE. I apologize for that. My questions >>>>>> are: >>>>>> >>>>>> - Where could I find more resources in order to understand the PETSc >>>>>> implementation of dual spaces for FE? >>>>>> - Why does it run with Dirichlet but not with Neumann? >>>>>> >>>>>> Thanks in advance. >>>>>> Miguel. >>>>>> >>>>>> >>>>>> On Tue, Mar 4, 2014 at 11:28 PM, Matthew Knepley wrote: >>>>>> >>>>>>> On Tue, Mar 4, 2014 at 12:01 PM, Matthew Knepley wrote: >>>>>>> >>>>>>>> On Tue, Mar 4, 2014 at 11:51 AM, Miguel Angel Salazar de Troya < >>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>> >>>>>>>>> I can run it now, thanks. Although if I run it with valgrind 3.5.0 >>>>>>>>> (should I update to the last version?) I get some memory leaks related with >>>>>>>>> the function DMPlexCreateBoxMesh. >>>>>>>>> >>>>>>>> >>>>>>>> I will check it out. >>>>>>>> >>>>>>> >>>>>>> This is now fixed. >>>>>>> >>>>>>> Thanks for finding it >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> Thanks, >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> [salaza11 at maya tutorials]$ valgrind --leak-check=full ./ex12 >>>>>>>>> -run_type test -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 >>>>>>>>> -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>>>>>> ==9625== Memcheck, a memory error detector >>>>>>>>> ==9625== Copyright (C) 2002-2009, and GNU GPL'd, by Julian Seward >>>>>>>>> et al. >>>>>>>>> ==9625== Using Valgrind-3.5.0 and LibVEX; rerun with -h for >>>>>>>>> copyright info >>>>>>>>> ==9625== Command: ./ex12 -run_type test -refinement_limit 0.0 >>>>>>>>> -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>> ==9625== >>>>>>>>> Local function: >>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>> type: seq >>>>>>>>> 0 >>>>>>>>> 0.25 >>>>>>>>> 1 >>>>>>>>> 0.25 >>>>>>>>> 0.5 >>>>>>>>> 1.25 >>>>>>>>> 1 >>>>>>>>> 1.25 >>>>>>>>> 2 >>>>>>>>> Initial guess >>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>> type: seq >>>>>>>>> 0.5 >>>>>>>>> L_2 Error: 0.111111 >>>>>>>>> Residual: >>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>> type: seq >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> Initial Residual >>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>> type: seq >>>>>>>>> 0 >>>>>>>>> L_2 Residual: 0 >>>>>>>>> Jacobian: >>>>>>>>> Mat Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> row 0: (0, 4) >>>>>>>>> Residual: >>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>> type: seq >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> -2 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> 0 >>>>>>>>> Au - b = Au + F(0) >>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>> type: seq >>>>>>>>> 0 >>>>>>>>> Linear L_2 Residual: 0 >>>>>>>>> ==9625== >>>>>>>>> ==9625== HEAP SUMMARY: >>>>>>>>> ==9625== in use at exit: 288 bytes in 3 blocks >>>>>>>>> ==9625== total heap usage: 2,484 allocs, 2,481 frees, 1,009,287 >>>>>>>>> bytes allocated >>>>>>>>> ==9625== >>>>>>>>> ==9625== 48 bytes in 1 blocks are definitely lost in loss record 1 >>>>>>>>> of 3 >>>>>>>>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>>>>>>>> ==9625== by 0x5D8D4E1: writepoly (triangle.c:12012) >>>>>>>>> ==9625== by 0x5D8FAAC: triangulate (triangle.c:13167) >>>>>>>>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>>>>>>>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>>>>>>>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>>>>>>>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>>>>>>>> ==9625== by 0x408D3D: main (ex12.c:651) >>>>>>>>> ==9625== >>>>>>>>> ==9625== 96 bytes in 1 blocks are definitely lost in loss record 2 >>>>>>>>> of 3 >>>>>>>>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>>>>>>>> ==9625== by 0x5D8D485: writepoly (triangle.c:12004) >>>>>>>>> ==9625== by 0x5D8FAAC: triangulate (triangle.c:13167) >>>>>>>>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>>>>>>>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>>>>>>>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>>>>>>>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>>>>>>>> ==9625== by 0x408D3D: main (ex12.c:651) >>>>>>>>> ==9625== >>>>>>>>> ==9625== 144 bytes in 1 blocks are definitely lost in loss record >>>>>>>>> 3 of 3 >>>>>>>>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>>>>>>>> ==9625== by 0x5D8CD20: writenodes (triangle.c:11718) >>>>>>>>> ==9625== by 0x5D8F9DE: triangulate (triangle.c:13132) >>>>>>>>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>>>>>>>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>>>>>>>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>>>>>>>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>>>>>>>> ==9625== by 0x408D3D: main (ex12.c:651) >>>>>>>>> ==9625== >>>>>>>>> ==9625== LEAK SUMMARY: >>>>>>>>> ==9625== definitely lost: 288 bytes in 3 blocks >>>>>>>>> ==9625== indirectly lost: 0 bytes in 0 blocks >>>>>>>>> ==9625== possibly lost: 0 bytes in 0 blocks >>>>>>>>> ==9625== still reachable: 0 bytes in 0 blocks >>>>>>>>> ==9625== suppressed: 0 bytes in 0 blocks >>>>>>>>> ==9625== >>>>>>>>> ==9625== For counts of detected and suppressed errors, rerun with: >>>>>>>>> -v >>>>>>>>> ==9625== ERROR SUMMARY: 3 errors from 3 contexts (suppressed: 6 >>>>>>>>> from 6) >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> On Mon, Mar 3, 2014 at 7:05 PM, Matthew Knepley >>>>>>>> > wrote: >>>>>>>>> >>>>>>>>>> On Mon, Mar 3, 2014 at 4:59 PM, Miguel Angel Salazar de Troya < >>>>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> You are welcome, thanks for your help. >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Okay, I have rebuilt completely clean, and ex12 runs for me. Can >>>>>>>>>> you try again after pulling? >>>>>>>>>> >>>>>>>>>> Thanks, >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> On Mon, Mar 3, 2014 at 4:13 PM, Matthew Knepley < >>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> On Mon, Mar 3, 2014 at 1:44 PM, Miguel Angel Salazar de Troya < >>>>>>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> Thanks. This is what I get. >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Okay, this was broken by a new push to master/next in the last >>>>>>>>>>>> few days. I have pushed a fix, >>>>>>>>>>>> however next is currently broken due to a failure to check in a >>>>>>>>>>>> file. This should be fixed shortly, >>>>>>>>>>>> and then ex12 will work. I will mail you when its ready. >>>>>>>>>>>> >>>>>>>>>>>> Thanks for finding this, >>>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> (gdb) cont >>>>>>>>>>>>> Continuing. >>>>>>>>>>>>> >>>>>>>>>>>>> Program received signal SIGSEGV, Segmentation fault. >>>>>>>>>>>>> 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, >>>>>>>>>>>>> X=0x168b5b0, >>>>>>>>>>>>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, >>>>>>>>>>>>> str=0x7fffae6e7970, >>>>>>>>>>>>> user=0x7fd6811be509) >>>>>>>>>>>>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>>>>>>>>>>>> 882 ierr = PetscFEGetDimension(fe[f], >>>>>>>>>>>>> &Nb);CHKERRQ(ierr); >>>>>>>>>>>>> (gdb) where >>>>>>>>>>>>> #0 0x00007fd6811bea7b in DMPlexComputeJacobianFEM >>>>>>>>>>>>> (dm=0x159a180, X=0x168b5b0, >>>>>>>>>>>>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, >>>>>>>>>>>>> str=0x7fffae6e7970, >>>>>>>>>>>>> user=0x7fd6811be509) >>>>>>>>>>>>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>>>>>>>>>>>> #1 0x00007fd6814a5bf6 in SNESComputeJacobian_DMLocal >>>>>>>>>>>>> (snes=0x14e9450, >>>>>>>>>>>>> X=0x1622ad0, A=0x7fffae6e8a88, B=0x7fffae6e8a88, >>>>>>>>>>>>> ctx=0x1652300) >>>>>>>>>>>>> at /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c:102 >>>>>>>>>>>>> #2 0x00007fd6814cc609 in SNESComputeJacobian (snes=0x14e9450, >>>>>>>>>>>>> X=0x1622ad0, >>>>>>>>>>>>> A=0x7fffae6e8a88, B=0x7fffae6e8a88) >>>>>>>>>>>>> at /home/salaza11/petsc/src/snes/interface/snes.c:2245 >>>>>>>>>>>>> #3 0x000000000040af72 in main (argc=15, argv=0x7fffae6e8bc8) >>>>>>>>>>>>> at >>>>>>>>>>>>> /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:784 >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> On Mon, Mar 3, 2014 at 1:40 PM, Matthew Knepley < >>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> On Mon, Mar 3, 2014 at 1:39 PM, Miguel Angel Salazar de Troya >>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> This is what I get at gdb when I type 'where'. >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> You have to type 'cont', and then when it fails you type >>>>>>>>>>>>>> 'where'. >>>>>>>>>>>>>> >>>>>>>>>>>>>> Matt >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> #0 0x000000310e0aa860 in __nanosleep_nocancel () from >>>>>>>>>>>>>>> /lib64/libc.so.6 >>>>>>>>>>>>>>> #1 0x000000310e0aa70f in sleep () from /lib64/libc.so.6 >>>>>>>>>>>>>>> #2 0x00007fd83a00a8be in PetscSleep (s=10) >>>>>>>>>>>>>>> at /home/salaza11/petsc/src/sys/utils/psleep.c:52 >>>>>>>>>>>>>>> #3 0x00007fd83a06f331 in PetscAttachDebugger () >>>>>>>>>>>>>>> at /home/salaza11/petsc/src/sys/error/adebug.c:397 >>>>>>>>>>>>>>> #4 0x00007fd83a0af1d2 in PetscOptionsCheckInitial_Private () >>>>>>>>>>>>>>> at /home/salaza11/petsc/src/sys/objects/init.c:444 >>>>>>>>>>>>>>> #5 0x00007fd83a0b6448 in PetscInitialize >>>>>>>>>>>>>>> (argc=0x7fff5cd8df2c, >>>>>>>>>>>>>>> args=0x7fff5cd8df20, file=0x0, >>>>>>>>>>>>>>> help=0x60ce40 "Poisson Problem in 2d and 3d with >>>>>>>>>>>>>>> simplicial finite elements.\nWe solve the Poisson problem in a >>>>>>>>>>>>>>> rectangular\ndomain, using a parallel unstructured mesh (DMPLEX) to >>>>>>>>>>>>>>> discretize it.\n\n\n") >>>>>>>>>>>>>>> at /home/salaza11/petsc/src/sys/objects/pinit.c:876 >>>>>>>>>>>>>>> #6 0x0000000000408f2c in main (argc=15, argv=0x7fff5cd8f1f8) >>>>>>>>>>>>>>> at >>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:663 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> The rest of the gdb output is attached. I am a bit ignorant >>>>>>>>>>>>>>> with gdb, I apologize for that. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Mon, Mar 3, 2014 at 12:48 PM, Matthew Knepley < >>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Mon, Mar 3, 2014 at 12:39 PM, Miguel Angel Salazar de >>>>>>>>>>>>>>>> Troya wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Thanks for your response. Sorry I did not have the "next" >>>>>>>>>>>>>>>>> version, but the "master" version. I still have an error though. I followed >>>>>>>>>>>>>>>>> the steps given here ( >>>>>>>>>>>>>>>>> https://bitbucket.org/petsc/petsc/wiki/Home) to obtain >>>>>>>>>>>>>>>>> the next version, I configured petsc as above and ran ex12 as above as >>>>>>>>>>>>>>>>> well, getting this error: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> [salaza11 at maya tutorials]$ ./ex12 -run_type test >>>>>>>>>>>>>>>>> -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 >>>>>>>>>>>>>>>>> -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> 0.25 >>>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>>> 0.25 >>>>>>>>>>>>>>>>> 0.5 >>>>>>>>>>>>>>>>> 1.25 >>>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>>> 1.25 >>>>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>>>> Initial guess >>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>> 0.5 >>>>>>>>>>>>>>>>> L_2 Error: 0.111111 >>>>>>>>>>>>>>>>> Residual: >>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> Initial Residual >>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> L_2 Residual: 0 >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Okay, now run with -start_in_debugger, and give me a stack >>>>>>>>>>>>>>>> trace using 'where'. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>>>>>>>>>>> Violation, probably memory access out of range >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to >>>>>>>>>>>>>>>>> find memory corruption errors >>>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack >>>>>>>>>>>>>>>>> below >>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack >>>>>>>>>>>>>>>>> are not available, >>>>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the start >>>>>>>>>>>>>>>>> of the function >>>>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 871 >>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>>> -------------------------------------------------------------- >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received >>>>>>>>>>>>>>>>> [0]PETSC ERROR: See http:// >>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for >>>>>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>> v3.4.3-4705-gfb6b3bc GIT Date: 2014-03-03 08:23:43 -0600 >>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya >>>>>>>>>>>>>>>>> by salaza11 Mon Mar 3 11:49:15 2014 >>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>>>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>>>>>>>>>> [0]PETSC ERROR: #1 User provided function() line 0 in >>>>>>>>>>>>>>>>> unknown file >>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> [unset]: aborting job: >>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process >>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Sun, Mar 2, 2014 at 7:11 PM, Matthew Knepley < >>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Sun, Mar 2, 2014 at 6:54 PM, Miguel Angel Salazar de >>>>>>>>>>>>>>>>>> Troya wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Hi everybody >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> I am trying to run example ex12.c without much success. >>>>>>>>>>>>>>>>>>> I specifically run it with the command options: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> We need to start narrowing down differences, because it >>>>>>>>>>>>>>>>>> runs for me and our nightly tests. So, first can >>>>>>>>>>>>>>>>>> you confirm that you are using the latest 'next' branch? >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 >>>>>>>>>>>>>>>>>>> -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> And I get this output >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>>>>>> 3 >>>>>>>>>>>>>>>>>>> Initial guess >>>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>>>> L_2 Error: 0.625 >>>>>>>>>>>>>>>>>>> Residual: >>>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>> Initial Residual >>>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>>>> L_2 Residual: 0 >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: >>>>>>>>>>>>>>>>>>> Segmentation Violation, probably memory access out of range >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to >>>>>>>>>>>>>>>>>>> find memory corruption errors >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in >>>>>>>>>>>>>>>>>>> stack below >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the >>>>>>>>>>>>>>>>>>> stack are not available, >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the >>>>>>>>>>>>>>>>>>> start of the function >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 867 >>>>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line >>>>>>>>>>>>>>>>>>> 2244 /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>>> v3.4.3-3453-g0a94005 GIT Date: 2014-03-02 13:12:04 -0600 >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>>>>>>> updates. >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about >>>>>>>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya >>>>>>>>>>>>>>>>>>> by salaza11 Sun Mar 2 17:00:09 2014 >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>>> /home/salaza11/petsc/linux-gnu-c-debug/lib >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Sun Mar 2 16:46:51 2014 >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>>>>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>>>>>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in >>>>>>>>>>>>>>>>>>> unknown file >>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - >>>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>>> [unset]: aborting job: >>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - >>>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Probably my problems could be on my configuration. I >>>>>>>>>>>>>>>>>>> attach the configure.log. I ran ./configure like this >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> ./configure --download-mpich --download-scientificpython >>>>>>>>>>>>>>>>>>> --download-triangle --download-ctetgen --download-chaco --with-c2html=0 >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Thanks a lot in advance. >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> On Tue, Jan 28, 2014 at 10:37 AM, Matthew Knepley < >>>>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> On Tue, Jan 28, 2014 at 10:31 AM, Yaakoub El Khamra < >>>>>>>>>>>>>>>>>>>> yelkhamra at gmail.com> wrote: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> If >>>>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit >>>>>>>>>>>>>>>>>>>>> 0.0125 -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> is for serial, any chance we can get the options to >>>>>>>>>>>>>>>>>>>>> run in parallel? >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Just use mpiexec -n >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Regards >>>>>>>>>>>>>>>>>>>>> Yaakoub El Khamra >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:29 AM, Matthew Knepley < >>>>>>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:06 AM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>> *Sent:* Friday, January 17, 2014 11:04 AM >>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running ex12.c >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:00 AM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> These examples all seem to run excepting the >>>>>>>>>>>>>>>>>>>>>>>> following command, >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit 0.0125 >>>>>>>>>>>>>>>>>>>>>>>> -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> I get the following ouput: >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit >>>>>>>>>>>>>>>>>>>>>>>> 0.0125 -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>>>>>>>>>>>> ./ex12: symbol lookup error: >>>>>>>>>>>>>>>>>>>>>>>> /opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so: undefined >>>>>>>>>>>>>>>>>>>>>>>> symbol: omp_get_num_procs >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> This is a build problem, but it should affect all >>>>>>>>>>>>>>>>>>>>>>> the runs. Is this reproducible? Can you send configure.log? MKL is the >>>>>>>>>>>>>>>>>>>>>>> worst. If this >>>>>>>>>>>>>>>>>>>>>>> persists, I would just switch to >>>>>>>>>>>>>>>>>>>>>>> --download-f-blas-lapack. >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Thanks. I have some advice on options >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> --with-precision=single # I would not use this >>>>>>>>>>>>>>>>>>>>>> unless you are doing something special, like CUDA >>>>>>>>>>>>>>>>>>>>>> --with-clanguage=C++ # I would recommend switching >>>>>>>>>>>>>>>>>>>>>> to C, the build is much faster >>>>>>>>>>>>>>>>>>>>>> --with-mpi-dir=/usr --with-mpi4py=0 >>>>>>>>>>>>>>>>>>>>>> --with-shared-libraries --CFLAGS=-O0 --CXXFLAGS=-O0 >>>>>>>>>>>>>>>>>>>>>> --with-fc=0 >>>>>>>>>>>>>>>>>>>>>> --with-etags=1 # This is unnecessary >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>>>>> # Here is the problem, see below >>>>>>>>>>>>>>>>>>>>>> --download-metis >>>>>>>>>>>>>>>>>>>>>> --download-fiat=yes --download-generator >>>>>>>>>>>>>>>>>>>>>> --download-scientificpython # Get rid of these, they are obsolete >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Your MKL needs another library for the OpenMP >>>>>>>>>>>>>>>>>>>>>> symbols. I would recommend switching to --download-f2cblaslapack, >>>>>>>>>>>>>>>>>>>>>> or you can try and find that library. >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 6:35 PM >>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 5:43 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> Hi, This is the next error message after >>>>>>>>>>>>>>>>>>>>>>>>> configuring and building with the triangle package when trying to run ex12 >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> This is my fault for bad defaults. I will fix. >>>>>>>>>>>>>>>>>>>>>>>> Try running >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 >>>>>>>>>>>>>>>>>>>>>>>> -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>>>>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> for a representative run. Then you could try 3D >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit >>>>>>>>>>>>>>>>>>>>>>>> 0.0125 -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> or a full run >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>>>>>>>>>>>>> -interpolate -petscspace_order 1 >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>>>>>>>>>>>>> -interpolate -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Let me know if those work. >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> ./ex12 >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 8 FPE: >>>>>>>>>>>>>>>>>>>>>>>>> Floating Point Exception,probably divide by zero >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS >>>>>>>>>>>>>>>>>>>>>>>>> X to find memory corruption errors >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given >>>>>>>>>>>>>>>>>>>>>>>>> in stack below >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack >>>>>>>>>>>>>>>>>>>>>>>>> Frames ------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in >>>>>>>>>>>>>>>>>>>>>>>>> the stack are not available, >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of >>>>>>>>>>>>>>>>>>>>>>>>> the start of the function >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeResidualFEM line >>>>>>>>>>>>>>>>>>>>>>>>> 531 /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction_DMLocal >>>>>>>>>>>>>>>>>>>>>>>>> line 63 /home/mjonesa/PETSc/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user function line 2088 >>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction line 2076 >>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve_NEWTONLS line 144 >>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/impls/ls/ls.c >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve line 3765 >>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error >>>>>>>>>>>>>>>>>>>>>>>>> Message ------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for >>>>>>>>>>>>>>>>>>>>>>>>> recent updates. >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about >>>>>>>>>>>>>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual >>>>>>>>>>>>>>>>>>>>>>>>> pages. >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>>>>>> named maeda by mjonesa Thu Jan 16 17:41:23 2014 >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 >>>>>>>>>>>>>>>>>>>>>>>>> 17:38:33 2014 >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local >>>>>>>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>>>>>>>> --with-c2html=0 --with-clanguage=c++ PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>>>>>> --download-triangle >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in >>>>>>>>>>>>>>>>>>>>>>>>> unknown file >>>>>>>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - >>>>>>>>>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:37 PM >>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:33 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> Hi, I have downloaded and built the dev version >>>>>>>>>>>>>>>>>>>>>>>>>> you suggested. I think I need the triangle package to run this particular >>>>>>>>>>>>>>>>>>>>>>>>>> case. Is there any thing else that appears wrong in what I have done from >>>>>>>>>>>>>>>>>>>>>>>>>> the error messages below: >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> Great! Its running. You can reconfigure like >>>>>>>>>>>>>>>>>>>>>>>>> this: >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> $PETSC_DIR/$PETSC_ARCH/conf/reconfigure-$PETSC_ARCH.py --download-triangle >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> and then rebuild >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> make >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> and then rerun. You can load meshes, but its >>>>>>>>>>>>>>>>>>>>>>>>> much easier to have triangle create them. >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> Thanks for being patient, >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error >>>>>>>>>>>>>>>>>>>>>>>>>> Message ------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: No support for this operation for >>>>>>>>>>>>>>>>>>>>>>>>>> this object type! >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Mesh generation needs external >>>>>>>>>>>>>>>>>>>>>>>>>> package support. >>>>>>>>>>>>>>>>>>>>>>>>>> Please reconfigure with --download-triangle.! >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for >>>>>>>>>>>>>>>>>>>>>>>>>> recent updates. >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about >>>>>>>>>>>>>>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual >>>>>>>>>>>>>>>>>>>>>>>>>> pages. >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>>>>>>> named maeda by mjonesa Thu Jan 16 16:28:20 2014 >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 >>>>>>>>>>>>>>>>>>>>>>>>>> 16:25:53 2014 >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local --with-clanguage=c++ --with-c2html=0 >>>>>>>>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexGenerate() line 4332 in >>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plex.c >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexCreateBoxMesh() line 600 in >>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexcreate.c >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: CreateMesh() line 295 in >>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: main() line 659 in >>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 56) >>>>>>>>>>>>>>>>>>>>>>>>>> - process 0 >>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:06 PM >>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:05 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> Hi. I changed the ENV variable to the correct >>>>>>>>>>>>>>>>>>>>>>>>>>> entry. when I type make ex12 I get this: >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>>>>>>>>>>> make ex12 >>>>>>>>>>>>>>>>>>>>>>>>>>> g++ -o ex12.o -c -Wall -Wwrite-strings >>>>>>>>>>>>>>>>>>>>>>>>>>> -Wno-strict-aliasing -Wno-unknown-pragmas -g -fPIC >>>>>>>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include >>>>>>>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/arch-linux2-cxx-debug/include >>>>>>>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include/mpiuni >>>>>>>>>>>>>>>>>>>>>>>>>>> -D__INSDIR__=src/snes/examples/tutorials/ ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c:14:18: fatal error: ex12.h: No such file >>>>>>>>>>>>>>>>>>>>>>>>>>> or directory >>>>>>>>>>>>>>>>>>>>>>>>>>> compilation terminated. >>>>>>>>>>>>>>>>>>>>>>>>>>> make: *** [ex12.o] Error 1 >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> Any help of yours is very much appreciated. >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> Yes, this relates to my 3). This is not going >>>>>>>>>>>>>>>>>>>>>>>>>> to work for you with the release. Please see the link I sent. >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:58 PM >>>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:55 PM, >>>>>>>>>>>>>>>>>>>>>>>>>>> Jones,Martin Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> You built with PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:31 PM >>>>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:11 PM, >>>>>>>>>>>>>>>>>>>>>>>>>>>> Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>>>> > wrote: >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> Now I went to the directory where ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>>>> sits and just did a 'make ex12.c' with the following error if this helps? : >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>>>>>>>>>>>>> make ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/variables:108: >>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscvariables: No >>>>>>>>>>>>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/rules:962: >>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules: No >>>>>>>>>>>>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> make: *** No rule to make target >>>>>>>>>>>>>>>>>>>>>>>>>>>>> `/home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules'. >>>>>>>>>>>>>>>>>>>>>>>>>>>>> Stop. >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> 1) You would type 'make ex12' >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> 2) Either you PETSC_DIR ( >>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3) or PETSC_ARCH >>>>>>>>>>>>>>>>>>>>>>>>>>>> (linux-gnu-cxx-debug) environment variables >>>>>>>>>>>>>>>>>>>>>>>>>>>> do not match what you built. Please send >>>>>>>>>>>>>>>>>>>>>>>>>>>> configure.log and make.log >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> 3) Since it was only recently added, if you >>>>>>>>>>>>>>>>>>>>>>>>>>>> want to use the FEM functionality, you must use the development version: >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/developers/index.html >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [mailto: >>>>>>>>>>>>>>>>>>>>>>>>>>>>> knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 2:48 PM >>>>>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 2:35 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello To Whom it Concerns, >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> I am trying to run the tutorial ex12.c by >>>>>>>>>>>>>>>>>>>>>>>>>>>>> running 'bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>>>>>>>>>>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h' >>>>>>>>>>>>>>>>>>>>>>>>>>>>> but getting the following error: >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ >>>>>>>>>>>>>>>>>>>>>>>>>>>>> bin/pythonscripts/PetscGenerateFEMQuadrature.py dim order dim 1 laplacian >>>>>>>>>>>>>>>>>>>>>>>>>>>>> dim order dim 1 boundary src/snes/examples/tutorials/ex12.h >>>>>>>>>>>>>>>>>>>>>>>>>>>>> Traceback (most recent call last): >>>>>>>>>>>>>>>>>>>>>>>>>>>>> File >>>>>>>>>>>>>>>>>>>>>>>>>>>>> "bin/pythonscripts/PetscGenerateFEMQuadrature.py", line 15, in >>>>>>>>>>>>>>>>>>>>>>>>>>>>> from FIAT.reference_element import >>>>>>>>>>>>>>>>>>>>>>>>>>>>> default_simplex >>>>>>>>>>>>>>>>>>>>>>>>>>>>> ImportError: No module named >>>>>>>>>>>>>>>>>>>>>>>>>>>>> FIAT.reference_element >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> I have removed the requirement of generating >>>>>>>>>>>>>>>>>>>>>>>>>>>>> the header file (its now all handled in C). I thought >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> I changed the documentation everywhere >>>>>>>>>>>>>>>>>>>>>>>>>>>>> (including the latest tutorial slides). Can you try running >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> with 'master' (or 'next'), and point me toward >>>>>>>>>>>>>>>>>>>>>>>>>>>>> the old docs? >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted >>>>>>>>>>>>>>>>>>>>>>>>>>>>> before they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>> (217) 550-2360 >>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>> Graduate Research Assistant >>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>> (217) 550-2360 >>>>>>>>> salaza11 at illinois.edu >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> *Miguel Angel Salazar de Troya* >>>>>> Graduate Research Assistant >>>>>> Department of Mechanical Science and Engineering >>>>>> University of Illinois at Urbana-Champaign >>>>>> (217) 550-2360 >>>>>> salaza11 at illinois.edu >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>>> >>>> -- >>>> *Miguel Angel Salazar de Troya* >>>> Graduate Research Assistant >>>> Department of Mechanical Science and Engineering >>>> University of Illinois at Urbana-Champaign >>>> (217) 550-2360 >>>> salaza11 at illinois.edu >>>> >>>> >>> >>> >>> -- >>> *Miguel Angel Salazar de Troya* >>> Graduate Research Assistant >>> Department of Mechanical Science and Engineering >>> University of Illinois at Urbana-Champaign >>> (217) 550-2360 >>> salaza11 at illinois.edu >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > > > -- > *Miguel Angel Salazar de Troya* > Graduate Research Assistant > Department of Mechanical Science and Engineering > University of Illinois at Urbana-Champaign > (217) 550-2360 > salaza11 at illinois.edu > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mathisfriesdorf at gmail.com Mon Mar 31 11:49:40 2014 From: mathisfriesdorf at gmail.com (Mathis Friesdorf) Date: Mon, 31 Mar 2014 18:49:40 +0200 Subject: [petsc-users] Tensor product as matrix free method Message-ID: Hello everybody, for my Ph.D. in theoretical quantum mechanics, I am currently trying to integrate the Schroedinger equation (a linear partial differential equation). In my field, we are working with so called local spin chains, which mathematically speaking are described by tensor products of small vector spaces over several systems (let's say 20). The matrix corresponding to the differential equation is called Hamiltonian and can for typical systems be written as a sum over tensor products where it acts as the identity on most systems. It normally has the form *\sum Id \otimes Id ... Id \otimes M \otimes Id \otimes ...* where M takes different positions.I know how to explicitly construct the full matrix and insert it into Petsc, but for the interesting applications it is too large to be stored in the RAM. I would therefore like to implement it as a matrix free version. This should be possible using MatCreateMAIJ() and VecGetArray(), as the following very useful post points out http://lists.mcs.anl.gov/pipermail/petsc-users/2011-September/009992.html. I was wondering whether anybody already made progress with this, as I am still a bit lost on how to precisely proceed. These systems really are ubiquitous in theoretical quantum mechanics and I am sure it would be helpful to quite a lot of people who still shy away a bit from Petsc. Thanks already for your help and all the best, Mathis -------------- next part -------------- An HTML attachment was scrubbed... URL: From dmeiser at txcorp.com Mon Mar 31 12:10:49 2014 From: dmeiser at txcorp.com (Dominic Meiser) Date: Mon, 31 Mar 2014 11:10:49 -0600 Subject: [petsc-users] Tensor product as matrix free method In-Reply-To: References: Message-ID: <5339A199.7080203@txcorp.com> Have you considered using a matrix shell: http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatCreateShell.html What operations do you need to support besides sparse matrix vector products? On 03/31/2014 10:49 AM, Mathis Friesdorf wrote: > Hello everybody, > > for my Ph.D. in theoretical quantum mechanics, I am currently trying > to integrate the Schroedinger equation (a linear partial differential > equation). In my field, we are working with so called local spin > chains, which mathematically speaking are described by tensor products > of small vector spaces over several systems (let's say 20). The matrix > corresponding to the differential equation is called Hamiltonian and > can for typical systems be written as a sum over tensor products where > it acts as the identity on most systems. It normally has the form > > /\sum Id \otimes Id ... Id \otimes M \otimes Id \otimes .../ > > where M takes different positions.I know how to explicitly construct > the full matrix and insert it into Petsc, but for the interesting > applications it is too large to be stored in the RAM. I would > therefore like to implement it as a matrix free version. > This should be possible using MatCreateMAIJ() and VecGetArray(), as > the following very useful post points out > http://lists.mcs.anl.gov/pipermail/petsc-users/2011-September/009992.html. > I was wondering whether anybody already made progress with this, as I > am still a bit lost on how to precisely proceed. These systems really > are ubiquitous in theoretical quantum mechanics and I am sure it would > be helpful to quite a lot of people who still shy away a bit from Petsc. > > Thanks already for your help and all the best, Mathis -- Dominic Meiser Tech-X Corporation 5621 Arapahoe Avenue Boulder, CO 80303 USA Telephone: 303-996-2036 Fax: 303-448-7756 www.txcorp.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Mar 31 12:10:40 2014 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 31 Mar 2014 12:10:40 -0500 Subject: [petsc-users] Tensor product as matrix free method In-Reply-To: References: Message-ID: On Mon, Mar 31, 2014 at 11:49 AM, Mathis Friesdorf < mathisfriesdorf at gmail.com> wrote: > Hello everybody, > > for my Ph.D. in theoretical quantum mechanics, I am currently trying to > integrate the Schroedinger equation (a linear partial differential > equation). In my field, we are working with so called local spin chains, > which mathematically speaking are described by tensor products of small > vector spaces over several systems (let's say 20). The matrix corresponding > to the differential equation is called Hamiltonian and can for typical > systems be written as a sum over tensor products where it acts as the > identity on most systems. It normally has the form > > *\sum Id \otimes Id ... Id \otimes M \otimes Id \otimes ...* > > where M takes different positions.I know how to explicitly construct the > full matrix and insert it into Petsc, but for the interesting applications > it is too large to be stored in the RAM. I would therefore like to > implement it as a matrix free version. > This should be possible using MatCreateMAIJ() and VecGetArray(), as the > following very useful post points out > http://lists.mcs.anl.gov/pipermail/petsc-users/2011-September/009992.html. > I was wondering whether anybody already made progress with this, as I am > still a bit lost on how to precisely proceed. These systems really are > ubiquitous in theoretical quantum mechanics and I am sure it would be > helpful to quite a lot of people who still shy away a bit from Petsc. > > Thanks already for your help and all the best, Mathis > 1) The first thing you could try is MatShell(). However, you would have to handle all the parallelism, which might be onerous. 2) An alternative is to explore the new TAIJ matrices. This is definitely not for novice programmers, but it is a direct representation of a Kronecker product, and in addition is vectorized. Jed is the lead there, so maybe he can comment. Thanks, Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From salazardetroya at gmail.com Mon Mar 31 18:37:51 2014 From: salazardetroya at gmail.com (Miguel Angel Salazar de Troya) Date: Mon, 31 Mar 2014 18:37:51 -0500 Subject: [petsc-users] Problems running ex12.c In-Reply-To: References: <8448FFCE4362914496BCEAF8BE810C13EF1D34@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D5E@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D7C@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D8F@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DA3@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DBF@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E22@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E50@DCPWPEXMBX02.mdanderson.edu> Message-ID: Thanks for your response. Now I am trying to modify this example to include Dirichlet and Neumann conditions at the same time. I can see that inside of DMPlexCreateSquareBoundary there is an option ("-dm_plex_separate_marker") to just mark the top boundary with 1. I understand that only this side would have Dirichlet conditions that are described by the function bcFuncs in user.fem (the exact function in this example). However, when we run the Neumann condition, we fix all the boundary as Neumann condition with the function DMPlexAddBoundary, is this right? Could there be a way to just fix a certain boundary with the Neumann condition in this example? Would it be easier with an external library as Exodus II? On Sun, Mar 30, 2014 at 7:51 PM, Matthew Knepley wrote: > On Sun, Mar 30, 2014 at 7:07 PM, Miguel Angel Salazar de Troya < > salazardetroya at gmail.com> wrote: > >> Thanks for your response. Your help is really useful to me. >> >> The difference between the analytic and the field options are that for >> the field options the function is projected onto the function space defined >> for feAux right? What is the advantage of doing this? >> > > If it is not purely a function of the coordinates, or you do not know that > function, there is no option left. > > >> Also, for this field case I see that the function always has to be a >> vector. What if we wanted to implement a heterogeneous material in linear >> elasticity? Would we implement the constitutive tensor as a vector? It >> would not be very difficult I think, I just want to make sure it would be >> this way. >> > > Its not a vector, which indicates a particular behavior under coordinate > transformations, but an array > which can hold any data you want. > > Matt > > >> Thanks in advance >> Miguel >> >> >> On Sun, Mar 30, 2014 at 2:01 PM, Matthew Knepley wrote: >> >>> On Sun, Mar 30, 2014 at 1:57 PM, Miguel Angel Salazar de Troya < >>> salazardetroya at gmail.com> wrote: >>> >>>> Hello everybody >>>> >>>> I had a question about this example. In the petsc-dev next version, why >>>> don't we create a PetscSection in the function SetupSection, but we do it >>>> in the function SetupMaterialSection and in the function SetupSection of >>>> the petsc-current version. >>>> >>> >>> 1) I wanted to try and make things more automatic for the user >>> >>> 2) I needed a way to automatically layout data for coarser/finer grids >>> in unstructured MG >>> >>> Thus, now when you set for PetscFE into the DM using DMSetField(), it >>> will automatically create >>> the section on the first call to DMGetDefaultSection(). >>> >>> I do not have a similar provision now for materials, so you create your >>> own section. I think this is >>> alright until we have some idea of a nicer interface. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> petsc-dev: >>>> >>>> #undef __FUNCT__ >>>> #define __FUNCT__ "SetupSection" >>>> PetscErrorCode SetupSection(DM dm, AppCtx *user) >>>> { >>>> DM cdm = dm; >>>> const PetscInt id = 1; >>>> PetscErrorCode ierr; >>>> >>>> PetscFunctionBeginUser; >>>> ierr = PetscObjectSetName((PetscObject) user->fe[0], >>>> "potential");CHKERRQ(ierr); >>>> while (cdm) { >>>> ierr = DMSetNumFields(cdm, 1);CHKERRQ(ierr); >>>> ierr = DMSetField(cdm, 0, (PetscObject) user->fe[0]);CHKERRQ(ierr); >>>> ierr = DMPlexAddBoundary(cdm, user->bcType == DIRICHLET, >>>> user->bcType == NEUMANN ? "boundary" : "marker", 0, user->exactFuncs[0], 1, >>>> &id, user);CHKERRQ(ierr); >>>> ierr = DMPlexGetCoarseDM(cdm, &cdm);CHKERRQ(ierr); >>>> } >>>> PetscFunctionReturn(0); >>>> } >>>> >>>> >>>> It seems that it adds the number of fields directly to the DM, and >>>> takes the number of components that were specified in SetupElementCommon, >>>> but what about the number of degrees of freedom? Why we added it for the >>>> MaterialSection but not for the regular Section. >>>> >>>> Thanks in advance >>>> Miguel >>>> >>>> >>>> On Sat, Mar 15, 2014 at 4:16 PM, Miguel Angel Salazar de Troya < >>>> salazardetroya at gmail.com> wrote: >>>> >>>>> Thanks a lot. >>>>> >>>>> >>>>> On Sat, Mar 15, 2014 at 3:36 PM, Matthew Knepley wrote: >>>>> >>>>>> On Sat, Mar 15, 2014 at 3:31 PM, Miguel Angel Salazar de Troya < >>>>>> salazardetroya at gmail.com> wrote: >>>>>> >>>>>>> Hello everybody >>>>>>> >>>>>>> I keep trying to understand this example. I don't have any problems >>>>>>> with this example when I run it like this: >>>>>>> >>>>>>> [salaza11 at maya PETSC]$ ./ex12 -bc_type dirichlet -interpolate >>>>>>> -petscspace_order 1 -variable_coefficient nonlinear -dim 2 -run_type full >>>>>>> -show_solution >>>>>>> Number of SNES iterations = 5 >>>>>>> L_2 Error: 0.107289 >>>>>>> Solution >>>>>>> Vec Object: 1 MPI processes >>>>>>> type: seq >>>>>>> 0.484618 >>>>>>> >>>>>>> However, when I change the boundary conditions to Neumann, I get >>>>>>> this error. >>>>>>> >>>>>>> [salaza11 at maya PETSC]$ ./ex12 -bc_type neumann -interpolate 1 >>>>>>> -petscspace_order 2 -variable_coefficient nonlinear -dim 2 -run_type full >>>>>>> -show_solution >>>>>>> >>>>>> >>>>>> Here you set the order of the element used in bulk, but not on the >>>>>> boundary where you condition is, so it defaults to 0. In >>>>>> order to become more familiar, take a look at the tests that I run >>>>>> here: >>>>>> >>>>>> >>>>>> https://bitbucket.org/petsc/petsc/src/64715f0f033346c10c77b73cf58216d111db8789/config/builder.py?at=master#cl-216 >>>>>> >>>>>> Matt >>>>>> >>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>> -------------------------------------------------------------- >>>>>>> [0]PETSC ERROR: Petsc has generated inconsistent data >>>>>>> [0]PETSC ERROR: Number of dual basis vectors 0 not equal to >>>>>>> dimension 1 >>>>>>> [0]PETSC ERROR: See http:// >>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble >>>>>>> shooting. >>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-4776-gb18359b >>>>>>> GIT Date: 2014-03-04 10:53:30 -0600 >>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 >>>>>>> Sat Mar 15 14:28:05 2014 >>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>> --download-chaco --with-c2html=0 >>>>>>> [0]PETSC ERROR: #1 PetscDualSpaceSetUp_Lagrange() line 1763 in >>>>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>>>>> [0]PETSC ERROR: #2 PetscDualSpaceSetUp() line 1277 in >>>>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>>>>> [0]PETSC ERROR: #3 SetupElementCommon() line 474 in >>>>>>> /home/salaza11/workspace/PETSC/ex12.c >>>>>>> [0]PETSC ERROR: #4 SetupBdElement() line 559 in >>>>>>> /home/salaza11/workspace/PETSC/ex12.c >>>>>>> [0]PETSC ERROR: #5 main() line 755 in >>>>>>> /home/salaza11/workspace/PETSC/ex12.c >>>>>>> [0]PETSC ERROR: ----------------End of Error Message -------send >>>>>>> entire error message to petsc-maint at mcs.anl.gov---------- >>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 77) - process 0 >>>>>>> [unset]: aborting job: >>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 77) - process 0 >>>>>>> >>>>>>> I honestly do not know much about using dual spaces in a finite >>>>>>> element context. I have been trying to find some material that could help >>>>>>> me without much success. I tried to modify the dual space order with the >>>>>>> option -petscdualspace_order but I kept getting errors. In particular, I >>>>>>> got this when I set it to 1. >>>>>>> >>>>>>> [salaza11 at maya PETSC]$ ./ex12 -bc_type neumann -interpolate 1 >>>>>>> -petscspace_order 2 -variable_coefficient nonlinear -dim 2 -run_type full >>>>>>> -show_solution -petscdualspace_order 1 >>>>>>> [0]PETSC ERROR: PetscTrFreeDefault() called from >>>>>>> PetscFESetUp_Basic() line 2492 in >>>>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>>>>> [0]PETSC ERROR: Block [id=0(32)] at address 0x1cc32f0 is corrupted >>>>>>> (probably write past end of array) >>>>>>> [0]PETSC ERROR: Block allocated in PetscFESetUp_Basic() line 2483 in >>>>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>> -------------------------------------------------------------- >>>>>>> [0]PETSC ERROR: Memory corruption: >>>>>>> http://www.mcs.anl.gov/petsc/documentation/installation.html#valgrind >>>>>>> [0]PETSC ERROR: Corrupted memory >>>>>>> [0]PETSC ERROR: See http:// >>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble >>>>>>> shooting. >>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-4776-gb18359b >>>>>>> GIT Date: 2014-03-04 10:53:30 -0600 >>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by salaza11 >>>>>>> Sat Mar 15 14:37:34 2014 >>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>> --download-chaco --with-c2html=0 >>>>>>> [0]PETSC ERROR: #1 PetscTrFreeDefault() line 289 in >>>>>>> /home/salaza11/petsc/src/sys/memory/mtr.c >>>>>>> [0]PETSC ERROR: #2 PetscFESetUp_Basic() line 2492 in >>>>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>>>>> [0]PETSC ERROR: #3 PetscFESetUp() line 2126 in >>>>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>>>>> [0]PETSC ERROR: #4 SetupElementCommon() line 482 in >>>>>>> /home/salaza11/workspace/PETSC/ex12.c >>>>>>> [0]PETSC ERROR: #5 SetupElement() line 506 in >>>>>>> /home/salaza11/workspace/PETSC/ex12.c >>>>>>> [0]PETSC ERROR: #6 main() line 754 in >>>>>>> /home/salaza11/workspace/PETSC/ex12.c >>>>>>> [0]PETSC ERROR: ----------------End of Error Message -------send >>>>>>> entire error message to petsc-maint at mcs.anl.gov---------- >>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 >>>>>>> [unset]: aborting job: >>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 >>>>>>> [salaza11 at maya PETSC]$ >>>>>>> >>>>>>> >>>>>>> Then again, I do not know much what I am doing given my ignorance >>>>>>> with respect to the dual spaces in FE. I apologize for that. My questions >>>>>>> are: >>>>>>> >>>>>>> - Where could I find more resources in order to understand the PETSc >>>>>>> implementation of dual spaces for FE? >>>>>>> - Why does it run with Dirichlet but not with Neumann? >>>>>>> >>>>>>> Thanks in advance. >>>>>>> Miguel. >>>>>>> >>>>>>> >>>>>>> On Tue, Mar 4, 2014 at 11:28 PM, Matthew Knepley wrote: >>>>>>> >>>>>>>> On Tue, Mar 4, 2014 at 12:01 PM, Matthew Knepley >>>>>>> > wrote: >>>>>>>> >>>>>>>>> On Tue, Mar 4, 2014 at 11:51 AM, Miguel Angel Salazar de Troya < >>>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> I can run it now, thanks. Although if I run it with valgrind >>>>>>>>>> 3.5.0 (should I update to the last version?) I get some memory leaks >>>>>>>>>> related with the function DMPlexCreateBoxMesh. >>>>>>>>>> >>>>>>>>> >>>>>>>>> I will check it out. >>>>>>>>> >>>>>>>> >>>>>>>> This is now fixed. >>>>>>>> >>>>>>>> Thanks for finding it >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> [salaza11 at maya tutorials]$ valgrind --leak-check=full ./ex12 >>>>>>>>>> -run_type test -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 >>>>>>>>>> -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>>>>>>> ==9625== Memcheck, a memory error detector >>>>>>>>>> ==9625== Copyright (C) 2002-2009, and GNU GPL'd, by Julian Seward >>>>>>>>>> et al. >>>>>>>>>> ==9625== Using Valgrind-3.5.0 and LibVEX; rerun with -h for >>>>>>>>>> copyright info >>>>>>>>>> ==9625== Command: ./ex12 -run_type test -refinement_limit 0.0 >>>>>>>>>> -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>> ==9625== >>>>>>>>>> Local function: >>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>> type: seq >>>>>>>>>> 0 >>>>>>>>>> 0.25 >>>>>>>>>> 1 >>>>>>>>>> 0.25 >>>>>>>>>> 0.5 >>>>>>>>>> 1.25 >>>>>>>>>> 1 >>>>>>>>>> 1.25 >>>>>>>>>> 2 >>>>>>>>>> Initial guess >>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>> type: seq >>>>>>>>>> 0.5 >>>>>>>>>> L_2 Error: 0.111111 >>>>>>>>>> Residual: >>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>> type: seq >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> Initial Residual >>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>> type: seq >>>>>>>>>> 0 >>>>>>>>>> L_2 Residual: 0 >>>>>>>>>> Jacobian: >>>>>>>>>> Mat Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> row 0: (0, 4) >>>>>>>>>> Residual: >>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>> type: seq >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> -2 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> 0 >>>>>>>>>> Au - b = Au + F(0) >>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>> type: seq >>>>>>>>>> 0 >>>>>>>>>> Linear L_2 Residual: 0 >>>>>>>>>> ==9625== >>>>>>>>>> ==9625== HEAP SUMMARY: >>>>>>>>>> ==9625== in use at exit: 288 bytes in 3 blocks >>>>>>>>>> ==9625== total heap usage: 2,484 allocs, 2,481 frees, 1,009,287 >>>>>>>>>> bytes allocated >>>>>>>>>> ==9625== >>>>>>>>>> ==9625== 48 bytes in 1 blocks are definitely lost in loss record >>>>>>>>>> 1 of 3 >>>>>>>>>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>>>>>>>>> ==9625== by 0x5D8D4E1: writepoly (triangle.c:12012) >>>>>>>>>> ==9625== by 0x5D8FAAC: triangulate (triangle.c:13167) >>>>>>>>>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>>>>>>>>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>>>>>>>>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>>>>>>>>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>>>>>>>>> ==9625== by 0x408D3D: main (ex12.c:651) >>>>>>>>>> ==9625== >>>>>>>>>> ==9625== 96 bytes in 1 blocks are definitely lost in loss record >>>>>>>>>> 2 of 3 >>>>>>>>>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>>>>>>>>> ==9625== by 0x5D8D485: writepoly (triangle.c:12004) >>>>>>>>>> ==9625== by 0x5D8FAAC: triangulate (triangle.c:13167) >>>>>>>>>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>>>>>>>>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>>>>>>>>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>>>>>>>>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>>>>>>>>> ==9625== by 0x408D3D: main (ex12.c:651) >>>>>>>>>> ==9625== >>>>>>>>>> ==9625== 144 bytes in 1 blocks are definitely lost in loss record >>>>>>>>>> 3 of 3 >>>>>>>>>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>>>>>>>>> ==9625== by 0x5D8CD20: writenodes (triangle.c:11718) >>>>>>>>>> ==9625== by 0x5D8F9DE: triangulate (triangle.c:13132) >>>>>>>>>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>>>>>>>>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>>>>>>>>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>>>>>>>>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>>>>>>>>> ==9625== by 0x408D3D: main (ex12.c:651) >>>>>>>>>> ==9625== >>>>>>>>>> ==9625== LEAK SUMMARY: >>>>>>>>>> ==9625== definitely lost: 288 bytes in 3 blocks >>>>>>>>>> ==9625== indirectly lost: 0 bytes in 0 blocks >>>>>>>>>> ==9625== possibly lost: 0 bytes in 0 blocks >>>>>>>>>> ==9625== still reachable: 0 bytes in 0 blocks >>>>>>>>>> ==9625== suppressed: 0 bytes in 0 blocks >>>>>>>>>> ==9625== >>>>>>>>>> ==9625== For counts of detected and suppressed errors, rerun >>>>>>>>>> with: -v >>>>>>>>>> ==9625== ERROR SUMMARY: 3 errors from 3 contexts (suppressed: 6 >>>>>>>>>> from 6) >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Mon, Mar 3, 2014 at 7:05 PM, Matthew Knepley < >>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> On Mon, Mar 3, 2014 at 4:59 PM, Miguel Angel Salazar de Troya < >>>>>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> You are welcome, thanks for your help. >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Okay, I have rebuilt completely clean, and ex12 runs for me. Can >>>>>>>>>>> you try again after pulling? >>>>>>>>>>> >>>>>>>>>>> Thanks, >>>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> On Mon, Mar 3, 2014 at 4:13 PM, Matthew Knepley < >>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> On Mon, Mar 3, 2014 at 1:44 PM, Miguel Angel Salazar de Troya >>>>>>>>>>>>> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks. This is what I get. >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Okay, this was broken by a new push to master/next in the last >>>>>>>>>>>>> few days. I have pushed a fix, >>>>>>>>>>>>> however next is currently broken due to a failure to check in >>>>>>>>>>>>> a file. This should be fixed shortly, >>>>>>>>>>>>> and then ex12 will work. I will mail you when its ready. >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks for finding this, >>>>>>>>>>>>> >>>>>>>>>>>>> Matt >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> (gdb) cont >>>>>>>>>>>>>> Continuing. >>>>>>>>>>>>>> >>>>>>>>>>>>>> Program received signal SIGSEGV, Segmentation fault. >>>>>>>>>>>>>> 0x00007fd6811bea7b in DMPlexComputeJacobianFEM (dm=0x159a180, >>>>>>>>>>>>>> X=0x168b5b0, >>>>>>>>>>>>>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, >>>>>>>>>>>>>> str=0x7fffae6e7970, >>>>>>>>>>>>>> user=0x7fd6811be509) >>>>>>>>>>>>>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>>>>>>>>>>>>> 882 ierr = PetscFEGetDimension(fe[f], >>>>>>>>>>>>>> &Nb);CHKERRQ(ierr); >>>>>>>>>>>>>> (gdb) where >>>>>>>>>>>>>> #0 0x00007fd6811bea7b in DMPlexComputeJacobianFEM >>>>>>>>>>>>>> (dm=0x159a180, X=0x168b5b0, >>>>>>>>>>>>>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, >>>>>>>>>>>>>> str=0x7fffae6e7970, >>>>>>>>>>>>>> user=0x7fd6811be509) >>>>>>>>>>>>>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>>>>>>>>>>>>> #1 0x00007fd6814a5bf6 in SNESComputeJacobian_DMLocal >>>>>>>>>>>>>> (snes=0x14e9450, >>>>>>>>>>>>>> X=0x1622ad0, A=0x7fffae6e8a88, B=0x7fffae6e8a88, >>>>>>>>>>>>>> ctx=0x1652300) >>>>>>>>>>>>>> at /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c:102 >>>>>>>>>>>>>> #2 0x00007fd6814cc609 in SNESComputeJacobian >>>>>>>>>>>>>> (snes=0x14e9450, X=0x1622ad0, >>>>>>>>>>>>>> A=0x7fffae6e8a88, B=0x7fffae6e8a88) >>>>>>>>>>>>>> at /home/salaza11/petsc/src/snes/interface/snes.c:2245 >>>>>>>>>>>>>> #3 0x000000000040af72 in main (argc=15, argv=0x7fffae6e8bc8) >>>>>>>>>>>>>> at >>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:784 >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Mon, Mar 3, 2014 at 1:40 PM, Matthew Knepley < >>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Mon, Mar 3, 2014 at 1:39 PM, Miguel Angel Salazar de >>>>>>>>>>>>>>> Troya wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> This is what I get at gdb when I type 'where'. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> You have to type 'cont', and then when it fails you type >>>>>>>>>>>>>>> 'where'. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> #0 0x000000310e0aa860 in __nanosleep_nocancel () from >>>>>>>>>>>>>>>> /lib64/libc.so.6 >>>>>>>>>>>>>>>> #1 0x000000310e0aa70f in sleep () from /lib64/libc.so.6 >>>>>>>>>>>>>>>> #2 0x00007fd83a00a8be in PetscSleep (s=10) >>>>>>>>>>>>>>>> at /home/salaza11/petsc/src/sys/utils/psleep.c:52 >>>>>>>>>>>>>>>> #3 0x00007fd83a06f331 in PetscAttachDebugger () >>>>>>>>>>>>>>>> at /home/salaza11/petsc/src/sys/error/adebug.c:397 >>>>>>>>>>>>>>>> #4 0x00007fd83a0af1d2 in PetscOptionsCheckInitial_Private >>>>>>>>>>>>>>>> () >>>>>>>>>>>>>>>> at /home/salaza11/petsc/src/sys/objects/init.c:444 >>>>>>>>>>>>>>>> #5 0x00007fd83a0b6448 in PetscInitialize >>>>>>>>>>>>>>>> (argc=0x7fff5cd8df2c, >>>>>>>>>>>>>>>> args=0x7fff5cd8df20, file=0x0, >>>>>>>>>>>>>>>> help=0x60ce40 "Poisson Problem in 2d and 3d with >>>>>>>>>>>>>>>> simplicial finite elements.\nWe solve the Poisson problem in a >>>>>>>>>>>>>>>> rectangular\ndomain, using a parallel unstructured mesh (DMPLEX) to >>>>>>>>>>>>>>>> discretize it.\n\n\n") >>>>>>>>>>>>>>>> at /home/salaza11/petsc/src/sys/objects/pinit.c:876 >>>>>>>>>>>>>>>> #6 0x0000000000408f2c in main (argc=15, >>>>>>>>>>>>>>>> argv=0x7fff5cd8f1f8) >>>>>>>>>>>>>>>> at >>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:663 >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> The rest of the gdb output is attached. I am a bit ignorant >>>>>>>>>>>>>>>> with gdb, I apologize for that. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Mon, Mar 3, 2014 at 12:48 PM, Matthew Knepley < >>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Mon, Mar 3, 2014 at 12:39 PM, Miguel Angel Salazar de >>>>>>>>>>>>>>>>> Troya wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Thanks for your response. Sorry I did not have the "next" >>>>>>>>>>>>>>>>>> version, but the "master" version. I still have an error though. I followed >>>>>>>>>>>>>>>>>> the steps given here ( >>>>>>>>>>>>>>>>>> https://bitbucket.org/petsc/petsc/wiki/Home) to obtain >>>>>>>>>>>>>>>>>> the next version, I configured petsc as above and ran ex12 as above as >>>>>>>>>>>>>>>>>> well, getting this error: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> [salaza11 at maya tutorials]$ ./ex12 -run_type test >>>>>>>>>>>>>>>>>> -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 >>>>>>>>>>>>>>>>>> -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>>>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>> 0.25 >>>>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>>>> 0.25 >>>>>>>>>>>>>>>>>> 0.5 >>>>>>>>>>>>>>>>>> 1.25 >>>>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>>>> 1.25 >>>>>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>>>>> Initial guess >>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>>> 0.5 >>>>>>>>>>>>>>>>>> L_2 Error: 0.111111 >>>>>>>>>>>>>>>>>> Residual: >>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>> Initial Residual >>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>> L_2 Residual: 0 >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Okay, now run with -start_in_debugger, and give me a stack >>>>>>>>>>>>>>>>> trace using 'where'. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: >>>>>>>>>>>>>>>>>> Segmentation Violation, probably memory access out of range >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to >>>>>>>>>>>>>>>>>> find memory corruption errors >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in stack >>>>>>>>>>>>>>>>>> below >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack >>>>>>>>>>>>>>>>>> are not available, >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the >>>>>>>>>>>>>>>>>> start of the function >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 871 >>>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line 2244 >>>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>>>> -------------------------------------------------------------- >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See http:// >>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for >>>>>>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>> v3.4.3-4705-gfb6b3bc GIT Date: 2014-03-03 08:23:43 -0600 >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya >>>>>>>>>>>>>>>>>> by salaza11 Mon Mar 3 11:49:15 2014 >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>>>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>>>>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>>>>>>>>>>> [0]PETSC ERROR: #1 User provided function() line 0 in >>>>>>>>>>>>>>>>>> unknown file >>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - >>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>> [unset]: aborting job: >>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - >>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Sun, Mar 2, 2014 at 7:11 PM, Matthew Knepley < >>>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> On Sun, Mar 2, 2014 at 6:54 PM, Miguel Angel Salazar de >>>>>>>>>>>>>>>>>>> Troya wrote: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Hi everybody >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> I am trying to run example ex12.c without much success. >>>>>>>>>>>>>>>>>>>> I specifically run it with the command options: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> We need to start narrowing down differences, because it >>>>>>>>>>>>>>>>>>> runs for me and our nightly tests. So, first can >>>>>>>>>>>>>>>>>>> you confirm that you are using the latest 'next' branch? >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 >>>>>>>>>>>>>>>>>>>> -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> And I get this output >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>>>>>>> 3 >>>>>>>>>>>>>>>>>>>> Initial guess >>>>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>>>>> L_2 Error: 0.625 >>>>>>>>>>>>>>>>>>>> Residual: >>>>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>>> Initial Residual >>>>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>>>>> L_2 Residual: 0 >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: >>>>>>>>>>>>>>>>>>>> Segmentation Violation, probably memory access out of range >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to >>>>>>>>>>>>>>>>>>>> find memory corruption errors >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in >>>>>>>>>>>>>>>>>>>> stack below >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the >>>>>>>>>>>>>>>>>>>> stack are not available, >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the >>>>>>>>>>>>>>>>>>>> start of the function >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 867 >>>>>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line 94 >>>>>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line >>>>>>>>>>>>>>>>>>>> 2244 /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>>>> v3.4.3-3453-g0a94005 GIT Date: 2014-03-02 13:12:04 -0600 >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>>>>>>>> updates. >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about >>>>>>>>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named >>>>>>>>>>>>>>>>>>>> maya by salaza11 Sun Mar 2 17:00:09 2014 >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>>>> /home/salaza11/petsc/linux-gnu-c-debug/lib >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Sun Mar 2 16:46:51 >>>>>>>>>>>>>>>>>>>> 2014 >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>>>>>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>>>>>>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in >>>>>>>>>>>>>>>>>>>> unknown file >>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - >>>>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>>>> [unset]: aborting job: >>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - >>>>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Probably my problems could be on my configuration. I >>>>>>>>>>>>>>>>>>>> attach the configure.log. I ran ./configure like this >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> ./configure --download-mpich >>>>>>>>>>>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>>>>>>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Thanks a lot in advance. >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> On Tue, Jan 28, 2014 at 10:37 AM, Matthew Knepley < >>>>>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> On Tue, Jan 28, 2014 at 10:31 AM, Yaakoub El Khamra < >>>>>>>>>>>>>>>>>>>>> yelkhamra at gmail.com> wrote: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> If >>>>>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit >>>>>>>>>>>>>>>>>>>>>> 0.0125 -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> is for serial, any chance we can get the options to >>>>>>>>>>>>>>>>>>>>>> run in parallel? >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Just use mpiexec -n >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Regards >>>>>>>>>>>>>>>>>>>>>> Yaakoub El Khamra >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:29 AM, Matthew Knepley < >>>>>>>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:06 AM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Friday, January 17, 2014 11:04 AM >>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:00 AM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> These examples all seem to run excepting the >>>>>>>>>>>>>>>>>>>>>>>>> following command, >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit >>>>>>>>>>>>>>>>>>>>>>>>> 0.0125 -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> I get the following ouput: >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit >>>>>>>>>>>>>>>>>>>>>>>>> 0.0125 -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>>>>>>>>>>>>> ./ex12: symbol lookup error: >>>>>>>>>>>>>>>>>>>>>>>>> /opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so: undefined >>>>>>>>>>>>>>>>>>>>>>>>> symbol: omp_get_num_procs >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> This is a build problem, but it should affect all >>>>>>>>>>>>>>>>>>>>>>>> the runs. Is this reproducible? Can you send configure.log? MKL is the >>>>>>>>>>>>>>>>>>>>>>>> worst. If this >>>>>>>>>>>>>>>>>>>>>>>> persists, I would just switch to >>>>>>>>>>>>>>>>>>>>>>>> --download-f-blas-lapack. >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Thanks. I have some advice on options >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> --with-precision=single # I would not use this >>>>>>>>>>>>>>>>>>>>>>> unless you are doing something special, like CUDA >>>>>>>>>>>>>>>>>>>>>>> --with-clanguage=C++ # I would recommend >>>>>>>>>>>>>>>>>>>>>>> switching to C, the build is much faster >>>>>>>>>>>>>>>>>>>>>>> --with-mpi-dir=/usr --with-mpi4py=0 >>>>>>>>>>>>>>>>>>>>>>> --with-shared-libraries --CFLAGS=-O0 >>>>>>>>>>>>>>>>>>>>>>> --CXXFLAGS=-O0 --with-fc=0 >>>>>>>>>>>>>>>>>>>>>>> --with-etags=1 # This is unnecessary >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>>>>>> # Here is the problem, see below >>>>>>>>>>>>>>>>>>>>>>> --download-metis >>>>>>>>>>>>>>>>>>>>>>> --download-fiat=yes --download-generator >>>>>>>>>>>>>>>>>>>>>>> --download-scientificpython # Get rid of these, they are obsolete >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Your MKL needs another library for the OpenMP >>>>>>>>>>>>>>>>>>>>>>> symbols. I would recommend switching to --download-f2cblaslapack, >>>>>>>>>>>>>>>>>>>>>>> or you can try and find that library. >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 6:35 PM >>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 5:43 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> Hi, This is the next error message after >>>>>>>>>>>>>>>>>>>>>>>>>> configuring and building with the triangle package when trying to run ex12 >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> This is my fault for bad defaults. I will fix. >>>>>>>>>>>>>>>>>>>>>>>>> Try running >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 >>>>>>>>>>>>>>>>>>>>>>>>> -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>>>>>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> for a representative run. Then you could try 3D >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit >>>>>>>>>>>>>>>>>>>>>>>>> 0.0125 -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> or a full run >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>>>>>>>>>>>>>> -interpolate -petscspace_order 1 >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type dirichlet >>>>>>>>>>>>>>>>>>>>>>>>> -interpolate -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> Let me know if those work. >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> ./ex12 >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 8 FPE: >>>>>>>>>>>>>>>>>>>>>>>>>> Floating Point Exception,probably divide by zero >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac >>>>>>>>>>>>>>>>>>>>>>>>>> OS X to find memory corruption errors >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given >>>>>>>>>>>>>>>>>>>>>>>>>> in stack below >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack >>>>>>>>>>>>>>>>>>>>>>>>>> Frames ------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in >>>>>>>>>>>>>>>>>>>>>>>>>> the stack are not available, >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of >>>>>>>>>>>>>>>>>>>>>>>>>> the start of the function >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeResidualFEM line >>>>>>>>>>>>>>>>>>>>>>>>>> 531 /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction_DMLocal >>>>>>>>>>>>>>>>>>>>>>>>>> line 63 /home/mjonesa/PETSc/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user function line 2088 >>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction line 2076 >>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve_NEWTONLS line 144 >>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/impls/ls/ls.c >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve line 3765 >>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error >>>>>>>>>>>>>>>>>>>>>>>>>> Message ------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for >>>>>>>>>>>>>>>>>>>>>>>>>> recent updates. >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about >>>>>>>>>>>>>>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual >>>>>>>>>>>>>>>>>>>>>>>>>> pages. >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>>>>>>> named maeda by mjonesa Thu Jan 16 17:41:23 2014 >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 >>>>>>>>>>>>>>>>>>>>>>>>>> 17:38:33 2014 >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local >>>>>>>>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>>>>>>>>> --with-c2html=0 --with-clanguage=c++ PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>>>>>>> --download-triangle >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 >>>>>>>>>>>>>>>>>>>>>>>>>> in unknown file >>>>>>>>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) >>>>>>>>>>>>>>>>>>>>>>>>>> - process 0 >>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:37 PM >>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:33 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> Hi, I have downloaded and built the dev >>>>>>>>>>>>>>>>>>>>>>>>>>> version you suggested. I think I need the triangle package to run this >>>>>>>>>>>>>>>>>>>>>>>>>>> particular case. Is there any thing else that appears wrong in what I have >>>>>>>>>>>>>>>>>>>>>>>>>>> done from the error messages below: >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> Great! Its running. You can reconfigure like >>>>>>>>>>>>>>>>>>>>>>>>>> this: >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> $PETSC_DIR/$PETSC_ARCH/conf/reconfigure-$PETSC_ARCH.py --download-triangle >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> and then rebuild >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> make >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> and then rerun. You can load meshes, but its >>>>>>>>>>>>>>>>>>>>>>>>>> much easier to have triangle create them. >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> Thanks for being patient, >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error >>>>>>>>>>>>>>>>>>>>>>>>>>> Message ------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: No support for this operation >>>>>>>>>>>>>>>>>>>>>>>>>>> for this object type! >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Mesh generation needs external >>>>>>>>>>>>>>>>>>>>>>>>>>> package support. >>>>>>>>>>>>>>>>>>>>>>>>>>> Please reconfigure with --download-triangle.! >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for >>>>>>>>>>>>>>>>>>>>>>>>>>> recent updates. >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints >>>>>>>>>>>>>>>>>>>>>>>>>>> about trouble shooting. >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual >>>>>>>>>>>>>>>>>>>>>>>>>>> pages. >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a >>>>>>>>>>>>>>>>>>>>>>>>>>> arch-linux2-cxx-debug named maeda by mjonesa Thu Jan 16 16:28:20 2014 >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 >>>>>>>>>>>>>>>>>>>>>>>>>>> 16:25:53 2014 >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local --with-clanguage=c++ --with-c2html=0 >>>>>>>>>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexGenerate() line 4332 in >>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plex.c >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexCreateBoxMesh() line 600 >>>>>>>>>>>>>>>>>>>>>>>>>>> in /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexcreate.c >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: CreateMesh() line 295 in >>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: main() line 659 in >>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 56) >>>>>>>>>>>>>>>>>>>>>>>>>>> - process 0 >>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:06 PM >>>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:05 PM, >>>>>>>>>>>>>>>>>>>>>>>>>>> Jones,Martin Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> Hi. I changed the ENV variable to the correct >>>>>>>>>>>>>>>>>>>>>>>>>>>> entry. when I type make ex12 I get this: >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>>>>>>>>>>>> make ex12 >>>>>>>>>>>>>>>>>>>>>>>>>>>> g++ -o ex12.o -c -Wall -Wwrite-strings >>>>>>>>>>>>>>>>>>>>>>>>>>>> -Wno-strict-aliasing -Wno-unknown-pragmas -g -fPIC >>>>>>>>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include >>>>>>>>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/arch-linux2-cxx-debug/include >>>>>>>>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include/mpiuni >>>>>>>>>>>>>>>>>>>>>>>>>>>> -D__INSDIR__=src/snes/examples/tutorials/ ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c:14:18: fatal error: ex12.h: No such file >>>>>>>>>>>>>>>>>>>>>>>>>>>> or directory >>>>>>>>>>>>>>>>>>>>>>>>>>>> compilation terminated. >>>>>>>>>>>>>>>>>>>>>>>>>>>> make: *** [ex12.o] Error 1 >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> Any help of yours is very much appreciated. >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> Yes, this relates to my 3). This is not going >>>>>>>>>>>>>>>>>>>>>>>>>>> to work for you with the release. Please see the link I sent. >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:58 PM >>>>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:55 PM, >>>>>>>>>>>>>>>>>>>>>>>>>>>> Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>>>> > wrote: >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> You built with >>>>>>>>>>>>>>>>>>>>>>>>>>>> PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:31 PM >>>>>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:11 PM, >>>>>>>>>>>>>>>>>>>>>>>>>>>>> Jones,Martin Alexander < >>>>>>>>>>>>>>>>>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Now I went to the directory where ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> sits and just did a 'make ex12.c' with the following error if this helps? : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> make ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/variables:108: >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscvariables: No >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/rules:962: >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules: No >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> make: *** No rule to make target >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> `/home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules'. >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Stop. >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> 1) You would type 'make ex12' >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> 2) Either you PETSC_DIR ( >>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3) or >>>>>>>>>>>>>>>>>>>>>>>>>>>>> PETSC_ARCH (linux-gnu-cxx-debug) environment >>>>>>>>>>>>>>>>>>>>>>>>>>>>> variables >>>>>>>>>>>>>>>>>>>>>>>>>>>>> do not match what you built. Please send >>>>>>>>>>>>>>>>>>>>>>>>>>>>> configure.log and make.log >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3) Since it was only recently added, if you >>>>>>>>>>>>>>>>>>>>>>>>>>>>> want to use the FEM functionality, you must use the development version: >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/developers/index.html >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [mailto: >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 2:48 PM >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> running ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 2:35 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello To Whom it Concerns, >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> I am trying to run the tutorial ex12.c by >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> running 'bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h' >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but getting the following error: >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> bin/pythonscripts/PetscGenerateFEMQuadrature.py dim order dim 1 laplacian >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dim order dim 1 boundary src/snes/examples/tutorials/ex12.h >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Traceback (most recent call last): >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> File >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "bin/pythonscripts/PetscGenerateFEMQuadrature.py", line 15, in >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> from FIAT.reference_element import >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> default_simplex >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ImportError: No module named >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FIAT.reference_element >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> I have removed the requirement of generating >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> the header file (its now all handled in C). I thought >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> I changed the documentation everywhere >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> (including the latest tutorial slides). Can you try running >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> with 'master' (or 'next'), and point me >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> toward the old docs? >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> before they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted >>>>>>>>>>>>>>>>>>>>>>>>>>>>> before they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>> Graduate Research Assistant >>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>> (217) 550-2360 >>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> *Miguel Angel Salazar de Troya* >>>>>>> Graduate Research Assistant >>>>>>> Department of Mechanical Science and Engineering >>>>>>> University of Illinois at Urbana-Champaign >>>>>>> (217) 550-2360 >>>>>>> salaza11 at illinois.edu >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> *Miguel Angel Salazar de Troya* >>>>> Graduate Research Assistant >>>>> Department of Mechanical Science and Engineering >>>>> University of Illinois at Urbana-Champaign >>>>> (217) 550-2360 >>>>> salaza11 at illinois.edu >>>>> >>>>> >>>> >>>> >>>> -- >>>> *Miguel Angel Salazar de Troya* >>>> Graduate Research Assistant >>>> Department of Mechanical Science and Engineering >>>> University of Illinois at Urbana-Champaign >>>> (217) 550-2360 >>>> salaza11 at illinois.edu >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> >> >> -- >> *Miguel Angel Salazar de Troya* >> Graduate Research Assistant >> Department of Mechanical Science and Engineering >> University of Illinois at Urbana-Champaign >> (217) 550-2360 >> salaza11 at illinois.edu >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- *Miguel Angel Salazar de Troya* Graduate Research Assistant Department of Mechanical Science and Engineering University of Illinois at Urbana-Champaign (217) 550-2360 salaza11 at illinois.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Mar 31 19:08:24 2014 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 31 Mar 2014 19:08:24 -0500 Subject: [petsc-users] Problems running ex12.c In-Reply-To: References: <8448FFCE4362914496BCEAF8BE810C13EF1D34@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D5E@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D7C@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1D8F@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DA3@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1DBF@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E22@DCPWPEXMBX02.mdanderson.edu> <8448FFCE4362914496BCEAF8BE810C13EF1E50@DCPWPEXMBX02.mdanderson.edu> Message-ID: On Mon, Mar 31, 2014 at 6:37 PM, Miguel Angel Salazar de Troya < salazardetroya at gmail.com> wrote: > Thanks for your response. Now I am trying to modify this example to > include Dirichlet and Neumann conditions at the same time. > > I can see that inside of DMPlexCreateSquareBoundary there is an option > ("-dm_plex_separate_marker") to just mark the top boundary with 1. I > understand that only this side would have Dirichlet conditions that are > described by the function bcFuncs in user.fem (the exact function in this > example). However, when we run the Neumann condition, we fix all the > boundary as Neumann condition with the function DMPlexAddBoundary, is this > right? > Right about the shortcoming, but wrong about the source. DMPlexAddBoundary() takes an argument that is the marker value for the given label, so you can select boundaries. However, DMPlexComputeResidualFEM() currently hardcodes the boundary name ("boundary") and the marker value (1). I wrote this when we had no boundary representation in PETSc. Now that we have DMAddBoundary(), we can loop over the Neumann boundaries. I have put this on my todo list. If you are motivated, you can do it first and I will help. Thanks, Matt > Could there be a way to just fix a certain boundary with the Neumann > condition in this example? Would it be easier with an external library as > Exodus II? > > > On Sun, Mar 30, 2014 at 7:51 PM, Matthew Knepley wrote: > >> On Sun, Mar 30, 2014 at 7:07 PM, Miguel Angel Salazar de Troya < >> salazardetroya at gmail.com> wrote: >> >>> Thanks for your response. Your help is really useful to me. >>> >>> The difference between the analytic and the field options are that for >>> the field options the function is projected onto the function space defined >>> for feAux right? What is the advantage of doing this? >>> >> >> If it is not purely a function of the coordinates, or you do not know >> that function, there is no option left. >> >> >>> Also, for this field case I see that the function always has to be a >>> vector. What if we wanted to implement a heterogeneous material in linear >>> elasticity? Would we implement the constitutive tensor as a vector? It >>> would not be very difficult I think, I just want to make sure it would be >>> this way. >>> >> >> Its not a vector, which indicates a particular behavior under coordinate >> transformations, but an array >> which can hold any data you want. >> >> Matt >> >> >>> Thanks in advance >>> Miguel >>> >>> >>> On Sun, Mar 30, 2014 at 2:01 PM, Matthew Knepley wrote: >>> >>>> On Sun, Mar 30, 2014 at 1:57 PM, Miguel Angel Salazar de Troya < >>>> salazardetroya at gmail.com> wrote: >>>> >>>>> Hello everybody >>>>> >>>>> I had a question about this example. In the petsc-dev next version, >>>>> why don't we create a PetscSection in the function SetupSection, but we do >>>>> it in the function SetupMaterialSection and in the function SetupSection of >>>>> the petsc-current version. >>>>> >>>> >>>> 1) I wanted to try and make things more automatic for the user >>>> >>>> 2) I needed a way to automatically layout data for coarser/finer grids >>>> in unstructured MG >>>> >>>> Thus, now when you set for PetscFE into the DM using DMSetField(), it >>>> will automatically create >>>> the section on the first call to DMGetDefaultSection(). >>>> >>>> I do not have a similar provision now for materials, so you create your >>>> own section. I think this is >>>> alright until we have some idea of a nicer interface. >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> petsc-dev: >>>>> >>>>> #undef __FUNCT__ >>>>> #define __FUNCT__ "SetupSection" >>>>> PetscErrorCode SetupSection(DM dm, AppCtx *user) >>>>> { >>>>> DM cdm = dm; >>>>> const PetscInt id = 1; >>>>> PetscErrorCode ierr; >>>>> >>>>> PetscFunctionBeginUser; >>>>> ierr = PetscObjectSetName((PetscObject) user->fe[0], >>>>> "potential");CHKERRQ(ierr); >>>>> while (cdm) { >>>>> ierr = DMSetNumFields(cdm, 1);CHKERRQ(ierr); >>>>> ierr = DMSetField(cdm, 0, (PetscObject) user->fe[0]);CHKERRQ(ierr); >>>>> ierr = DMPlexAddBoundary(cdm, user->bcType == DIRICHLET, >>>>> user->bcType == NEUMANN ? "boundary" : "marker", 0, user->exactFuncs[0], 1, >>>>> &id, user);CHKERRQ(ierr); >>>>> ierr = DMPlexGetCoarseDM(cdm, &cdm);CHKERRQ(ierr); >>>>> } >>>>> PetscFunctionReturn(0); >>>>> } >>>>> >>>>> >>>>> It seems that it adds the number of fields directly to the DM, and >>>>> takes the number of components that were specified in SetupElementCommon, >>>>> but what about the number of degrees of freedom? Why we added it for the >>>>> MaterialSection but not for the regular Section. >>>>> >>>>> Thanks in advance >>>>> Miguel >>>>> >>>>> >>>>> On Sat, Mar 15, 2014 at 4:16 PM, Miguel Angel Salazar de Troya < >>>>> salazardetroya at gmail.com> wrote: >>>>> >>>>>> Thanks a lot. >>>>>> >>>>>> >>>>>> On Sat, Mar 15, 2014 at 3:36 PM, Matthew Knepley wrote: >>>>>> >>>>>>> On Sat, Mar 15, 2014 at 3:31 PM, Miguel Angel Salazar de Troya < >>>>>>> salazardetroya at gmail.com> wrote: >>>>>>> >>>>>>>> Hello everybody >>>>>>>> >>>>>>>> I keep trying to understand this example. I don't have any problems >>>>>>>> with this example when I run it like this: >>>>>>>> >>>>>>>> [salaza11 at maya PETSC]$ ./ex12 -bc_type dirichlet -interpolate >>>>>>>> -petscspace_order 1 -variable_coefficient nonlinear -dim 2 -run_type full >>>>>>>> -show_solution >>>>>>>> Number of SNES iterations = 5 >>>>>>>> L_2 Error: 0.107289 >>>>>>>> Solution >>>>>>>> Vec Object: 1 MPI processes >>>>>>>> type: seq >>>>>>>> 0.484618 >>>>>>>> >>>>>>>> However, when I change the boundary conditions to Neumann, I get >>>>>>>> this error. >>>>>>>> >>>>>>>> [salaza11 at maya PETSC]$ ./ex12 -bc_type neumann -interpolate 1 >>>>>>>> -petscspace_order 2 -variable_coefficient nonlinear -dim 2 -run_type full >>>>>>>> -show_solution >>>>>>>> >>>>>>> >>>>>>> Here you set the order of the element used in bulk, but not on the >>>>>>> boundary where you condition is, so it defaults to 0. In >>>>>>> order to become more familiar, take a look at the tests that I run >>>>>>> here: >>>>>>> >>>>>>> >>>>>>> https://bitbucket.org/petsc/petsc/src/64715f0f033346c10c77b73cf58216d111db8789/config/builder.py?at=master#cl-216 >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>> -------------------------------------------------------------- >>>>>>>> [0]PETSC ERROR: Petsc has generated inconsistent data >>>>>>>> [0]PETSC ERROR: Number of dual basis vectors 0 not equal to >>>>>>>> dimension 1 >>>>>>>> [0]PETSC ERROR: See http:// >>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble >>>>>>>> shooting. >>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>> v3.4.3-4776-gb18359b GIT Date: 2014-03-04 10:53:30 -0600 >>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by >>>>>>>> salaza11 Sat Mar 15 14:28:05 2014 >>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>> --download-chaco --with-c2html=0 >>>>>>>> [0]PETSC ERROR: #1 PetscDualSpaceSetUp_Lagrange() line 1763 in >>>>>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>>>>>> [0]PETSC ERROR: #2 PetscDualSpaceSetUp() line 1277 in >>>>>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>>>>>> [0]PETSC ERROR: #3 SetupElementCommon() line 474 in >>>>>>>> /home/salaza11/workspace/PETSC/ex12.c >>>>>>>> [0]PETSC ERROR: #4 SetupBdElement() line 559 in >>>>>>>> /home/salaza11/workspace/PETSC/ex12.c >>>>>>>> [0]PETSC ERROR: #5 main() line 755 in >>>>>>>> /home/salaza11/workspace/PETSC/ex12.c >>>>>>>> [0]PETSC ERROR: ----------------End of Error Message -------send >>>>>>>> entire error message to petsc-maint at mcs.anl.gov---------- >>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 77) - process 0 >>>>>>>> [unset]: aborting job: >>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 77) - process 0 >>>>>>>> >>>>>>>> I honestly do not know much about using dual spaces in a finite >>>>>>>> element context. I have been trying to find some material that could help >>>>>>>> me without much success. I tried to modify the dual space order with the >>>>>>>> option -petscdualspace_order but I kept getting errors. In particular, I >>>>>>>> got this when I set it to 1. >>>>>>>> >>>>>>>> [salaza11 at maya PETSC]$ ./ex12 -bc_type neumann -interpolate 1 >>>>>>>> -petscspace_order 2 -variable_coefficient nonlinear -dim 2 -run_type full >>>>>>>> -show_solution -petscdualspace_order 1 >>>>>>>> [0]PETSC ERROR: PetscTrFreeDefault() called from >>>>>>>> PetscFESetUp_Basic() line 2492 in >>>>>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>>>>>> [0]PETSC ERROR: Block [id=0(32)] at address 0x1cc32f0 is corrupted >>>>>>>> (probably write past end of array) >>>>>>>> [0]PETSC ERROR: Block allocated in PetscFESetUp_Basic() line 2483 >>>>>>>> in /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>> -------------------------------------------------------------- >>>>>>>> [0]PETSC ERROR: Memory corruption: >>>>>>>> http://www.mcs.anl.gov/petsc/documentation/installation.html#valgrind >>>>>>>> [0]PETSC ERROR: Corrupted memory >>>>>>>> [0]PETSC ERROR: See http:// >>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble >>>>>>>> shooting. >>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>> v3.4.3-4776-gb18359b GIT Date: 2014-03-04 10:53:30 -0600 >>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by >>>>>>>> salaza11 Sat Mar 15 14:37:34 2014 >>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>> --download-chaco --with-c2html=0 >>>>>>>> [0]PETSC ERROR: #1 PetscTrFreeDefault() line 289 in >>>>>>>> /home/salaza11/petsc/src/sys/memory/mtr.c >>>>>>>> [0]PETSC ERROR: #2 PetscFESetUp_Basic() line 2492 in >>>>>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>>>>>> [0]PETSC ERROR: #3 PetscFESetUp() line 2126 in >>>>>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c >>>>>>>> [0]PETSC ERROR: #4 SetupElementCommon() line 482 in >>>>>>>> /home/salaza11/workspace/PETSC/ex12.c >>>>>>>> [0]PETSC ERROR: #5 SetupElement() line 506 in >>>>>>>> /home/salaza11/workspace/PETSC/ex12.c >>>>>>>> [0]PETSC ERROR: #6 main() line 754 in >>>>>>>> /home/salaza11/workspace/PETSC/ex12.c >>>>>>>> [0]PETSC ERROR: ----------------End of Error Message -------send >>>>>>>> entire error message to petsc-maint at mcs.anl.gov---------- >>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 >>>>>>>> [unset]: aborting job: >>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 >>>>>>>> [salaza11 at maya PETSC]$ >>>>>>>> >>>>>>>> >>>>>>>> Then again, I do not know much what I am doing given my ignorance >>>>>>>> with respect to the dual spaces in FE. I apologize for that. My questions >>>>>>>> are: >>>>>>>> >>>>>>>> - Where could I find more resources in order to understand the >>>>>>>> PETSc implementation of dual spaces for FE? >>>>>>>> - Why does it run with Dirichlet but not with Neumann? >>>>>>>> >>>>>>>> Thanks in advance. >>>>>>>> Miguel. >>>>>>>> >>>>>>>> >>>>>>>> On Tue, Mar 4, 2014 at 11:28 PM, Matthew Knepley >>>>>>> > wrote: >>>>>>>> >>>>>>>>> On Tue, Mar 4, 2014 at 12:01 PM, Matthew Knepley < >>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> On Tue, Mar 4, 2014 at 11:51 AM, Miguel Angel Salazar de Troya < >>>>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> I can run it now, thanks. Although if I run it with valgrind >>>>>>>>>>> 3.5.0 (should I update to the last version?) I get some memory leaks >>>>>>>>>>> related with the function DMPlexCreateBoxMesh. >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> I will check it out. >>>>>>>>>> >>>>>>>>> >>>>>>>>> This is now fixed. >>>>>>>>> >>>>>>>>> Thanks for finding it >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> Thanks, >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> [salaza11 at maya tutorials]$ valgrind --leak-check=full ./ex12 >>>>>>>>>>> -run_type test -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 >>>>>>>>>>> -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>>>>>>>> ==9625== Memcheck, a memory error detector >>>>>>>>>>> ==9625== Copyright (C) 2002-2009, and GNU GPL'd, by Julian >>>>>>>>>>> Seward et al. >>>>>>>>>>> ==9625== Using Valgrind-3.5.0 and LibVEX; rerun with -h for >>>>>>>>>>> copyright info >>>>>>>>>>> ==9625== Command: ./ex12 -run_type test -refinement_limit 0.0 >>>>>>>>>>> -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>> ==9625== >>>>>>>>>>> Local function: >>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>> type: seq >>>>>>>>>>> 0 >>>>>>>>>>> 0.25 >>>>>>>>>>> 1 >>>>>>>>>>> 0.25 >>>>>>>>>>> 0.5 >>>>>>>>>>> 1.25 >>>>>>>>>>> 1 >>>>>>>>>>> 1.25 >>>>>>>>>>> 2 >>>>>>>>>>> Initial guess >>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>> type: seq >>>>>>>>>>> 0.5 >>>>>>>>>>> L_2 Error: 0.111111 >>>>>>>>>>> Residual: >>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>> type: seq >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> Initial Residual >>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>> type: seq >>>>>>>>>>> 0 >>>>>>>>>>> L_2 Residual: 0 >>>>>>>>>>> Jacobian: >>>>>>>>>>> Mat Object: 1 MPI processes >>>>>>>>>>> type: seqaij >>>>>>>>>>> row 0: (0, 4) >>>>>>>>>>> Residual: >>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>> type: seq >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> -2 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> 0 >>>>>>>>>>> Au - b = Au + F(0) >>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>> type: seq >>>>>>>>>>> 0 >>>>>>>>>>> Linear L_2 Residual: 0 >>>>>>>>>>> ==9625== >>>>>>>>>>> ==9625== HEAP SUMMARY: >>>>>>>>>>> ==9625== in use at exit: 288 bytes in 3 blocks >>>>>>>>>>> ==9625== total heap usage: 2,484 allocs, 2,481 frees, >>>>>>>>>>> 1,009,287 bytes allocated >>>>>>>>>>> ==9625== >>>>>>>>>>> ==9625== 48 bytes in 1 blocks are definitely lost in loss record >>>>>>>>>>> 1 of 3 >>>>>>>>>>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>>>>>>>>>> ==9625== by 0x5D8D4E1: writepoly (triangle.c:12012) >>>>>>>>>>> ==9625== by 0x5D8FAAC: triangulate (triangle.c:13167) >>>>>>>>>>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>>>>>>>>>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>>>>>>>>>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>>>>>>>>>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>>>>>>>>>> ==9625== by 0x408D3D: main (ex12.c:651) >>>>>>>>>>> ==9625== >>>>>>>>>>> ==9625== 96 bytes in 1 blocks are definitely lost in loss record >>>>>>>>>>> 2 of 3 >>>>>>>>>>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>>>>>>>>>> ==9625== by 0x5D8D485: writepoly (triangle.c:12004) >>>>>>>>>>> ==9625== by 0x5D8FAAC: triangulate (triangle.c:13167) >>>>>>>>>>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>>>>>>>>>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>>>>>>>>>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>>>>>>>>>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>>>>>>>>>> ==9625== by 0x408D3D: main (ex12.c:651) >>>>>>>>>>> ==9625== >>>>>>>>>>> ==9625== 144 bytes in 1 blocks are definitely lost in loss >>>>>>>>>>> record 3 of 3 >>>>>>>>>>> ==9625== at 0x4A05E46: malloc (vg_replace_malloc.c:195) >>>>>>>>>>> ==9625== by 0x5D8CD20: writenodes (triangle.c:11718) >>>>>>>>>>> ==9625== by 0x5D8F9DE: triangulate (triangle.c:13132) >>>>>>>>>>> ==9625== by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749) >>>>>>>>>>> ==9625== by 0x56B5EE4: DMPlexGenerate (plex.c:4503) >>>>>>>>>>> ==9625== by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668) >>>>>>>>>>> ==9625== by 0x4051FA: CreateMesh (ex12.c:341) >>>>>>>>>>> ==9625== by 0x408D3D: main (ex12.c:651) >>>>>>>>>>> ==9625== >>>>>>>>>>> ==9625== LEAK SUMMARY: >>>>>>>>>>> ==9625== definitely lost: 288 bytes in 3 blocks >>>>>>>>>>> ==9625== indirectly lost: 0 bytes in 0 blocks >>>>>>>>>>> ==9625== possibly lost: 0 bytes in 0 blocks >>>>>>>>>>> ==9625== still reachable: 0 bytes in 0 blocks >>>>>>>>>>> ==9625== suppressed: 0 bytes in 0 blocks >>>>>>>>>>> ==9625== >>>>>>>>>>> ==9625== For counts of detected and suppressed errors, rerun >>>>>>>>>>> with: -v >>>>>>>>>>> ==9625== ERROR SUMMARY: 3 errors from 3 contexts (suppressed: 6 >>>>>>>>>>> from 6) >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Mon, Mar 3, 2014 at 7:05 PM, Matthew Knepley < >>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> On Mon, Mar 3, 2014 at 4:59 PM, Miguel Angel Salazar de Troya < >>>>>>>>>>>> salazardetroya at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> You are welcome, thanks for your help. >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Okay, I have rebuilt completely clean, and ex12 runs for me. >>>>>>>>>>>> Can you try again after pulling? >>>>>>>>>>>> >>>>>>>>>>>> Thanks, >>>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> On Mon, Mar 3, 2014 at 4:13 PM, Matthew Knepley < >>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> On Mon, Mar 3, 2014 at 1:44 PM, Miguel Angel Salazar de Troya >>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks. This is what I get. >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Okay, this was broken by a new push to master/next in the >>>>>>>>>>>>>> last few days. I have pushed a fix, >>>>>>>>>>>>>> however next is currently broken due to a failure to check in >>>>>>>>>>>>>> a file. This should be fixed shortly, >>>>>>>>>>>>>> and then ex12 will work. I will mail you when its ready. >>>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks for finding this, >>>>>>>>>>>>>> >>>>>>>>>>>>>> Matt >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> (gdb) cont >>>>>>>>>>>>>>> Continuing. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Program received signal SIGSEGV, Segmentation fault. >>>>>>>>>>>>>>> 0x00007fd6811bea7b in DMPlexComputeJacobianFEM >>>>>>>>>>>>>>> (dm=0x159a180, X=0x168b5b0, >>>>>>>>>>>>>>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, >>>>>>>>>>>>>>> str=0x7fffae6e7970, >>>>>>>>>>>>>>> user=0x7fd6811be509) >>>>>>>>>>>>>>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>>>>>>>>>>>>>> 882 ierr = PetscFEGetDimension(fe[f], >>>>>>>>>>>>>>> &Nb);CHKERRQ(ierr); >>>>>>>>>>>>>>> (gdb) where >>>>>>>>>>>>>>> #0 0x00007fd6811bea7b in DMPlexComputeJacobianFEM >>>>>>>>>>>>>>> (dm=0x159a180, X=0x168b5b0, >>>>>>>>>>>>>>> Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88, >>>>>>>>>>>>>>> str=0x7fffae6e7970, >>>>>>>>>>>>>>> user=0x7fd6811be509) >>>>>>>>>>>>>>> at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882 >>>>>>>>>>>>>>> #1 0x00007fd6814a5bf6 in SNESComputeJacobian_DMLocal >>>>>>>>>>>>>>> (snes=0x14e9450, >>>>>>>>>>>>>>> X=0x1622ad0, A=0x7fffae6e8a88, B=0x7fffae6e8a88, >>>>>>>>>>>>>>> ctx=0x1652300) >>>>>>>>>>>>>>> at /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c:102 >>>>>>>>>>>>>>> #2 0x00007fd6814cc609 in SNESComputeJacobian >>>>>>>>>>>>>>> (snes=0x14e9450, X=0x1622ad0, >>>>>>>>>>>>>>> A=0x7fffae6e8a88, B=0x7fffae6e8a88) >>>>>>>>>>>>>>> at /home/salaza11/petsc/src/snes/interface/snes.c:2245 >>>>>>>>>>>>>>> #3 0x000000000040af72 in main (argc=15, argv=0x7fffae6e8bc8) >>>>>>>>>>>>>>> at >>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:784 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Mon, Mar 3, 2014 at 1:40 PM, Matthew Knepley < >>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Mon, Mar 3, 2014 at 1:39 PM, Miguel Angel Salazar de >>>>>>>>>>>>>>>> Troya wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> This is what I get at gdb when I type 'where'. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> You have to type 'cont', and then when it fails you type >>>>>>>>>>>>>>>> 'where'. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> #0 0x000000310e0aa860 in __nanosleep_nocancel () from >>>>>>>>>>>>>>>>> /lib64/libc.so.6 >>>>>>>>>>>>>>>>> #1 0x000000310e0aa70f in sleep () from /lib64/libc.so.6 >>>>>>>>>>>>>>>>> #2 0x00007fd83a00a8be in PetscSleep (s=10) >>>>>>>>>>>>>>>>> at /home/salaza11/petsc/src/sys/utils/psleep.c:52 >>>>>>>>>>>>>>>>> #3 0x00007fd83a06f331 in PetscAttachDebugger () >>>>>>>>>>>>>>>>> at /home/salaza11/petsc/src/sys/error/adebug.c:397 >>>>>>>>>>>>>>>>> #4 0x00007fd83a0af1d2 in PetscOptionsCheckInitial_Private >>>>>>>>>>>>>>>>> () >>>>>>>>>>>>>>>>> at /home/salaza11/petsc/src/sys/objects/init.c:444 >>>>>>>>>>>>>>>>> #5 0x00007fd83a0b6448 in PetscInitialize >>>>>>>>>>>>>>>>> (argc=0x7fff5cd8df2c, >>>>>>>>>>>>>>>>> args=0x7fff5cd8df20, file=0x0, >>>>>>>>>>>>>>>>> help=0x60ce40 "Poisson Problem in 2d and 3d with >>>>>>>>>>>>>>>>> simplicial finite elements.\nWe solve the Poisson problem in a >>>>>>>>>>>>>>>>> rectangular\ndomain, using a parallel unstructured mesh (DMPLEX) to >>>>>>>>>>>>>>>>> discretize it.\n\n\n") >>>>>>>>>>>>>>>>> at /home/salaza11/petsc/src/sys/objects/pinit.c:876 >>>>>>>>>>>>>>>>> #6 0x0000000000408f2c in main (argc=15, >>>>>>>>>>>>>>>>> argv=0x7fff5cd8f1f8) >>>>>>>>>>>>>>>>> at >>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:663 >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> The rest of the gdb output is attached. I am a bit >>>>>>>>>>>>>>>>> ignorant with gdb, I apologize for that. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Mon, Mar 3, 2014 at 12:48 PM, Matthew Knepley < >>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Mon, Mar 3, 2014 at 12:39 PM, Miguel Angel Salazar de >>>>>>>>>>>>>>>>>> Troya wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Thanks for your response. Sorry I did not have the >>>>>>>>>>>>>>>>>>> "next" version, but the "master" version. I still have an error though. I >>>>>>>>>>>>>>>>>>> followed the steps given here ( >>>>>>>>>>>>>>>>>>> https://bitbucket.org/petsc/petsc/wiki/Home) to obtain >>>>>>>>>>>>>>>>>>> the next version, I configured petsc as above and ran ex12 as above as >>>>>>>>>>>>>>>>>>> well, getting this error: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> [salaza11 at maya tutorials]$ ./ex12 -run_type test >>>>>>>>>>>>>>>>>>> -refinement_limit 0.0 -bc_type dirichlet -interpolate 0 >>>>>>>>>>>>>>>>>>> -petscspace_order 1 -show_initial -dm_plex_print_fem 1 >>>>>>>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>> 0.25 >>>>>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>>>>> 0.25 >>>>>>>>>>>>>>>>>>> 0.5 >>>>>>>>>>>>>>>>>>> 1.25 >>>>>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>>>>> 1.25 >>>>>>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>>>>>> Initial guess >>>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>>>> 0.5 >>>>>>>>>>>>>>>>>>> L_2 Error: 0.111111 >>>>>>>>>>>>>>>>>>> Residual: >>>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>> Initial Residual >>>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>> L_2 Residual: 0 >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Okay, now run with -start_in_debugger, and give me a >>>>>>>>>>>>>>>>>> stack trace using 'where'. >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: >>>>>>>>>>>>>>>>>>> Segmentation Violation, probably memory access out of range >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to >>>>>>>>>>>>>>>>>>> find memory corruption errors >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in >>>>>>>>>>>>>>>>>>> stack below >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the >>>>>>>>>>>>>>>>>>> stack are not available, >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the >>>>>>>>>>>>>>>>>>> start of the function >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 871 >>>>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line >>>>>>>>>>>>>>>>>>> 94 /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line >>>>>>>>>>>>>>>>>>> 2244 /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>>>>> -------------------------------------------------------------- >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See http:// >>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for >>>>>>>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>>> v3.4.3-4705-gfb6b3bc GIT Date: 2014-03-03 08:23:43 -0600 >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya >>>>>>>>>>>>>>>>>>> by salaza11 Mon Mar 3 11:49:15 2014 >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>>>>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>>>>>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: #1 User provided function() line 0 in >>>>>>>>>>>>>>>>>>> unknown file >>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - >>>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>>> [unset]: aborting job: >>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - >>>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> On Sun, Mar 2, 2014 at 7:11 PM, Matthew Knepley < >>>>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> On Sun, Mar 2, 2014 at 6:54 PM, Miguel Angel Salazar de >>>>>>>>>>>>>>>>>>>> Troya wrote: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Hi everybody >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> I am trying to run example ex12.c without much >>>>>>>>>>>>>>>>>>>>> success. I specifically run it with the command options: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> We need to start narrowing down differences, because it >>>>>>>>>>>>>>>>>>>> runs for me and our nightly tests. So, first can >>>>>>>>>>>>>>>>>>>> you confirm that you are using the latest 'next' branch? >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 >>>>>>>>>>>>>>>>>>>>> -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> And I get this output >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>>>>>>>> 1 >>>>>>>>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>>>>>>>> 2 >>>>>>>>>>>>>>>>>>>>> 3 >>>>>>>>>>>>>>>>>>>>> Initial guess >>>>>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>>>>>> L_2 Error: 0.625 >>>>>>>>>>>>>>>>>>>>> Residual: >>>>>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>>>> 0 >>>>>>>>>>>>>>>>>>>>> Initial Residual >>>>>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>>> type: seq >>>>>>>>>>>>>>>>>>>>> L_2 Residual: 0 >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: >>>>>>>>>>>>>>>>>>>>> Segmentation Violation, probably memory access out of range >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X >>>>>>>>>>>>>>>>>>>>> to find memory corruption errors >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in >>>>>>>>>>>>>>>>>>>>> stack below >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the >>>>>>>>>>>>>>>>>>>>> stack are not available, >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of the >>>>>>>>>>>>>>>>>>>>> start of the function >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 867 >>>>>>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line >>>>>>>>>>>>>>>>>>>>> 94 /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line >>>>>>>>>>>>>>>>>>>>> 2244 /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203 >>>>>>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>>>>> v3.4.3-3453-g0a94005 GIT Date: 2014-03-02 13:12:04 -0600 >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>>>>>>>>> updates. >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about >>>>>>>>>>>>>>>>>>>>> trouble shooting. >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named >>>>>>>>>>>>>>>>>>>>> maya by salaza11 Sun Mar 2 17:00:09 2014 >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>>>>> /home/salaza11/petsc/linux-gnu-c-debug/lib >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Sun Mar 2 16:46:51 >>>>>>>>>>>>>>>>>>>>> 2014 >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich >>>>>>>>>>>>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>>>>>>>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in >>>>>>>>>>>>>>>>>>>>> unknown file >>>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - >>>>>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>>>>> [unset]: aborting job: >>>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - >>>>>>>>>>>>>>>>>>>>> process 0 >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Probably my problems could be on my configuration. I >>>>>>>>>>>>>>>>>>>>> attach the configure.log. I ran ./configure like this >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> ./configure --download-mpich >>>>>>>>>>>>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen >>>>>>>>>>>>>>>>>>>>> --download-chaco --with-c2html=0 >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Thanks a lot in advance. >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> On Tue, Jan 28, 2014 at 10:37 AM, Matthew Knepley < >>>>>>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> On Tue, Jan 28, 2014 at 10:31 AM, Yaakoub El Khamra < >>>>>>>>>>>>>>>>>>>>>> yelkhamra at gmail.com> wrote: >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> If >>>>>>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit >>>>>>>>>>>>>>>>>>>>>>> 0.0125 -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> is for serial, any chance we can get the options to >>>>>>>>>>>>>>>>>>>>>>> run in parallel? >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Just use mpiexec -n >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Regards >>>>>>>>>>>>>>>>>>>>>>> Yaakoub El Khamra >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:29 AM, Matthew Knepley < >>>>>>>>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:06 AM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Friday, January 17, 2014 11:04 AM >>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:00 AM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> These examples all seem to run excepting the >>>>>>>>>>>>>>>>>>>>>>>>>> following command, >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit >>>>>>>>>>>>>>>>>>>>>>>>>> 0.0125 -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> I get the following ouput: >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit >>>>>>>>>>>>>>>>>>>>>>>>>> 0.0125 -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>>>>>>>> Local function: >>>>>>>>>>>>>>>>>>>>>>>>>> ./ex12: symbol lookup error: >>>>>>>>>>>>>>>>>>>>>>>>>> /opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so: undefined >>>>>>>>>>>>>>>>>>>>>>>>>> symbol: omp_get_num_procs >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> This is a build problem, but it should affect >>>>>>>>>>>>>>>>>>>>>>>>> all the runs. Is this reproducible? Can you send configure.log? MKL is the >>>>>>>>>>>>>>>>>>>>>>>>> worst. If this >>>>>>>>>>>>>>>>>>>>>>>>> persists, I would just switch to >>>>>>>>>>>>>>>>>>>>>>>>> --download-f-blas-lapack. >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Thanks. I have some advice on options >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> --with-precision=single # I would not use this >>>>>>>>>>>>>>>>>>>>>>>> unless you are doing something special, like CUDA >>>>>>>>>>>>>>>>>>>>>>>> --with-clanguage=C++ # I would recommend >>>>>>>>>>>>>>>>>>>>>>>> switching to C, the build is much faster >>>>>>>>>>>>>>>>>>>>>>>> --with-mpi-dir=/usr --with-mpi4py=0 >>>>>>>>>>>>>>>>>>>>>>>> --with-shared-libraries --CFLAGS=-O0 >>>>>>>>>>>>>>>>>>>>>>>> --CXXFLAGS=-O0 --with-fc=0 >>>>>>>>>>>>>>>>>>>>>>>> --with-etags=1 # This is >>>>>>>>>>>>>>>>>>>>>>>> unnecessary >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>>>>>>> # Here is the problem, see below >>>>>>>>>>>>>>>>>>>>>>>> --download-metis >>>>>>>>>>>>>>>>>>>>>>>> --download-fiat=yes --download-generator >>>>>>>>>>>>>>>>>>>>>>>> --download-scientificpython # Get rid of these, they are obsolete >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Your MKL needs another library for the OpenMP >>>>>>>>>>>>>>>>>>>>>>>> symbols. I would recommend switching to --download-f2cblaslapack, >>>>>>>>>>>>>>>>>>>>>>>> or you can try and find that library. >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 6:35 PM >>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 5:43 PM, Jones,Martin >>>>>>>>>>>>>>>>>>>>>>>>>> Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> Hi, This is the next error message after >>>>>>>>>>>>>>>>>>>>>>>>>>> configuring and building with the triangle package when trying to run ex12 >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> This is my fault for bad defaults. I will fix. >>>>>>>>>>>>>>>>>>>>>>>>>> Try running >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -refinement_limit 0.0 >>>>>>>>>>>>>>>>>>>>>>>>>> -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial >>>>>>>>>>>>>>>>>>>>>>>>>> -dm_plex_print_fem 1 >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> for a representative run. Then you could try 3D >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit >>>>>>>>>>>>>>>>>>>>>>>>>> 0.0125 -variable_coefficient field -interpolate 1 -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> or a full run >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type >>>>>>>>>>>>>>>>>>>>>>>>>> dirichlet -interpolate -petscspace_order 1 >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> ex12 -refinement_limit 0.01 -bc_type >>>>>>>>>>>>>>>>>>>>>>>>>> dirichlet -interpolate -petscspace_order 2 >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> Let me know if those work. >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> ./ex12 >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 8 FPE: >>>>>>>>>>>>>>>>>>>>>>>>>>> Floating Point Exception,probably divide by zero >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>>>>>>>>>>>>>>>>> -on_error_attach_debugger >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: or see >>>>>>>>>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try >>>>>>>>>>>>>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac >>>>>>>>>>>>>>>>>>>>>>>>>>> OS X to find memory corruption errors >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given >>>>>>>>>>>>>>>>>>>>>>>>>>> in stack below >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Stack >>>>>>>>>>>>>>>>>>>>>>>>>>> Frames ------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in >>>>>>>>>>>>>>>>>>>>>>>>>>> the stack are not available, >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: INSTEAD the line number of >>>>>>>>>>>>>>>>>>>>>>>>>>> the start of the function >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: is given. >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeResidualFEM >>>>>>>>>>>>>>>>>>>>>>>>>>> line 531 /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexfem.c >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction_DMLocal >>>>>>>>>>>>>>>>>>>>>>>>>>> line 63 /home/mjonesa/PETSc/petsc/src/snes/utils/dmlocalsnes.c >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user function line 2088 >>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction line >>>>>>>>>>>>>>>>>>>>>>>>>>> 2076 /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve_NEWTONLS line 144 >>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/impls/ls/ls.c >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve line 3765 >>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error >>>>>>>>>>>>>>>>>>>>>>>>>>> Message ------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received! >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for >>>>>>>>>>>>>>>>>>>>>>>>>>> recent updates. >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints >>>>>>>>>>>>>>>>>>>>>>>>>>> about trouble shooting. >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual >>>>>>>>>>>>>>>>>>>>>>>>>>> pages. >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a >>>>>>>>>>>>>>>>>>>>>>>>>>> arch-linux2-cxx-debug named maeda by mjonesa Thu Jan 16 17:41:23 2014 >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 >>>>>>>>>>>>>>>>>>>>>>>>>>> 17:38:33 2014 >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local >>>>>>>>>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>>>>>>>>>> --with-c2html=0 --with-clanguage=c++ PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>>>>>>>> --download-triangle >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 >>>>>>>>>>>>>>>>>>>>>>>>>>> in unknown file >>>>>>>>>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) >>>>>>>>>>>>>>>>>>>>>>>>>>> - process 0 >>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:37 PM >>>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:33 PM, >>>>>>>>>>>>>>>>>>>>>>>>>>> Jones,Martin Alexander wrote: >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> Hi, I have downloaded and built the dev >>>>>>>>>>>>>>>>>>>>>>>>>>>> version you suggested. I think I need the triangle package to run this >>>>>>>>>>>>>>>>>>>>>>>>>>>> particular case. Is there any thing else that appears wrong in what I have >>>>>>>>>>>>>>>>>>>>>>>>>>>> done from the error messages below: >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> Great! Its running. You can reconfigure like >>>>>>>>>>>>>>>>>>>>>>>>>>> this: >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> $PETSC_DIR/$PETSC_ARCH/conf/reconfigure-$PETSC_ARCH.py --download-triangle >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> and then rebuild >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> make >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> and then rerun. You can load meshes, but its >>>>>>>>>>>>>>>>>>>>>>>>>>> much easier to have triangle create them. >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> Thanks for being patient, >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error >>>>>>>>>>>>>>>>>>>>>>>>>>>> Message ------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: No support for this operation >>>>>>>>>>>>>>>>>>>>>>>>>>>> for this object type! >>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Mesh generation needs external >>>>>>>>>>>>>>>>>>>>>>>>>>>> package support. >>>>>>>>>>>>>>>>>>>>>>>>>>>> Please reconfigure with --download-triangle.! >>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>>>>>>>>>>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7 GIT Date: 2014-01-15 20:33:42 -0600 >>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for >>>>>>>>>>>>>>>>>>>>>>>>>>>> recent updates. >>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints >>>>>>>>>>>>>>>>>>>>>>>>>>>> about trouble shooting. >>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual >>>>>>>>>>>>>>>>>>>>>>>>>>>> pages. >>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a >>>>>>>>>>>>>>>>>>>>>>>>>>>> arch-linux2-cxx-debug named maeda by mjonesa Thu Jan 16 16:28:20 2014 >>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/local/lib >>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16 >>>>>>>>>>>>>>>>>>>>>>>>>>>> 16:25:53 2014 >>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local --with-clanguage=c++ --with-c2html=0 >>>>>>>>>>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]" >>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexGenerate() line 4332 in >>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plex.c >>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexCreateBoxMesh() line 600 >>>>>>>>>>>>>>>>>>>>>>>>>>>> in /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexcreate.c >>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: CreateMesh() line 295 in >>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: main() line 659 in >>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, >>>>>>>>>>>>>>>>>>>>>>>>>>>> 56) - process 0 >>>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:06 PM >>>>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 4:05 PM, >>>>>>>>>>>>>>>>>>>>>>>>>>>> Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>>>> > wrote: >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hi. I changed the ENV variable to the >>>>>>>>>>>>>>>>>>>>>>>>>>>>> correct entry. when I type make ex12 I get this: >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>>>>>>>>>>>>> make ex12 >>>>>>>>>>>>>>>>>>>>>>>>>>>>> g++ -o ex12.o -c -Wall -Wwrite-strings >>>>>>>>>>>>>>>>>>>>>>>>>>>>> -Wno-strict-aliasing -Wno-unknown-pragmas -g -fPIC >>>>>>>>>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include >>>>>>>>>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/arch-linux2-cxx-debug/include >>>>>>>>>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include/mpiuni >>>>>>>>>>>>>>>>>>>>>>>>>>>>> -D__INSDIR__=src/snes/examples/tutorials/ ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c:14:18: fatal error: ex12.h: No such >>>>>>>>>>>>>>>>>>>>>>>>>>>>> file or directory >>>>>>>>>>>>>>>>>>>>>>>>>>>>> compilation terminated. >>>>>>>>>>>>>>>>>>>>>>>>>>>>> make: *** [ex12.o] Error 1 >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> Any help of yours is very much appreciated. >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> Yes, this relates to my 3). This is not going >>>>>>>>>>>>>>>>>>>>>>>>>>>> to work for you with the release. Please see the link I sent. >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:58 PM >>>>>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running >>>>>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:55 PM, >>>>>>>>>>>>>>>>>>>>>>>>>>>>> Jones,Martin Alexander < >>>>>>>>>>>>>>>>>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Thanks! >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> You built with >>>>>>>>>>>>>>>>>>>>>>>>>>>>> PETSC_ARCH=arch-linux2-cxx-debug >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------ >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:31 PM >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> running ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 3:11 PM, >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Jones,Martin Alexander < >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Now I went to the directory where ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> sits and just did a 'make ex12.c' with the following error if this helps? : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$ >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> make ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/variables:108: >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscvariables: No >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/rules:962: >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules: No >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> such file or directory >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> make: *** No rule to make target >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> `/home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules'. >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Stop. >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 1) You would type 'make ex12' >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 2) Either you PETSC_DIR ( >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3) or >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> PETSC_ARCH (linux-gnu-cxx-debug) environment >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> variables >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> do not match what you built. Please send >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> configure.log and make.log >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3) Since it was only recently added, if you >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> want to use the FEM functionality, you must use the development version: >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/developers/index.html >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [mailto: >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> knepley at gmail.com] >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 2:48 PM >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> running ex12.c >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 2:35 PM, >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Jones,Martin Alexander < >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote: >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello To Whom it Concerns, >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> I am trying to run the tutorial ex12.c by >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> running 'bin/pythonscripts/PetscGenerateFEMQuadrature.py >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h' >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but getting the following error: >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> bin/pythonscripts/PetscGenerateFEMQuadrature.py dim order dim 1 laplacian >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dim order dim 1 boundary src/snes/examples/tutorials/ex12.h >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Traceback (most recent call last): >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> File >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "bin/pythonscripts/PetscGenerateFEMQuadrature.py", line 15, in >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> from FIAT.reference_element import >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> default_simplex >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ImportError: No module named >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FIAT.reference_element >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> I have removed the requirement of generating >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> the header file (its now all handled in C). I thought >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> I changed the documentation everywhere >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> (including the latest tutorial slides). Can you try running >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> with 'master' (or 'next'), and point me >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> toward the old docs? >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> before they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> before they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted >>>>>>>>>>>>>>>>>>>>>>>>>>>>> before they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before >>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any >>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead. >>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>>>> (217) 550-2360 >>>>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>>>>> Graduate Research Assistant >>>>>>>>>>> Department of Mechanical Science and Engineering >>>>>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>>>>> (217) 550-2360 >>>>>>>>>>> salaza11 at illinois.edu >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> *Miguel Angel Salazar de Troya* >>>>>>>> Graduate Research Assistant >>>>>>>> Department of Mechanical Science and Engineering >>>>>>>> University of Illinois at Urbana-Champaign >>>>>>>> (217) 550-2360 >>>>>>>> salaza11 at illinois.edu >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> *Miguel Angel Salazar de Troya* >>>>>> Graduate Research Assistant >>>>>> Department of Mechanical Science and Engineering >>>>>> University of Illinois at Urbana-Champaign >>>>>> (217) 550-2360 >>>>>> salaza11 at illinois.edu >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> *Miguel Angel Salazar de Troya* >>>>> Graduate Research Assistant >>>>> Department of Mechanical Science and Engineering >>>>> University of Illinois at Urbana-Champaign >>>>> (217) 550-2360 >>>>> salaza11 at illinois.edu >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >>> >>> -- >>> *Miguel Angel Salazar de Troya* >>> Graduate Research Assistant >>> Department of Mechanical Science and Engineering >>> University of Illinois at Urbana-Champaign >>> (217) 550-2360 >>> salaza11 at illinois.edu >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > > > -- > *Miguel Angel Salazar de Troya* > Graduate Research Assistant > Department of Mechanical Science and Engineering > University of Illinois at Urbana-Champaign > (217) 550-2360 > salaza11 at illinois.edu > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: